Effective software supply chain pipelines are integral to businesses and organizations that depend on new digital products and services.
An overview & case study
Effective software supply chain pipelines are integral to businesses and organizations that depend on new digital products and services.
Whether they are looking to engage with customers in different ways, open new revenue channels, or streamline operations, companies need to be able to deliver these new services quickly and predictably with high levels of user satisfaction.
It’s much easier said than done though. Modern software supply chain pipelines have continued to increase in complexity and sophistication as a result of fragmented toolsets, open-source software adoption and zero day vulnerabilities, microservices architectures, hybrid and multi-cloud deployments.
Managing those pipelines involves the thoughtful planning and implementation of a combination of processes, including Continuous Integration, Continuous Delivery, Continuous Deployment, and Continuous Testing. In some cases, this may also require specialized expertise and tools.
But for those organizations that can master these processes, the benefits are significant and include:
The gathered metadata can be stored in both raw and processed format to further automate, govern, analyze, and audit the produced software package and its deployment. It can transform an organization’s software supply chain pipeline from process-driven to data-driven.
(Standard Modern Software Supply Chain Pipeline Flow)
Different organizations have different compliance, security, and acceptance criteria depending on their business domain and practices.
These criteria must be met before promoting new features to production. Performing these tasks manually can be time-consuming and error-prone and impact the overall quality and timeliness of the software features. In many cases, these criteria are based on rigid pass/fail mechanisms which do not accurately portray the quality of a prospective software service.
Instead, leading delivery organizations today can create custom policies which audit the metadata in the software supply chain pipeline and use it to automatically govern, analyze, and deploy new software packages to the production environment. See the case study later in this paper for an example of this more automated approach.
The benefits realized by leading delivery organizations able to automate their supply chain governance are substantial including:
This healthcare organization has built its reputation on patient care and is well known for providing its patients with exceptional experiences and quality outcomes.
This commitment extended to all the ways its patients interacted with the organization, including an increasing number of digital patient services.
Each digital product or service had to meet a strict and long list of acceptance criteria before it could be promoted to a higher level environment. The checklist included more than 50 items which had to be executed and analyzed manually before a software release could be considered ready for production – or meet the definition of done (DoD). This significantly impacted the organization’s ability to deliver new features and versions of its software services.
To better understand how to optimize this process, the company’s digital team reviewed every single recent release from concept to production.
That analysis revealed a very manual, time-consuming and error-prone DoD adherence process. From there, the decision was made to automate it — with the help of Apexon.
The Apexon team started by helping the company understand how the data from the existing tools could be leveraged to quickly assess whether a software service element met the acceptance criteria.
Working together, we then proceeded to build a solution to push metadata and automate the analysis on that data to check whether it met both pre and post-commit acceptance criteria and govern the deployments accordingly.
For pre-commit quality checks, the client installed a set of tools and hookups that triggered checks every time a developer tried to check in code into the local code repository. The system permitted a commit only if all the quality checks and gates were cleared.
For post-commit quality checks, the CI engine of the client executed a set of scripts and tests that checked after the code has been pushed to the remote repository and updated the status if the artifact cleared all the quality checks and gates.
The solution is entirely modular, where each DoD is an independent module that can be plugged and played. This makes it flexible enough to allow additional quality gates to be added quickly.
Apexon provided a framework with Git hooks that acts as a quality gate where the code is flagged and held if it does not meet the DoD. The DOD items are verified at the time of commit to make sure there is no hard coding of API Keys and to validate the Schema of the OpenAPI specification stored in the code.
Apexon provided a Framework within its Jenkins pipeline which provided an automated way to:
Simulate error paths and automatically validate the corresponding exception/error codes and generated application logs.
Run APIs and inspect the headers to match the swagger file.
Inspect if there are unauthenticated POST calls upon commit.OpenAPI specification stored in the code.
A DoD dashboard to view the evaluation status of all the DoD’s and quality gates analyzed.
The dashboard enables a simple click on the DOD item to display Passed/Failed and other relevant information.
The initial POC implementation proved that the automated framework helps reduce the cost and time required for the DOD adherence while improving quality.
It also advanced the client’s cause of delivering an exceptional patient care experience.