SKIP TO CONTENT

Shifting to a Secure Development Lifecycle

Jason R. Weiss
June 1, 2023
5 min read

SDLC is an acronym that commonly resolves to one of two meanings, either Systems Development Lifecycle or Software Development Lifecycle. Today, the third turning point of software experience is upon us, Security Experience (SX), as the risks in the software powering our software defined world have become unacceptably high. The emphasis on software SX has led to a growing recognition that there is another definition here for SDLC: Secure Development Lifecycle, with an explicit emphasis on supply chain risk management (SCRM).

A supply chain is simply the collection of ingredients and activities required to produce something. For example, consider plywood. Plywood is a key ingredient in building homes, and wood is plentiful in North America. Plywood manufacturing requires adhesives, and adhesive manufacturing requires chemicals. When storms disrupt refinery operations on the Gulf Coast required for creating certain chemicals, it cascades through the system impacting the ability to build homes in New England. Supply chains are messy and complicated, and where there is only one manufacturer operating one plant that creates the specialty adhesive required for plywood manufacturing, the supply chain breaks.

In the 1990s when Client/Server architecture was dominant, software development required nothing more than your tools of choice, say PowerBuilder and a Microsoft SQL Server database. That was the end of the list! Tools like PowerBuilder were complete development monoliths. It was your code editor, your only code editor, it managed the source code check-in/check-out process, it was used to build both client-side business logic in Powerscript and server-side logic in SQL, it included built-in database drivers, and it had a finite set of user interface widgets to build from. If there was a problem with the code, it offered the integrated debugger. There was only one type of developer, a PowerBuilder developer in this example, and a database administrator.

Modern software development is profoundly different. Popular integrated development environments (IDEs) are language agnostic, working equally well if the language is Javascript, Go, or Rust. While git is the dominant software configuration management tool, the git experience varies across implementations: GitHub, GitLab, Bitbucket, Harness.ai, etc. Languages like Go rely on a suite of command line tools with their own unique flags, and there are multiple debuggers to choose from, each with their own strengths and weaknesses. Containerization and orchestration with Kubernetes introduces another discrete set of tools that range from support for declarative configuration to service mesh design and configuration. Modern web applications sit behind web application firewalls (WAFs) and load balancers, and application performance monitoring (APM) tooling is a prerequisite for successful debugging in a complex microservices architecture. The Cloud provider selected to host the application introduces yet another layer of complexity.

The modern software supply chain is big and messy. The supply has gone from requiring only two pieces of software and two roles to now requiring dozens of highly specialized tools split across multiple disciplines - backend engineer, frontend engineer, architect, DevOps engineer. Cloud engineer. The most prolific Go developer may have neither experience nor understanding of the setup and configuration required to host Kubernetes in a specific Cloud provider.

And herein lies the problem in a world where cyber threat actors are looking for ways to compromise software. Today, software defaults to a permissive state and hardening guides are available for download for folks looking to deploy the software more securely. The vast majority of these developer tools are open source, and vetting the authenticity of an open source contributor is particularly challenging. At its core, this means that each tool needs to be uniquely tuned and vetted to become production ready. As DevOps folks seek to automate tasks and activities that are done repeatedly, they often glue tools together in ways that the individual tool creators may not have considered. All of this creates an attack surface that is so expansive that no one person can track the complete software supply chain of the modern software application. And that is a huge problem.

If you do not know what you have, how can you ensure it is secured?

Can you point to a singular location that captures with 100% accuracy every single tool used across every single member of the software team to design, create, test, deploy, and operate the application? If a development cannot provide this on-demand with full accuracy and it cannot attest to the veracity and authenticity of each of those tools, then it cannot be sure that the software touching the source code, intermediate artifacts, or final executable is not creating an attack surface that could be or is being exploited by a cyber attacker.

The journey to a secure development lifecycle capable of creating resilient software cannot be depicted as a marathon. Why? Because a marathon has a clear start and an end, and we all know that software is never done. Software teams must begin the arduous process of assembling a complete list of tools used across all roles and across all environments (dev/test/prod), and their nuanced SDLC needs to establish a governance process to ensure that when a backend, frontend, architect, DevOps, or Cloud engineer decides to introduce a new tool into the process, that tool is vetted and captured in the tooling inventory list.

In the next blog in this series, we’ll take a deeper dive into the specifics of software supply chain risk management and we’ll also explore the cardinality of the software bill of materials (SBOM).

Sign up for our newsletter to join our impact-driven mission.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.