The White House and the Executive Office of the President have just issued a memorandum for the heads of U.S. government and federal executive departments and agencies for enhancing the security of the software supply chain through secure software development practices.
The guidelines come soon after discussions conducted at the August 2022 U.S. Open Source Software Initiative Workshop, which I attended. The workshop was hosted by the Executive Office of the President of the United States, the National Science Foundation (NSF), and the National Institute of Standards and Technology (NIST). It brought together stakeholders from the open-source software (OSS) community, the private sector, academia, and the U.S. Government, to improve the security of the open-source software ecosystem. And it built upon previous U.S. Government activity in this space.
The U.S. Congress recently passed the “Strengthening American Cybersecurity Act of 2022”, which consolidated the publication of an executive order on cybersecurity from the office of the U.S. President in the spring of 2021 and built upon the “Federal Information Security Modernization Act of 2022”.
The 2022 cybersecurity act authorizes an escalation of security due diligence activity to avoid threats against critical infrastructure and the federal government. Among a raft of measures, it requires the Cybersecurity and Infrastructure Security Agency (CISA) to perform ongoing and continuous assessments of federal risk posture and it demands full reporting and other actions to address cybersecurity incidents.
The new guidelines consolidate this by identifying specific responsibilities of agencies and the government in the policing of software and protecting the software supply chain. Federal agencies must now only use software provided by software producers who can attest to complying with the government-specified secure software development practices. The main points of the guidelines are:
Why have these guidelines been published now? What might be the most significant considerations that arise from the U.S. government’s activity? Where does the cybersecurity industry stand on them, and what are the potential implications? Here’s my view.
I think four significant considerations arise that explain why they have now shone a spotlight on the issue of cybersecurity:
The software and cybersecurity industries have so far cautiously welcomed these recommendations. On one hand, they demonstrate that the U.S.government takes the issue of cybersecurity increasingly seriously, and where government leads, business will follow, especially when lucrative government contracts are at stake.
However, there is some skepticism about the feasibility of some of the government’s advisors’ aspirations, in particular the aspiration to completely eradicate software vulnerabilities from the government’s software supply chain. The escalating use of open source software, and the frequency and volume of changes in components, dependencies, and updates, make this impractical, if not impossible. While this standpoint is understandable as an ideal, it is arguably somewhat simplistic, because it doesn’t take into consideration techniques like effective usage analysis, or prioritization, which enable organizations to streamline and accelerate security scanning and remediation by checking whether vulnerabilities are used in any given context.
Why context is important
Let’s say you have a library containing one hundred different utility functions, but you only use five of them. There might be a critical vulnerability in one of the other 95, but if they aren’t ever invoked and are thus inactive, they pose no threat. One of the things that a solution like Mend can do is check whether you’re actually calling on a vulnerable component or function. If it’s secure, you get a green light, so to speak, and you have access to use it. If not, it’s blocked, so you’re not exposed to risk because you can’t be exploited if the code is impossible to execute. It’s faster and more efficient to prioritize vulnerabilities like this than trying to detect and fix every vulnerability, which is impractical.
Similarly, from a methodological perspective, it’s valuable to consider the context in which software or components are deployed, because different contexts can either increase or reduce risk. For instance, if you’re using software strictly for internal purposes, such as a travel expenses program or application, and there’s a potential DDoS risk in it, it’s unlikely to have a serious impact when the potential issue stays within your organization. If you’re only using one IP address, it can’t get widely distributed, so the vulnerability is not really effective in this context.
Support will be forthcoming
Nevertheless, the industry recognizes that the U.S.government is trying to move the needle in the right direction. To start with, it’s trying to push the use of SBOMs, vulnerability detection, and minimization. It’s quite open to hearing from the industry and learning from us. That’s positive, and consequently, I think the industry will get behind what it’s doing and why. However, the challenge may prove to be in the implementation.
Firstly, any regulation is difficult. Nearly all regulation has unintended consequences, and if, in the process of implementation, people aren’t convinced by the outcomes, or feel it’s pointless, then we’ll lose them. They won’t believe the process will make any effective difference, and they’ll feel it’s just additional bureaucracy. It’s therefore really important that the industry and the government keep the channels of communication about these unintended consequences open, so that together we find the best ways to address the challenge of software vulnerability management. In my view, it’s up to our industry to actively push the agenda and work proactively with the government to make regulations as effective as possible. After all, the nature of security is that it continually changes, and any policies that are put in place will benefit from an ongoing relationship with the very industry whose raison d’être is to keep abreast of new security challenges.
Encouraging a “trickle-down” of best practice
Just as importantly, the U.S. government can be a kind of “market-maker.” What I mean by this is that it can influence the way both the public and private sectors behave, thanks to its active promotion of issues such as cybersecurity. This extends beyond imposing regulations. It sets a precedent for what it believes to be best practice, and it creates a trickle-down effect.
I’ll illustrate this. If the government requires its suppliers to have fully documented software with robust vulnerability fixes, the size and value of its contracts make it worthwhile for suppliers to have these measures in place. We can reasonably assume that suppliers to the U.S. government are most likely to be the big beasts of the software world, such as Microsoft and Oracle. In turn, these big corporations will want their suppliers to show similar due diligence, so they can confidently assure the government that their software supply chains are secure. Theoretically, in time, everyone, from the largest corporations to the smaller SMBs, will have an incentive to prioritize software and application security.
The U.S. government has embarked upon a journey toward best practice, and I believe we should support it. It’s both a challenge and an opportunity for us in the security sector. It’s a challenge because we must be engaged with the government, however tricky the consultation process might be, and we must advise it wisely so that it introduces the most effective measures to safeguard software and application security. But it is also a great opportunity to nurture a valuable partnership with the government and encourage it to implement the best possible solutions that will be adopted throughout business and industry – which, ultimately, will benefit us all.
Want to know more about the importance of SBOMs in protecting the software supply chain? Read this white paper for more details.