• Home
  • Blog
  • White House Issues New Guidelines on Software Supply Chain Security – What Are the Challenges and Possible Outcomes?

White House Issues New Guidelines on Software Supply Chain Security – What Are the Challenges and Possible Outcomes?

What are the implications of the new White House guidelines on cyber security?
What are the implications of the new White House guidelines on cyber security?

The White House and the Executive Office of the President have just issued a memorandum for the heads of U.S. government and federal executive departments and agencies for enhancing the security of the software supply chain through secure software development practices.

The guidelines come soon after discussions conducted at the August 2022 U.S. Open Source Software Initiative Workshop, which I attended. The workshop was hosted by the Executive Office of the President of the United States, the National Science Foundation (NSF), and the National Institute of Standards and Technology (NIST). It brought together stakeholders from the open-source software (OSS) community, the private sector, academia, and the U.S. Government, to improve the security of the open-source software ecosystem.  And it built upon previous U.S. Government activity in this space. 

The U.S. Congress recently passed the “Strengthening American Cybersecurity Act of 2022”, which consolidated the publication of an executive order on cybersecurity from the office of the U.S. President in the spring of 2021 and built upon the “Federal Information Security Modernization Act of 2022”.

The 2022 cybersecurity act authorizes an escalation of security due diligence activity to avoid threats against critical infrastructure and the federal government. Among a raft of measures, it requires the Cybersecurity and Infrastructure Security Agency (CISA) to perform ongoing and continuous assessments of federal risk posture and it demands full reporting and other actions to address cybersecurity incidents.

The new guidelines consolidate this by identifying specific responsibilities of agencies and the government in the policing of software and protecting the software supply chain. Federal agencies must now only use software provided by software producers who can attest to complying with the government-specified secure software development practices. The main points of the guidelines are:

  • Agencies are required to obtain a self-attestation for all third-party software from the software producer before using software. This includes software renewals and major version changes
  • Agencies may obtain from software producers artifacts that demonstrate conformance to secure software development practices, as needed, the primary example being software bills of material (SBOMs).
  • Agencies shall inventory all software subject to the requirements of the memorandum, with a separate inventory for “critical software.”
  • Agencies shall develop a consistent process to communicate relevant requirements in this memorandum to vendors, and ensure attestation letters not posted publicly by software providers are collected in one central agency system.
  • Agency CIOs, in coordination with agency-requiring activities and agency CAOs, shall assess organizational training needs and develop training plans for the review and validation of full attestation documents and artifacts.
  • The President’s Office of Management and Budget (OMB), in consultation with CISA and the General Services Administration (GSA), will establish requirements for a centralized repository for software attestations and artifacts, with appropriate mechanisms for protection and sharing among federal agencies.

Why have these guidelines been published now? What might be the most significant considerations that arise from the U.S. government’s activity? Where does the cybersecurity industry stand on them, and what are the potential implications? Here’s my view.

Related: A Guide to Standard SBOM Formats

Significant considerations

I think four significant considerations arise that explain why they have now shone a spotlight on the issue of cybersecurity:

  • Cyberthreats have escalated considerably in frequency and severity over the last two years. In 2020, a number of federal agencies and large corporations were compromised by malicious code that was added into the SolarWinds software. Such vulnerabilities have seriously threatened the delivery of government services and put at risk huge amounts of personal information and business data in the private sector.
  • In the briefing document that accompanies the memorandum of new guidelines, the U.S. government articulated its concerns about open source software that originate from criminal organizations, locations or foreign governments whose interests are deemed incompatible or in opposition to those of the U.S., and who seek ways to compromise our digital infrastructure.
  • Organizations that seek to work with federal authorities must be completely transparent as regards their software supply chain. They must divulge what components and dependencies they use and what risks these pose, and they must ensure that their usage is compliant with all usage licenses. To all intents and purposes, this is a requirement that all contractors of the U.S. government and its agencies provide a comprehensive SBOM.
  • Organizations are not allowed to sell software to the U.S. government and associated federal agencies if they contain any known vulnerabilities. Congress aims to try and ensure that no vulnerabilities are received by these bodies, either knowingly or unknowingly, but it remains to be seen if this “zero vulnerability” goal is practical.

Industry reception

The software and cybersecurity industries have so far cautiously welcomed these recommendations. On one hand, they demonstrate that the U.S.government takes the issue of cybersecurity increasingly seriously, and where government leads, business will follow, especially when lucrative government contracts are at stake.

However, there is some skepticism about the feasibility of some of the government’s advisors’ aspirations, in particular the aspiration to completely eradicate software vulnerabilities from the government’s software supply chain. The escalating use of open source software, and the frequency and volume of changes in components, dependencies, and updates, make this impractical, if not impossible. While this standpoint is understandable as an ideal, it is arguably somewhat simplistic, because it doesn’t take into consideration techniques like effective usage analysis, or prioritization, which enable organizations to streamline and accelerate security scanning and remediation by checking whether vulnerabilities are used in any given context.

Why context is important

Let’s say you have a library containing one hundred different utility functions, but you only use five of them. There might be a critical vulnerability in one of the other 95, but if they aren’t ever invoked and are thus inactive, they pose no threat. One of the things that a solution like Mend can do is check whether you’re actually calling on a vulnerable component or function. If it’s secure, you get a green light, so to speak, and you have access to use it. If not, it’s blocked, so you’re not exposed to risk because you can’t be exploited if the code is impossible to execute. It’s faster and more efficient to prioritize vulnerabilities like this than trying to detect and fix every vulnerability, which is impractical.

Similarly, from a methodological perspective, it’s valuable to consider the context in which software or components are deployed, because different contexts can either increase or reduce risk. For instance, if you’re using software strictly for internal purposes, such as a travel expenses program or application, and there’s a potential DDoS risk in it, it’s unlikely to have a serious impact when the potential issue stays within your organization. If you’re only using one IP address, it can’t get widely distributed, so the vulnerability is not really effective in this context.

Support will be forthcoming

Nevertheless, the industry recognizes that the U.S. government is trying to move the needle in the right direction. To start with, it’s trying to push the use of SBOMs, vulnerability detection, and minimization. It’s quite open to hearing from the industry and learning from us. That’s positive, and consequently, I think the industry will get behind what it’s doing and why. However, the challenge may prove to be in the implementation.

Any regulation is difficult. Nearly all regulation has unintended consequences, and if, in the process of implementation, people aren’t convinced by the outcomes, or feel it’s pointless, then we’ll lose them. They won’t believe the process will make any effective difference, and they’ll feel it’s just additional bureaucracy. It’s therefore really important that the industry and the government keep the channels of communication about these unintended consequences open, so that together we find the best ways to address the challenge of software vulnerability management. In my view, it’s up to our industry to actively push the agenda and work proactively with the government to make regulations as effective as possible. After all, the nature of security is that it continually changes, and any policies that are put in place will benefit from an ongoing relationship with the very industry whose raison d’être is to keep abreast of new security challenges.

Encouraging a “trickle-down” of best practice

Just as importantly, the U.S. government can be a kind of “market-maker.” What I mean by this is that it can influence the way both the public and private sectors behave, thanks to its active promotion of issues such as cybersecurity. This extends beyond imposing regulations. It sets a precedent for what it believes to be best practice, and it creates a trickle-down effect.

I’ll illustrate this. If the government requires its suppliers to have fully documented software with robust vulnerability fixes, the size and value of its contracts make it worthwhile for suppliers to have these measures in place. We can reasonably assume that suppliers to the U.S. government are most likely to be the big beasts of the software world, such as Microsoft and Oracle. In turn, these big corporations will want their suppliers to show similar due diligence, so they can confidently assure the government that their software supply chains are secure. Theoretically, in time, everyone, from the largest corporations to the smaller SMBs, will have an incentive to prioritize software and application security.

The U.S. government has embarked upon a journey toward best practice, and I believe we should support it. It’s both a challenge and an opportunity for us in the security sector. It’s a challenge because we must be engaged with the government, however tricky the consultation process might be, and we must advise it wisely so that it introduces the most effective measures to safeguard software and application security. But it is also a great opportunity to nurture a valuable partnership with the government and encourage it to implement the best possible solutions that will be adopted throughout business and industry – which, ultimately, will benefit us all.

Want to know more about the importance of SBOMs in protecting the software supply chain?

Meet The Author

Rhys Arkins

Rhys Arkins is Vice President of Product Management, responsible for developer solutions at Mend.io. He was the founder of Renovate Bot – an automated tool for software dependency updating, which was acquired by Mend.io in 2019. Rhys is particularly fond of automation and a firm believer in never sending humans to do a machine’s job.

Subscribe to Our Blog