This is part of an ongoing series of Covington blogs on implementation of Executive Order 14028, “Improving the Nation’s Cybersecurity,” issued by President Biden on May 12, 2021 (the “Cyber EO”). The first blog summarized the Cyber EO’s key provisions and timelines, and the subsequent blogs described the actions taken by various government agencies to implement the Cyber EO from June 2021 through March 2024. This blog describes key actions taken to implement the Cyber EO, as well as the U.S. National Cybersecurity Strategy, during April 2024. It also describes key actions taken during April 2024 to implement President Biden’s Executive Order on Artificial Intelligence (the “AI EO”), particularly its provisions that impact cybersecurity, national security, and secure software.
NIST Publishes Initial Draft Handbook on Secure IOT Development
On April 3, NIST released an initial public draft of a cybersecurity handbook that outlines considerations for developing and deploying internet of things products across sectors. In sum, the handbook is intended to help outline and mitigate the risks that may be associated with these products. Among other things, the handbook outlines approaches to cybersecurity in IoT products, including with respect to architecture and deployment of the products. Among other topics and consistent with the government’s focus on supply chain security, this handbook also addresses cybersecurity considerations relating to the hardware and software components of IoT. The handbook also provides examples of implementation of these practices, including with respect to deployment.
New FAR Part 40 Established
On April 10, the FAR Council released a Request for Information (RFI) relating to the final FAR rule to establish FAR Part 40, which contains information and supply chain security requirements. That final rule was published in the Federal Register on April 1. The RFI is proposing a two-part test to determine whether a requirement should appear in the new Part 40. If the scope of a security requirement applies beyond information and communications technology (ICT), it should be placed in the new FAR Part 40. If the scope of the security requirement is limited to ICT, it would be located in current FAR Part 39 (“Acquisition of Information Technology”). The FAR Council is seeking comments on the contents of this FAR section through June 10, 2024.
NSA Issues Guidance on Safe Deployment of AI
On April 15, the National Security Agency’s Artificial Intelligence Security Center released guidance on strengthening AI system security. The guidance is heavily focused on ensuring that known cybersecurity vulnerabilities in AI systems are appropriately mitigated, providing methodologies and controls to protect, detect, and respond to malicious activity against AI systems and related data and services, and improving the confidentiality, integrity, and availability of AI systems. The document is intended to be used by organizations that are deploying and operating externally developed AI systems on premises or in private cloud environments, especially those in high-threat, high-value environments.
DHS Collaborates with Open Source Foundation to Release New Tool for Creating and Translating SBOMs
On April 16, the Open Source Foundation collaborated with the Department of Homeland Security Science and Technology Directorate and the Cybersecurity and Infrastructure Security Agency (CISA) collaborated to develop a new tool that allows organizations, including government organizations, to read and generate Software Bills of Materials (SBOMs). The tool is open source and therefore can be further developed as needs evolve. The tool, known as “protobom,” can be accessed and downloaded here. It is unclear how this tool, and/or others, may be relied on by agencies as they implement the secure software development framework that we have written about previously.
GAO Requests CISA to Produce List of Critical Software Identified By Federal Agencies Pursuant to Cyber EO
On April 18, the Government Accountability Office (GAO) issued a report which surveyed the states of implementation of the Cybersecurity Executive Order. In the report, GAO found that agencies had implemented 16 of the 17 requirements in Section 4 of the EO, which addresses enhanced mechanisms to ensure the integrity of the software used by federal supply chain partners. The report found that agencies had implemented 16 of the 17 requirements in Section 4, but highlighted action needed in one area. The report recommended, among other things, that CISA should issue its list of software and software product categories that are considered to be critical software, that CISA should direct Cyber Safety Review Board to document steps taken or planned to implement the recommendations provided to the President for improving the board’s operations, and that OMB should demonstrate that it has conducted cost analyses for the implementation of recommendations related to the sharing of threat information and resourcing needs for the implementation of an endpoint detection and response capability.
DOD Initiates Vulnerability Disclosure Program for Defense Contractors
On April 19, the Department of Defense (“DoD”) Cyber Crime Center (“DC3”) and Defense Counterintelligence and Security Agency (“DCSA”) announced a new Defense Industrial Base Vulnerability Disclosure Program (“DIB-VDP”). The program stems from a pilot that DoD conducted for one year, and will allow program participants to be onboarded and integrated to allow for vulnerability threat assessment on those participants’ voluntarily identified assets and platforms.
CISA Issues Guidelines for Critical Infrastructure to Assess AI Risk
On April 29, the Cybersecurity Infrastructure Security Agency (“CISA”) released guidelines relating to security and safety for use by critical infrastructure owners and operators. The guidelines outline the findings of CISA’s cross-sector analysis of AI risks, including cross-sector AI use cases and patterns in adoption. The analysis focuses on three risk types – attacks using AI, attacks that target AI systems, and failures in AI design and implementation. The guidelines that arose from this analysis are intended to mitigate the identified cross-sector AI risks to critical infrastructure.
NIST Issues Four AI Guidance Documents
On April 30, 2024, the National Institute of Standards and Technology (“NIST”) issued four significant guidance documents pursuant to the AI EO. These documents are: (1) a draft generative AI companion guide for NIST’s Secure Software Development Framework (SSDF) (2) a draft generative AI profile for NIST’s AI Risk Management Framework; a draft plan for global engagement on AI safety standards; and (4) draft guidance on “reducing risks posed by synthetic content.” Comments on each of these documents are due by June 2, 2024.
The draft generative AI companion guide for SSDF may prove to be the most impactful of these documents for government contractors. Federal agencies are currently required by OMB Memoranda M-22-18 and M-23-16 to obtain “self-attestation forms” from producers of certain “software” used by the agency that the software was developed in compliance with certain principles in the SSDF. Such self-attestations are required for “critical software” by June 8, 2024, and for non-critical software by September 8, 2024. The term “software” is broadly defined to include almost all types of software, including products that contain software. Thus, contractors may already be required to provide SSDF attestations regarding AI products or services to the extent incorporated in or associated with “software” subject to the attestation requirements to the extent the NIST AI generative companion guide results in additional or different SSDF requirements for generative AI, such requirements may be incorporated into the SSDF attestation forms required from software producers.