The following timeline features significant events, policies, methodologies, and organizations relevant to the Department of Defense’s (DoD) adoption of cloud computing, and modern software development methodologies.
“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
– National Institute of Standards and Technology (NIST)
The following timeline plots some significant events, policies, methodologies, and organizations relevant to the Department of Defense’s (DoD) efforts to adopt cloud computing and commercial software development and acquisition best practices from the private sector.
Although cloud computing has a complex history that can be told from many perspectives and can be traced to many origins, Salesforce.com, launched in 1999, is often credited for introducing some of the first successful cloud services to see broad (and today almost ubiquitous) adoption by commercial enterprise. At the time, Salesforce offered Customer Relationship Management (CRM) software through the cloud when competing CRM applications were still desktop-based. This model is known as Software-as-a-Service (SaaS) – a term that has come to encompass both a deployment and a business model, but that has also transformed the way that software is developed, deployed, and maintained.
The SaaS approach had implications for how software could be developed, deployed, and maintained. Beginning in the 90’s, developers would begin to seek ways to evolve from legacy software development workflows. Among them, Agile development would transform the software development process into an iterative one where design, development, and user feedback are part of a cycle that allows for a continuous integration and continuous delivery pipeline. The idea truly came into form in the years 2000 and 2001 when 17 programmers would meet to discuss the state of efficient software development and the need to escape documentation laden development workflows. These programmers were: Kent Beck, Mike Beedle, Arie van Bennekum, Alistair Cockburn, Ward Cunningham, Martin Fowler, James Grenning, Jim Highsmith, Andrew Hunt, Ron Jeffries, Jon Kern, Brian Marick, Robert C. Martin, Steve Mellor, Ken Schwaber, Jeff Sutherland, and Dave Thomas. They were representatives from other evolving development methodology thought groups such as “Extreme Programming, SCRUM, DSDM, Adaptive Software Development, Crystal, Feature-Driven Development, Pragmatic Programming, and others sympathetic to the need for an alternative to documentation driven, heavyweight software development processes convened,” according to “History: The Agile Manifesto.” From their two meetings in 2000 and 2001 they drafted the “Agile Manifesto.” This combination of new modes of delivery with more responsive and iterative software development saw growing commercial enterprise adoption.
As the private sector started to evolve how it built and deployed software at a faster pace than the public sector, E-Government Act of 2002, also known as the Federal Information Security Modernization Act (FISMA), was enacted. FISMA assigned responsibility to the National Institute of Standards and Technology (NIST) and the Office of Management and Budget (OMB) to provide guidance and compliance standards for security across all government agencies. FISMA 2002 would later be amended by the Federal Information Security Modernization Act of 2014. The legislation made the first steps toward establishing a centralized enterprise level cybersecurity standard for the Federal Government. It called for all Federal bodies and contractors to develop and implement security plans as well as monitor and report findings. FISMA requirements include certifications and accreditations, establishing categories for data and systems based on risk levels, and keeping an inventory of information systems.
Amazon internally launched Amazon Web Services (AWS) infrastructure in 2002 to better manage its growing e-commerce operations. The intent was to help companies build their own e-commerce sites on top of Amazon’s e-commerce engine, but the effort proved challenging. When finally completed in 2006, Amazon went public with the AWS cloud offering it as an Infrastructure-as-a-Service (IaaS) product. Today, AWS is the largest cloud service provider in the world for both the private and public sectors.
Zimki was the first Platform-as-a-Service (PaaS). It was launched in 2005 by a Canon Europe subsidiary, Fotango, targeted toward developers, and delivered using a utility computing model. Organizations purchase utility computing services to outsource IT infrastructure and management. This delivery model also allows users to receive services based on specific demand as opposed to flat rates, offering a more scalable solution for organizations’ fluctuating needs. With Zimki users did not have to worry about the infrastructure underlying their applications. Zimki used the term “realms” to describe what we now understand as environments. “Users could easily pull code from these realms into their local machines to develop, and seamlessly clone their code across development, staging, and production realms,” according to Porter. Zimki would meet its end in 2007, but not before leaving a mark in computing history.
In 2007, a change in the National Security Agency’s (NSA) underlying security approaches led to the publication of the DoD Information Assurance Certification and Accreditation Process (DIACAP). DIACAP was enacted to support FISMA, standardize information system requirements, and create a process for authorizing these systems to operate within the DoD IT environment. DIACAP replaced the DoD IT Security Certification & Accreditation Process (DITSCAP) that was published in 1997. DITSCAP was very similar to DIACAP except it focused on accrediting single IT entities, whereas DIACAP focused on accreditation from a DoD-wide enterprise level. DIACAP relied on information assurance (IA) controls, which are safeguards and countermeasures designed to protect the confidentiality, integrity, and availability of information within a system or organization. DIACAP shifted the DoD toward an enterprise level view of accreditation, implementing a standardized control set based on FISMA, opening a web based support portal, and emphasizing the need for regular security posture reviews. DIACAP’s shortcomings amounted to rigidity and a lack of translatability across the rest of the Federal Government. The copious documentation required and the work it necessitated made DIACAP a long process averaging 6 months. This process would have to be repeated every 3 years, or every time a significant update is made to the information system.
DevSecOps is a discipline that is in many ways a natural progression of the principles set by Agile. DevSecOps combines development, security, and operations in software development into one process instead of siloing these teams and responsibilities. DevSecOps adds security testing and coordination to all phases of the DevOps approach to the software lifecycle. This starts at the very beginning of the build process and is incorporated throughout – rather than, for example, saving vulnerability tests for the final software review stages. DevOps and DevSecOps are often described as a culture rather than a simple method or strategy.
In 2010, the DoD adopted the NIST-devised Risk Management Framework (RMF) that would replace DIACAP. NIST RMF changed a lot of the language around IA. The NIST RMF allows for multiple authorizing officials (AO) to share responsibility for authorizing systems and the associated risks, whereas DIACAP designated a single AO. The switch to multiple AOs allowed for any one AO to have more bandwidth and for those seeking authorization to have greater access to an AO at any given time. NIST RMF also integrated risk management activities into the development lifecycle, and focused on continuous risk monitoring during operation. While NIST RMF is a great mandate for security, the methods to accomplish the framework’s requirements are no longer sufficient. The continuous updating and deployments of today’s commercial softwares keep them relevant and useful, but this continuous delivery cannot be maintained under RMF. The processes involved in complying with RMF were and still are manual. Source codes are checked by human eyes and experimented with in a sandbox. Without automation, the NIST RMF struggles to contend with commercial technology sector practices.
In 2010, the first Federal CIO Vivek Kundra announced the “Cloud First” policy, shifting government focus toward transitioning IT infrastructure to the cloud. In 2011, the Cloud First policy was adopted to accelerate government cloud adoption. This was a turn in the right direction, but the steep learning curve and lack of clear guidance on cloud technology limited agencies’ ability to act. These obstacles were combined with the decentralized nature of on-prem IT infrastructures across local, state, and federal entities.
After the Cloud First policy set government sights on cloud adoption, the Federal Risk and Authorization Management Program (FedRAMP) program of 2011 set standards specifically for agencies using cloud technologies. FedRAMP is overseen by two entities: the Joint Authorization Board (JAB) and the Program Management Office (PMO). FedRAMP applies to private government owned clouds and CSP’s. FedRAMP is a general ATO route for approval to work with all agencies as opposed to FISMA’s agency-by-agency policy. FedRAMP is limited in that it is only suitable for software that handles unclassified data. Higher sensitivity information and the agencies that deal with it still maintain their own cybersecurity protocols. The DoD follows its own more rigorous version of FedRAMP called FedRAMP Plus that is not overseen by JAB. FedRAMP has authorized more than 260 cloud solutions since its inception, but continues to struggle to scale to meet the demand of the enormous and ever-expanding ecosystem of software companies seeking a pathway to the public sector software marketplace.
In 2013, AWS was awarded the Commercial Cloud Services Contract (C2S) brokered by the Central Intelligence Agency to provide cloud services to all 17 agencies of the Intelligence Community (IC) for 10 years, valued at some $600 million. The first intelligence agency to host a major operational capability in the C2S environment was the National Geospatial-Intelligence Agency with the Map of the World application. The success of C2S opened the door for AWS to expand the security clearance levels of its cloud service offerings over the years, including the AWS Secret Region (aka Secret Commercial Cloud Services Contract or SC2S), launched in 2017. According to AWS, this infrastructure made them “the first and only commercial cloud provider to offer regions to serve government workloads across the full range of data classifications, including Unclassified, Sensitive, Secret, and Top Secret.” It is an offshoot of AWS Top Secret Region, or C2S, launched three years prior that is exclusive to the Intelligence Community. In addition in 2020, the CIA awarded the Commercial Cloud Enterprise (C2E) Contract to 5 companies: AWS, IBM, Oracle, Microsoft, and Google. The IC provisioned $10 billion over a ten year period for the C2E contract to assemble a multi-provider enterprise cloud.
18F was started in 2014 by a group of Presidential Innovation Fellows in an effort to improve and modernize government technology. It is an office of federal employees within the General Services Administration (GSA) that implements code to ensure code being pushed is secure through both static and dynamic code analysis, and eliminate the need for excel compliance sheets. 18F would not be the only one to come at the software acquisition problem from different angles either.
The Defense Innovation Unit is a DoD organization that was started in 2015 to help the DoD acquire commercial technology faster. Originally DIUx (for experimental), DIU also assists in fielding and scaling these technologies for the DoD. According to diu.mil, “DIU is the only DoD organization focused exclusively on fielding and scaling commercial technology across the U.S. military at commercial speeds.” DIU partners with organizations across the DoD and National Security Community to connect them with leading commercial sector tech businesses in areas such as artificial intelligence (AI), energy, and space.
Cloud One, initiated in 2017 by the Air Force, is a hosting service and platform that offers secure government cloud service providers such as AWS, Microsoft, and Google to the DoD. In addition to hosting, users can efficiently develop cloud-native apps using a DevSecOps development framework. Cloud One’s technical teams also assist with migrating customer systems to the cloud. Cloud One started as an Air Force solution, but has scaled to serve all of the DoD.
AFWERX is a Technology Directorate of the Air Force Research Laboratory (AFRL) and the innovation arm of the Department Air Force. It was officially announced on July 21, 2017. AFWERX’s goal is to advance technology acquisition for the Air Force by making capability transitions agile and affordable. AFWERX and AFVentures were the first DoD innovation initiative to leverage the Small Business Innovative Research (SBIR) budget to engage with commercial industry.
Kessel Run is the operational name of the Air Force Life Cycle Management Center’s (AFLCMC) Detachment 12. Kessel Run was started in 2017 with a goal to revolutionize DoD software acquisition. It has made significant steps in this direction by focusing on software development powered by Air Force Airmen equipped with commercial industry software development training to continuously deliver valuable software to DoD users. These Kessel Run Airmen build, test, deliver, operate, and maintain cloud-based infrastructure and warfighting software applications. Today, Kessel Run is recognized as an accomplished DoD software factory.
Cloud Smart was established to succeed Cloud First and advance the mission it set out. It provides practical and actionable recommendations and guidance on government cloud adoption. Cloud Smart’s three pillars to the success of cloud adoption are security, procurement, and workforce. Some Cloud Smart action items include reducing application portfolios, assessing customer experience and user needs, and IT staff training.
Platform One was established in Dec. 2019 by the Air Force with the mission statement to “Accelerate Secure Software Delivery for the DoD.” Platform One provides DoD Enterprise DevSecOps Initiative (DSOP) solutions for federal software acquisition entities. It provides these solutions through an infrastructure as code (IaC) model. Some of its offerings include hardened Cloud Native Computing Foundation (CNCF)-compliant Kubernetes distributions, infrastructure as code playbooks, and hardened containers all pre-approved for DoD use. Platform One was also able to lobby for policy change so that they don’t have to ask for source code and go through compliance checklists, instead leveraging hardened containers and automated testing. Platform One is meant to serve as the model for open architecture services across the DoD. While the DoD would be nurturing a boom in tech intrapreneurship, it would also be devising plans to court big tech entrepreneurs.
Few government cloud deals have made as much of an impact on the cloud acquisition conversation as the Pentagon’s 2019 Joint Enterprise Defense Infrastructure (JEDI) contract. The contract would have authorized a $10 billion budget for a ten year time period in an effort to unify the DoD’s data load in one cloud. The initiative became controversial and although awarded, it was protested. Due to schedule setbacks and lengthy legal scuffles, the Pentagon dropped the contract in 2021, but not the requirement underlying the contract.
After JEDI, the Pentagon launched the Joint Warfighter Cloud Capability (JWCC) contract. JWCC solves the competition issue by including multiple government cloud providers in the deal, namely Amazon, Microsoft, Google, and Oracle, authorizing $9 billion over five years. It has faced setbacks as well though. In March 2022, the Pentagon’s chief information officer announced that the contract would be delayed till December, eight months off from the previous April date for the first award.
The Army Software Factory opened on April 15, 2021, in Austin, Texas on Austin Community College’s Rio Grande campus. The Army Software Factory is a training pipeline for Soldiers and Army Civilians to learn modern software development practices in order to prototype, develop, and recreate applications for real operations. The ultimate goal is to cultivate software development talent from within the Army to further modernization efforts.
How the DoD develops, acquires, and maintains software is an ongoing story. The mounting efforts to date reflect a growing realization that across the enterprise and including senior leadership that more must be done – not just in the name of cost savings and efficiency but in order to remain competitive and keep pace with near peer competitors and potential adversaries.