Back to Digital Guide Home

Digital Considerations for Acquisition Documents

The DAF Digital Transformation Office (DTO) knows that as programs transform to incorporate digital acquisition, these core documents may still be required (i.e. will not be represented as source of truth within a model; or will be source of truth in a model yet still needed by Senior Acquisition Leaders in a document form) and will need to include digital acquisition planning information.  

The DTO has looked at the core acquisition documents: Acquisition Strategy Plan (ASP), Systems Engineering Plan (SEP), Test and Evaluation Master Plan (TEMP), and Life Cycle Sustainment Plan (LCSP) and provide the information below to help.  The ASP template is located as a PowerPoint file available for download from the directory to the right.  Also available for download are the word documents of the information that is located below for the SEP, TEMP and LCSP.  As the DTO progresses these data products are expected to be updated, and we are open to comments from programs that will make these better products for the overall acquisition community.

Acquisition Strategy Plan (ASP):

Task 1:  Identify areas within ASP template that need to address the DTO Objectives.

  • SAF/AQX maintains the Acquisition Strategy briefing template for programs where the Service Acquisition Executive (SAE) is the Decision Authority (e.g. large MDAP ACAT I size programs).
  • Each Program Executive Officer directorate or program may tailor their ASP Templates as required.
  • DTO recommended updates were officially implemented as of Feb 2021.  Changes were primarily implemented in the notes pages to support communication of Digital acquisition strategies and impacts. Programs should verify directorate/division ASP template configuration with their Program Execution Group
  • Particular slides with significant content change includes:

    • Business

      • Slide 17 “Business Strategy”
      • Slide 19 “Competition Strategy”
      • Slide 21 “Contract Incentives”
      • Slide 22 “Intellectual Property (IP) Strategy”
      • Slide 35 “Program Office”
    • Technical
      • Slide 24-25 “Systems Engineering (SE)”
      • Slide 26 “Digital Engineering/MBSE Strategy”
      • Slide 29 “Product Support Strategy/Digital Engineering”
      • Slide 30 “Test and Evaluation (T&E)”
      • Slide 38 “Agile Software Development”
  • These changes align with the SAF/AQ Digital Building Code memo dated 2022

Systems Engineering Plan (SEP):

This section of the page was updated on October 25, 2021.  Previously this section listed several additions to the OSD SEP Outline version 3.0.  Many of those additions were incorporated into version 4.0 of the OSD SEP Outline which can be found by clicking here.  The OSD Guidance, to include the SEP Outline, can be found on the Engineering Reference for Program Offices.

Test and Evaluation Master Plan (TEMP)

 

Task 1: Identify areas within TEMP template that need to address transformation objectives.

  • Content is more important than format.  Do not repeat content in the various sections, and consolidate as practicable.  DOT&E TEMP Guidebook 3.1, dated 19 Jan 2017, was used as the reference.  The TEMP should be specific to the program and tailored to meet program needs.
  • Note: DOT&E pilot program format (see 8 Apr 2019 memo) has a reduced number of sections (3).  Therefore content may need to go in another section, or be incorporated via reference, for TEMPs using the abbreviated format.

In the order appearing in the template:

  • 1.3.1 Program Background
    • Include reference to Acquisition Strategy sections that address Digital Acquisition / Digital Engineering Objectives.  The TEMP supports the Acquisition Strategy with a Test & Evaluation Strategy.
    • Describe whether the program has chosen a model-supported, model integrated, or model-centric strategy.  A program’s test strategy needs to reflect the acquisition strategy digital objectives and how they will be utilized/supported.
      • Is a Government Reference Model (GRM) and/or Acquisition Reference Model (ARM) being used on the program?  If so, testers need to understand the former, and help build the latter.
  • 1.3.5 System Engineering Requirements
    • Include reference to Systems Engineering Plan (SEP) sections addressing Digital Engineering Objectives.  Expected new sections include:
      • Modeling strategy
      • Key modeling activities
      • Technical reviews – how models used for continuous/dynamic review vs. static data (i.e. PowerPoint charts)
    • TEMP supports the SEP with a strategy to meet applicable systems engineering needs along and across the “V” – particularly system level verification.
    • Reference AF Digital Guide “Key Digital Engineering Features” product(s), and other applicable briefings.  Explain how T&E objectives are leveraging (or driving) key digital features chosen for the program. 
    • Examples:
      • Will Technical Performance Measures (TPMs) status be available to all stakeholders via dynamic digital environment vs. monthly contractual updates via email to one particular group in the program office?  Describe here (or other section where program’s Integrated Digital Environment (IDE) is explained) how this will be implemented to include Integrated Test Team (ITT) access.
      • If program is model-centric, describe early tester participation in building model frameworks/views (e.g., the ARM) and populating with T&E content.  This will be necessary whenever models will be used to “output” system level test plans, especially with resource-loaded content and schedule linkages. 
      • If program is model-supported or model-collaborative, then testers still need to collaborate with the prime contractor and program office via some defined level of access to models in order to begin detailed test planning.  Examples include aircraft envelope expansion predictions and test point matrices.
  • 2.2 Common T&E Database Requirements
    • This is required by AFI99-103, Para 5.18.  Implementation is left to the program.
    • The requirement for a description of provisions/methods for “…accessing, collecting, validating, and sharing data as it becomes available from contractor testing, Government DT, OT…” is already in the template.  However, more detail specific to implementation of the intended digital environment is needed.  The typical description seen in most TEMPs is very generic and needs more critical thought. 
    • Goal is an “Integrated Digital Environment (link to illustration of one possible implementation only) that includes testers (DT & OT), contractor, program office engineering, other functional specialties, independent certification authorities, and others designated as needing access by the ITT.
      • The “Common T&E Database” could certainly be a subset of the program’s IDE.
      • Describe “how” test data will be shared.  Use references to other program documents when pertinent.  The contract needs to have more details than the TEMP as the TEMP is not a contractual document.
      • Is the “Common T&E Database” cloud-based? If not who is hosting and controls access?  Is there one data repository, or several (may be driven by security levels)?  Procedures & applications to be used?  Is the “data dictionary” specified as a contract deliverable?  Is raw or reduced data needed, and/or products such as quick-look reports?
        • “Raw Data” will be very large quantities, need to account for this in planning the infrastructure and transfer methods
      • Does the test community have access to requirements database? (E.g., DOORS)
      • Does the test community have access to contractor problem report database?
      • Can the above information needs be satisfied by one overarching IDE?  Or are other, more stove-piped implementations necessary?  Short of access, static “snap shots” of database content would have to be used.
      • There are limitations when interfacing with Government networks, e.g. large amounts (100s of GB to TB) of data transfer via standard AFNET does not work very well.
        • Example: Transfer of sled test video from contractor network via logging into server from AFNET did not work at all when attempted by one recent program.  The local base network “choked” on very large data files and timed out.
        • In the 21st century we should not be hand carrying hard drives of data between test & data analysis locations on a regular basis.  Typically contractors do not allow Government hard drives to be connected to their networks, and the test team will have to get special permission to connect contractor-owned drives to a Government network.
        • Consider dedicated point-to-point AFNET lines.  412 Test Wing has implemented this before successfully between Edwards AFB and contractor facilities.  The cost is low and benefit high to a program.
      • Classified vs. unclassified.  Is a new DREN terminal or Taclane needed?  Does contract DD254 have the right boxes checked? (E.g., does the contractor need a COMSEC account to store keys for encryption devices – although many devices can now be keyed remotely)
      • Consider a “sandbox” approach to implement an IDE.  Pilot tools deployed on Cloud-1 would likely be program-funded.  Check if a PEO-funded “Integrated Digital Sandbox” has been deployed & is available to the program.  GBSD & ABMS used this approach.
      • A minimal IDE would be used for exchanging documents/artifacts (e.g., SharePoint).  While useful, this is not realizing the full potential of “going digital.”  Documents are static artifacts unless stakeholders have the ability to asynchronously collaborate on maturing them.
      • What user functions & features do testers need?  Is user access from primary work PC needed?  Does access location matter (i.e. classified vs. unclassified facility, unique PC configuration)?
      • What level of collaboration with the contractor is planned for test planning & execution?  “Insight” vs. “Oversight” for contractor DT&E?
      • Will cybersecurity and airworthiness processes demand large amounts of data to be reviewed by the Government, and how will the ITT facilitate this flow of data? (ITT has a vested interest but may not be the primary player in facilitating flow of data – depends on the scope of the test program in supporting certifications.)
      • Will CDRL deliverables (e.g., test plans) be digitally available for dynamic and collaborative review prior to formal delivery? [As opposed to the way it is generally done today – informal draft document via email, if at all, and formal delivery with contractual overhead activity and deadlines.]
      • Are the program requirements for an IDE described in the contract?  Briefly describe these in TEMP, or provide direct references to statement of work etc.
  • 3.1 T&E Strategy
    • Concise summary description of how test program is implementing or leveraging Digital Engineering – should reference applicable sections of supported program documents
      • Digital Engineering (DE) Implementation – is there a separate plan? Or, summarized in the acquisition strategy and SEP?
      • How much collaboration with the prime contractor is planned? 
      • Use of Model-Based Engineering / Model-Based Systems Engineering (MBE/MBSE) for test planning, test readiness reviews, & test execution
      • Examples: Analysis via models to build a test matrix in a detailed test plan.  Model views utilized for test readiness reviews.  Comparing predictions from models to actual test results, and the process by which models are updated/improved during development of the system.
      • Use of Integrated Digital Environment (IDE)
        • How is data (of all forms needed for test planning and independently evaluating test results) efficiently accessed by stakeholders.  This will likely include model data in some form.
    • How does the test strategy support acquisition strategy and system engineering needs (including verification at the various levels – system of systems, system, subsystem, etc.)?
      • Tools: Verification Cross Reference Matrix (VCRM) and Integrated Test Event Matrix (ITEM) – former is part of the system specification, latter is derived from former.  These should be digitally available to ITT via an IDE.
    • Test strategy must be synchronized with contract language – confirm here
      • What does the Government test team need from the contractor in terms of digital access and/or delivery?
        • Access to dynamic vs. static warehoused data
        • Access to models – views, ability to provide input and observe output, etc.  Which models does the ITT really need?
        • Access to simulations, or ability to run independently
        • Technical Data Packages (e.g., modifications to test aircraft)
        • Cybersecurity testing – facilities, test items, other support
        • Laboratory testing
    • Digital (IT) infrastructure – services & transport
      • What type of connectivity is needed between contractor, program office, and test organizations?  Cloud-based?  Direct?  Indirect (through intermediary)?
      • DREN? AFNET? Contractor-managed network?
      • Capacity needed? (Consider file sizes; documents vs. flight test raw data)
    • Software tools (Identify any that require approval to be utilized on Gov’t networks? Long lead times are typical, need to be addressed early.)
      • E.g. CAD tools, Cameo, PLM TeamCenter or Windchill, AFSIM
  • 3.2.3 Modeling & Simulation (DT&E utilization) – see existing language in template already – includes:
    • “Describe the key models and simulations and their intended use. Include the developmental test objectives to be addressed using M&S to include any approved operational test objectives.”
    • “Identify who will perform M&S verification, validation, and accreditation.”
    • “Identify data needed and the planned accreditation effort.”
    • “Identify how the developmental test scenarios will be supplemented with M&S, including how M&S will be used to predict the Sustainment KPP and other sustainment considerations.”
    • “Identify developmental M&S resource requirements in Part IV.”

Augmenting questions for critical thought:

    • What are the modeling objectives for program’s effort?  In other words, what does the Gov’t need to able to do with the model(s)?
      • These are NOT modeling requirements
    • What are the modeling capabilities needed for DT&E, and when?
      • Predict, Test, Validate – how will models be used, and for what ends?
      • Descriptive models vs. Analytical models, and integrating analytical with descriptive
      • Expect model(s) to evolve if system/product is in development
      • Examples: Analytical models to define performance characteristics, e.g. Flight sciences: Aerodynamic, propulsion, flying qualities.  Airframe structures (loads, flutter).  Mission systems: Electronic Warfare, Radar, weapons usage.
      • Examples: descriptive models of a test plan review process, or stakeholder relationships within a program including all test organizations and certifying authorities.
      • Training for testers on use of models?  Formal vs. informal?
    • What is needed from specific models for independent evaluation?  (Definition of modeling elements and data, if not addressed in another plan like the SEP.  If not in the TEMP, needs to be documented by ITT such that PMO can implement in a contract.)
      • Output of models alone (artifact)?
      • Electronic access to contractor models with ability to input parameters and observe/collect output?  Is ability to access & utilize model views needed?
      • Delivery of models to Government (including test community)?  Format – Is source code needed, just executable code, and/or summary design information (e.g. control law block diagrams)?  One delivery vs. multiple?  Timing?
      • Delivery of drawings for special instrumentation “temporary” modifications.  What/How/When – 2D vs. 3D.  What tools needed to view/modify.  See latest standards for sustainment TDPs – should be same format (e.g. MIL-STD 31000B). 
    • Who is responsible for managing models?  [There will be overlap with PMO engineering requirements here, which should be documented in the acquisition strategy and SEP.  Focus on access to and usage of models for system level verification that the independent testers need to write reports, as agreed to with the PMO.  Whether the Government or Contractor actually manages the models may not be critical to testers, as long as AFTC has access.]
  • 3.3 Developmental Test Approach
    • 3.3.1 Mission oriented approach.  This section describes how to test the system in a mission context.  Will any previously addressed models, simulations, and/or analysis be used?  Likely necessary for non-testable scenarios, including those too expensive or resource intensive to execute in practice.
    • 3.3.2 Developmental Test Events (Description, Scope, and Scenario) and Objectives [Is anything needed in this section that has not already been addressed above?]
  • 3.5.3 Modeling & Simulation (OT&E utilization)
    • This is OTA content (AFOTEC or MAJCOM test organization).  If OTA needs access to contractor-produced & maintained models, this generally needs to be written into the contract.
  • 4.2 Test Resource Summary – 10 subsections including:
    • 4.2.3 Test Instrumentation
      • If the contractor is performing instrumentation design, will the Government be able to review the design digitally?  How?
      • How will final design / drawings be delivered?  3D models, or 2D representation of 3D data?  (Latter case is less desired – old way of doing things – Government should be using 3D tools rather than paying contractor to reduce 3D models to 2D artifacts.)
        • Does AFTC require new software tools to utilize 3D data?
      • Is a Technical Data Package (TDP) required?  Generally yes, if a Government-owned test asset is being permanently modified and will be managed/sustained by the Government in the long term.
        • Review T-2 modification documentation & requirements
        • Review TDP requirements & what is really needed [need a good link to references]
    • 4.2.8 Models, Simulations, and Test Beds
      • See existing language in the DOT&E Guide – what needs to be augmented as a result of program’s chosen digital acquisition strategy?
      • Are there any resource needs specific to Government use of planned models, simulations, and Test-Beds to execute current test program, possibly including provisioning for future capability updates?
      • Which digital and/or modeling capabilities & training does the Government need to have in place before award?  Must be ready for execution so the Government isn’t behind on day one!
    • 4.2.10 Special Requirements
      • Anything that doesn’t quite fit above or not previously included.  Such as:
      • Licensing and/or AF approval for data processing tools/means/methods, use of unique Government or contractor databases, or IT infrastructure unique to the program or enterprise
      • Examples: Use of Distributed Test Operations, AFNET data “pipe” installation at contractor facilities, remote access to contractor models / databases / data storage, new or modified DREN point of service.
      • Unique contractual arrangements such as “Design Agent” construct.
      • Reference to a DE/MBE/MBSE Implementation Plan the program has produced, and what ITT stakeholders need to support it. 
      • Specialized DE/MBSE Training for DT&E personnel.

Lifecycle Sustainment Plan (LCSP)

 

Task 1: Identify areas within LCSP template that need to address transformation objectives.

Digital Considerations within the Lifecycle Sustainment Plan (LCSP)

Introduction

The product support strategy should align with the model-based acquisition strategy – as defined below:

  • Modeling objectives are used to select one of the model-based acquisition types:
    • Model-supported acquisition: models are used to support various engineering activities, including the production of key documents for contractual purposes
    • Model-collaborative acquisition:  Models form part of the contractual artifacts but as secondary or complementary artifacts
    • Model-centric acquisition:  Models are primary artifacts (with the capability to generate required documentation)

__________________________________________________________________________

1. Product Support Strategy (Section 3)

  • This section directly relates to the Acquisition Strategy Sections 5.5 and 6.2, and therefore should address the model-based acquisition strategy (e.g., model-supported, model-integrated, model-centric)
  • Summarize the product support strategy for meeting sustainment requirements necessary to satisfy the model-based acquisition strategy requirements (Technical Data Strategy/Intellectual Property Strategy)
  • Use of digital twin for product support decisions

2. Cybersecurity (Section 3.1.4) Address considerations for cybersecurity within the program protection plan (PPP) that is of course an annex to the LCSP.

  • Address supportability and/or sustainment efforts support compliance with the Program Protection Planning (PPP) and the AFLCMC System Security Engineering Standard Processes within the model-based environment

3. Influencing Design and Sustainment (Section 5)

  • Identify model-based requirements that affect system’s design and performance
  • Identify impacts of the model-based requirements to a system’s product support strategy, planning, and implementation

Beyond this section, the LCSP addresses each of the 12 product support elements

4. Design Interface (Section 9.1)

  • Should articulate model-based requirements as a design-consideration – as outlined in the SEP (systems engineering plan)

5. Sustaining Engineering (Section 9.3)

  • Document the Failure Reporting, Analysis, and Corrective Action System (FRACAS) to include how the FRACAS data will be used from initial modeling and analysis through the fielding of the system.

6. Maintenance Planning and Management (Section 9.4.3)

  • Outline the maintenance concepts for hardware and software, to include considerations for maintenance (depot activation requirements) in a model-based environment
  • Considerations include manpower skills, support equipment needs, how the program will determine repair time, testability requirements, etc. within a model-based environment
  • Identify use of preventative maintenance strategies that rely on model-based environment like CBM+
  • Identify how the program will acquire and manage necessary data to populate the supply and maintenance systems that support maintenance concepts in a model-based environment (CBM+?)

7. Supply Support (9.4.4)

  • Should include considerations for provisioning and cataloguing processes (provisioning technical documentation delivery/storage) in a model-based environment

8. Packaging, Handling, Storage, and Transportation (PHS&T) (9.4.5)

  • Identify the program strategy for safely packaging, handling, storing, and transporting the system as well as any special requirements and interfaces with agencies or DoD components responsible for transporting the system within the model-based environment.  Product support requirements for tracking shipped components.

9. Technical Data (9.4.6)

  • Define the technical data strategy for support of a model-based environment
  • Define the program’s approach to managing the data during acquisition and sustainment (e.g., access, method of delivery, format, and storage) within a model-based environment
  • Technical data rights strategy in support of the model-based requirements
  • If operating in an integrated data environment, consideration for network compatibility issues and mitigation steps for operating in the model-based environment
  • Document logistics data enterprise architecture generated which identifies electronic data repositories, information exchange requirements, and/or usage
  • Identification of preliminary engineering/products support data needed
  • Process by which the TOs will transfer from Acquisition to Sustainment

10. Support Equipment (9.4.7)

  • Requirements for overall support strategy for SE to include identification of the following:  support equipment documents, supply support, interim spares, manpower, training, technical data, maintenance level and maintenance task requirements, computer resource support, calibration, facility requirements, support equipment for SE, hand tools and depot level support equipment 
  • Identify the support equipment strategy aligned with the model-based requirements (maintenance concept, acquisition strategy) (e.g., Support equipment drawing delivery in 3D?)  Would we require delivery of the support equipment drawing in a PDF? Or 3D model?
  • Considerations for newly designed support equipment technical data that can be used in a model-based environment
  • Delivery of support equipment technical packages in model based environment
  • Considerations for development versus non-development

11. Training and Training Support (9.4.8)

  • Address potential security issues for working in an integrated environment (e.g., integration of training systems with vendors)
  • Trainers and simulators – digital twin technology

12. Manpower (9.4.9)

  • Manpower requirements aligned with the maintenance and support equipment strategy
  • Considerations for manpower requirements for operating in a model-based environment

13. Facilitates and Infrastructure (9.4.10) 

  • Address use of digital facilities drawings
  • Civil engineering considerations

14. Computer Resources (9.4.11)

  • Identify, plan, resource, and acquire facilities, hardware, software, documentation, manpower and personnel necessary for planning and management of mission critical computer hardware and software systems.  Programs should coordinate and implement agreements necessary to manage technical interfaces, manage work performed by maintenance activities, and establish/update plans for periodic test and certification activities required throughout the life-cycle.
  • Program’s support plan for software/system in an integrated data environment
  • Identification of all systems/software used for operating in the model-based environment
  • Licensing agreements for operating within the model-based tools
  • Configuration management approach to include obsolescence, deficiency, modification, hardware/software baseline, and requirements management within model-based environment
  • Software baseline delivery methods Owning and managing the technical baseline
  • Strategy for managing the technical baseline into sustainment (e.g., access to data)
  • Considerations for open system architecture (document the OSA strategy)
  • Hosting and infrastructure strategy
  • Cybersecurity compliance