An official website of the United States government
Here's how you know
Official websites use .mil
A
.mil
website belongs to an official U.S. Department of Defense organization in the United States.
Secure .mil websites use HTTPS
A
lock (
lock
)
or
https://
means you’ve safely connected to the .mil website. Share sensitive information only on official, secure websites.
Skip to main content (Press Enter).
Toggle navigation
The Office of the Director, Operational Test and Evaluation
Weapons That Work — Faster
DOT&E
Search
Search DOT&E:
Search
Search DOT&E:
Search
Home
About
About DOT&E
Leadership
Organization
CCM
Directors, Past and Present
DOT&E Jobs
News
Calendar
Annual Reports
2023 Annual Report
2022 Annual Report
2021 Annual Report
2020 Annual Report
2019 Annual Report
2018 Annual Report
2017 Annual Report
2016 Annual Report
2015 Annual Report
2014 Annual Report
2013 Annual Report
2012 Annual Report
2011 Annual Report
2010 Annual Report
2009 Annual Report
2008 Annual Report
2007 Annual Report
2006 Annual Report
2005 Annual Report
2004 Annual Report
2003 Annual Report
2002 Annual Report
2001 Annual Report
2000 Annual Report
Oversight
Guidance
Engagements & Testimony
Presentations
Archive
Publications
Legislation
Links
Contact
Home
Guidance
DOT&E TEMP Guidebook
DOT&E TEMP Guidebook 3.1
TEMP Guidebook 3.1a (Complete PDF)
~4Mb
Table of Contents
♦
New Features of Guidebook 3.1 (for all users)
Part I - Introduction
♦
1.1 - PURPOSE
♦
1.2 - MISSION DESCRIPTION
♦
1.2.1 - Mission Overview
♦
1.2.2 - Concept of Operations
♦
CONOPS
Guidance
and
Examples
♦
1.2.3 - Operational Users
♦
1.3 - SYSTEM DESCRIPTION
♦
Cybersecurity OT&E
Guidance
and
Example
♦
1.3.1 - Program Background
♦
1.3.2 - Key Interfaces
♦
1.3.3 - Key Capabilities
♦
1.3.4 - System Threat Assessment
♦
Threat Representation
Guidance
and
Examples
♦
Cybersecurity OT&E
Guidance
and
Example
♦
1.3.5 - Systems Engineering (SE) Requirements
♦
Reliability Growth
Guidance
♦
1.3.6 - Special Test or Certification Requirements
♦
Threat Representation
Guidance
and
Examples
♦
1.3.7 - Previous Testing
♦
LFT&E Strategy
Guidance
Part II - Test Program Management and Schedule
♦
2.1 - T&E MANAGEMENT
♦
2.1.1 - T&E Organizational Construct
♦
LFT&E Strategy
Guidance
♦
2.2 - COMMON T&E DATA BASE REQUIREMENTS
♦
2.3 - DEFICIENCY REPORTING
♦
Defense Business Systems
Guidance
and
Examples
♦
2.4 - TEMP UPDATES
♦
2.5 - INTEGRATED TEST PROGRAM SCHEDULE
♦
Figure 2.1 -
Integrated Test Program Schedule
Part III - Test and Evaluation Strategy and Implementation
♦
3.1 - T&E STRATEGY
♦
Integrated Testing
Guidance and Best Practices
♦
3.1.1 - Decision Support Key
♦
3.2 - DEVELOPMENTAL EVALUATION APPROACH
♦
3.2.1 - Developmental Evaluation Framework
♦
3.2.2 - Test Methodology
♦
3.2.3 - Modeling and Simulation
♦
3.2.4 - Test Limitations and Risks
♦
Test Limitations
Guidance
and
DT Examples
♦
3.3 - DEVELOPMENTAL TEST APPROACH
♦
3.3.1 - Mission-Oriented Approach
♦
Integrated Testing
Guidance and Best Practices
♦
3.3.2 - Developmental Test Events and Objectives
♦
Integrated Testing
Guidance and Best Practices
♦
Software Algorithm Testing
Guidance
and
Examples
♦
Reliability Growth
Guidance
♦
Cybersecurity OT&E
Guidance
♦
3.4 - CERTIFICATION FOR INITIAL OPERATIONAL TEST AND EVALUATION (IOT&E)
♦
IOT&E entrance Criteria
Guidance
and
Examples
♦
3.5 - OPERATIONAL EVALUATION APPROACH
♦
Mission Focused Evaluation
Guidance
and
Examples
♦
Baseline Evaluation
Guidance with Best Practices
♦
End-to-End Operational Testing
Guidance
and
Examples
♦
Integrated Testing
Guidance and Best Practices
♦
Integrated Survivability Assessment
Guidance and Best Practices
♦
Force Protection Evaluation
Guidance
♦
Cybersecurity OT&E
Guidance
♦
Survey Design and Administration
Guidance
♦
3.5.1 - Operational Test Events and Objectives
♦
Realistic Operational Conditions
Guidance
and
Examples
♦
OT of Software Intensive Systems
Guidance
and
Examples
♦
Cybersecurity OT&E
Guidance
♦
3.5.2 - Operational Evaluation Framework
♦
Operational Evaluation Framework
Guidance with
Examples
♦
Test Instrumentation
Guidance
and
Examples
♦
Software Evaluation
Guidance with
Examples
♦
Mission Focused Metrics
Guidance with
Examples
♦
Scientific Test and Analysis Techniques
Guidance with
Examples
♦
Production Representative Test Articles
Guidance
and
Examples
♦
Test Resources
Guidance
and
Examples
♦
3.5.3 - Modeling and Simulation
♦
M&S for OT&E
Guidance
and
Examples
♦
3.5.4 - Test Limitations
♦
Test Limitations
Guidance
and
Examples
♦
3.6 -
LIVE FIRE EVALUATION APPROACH
♦
LFT&E Strategy
Guidance
♦
Integrated Survivability Assessment
Guidance and Best Practices
♦
Force Protection Evaluation
Guidance
♦
3.6.1 - Live Fire Test Objectives
♦
3.6.2 - Modeling and Simulation
♦
M&S for LFT&E
Guidance
and
Examples
♦
3.6.3 - Test Limitations
♦
Test Limitations
Guidance
and
LFT&E
Examples
♦
3.7 - OTHER CERTIFICATIONS
♦
3.8 - FUTURE TEST AND EVALUATION
Part IV - Resource Summary
♦
4.1 - Introduction
♦
Test Resources
Guidance
and
Examples
♦
4.2 - TEST RESOURCE SUMMARY
♦
4.2.1 -
Test Articles
♦
Production Representative Test Articles
Guidance
and
Examples
♦
4.2.2 -
Test Sites
♦
4.2.3 -
Test Instrumentation
♦
Test Instrumentation
Guidance
and
Examples
♦
4.2.4 -
Test Support Equipment
♦
4.2.5 -
Threat Representation
♦
Threat Representation
Guidance
and
Examples
♦
Cybersecurity Resources
Guidance
and
Examples
♦
4.2.6 -
Test Targets and Expendables
♦
4.2.7 -
Operational Force Test Support
♦
4.2.8 -
Models, Simulations, and Test Beds
♦
4.2.9 -
Joint Operational Test Environment
♦
4.2.10 -
Special Requirements
♦
4.3 - FEDERAL, STATE, AND LOCAL REQUIREMENTS
♦
4.4 - MANPOWER / PERSONNEL TRAINING
♦
4.5 - TEST FUNDING SUMMARY
♦
Test Funding
Guidance
and
Examples
Appendix A - Bibliography
♦
This appendix is self-explanatory. No guidance or examples are provided.
Appendix B - Acronyms
♦
This appendix is self-explanatory. No guidance or examples are provided.
Appendix C - Points of Contact
♦
This appendix is self-explanatory. No guidance or examples are provided.
Appendix D - Scientific Test and Analysis Techniques
♦
Appendix D is not required if the scope of the T&E strategy is fully explained and justified by scientific techniques in the body of the TEMP.
♦
Guidance
♦
STAT Guidance
♦
DOE Guidance
♦
Bayesian Guidance
♦
Examples
♦
STAT Common Designs
♦
Observational Example
♦
Bayesian Example
♦
DOE Examples
♦
a. DOE TEMP Body Example
♦
b. DOE Appendix D Artillery Example
♦
c. DOE Appendix D Precision Guided Weapon Example
♦
d. DOE Appendix D Software Intensive System Example
Appendix E - Cybersecurity
♦
Appendix E is not required if the cybersecurity strategy is fully explained in the body of the TEMP.
♦
Cybersecurity OT&E Guidance
Appendix F - Reliability Growth Plan
♦
Appendix F is not required if the reliability growth strategy is fully explained in the body of the TEMP.
♦
Guidance
♦
Reliability Growth
♦
Reliability Growth for Ships
♦
Reliability Test Planning
♦
Examples
♦
Reliability Growth Example
♦
Software Reliability Tracking - Example
♦
New Ship Example
♦
Mature Ship Example
Appendix G - Requirements Rationale
♦
Appendix G is not required if the rationale for requirements is fully explained in reference documents or in the body of the TEMP.
♦
Requirements Rationale Guidance