U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Https

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Breadcrumb

  1. Home
  2. Opog
  3. Directives

Was this page helpful?

PROGRAM EVALUATION

Number: 

DAO 216-21

Effective Date: 

2021-08-26

SECTION 1. PURPOSE.

.01 This Department Administrative Order (DAO) establishes Department standards for developing and using program evaluation findings and other evidence to improve efficiency and impact.

.02 It also establishes a system for regular review of programs and processes to improve their value and customer experience

SECTION 2. AUTHORITY.

.01 This Order is issued pursuant to the Foundations for Evidence-Based Policy Making Act of 2018, "Evidence Act", Pub. L. No. 115-435, 132 Stat. 5529 (2019).

SECTION 3. SCOPE.

.01 This Order applies to all bureaus, offices, and operating units of the Department. In accordance with the American Inventors Protection Act, Pub. L. No. 106-113, the U.S. Patent and Trademark Office shall be subject to the policy direction of the Department as provided in this Order, but otherwise shall retain responsibility for decisions regarding the management and administration of its operations and shall exercise independent control of its budget allocations and expenditures, personnel decisions and processes, procurements, and other administrative and management functions.

.02 Funding opportunities and awards, such as grants, contracts, and cooperative agreements, will be conducted in accordance with the standards and principles contained in this policy.

SECTION 4. DEFINITIONS.

.01 Evaluation -  As defined by 5 U.S.C. 311(3), "evaluation" means using systematic data collection and analysis of one or more programs, policies, or organizations to assess their effectiveness and efficiency. Evaluations can provide critical information to inform decisions about current and future programming, policies, and organizational operations. Evaluations can include an assessment of projects or interventions within a program or aspects of a policy or unit within an organization. Evaluations may address questions related to implementation; the effectiveness of specific strategies; and/or factors that relate to variability regarding effectiveness. Evaluations can also examine questions related to understanding the contextual factors surrounding a program, as well as how to effectively target specific populations or groups for an intervention. Evaluations can and should be used for learning and improvement purposes, as well as accountability. See OMB Memorandum M-20-12.

.02 Impact Evaluations - This type of evaluation estimates and compares outcomes with and without the program, policy, or organization. Impact evaluations include both experimental (i.e., randomized controlled trials) and quasi-experimental (i.e., comparison group that did not receive the service or intervention) designs. An impact evaluation can help answer the question, "does it work," or "did the intervention cause the observed outcomes?"

.03 Evidence - Defined by 44 U.S.C. 3561(6), "evidence" means information produced using statistical methods for a statistical purpose. Evidence, as applied in this DAO, and in the context of the Federal Performance Framework for improving organizational and agency performance, includes the available body of facts or information indicating whether a belief or proposition is true or valid. As such, evidence can be quantitative or qualitative and may come from a variety of sources, including foundational fact finding (e.g., aggregate indicators, exploratory studies, descriptive statistics, and other research), performance measurement, policy analysis, and program evaluation (see OMB Memorandum M-19-23). Evidence has varying degrees of credibility, and the strongest evidence generally comes from a portfolio of high-quality, credible sources rather than a single study.

SECTION 5. STANDARDS.

.01 Section 6 and 7 of this DAO establish the approach offices and bureaus will use to develop and document priorities for evaluations and evidence. The standards listed in this section will underpin the design, execution, and deployment of evidence and evaluations.

a. Relevance and Utility - Evaluations must address questions of importance (i.e., high impact and aligned with Department priorities) and serve the information needs of stakeholders. Evaluations should present findings that are actionable and timely. Findings should inform actions such as budgeting, program improvement, process design, accountability, management, regulatory action, and policy development.

b. Rigor - Evaluations must produce findings that Federal agencies and their stakeholders can confidently rely upon and provide clear explanations of limitations. An evaluation must have the most appropriate design and methods to answer key questions, while balancing its goals, scale, timeline, feasibility, and available resources.

c. Independence and Objectivity - Evaluations must be objective. Evaluators should involve stakeholders but have an appropriate level of independence in the planning and conducting of evaluations and in the interpretation and dissemination of findings, avoiding conflicts of interest, bias, and other partiality.

d. Transparency - Evaluations must be transparent. Evaluation design and findings should be made public to the greatest extent possible and should only be withheld for legal, ethical, or national security concerns. Decisions about an evaluation's purpose and objectives (including internal versus public use), the stakeholders who will have access to details of the work and findings, the design and methods, and the timeline and strategy for releasing findings should be clearly documented before the evaluation is conducted. Decisions should consider any legal, ethical, national security, or other constraints for disclosing information. Findings should provide enough detail so that others can review, interpret, or replicate/reproduce the work.

e. Ethics - Evaluations must be conducted to the highest ethical standards. Evaluations should be planned and implemented to safeguard the dignity, rights, safety, and privacy of participants and other stakeholders. Evaluators should be equitable, fair, and just, and consider cultural and contextual factors that could influence the findings or their use.

SECTION 6. EVALUATION/EVIDENCE DOCUMENTS.

.01 The documents listed in this section are part of the Federal Performance Framework that is standard for all major agencies. The Framework creates a high level but uniform approach to organizational learning and evolution. The Framework sets a schedule and specifications for: strategic planning, budgeting, monitoring plan implementation, developing evidence on effectiveness and impact, and using the evidence to improve program delivery, return-on-investment, and subsequent plans. All Department bureaus and offices are required to collaborate on the development and refinement of the required plans and reports.

.02 Multi-Year Learning Agenda - The Evidence Act of 2018 requires that agencies develop Learning Agendas as part of the Strategic Planning process that coincides with presidential term cycles. In the year after a president's inauguration, agencies are required to develop a four-year Strategic Plan and a multi-year learning agenda of evidence and evaluations that refine the agency's approach to its Strategic Objectives. The Commerce Department Strategic Plan and Learning Agenda are developed through a cross-bureau collaboration led by the Department Performance Improvement Officer (PIO), the Evaluation Officer (EO), and the Director of the Office of Policy and Strategic Planning in the immediate Office of the Secretary.

.03 Annual Evaluation Plan - The Department is required to develop a fiscal year Evaluation Plan that is submitted to OMB in September with the draft Agency Performance Plan and Report and the initial budget proposal. The most significant questions selected for evaluation in the year of the proposed budget are reported in the Department Annual Evaluation Plan. In collaboration with bureau staff identified by bureau leadership, the Evaluation Officer (EO) issues annual instructions on the process, templates, and schedule for developing the Department Annual Evaluation Plan. Bureaus have bureau specific processes for composing and clearing sections of the plan they may submit for Department consideration.

.04 Capacity Assessment for Research, Evaluation, Statistics, and Analysis - The Department is required to develop an assessment of resources available for evidence and evaluation projects and identify any resource or skill gaps that present challenges. The assessment will be done every four years on the same schedule as the Strategic Plan and Learning Agenda. Led by the EO, with the Statistical Official (SO), Chief Data Officer (CDO), and bureau representation, the assessment will address the coverage, quality, methods, effectiveness, independence of the statistics, evaluation, research, and analysis efforts of the Department. The EO will establish the approach and format for the assessment in consultation with executive and staff collaborators.

.05 Annual Strategic Review - An Annual Strategic Review is conducted every spring and is the process for multi-bureau Strategic Objective collaborators to assess progress on their Strategic Objectives. This process, led by the Performance Improvement Officer (PIO) and the Deputy PIO, should provide input on priority research questions for the Annual Evaluation Plan.

.06 Bureau Learning Agendas, Evaluation Plans and Capacity Assessments - Bureaus are encouraged to develop bureau-specific Learning Agendas, Evaluation Plans and Capacity Assessments. The multi-year Learning Agendas can include both evidence and evaluations and can be synchronized with planning and evaluation cycles specific to the bureau and existing systems for stakeholder and advisory group input. Bureau plans facilitate the development of evaluations and evidence that support multiple levels of the organization. The elements of bureau plans and assessments that are most significant to the Department's Strategic Plan will also be included in the Department level documents.

SECTION 7. EVIDENCE/EVALUATION SYSTEMS.

.01 Impact Evaluations will be included in Learning Agendas and Annual Evaluation Plans of the Department in accordance with Section 7.02. Impact Evaluations address the extent to which a program, strategy, or policy is achieving the public benefits it was designed to achieve. This could be additional employment opportunities, accelerated innovation, increased public safety, etc. Impact Evaluations typically require inferential statistics, and/or expertise in other advanced research methods. Bureaus will establish systems to assure policies, programs and procedures reflect impact evaluation findings on the relative effectiveness of alternatives.

.02 If a bureau has developed a Learning Agenda, the question(s) addressed, and schedule should be established in that document. Otherwise, a bureau will conduct at least one policy/program evaluation using a rigorous methodology in each four-year Strategic Plan cycle. The question(s) addressed may be broad and assess the impact of an entire program or initiative. They may be narrow and assess a specific strategy or targeting approach. The question(s) should be selected based on the potential impact of the findings on effectiveness or efficiency, the feasibility of the research, and resources available for the evaluation. Priority should be given to questions that recur in deliberations with OMB and/or Congress or have been recommended by stakeholder advisory committees, the Government Accountability Office or the Inspector General.

.03 Process Evaluations will be conducted by all bureaus and all offices reporting to the Deputy Secretary or Chief Financial Officer. Processes are the sequence of steps taken to create or deliver a product or service, e.g., the steps to reach a decision on a patent application, the sequence of actions taken to fill a position vacancy, the steps a person follows to submit or review a grant application. Processes that are fundamental to delivering services to the public or Department staff must be periodically assessed in accordance with bureau/office Learning Agendas. A bureau will conduct at least one process assessment in each Strategic Plan cycle. This review may cover an entire service delivery process or a segment that is important to the quality and timeliness of service. Reviews will assess if the process is meeting standards for cycle time, error rate, customer satisfaction, and if new technology should be employed to improve service. A reliable process map and revised performance standards are recommended products of the review.

.04 To maintain process performance on an ongoing basis, all major processes should have a designated process owner. The process owner is charged with regularly monitoring data on cycle time, error rate, rework and customer satisfaction and taking actions when standards for the process are compromised. The process owners are authorized to work pre-emptively and cross-functionally to correct problems as they emerge. Process performance is central to their individual performance plan.

SECTION 8. ROLES AND RESPONSIBILITIES.

.01 Deputy Secretary - The Deputy Secretary is the Chief Operating Officer of the Department and will receive a biannual report from the Department EO on compliance with this DAO. Bureaus will provide the Deputy Secretary with an abstract on the findings of evaluations and the related improvement actions within a month of completing an evaluation/evidence project.

.02 Evaluation Officer -  A Department EO is designated pursuant to the Evidence Act. The EO will collaborate with and assist bureaus and offices in complying with this DAO; will facilitate cross-

bureau/office collaboration on evaluations; and will oversee compliance with all evaluation requirements established in law or OMB directives. The EO will identify impediments to Commerce staff using data to develop evidence and will work with the Chief Data Officer (CDO) and the Statistical Official (SO) to improve access and usability of data.

.03 The Chief Data Officer, Statistical Official, and Performance Improvement Officer - The CDO, SO, and PIO will collaborate with the EO on development and implementation of the Learning Agenda and the Annual Evaluation Plan by taking actions that facilitate data availability (CDO), the development of rigorous evaluation methodology (SO), incorporating evidence in the Strategic Plan, and the Annual Performance Plan and Report (PIO). Further, the SO will champion confidentiality protection, appropriate data access, and full compliance with legal or ethical restrictions on use or release of data.

.04 Bureau Heads - Bureau leadership is responsible for incorporating evaluation and evidence into internal decision making, and program and process development. Bureau heads will designate a bureau Evaluation Lead to collaborate with the Department EO and support a bureau culture of continuous improvement and learning. The bureau Evaluation Leads will have standard language on the function in their performance plans. Bureau Heads will ensure that all Senior Executives contribute to developing the Department Annual Evaluation Plan and have improvement projects and related performance indicators in their performance plans.

SECTION 9. BUDGET INTERFACE.

.01 Budget Submissions - Systematic and regular development of evidence and evaluations are required by the Evidence Act. To the extent possible, bureau budget submissions to the Department should clearly identify the resources proposed for developing evidence and evaluations. Available evaluation and evidence findings should be foundational to budget narrative and justifications.

SECTION 10. HUMAN CAPITAL FOR EVIDENCE AND EVALUATION.

.01 Department/Bureau Staff - To advance staff skills in data analysis and evaluation, bureaus/offices are encouraged to engage their staff and cross-functional teams in evidence building and evaluations. Process evaluations often are best accomplished by the staff who execute and use a process. This approach not only improves process performance but facilitates implementation and avoids increased costs in a resource constrained environment. However, to maintain objectivity, an evaluation should be led by a qualified individual who is not a stakeholder in the outcome. Program staff should help evaluators understand the program and key questions. Evaluation staff should help the program staff understand the research methodology used, the rationale for its use, and limitations of findings. Evaluation teams may also include process/program customers (user or beneficiary); at minimum, their input should be part of the research approach.

.02 Contractor Support - Major evaluations may be conducted via contract. Alternatively, to control costs and build internal evaluation capacity, contractors may play the more limited role of providing expert advice to internal teams. While maintaining objectivity, all evaluations should involve the staff who administer a program or provide a service on the front line.

SECTION 11. DATA FOR EVIDENCE AND EVALUATION.

.01 Statistical and Administrative Data - Evaluators and analysts should use existing administrative and statistical data for their analysis whenever feasible, consistent with applicable laws and regulations. This will build internal skills in accessing and analyzing government data and will reduce the cost and burden of data collection. Evaluators and analysts will identify difficulties in accessing and using data and involve the CDO, SO and EO in resolving the problems.

SECTION 12. RESOURCES.

a. Foundations of Evidence Based Policy Making Act of 2018;

b. M 19 23 Phase 1 Implementation of the Foundations for Evidence-Based Policymaking Act of 2018: Learning Agendas, Personnel, and Planning Guidance; and

c. M 20 12 Phase 4 Implementation of the Foundations for Evidence-Based Policymaking Act of 2018: Program Evaluation Standards and Practices.

SECTION 13. EFFECT ON OTHER ORDERS.

This Order establishes Department Administrative Order 216-21, Program Evaluation.

Signed by: Acting Chief Financial Officer & Assistant Secretary for Administration

Approved by: Deputy Secretary of Commerce

Office of Primary Interest: Office of the Under Secretary for Economic Affairs