Return to Contents

Directorate of

Notes on Analytic

Product Evaluation

    April 1995

Note 2


Access and Credibility

This is the second in a series of Product Evaluation Staff notes to clarify the standards used for evaluating DI assessments and to provide tradecraft tips for putting the standards into practice.

The goal of enhancing the policy utility and analytic quality of DI papers and briefings is centered on efforts to promote analysts' access to and credibility with the US officials who count most in policymaking, warfighting, and law enforcement:

The two standards are complementary. Without credibility, analysts will lose their access to the hands-on policy and operational officials who need sound and reliable intelligence analysis to succeed in their professional missions. Without access, even unquestionably objective assessments do little to promote the effective use of intelligence by consumers.

As indicated below, the nine standards the Product Evaluation Staff uses to evaluate DI analytic performance reflect the importance of increasing access and protecting credibility:

Increasing Access

Addressing US Interests. The most important and demanding charge given to DI analysts is to assist US officials to fulfill their obligation to design, implement, and monitor national security policy. The highest standard is to structure DI assessments to underscore what officials need to know to get their jobs done (for example, about vulnerabilities and strengths of adversaries). When this is not practical, DI deliverables should contain a carefully crafted section that addresses implications, dangers, and opportunities with the role of US policymakers, warfighters, and law enforcement officials as action officers clearly in mind.

Sophistication of the Analysis/Depth of Research. Especially in broadly distributed and fully developed assessments, analysts have to convey their distinctive expertise on an issue. This demonstration of authority to speak to the issue should be based on some combination of research methodology, firsthand knowledge of country or subject, all-source data bases, closeness to collectors, and clear articulation of facts and assumptions attributes that make a busy policymaker want to come back to the analyst for additional information and analysis.

Unique Intelligence Information. Especially in quick turnaround deliverables for small numbers of key officials, the analyst should make appropriate use of unique, at times highly classified, information that provides insights not otherwise available to well-informed officials. Policymakers and warfighters who have the action on key issues almost always have the clearances as well. At times they need assistance, however, in understanding the context and character of intelligence from clandestine collection and other special sources.

Effective Summary. The analyst's charge is to convey the distinctive values of the paper in the prescribed space. To maximize service to heavily engaged officials, the most important findings and judgments should be highlighted within a bare-bones depiction of the general context. Even more than in the main text, the summary should be made "actionable" via emphasis on US policy implications.

Maintaining Credibility

The Facts - Or What We Know. With emphasis on what is new, different, attention-worthy, a DI assessment should set out what the Directorate knows with sufficient confidence to warrant reliance by policymakers and warfighters in planning and executing US courses of action. When relevant, DI deliverables should also address what analysts do not know that could have significant consequences for the issue under consideration. If the paper notes intelligence gaps, it should, when appropriate, suggest collection strategies to fill the gaps.

Sources of the Facts - How We Know It. DI assessments have to depict the sources of information on which consumers are asked to rely within the general rules for using evidence. Direct evidence (for example, imagery and most intercepts) should be distinguished from testimonial evidence (for example, most clandestine and embassy reporting). On complex matters (for example, the attitudes and plans of foreign leaders) analysts should make explicit their levels of confidence in the evidence.


Conclusions. Assessments should enunciate the conclusory findings from the hard evidence (for example, well-documented events) in terms of trends, patterns, and precedents that underscore dangers or opportunities for US policymakers, warfighters, and law enforcement officials. When appropriate, the calculus that leads to the extension of the factual base to an actionable finding should be spelled out (for example, rules used to establish degrees of risk in dual-purpose technological transfers).

Clear Articulation of Assumptions. When analysts address uncertainty - matters that require interpretations and estimates that go well beyond the hard evidence - their argumentation must clarify the premises, suppositions, and other elements of critical thinking that underlie the judgments. For example, effective argumentation for consumers, who will often have their own strong opinions, requires the analyst to clarify not only degree of confidence in key assumptions but also the criticality of the latter to bottom-line judgments.


Outlook. The outlook sections of estimative assessments (for example, the likely course and impact of political, economic, and military developments in foreign countries) should identify the dynamics that will have the greatest impact on subsequent developments. In other words, what are the drivers that will determine the outcome, or what drivers would have to change to alter the outcome?

Forward to "Note 3"

Return to Contents