Research ArticleSCIENCE POLICY

A multistage crucible of revision and approval shapes IPCC policymaker summaries

See allHide authors and affiliations

Science Advances  05 Aug 2016:
Vol. 2, no. 8, e1600421
DOI: 10.1126/sciadv.1600421

Abstract

Intergovernmental Panel on Climate Change (IPCC) member governments approve each report’s summary for policymakers (SPM) by consensus, discussing and agreeing on each sentence in a plenary session with scientist authors. A defining feature of IPCC assessment, the governmental approval process builds joint ownership of current knowledge by scientists and governments. The resulting SPM revisions have been extensively discussed in anecdotes, interviews, and perspectives, but they have not been comprehensively analyzed. We provide an in-depth evaluation of IPCC SPM revisions, establishing an evidential basis for understanding their nature. Revisions associated with governmental review and approval generally expand SPMs, with SPM text growing by 17 to 53% across recent assessment reports. Cases of high political sensitivity and failure to reach consensus are notable exceptions, resulting in SPM contractions. In contrast to recent claims, we find that IPCC SPMs are as readable, for multiple metrics of reading ease, as other professionally edited assessment summaries. Across reading-ease metrics, some SPMs become more readable through governmental review and approval, whereas others do not. In an SPM examined through the entire revision process, most revisions associated with governmental review and approval occurred before the start of the government-approval plenary session. These author revisions emphasize clarity, scientific rigor, and explanation. In contrast, the subsequent plenary revisions place greater emphasis especially on policy relevance, comprehensiveness of examples, and nuances of expert judgment. Overall, the value added by the IPCC process emerges in a multistage crucible of revision and approval, as individuals together navigate complex science-policy terrain.

Keywords
  • Climate change
  • Intergovernmental Panel on Climate Change
  • assessment
  • science-policy interface
  • climate change communications

INTRODUCTION

The mandate of the Intergovernmental Panel on Climate Change (IPCC) is to provide comprehensive, balanced, and policy-neutral stocktaking of current knowledge on climate change, its causes and impacts, and the options for response (1). Since 1988, IPCC assessments have provided a powerful and enduring foundation for decision-making and public understanding, and they have also pointed to opportunities for scientific discovery. Expert author teams develop IPCC assessments of the scientific literature through multiple rounds of drafting and monitored scientific review, incorporating feedback of experts and governments from around the world (2).

A defining feature of IPCC assessment is governmental approval of each report’s summary for policymakers (SPM). Following years of report development, governmental approval builds shared ownership of IPCC assessment by both scientists and governments. The process begins with circulation of an SPM government review draft among IPCC member governments, once the underlying assessment report is complete (2). Government reviewers provide written comments on the SPM draft. Addressing these government review comments, the SPM scientist author team then revises the draft in advance of an in-person government-approval plenary session. In the plenary session, IPCC member governments approve the SPM by consensus, proceeding sentence by sentence with scientist authors (2). SPM approval requires agreement that the SPM is consistent with the underlying assessment report. Government delegates make verbal comments as an approval plenary proceeds, reiterating submitted written comments, raising new issues or possible amendments, or responding to the plenary’s ongoing dialogue. To reach consensus, the scientist author team, led by an IPCC working group’s scientist cochairs or the IPCC chair, provides explanations and proposes revisions in real time to address government comments raised from the floor.

Hundreds of individuals participate in IPCC SPM approval plenaries, which are intense, diversely interpreted experiences. Anecdotes and perspectives on IPCC governmental approval have been widely documented in the scientific literature (311), media (1215), blogs (16, 17), and scientific memoirs (1820). Approval-plenary interpretations for the recent IPCC Fifth Assessment Report (AR5) range from assertions of politicization resulting in summaries by policymakers (17, 21, 22) to underscoring of the unique value added (9). Others have simultaneously called for improving SPM readability for nonspecialists (23, 24).

Despite the importance of and spotlight on IPCC SPM governmental approval, the resulting SPM revisions have not been comprehensively analyzed. One figure in Barkemeyer et al. (24) compares a simple reading-ease metric (25) across SPM drafts, providing a glimpse of the value of objective analyses of SPM revisions. Revisions associated with governmental review and approval may affect SPMs in a variety of ways. Within a single SPM, revisions could increase readability and accessibility, improve scientific rigor, and enhance relevance to decision-making. Alternatively, revisions could politicize or distort scientific assessment or degrade its clarity. In other cases, revisions may not meaningfully change the SPM, although the approval process itself may build governmental understanding and ownership of the science.

Here, we provide an in-depth evaluation of IPCC SPM revisions associated with governmental review and approval. Focusing on the eight SPMs of the IPCC Fourth Assessment Report (AR4) and AR5, we ask fundamental questions: How and why do SPMs change during governmental review and approval? What are the effects of revisions? For future IPCC SPM approvals, what lessons does our analysis imply? We shed light on IPCC SPM revisions through three specific analyses: (i) For the eight AR4 and AR5 SPMs, we evaluate SPM expansion and contraction resulting from the government-approval process. (ii) For the SPM for which we have highest familiarity [Working Group II (WGII) AR5], we comprehensively survey revisions made and their purposes, both before and during the government-approval plenary session. (iii) For the eight SPMs, we complete a linguistic analysis of readability metrics to evaluate the effects of revisions made. Our overall goal is to establish an evidential basis that helps ground understanding of IPCC governmental approval, including value added and subtracted in a unique science-policy interaction.

RESULTS

IPCC SPMs expand, mostly, through governmental review and approval

The IPCC’s AR4 and AR5 include eight separate SPMs for Working Group I (WGI) (physical science basis) (26, 27), WGII (impacts, adaptation, and vulnerability) (28, 29), Working Group III (WGIII) (mitigation) (30, 31), and the Synthesis Report (SYR) (32, 33). Revisions associated with governmental review and approval expand text overall for all eight SPMs (Fig. 1A; t tests: P = 0.02 for AR4, P = 0.003 for AR5). Not including words associated with figures and tables, AR4 SPMs increase from a mean of 5572 words (range, 4917 to 6807) to 7270 words (range, 6422 to 8166), corresponding to increases of 17% (WGII AR4) to 53% (WGIII AR4). AR5 SPMs increase from a mean of 8309 words (range, 7451 to 9250) to 10,518 words (range, 9283 to 11,732), corresponding to increases of 20% (WGI AR5) to 35% (SYR AR5). The percentage text increase due to revisions is similar across AR4 and AR5 SPMs (t test: P = 0.61), although AR5 SPMs are significantly longer than AR4 SPMs (t tests: government review drafts, P = 0.004; final SPMs, P = 0.003).

Fig. 1 Overall, IPCC SPMs generally expand following revisions associated with governmental review and approval.

(A) For the SPMs (WGI, WGII, WGIII, and SYR) of two IPCC assessment reports (AR4 and AR5), word length increases in all cases. (B and C) The number of figures (B) and tables (C) increases for many but not all SPMs. In each panel, the government review draft is the SPM draft circulated to IPCC member governments for written government review comments. The subsequent author-revised draft (shown only for the WGII AR5 SPM) is developed in advance of the plenary session in which IPCC member governments approve the SPM. The final SPM is the approved SPM, which includes further revisions from the plenary session as well as final production (that is, layout and minor copyediting). Words associated with figures and tables are not included in totals in (A). The percentage increase in word length, from government review draft to final SPM, is shown for each SPM in (A). For the WGII AR5 SPM, the percentage increase from the government review draft to the author revision draft (13%) is also listed.

Revisions associated with governmental review and approval differentially affect the number of figures (Fig. 1B), as well as figure panels (fig. S1), across IPCC AR4 and AR5 SPMs. The total number of figures increases in half of the AR4 and AR5 SPMs (WGI AR4 and AR5, SYR AR4, and WGII AR5) and decreases in two SPMs (WGII AR4 and WGIII AR5). Figure panels are lost in three SPMs (WGII AR4 and AR5 and WGIII AR5). For the WGIII AR5, 10 figure panels and three figures that were lost showed greenhouse gas emissions, categorizing countries by region or income group.

Revisions associated with governmental review and approval also differentially affect the number of tables in IPCC AR4 and AR5 SPMs (Fig. 1C). The number of tables increases in most AR4 and AR5 SPMs (WGI AR4 and AR5, WGIII AR4 and AR5, SYR AR4, and WGII AR5), decreasing only in the WGII AR4 and remaining unchanged only in the SYR AR5.

Individual paragraphs both grow and shrink in revisions associated with governmental review and approval (Fig. 2). As a measure of text contraction versus expansion, we calculate a ratio F as the sum of words deleted divided by the sum of words added. F is greatest, exceeding 0.3, for the WGII AR4 and WGIII AR5 (Fig. 2), highlighting substantial deletions that occurred during their revision.

Fig. 2 Individual IPCC SPM paragraphs differentially grow and shrink across SPMs, following revisions associated with governmental review and approval.

For the SPMs (WGI, WGII, WGIII, and SYR) of two IPCC assessment reports (AR4 and AR5), net word changes are shown for different paragraph revision categories, comparing individual paragraphs in the government review draft to their counterparts in the final SPM. The paragraph revision categories are defined as follows: paragraphs that expanded, paragraphs that were newly added, paragraphs for which material was rearranged given splitting and/or merging of paragraphs, paragraphs that contracted, and paragraphs that were deleted. For each SPM, F is the sum of words deleted divided by the sum of words added. Words associated with figures and tables are not included in depicted net word changes.

Across AR5 SPM revisions, we also evaluate several examples of politically sensitive material emphasized in published perspectives and notes (7, 8, 11, 16, 17, 21, 3436). For some of these examples, political sensitivity may have been amplified, given the relevance for ongoing climate negotiations toward the 2015 Paris Agreement under the United Nations Framework Convention on Climate Change (UNFCCC). In our observations, supported by published notes, these examples were generally discussed at particular length during government-approval plenary sessions, and sensitivity about them sometimes affected more than one AR5 approval session. For the WGI AR5 SPM, findings on observed temperature increase expand by 64% (from 132 words in the government review draft to 217 words in the final SPM), findings on decadal variability and the so-called “hiatus” featured in preplenary media coverage expand by 137% (from 54 to 128 words), and findings on cumulative emissions relevant to climate policy targets expand by 90% (from 145 to 275 words). For the WGII AR5 SPM, material on transformation, that is, fundamental change relevant to climate change response, contracts by 6% (from 163 words in the government review draft to 153 words in the final SPM); findings on global economic impacts, featured in media coverage of the leaked government review draft, expand by 129% (from 113 to 259 words); and a box presenting findings relevant to UNFCCC Article 2 expands by 33% (from 445 to 593 words). For the WGIII AR5 SPM, a section introducing approaches to mitigation including ethical dimensions expands by 13% (from 911 words in the government review draft to 1027 words in the final SPM, comparing analogous paragraphs across SPM versions), whereas a section on international cooperation on mitigation contracts by 66% (from 677 to 233 words, comparing analogous paragraphs). For the SYR AR5 SPM, a box presenting information relevant to UNFCCC Article 2 is deleted in its entirety (156 words in the government review draft).

WGII AR5 SPM author revisions outnumber plenary revisions and differ in purpose

The IPCC WGII AR5 SPM includes 886 total revisions associated with governmental review and approval. Of these revisions, 598 are author revisions introduced in full or in part in the author-revised draft developed in advance of the government-approval plenary session. The remaining 288 revisions are plenary revisions made during the plenary session. Revisions here are defined at the smallest practical scope of change; for example, if adjustments to a sentence’s component phrases are made for different reasons, we tally them as separate revisions.

We assign two revision purposes to each revision in the WGII AR5 SPM, spanning text, figures, and tables. Table S1 describes the 11 revision-purpose categories, and data file S1 lists the SPM revisions and their purposes. Overall, most revisions focus on improving clarity, scientific rigor, explanation, and comprehensiveness of examples and concepts (Fig. 3A). These four categories together account for 58% of total revision purposes. Most author revisions focus on clarity, scientific rigor, and explanation (Fig. 3B). These three categories account for 50% of author revision purposes. By contrast, plenary revisions focus more heavily on policy relevance, comprehensiveness, and explanation, as well as nuances of expert judgment (emphasis and balance). These categories account for 61% of plenary revision purposes.

Fig. 3 Author revisions, rather than plenary revisions, dominate IPCC WGII AR5 SPM changes associated with governmental review and approval.

We assign two purposes to each revision in the WGII AR5 SPM, spanning text, figures, and tables (see data file S1). Table S1 describes the 11 revision-purpose categories. (A) Distribution of revision purposes for the 886 total revisions in the SPM, comparing the government review draft to the final SPM. (B) Distribution of revision purposes for the 598 author revisions introduced in full or in part in the author-revised draft developed in advance of the government-approval plenary session. (C) Distribution of revision purposes for the additional 288 plenary revisions made during the plenary session. For each panel, the percentage of purposes in each revision category is indicated at the end of each bar. For 136 revisions, a single revision purpose fully characterizes the revision. The corresponding purpose designation of “no secondary purpose” [136 for (A), 107 for (B), and 29 for (C)] is not plotted within the figure. Of the total revisions (A), 67.5% were author revisions (B) and 32.5% were plenary revisions (C).

The ratio of author to plenary revision purposes is shown in Fig. 4 for each revision category. Revisions focused on background context, scientific rigor, typographical adjustment, and clarity are made predominantly as author revisions. By contrast, revisions related to the plenary process, policy relevance, comprehensiveness, and nuances of expert judgment (balance and emphasis) receive greater relative emphasis during plenary revisions. Plenary revisions outnumber author revisions for only two revision purposes: revisions resulting from the rules and conventions of the approval process and revisions addressing policy relevance.

Fig. 4 Author and plenary revisions in the IPCC WGII AR5 SPM emphasize different purposes.

Each revision in the SPM is assigned two purposes (see data file S1). For each revision-purpose category (defined in table S1), this figure shows the proportion of purposes made as author versus plenary revisions. Author revisions were introduced in full or in part in the author-revised SPM draft developed in advance of the governmental-approval plenary session. Plenary revisions were made during the plenary session. The number of purposes in each revision category and stage is indicated at the end of each bar.

Revision purposes differ across WGII AR5 SPM text, figures, and tables (fig. S2). Text and figure revisions focus on clarity, scientific rigor, and explanation (fig. S2, A and B). By contrast, revisions increasing policy relevance and comprehensiveness are the most common table revisions (fig. S2C).

For both paragraphs that grow and paragraphs that shrink, clarity and scientific rigor are dominant revision purposes (fig. S3). Revisions to paragraphs that grow also emphasize explanation and comprehensiveness (fig. S3A). Revisions to paragraphs that shrink include greater focus on nuances of expert judgment (emphasis) and typographical adjustments (fig. S3B).

IPCC SPMs become more and less readable through governmental review and approval

We evaluate SPM readability through multiple metrics of text reading ease (Fig. 5). In reporting readability of SPM text, we exclude text associated with figures and tables, as well as references to the underlying report.

Fig. 5 IPCC SPMs become more and less readable through governmental review and approval across linguistic reading-ease metrics.

(A to F) For the SPMs (WGI, WGII, WGIII, and SYR) of two IPCC assessment reports (AR4 and AR5), six reading-ease metrics are plotted for the government review draft versus the final SPM. For each metric, higher values correspond to greater reading ease. SPM values falling above each plot’s diagonal line indicate increased readability for the metric in the final SPM as compared to the government review draft; values below the diagonal line indicate decreased readability for the metric. (A) Flesch Reading Ease is a simple metric of text ease, calculated as a function of sentence and word length. (B to F) Coh-Metrix metrics characterize multiple dimensions of text ease. (B) High values of narrativity indicate text that communicates a story or sequence with familiar words, whereas low values reflect informational texts with unfamiliar words. (C) Higher values of syntactic simplicity indicate more simple, familiar syntactic structures and shorter sentences, whereas lower values correspond to structurally complex sentences. (D) Referential cohesion captures the extent to which words and ideas are explicitly connected across the text; higher values reflect more overlap in words and ideas. (E) Deep cohesion encompasses the use of connecting words that clarify relationships among events and concepts across the text; higher values correspond to greater use of connecting words. (F) Higher values of word concreteness indicate words that are more concrete, meaningful, and imaginable, as compared to abstract. Reference texts are assessment and report summaries in which scientist authors worked with input from professional editors to clarify text. Reference-text values are plotted along the diagonal in each panel.

Flesch Reading Ease is a simple metric of text reading ease, calculated as a function of sentence and word lengths (25). High Flesch Reading Ease scores indicate short sentences and words, whereas low scores indicate long sentences and words. Half of the SPMs (final SPMs for WGI AR4 and AR5, WGII AR4, and SYR AR4) have Flesch Reading Ease scores similar to World Bank (3740) and U.S. National Climate Assessment (41) report summary text for which scientist authors worked with professional science editors to improve readability (Fig. 5A). By contrast, the other SPMs (WGIII AR4 and AR5, WGII AR5, and SYR AR5) have Flesch Reading Ease scores notably lower than these reference texts. Flesch Reading Ease increases for half of the SPMs (WGI AR4 and AR5, SYR AR4, and WGII AR5) following revisions associated with governmental review and approval (Fig. 5A). [This revisions result contrasts with the conclusion of Barkemeyer et al. (24); in our analysis, we adopt the more rigorous approach of consistently excluding references to the underlying reports, which skew word- and sentence-length measures and are not parsed by the reader as normal text, and we also consistently exclude text associated with figures and tables, which was inconsistently included across SPM text version files used by Barkemeyer et al. (24).]

Flesch Reading Ease has been used for decades as a simple, easy-to-calculate metric of text reading ease. However, in multiple ways, it inadequately captures essential features of readability for scientific texts (42, 43). First, it does not characterize meaning, cohesion, or text-genre differences fundamental in shaping reading experiences, especially for nonexperts; we consider these dimensions further in the Discussion. Second, metrics based on sentence and word lengths become less successful in predicting difficulty for reading levels above sixth to eighth grade, diminishing their relevance for IPCC SPMs. As a further specific example, high SPM Flesch Reading Ease scores can be driven by frequent use of short acronyms and units (for example, RF, CO2, ppm, and Wm−2), especially for WGI, emphasizing the limits of simple word-length measures in indicating readability for scientific texts. For IPCC AR4 and AR5 SPMs, we therefore evaluate changes in multiple dimensions of text reading ease on the basis of in-depth linguistic analyses through Coh-Metrix (4244). Here, we report results for five overarching readability metrics that reflect multiple levels of language and discourse important for understanding SPM readability and comprehension.

First, narrativity (Fig. 5B) indicates the degree to which text is story-like (higher z-score values) as compared to informational (lower z-score values). All AR4 and AR5 SPMs have low narrativity values, reflecting informational text. The SPM narrativity values broadly fall in the value range for World Bank and U.S. National Climate Assessment reference texts (3741); only the WGII AR5 SPM falls below all reference values. Three SPMs (WGIII AR5 and SYR AR4 and AR5) have improved narrativity following governmental review and approval.

Second, syntactic simplicity (Fig. 5C) captures the degree to which syntax is simple and familiar with shorter sentences (higher z-score values) as compared to structurally complex (lower z-score values). AR4 and AR5 SPMs have syntactic-simplicity values broadly within the value range for World Bank and U.S. National Climate Assessment reference texts (3741); the WGIII AR4 and WGII AR5 SPMs slightly exceed all reference values, whereas the WGI AR4 and AR5 fall below. Three SPMs (WGII AR4, SYR AR4, and WGI AR5) have improved syntactic simplicity following governmental review and approval.

Referential cohesion (Fig. 5D) encompasses the extent to which words and ideas are explicitly connected across the text, and deep cohesion (Fig. 5E) reflects use of connecting words to clarify relationships among events and concepts. Most SPMs become more cohesive following governmental review and approval. Only one SPM (WGII AR4) has decreased referential cohesion following governmental review and approval, and only three SPMs (WGIII AR4, WGI AR5, and SYR AR5) have decreased deep cohesion. All SPMs have deep-cohesion values that fall within the value range for World Bank and U.S. National Climate Assessment reference texts (3741), whereas the SPMs have referential-cohesion values greater and lower than the reference values.

Finally, word concreteness (Fig. 5F) indicates words that are more concrete, meaningful, and imaginable (higher z-score values), as compared to abstract (lower z-score values). With the exception of the WGII AR5 SPM, AR4 and AR5 SPMs have word-concreteness values substantially below the value range for World Bank and U.S. National Climate Assessment reference texts (3741). Word concreteness increases following governmental review and approval for five SPMs (WGI AR4, WGII AR4, WGIII AR4, WGI AR5, and WGIII AR5).

Across reading-ease metrics, two SPMs consistently score highest in terms of readability: WGI AR5 (highest final SPM readability score in Fig. 5, A, B, and D) and WGII AR5 (highest final SPM readability score in Fig. 5, C, E, and F).

DISCUSSION

Government approval of IPCC SPMs deeply embeds member governments in finalizing summaries of what is known and unknown across domains of climate-change knowledge. The unique process increases the traction and relevance of SPM assessment findings, increasing their influence beyond that of the underlying report and the abundant scientific and technical literature on which it is based (7, 9, 45, 46). Here, we discuss how and why SPMs change during governmental review and approval, considering lessons for the future.

Perhaps most obviously, revisions associated with governmental review and approval generally expand IPCC SPMs. In all SPMs analyzed, the text in main SPM sections and boxes lengthens overall, and in many cases, the number of figures and tables increases as well (Fig. 1). Why do SPMs expand versus contract, and to what effect?

In the case of the WGII AR5 SPM, revisions made during the government-approval plenary session contribute disproportionately to SPM expansions, and SPM expansions especially add explanations and examples that increase accessibility and relevance. That is, 67.5% of revisions in the WGII AR5 SPM were introduced in the author-revised draft developed in advance of the government-approval plenary session (Fig. 3), but these revisions contributed only 54% of the total length increase (Fig. 1A). By contrast, 32.5% of revisions occurred during the plenary session and contributed 46% of the total length increase. In WGII AR5 SPM paragraphs that grow, as compared to those that shrink, more revisions focus on purposes of explanation and comprehensiveness, for example, making implications of conclusions more explicit or adding examples of interest to policymakers (fig. S3). These trends suggest that SPM expansions, especially during approval plenaries, may advance SPM relevance and accessibility for decision-making.

SPM expansion is the most common outcome following revisions associated with governmental review and approval (Figs. 1 and 2). However, it does not always occur. In our observations supported by published perspectives and notes (7, 8, 11, 16, 17, 21, 3436), notable contractions have occurred in cases of high political sensitivity and failure to reach consensus, including cases of more broadly diminished cooperative spirit within a government-approval plenary session. We calculate a ratio F as a measure of contraction across an SPM (Fig. 2). This ratio is highest, exceeding 0.3, for WGII AR4 and WGIII AR5. Previous analysis (24) and published perspectives and notes (7, 8, 11, 16, 17, 19, 21, 47, 48) have indicated that the approval plenaries for these SPMs were marked by particularly high political tensions and disagreements. It is possible that high plenary tensions resulted in high F ratios or that the WGII AR4 and WGIII AR5 SPMs already exhibited substantial rewriting in the author revision draft, which challenged real-time governmental review and thereby exacerbated plenary mood—or that both dynamics occurred. These two SPMs also lost the greatest number of figures (Fig. 1B) and figure panels (fig. S1). Especially in cases of broadly diminished cooperative spirit and more pervasive failure to reach consensus, SPM revisions marked by extensive contractions (Fig. 2 and fig. S1) may indeed constrain the scope and balance of SPM conclusions, in this way potentially politicizing, distorting, or degrading SPM content. These instances underscore inevitable trade-offs, tensions, and potential conflicts between increasing policy relevance and impact and maintaining scientific credibility in interactions among experts and decision-makers (46, 49).

However, in most cases, political sensitivity does not result in failure to reach consensus. High-interest material relevant to climate negotiations is typically discussed at length in approval plenaries (21, 3436, 47, 48, 50, 51). The IPCC AR5 examples that we analyze suggest that revisions of particularly sensitive or relevant material may often expand SPM text, more than occurs for less sensitive SPM text. For instance, large expansions are observed for WGI AR5 findings on observed temperature increase (64% increase), decadal variability (137%), and cumulative emissions (90% increase) and for WGII AR5 findings on global economic impacts (129% increase). Nonetheless, consensus failure and resulting SPM contractions have been notable, for example, as occurred for emission figures categorizing countries by region or income group (10 figure panels and three figures lost; WGIII AR5), for findings on international cooperation (66% reduction; WGIII AR5), and for information related to UNFCCC Article 2 (box lost; SYR AR5). There may be some topics with such political sensitivity that governmental approval is fundamentally challenged. At the same time, plenary-session dynamics also matter, with management by the scientist cochairs or chair figuring prominently in these dynamics. In contrast to the SYR AR5 box that was lost, a WGII AR5 SPM box presenting findings relevant to UNFCCC Article 2 was approved, expanding by 33%, following extensive plenary discussions. To address challenges of political sensitivity, Dubash et al. (7) call for continued emphasis on coproduction, rather than political insulation, for future SPMs. Before and during approval plenaries, scientist authors can be ambitious and creative in understanding political interpretations and ways that SPM presentations can navigate them in a process of joint fact finding (52). These orientations can minimize consensus failures and ultimately increase the influence of science on policy framings, priorities, and actions (7, 9, 46).

For the WGII AR5 SPM, author revisions introduced in advance of the government-approval plenary session differ from revisions made during the plenary, potentially pointing to ways that approval sessions add value. As a first difference, author revisions are substantially more numerous than plenary revisions (Fig. 3). This difference is perhaps not surprising because scientist authors develop preplenary revisions over weeks to months, whereas plenary revisions, sometimes involving hours of discussion of individual conclusions, are made over the course of several days and sometimes nights.

As a second difference, WGII AR5 SPM author and plenary revisions emphasize different purposes. Our analysis of revision purposes relies on our deep familiarity with SPM revisions made and their rationales in response to government comments before and during the approval session. Compared to author revisions, plenary revisions particularly emphasize policy relevance (15% of plenary revision purposes versus 6% of author revision purposes in Fig. 3), comprehensiveness (15% versus 8%), and nuances of expert judgment (17% versus 12%, combining emphasis and balance). Relatively fewer address clarity (13% of plenary revision purposes versus 20% of author revision purposes in Fig. 3), scientific rigor (10% versus 18%), and typographical adjustment (4% versus 8%). These different emphases suggest that plenary revisions expand beyond author revisions in response to written government comments. Governmental approval of three lengthy tables proceeded relatively rapidly. This perhaps occurred because government requests for additional policy-relevant examples could be more readily accommodated within example-based tabular SPM material (26% of table revision purposes are policy relevance and 23% are comprehensiveness of examples, substantially higher percentages than for other SPM material; fig. S2). Plenary revisions are introduced by the scientific IPCC working-group cochairs or the IPCC chair, with the requirement of consistency with the underlying scientific assessment. Government plenary comments, as compared to their preplenary written comments, may place greater focus on policy-relevant dimensions of conclusions. Even accounting for this possibility, scientist authors appear to address government comments more thoroughly in the context of real-time interactive feedback during the plenary—especially as relates to policy relevance, comprehensiveness of examples, and nuances of expert judgment. Broader science-policy analyses have found that active, iterative, and inclusive communication between experts and decision-makers enhances knowledge-based foundations for action (46). Our results perhaps suggest greater scientist author responsiveness to member governments during approval plenaries, when communication is in-person and interactive and when government comments must be addressed through explanations or revisions to achieve consensus.

On the basis of Flesch Reading Ease, Barkemeyer et al. (24) conclude that IPCC SPMs have low readability compared to tabloid, quality, and scientific media and that SPM readability often decreases following governmental approval. Flesch Reading Ease (25), which is a function of sentence and word lengths, does not characterize meaning, cohesion, or text-genre differences. Metrics based on sentence and word lengths are also less successful in predicting difficulty for reading levels above sixth to eighth grade (42), affecting their performance for IPCC SPMs. By contrast, the additional Coh-Metrix metrics that we evaluate encompass multiple levels of language and discourse important for understanding SPM readability and comprehension (4244). We thereby capture fundamental dimensions of SPM readability missed entirely in Flesch Reading Ease scores. Further, we consider reference texts (3741) that are professionally edited report summaries presenting a series of conclusions. Their styles range from concise key message paragraphs extracted from the U.S. National Climate Assessment (41) to more discursive presentations (40). Our analysis of multiple metrics of text reading ease and more comparable reference texts can therefore suggest more nuanced directions for increasing SPM readability in the future. Below, we highlight three central conclusions emerging from our Coh-Metrix analysis.

First, contrary to recent assertions (23, 24), IPCC SPMs are as readable as other professionally edited assessment summaries for multiple metrics of reading ease (Fig. 5). This finding indicates the importance of comparing SPMs to analogous reference texts. An SPM will never be as easy to read as a tabloid newspaper article (24) because the fundamentally different genres have different purposes. Efforts to increase SPM readability will benefit from recognizing their importance as deeply informational, rather than narrative, resources (narrativity z scores < −1 in Fig. 5B). The scientifically informational nature of SPMs, even with enhanced readability, implies a need for additional materials that can introduce and unfold them in highly accessible story-based formats. For example, the SPM figure additions following governmental review and approval (Fig. 1B) suggest that graphical representations increase accessibility of complex information for nonspecialists. Additionally, IPCC AR5 leaders went far beyond SPM text to explain assessment conclusions, encouraging extensive media interviews and debuting compact briefs, films, animations, infographics, and more interactive Web sites. These materials, developed in collaboration with science journalists and other communications professionals, can provide narrative summaries akin to popular and scientific media coverage. They can build from the fundamental strengths of IPCC SPMs as definitive encapsulations of current knowledge. In enhancing SPM readability and narrative, expecting newspaper-story reading ease (23, 24) may both disappoint and fail to capitalize on the resource.

Second, acknowledging the role of cohesion in informational documents is essential for understanding IPCC SPM readability and how to increase it. That is, increasing a text’s cohesion involves explicitly linking words and ideas across a text (Fig. 5D) and using connecting words to clarify relationships among events and concepts (Fig. 5E). Cohesion improves readability of scientific texts for nonexpert readers, although enhancing cohesion can decrease measures such as Flesch Reading Ease and syntactic simplicity (4244). Indeed, most SPMs become more cohesive following governmental review and approval, and it is therefore not surprising that Flesch Reading Ease (Fig. 5A) and syntactic simplicity (Fig. 5C) often decrease. Advancing SPM readability and accessibility requires a more holistic look at cohesion in addition to syntactic simplicity and sentence and word lengths. Increasing cohesion may be a primary means through which revisions following governmental review and approval advance decision-maker understanding of scientific SPMs (42, 44), thereby enabling decision-makers to better link knowledge to action (46).

Third, the most consistent difference between IPCC SPMs and other professionally edited assessment summaries is the greater use of abstract, jargon-heavy wording in SPMs (all SPMs except WGII AR5 in Fig. 5F). This result suggests that professional editing may contribute most substantially to accessible, meaningful wordings. Among IPCC SPMs, the WGII AR5 may have high word concreteness because of the emphasis placed on language editing during its development. Jargon can substantially hinder mutual understanding between experts and decision-makers, decreasing the effectiveness of science-policy interactions (46). Our analysis therefore emphasizes a ready target for increasing SPM readability: Wherever possible, discipline-specific jargon should be eliminated (23). Where it remains, it should be accessibly introduced. Inputs from professional science editors may be particularly effective in introducing SPM drafting strategies and edits that advance accessible, meaningful wordings. These inputs are particularly important in international assessment contexts, where English may not be the first or even the second language of participating scientist authors. In responding to government comments, scientist authors can also prioritize communication and understanding unimpeded by jargon.

Overall, the SPM writing challenge is being scientifically rigorous, unambiguous, relevant, and compelling, while using words that nonexperts can understand and syntax that is easy to read. Whereas metrics of reading ease imperfectly capture the entirety of a reading experience, our analysis nonetheless suggests priorities for the future. The IPCC’s next SPM drafting efforts have opportunities for increasing accessibility through prose that is cohesive, jargon-limited, and syntactically simple. Future SPMs could be conceptualized as containing (i) components that aim for high readability, for example, inserting short narrative overviews, highlight boxes, or top-level summary statements as in WGI AR5 SPM headline statements; (ii) sections that provide more technical assessment findings and support; (iii) accessible graphics that illustrate and unfold the state and story of current scientific understanding; and (iv) tables that provide examples, including those of high interest and relevance. Coaching and advice from professional science editors and other communications specialists would benefit multiple drafting stages, particularly in advance of approval plenaries. SPM releases should also further feature complementary products, ranging from short briefing notes to videos, developed in collaboration with communications professionals. AR5 emphases on SPM headlines, vibrant wordings, graphics, and derivative products provide starting points for increasing ambition across such dimensions in the future.

The IPCC’s process of SPM governmental review and approval is a unique and complex science-policy interaction that increases the relevance and impact of IPCC assessment. Indeed, its complexity challenges understanding of the process. Abundant anecdotes and perspectives have typified scientific-literature and media coverage to date. Our study delves further, more comprehensively shedding light on how and why SPMs change, although in many ways we can only probe the most ready entry points into multifaceted science-policy exchanges. IPCC SPM approval is both a scientific process and a social process, as underscored in our analyses. Navigating the in-person government-approval plenary session, in particular, requires rigorous scientific expertise. However, it also requires creativity and determination to sustain open communication, mutual understanding, and cooperative spirit, especially for politically sensitive topics and conclusions. As climate continues to change and responses unfold, the value of the IPCC’s SPM governmental approval process will persist. The complexity of this science-policy terrain is also certain to remain, as scientists and governments together discuss, debate, and eventually create joint ownership of current knowledge, advancing foundations for action.

MATERIALS AND METHODS

We analyzed IPCC SPM revisions associated with governmental review and approval as follows: (i) For the eight AR4 and AR5 SPMs, we evaluated SPM expansion and contraction resulting from governmental review and approval. (ii) For the WGII AR5 SPM, we comprehensively surveyed revisions made and their purposes, both before and during the government-approval plenary session. (iii) For the eight SPMs, we completed a linguistic analysis of readability metrics to evaluate the effects of revisions made.

IPCC SPM expansion and contraction through governmental review and approval

Overall numbers of words, figures, and tables were measured for the SPMs (WGI, WGII, WGIII, and SYR) of two IPCC assessment reports (AR4 and AR5). For each SPM, counts were made for both the government review draft and the final SPM. For the WGII AR5 SPM, counts were additionally made for the intermediate author revision draft. Word counts were calculated from prepared SPM text files, using a macro in Microsoft Word 2016 for Mac, version 15.19.1. Word counts excluded words associated with figures and tables (therefore excluding text appearing within figures and tables, their footnotes, and their captions); word counts included text of all other SPM paragraphs in main SPM sections and boxes (including associated footnotes and references). t tests were used to evaluate changes in the number of words across versions of the AR4 and AR5 SPMs. Tests compared word counts for government review drafts versus final SPMs for both the AR4 and AR5, and overall SPM lengths and percentage increases were compared across the AR4 and AR5. For the AR4 and AR5 SPMs, we additionally counted the number of figure panels that were added and deleted following governmental review and approval, comparing figures in the government review draft to their counterparts in the final SPM.

We also evaluated growth and shrinkage of individual SPM paragraphs. For each AR4 and AR5 SPM, we classified the revision of each paragraph as follows, comparing individual paragraphs in the government review draft to their counterparts in the final SPM: paragraphs that expanded, paragraphs that were newly added, paragraphs for which material was rearranged given splitting and/or merging of paragraphs, paragraphs that contracted, and paragraphs that were deleted. We then calculated net word changes for each paragraph revision category. Across paragraph revision categories, we computed a ratio F for each SPM as the sum of words deleted divided by the sum of words added. Additionally, we tabulated net word changes for examples of particularly politically sensitive material in each AR5 SPM.

WGII AR5 SPM revisions through governmental review and approval

We cataloged each revision to the IPCC WGII AR5 SPM, comparing the final SPM to the government review draft. Informed by global environmental assessment analyses [for example, Cash et al. (46)], we developed a qualitative typology of specific revision purposes, inductively and fully defining the categories (table S1) through examination of revisions in the WGII AR5 SPM. We defined each revision at the smallest practical scope of change. For example, if adjustments to a sentence’s component phrases were made for different reasons, we tallied them as separate revisions. On the basis of the typology, the specific purposes of each revision were manually coded and then verified in a second pass (coding and verification completed by K.J.M.), with two primary purposes assigned to each revision. Assignments were queried through evaluation by a second coder (P.T.F.), with each disagreement analyzed in depth to resolve the discrepancy. We recorded whether each revision was present in full or in part in the author-revised draft developed in advance of the government-approval plenary session (defined as an author revision present in the author revision draft) or whether it was instead made during the plenary session (defined as a plenary revision). We then computed the frequency of different revision purposes for revisions overall; for the subsets of author and plenary revisions; for SPM text, figures, and tables; and for paragraphs that grew versus those that shrank.

Linguistic analysis of IPCC SPM revisions

For the SPMs (WGI, WGII, WGIII, and SYR) of two IPCC assessment reports (AR4 and AR5), linguistics metrics of readability were calculated, comparing the government review draft to the final SPM. Text files were prepared from each SPM, excluding all text associated with figures and tables and excluding references to the underlying report (generally consisting of strings of numbers separated by punctuation and some words). Flesch Reading Ease (25) was calculated using text evaluation tools in Microsoft Word 2016 for Mac, version 15.19.1. Coh-Metrix 3.0 (4244) was used to evaluate five additional dimensions of text ease, based on multiple levels of language and discourse: narrativity, syntactic simplicity, referential cohesion, deep cohesion, and word concreteness. Coh-Metrix results are reported as z scores indicating SDs above or below the mean of the Touchstone Applied Science Associates corpus, which includes >30,000 science, social studies/history, and language arts texts across grade levels. For our SPM readability analyses, reference texts were drawn from several assessment and report summaries in which scientist authors worked with input from professional editors to clarify text: the U.S. National Climate Assessment (41), the World Bank Turn Down the Heat reports (3739), and the World Bank Shock Waves report (40). For the U.S. National Climate Assessment, report summary sections presenting report findings and key messages were combined into a single file for analysis. For the World Bank reports, text was analyzed for each report summary, excluding figures, tables, case study boxes, and reference lists.

SUPPLEMENTARY MATERIALS

Supplementary material for this article is available at http://advances.sciencemag.org/cgi/content/full/2/8/e1600421/DC1

fig. S1. IPCC SPMs gain and lose figure panels through governmental review and approval.

fig. S2. Revision purposes differ across IPCC WGII AR5 SPM text, figures, and tables.

fig. S3. Revision purposes differ across IPCC WGII AR5 SPM paragraphs that grow versus those that shrink.

table S1. Revisions to IPCC SPMs have different purposes.

data file S1. Revisions to the IPCC WGII AR5 SPM emphasize different purposes; most revisions are author rather than plenary revisions.

This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial license, which permits use, distribution, and reproduction in any medium, so long as the resultant use is not for commercial advantage and provided the original work is properly cited.

REFERENCES AND NOTES

Acknowledgments: We thank M. Oppenheimer and M. Vardy for feedback on the manuscript, T. E. Bilir for file preparations, and N. Dowell for extended assistance with the Coh-Metrix analyses. Funding: Support for this work was provided by the Alexander von Humboldt Foundation. Author contributions: All authors designed the experiments, K.J.M. and P.T.F. completed the experiments and the analysis of collected data, and K.J.M. wrote the manuscript with inputs from all authors. Competing interests: The authors declare that they have no competing interests. Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper and/or the Supplementary Materials. Additional data related to this paper may be requested from the authors.
View Abstract

Navigate This Article