Participatory Technology Assessment

The participatory technology assessment method involves three the phases of problem framing, ECAST deliberation, and results integration, with the overarching aim of including more diverse perspectives in scientific policy decision-making processes.

Problems and Purpose

Many scientific and technological issues with which policymakers grapple involve an array of complex social, moral, and ethical issues that technical experts alone cannot address. The purpose of the Expert and Citizen Assessment of Science and Technology (ECAST) network’s participatory technology assessment (pTA) method is to support democratic science policy decision-making by including a broader set of voices and perspectives. 

Origins and Development

The idea of citizen involvement in decision-making processes on science and technology gained traction in Europe in the late 1980s with the Danish Board of Technology and the use of consensus conferences. In 2010, a group of educators, researchers, and policy practitioners launched the Expert and Citizen Assessment of Science and Technology (ECAST) network to conduct pTAs on complex, contested, and emergent science, technology, and societal issues in the United States. The network was based on the following objectives [1]: 

  1. Participation and expertise: Incorporate effective citizen-participation methods to complement expert analysis.
  2. 21st-century structure: Develop a partially decentralized, agile and collaborative organizational structure, seeking effective technology assessment (TA), low cost and timeliness.
  3. Continual innovation in concepts and practices: Encourage, evaluate and, as warranted, adopt new TA concepts and methods.
  4. Nonpartisan structure and governance: Establish the ethos and institutional structures needed to ensure that any new TA institution is strictly non-partisan. When there are strongly divergent normative perspectives on a particular topic, individual TA projects can benefit from a balanced, overtly value-pluralistic or multi-partisan approach.
  5. Commitment to transparent process and public results.

After a demonstration project where the Danish Board of Technology’s project provided input to the United Nations Convention on Biological Diversity, ECAST began developing its pTA method for a project with the National Aeronautics and Space Administration’s (NASA) Asteroid Initiative. One of the two objectives of this project was to “develop and apply a participatory technology assessment that elicited nuanced information from a diverse group of citizens whose insights would not otherwise be available to decision-makers” [2]. Since then, the method has been refined through more than 25 deliberative processes.

Participant Recruitment and Selection

The objective of participant selection is not to achieve statistical representation, but rather to reflect the diversity of the community and include a plurality of voices. Participants are selected through email lists, social and traditional media, institutional partnering, and canvassing. Each participant receives a stipend (usually $100).

How it Works: Process, Interaction, and Decision-Making

The pTA method includes three phases of participatory activity:

1) Problem Framing: This first stage consists of open-framing focus groups with 15-20 selected citizens and a stakeholder design workshop. The focus groups elicit citizen perspectives on an issue with minimal or no background material to limit expert issue framing. Following the focus groups, experts and stakeholders are convened for a stakeholder design workshop in which they provide input on which topic-specific questions could most benefit from citizen input, and what information citizens would need to know in order to deliberate effectively. 

2) ECAST Deliberation: Drawing on the Danish Board of Technology’s World Wide Views method, the deliberation process convenes around 100 selected citizens in each site. This stage includes a learning phase (information packet two weeks in advance, briefing materials, videos, stakeholder cards, etc.) and a deliberation phase (small groups with a facilitator, worksheets, etc.). 

3) Results Integration: This last stage is the analysis of both qualitative (worksheet, notes from observers at the tables, etc.) and quantitative (pre- and post-surveys, ratings and ranking from individual worksheets) data collected during the event. After a preliminary analysis of the data, a second stakeholder workshop is convened to present preliminary deliberation results and solicit feedback on directions for deeper analysis. This process helps connect citizen input to decision-making processes. The final results are disseminated through various means (e.g. peer review publications and presentations to policy-makers and the participants).

Influence, Outcomes, and Effects

Know what influence and effects this initiative had? Help us complete this section!

Analysis and Lessons Learned

Want to contribute an analysis of this initiative? Help us complete this section!

See Also

Danish Board of Technology


[1] Sclove, Richard, 2010, Reinventing Technology Assessment: A 21st Century Model,  Woodrow Wilson International Center for Scholars,

[2] Tomblin, David, et al., 2015, Informing NASA’s Asteroid Initiative - A Citizens’ Forum,

External Links



Kaplan L, Nelson J, Tomblin D, Farooque M, Lloyd J, Neff M, Bedsted B, & Sarewitz D (2019) Exploring Democratic Governance of Geoengineering Research Through Public and Stakeholder Engagement. Report, Consortium for Science, Policy & Outcomes, Washington, DC, November.


This entry is based on an interview with Leah Kaplan from the Arizona State University Consortium for Science, Policy, and Outcomes.