International Experts Speaks on Technology Assessment
Where and how nuclear waste should be buried? What ethical principles should govern autonomous cars? These and similar questions all fall in the realm of technology assessment – an area of knowledge that appeared in the 1950s and has since gained significant prominence. On December 5, Dr Armin Grunwald from Karlsruhe Institute of Technology came to HSE to give a talk on this topical issue.
Associate Professor at the School of Philosophy Alexander Mikhailovsky and Dr Armin Grunwald
Introducing Dr Grunwald to the audience, Professor Mikhailovsky of HSE’s School of Philosophy, stressed the importance of the lecture on technological assessment taking place at HSE. ‘HSE strives to become a truly comprehensive university that can serve as a platform for interdisciplinary dialogue and as source of expertise on social implications of technological developments. So it is only natural that the discussion of technology assessment that in fact lies at the intersection of humanities and technology happens here, within these walls’
As Professor Grunwald explained, the technological advance affects almost all areas of human life. Besides overwhelming positive effects it also raises questions of unintended effects, tensions with democracy, the role of citizens, and sustainability in the face of environmental issues. In his talk, Prof. Grunwald spoke about the background of technological assessment, described its framework, and outlined the type of thinking associated with it.
Origins of Technological Assessment
Technology assessment (TA) was developed more than 50 years ago but its roots can be traced back to 1920s and the theories developed by sociologist William Ogburn and philosopher John Dewey. Technological assessment gained more prominence in the 1950s and 1960s when the economic competition and the Cold War dictated the need for more solid policy advice. It can also be linked to the emergence of futurist thinking and invention of the scenario technique, that is, thinking in alternatives while planning. The first TA institution, the Office of Technology Assessment, was founded at the U.S. Congress in 1972.
Related notions appeared to respond to all the new challenges, including impact assessment, technology foresight, technology futures assessment, ELSI studies (ethical, legal, social implications) RRI (Responsible research and innovation).
A good example illustrating the importance of technological assessment is the failure of nuclear power in Germany. Initially, researchers were overly optimistic about nuclear power and its potential. However, in their over-optimism they did not think about possible accidents or what to do with nuclear waste at the end of the life of a nuclear plant. Now it is obvious that lifecycle thinking is vitally important and it is imperative that the researchers or policy makers think about the technology usage and disposal at the very end of its lifetime.
In the 1980s, parliamentary TA was already in place in several European countries and in 1990 European parliamentary TA network (EPTA) was founded, which is now made up of 22 member states, including Russia. In 2012-2015 European Commission funded a project to further develop TA all over Europe and today we are moving a global TA network that would include Russia, China, Australia, and Latin America.
During the course of its history, technology assessment underwent several transformations. First of all, it shifted its focus from technology determinism to shaping technology. Technology is no longer thought to be a self-developing entity that the society has to adapt to but rather as a social process that is shaping technology in accordance with human values and interests.
Furthermore, instead of a purely prognostic orientation it began to consider a broader range of futures and started to develop scenarios and engage in deeper reflection of alternatives as it became clear that it is not possible to make an accurate model of society that would make unequivocal predictions.
It also became more inclusive and participatory rather than expert-oriented. What is more, researchers now are often engaged participants in innovation processes - they are no longer distant observes but try to contribute to transformations towards a more sustainable society.
Finally, technology assessment ceased to claim it was value neutral and abandoned its positivistic approach, which proved to be impossible to implement. Now it explicitly reflects normativity.
Technology Assessment Today
At present, technology assessment mainly serves three functions. First, TA is used as policy advice in the form of parliamentary TA. For instance, office of TA at the German Bundestag provides strategic advice to ministries and authorities about technological and science developments. Its role is to tell MPs about possible implications of technology and possible policy fields. What is important here is that the parliamentary TA be free of lobbying influence.
Secondly, TA contributes to public debate by creating awareness, organising public events and debates, and expanding media presence. There is a broad movement now of involving stakeholders and society in technology debate.
Finally, technology assessment has become an integral part of engineering projects. It adds responsibility and accountability to the design process by building in thinking about technological consequences and implications.
Conceptual Framework of TA
Although there is high diversity of adressees, concepts, methods, and institutions of technology assessment, there is still some common framework that it generally follows. TA enhances the reflexivity of debates and decision making on shaping the technological advance and making use of its outcomes.
TA generally involves thinking about possible consequences (anticipation of future developments), broad public discussion of the issues, and complexity management since it is necessary to restrict the scope of futures analyzed. The outcomes of TA should make a difference in the real world and should impact the political decisions.
TA is rooted in thinking about consequences in advance and using the reasoning to improve decisions to be made. The main challenge, however, is obtaining prospective knowledge as we do not have knowledge about the future, only data from the past. It is also necessary to develop assessment criteria.
Quite often, new technologies raise ethical questions, especially questions about responsibility. This is true, for example, for autonomous cars that need to be safely integrated into the traffic. There are also viability issues – for instance, a high-speed train developed in Germany was never used there because it would have needed a completely new infrastructure. It turned out to be an ingenious technology without a market. Hot debates have always accompanied nanotechnology – whether it will bring a paradise or catastrophe with the self-organizing technology taking over. This fear has now come back again with the emergence of AI.
Generally, it is anticipated that by creating predictions and scenarios for the future and considering possible expectations, fears, visions, etc, TA can create a scope of technology futures that can be assessed and then used in decision making
However, the question arises about the quality of knowledge about possible consequences because some futures might be completely speculative. Several issues need to be addressed, including:
- What is known about prospective consequences and impacts of new technology?
- What could be known in case of more research, and which uncertainties or fields of ignorance will be pertinent?
- How can different uncertainties be qualified and compared to each other? How can different or diverging “futures” be ranked with respect to quality?
- What is at stake in worst case scenarios?
In ideal situation, there can be an image of ‘one future’ with the range of possibilities expanding slightly with time. An alternative to this ‘prognostic’ mode is ‘scenario-based’ orientation with the set of different but plausible futures. However, in certain cases we know nothing about the consequences and we deal with arbitrary futures based on inconsistent input data. This third type of orientation is called hermeneutic and is based on enlighthtened deliberation via analysis of expectations, fears, visions, utopia, and dystopia and results in diverging and contested futures. This type of analysis is geared more towards learning about ourselves, perspectives, attitudes, conflicts rather than towards informing the technology-related decisions. This is precisely the case of AI which raises questions about the humanity, its perceptions and feelings, rather than about the future.
In conclusion, Professor Grunwald emphasised that TA’s main mission is enhancing reflexivity in order to improve quality and legitimacy of decisions on the technological advance and the use of tis outcomes. Therefore, it has to be conducted in a highly interdisciplinary manner and in touch with society. It also needs to carefully consider technology futures with extremely different properties.
Prof. Dr. Armin Grunwald studied physics, mathematics and philosophy at the universities of Münster and Cologne. Since 1999, Professor Grunwald has been head of the institute for Technology Assessment and systems analysis (ITAS) at Karlsruhe Institute of Technology (KIT) and since 2002 – the director of the Office of Technology Assessment at the German Bundestag (TAB). Since 2007 he has also been full professor of philosophy of technology at KIT. Armin Grunwald is member of several advisory boards of ministries and authorities in Germany and in the European Union. He is also member of the Executive Board of the National Academy of Science and Engineering in Germany.
Institute for Technology Assessment and Systems Analysis at KIT appeared in 1960s. It brings together engineers, natural scientists, and philosophers who are engaged in the study of the consequences of new scientific and technological developments and their science-based evaluation; analysis of the complex interrelationships between technology, science and society. They also provide knowledge for responsible designing and societal embedding of new technologies and give policy advice (e.g. for the German Bundestag) and participation in the public debate.
Photos by Alexandra Balzhieva