Skip to main content

Enhancing EU legal frameworks for Carbon Dioxide Removal

Policy brief
Enhancing EU legal frameworks for Carbon Dioxide Removal

Publication | 28 February 2023

In short

Carbon Dioxide Removal (CDR) is a type of climate engineering technique, also known as “negative emissions techniques”, that removes atmospheric CO2 and stores it in geological, terrestrial, or oceanic reservoirs.

This policy brief sets out recommendations based on the regulatory challenges related to CDR that were identified in our analysis of EU laws and policies. We address them to EU policymakers and officials involved in the preparation of legislative or policy initiatives related to climate action, climate technologies, climate engineering, geoengineering, carbon removal, and CDR specifically.

To protect and uphold ethical, fundamental rights and sustainability considerations in the research, development and deployment of CDR, TechEthos encourages policy makers to:

  • Clarify the EU’s terminology and rationale for the use of terms, including climate engineering, geoengineering, carbon removal and CDR, and pursue the harmonisation of terms to bring them in line with the terminology of the Intergovernmental Panel on Climate Change (IPCC);
  • Clarify what role – if any – CDR has to play in meeting the EU’s legally binding target of net-zero by 2050;
  • Explicitly incorporate EU fundamental rights into policies and decision-making processes governing CDR techniques in the EU;
  • Clarify the legal status of carbon removals and recognise them as distinct from emission reductions;
  • Define the sustainability requirements for CDR, particularly those in the context of the Sustainable Development Goals (SDGs), the EU Taxonomy Regulation, and the Carbon Removal Certification Framework (CRCF) initiative;
  • Pursue greater international collaboration in relation to CDR to promote the standardisation of removal accounting to avoid double counting, and the enforcement of such standards;
  • Review the adequacy of environmental liability regimes in relation to CDR activities in the EU, including research and deployment.

Find out more about each recommendation by downloading the policy brief below.

Author

Julie Vinders, Trilateral Research (TRI), Ben Howkins, TRI.

Date of publication

28 February 2023

Cite this resource

Vinders, J., Howkins, B. (2023). Enhancing EU legal frameworks for Carbon Dioxide Removal. Extract from Deliverable 6.2 for the European Commission. TechEthos Project Deliverable. Available at: www.techethos.eu

Share:

go to top

Continue reading

Enhancing EU law on emerging technologies

Publication
Enhancing EU law on emerging technologies: Our recommendations

Publication | 28 February 2023

In short

This report presents a series of policy briefs which offer recommendations to policymakers at the EU level to enhance legal frameworks for the governance of climate engineering (Carbon Dioxide Removal – CDR – and Solar Radiation Modification – SRM), neurotechnologies and digital extended reality (XR).

The recommendations are based on the legal and policy analysis of TechEthos: an in-depth look at international and EU law and policy and a series of national legal case studies. These findings were discussed and validated through consultation meetings with 14 policymakers at the European Commission.

Authors

Julie Vinders, Trilateral Research (TRI), Ben Howkins, TRI

Date of publication

28 February 2023

Status

Draft version submitted to the European Commission for review

Cite this resource

Vinders, J., Howkins, B. (2023). Policy briefs on enhancing EU legal frameworks. Deliverable 6.2 for the European Commission. TechEthos Project Deliverable. Available at: www.techethos.eu

Share:

go to top

Continue reading

National legal cases on emerging technologies

Deliverable
National legal case studies on emerging technologies

Publication | 30 July 2022

In short

Climate engineering, neurotechnologies, and digital extended reality (XR) present many significant legal issues that impact socio-economic equality and fundamental rights. In most cases, there is only limited amount of comprehensive or dedicated national laws governing these technology families, though many elements of the technologies are subject to existing national legal frameworks.

This report explores and analyses relevant national laws in nine case studies for these three technology families: Climate Engineering in Australia, Austria and the United Kingdom, neurotechnologies in Germany, Ireland and the United States, and XR in France, Italy and the United Kingdom.

You can explore each individual case study or download the full comparative report further below.

  • Climate Engineering in Australia

    Download

  • Climate Engineering in Austria

    Download

  • Climate Engineering in the United Kingdom

    Download

  • Neurotechnologies in Germany

    Download

  • Neurotechnologies in Ireland

    Download

  • Neurotechnologies in the United States

    Download

  • Digital Extended Reality in Italy

    Download

  • Digital Extended Reality in France

    Download

  • Digital Extended Reality in the United Kingdom

    Download

Together with TechEthos’ analysis of international and EU law and policy, this analysis will serve as the basis for future work in the TechEthos project involving the development of recommendations for the adjustment or enhancement of legal frameworks at the national and/or international level, as well as policy briefs on the possible need for dedicated legislation at the EU level.

Authors

Julie Vinders, Trilateral Research (TRI), Ben Howkins, TRI, Nicole Santiago, TRI, Iva-Nicole Mavrlja, TRI, Rowena Rodrigues, TRI, Wenzel
Mehnert
AIT Austrian Institute of Technology (AIT), Domenico Piero Muscillo, Associazione Italiana per la Ricerca Industriale (Airi), Gustavo Gonzalez, Airi, Marco Liut, Airi, Sara Morisani, Airi, Andrea Porcari, Airi, Laurynas Adomaitis, CEA, Alexei Grinbaum, CEA, Kathleen Richardson, De Monfort University (DMU), Nitika Bhalla, DMU, Lisa Häberlein, European Network of Research Ethics Committees (EUREC), Bennet Francis, University of Twente (UT), Dominic Lenzi, UT.

Date of publication

30 December 2022

Status

Draft version submitted to the European Commission for review

Cite this resource

Vinders, J., et al. (2022). TechEthos D4.2:
Comparative analysis of national legal case
studies. Deliverable 4.2 for the European Commission. TechEthos Project Deliverable. Available at: www.techethos.eu

Share:

go to top

Continue reading

XR and General Purpose AI: from values and principles to norms and standards

Policy brief
XR and General Purpose AI: from values and principles to norms and standards

Policy brief | 22 February 2023

In short

The TechEthos project addressed the ethical challenges of eXtended Reality and Natural Language Processing. These topics belong to the larger area of General Purpose Artificial Intelligence.

We take the position that values and principles are not enough for AI regulation. European policy makers should go beyond merely listing such values and principles, because manufacturers may not immediately understand how to implement them in the design of AI systems. For EU regulation to be effective, we offer an operationalization of the values and principles in the form of suggested norms and standards.

This policy brief lists new and emerging issues to supplement, enhance and update the Assessment List for Trustworthy Artificial Intelligence (ALTAI) developed by the High-Level Expert Group on AI. Based on our analysis, we formulate specific recommendations for AI regulation.

Authors

Laurynas Adomaitis, French Alternative Energies and Atomic Energy Commission (CEA), Alexei Grinbaum, CEA.

Date of publication

22 February 2023

Cite this resource

Adomaitis, L. and Grinbaum A. (2023). XR and General Purpose AI: from values and principles to norms and standards. TechEthos Project Policy Brief. Available at: www.techethos.eu

Share:

go to top

Continue reading

The TechEthos game: Ages of Technology Impacts

Tool
The TechEthos game: Ages of Technology Impact

Developed in co-creation with science engagement professionals and game experts, the TechEthos game aims at capturing societal attitudes, values and concerns towards Digital Extended Reality, Neurotechnologies and Climate Engineering.

Have a look at the game rulebook and discover how participants will forge their ideal world

A co-creation process

The TechEthos game was developed using the Triadic Game Design methodology in a series of workshops with science engagement professionals and game experts. This allowed the key principles of the game to emerge and guide the design choices. This process has been captured in a project report.

Image gallery

The TechEthos game is available in 7 languages (English, German, Czech, Romanian, Serbian, Spanish, and Swedish). Here is a glimpse of what it looks like:

  • TE game | 31 January 2023

    TechEthos_image_game-gallery_07

    TechEthos game version in Spanish. Photo: Parque de las Ciencias.
  • TE game | 31 January 2023

    TechEthos_image_game gallery_05

    TechEthos game voting cards. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TE game | 31 January 2023

    TechEthos_image_game-gallery_01

    TechEthos board and cards. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TechEthos_image_gamegallery_09

    TechEthos game workshop at the Center for the Promotion of Science. Photo: Marko Risovic/ Center for the Promotion of Science.

  • TE game | 31 January 2023

    TechEthos_image_game gallery_02

    TechEthos cards from the Natural Language Processing (NLP) deck. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TE game | 31 January 2022

    TechEthos_image_game gallery_06

    TechEthos game box created at the Center for the Promotion of Science. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TE game | 31 January 2023

    TechEthos_image_game-gallery_03

    TechEthos World Card. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TE game | 31 January 2023

    TechEthos_image_game-gallery_04

    Internal workshop at the Center for the Promotion of Science. Photo: Marko Risovic/ Center for the Promotion of Science.
  • TechEthos_image_gamegallery_08

    TechEthos game workshop at the Center for the Promotion of Science. Photo: Marko Risovic/ Center for the Promotion of Science.

Interested to try the game?

Get in touch

teamGreta Alliaj
Ecsite – European Network of Science Centres and Museums

galliaj@ecsite.eu

Share:

go to top

Continue reading

Tools to develop and advance scenarios dealing with the ethics of new technologies

Deliverable
Tools to develop and advance
scenarios dealing with the ethics
of new technologies

Publication | 15 December 2022

In short

This report describes the process of co-creation of the TechEthos game that was developed to enhance the TechEthos scenarios.

It also presents the results of employing the Triadic Game Design methodology as an approach to working with expert game design stakeholders across the dedicated workshops in order to resolve emerging value tensions in game design.

The game resulting from the co-creation activities with expert stakeholders will be used in conjunction with the TechEthos scenarios and both expert and citizen participants to surface ethical issues and concerns in those scenarios and, consequently, helping to enhance the scenarios in order to be more comprehensive in their breadth.

Authors

Steven Umbrello, Delft University of Technology (TUD), Pieter Vermaas, TUD, Cristina Paca, European Network of Science Centres and Museums (Ecsite), Greta Alliaj, Ecsite, Andrew Whittington-Davis, Ecsite, Fabrice Jouvenot, Ecsite, Michael J. Bernstein, AIT Austrian Institute of Technology (AIT), Wenzel Mehnert, AIT, Masafumi Nishi, AIT, Eva Buchinger, AIT.

Date of publication

15 December 2022

Status

Draft version submitted to the European Commission for review

Cite this resource

Umbrello, S., Vermaas, P., Paca, C., Alliaj, G., Nishi, M., Whittington, A., Jouvenot, F., Bernstein, M.J., Mehnert, W., Buchinger, E., (2022). Tools to develop and advance scenarios dealing with the ethics of new technologies. TechEthos Project Deliverable. Available at: www.techethos.eu.

Share:

go to top

Continue reading

Newsletter #4: Spotlight on TechEthos analysis of international & EU law

Newsletter #4
Spotlight on TechEthos analysis of international & EU law on new and emerging technologies

Newsletter | 09 November 2022

In short

Welcome to the fourth instalment of the TechEthos newsletter. This issue highlights the key findings of TechEthos analysis of international and EU law and policy. Have a look and learn more about our human rights impact assessment, get familiar with the key role of our Advisory and Impact Board, and discover materials and resources that we think you may find useful. Additional resources, tools and events relevant to the TechEthos community complete this edition.

Date of publication

9 November 2022

Share:

go to top

Continue reading

Moral Equivalence of the Metaverse

Publication
Moral Equivalence in the Metaverse

Publication | 17 November 2022

In short

This scientific paper dives into the question “Are digital subjects in virtual reality morally equivalent to human subjects?”, from the perspective of cognitive and emotional equivalence. It builds on TechEthos’ analysis of ethical issues concerning Digital Extended Reality and expands significantly on the question of moral transfer, including themes of identity, action, responsibility, and imitating human language and appearance.

Authors

Alexei Grinbaum, Commissariat à l’Énergie Atomique et aux Énergies Alternatives (CEA), Laurynas Adomaitis, CEA.

Date of publication

11 October 2022

Cite this paper

Grinbaum, A., Adomaitis, L. (2022). Moral Equivalence in the Metaverse. Nanoethics. 16, 257-270. https://doi.org/10.1007/s11569-022-00426-x

Share:

go to top

Continue reading

Neurotechnologies through the lens of human rights law

Neurotechnologies through the lens of human rights law
06 october 2022

Authored by: Ben Howkins and Julie Vinders
Reviewed by: Corinna Pannofino and Anaïs Resseguier

Article | 06 October 2022

Technological innovation can both enhance and disrupt society in various ways. It often raises complex legal questions regarding the suitability of existing laws to maximise the benefits to society, whilst also mitigating potentially harmful consequences. Some emerging technologies even challenge us to rethink the ways in which our fundamental rights as human beings are protected by law. What happens, for example, to the right to not self-incriminate if advanced neurotechnologies in the courtroom can provide insights into a defendant’s mental state?

As a means of directly accessing, analysing, and manipulating the neural system, neurotechnologies have a range of use applications, presenting the potential for both enhancements to and interferences with protected human rights. In an educational context, insights on how the brain works during the learning process gained from research on neuroscience and the use of neurotechnologies may lead to more effective teaching methods and improved outcomes linked to the right to education. This may enhance the rights of children, particularly those with disabilities, yet more research is required to assess whether there is the potential for long-term impacts to brain development. The (mis)use of neurotechnologies in the workplace context, meanwhile, may negatively impact an individual’s right to enjoy the right to rest and leisure by increasing workload, or instead positively enhance this right by improving efficiency and creating more time and varied opportunities for rest and leisure.

Neurotechnologies and medical treatment

The primary use application of neurotechnologies is in a clinical context, both as a means of improving understanding of patients’ health and as a means of administering clinical treatment, the effects of which have the potential to enhance various protected human rights in conjunction with the right to health. For example, neurotechnologies may facilitate communication in persons whose verbal communication skills are impaired, the benefits of which are directly linked to the right to freedom of expression. Additionally, neurotechnologies may be used to diagnose and treat the symptoms of movement disorders such as Parkinson’s disease, thereby potentially enhancing the rights to dignity and autonomy of persons with disabilities.

However, the clinical use of neurotechnologies requires compliance with legal and bioethical principles such as consent and the right to refuse treatment, without which the protected rights of users may be interfered with. A particular concern is that the clinical use of neurotechnologies may lead to infringements with the right to non-discrimination, specifically in the form of neurodiscrimination, whereby the insights from brain data processed by neurotechnologies form the basis of differential treatment between individuals, for instance in insurance and employment contexts. From this a key consideration emerges, namely whether brain data is adequately protected by the existing right to privacy, or whether there is a need for a putative right to mental privacy, amongst a range of novel human rights protections, including a right to cognitive liberty, a right to mental integrity and a right to psychological continuity. The essential premise behind these proposed ‘neurorights’ is that the existing human rights framework needs revising to ensure individuals are adequately protected against certain neuro-specific interferences, including the proposed ‘neurocrime’ of brain-hacking.

Neurotechnologies and the legal system

Neurotechnologies are also increasingly being used in the justice system, wherein they may enhance an individual’s right to a fair trial, for instance by ‘establishing competency of individuals to stand trial’ and informing rules on the appropriate ‘age of criminal responsibility’. However, the use of neurotechnologies may also interfere with access to justice and the right to a fair trial. For example, advanced neurotechnologies capable of gathering data on mental states consistent with one’s thoughts and emotions risks interfering with the right to presumption of innocence or the privilege against self-incrimination. An additional consideration in this context is the right of individuals to choose to or opt against benefitting from scientific progress, the relevance of which is that individuals cannot be compelled by States to use neurotechnologies, except in certain limited circumstances determined by the law. The enforced use of neurotechnologies in justice systems could therefore interfere with the right to choose to opt against “benefitting” from scientific progress, as well as the right to a fair trial and access to justice.

Neurotechnologies and future human rights challenges

Finally, whilst this study has highlighted the ways in which neurotechnologies may already affect the enjoyment of fundamental human rights, the potential for enhancements to and interferences with these protected rights may increase as the technological state of the art progresses. For example, although primarily contemplated within the realm of science fiction, in a future reality the use of neurotechnologies may challenge the strictness of the dichotomy between ‘life’ and ‘death’ by enabling ‘neurological functioning’ to be sustained independently of other bodily functions. This may affect States’ obligations to ensure the full enjoyment of the right to life, while also raising questions around the appropriate regulation of commercial actors seeking to trade on the promise of supposed immortality

Next in TechEthos – is there a need to expand human rights law?

The study highlights the importance of bringing a human rights law perspective into the development of neurotechnologies. The human rights impact assessment is a mechanism designed to help ensure that new and emerging technologies, including neurotechnologies, develop in a manner that respects human rights, while also enabling the identification of potential gaps and legal uncertainties early on in the development stage. The analysis also raises the question as to whether further legislation may be required to address these gaps. Crucial to this question is the need to strike a balance between ensuring technological development does not interfere with fundamental human rights protections and avoiding overregulating emerging technologies at an early stage and thereby stifling further development.

Read more about the human rights law implications of climate engineering and digital extended reality.

Read the report

Share:

go to top

Continue reading

Digital extended reality through the lens of human rights law

Digital extended reality through the lens of human rights law
06 October 2022

Authored by: Ben Howkins and Julie Vinders
Reviewed by: Corinna Pannofino and Anaïs Resseguier

Article | 06 October 2022

Technological innovation can both enhance and disrupt society in various ways. It often raises complex legal questions regarding the suitability of existing laws to maximise the benefits to society, whilst also mitigating potentially harmful consequences. Some emerging technologies even challenge us to rethink the ways in which our fundamental rights as human beings are protected by the law. For instance, how might digital extended reality (XR) affect online safety and the emerging rights to be online and to disconnect?

XR technologies have an assortment of uses and applications, from gaming and filmmaking to healthcare and education. Each use case of XR creates the potential for enhancements to and interferences with various human rights, including new and emerging rights, such as the right to a healthy environment, a right to disconnect and a right to be online.
The use of XR gaming applications, for instance, is consistent with the right to benefit from scientific progress and may enhance the right to rest and leisure of all users. It may benefit persons with disabilities in particular, whose right to autonomy, for instance, may be enhanced by being able to access leisure experiences perhaps otherwise unattainable in the physical world. However, the use of XR gaming applications may also lead to increased incidences of cyberbullying, harassment, and virtual sexual assault, the experiencing of which may interfere with the realisation of the rights of women and children, in particular.

XR and possible use cases

In addition to particular use cases, there are also a variety of contexts in which the use of XR technologies may lead to both positive and negative impacts on the realisation of fundamental rights. In the clinical context, for instance, XR may enhance the right of healthcare professionals to just and favourable conditions of work when used to provide low-risk, hyper-realistic training experiences designed to improve overall healthcare provision. For patients, meanwhile, the clinical use of XR may lead to benefits linked to the right to health. Such applications may also enhance other protected rights, with the use of XR technologies to treat psychological trauma, for instance, potentially enhancing the right to dignity of victims of criminal offences. There is a risk, however, that the use of XR in a clinical setting could interfere with these protected rights, for instance if patients experience short or long-term health-related harms through the use of XR, such as motion sickness and depersonalisation / derealisation disorder.

Developing XR in accordance with human rights

In an educational context, the use of XR technologies may lead to improved learning outcomes linked to the right to education, including by accommodating specific educational needs, the benefits of which relate to the enjoyment of the rights of persons with disabilities on the basis of non-discrimination. Similarly, the incorporation of XR into the judicial system may enhance an individual’s right to a fair trial by improving the accessibility of legal proceedings, enabling evidential recreation of events, and helping to provide legal professionals with anti-bias training in order to maintain fairness. In both contexts, however, there is also a risk that the use of XR may lead to interferences with these rights, particularly if adopted without consideration of the potential drawbacks.

As such, the use of XR as an educational tool, for instance, should be informed by research on information overload and the possible effects on brain and neurological development in children, without which potentially safer and more effective teaching measures may be deprioritised or underfunded. Likewise, the use of XR in legal proceedings should be guided by factors including the suitability of virtual participation and the accessibility of XR technology. The right of access to justice may otherwise be undermined, for instance by serving to promote a form of participation of inferior type or quality in comparison to in-person participation, or by exacerbating existing accessibility issues faced by disadvantaged parties.

XR and future human rights challenges

There are certain human rights considered in this study for which XR technologies may enhance enjoyment while also raising challenging issues which fall short of constituting interferences. In relation to the right to freedom of expression, for instance, XR applications may facilitate new forms of creative expression in a variety of mediums, including music, narrative storytelling, and art. Yet there are also concerns related to the appropriate treatment of content in XR depicting violence, pornography, hate speech and mis/disinformation. This creates a tension between the right of everyone to freedom of expression and the obligation on States to protect users of XR from potentially harmful content and interferences with other fundamental rights. In seeking to resolve this conflict, States are required to strike a balance between unrestricted freedom and legitimate limitations.

Next in TechEthos – is there a need to expand human rights law?

The study highlights the importance of bringing a human rights law perspective into the development of new and emerging technologies, including XR. The human rights impact assessment is a mechanism designed to help ensure that such technologies develop in a manner that respects human rights, while also enabling the identification of potential gaps and legal uncertainties early on in the development stage. The analysis also raises the question as to whether further legislation may be required to address these gaps. Crucial to this question is the need to strike a balance between ensuring technological development does not interfere with fundamental human rights protections and avoiding overregulating emerging technologies at an early stage and thereby stifling further development.

Read more about the human rights law implications of climate engineering and neurotechnologies.

Read the report

Share:

go to top

Continue reading