Responsible Data Futures: From Privacy Rights to Environmental Impact

There are two sides to any coin. While we are being flooded with digital tools and connected devices promising to make our lives easier and more comfortable, these developments raise questions on ethics, responsibility and sustainability. Because what are we giving up in return for this comfort? What are we losing? And how can we protect ourselves against the flipside of the coin of digital services? In4Art approaches these critical questions through the unique lens of artistic led innovation practices, demonstrating how art-driven innovation can shape more responsible and human-centred technological development.

We have developed three distinct yet interconnected projects aimed at informing and enhancing responsible innovation in data management and AI development. Each project addresses a different facet of the data ethics challenge: Vocal Values Principles aims to protect personal data rights and privacy of voice data; the environmental impact of digital behaviour is the topic of E-missions; and the ethical implications of biometric data collection is at the heart of the Data Donor Register. By creating space for experimentation and understanding, In4Art facilitates discussions that bridge the gap between technological capabilities, (non)human needs and social contexts.

In each of the projects, three key principles from our working method stood out:

  • 1

    Artistic Experimentation
    Enabling artistic practices to explore and visualize complex technological concepts

  • 2

    Interdisciplinary Collaboration
    Bringing together diverse stakeholders to ensure multiple perspectives inform development

  • 3

    Practical Implementation
    Translating artistic insights into concrete frameworks and tools

The Human Side of Data: Data Donor Register

The Data Donor Register (DaDoR) represents a fundamental rethinking of data ownership and privacy in the digital age, directly inspired by artistic exploration. The project’s genesis lies in “Prosthetic X” with Isaac Monté, where we looked at health by changing the lens from the inside of the body towards the outside and imagining the types of assistive support an aging population will need to age healthy. The prosthetics are part of the Internet of Things systems and that led to a crucial insight: if physical organs require careful protocols for donation and use, shouldn’t our data receive similar consideration?

Prosthetic X - Data Donor

Prosthetic X - Data Donor

This resulted in the development of DaDoR as part of the Prosthetic X project. The project conceptualizes personal health related data as a “digital organ” requiring its own codicil, implementing privacy by design principles that ensure data protection is built into systems from the ground up rather than added as an afterthought. The concrete outcome was the creation of a comprehensive data donor codicil with six distinct options that give users granular control over their data:

  • 1

    Strictly Personal
    Data remains private and is only used for personal early warning systems

  • 2

    Private Sharing
    Enables selective sharing with family members and trusted individuals for health monitoring

  • 3

    Monitoring
    Allows sharing with healthcare professionals for diagnosis and treatment

  • 4

    Scientific Research
    Contributes to scientific studies while maintaining privacy

  • 5

    Reward-Based
    Enables compensated data sharing for marketing and business optimization

  • 6

    Post-Mortem
    Controls data access and usage after death

This innovative framework draws inspiration from medical donor registers, creating a balanced approach that serves all stakeholders while maintaining individual autonomy. The codicil system, although in this stage imaginary, is particularly effective in the context of wearables and IoT devices, where data ownership and privacy concerns are increasingly critical and a change towards user-centered terms of reference would shift the balance. Law graduates investigated the case to see how from a legal and normalization perspective such a system could be implemented, but realized that this is a maze of overlapping regulations including the GDPR (General Data Protection Regulation), the Data Act, and the AI Act. While GDPR Article 20 established the right to data portability, and the proposed Data Act (expected to come into effect in 2025) aims to give users more control over IoT-generated data, there remains a gap between these frameworks and the kind of granular, user-centric control envisioned in the Data Donor Register. The recent European Health Data Space (EHDS) regulation, proposed in 2022, offers perhaps the closest parallel, as it specifically addresses health data sharing and reuse while maintaining strong privacy protections.

For the moment, the Data Donor Register was part of the exhibition of Prosthetic X and in such way exhibited in six venues across the Netherlands and Belgium from 2021-2024.

Environmental Consciousness: E-missions.nl

The E-missions.nl project transforms invisible carbon footprints of digital activities into clear calculators with tips and tricks to consume digital services more responsibly. Launched in 2021, one year before the generative AI revolution marked by the public launch of ChatGPT, the tool addresses the hidden, real-life effects of our energy and water-hungry digital lives. Currently not up to date concerning the needs for AI, the tool is still useful for nine other energy-consuming aspects of our digitized societies, from video streaming to online gaming, and from emailing to social media.

Digital Etiquette - CO2 E-missions

Digital Etiquette - CO2 E-missions

A cornerstone of the project is its Digital Etiquette initiative, which moves beyond mere awareness to foster organizational and social change. Through a dedicated plug-in on the platform, institutions, companies, and government bodies can engage in structured dialogue about their digital practices. This tool facilitates team-building and goal-setting around sustainable digital behaviour, encouraging organizations to establish their own digital etiquette. Its impact is evident in its adoption during “digital clean-up days,” where it serves as a reference point for organizational change.

The project combines practical solutions with artistic interpretation, exemplified by Leanne Wijnsma’s installation “Sensing CO2” – a sensory experience where seven distinct flavours represent the CO2 emissions of different types of emails. This playful and sensory performance supports the awareness and was also integrated in the “Big data Show” to support digital literacy among the youth in the Netherlands.

The calculators are based on the research of Jens Gröger from the German Öko Institute to provide concrete data to support these behavioural changes, leading to many surprising finds. For example, did you know that a message containing a single image requires 10 times the amount of energy compared to a text-only message? Unless the image says more than a thousand words, it is questionable whether images are needed as often as they are shared. It is these metrics which inform the platform’s recommendations for sustainable digital practices, such as lowering the streaming quality on your video services or disabling auto complete functionality on your search engine. There are many such simple etiquettes which anyone can follow that immediately lower the energy burden of their digital actions.

The project advocates for systemic changes, suggesting that energy-efficient settings should become standard by default – similar to organ donor registers – with users having to actively opt out rather than in. This approach, combined with the social framework of Digital Etiquette, creates strategic input for reducing digital environmental impact through both technical solutions and behavioural change.

Voice as Identity: Vocal Values Principles

The Vocal Values Principles (VVP) project demonstrates how artistic inquiry can shape ethical frameworks for emerging technologies, particularly addressing a critical gap in EU AI and Data Act regulations regarding voice data protection. Through collaborative efforts bringing together over 20 experts from diverse fields – including singers, ethnomusicologists, legal specialists, and AI ethicists – the project with Jonathan Reus and in close collaboration also with AI researcher Wiebke Toussaint, established starting principles for transparent and traceable voice technology development.

The project came at a key moment, since more and more generative AI platforms are starting to incorporate voice based models to enhance text-to-speech, speech-to-image and sophisticated voice synthesis and cloning technologies. For this, custom voices, trained on existing voices, are generated. But the question is; from which data sources? And how are they attributed? These questions, relevant on multiple levels including economically as well as socially in the context of data commodification, overlap with what author and researcher Shoshana Zuboff warns for: “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data.” This captures the tension between treating voice as data as opposed to preserving its human essence. The ability to create “digital voice twins” that can attend meetings or make calls on our behalf raises profound questions about authenticity, consent, and the very nature of human interaction. This project creates the much needed spaces to explore both the potential and pitfalls of these technologies before they become ubiquitous.

Voice represents far more than mere sound waves or data points – it embodies our identity, emotions, cultural heritage, and humanity itself. As AI technologies increasingly treat voice as just another dataset to be captured, processed, and replicated, what we risk losing is its profound personal and cultural significance. The human voice exists in an intricate ecosystem of creation, transmission, and reception, where each utterance carries not just information but markers of individual identity, cultural context, and emotional state.

In light of this urgent need to establish safeguards, it is significant that voice data, as a form of biometric data, receives special attention in the EU AI Act. The Act classifies systems using voice characteristics in emotion recognition or biometric categorization as high-risk applications, acknowledging how vocal features – from tone and volume to speech patterns – can reveal sensitive personal information. This designation is crucial as voice represents one of the most intimate forms of biometric data, capable of revealing emotional states, health conditions, and personal identity markers.

To address these emerging challenges systematically, the project developed a structured framework focusing on three core areas that emerged from extensive analysis of how voice data is created, shared and used:

  • 1

    Consent
    Recognizing that consent is not binary but multidimensional and context-dependent, requiring flexible and nuanced frameworks.

  • 2

    Compensation
    Addressing the complex valuation of voice data, particularly considering cultural heritage and vocal labour.

  • 3

    Control and Enforcement
    Ensuring individuals maintain agency over their voice data through technological and legal frameworks.

The project has evolved into a living document at vocalvalues.org, offering a comprehensive framework that considers the entire voice data ecosystem – from initial creators (senders) through various handlers and receivers. The first tests were linked to performances and discussion during the exhibition “Poetics of Prompting”, for which also collaboration with The Hmm was sought as part of this project development. This initiative demonstrate how artistic practices can bridge the gap between technical development and human experience in AI systems, while addressing questions of ethics, ownership, and identity. As voice AI technologies evolve, they promise to fundamentally reshape our relationship with voice in daily life. The VVP project serves as a crucial intervention, using artistic experimentation to illuminate potential futures – both promising and concerning – before they become reality. By examining what gets preserved or lost in the training of AI models, what biases might be embedded, and how these technologies might transform language itself, the project helps surface considerations that might otherwise be overlooked in the rush to innovation. It represents an ongoing dialogue rather than a fixed framework to examine the complex relationship between human voice and digital technology.

Synthesis and Conclusion

In4Art‘s approach demonstrates that artistic practices are not merely supplementary to technological development but are integral to creating more responsible and human-centred directions. The projects showcase how we, through collaborative artistic thinking and experimentation, try to unravel abstract technical concepts into tangible experiences, whether through the metaphor of digital organs in the Data Donor Register, the sensory representation of carbon emissions in E-missions.nl, or the collaborative exploration of voice data ethics in VVP.

Looking forward, an important element in this data ethics discourse would be the labour conditions and just transition. To start understanding this field, our newest project extends to emerging challenges such as fair labour practices in repair and maintenance, and linking physical and digital interaction points.

It’s important to note that these frameworks and approaches are continuously evolving works in progress. As technology and societal needs advance, the ethical frameworks and methodologies must adapt accordingly. This ongoing development reflects the dynamic nature of data ethics and responsible innovation in the digital age.

For further exploration of these topics, we encourage you to visit: