top of page
Search

Echoes in Eternity: The Ethical Frontier of Generative Artificial Intelligence and Posthumous Data Protection

Updated: May 28



Data Protection and Posthumous Data
Data Protection and Posthumous Data

Introduction

Generative artificial intelligence (GAI) convincingly mimics human voices, recreates facial expressions, and simulates personality traits. As a result, the boundary between life and digital afterlife is increasingly blurred. From chatbots channeling the voices of lost loved ones to interactive avatars of deceased celebrities and historical figures, we are entering a world where digital resurrection is no longer science fiction. It is a present reality. However, amid this technological revolution, a critical question remains unaddressed: Who owns the personal data of the dead?


Most global data privacy and protection laws and regulations, such as the European Union’s General Data Protection Regulation (EU GDPR) and the California Consumer Privacy Act (CCPA), focus exclusively on the rights of living individuals. They overlook the vast and growing ecosystem of post-mortem digital footprints. These include voice recordings, social media activity, biometric patterns, and personal documents. It is known as “posthumous data.” GAI systems increasingly mine and repurpose this data to build synthetic identities or simulate lifelike personas. Consequently, the absence of clear legal and ethical boundaries raises concerns about consent, cultural sensitivity, dignity, and identity.


This article dives deep into the uncharted legal and moral territory of digital legacy and posthumous data protection. It explores the fragmented global legal and regulatory landscape. It analyzes real-world applications of GAI using posthumous data. It outlines a comprehensive, future-facing roadmap. In doing so, it challenges ethicists, individuals, GAI technology developers, and policymakers to grapple with a provocative and urgent question: Should posthumous data be governed, and if so, by whom and to what end?


Key Terms

It is crucial to clarify the terminology that frames the discussion to fully engage with the complexities surrounding data protection for deceased individuals. The following key terms provide foundational context for understanding how posthumous data intersects with GAI and digital legacy.

  • Biometric Data: Unique biological or behavioral characteristics used to identify individuals, such as voiceprints, facial geometry, and retinal patterns. Biometric data remains sensitive even after death.

  • Dead Data: Digital information left behind by individuals after death, including emails, photos, social media posts, biometric records, and voice recordings.

  • Digital Executor: A person designated to manage a deceased individual's digital assets and online accounts, including decisions on personal data and AI likeness usage.

  • Digital Resurrection: The recreation or simulation of deceased individuals using AI technologies, typically through voice synthesis, chatbots, or avatars.

  • Emotional AI (Affective Computing): Systems that analyze and respond to human emotions. Relevant when AI mimics or interacts with bereaved individuals.

  • GAI: A class of artificial intelligence models capable of generating new content, such as text, audio, images, or video, based on patterns learned from training data.

  • Post-Mortem Data Rights: Legal or ethical rights concerning personal data access, control, and processing after an individual’s death.

  • Right to Be Forgotten: The ability to request deletion of personal data under laws like the EU GDPR and similar data protection laws and regulations. Generally, it applies only to the living.

  • Synthetic Identity: A fabricated or partially fabricated digital identity may incorporate data from deceased persons.

  • Synthetic Media: AI-generated multimedia content that mimics or fabricates real-world likenesses, speech, or behaviors.

  • Voice Cloning: A generative AI technique that uses a small audio data sample to create synthetic replicas of a person's voice.


Understanding Posthumous Data

Navigating the legal and ethical complexities of using personal data after death requires thoroughly appreciating the breadth and sensitivity involved. Posthumous data encompasses the extensive digital footprint individuals leave behind in death. It includes emails, text messages, biometric information, social media profiles, cloud storage content, voice recordings, and more. Unlike physical belongings, digital legacies remain perpetually accessible, susceptible to processing, and increasingly vulnerable to exploitation by emerging GAI technologies.


GAI models, which learn and evolve from vast data repositories, find posthumous data particularly valuable. These sophisticated models leverage such data in numerous innovative yet ethically challenging ways, including:

  • Creating therapeutic applications designed for grief counseling and legacy preservation, thus extending emotional connections beyond natural human boundaries.

  • Developing personalized chatbots capable of simulating realistic conversations with deceased individuals, offering emotional comfort or closure to bereaved families.

  • Generating compelling creative content across industries such as film, literature, and virtual storytelling, often blurring the line between tribute and exploitation.

  • Training advanced natural language models to replicate authentic human behaviors, speech patterns, and emotional nuances.


Profound ethical dilemmas emerge when posthumous data is repurposed without explicit consent. It is often done in ways unforeseen or unintended by the individuals concerned. This practice poses urgent questions about data protection: Who can consent to using a deceased individual's posthumous data? Can consent be ethically inferred or legitimately delegated post-mortem? Additionally, is it morally acceptable for GAI technologies to reconstruct or simulate someone's identity without explicit lifetime authorization?


These developments fundamentally challenge traditional views on data ownership, personal autonomy, digital dignity, and human identity. Floridi and Taddeo (2016) underscore the necessity for our ethical frameworks to adapt swiftly, aligning with rapid advancements in AI. They highlight the risks of digital legacies becoming commodities used for profit, persuasion, or emotional manipulation. Understanding posthumous data thus extends far beyond technological considerations. It delves deeply into identity, memory, and humanity's digital existence. As GAI technological capabilities expand, global data protection laws and regulations must urgently and proactively address the critical issue: What data protection rights and ethical protections should extend beyond an individual's lifetime?


Global Legal Frameworks Governing Posthumous Data Protection

Although many data protection laws and regulations apply to living individuals, a few jurisdictions offer explicit or sector-specific protection for posthumous data. These protections vary widely in scope, enforcement, and retention timelines (Bradshaw et al., 2010). Table 1 summarizes select jurisdictions that have addressed this issue to varying extents:

Table 1: Jurisdictions with Posthumous Data Protections

Jurisdiction

Law/Regulation

Coverage for Deceased Persons

Retention Period (if specified)

Authorized Representatives

Key Notes

Argentina

Ley 25.326

Partial

Not defined

Legal heirs or relatives

Heirs may access data for legal or administrative purposes.

Australia (NSW)

Health Records and Information Privacy Act

Health records only

Health records: 7 years (adults) after last service; until age 25 (minors).

Executors or legal representatives

Limited to public health data.

Canada (Ontario)

Personal Health Information Protection Act (PHIPA)

The health sector only

Health records: Minimum 10 years from last entry (adults); minors: 10 years after reaching 18.

Estate trustees or executors

Robust protection for medical data.

Canada (British Columbia)

Freedom of Information and Protection of Privacy Act

Health and public records

Varies by agency

Executor or family

Access granted on a case-by-case basis.

France

Loi Informatique et Libertés (Article 85)

Explicit

Civil/Identity data: 15 years (biometric data, adults); 10 years (minors).

Heirs or designated individuals

Individuals can specify post-mortem data handling instructions.

Germany

Civil Code (BGB §823)

Civil personality rights

Civil/Identity data: Varies; telecom data retained 10 weeks.

Family members

Based on reputation and dignity protections.

Hungary

Act CXII of 2011

Explicit

Civil/Identity data: Up to 10 years post-contract termination.

Legal heirs

Allows delegation of data rights after death.

Japan

Act on the Protection of Personal Information (APPI)

None

Not applicable

Not applicable

No post-mortem coverage.

New Zealand

Health Information Privacy Code 2020

Limited (health data only)

Health records: 10 years from the last service provided.

Health agencies may refuse access to protect privacy

Ethical considerations for sensitive records.

Singapore

Personal Data Protection Act (PDPA)

None

Not applicable

Not applicable

No legal recognition of post-mortem rights.

South Africa

Protection of Personal Information Act (POPIA)

None

Not applicable

Not applicable

Applies only to living individuals.

South Korea

Civil/Criminal Code (Article 308)

Dignity-based (not data-specific)

No fixed data retention rules

Families may sue for posthumous harm

Indirect protection via defamation statutes.

Spain

Organic Law 3/2018 (LOPDGDD)

Explicit

Not defined

Heirs or designated individuals

Heirs can exercise rights unless deceased prohibited it.

United Kingdom

UK GDPR + DPA 2018

None

Sector-specific only

It depends on the archival or estate arrangements. Indefinite for historically significant data.

Public interest exemptions apply in some cases.

United States (California)

CCPA / CPRA

None

Not applicable

Estate plans or contract-based tools only

No formal legal mechanism for digital legacy.

 Key Observations:

  • Explicit Protections: France, Hungary, and Spain explicitly define post-mortem data rights, providing clear directives or allowing heirs specific access and control.

  • Health-Specific Protections: Australia (NSW), Canada (Ontario), and New Zealand primarily focus protections on health-related data, offering specific retention guidelines.

  • Limited or No Protections: Jurisdictions such as Japan, South Africa, and the United States (California) do not provide formal mechanisms for post-mortem data rights, leading to significant gaps in protection.

  • Variability in Representation: Jurisdictions like Germany and South Korea approach post-mortem protections indirectly, often through dignity-based legal provisions, rather than explicit data privacy laws.


Civil identity, cultural archives, and healthcare are among the domains with more clearly defined timelines. Understanding sectoral data retention requirements is critical to developing compliant and respectful GAI applications.


Table 2: Retention Patterns by Data Type

Table 2 summarizes global data retention requirements.

Data Type

Typical Retention Period

Jurisdictions with Explicit Rules

Civil/Identity Data

France: 15 years (biometric data, adults); 10 years (minors). Germany: Varies; telecom data retained 10 weeks. Hungary: Up to 10 years post-contract termination.

France, Germany, Hungary

Cultural Archives

China: Defined by state regulations, regularly transferred for central management. EU: Historical records selected within 15 years. UK: Indefinite for historically significant data.

China, European Union, United Kingdom

Health Records

Australia (NSW): 7 years (adults) after last service; until age 25 (minors). Canada (Ontario): Minimum 10 years from last entry (adults); minors: 10 years after reaching 18. New Zealand: 10 years from the last service provided.

Australia (NSW), Canada (Ontario), New Zealand

Key Observations:

  • Civil/Identity Data: Retention periods significantly differ across jurisdictions; France retains biometric data for 15 years, Germany has short retention for telecom data (10 weeks), and Hungary retains data up to 10 years after contract termination.

  • Cultural Archives: Indefinite retention for historically significant records is common, though specifics vary. The UK maintains permanent archives, the EU sets selection criteria within 15 years, and China centrally regulates archival management.

  • Health Records: Diverse retention periods exist, from Ontario’s minimum of 10 years (longer for minors) to NSW’s 7-year standard for adults (longer for minors), and New Zealand’s consistent 10-year retention from the last health service.


While these examples illustrate incremental progress, they also highlight the lack of harmonization across jurisdictions. This inconsistency makes it challenging for global GAI developers to implement universally compliant practices. Having examined the legal landscape, we now explore the ethical and cultural implications of using posthumous data.


Ethical and Legal Risks Associated with Posthumous Data Use and Generative AI

As GAI technologies advance rapidly, their utilization of posthumous data presents complex ethical and legal challenges. These technologies offer remarkable possibilities, such as digitally memorializing loved ones or historical figures. They also raise significant concerns regarding consent, cultural sensitivities, data protection, and emotional well-being.


The ethical and legal issues surrounding posthumous data use and GAI require a comprehensive examination. Table 3 outlines key risk categories and associated concerns and provides actionable recommendations to address and mitigate these challenges.


Table 3: Ethical and Legal Risks Associated with Posthumous Data Use and Generative AI

Risk Category

Key Issues

Recommendations for Mitigation

Algorithmic Bias and Inaccuracy

AI perpetuates biases from training data.

Emphasize the importance of diverse training datasets and bias audits.

Commercial Exploitation

Unauthorized use of deceased persons' likenesses for profit.

Mention notable legal cases addressing unauthorized commercial use.

Consent Ambiguity

Lack of explicit posthumous consent for data use.

Implement digital will or data trustee designations.

Cultural and Religious Conflicts

Potential disrespect or taboo violations in specific cultural contexts.

Cite specific doctrines or cultural perspectives prohibiting posthumous representation.

Data Ownership and Control

Ambiguity in managing and inheriting digital assets.

Reference established frameworks like RUFADAA (U.S.) for digital asset inheritance.

Emotional Harm to Survivors

Psychological distress caused by digital resurrections.

Incorporate grief counseling guidelines and research on digital legacies.

Ethical Oversight Deficiency

Absence of established ethical guidelines for generative AI use.

Propose creating ethical frameworks and oversight committees.

Identity Misrepresentation

AI-generated personas misrepresent the deceased.

Refer to laws protecting posthumous publicity rights (e.g., California Civil Code §3344.1).

Legal Gaps and Jurisdictional Variability

Inconsistent laws governing post-mortem data across jurisdictions.

Advocate for international harmonization efforts or treaties.

Technological Misuse

Malicious exploitation of AI technologies, such as deepfakes.

Provide concrete examples of misuse; suggest robust regulatory frameworks.

To effectively manage these risks, stakeholders must collaborate to establish clear ethical standards and robust regulatory frameworks. These stakeholders include policymakers, ethicists, technology developers, and society. These proactive approaches ensure GAI respects dignity, privacy, emotional safety, and cultural sensitivity. Ultimately, they will result in responsible and ethical innovation in the digital age. Next, we look at several real-world applications that use and process posthumous data in GAI systems and technologies.


Real-World Applications

Today's technological landscape offers compelling examples of GAI used to replicate or simulate deceased individuals. These applications employ advanced GAI models capable of producing realistic content. These technologies range from text and voice to images and video, which can authentically mimic a person's appearance, speech, or personality.

  • D-ID: This deep-learning-based technology transforms static photographs into lifelike talking videos, often combined with voice cloning to create realistic digital personas. Such synthetic media highlights the emerging capabilities and ethical responsibilities associated with AI-driven recreations of human likeness (D-ID, n.d.).

  • HereAfter AI: This GAI platform creates voice-based chatbots trained on personal interviews and recordings. By leveraging voice cloning and sophisticated natural language processing, HereAfter AI enables meaningful and emotionally resonant conversations with deceased loved ones (HereAfter AI, n.d.).

  • Kanye West's Hologram of Robert Kardashian: This well-publicized instance combined pre-scripted dialogue with GAI-generated facial animation and synthetic voice replication, illustrating digital resurrection's significant cultural impact and ethical complexities (BBC News, 2020).

  • Project December: Utilizing advanced GPT models, Project December facilitates interactive conversations with simulated personas, including those modeled after deceased individuals, through highly realistic natural language generation. This technology exemplifies the emotionally powerful potential and ethical challenges of recreating human interaction digitally (Roose, 2020).

  • Respeecher: Known for its generative voice synthesis, Respeecher clones and recreates the voices of real individuals, including deceased celebrities and public figures. The technology has notable applications in entertainment and creative industries, significantly raising the stakes around consent and ethical usage (Respeecher, n.d.).


These applications demonstrate GAI's robust capabilities and profound ethical implications. As these technologies become increasingly prevalent, there is an urgent need for more comprehensive legal and regulatory frameworks and ethical standards to govern their responsible use. They illustrate the need for effective legal and regulatory GAI governance policies.


Practical Legal and Regulatory GAI Policy Recommendations

The European Parliament’s “Guidelines on Ethics in AI” (2019) exemplified the EU’s efforts to align current and future GAI-focused regulations with ethical AI practices. The EU AI Act (2024) is a culmination of those efforts and an example of the EU’s achievement of this goal. To effectively manage the complexities surrounding GAI's use of posthumous data, other jurisdictions must align their GAI governance efforts with their AI ethics and data protection strategies. These strategies must balance technological innovation with ethical, legal, and regulatory compliance recommendations. They must also respect individual dignity during and after death.


Recommended actions include:

  • AI Ethics Reviews: Introduce mandatory oversight committees and ethical review processes for AI technologies that utilize posthumous data. These bodies should evaluate the ethical implications, potential emotional harm, and cultural sensitivities.

  • Digital Legacy Clauses: Integrate explicit digital consent provisions within estate planning processes, ensuring individuals clearly define how their posthumous data may be managed, utilized, or restricted after death.

  • Family Opt-Out Mechanisms: Establish accessible and straightforward procedures enabling surviving family members or designated representatives to request the removal of deceased relatives' data from GAI training datasets, safeguarding familial data protection and emotional well-being.

  • Post-Mortem Privacy Rights: Advocate for comprehensive, universally recognized privacy rights that continue to protect individuals' data after death and prevent unauthorized or exploitative use.

  • Standardized Retention Rules: Develop internationally aligned sector-specific retention guidelines and standards for posthumous data, clearly defining the duration and conditions under which such data can be stored, accessed, and utilized.


These recommendations will provide foundational safeguards, promote a roadmap for GAI governance, and ensure posthumous data protection.


Proposed Roadmap for GAI Governance and the Data Protection of Posthumous Data

Understanding global legal frameworks leads naturally to ethical and cultural concerns that influence attitudes toward posthumous data use. Cultural and ethical standards vary significantly worldwide, and respecting diverse views on dignity and memory is crucial when considering AI ethics (UNESCO, 2023). As GAI continues bridging the gap between technology and human memory, it is essential to acknowledge and address the profound implications. These implications include cultural sensitivity, potential emotional distress, and the heightened risk of reputational harm (Wagner, 2020).


Key ethical dimensions include:

  • Consent: Did the individuals explicitly authorize using their digital identity or data after death? Without explicit consent, utilizing posthumous data raises ethical dilemmas regarding autonomy and personal integrity.

  • Cultural Sensitivity: Cultural and religious beliefs significantly influence how societies perceive death and remembrance. Digital resurrection may be considered disrespectful or taboo in specific cultural or spiritual contexts, making culturally informed AI development vital (VKTR, 2025).

  • Emotional Harm: The use of AI to recreate or simulate deceased loved ones can lead to emotional trauma or prolonged grief for surviving relatives, who might struggle with distinguishing digital interactions from genuine emotional connections.

  • Misuse and Disinformation: GAI-generated recreations or personas could be manipulated for unethical purposes, including damaging reputations, spreading misinformation, or misleading the public.


Allen and Rothman (2024) discussed the importance of “post-mortem privacy” and the need to protect data protection rights in death. They highlighted the ethical considerations associated with post-mortem privacy, emphasizing that data protections should extend to the deceased and the living (Allen & Rothman, 2024). Having outlined the ethical dimensions associated with data protection and posthumous data, we now turn to concrete real-world applications. We provide practical insight into how GAI already navigates, and sometimes challenges, these ethical boundaries.


A comprehensive and structured roadmap is proposed to responsibly guide the development and deployment of GAI technologies utilizing posthumous data. This roadmap emphasizes ethical integrity, legal clarity, and practical applicability, ensuring respectful and responsible use of posthumous data. The key focus areas are as follows:


Best Practices

  • Codify explicit digital legacy clauses within estate planning and broader data governance frameworks.

  • Establish family opt-out rights, empowering surviving relatives to control or prevent the use of deceased individuals' likenesses and personal data.

  • Standardize data retention and deletion protocols across critical healthcare, entertainment, and archival management sectors.

  • Promote the widespread adoption of GAI ethics certifications among developers and platforms involved in digital resurrection initiatives.

Ethical Considerations

  • Embed fundamental ethical principles such as dignity, explicit consent, cultural sensitivity, and emotional safety into GAI technologies' core design and deployment practices.

  • Establish dedicated ethical review boards for overseeing projects involving the use or simulation of deceased individuals' data.

  • Facilitate active engagement and consultation with bioethicists, religious authorities, mental health professionals, and cultural experts throughout the GAI technology development lifecycle.

Legal Analysis

  • Undertake comprehensive comparative legal reviews to identify jurisdictions recognizing post-mortem data rights.

  • Analyze gaps in current international legal frameworks, including Brazil’s General Data Protection Law, the CCPA, China’s Personal Information Protection Law, and the EU GDPR, to highlight areas requiring further regulation or clarification.

  • Encourage international harmonization by advocating for model laws, standards, or treaties addressing digital legacies and posthumous data protections.

Harbinja (2023, 2017) argues for a harmonized legal and regulatory governance approach to digital inheritance and posthumous data protection that addresses jurisdictional legal and regulatory inconsistencies and uncertainties.

Real-World Applications

  • Document and critically analyze practical use cases such as HereAfter AI, Project December, and Respeecher to understand benefits, limitations, and potential ethical risks.

  • Implement robust transparency mechanisms and audit trails to monitor how GAI models collect, store, and utilize posthumous data.

  • Require establishing and maintaining consent logs or opt-out registries for deceased individuals whose data might be publicly accessible.


Yang (2024) highlighted the increase in Chinese companies' use of deepfakes to comfort loved ones. This roadmap will offer a detailed and actionable guide to fostering human-centered and ethically sound GAI practices to address this business practice and others. It highlights the need to respect the dignity of deceased individuals and safeguard surviving family members from potential emotional and reputational harm.


Conclusion

As GAI blurs the boundaries between life and death, it compels us to radically rethink our data protection, dignity, and digital legacy conceptions. The digital footprints left behind by the deceased are more than mere data. They represent enduring elements of personal identity, memory, and human connection. These elements command ethical reverence and thoughtful legal protection, far beyond current practices that often cease at death’s threshold.


In an age where GAI technologies routinely transcend the limits traditionally imposed by mortality, we face critical decisions: Will we protect the integrity and dignity of individuals even after death? Or will we allow unchecked innovation to potentially exploit and commodify the most intimate aspects of human identity? To address these profound and complex questions, global policymakers, technologists, and society must urgently and collectively embrace the idea of posthumous privacy as a fundamental human right.


Only by proactively defining clear ethical standards and robust regulatory frameworks can we ensure that technology honors, rather than undermines, the sanctity of human legacy.

Our choices will profoundly shape our digital future, influencing how future generations remember and interact with those who came before. Let us ensure that our technological advancements reflect our highest ethical standards, while preserving humanity's dignity and integrity in life and beyond.


Ethical and Policy Reflection Questions

Using generative AI to simulate or memorialize deceased individuals introduces complex ethical dilemmas and policy challenges. To guide responsible decision-making in this evolving space, the following questions are intended to spark critical reflection among ethicists, GAI technology developers, individuals, and policymakers:

  1. Commercialization Risks: To what extent is it ethical for companies to profit from synthetic recreations of the deceased, particularly celebrities or public figures?

  2. Consent and Autonomy: How can we respect the autonomy of individuals who never had the opportunity to consent to their data being used after death?

  3. Cultural Sensitivity: How can organizations ensure that AI-generated resurrections align with cultural, religious, and spiritual beliefs about death and remembrance?

  4. Digital Executor Role: Should individuals be required to designate a digital executor or data trustee to manage their AI-relevant digital legacy?

  5. Dignity and Legacy: What ethical boundaries should exist when recreating the voice, appearance, or personality of someone who has died?

  6. Emotional Safety: What safeguards should be in place to prevent emotional manipulation or psychological harm to those interacting with AI simulations of loved ones?

  7. Family and Community Rights: Should surviving relatives have the legal or moral authority to approve or deny using a deceased person's likeness or data?

  8. Global Governance: What international frameworks or treaties could harmonize post-mortem data rights in cross-border AI development?

  9. Regulatory Scope: Should data protection laws be updated globally to include post-mortem data rights explicitly, and if so, how?

  10. Technological Accountability: How can we audit and trace generative models that retain sensitive or identifiable data in latent memory? 

  

References

  1. Allen, A.L. & Rothman, J.E. (2024, November). Post-mortem privacy. Michigan Law Review. https://michiganlawreview.org/journal/postmortem-privacy/.

  2. Australia-New South Wales (NSW) Government (2025). State records NSW. https://www.nsw.gov.au/departments-and-agencies/dciths/state-records-nsw.

  3. Barnes, E. (2025, January 10). When AI brings back the dead: Balancing comfort and consequences. VKTR. https://www.vktr.com/ai-ethics-law-risk/when-ai-brings-back-the-dead-balancing-comfort-and-consequences/.

  4. BBC News. (2020, October 30). Kanye West gives Kim Kardashian a hologram of her late father. https://www.bbc.com/news/entertainment-arts-54731382.

  5. Bradshaw, S., Millard, C., & Walden, I. (2010, September 2). Contracts for clouds: Comparison and analysis of the terms and conditions of cloud computing services. Queen Mary University of London School of Law, 19(3), 187–223. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1662374

  6. Bundesnetzagentur. (2025). https://www.bundesnetzagentur.de/EN/Home/home_node.html.

  7. Canadian Medical Protective Association. (2025). Canada (Ontario) health records retention. https://www.cmpa-acpm.ca/en/home.

  8. Carballo, R. (2023, December 11). Using AI to talk to the dead. The New York Times. https://www.nytimes.com/2023/12/11/technology/ai-chatbots-dead-relatives.html.

  9. China-NPC Observer. (2025). Archives law. https://npcobserver.com/legislation/archives-law/.

  10. CNIL. (2025). https://www.cnil.fr/en/professionnel.

  11. D-ID. (n.d.). AI-generated digital humans. https://www.d-id.com.

  12. DLA Piper. (2024, January 11). Data protection laws in Hungary. https://www.dlapiperdataprotection.com/index.html?t=retention-of-personal-data&c=HU.

  13. Edwards, L., & Veale, M. (2018). Slave to the algorithm? Why a right to an explanation is probably not the remedy you are looking for. Duke Law & Technology Review, 16(1), 18–84. https://doi.org/10.2139/ssrn.2972855.

  14. European Commission. (2025). Document management and archival policy. https://commission.europa.eu/about/service-standards-and-principles/transparency/information-and-document-management/archival-policy/document-management-and-archival-policy_en.

  15. European Parliament. (2019, September). EU guidelines on ethics in artificial intelligence: Context and implementation. https://www.europarl.europa.eu/RegData/etudes/BRIE/2019/640163/EPRS_BRI(2019)640163_EN.pdf.

  16. European Union. (2024). The EU AI Artificial Intelligence Act: Up-to-date developments and analyses of the EU AI Act. https://artificialintelligenceact.eu/.

  17. Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2083), 20160360. https://doi.org/10.1098/rsta.2016.0360.

  18. González Fuster, G. (2021). The emergence of personal data protection as a fundamental right of the EU. Springer. https://academic.oup.com/idpl/article-abstract/5/1/91/622976.

  19. Harbinja, E. (2023). Digital Death, Digital Assets and Post-mortem Privacy. Edinburgh University Press. http://www.jstor.org/stable/10.3366/j.ctv32vqmk8.

  20. Harbinja, E. (2017, February 22). Post-mortem privacy 2.0: theory, law, and technology. International Review of Law, Computers & Technology31(1), 26–42. https://doi.org/10.1080/13600869.2017.1275116.

  21. HereAfter AI. (n.d.). Talk to loved ones, even after they’re gone. https://www.hereafter.ai.

  22. Respeecher. (n.d.). Voice cloning for content creators. https://www.respeecher.com.

  23. UNESCO. (2023). UNESCO’s recommendation on the ethics of artificial intelligence: Key facts. https://unesdoc.unesco.org/ark:/48223/pf0000385082.page=4.

  24. United Kingdom National Archives. (2025). Advice on retention. https://www.nationalarchives.gov.uk/information-management/manage-information/policy-process/disposal/advice-on-retention/.

  25. Wagner, B. (2018, December 27). Ethics as an escape from regulation: From ethics-washing to ethics-shopping? In E. Bayamlioğlu, I. Irion, & R. Leenes (Eds.), Being profiled: Cogitas ergo sum (pp. 84–89). Amsterdam University Press. https://uplopen.com/chapters/e/10.1515/9789048550180-016.

  26. Yang, Z. (2024, May 7). Deepfakes of your dead loved ones are a booming Chinese business. MIT Technology Review. https://www.technologyreview.com/2024/05/07/1092116/deepfakes-dead-chinese-business-grief/.

 

 

 
 
 

Comments


bottom of page