AI, Digital Identity, and the Productization of Self

Despite the noise surrounding AI, I felt compelled to share some of my thoughts on the implications the commercialization of AI may have on our ‘digital self’.

woman reflection in the mirror as a robot

Like so many of us, I continue to be fascinated by the advancements and opportunities GenAI is making lately. The intersection of AI and identity resonates with me a lot, so despite the noise surrounding AI, I felt compelled to share some of my thoughts on the implications the commercialization of AI may have on our ‘digital self’.

Identity is an extremely complex topic spreading across a multitude of fields of research, and with recent advancements in AI, it is only reasonable to anticipate a significant interplay and mutual influence between these two areas. And while the academic research on this field persists, it is being swiftly outpaced by technology companies racing to productize AI as quickly as possible. This pace is, as with any other digital product, primarily motivated by first-to-market solutions, and far less concerned with short and long-term implications this productization may have on identities — moral, social, political, etc. In this article, I intend to bring to sharper focus some of the implications such commercialization of AI may have on our ‘digital self’.

What is ‘digital self’? How does it differ from my ‘self’ in the real world? As a ubiquitous part of our personal and professional lives, technology is changing not only how we do things but also who we are. The 'digital self' represents how we exist, interact, and are perceived in the world of technology, in a variety of identities we assume as we traverse through it. Digital self may include our LinkedIn, Facebook, Twitter or Instagram presence, but it can also include our presence on the innocuous (grocery and shopping apps, photo editing) and the extremely personal (dating profiles, OnlyFans subscriptions, or even more personal things). While we may see those as different fractures of ourselves, technology is inevitably connecting those parts, often without our knowledge and/or consent.

Our digital identity is both the product of our social interaction through technology and a force impacting the societies which help create them. I assume most readers are familiar with the attention economy, a fundamental force that created the explosion of digital content created both by individuals and organizations, all in an attempt to grab consumer’s attention. Platforms we use to create content are free to use, but in exchange, we don’t own the content itself, much to our dislike. To attract attention to our content, we can create identities that are somewhat different from reality — after all, are you the same person at home as you are at work, or the same with your friends or strangers? In creating these identities, we then project images of them — and sometimes those images are different from reality, polished to appear more interesting or compelling (or hireable). Digital media opens up possibilities to become who we want to be, creating our own identity(ies), instead of settling for who we are.

So, with advancements in AI, it is only a matter of time before we will see these identities captured in a digital product that allows us to interact with them as authentic personalities. In recent months we witnessed multiple products capturing a person’s body of knowledge (all digitally available content authored by an individual) and through the application of large language models, offering it as an interactive chatbot. For example, we can have an authentic conversation with Socrates, Galileo or Tesla. AI-generated digital replicas of real experts (dr Martin Seligman, or Deepak Chopra) are becoming a reality for which we don’t appear to be nearly ready. We can even create a version of ourselves, and offer it to others to interact. What is there not to like? What can go wrong?

Back in 2016 (in a digital world that’s a while ago), Mark Manson wrote an article “Future of Self” in which, prophetically, he mused around idea — What if your memories, ideas, expertise, .. can be captured in a series of bytes and then transferred on a digital medium to Mars, and then made accessible there in accessible digital format. He even went to speculate about a bionic (real body) version of ‘you’, ultimately asking — would that entity (real or digital) still be you? This article, like everything else Mark publishes, is very entertaining, and witty but full of deep thinking. I strongly recommend it.

Well, it seems, much sooner than Mark thought, that it is possible to capture at least a portion of our ‘digital selves’ (like in the case of Dr Seligman) and open it for public use. While Dr Seligman agreed to unlimited use of his body of knowledge, even after his death, it is reasonable to expect this paradigm to be ripe for productization. Why not capture any other personal or business-specific knowledge and offer it as a product? In the same fashion Paul Simon or Elton John have sold their musical portfolios, a high-profile consultant in risk management, regulatory compliance, or software architecture could assemble their knowledge, add a conversational chatbot to it and sell it as a digital product. Multiple companies could use the same product, questioning the need for high-profile consultants. Copyright is easy to enforce when a reseller of Elton John’s song sells it, but who owns the knowledge assembled by the tax advisory consultant? Furthermore, through service-level agreements, these products could be made modifiable, and extensible. How should we regulate that change, and who owns that modified, ‘new’ product? Do these ‘products’ survive their creators, like in Dr Seligman’s case?

Consider the following: the university, trusted with transfer and accreditation of knowledge, assembles the entire body of knowledge required to gain a Bachelor in Computer Science, for example, and solve computer science problems. There are no exams, no professors, no classes. The university packages this up into a GenAI and calls it “Digital Bob, BSc". Can companies now hire “Bob”, for a fee to the university? Once a digital worker, Bob never needs a sick day or any sort of leave and has all the expertise they need. It seems that the concept of education and the workforce may be facing some fundamental and radical challenges.

What about a personal body of knowledge? Some people have substantial content across multiple social networks, in their blogs, photo albums, e-mails… etc. Is it possible to create someone’s identity by centralizing all that content and offering it as an interactive digital product? We can already do that — create a version of self. Can surviving members of an individual’s family use this to interact with departed loved ones? A huge question. There is already tech doing this (HereAfter, StoryFile, and others), and it’s a source of comfort for some, but it makes others squeamish and disturbed. An interesting example of this is Robin Williams’ family fighting staunchly against people using his voice in AI. Who is that ‘new’ person? Can that digital (re)incarnation be modified? Who is that then, still the same individual?

If you change all the ship's parts, is it still the same ship?

This reminds me of a well-known philosophical treatise known as “Ship of Theseus”, which argues if all the ship’s parts are completely changed over a long time, is it still the same ship?

What answers do we have today to some of these questions? I believe it is so early that we are not even asking the right questions, let alone providing the right answers. Tracey Follows, a renowned futurist, offers a deeper view of these questions in her book “The Future of You”. She thoroughly examines and encourages readers to actively engage with the complex questions surrounding our digital identities.

So, how do we obtain and maintain possession of our digital selves? Doing that may not be trivial. The consequences of not doing so might be detrimental to who we are, and how we work and live. Does it mean interacting with the internet less? Does it mean using more technology that allows us to obscure our identities, so AI doesn’t have access?

A framework outlining both regulatory and technical components might be a good starting point for discussion and further development. From a technology point of view, we would need alignment on these components:

  • Data portability and interoperability: Ability to easily access, download, and transfer all your digital data, regardless of platforms and services hosting it. Standardized formats and open APIs would be crucial for seamless data portability.
  • Decentralized storage solutions: Taking the ownership and control of our own data away from large corporations is of critical importance. Blockchain technology may offer potential solutions.
  • Strong encryption and user-controlled access: Owners should have complete control over their data. Robust encryption mechanisms and granular access controls would enable you to decide what information gets shared and with whom.
  • Identity management tools: User-friendly tools for managing your digital identity across various platforms would be essential. This could include features for setting privacy preferences, tracking data flows, and managing consent for data sharing.

In parallel, we must work on a regulatory framework, with the following components:

  • Digital Identity: Governance and provisioning framework for the management of citizens’ identity (real and digital);
  • Data ownership rights: Clear legal frameworks are needed to establish and enforce your rights to ownership and control of your digital data.
  • Data protection regulations: Further strengthening and thorough application of existing data protection laws like GDPR is needed.
  • Digital literacy and education: Informed online choices about our online presence should be made through easy-to-access and easy-to-understand knowledge and skills to manage digital identities effectively.
  • Collective action and advocacy: Individual efforts are not enough. As Follows said “… states are behaving like a tech corporation, and tech corporations are behaving like a state…”. Feels a little idealistic, but we need them working together on digital rights, advocating for stricter data protection laws, and holding themselves accountable for establishing and maintaining these frameworks.

It is evident that, in the (near) future, we must assert ownership over our digital selves. However, in the current technological landscape (often referred to as Web 2.0), this is a daunting task. Without ownership of our content (a concept potentially attainable in a decentralized Web 3.0) and the establishment of unique digital identification as two key milestones, meaningful progress toward a satisfactory solution remains elusive. With our governments' notoriously slow (re)action to AI's revolutionary disruption, it is not hard to see this void become filled with well-intentioned, but profit-driven tech companies, who will have little time for considerations of any but monetary aspects of the solutions in this space. If we judge our record in the past, the future may not be as well prepared (read: regulated) as we hope. We find ourselves at the mercy of the evolutionary forces of these tech companies, which have served us relatively well in the physical world. Let us hope that these same forces can guide us wisely in the digital realm as well.

There is no simple answer to the complex identity questions surrounding our digital future. While seeking those answers we need to become informed participants in shaping the way our online identities evolve. By understanding the issues, adopting a critical mindset, and collectively taking proactive steps, we can navigate the challenges and opportunities of our digital lives with greater agency and responsibility, so that, instead of being replaced, we become augmented by our digital counterparts and come to be even better humans.

DISCLAIMER: The views and opinions expressed in this article are solely the author's own and do not reflect the official policy or position of any other individual, organization, or company. The author takes full responsibility for the content and any potential errors or omissions.

Continue Learning

Discover more articles on similar topics