In November 2022, The House of Ethics was among the first companies to refuse using ChatGPT. Many voices raised questions as to why we decided on such a drastic corporate move. First, out of data ethical reasons, and second, the mere fact that general purpose transformer technology is absolutely not fit for our analytical core activity, ethics.
Now that the understanding of inherent risks and harms generated by this untrustworthy technology starts to infiltrate the mainstream and collective conscience, people and businesses assess it differently. Besides the fact that generative AI sits on an ethical permafrost, now, private and business users sense the gradual erosion of our societal, human and democratic values. With the Atlantis of Fakery finally appearing after slow defrosting.
Major lawsuits surface like the recent New York Times-OpenAI lawsuit, more editors, artists, and writers rebel and legally ask for justice after the industry-scale IP violations, copyright infringements and data theft by ChatGPT’s and Dalle’s OpenAI, Bing’s Microsoft, Stability AI’s Stable Diffusion or Midjourney. After the leak of Midjourney developers caught discussing laundering creating database of artists, policymakers realized that they were just glancing at the tip of the iceberg. Not to mention the programmed collapse of large language transformer models endlessly fed by fake and inaccurate synthetic data.
But the technological problems seem minimal compared to the human and societal harm this unleashed prototypical technology is causing. Mis-and disinformation, dark and synthetic data contaminating the Internet and polluting corporate networks, plus the undermining of fundamental democratic values are only few of the immediate harms we are already witnessing. Major ethical disregards culminating in slapping fundamental human rights.
All these phenomena converged after year-long “ghosting ethics”. And being catalyzed at turbo speed by the generative AI downhill superpowers over the past two years. Not only has big tech been nudging global markets and consumers into large-scale “genAI dark patterns” but also the rapid and widespread endowment of individual users, becoming (voluntary or involuntary) accomplices, have been speeding up the ethical downfall.
‘Ghosting ethics‘ is more than just turning a blind eye to any ethical considerations. Beyond basic indifference, it also derives from negligence. The failing or responsibility to protect our common future from large latent human, societal and ecological harms. And finally the obnubilation of our soul being lured into shallow promises of more, better and faster.
Was Kierkegaard right when stating “Every notable historical era will have its own Faust”?
Ghosting: a Widespread Cyber-Physical Practice
To most social media users, ghosting people is nothing new. Who has not been abruptly cut off from all communication by a person, without any explanation nor excuse?
In 2015, to “ghost” somebody officially entered the Collins English Dictionary as one of the “words of 2015”. The dictionary defined the term as:
“ending a relationship by ignoring all communication from the other person.”
The original term dates back to the early 2000s. A New York Times article explained the trend, and the Times reported how the term originated in the dating universe where people mostly tried to escape unwanted connections by playing dead. It then slipped from the dating universe into the business world where companies suddenly started to ghost customers.
In 2016 ghosting was the Millenials’ favorite term and practice. The same year BBC 5 reported a baffling business case where Martin Brent, a freelance photographer, was systematically ghosted by his insurer, Hiscox. Hiscox simply stopped returning Brent’s calls and emails, and played dead to its customer. The case ended at the Supreme Court.
Eight years later, in 2024, a new evolution in the ghosting world occurred: from ghosting somebody to ghosting ethics.
With the explosion of large language models and generative AI, ghosting ethics seems to have developed into a popular sports. Everything seems to hint towards a correlation whereas the confirmation of causation would need deeper research of the phenomenon.
From Ghosting People to Ghosting Ethics
Ghosting ethics is a multi-step process on individual, corporate and governmental levels. And raising questions of individual responsibility and collective accountability.
Ethics ghosting: a three-step process
- The lowest stage of ghosting ethics might be an uninformed, unintentional use or participation. At this level ignorance of the context or technicity prevails. Here the act of ghosting ethics may be not entirely intentional.
- Intentional ghosting ethics starts when a choice emerges. When notions such as benefit, profit and self-serving outcomes enter the equation. Here starts real ethics ghosting practiced by people, businesses and national institutions.
- The third and most unethical stage of ghosting ethics is fine-tuned institutionalized ethics-washing. In this case ghosting ethics is part of a strategy, a goal and purpose. Ethical principles are systematical overwritten and bypassed although being paradoxically invoked. It is the highest and most Machiavellian face of ghosting ethics.
This modus operandi has been applied by Big Tech leaders like Sam Altman or Elon Musk using a specifically crafted rhetoric. They benevolently warn all of humanity of its imminent extinction caused by the same technology they developed. Thus parading as saviors for the problem they unethically created.
In this context, how is Goethe’s 18th century bestselling drama Faust, the scientist who soul his soul to the devil for unlimited knowledge, an anachronistic distorted mirror of our cyber-physical times?
More so, how do cyber-ethical ghosting behaviors nurture endemic ghosting ethics?
Fractionally Selling the Soul?
The recent BBC article” What the myth of Faust can teach us” by Benjamin Ramm delivers a quick introduction to the idea of the “devil’s-bargain” – der Teufelspakt – in modern times.
“The legend seems to have particular resonance at times of moral crisis.” Benjamin Ramm
For “ghosting ethics” however the phenomenon is more endemic than Ramm’s classical bribery. In Goethe’s Faust, the already highly knowledgeable scientist Dr Faust engages in a pact with the devil by selling his soul for unlimited knowledge. The insatiable quest for more.
In the digital age the individual soul sale happens at fractional instances. Little by little. Minimal, insipid but frequent Faustian choices starting with apparently insignificant ticking ‘Terms and Condition’ boxes.
The chain reaction of ghosting ethics flows from data to identity to soul to loss of freedom. That’s why collective ethics needs to find a strong ally in human rights, especially with the upcoming boom of neuro-tech, nano-robotics, e-health and genomics.
For Faust’s devil called Mephistopheles, the evil is a visible seductor, whereas with generative AI and novel pervasive technologies the “digital devil” is mostly a ghost in the machine, invisible.
The report “Ghost in the machine – Addressing the consumer harms of generative AI” by the Norwegian Consumer Council (June 2023) is a extensive illustration of the invisible harms of generative AI. Not only does generative AI try to seamlessly blend with humans but “most people may never realize that they are interacting with an AI-powered system.”
From 'Impostor Syndrome' to Endemic Impostorship
With a 7/24/352 AI tool at hand nearly everybody can prompt immediate magic and augmentation. Thus tokenizing immediate gratification.
Like Mephisto talking to Faust: “Sample every possible delight… grasp at what you want!”
Digital times are also times of rewards and instant gratification as underscored by Ramm. “The Faust legend has thrived in secular consumer societies, particularly in a culture of instant gratification. “
The paradox of the impostor meets the myth of Santa Claus. According to Roland Barthes, the French semiologist, “le mythe du Père Noel” makes you believe in something you perfectly know does not exist. The same occurs with the ‘Impostor Syndrome’. You might believe that you are a good writer or perfect marketer but at the end of the day you know you are not. Especially if the machine is unplugged.
Such conflicting and coexisting knowledge and realities extend into colliding and multiplying ethical ramifications or “ethical dissonance” – as described in a previous article. Ethical dissonance is particularly frequent cognitive phenomenon in cyber-physical times, and a powerful enabler of ghosting ethics.
From Self-Serving to Responsible Self
For decades Zuckerberg’s motto “Move fast and break things” has been a golden mantra for Silicon’s IT tech lords. Without regards to collateral long-term damage known as “technical debt”, and more significantly the recent analyzed “ethical debt” .
Fact is that the general adoption of General Purpose Transformer models does in no way make them more ethical or more trustworthy.
Summing up the rights does not balance out the wrongs.
Whereas Faust as a scientist of the Enlightenment attempts to forge a better world, the purpose of generative AI developers, marketers and users seem to be far more self-serving.
Faust to Gretchen: “My sweet, believe me, what’s called intellect / Is often shallowness and vanity”
Individual responsibility to counter the laissez-aller
Even though Goethe never refers explicitly to individual responsibility as understood in our modern terms, he underlines Faust’s own choices of acting.
Even if guided or seduced by Mephistopheles, for Goethe the protagonist-scientist is solely responsible for his actions, and accountable for his choices. This is clearly stated when the devil reminds Faust who destroyed Gretchen, Faust’s love.
Mephistopheles: “Who was it who ruined her? I, or you?”
Similarly, we are responsible for our actions. We are responsible when buying in into the frenzy of hyper-marketing of generative AI tools open to everyone.
The recently launched GPTStore by OpenAI might be seen as another “move fast and break things” attempt to secure a market position.
But beyond technology and business, the intention of Big Tech might be more insidious: break up and fragment their own accountability and responsibility and dispatch it to millions of users, thus collectively spreading their “ethical debt”.
Dragging users and businesses as accomplices in a spiraling banalization of wrongful business practices by superspreading the systemic adoption of unethical tech commodities at high-speed.
Building a herd immunity against accountability and responsibility towards unethical and untrustworthy tech products.
With this free technological self-service spurring and banalizing daily infringements and plagiarism. Amplifying the care-free entitlement to permissively using and mostly stealing content, concepts, ideas, words and thus nurturing a massive laissez-aller of ethics.
All fed by ongoing and misleading narratives based on unethical consequentialist reasoning by big tech elite, as heard by OpenAI’s Sam Altman stating that it is “‘Impossible’ to create AI tools like ChatGPT without copyrighted material“. Openly blaming data spenders/victims for harming their “product innovation cycle”. The same rhetoric has been used by Microsoft CEO Nadella, shaming and blaming users for Bing’s going rogue after long or extended chat sessions with 15 or more questions.
Both rhetorical pirouettes.
“Da steh ich nun ich armer Tor! Und bin so klug als wie zuvor.”
Maybe the Faust legend version web3 could be a wake-up call to be wary of the cult of the ego, the seductions of quick fame and the celebration of power as underlined by Ramm. But again: Nihil Novi Sub Sole…
Maybe one lesson to be learned might be that ghosting anything in life is not a sustainable choice. Be it people or ethics. We need respectful relations, sustainable outlooks and authentic beliefs. By ghosting ethics the win might be quick, and the prize alluring but the cost is definitely too high.
In any case, let’s cherish Goethe’s “Zwei Seelen wohnen ach in meiner Brust” and it’s up to us to choose and embrace them wholeheartedly knowing that each of them completes our personal Ethos.
Contributions to "Ghosting Ethics" by three Professionals
Edmund White - Chief Commercial Officer - MedTech, Digital Health, and Clinical Research
The rise of AI creates an insidious temptation – to “ghost” our ethics and conceal the role of these technologies in our achievements. This Faustian bargain promises great strides without struggle, but risks normalizing an impostor culture that could undermine healthcare’s foundation of trust.
Working in the field of healthcare, I believe medical institutions when adopting AI systems should strive to establish transparent policies and training to prevent deception – the ghosting of ethics. Achievements attributed to practitioners should fully reflect their skills, not merely an algorithm’s capabilities. Leaders within these institutions must embrace integrity rather than chasing accolades and metrics at the cost of credibility.
Individual healthcare workers also carry a responsibility. The impostor syndrome cannot become endemic impostorship, where we routinely claim credit for AI-generated work as our own. Our duty of care means earning accomplishments fairly and giving proper attribution. Medical professionals can collaborate with AI constructively while retaining public trust. With ethical integrity as our guide, instead of technological masks, healthcare can hopefully progress humanely.
Sabine Haselbeck - Data Privacy Engineer - Datenschutz Auditorin - TÜV
It’s possible that we have already found the “Weltenformel,” which Goethe dedicated all his life to, and maybe we found it now: algorithms able to provide answers to all our problems. But beware! It’ll be a Faustian bargain – almost all AI companies are like Mephistopheles, and we’re the hapless vain Faust! The same way that he was tempted to engage in a sinister deal to further his career, and romantic life.
Just as he was tempted into an unethical agreement, to improve his stand and social acceptance, so are we faced with difficult decisions when it comes to using artificial intelligence. Stealing our personality, our personal data our ideas, arts, and creation .
However, let’s be cautious! We exactly know how the tempted imposter ends. This transaction is going to be highly Faustian !
But remember, just like Goethe’s Zauberlehrling – arguably the best example of an outraged imposter – our robotic brooms can easily run amok if left ghosted by ethics. So let’s be wise in how to use them!
Christian Jäggi - President of DigtalBasel
What pact would Goethe have had Faust forge with the devil if Faust had lived today? Let’s leave the Gretchen question aside for a moment. Would the academic Faust have traded privacy and authenticity in favour of a new world that gives him access to almost unlimited knowledge through generative AI?
Perhaps Faust would have made a deal with Elon Musk. OpenAI had waved the flag of ethical principles and was founded as an open source organisation. However, the organisation then changed its legal structure to become a profit-maximising company. Musk criticised the lack of security at OpenAI and left the board in 2018. What really motivated him to do so remains hidden, as he stayed on as a shareholder and now presents himself as the father of ChatGPT. Ethical concerns or not, he benefited from an incredible increase in value through his investment of $50m in 2015. OpenAI is now valued at $80bn.
But how does the new “knowledge” come about? “Alles nur geklaut” (“It’s all stolen”), ‘Die Prinzen’ once sang, as if they had suspected it. No trace of ethics. The Large Language Model, with its frighteningly simple algorithm, appears to generate knowledge, but it is in fact knowledge imposture. It collects information from the Internet’s sea of data and creates nice-sounding sentences with statistically thrown-together words—and sometimes serves up the crudest lies. Simulated feelings make everything even stranger and ethically even more dubious. And yet: what comes out is great, sometimes brilliant. So was the (p)act worth it? We’re all a bit Faustian ghosting ethics.