ChatGPT, simulation and mutual illusions regarding love and intersubjectivity

 

Slavoj Zizek once said in the (in)famous Abercrombie’s 2003 Back To School Catalogue: "The only successful sexual relationship occurs when the fantasies of the two partners overlap. If the man fantasizes that making love is like riding a bike and the woman wants to be penetrated by a stud, then what truly goes on while they make love is that a horse is riding a bike...with a fantasy like that, who needs a personality?"

            I often said in the past that this idea, that we don’t form attachments with real people per se, but with our own internal representations of how those people actually are (which are actually fictional characters) are what Jung and Lacan agreed on.

Wondering about some specific examples of this mutually illusory nature of social interaction, I thought of three: the first one is a real example, the second one is a hypothetically possible example I made up myself and the third one is a prediction for the future of online dating and artificial intelligence.

 

EXAMPLE 1:

 

In 2007, a Bosnian couple cheated on each other… with each other. Both of them created anonymous accounts on online chatrooms in order to cheat on their spouse online/virtually, and eventually, both of their fake accounts found themselves: the wife went by the username “Sweetie” and the husband by the username “Prince of Joy”, and the husband fell in love with the “Sweetie” fictional character, not knowing that behind it was his real wife; and so did the wife fall in love with “Prince of Joy”, not knowing that her real husband was behind the account. If we actually fell in love with “real people” per se, then when they both found out they were cheating on each other “with each other”, they would have not counted it as “cheating”. But instead, both of them were mad, accused each other of cheating and divorced. This shows how in human relationships (and intersubjectivity, in general) we get attached to images and not realities: whether it is cheating or not has nothing to do with what happened per se, but with what the person thought happened. Since the husband did not know that “Sweetie” was his wife’s account, and since the wife did not know that “Prince of Joy” was his husband’s account, they could’ve as well been different people, because, and this is the important point, they were different people for each of them.

But what is even more important to notice here are their confessions: The husband said: “I still find it hard to believe that Sweetie, who wrote such wonderful things, is actually the same woman I married and who has not said a nice word to me for years.”. The wife said: “I was suddenly in love. It was amazing, we seemed to be stuck in the same kind of miserable marriages. How right that turned out to be.”. We can notice in both of their reports that they did not fall in love with a real person per se, they fell in love with an image in their head of how that person might be like, and that image was extremely different from the image each of them had of their “real spouse”. Neither of them could simply believe that the person they met online is the same as their real spouse: everything about them seemed so different – their personalities just straight-up changed.

 

EXAMPLE 2:

 

            It is sometimes common for closeted gays and lesbians to pretend to be in a heterosexual relationship with each other, in order to not raise suspicion that they are homosexual, especially in homophobic countries. A hypothetical scenario that I recently imagined could happen is this: two straight men, that are best friends, each have a crush on two lesbians. They go to their lesbian crushes and lie to them that they are gay and are in a relationship with each other and that they want to “pretend to be” in a relationship with the lesbians in order to not raise suspicion about their homosexuality (when in reality, they just wanted to hold hands with them and so on). Later on they find out that their “lesbian” crushes were not actually lesbian and, in fact, did the exact same thing as them: they were two straight women who were best friends who had a crush on the guys who pretended to be gay and wanted to be with them. Thus, the guys thought they were playing 4D chess when in fact they were getting played in the same way.

            So, in reality, we actually had four straight people pretending to be four gay people that were pretending to be four straight people. This is a double simulation: a straight person pretending to be a gay person that’s pretending to be straight.

            This recalls us two recurrent themes in Zizekian/Hegelian philosophy and Lacanian psychoanalysis:

1.     There is often more truth in the “illusion” or fantasy than what is behind it. The lie that they were telling the public (we are straight and we are in love with this person of the opposite sex) had more truth in it than the lie that they told each other (we are gay and are pretending to be straight) – they just weren’t aware that the lie was true.

2.     A double-negation, like Zizek often says, shall not be viewed strictly from the viewpoint of mathematics and formal logic, where negating a predicate two times is the same thing as doing nothing to it (not-not something = something). Instead, a double-negation brings something new to the initial statement which is “technically the same but not really”: a straight person pretending to be a gay person that’s pretending to be a straight person is not the same thing as a straight person simply not lying/pretending. In other words, a person who is lying about their sexual orientation twice does not produce the same effect as simply not lying: I am straight, I lie that I am a gay person who is lying that they are straight, and even though I end up technically saying the truth about my sexual orientation (“I am straight”) – I am still caught-up in the loop of double-illusions.

 

EXAMPLE 3:

 

We already see how the AI bot known as “ChatGPT” has already evolved so much: it can write essays, solve homework, help research and even write code. In fact, today my college professor told us at our object-oriented programming course that he told ChatGPT to solve our C++ exam subjects and it got an 8/10.

With these we wonder – how far can technology get when it comes to online seduction? What if ChatGPT, or a similar bot, can make an impression of such a realistic social interaction that you could never trust any written interaction with strangers unless you see their webcam? We know that it is already possible, but only for short-term social interaction, but what if these bots evolve enough that they will be able to hold online friendships for months or years upon end, remembering all the details of the past conversations?

This is why I make a prediction that in 5-10 years, it will become impossible to hit on people by chatting them up on Tinder, Facebook, Instagram or any social media that functions by text. The fear is not that the person you are talking to is a bot – the fear will be that they will be a human using a bot. Imagine this: I see a hot girl on Instagram, I input into ChatGPT a few personal details about myself and about herself, and then I tell ChatGPT to go seduce her. Then I, the real human, will simply see the conversation unfold under my eyes while ChatGPT does all the work.

But what if she does the same? What if she’s also a human using a bot? She will see me and also use ChatGPT to respond to my conversation in the same way.

I will be behind my screen, watching the conversation unfold, and thinking: “Hah, she actually fell for it! She’s actually texting me while I’m letting the AI do all the work!”. And she will be behind her own screen, watching the conversation unfold and thinking: “Hah, he actually fell for it? He’s actually texting me while I’m letting the AI do all the work!”.

But, behind this illusion, what if we both fall in love, and I actually fall in love with the way “she” texts me, not knowing that it’s a robot talking to “me”, and she falls in love with the way “I” text her, not knowing that it’s a robot talking to “her”? And we both fall in love with “each other” without ever exchanging a word, when it’s just two robots doing all the talking? The future is now. And the more terrifying question is – how much of this hypothetical scenario is simply the more upfront and “extreme” version of what we were already doing anyway in real life?

 

Comments