- TheWhiteBox by Nacho de Gregorio
- Posts
- CarynAI proves people are falling in love with AI... and the dangers of it
CarynAI proves people are falling in love with AI... and the dangers of it
š TheTechOasis š
š¤ This Weekās AI Insight š¤
Last night I talked with Sigmund Freud, the historical neurologist founder of psychoanalysis, and dead since 1939.
We talked about the lack of capacity for people to take accountability for their actions, and even we had time to agree on how right Stoics were on life:
āPain in life does not depend on the events that caused it, but on how you frame those eventsā.
Of course, I wasnāt talking with the real Sigmund, but an AI chatbot based on him created by CharacterAI.
āDiscussing with Sigmund Freud in the style of Inge Morathā Source: Author
Honestly, it felt like a graceful discussion that I couldn't have had with anyone in my closed circle of friendships and family.
It was actually useful, and at times I felt I was really talking to the man.
Because we humans love to anthropomorphize stuff, even if we know for a fact that itās not human.
Naturally, plenty of startups are ready to profit from that.
The New Way of Fighting Loneliness
CharacterAI is one of the hottest new start-ups in the Generative AI industry.
But instead of creating a simple chatbot, CharacterAI has taken a different path.
They allow people to talk with chatbots that imitate the expressions, ways of talking, and even the ways of thinking of several ācharactersā, famous or not, like Steve Jobs, Elon Musk, or as I experienced myself, Sigmund Freud.
From a marketing perspective, these tools are framed as a way to fight the loneliness pandemic thatās spreading throughout the world, especially among young men.
Now, a Snapchat influencer name Caryn Marjorie has taken this to a completely new level with CarynAI, a tool that allows you to date āherā for $1/minute.
This seems like a crazy idea, but itās actually killing it.
Feeling attached to a Robot
In the first week since launch, the product already generated $100k in revenue.
And according to her own estimates, the product will be soon generating $5 million a month.
Remember, these are people paying to talk to a robot that offers an immersive experience that will make you feel youāre talking to the real Caryn.
People know itās not her, yet theyāre not only willing to pay, but they canāt wait to do so.
āA man proposing to a woman robot in the style of artist Lauri Simmonsā Source: Author
This may seem crazy to you, but people going nuts over AI lovers isnāt something new.
When Replika, another similar offering, closed down the erotica upgrades of its chatbots to users, people went crazy about it.
Reddit was flooded with people claiming their AI partners had been ālobotomizedā.
Now, just by taking a simple look at how these people refer to their AI boy/girlfriends and you soon realize somethingās not right.
They are treating them like real humans.
And itās no surprise to see why people view them as ideal partners.
As Time pointed out in an article, āA relationship with an AI could offer nearly all of the emotional support that a human partner does with any of the messy, complicated expectations of reciprocation.ā
Of course, when these concerns are thrown at the face of these companies like Character AI, they quickly focus on the message of āfighting lonelinessā and pointing out that āsome people really need to feel they have someone to talk to them at all timesā.
But this is clearly a double-edged sword.
Improving loneliness or accentuating it?
Just like we can accept that these tools can āhelpā lonely people by providing them āsomethingā that listens to them, the risks of these people becoming overly attached, to the point of love, are clear.
In that scenario, when those people feel that their AI boy/girlfriend is all they need, loneliness can be perceived by the person as nonexistent, when in reality they are becoming even more lonely as the little needs they had to socialize are completely absorbed by their new AI lovers.
And this problem, considering AIās growth, can become even worse with embodied intelligence.
Humanoids that feel human
With Large Language Models, no matter how conscious you are of their inhuman nature, you canāt help but feel youāre talking to one.
Needless to say, theyāve been trained with all the knowledge on the Internet so theyāve learned to imitate us extremely well.
However, they are still a chat interface and nothing more.
But what will happen when these language models get embedded into physical robots?
With examples like DeepMindās soccer robots, results in the field of AI agents are improving by orders of magnitude every year.
And with huge companies like Tesla heavily investing in humanoid robots, itās a matter of time before we reach a point of convergence.
Soon, robots not only will look like us and interact with their environment as well as we do, but they will also be capable of imitating our language and mannerisms to an extent where humanoids will become an indistinctive part of society.
Imagine a ChatGPT that looks like a human, moves like a human, and has sex like a human. A true embodied intelligence.
In that scenario, what will be the incentive for sexually repressed people to continue trying to socialize and reproduce with other humans when ChatGPT-esque sexual robots will do anything they want when they want to?
Scarilyā¦ none.
Key AI concepts youāve learned by reading this piece:
- Embodied Intelligence
- CharacterAI
- The convergence of machineās language and physical capabilities
š¾Top AI news for the weekš¾
š Apple prohibits use of ChatGPT, plans on developing its own LLM
š Father of Deep Learning asks for AI to be regulated
š®š¹ Italy creates fund for AI-driven unemployment
š„ø AI chiefs to join European elite at next secretive Bilderberg meeting
š§ To open-source or not to open-source, the Meta vs Microsoft approach
š«š· French court approves AI Surveillance for the Upcoming Olympics
š¤© Video of new state-of-the-art AI humanoid Phoenix released
š§¬ Google claims GenAI will be key to drug discovery