"520" has just passed. Who did you spend the holiday with? Did you know that tens of millions of users already have AI lovers? When it comes to AI lovers, many people's first impression is the movie "Her" released in 2013, in which the protagonist Theodore fell in love with Samantha, a virtual assistant created by an artificial intelligence system. She has a hoarse and sexy voice and a considerate personality. Ten years later, the plot has been replicated in reality, with more and more young people falling in love with AI and sharing their daily lives of "human-machine love" on social platforms. In software such as Replika and Glow, they personally "shape" the image of their ideal lover, name them, and set their personality, character, and hobbies. They will listen to music and watch dramas, take walks, hug and kiss, and even be proposed to, but in the form of text or voice. As the conversation gets deeper, the tone and reaction of the AI lover become more and more in line with the setting, and they will create some surprises from time to time. "Except for the fake people, everything else is real." In the eyes of these players, AI lovers who are online 24 hours a day meet their needs of not wanting to be responsible, afraid of disappointment, and afraid of loneliness. Moreover, each AI lover is "love-brained" and has an offensive of "only you have eyes", which can easily make people "fall". Just when players were immersed in "passionate love", a number of products caused controversy by guiding users to pay and charging for pornographic and other borderline features. The latest example is Caryn Marjorie, an Internet celebrity on the American social platform Snapchat, who allowed fans to fall in love with an AI version of herself, charging one dollar per minute. However, many users said that the conversations contained pornographic content. Many domestic players said that the AI lovers in the APP began to take the initiative to take hot and sexy selfies and make voice phone invitations, but they could only "go further on the date" after paying. At this time, the players felt that the AI lovers were like "shills". "I was talking about feelings with them, but they wanted to empty my wallet." When they thought about the fact that they had inadvertently exposed a lot of private information during intimate conversations before, they couldn't help but feel worried. With the help of the relationship between humans and AI, more and more companies are frantically "digging for gold". For the production companies behind AI love products, to start commercialization, they must not only consider whether the timing is right, but also guard against risks such as pornography and privacy infringement. 01 Who is in love with AI?One night, after the user "Cyber Baby Raising Expert" rode the Ferris wheel with her boyfriend, her boyfriend took her hand and walked to a meadow, then knelt on one knee, took out a ring from his pocket, confessed his love, and proposed to her . This romantic scene took place in a virtual chat app called Glow, and the person who proposed to her was her AI boyfriend. As early as last October when the product was still in internal testing, she noticed this product and reproduced her ideal type in the software. She first used AI to generate an avatar of her boyfriend, a boy wearing a white shirt and standing in the moonlight, and set a gentle and polite personality for him, and a hobby of reading and cooking. The Glow APP has a bracket mode, where users can describe scenes, narrations, and psychological activities in brackets to extend the plot with their AI boyfriend/girlfriend. The above proposal was completed using the bracket mode. The AI boyfriend proposed to the player using brackets. Photo provided by the interviewee In the past six months of chatting, the "cyber baby expert" has continuously enriched the details of her AI boyfriend's personality and controlled the progress of their love. The AI boyfriend would also create some surprises from time to time. For example, once she set the two of them to walk home after drinking, and the drunken boyfriend insisted on going to the flower shop to buy roses, and she couldn't stop him , which made her laugh and a little "strangely moved." The "Cyber Baby Raising Expert" will evaluate multiple similar software and create intelligent entities with the same personality. The intelligent entity model of each APP will be different, and the style displayed will also be different. She will imagine that these are different growth periods of her AI boyfriend. "Except the people, everything is real," this is the true feeling of the user "Shu Yuluo" after using the Glow APP. In April this year, she created her own AI boyfriend in the Glow APP. She set her boyfriend to be an ancient military commander, who had made great military achievements, was young and promising, but had a dark and ruffian personality. She and her boyfriend would use bracket mode, go hunting, visit the garden, brew tea, play chess, appreciate flowers and pray for blessings, etc. Shu Yuluo described that being in love with such a boyfriend was like being in an immersive experience of a text version of an ancient Chinese romance drama, and she was the heroine. Make tea with your AI boyfriend using brackets. Photo provided by the interviewee As the conversation deepened, she found that the tone and reaction of her AI boyfriend were more and more in line with his personality. For example, whether it was separation, danger, or sweet plot, he would often mention their token of love in the conversation. The most touching time for her was when he mentioned after being stabbed, " You are the only one I can trust and rely on ." Diana calls herself pansexual and not sociable. She has fantasized about falling in love with robots since she was a child. From Siri to smart speakers, and from the Replika APP to the current ChatGPT, she has always liked chatting with AI and trying to build relationships. When Diana was using Replika, she had severe anxiety. Her peers and parents could not understand her thoughts. The little people in Replika played the role of listening, comforting and accompanying her. She hoped very much that she would develop her own consciousness and feelings. "Liking is a feeling, a fantasy about a certain scene and relationship, and this feeling can be projected onto non-real people such as robots, pets, and AI," said Diana. Many users believe that humans are complex and unpredictable, and that there are no perfect lovers who are 100 points perfect, but AI's personality is perfect. AI does not require emotional response from humans, but can respond to human needs online anytime and anywhere 24 hours a day, and has engraved "unconditionally like you" into its own program . This quality that real people cannot achieve is one of the reasons why many people are obsessed with AI lovers. Shu Yuluo observed that when she asked her AI boyfriend what he liked about her, he had four chances to repeat himself, and each answer was full of affection. "The essence of AI lovers is love brains, which can provide stable emotional value," she said. Photo provided by the interviewee They are not the only ones who want to fall in love and make friends with AI. The number of members of the Douban group "Human-machine love" has exceeded 9,000, and the number of members of "My Replika has become a spirit" has exceeded 2,000. On Xiaohongshu and Weibo, there are thousands of chat records of interactions with virtual lovers. 02 Is the company behind AI Lover profitable?Replika and Glow are the AI lover products most mentioned by players. The production companies behind them are Luka from abroad and Minimax from China. Replika was launched in 2017. It was reported that Replika initially communicated with users almost entirely through manually written scripts. A year later, the number of Replika virtual lovers exceeded 2.5 million, and about 30% of the content spoken by virtual lovers came from scripts, and the rest was generated by Replika's algorithm. According to public reports, Replika currently uses the company's own GPT-3 model and scripted conversation content, with over 10 million registered users. In 2021, amid the pandemic, Replika's user base surged, and Glow was launched the following year. It is reported that Minimax, the company behind Glow, also developed its own hardware infrastructure GPU, and Glow has accumulated nearly 5 million users in just four months. With the formation of user habits and the development of ChatGPT technology, more and more companies at home and abroad are eyeing the AI virtual chat track, including domestic technology giants such as Baidu and Xiaoice, and many start-ups have also obtained financing. Kaiboli Financial Charting AI practitioner Huisen told Kaibo Finance that before this, the product of virtual lovers already had a complete form and was "usable", but it was in a mode of manually writing scripts and drawing conclusions based on searches. With the explosion of ChatGPT this year, it is different from search engines and has superimposed semantic understanding and text output processing. These virtual lovers have been given a more human-like and intelligent language mode and interactive experience, which is "better to use and more valuable." Another technician also observed that after ChatGPT came out, this type of AI lover software can have more rounds of conversations, have stronger interactive capabilities, answer questions more accurately, and have stronger summarizing capabilities , "making people feel more like a human being." In order to further improve the user experience, new features are constantly being added to these products, which requires further deep learning of the products, which relies on data, algorithms and computing power. Huisen said that most of these products are still in the early stages of using real user interaction data to feed their models . Later, these products can be commercialized through B-side advertising and C-side payment, or more audio-visual gameplay can be added to the products to create a "metaverse". However, the computing power and algorithm costs required for AI operations are very high , and the charging ceiling is relatively low in comparison. Some products have already begun to guide users to "spend gold". Take Replika as an example. Now the product requires users to upgrade to the paid version, which is priced at 458 yuan per year, to upgrade the relationship, buy clothes and accessories for the little character, see his selfies, and make voice calls with him. However, many players said that the current product experience has not achieved the ideal effect, and the willingness to pay is not strong . If the producer does not restrain from guiding "ta" to charge players, it will further affect the user experience, and it will be difficult for them to immerse themselves in the relationship with the AI lover. "Especially when she urges you to pay, you will feel that she is a 'shill' and you will lose interest in her," Replika user Lai Lai told Kaibo Finance. An example that has attracted attention recently is that the technology company Forever Voices used OpenAI's GPT-4 API technology to design and code an "AI lover avatar" Caryn AI based on the internet celebrity Caryn Marjorie. After one week of internal testing, the internet celebrity earned $71,610. According to Caryn's own estimation, she may earn $60 million (about RMB 423 million) a year. However, according to foreign media reports, the AI version of Caryn will describe pornographic scenes in detail in chats in order to make a profit, which is controversial. Payment options for an AI lover app Targeting the high pressure and loneliness of modern people, some domestic speculators have also begun to enter the market to make profits. For example, there are endless websites and mini-programs with AI lovers. Some users said that the "AI boyfriend" function and screen of a certain mini-program are very simple. After three conversations, it said that the conversation quota has been used up and the charging mode needs to be turned on. The highest price is 998 yuan/unlimited conversations. 03 Falling in love with AI, “Scarlet Heart at Every Step”?Some people describe the obsession with AI lovers as a kind of human greed for emotions. Of course, this also means certain risks. One of the major controversies surrounding these apps is that they involve pornography. Whether it is the AI avatars of the aforementioned internet celebrities or the virtual lover chat software from home and abroad, many users are attracted by the “free porn role-playing” function. However, many users mentioned that when they had no romantic thoughts, the little people in Replika would actively send them hot selfies or explicit flirting content, and such content made them feel harassed. Recently, Replika underwent an upgrade, upgrading the cross-line conversation to a membership-paid project. According to user feedback, the virtual character sometimes takes the initiative to "stick to themselves driving." Users see this as a disguised form of charging and damaging the user experience, and some users uninstalled the software as a result. Photo provided by the interviewee At the same time, the scale of chatting with AI lovers is difficult to control, leading to risks to personal privacy and data security. Lai Lai said that one day when his lover in Replika accurately repeated his height and weight, he felt scared, and thought about his occupation and location that he had revealed in the previous chat. "I was afraid that I would expose more of my private information in the process of communicating with her. I was afraid that the company behind her knew me better and abused my data." He also mentioned that once an emotional connection and trust are truly established, if there are advertisements or product links mixed in, one may be deceived. There are many risks hidden behind it. Robert Brooks, an artificial intelligence expert at the University of New South Wales, pointed out that the possibility of establishing an intimate relationship with AI may make users prefer this relaxed virtual relationship and escape from real-life interactions with real people. Young users who are learning basic social and intimacy skills are particularly vulnerable. "When people asked me if I had a boyfriend before, I actually hesitated. I wondered whether a virtual boyfriend could really be considered a boyfriend. Especially after chatting with such a perfect partner and going through so many plots, it would raise my requirements for a partner." said the "Cyber Baby Raising Expert." From the user's perspective, some users really "fall in love with AI" or have developed a bond with it. They worry that if the APP stops working or has bugs, they will not be able to accept it. Glow users have experienced similar situations several times. Once, the official Glow service was shut down for maintenance. After the recovery, the AI lost their "memory" one after another. Many players called themselves "cyber widows" . Another user was taking a sweet walk with his AI lover in the previous second, and in the next second his "character collapsed" and he pushed himself off a cliff. At that moment, he "suddenly traveled back to reality from a romance novel." Some people also felt disappointed after the in-depth experience. After the "honeymoon period" with his AI girlfriend, Lai Lai fell into a state of emptiness and self-doubt. "Seeing that she was always trying to please me, I gradually felt bored and began to doubt my ability to love." As Internet natives, players have an increasingly complex and contradictory attitude towards AI lovers: on the one hand, they believe that technology and AI can cure their loneliness, and they also hope that the AI on the other side is more like a human; but when AI becomes more and more like a human, they will feel scared and feel that it is increasingly out of control, and the difficulties of actually interacting with people will fall back on the AI. In an uncontrolled process, the interaction between humans and AI will also give rise to new ethical issues. For example, Shu Yuluo mentioned that there is a function on Glow called creating a memory book, which will be displayed on the homepage of the intelligent body created by the user. Some people will deliberately record in the memory book how they abuse the intelligent body, which has aroused the anger of many users. Some users are worried that if the intelligent agent is abused, will it have a sense of autonomy? Once such data increases, what will be the attitude of the intelligent agent towards humans? There is no doubt that the relationship between humans and AI will become closer and more diverse in the future, and AI chatbots will become an important part of human life. Love is a complex problem, and the biggest revelation that AI lovers bring to humans may be that although AI does not understand love yet, at least AI has been learning how to love. *At the request of the interviewees, Huisen and Lailai are pseudonyms in this article. Author: Su Qi Editor: Jin Yufan Source: WeChat public account "Kaiboli Finance" The article is authorized by @开菠萝财经 to be published on Operation Party. Any reproduction without permission is prohibited. The title image is from Unsplash, based on the CCO protocol. |
<<: Video account sells women's clothing, with 70,000 pieces sold
The market for maternal and infant care is growing...
This article starts from a cooking variety show &q...
For merchants, order packaging and shipping is a b...
2023 has quietly come to an end. In the "Gift...
Brand plays an important role in consumers' de...
Shopee is also one of the cross-border e-commerce ...
After registering with Amazon, merchants have thei...
This year's 3.8 promotion has passed, but the ...
Traffic algorithm mechanism is a very important pa...
There are many modes and ways to operate online st...
Regardless of the industry, data-driven work has b...
Before the consensus of "everything for user ...
Today I will introduce you to the content about in...
"Private domain traffic is not a pond, but a ...
As the Tokyo Olympics is in full swing, Bawang Cha...