A few days ago, a commonplace topic exploded on the Internet and was on the hot search list on Weibo for two consecutive days. It’s called “information cocoon”. Many people have heard of this term before and understand it, but the reason why everyone reacted so strongly this time is that this is the first time they have faced the information cocoon so directly. Here's the thing. Some netizens discovered that under a video about a couple quarreling, the comment areas seen by different accounts were different. In the comment section that his account sees, the top comments are all from male netizens, and the stances are all from men; the comment section that his girlfriend's account sees is just the opposite. Left male right female ▼ Under this Weibo post, many netizens questioned whether this would invisibly affect our judgment. The next day, a blogger saw the video and decided to do a test. She registered a new TikTok account, followed Yixiao Qingcheng, kept liking videos of the elderly, and disguised herself as a middle-aged or elderly person in the online world. After doing this for an hour, she found herself in a new world. The ones competing online are no longer young anchors, but several old men with sparse hair and shiny heads; the netizens joining the online chat are also aunts of about the same age. Under a video of an old man tasting tea, the top comments are all from real middle-aged and elderly people. But when the blogger switched back to his account and found the tea tasting video, the first comment in the comment section was one he had never seen before. This means that people of different ages see different comment sections. After the incident fermented, it immediately attracted questions from many netizens. In addition to forwarding and commenting, they also went to the comment section of the original video and started testing. Someone asked whether the netizen who could see his comments was a man or a woman; Some people posted screenshots of their own comments section so that others could see if they were the same; Many people also recalled their previous experiences and concluded that "the algorithm is indeed customizing the comment section." For example, a netizen with a Sichuan IP said that every time he sees the first comment, it is about eating Sichuan food. There are also a lot of people who said that it’s no wonder that every time they read the comments, they find that others are arguing in the void, it turns out that the comment section is not the same version. There are even more "conspiracy theories" that believe that the short video platform is deliberately provoking confrontation between men and women through algorithms. In fact, after hearing about this, I also took the mobile phones of three colleagues (1 man and 2 women) to test it. But we found that below the original video, except for the slightly different order of individual comments, the comment area as a whole is the same. In order to rule out the possibility of the same IP, I also asked a friend who lived dozens of kilometers away to test it, and the situation was the same. Maybe we are late, or still in gray? Later, I also looked at several other bloggers, including a beautiful anchor, a male doctor, and a lawyer who discussed the topic of bride price. These topics are relatively easy to cause gender confrontation, and I want to see if there is what netizens call "gender customization" in their comment section. As a result, in the comment section of the last lawyer, we encountered a situation where the top comments were quite different, but as for the other two, they were completely consistent. Based on this and past experience, I can’t say that the comment sorting on short video platforms is intentionally pushing something, but I can say: It is by no means sorted entirely according to the dimensions of popularity and time. In the past, when we opened the comment section of some social platforms, we would see two options: popularity & time. However, on short video platforms, users do not have the right to choose the order of comments. For example, on TikTok, comments with low popularity sometimes appear before those with high popularity; A similar situation also occurred on Kuaishou. We cannot judge whether the comment section of a short video is connected to an algorithm, but if users are not given the right to choose the order of comments, this will undoubtedly aggravate the information cocoon and distort everyone's three views. The first thing to make clear is that the "information cocoon" is not a product of the algorithmic age. It originated from Sunstein's 2006 book "Information Utopia", which talks about a phenomenon:
The emergence of algorithms will exacerbate the formation of "information cocoons". Because we are constantly fed with what we like to see and what we want to see. Once the information input becomes single, the dimension of our view of things will also become single, and our thinking will become narrow. German film scholar Siegfried Kracauer wrote a book called "The Nature of Film", in which he tells a story. A director made a short urban film and showed it to African indigenous people who had never been exposed to movies. The film showed bright lights, wine and women, and tall buildings, but after watching it, the audience had no reaction to these things and only enthusiastically discussed a chicken that appeared briefly in the short film. The director himself didn't know there would be a chicken in the short film, and later discovered that a chicken was passing by in the corner of a certain 1-second shot. Why did the indigenous people pay attention to the chicken? Because they only knew the chicken, so the chicken became the protagonist, and the unknown high-rise buildings became the background. Later, there was a saying in film studies: Did you see a chicken? What this means is that when everyone reads a work, what we see is just the chicken in our eyes, which depends on the information we have received. It's like asking everyone to name their favorite movie. You might choose "Oppenheimer", your friend might choose "Barbie", and your cousin might choose "Wolf Warrior". But no matter who chooses, his answer will be limited to "the movies he has seen." What determines the answer is experience, cognition, and the information input into the brain. Once the algorithm makes the information you receive single-minded, your view and analysis of things will become one-sided. One is one-sidedness, and the second is becoming extreme. Because we can only hear opinions that we agree with, after constant repetition and deepening, our thinking will become rigidified, excluding dissidents, and eventually an echo chamber effect will occur, where opinions will be magnified and expanded in our minds and become extreme. We often see people with different opinions arguing with each other on the Internet. Because in the world they see, they all think they are right and are the majority, and people who are different from them are simply unreasonable. But in the real, complex and all-encompassing world, things are not black and white. I don’t know if you all feel the same way, but even in an age where comments are sorted by popularity, in many posts, it is common for the entire post’s outlook to be skewed by highly-rated comments, and people with opposing opinions are seen only at the very back. Because people tend to follow the crowd, they prefer not to isolate themselves rather than publicize their own judgment. Sometimes, you need to see the wind direction to determine your own ideas. So what would it be like if the comment section wasn’t sorted by popularity and was left to the algorithms? This will cause people with common labels (gender, hobbies) to be pulled into the same group comment section, causing some opposing opinions that you should have glanced at to disappear completely. People are more likely to tend to be consistent, but also more likely to become radicalized and more separated from other groups. Can you imagine that if men and women really surf the Internet separately and neither side can hear each other's thoughts at all, will the gender conflict be reduced or greatly deepened? Of course, for most people, all of the above is just a hidden worry. The conditions for forming a true information cocoon are quite harsh. Two scholars from Tsinghua University and Communication University of China once published an article mentioning that the "information cocoon" is a specious concept. There is no strong research to confirm its existence, and it is difficult for an environment that creates an "information cocoon" to appear. For example, in the third quarter of 2019, the number of Douyin users was 606 million and the number of Kuaishou users was 414 million, with an overlap rate of 36.4%. This means that people are generally unlikely to be in a "single information environment" that can form an information cocoon. After all, we usually receive information in many ways, such as various social media platforms, WeChat Moments, etc., which can help us understand the world. What is really worth worrying about is a group of people who have been neglected, such as middle-aged and elderly people. They tend to be in a low-frequency, single-threaded social environment, and their only way of understanding the Internet world is through WeChat or some short video platform. For them, is it really okay to only get the information they "should" get for a long time? However, even though the conditions for the formation of an information cocoon are harsh, it does not prevent everyone from paying attention to and being vigilant about it. Toutiao is one of the earliest news apps to use an algorithmic mechanism. Only four years after its launch, it has over 60 million daily active users and an average user usage time of 76 minutes. This is the magic of personalized recommendations, which attracts and retains users. At the time, we didn’t think there was anything wrong with this matter. We just felt it was very fresh and even a little addictive. In the following years, more and more apps have been connected to the algorithm system. From Weibo, which relies on the "following system", to Dongqiudi, which started out with professional battle reports, to Hupu, which is loved by netizens for its traditional forum, they have all changed their versions. Countless apps would rather abandon their own traditions and genes and plunge into algorithms with a heroic spirit. Although users were a little uncomfortable at first, the daily activity and data show two words: really good. Over time, users have come to find it acceptable, except that the content has become more numerous and complex. It was not until this incident was exposed that everyone began to feel that something was wrong. Because it finally touched the audience's heart. As we all know, the right to speak of video producers and commentators is not equal. The influence of the opinions expressed in the video requires 10,000 people holding the same opinion in the comment section to be comparable. Everyone is afraid that algorithms will deprive them of the right to gather together and act. What’s more, even if the algorithm doesn’t recommend content to you, you can still search for it. But it is difficult to accurately locate the comments that the algorithm does not recommend to you. They sink to the bottom of thousands or tens of thousands of comment sections, or may even be completely hidden and disappear from your Internet. I understand that algorithms are the inevitable product of simplifying people's access to information in the current era of information overload. They are also a technical means that many platforms will use sooner or later in order to extend user retention time. But we should be wary of algorithms. They are like a secretly invading robot, feeding you videos and subtly influencing and even shaping our personality. In fact, it is nothing more than a mathematical model that can be adjusted and optimized. It can prevent us from being in an overly single information environment. The decision-making power lies in the brain behind it. A long time ago, there was a joke circulating in the arena:
Behind the humor is a heavy fact. But now it seems that the algorithm has made it even heavier. Image and information sources : Weibo: @顾扯淡, @_一道更新, @梁州Zz Journal of Communication and Copyright, 2021, No. 7: Research on the impact of information cocoon on college students and the path to break out of it "Information Cocoon" in the West: Seemingly plausible concepts and algorithms for "breaking the cocoon"丨Chen Changfeng and Qiu Yunqian Research on the Practice and Development of China's Smart Media丨Jiang Xiaoli, Li Lianjie, Wang Bo, Yang Zhao Extreme crowds: The psychology of group behavior He Wei Journalism Award Outstanding Articles (Volume 3): The Debate on Technology, Ethics and Science in News Algorithm Recommendation With 550 million users and a valuation of 60 billion, how did Toutiao achieve explosive growth? Written by: Hedgehog & Mangshan Iron Head, edited by: Mangshan Iron Head & Noodles Source public account: Chaping (ID: chaping321), Debug the World. |
>>: The halo of a big company is not popular in small companies
OpenAI, Google, and Apple are all betting big on A...
In recent years, the concept of private domain has...
Now the cross-border e-commerce platform is develo...
As an offline beauty retailer, Watsons has been ab...
It is not only Taobao, Pinduoduo and other domesti...
In the wave of digital marketing, Xiaohongshu has ...
The creation and operation of ladder media has alw...
When opening a store on eBay, if you want more tra...
January is not over yet, and the February marketin...
Recently, brands have begun to focus on "bran...
When it comes to Mixue Ice City, perhaps the first...
Recently, a video of a Sichuanese woman living in ...
Now all major brands are pursuing rejuvenation. Wh...
There are actually quite a lot of merchants who wa...
There are many cross-border e-commerce platforms, ...