Facebook Informer: praise and sharing functions bring pressure and anxiety to young users

0
72

Tencent technology news on October 26, point praise and sharing have made the social media website Facebook. Today, the company’s documents show that it is trying to deal with their impact.
According to the report, Facebook researchers began a new study in 2019 to study a basic feature of Facebook: “like” button. According to company documents, they studied what people would do if Facebook removed obvious thumbs up icons and other facial expressions from posts on instagram, a photo sharing app. It was found that these buttons sometimes bring “stress and anxiety” to instagram’s youngest users, especially when the post did not get enough likes from friends.
But the researchers found that when the “like” button was hidden, users’ interaction with posts and advertisements decreased. At the same time, it did not alleviate teenagers’ social anxiety, and young users did not share more photos as Facebook thought, resulting in mixed results.
According to the document, Facebook CEO Mark Zuckerberg and other executives discussed hiding the “like” button for more instagram users. Finally, a larger test around instagram’s “building a positive media narrative” was carried out in a limited range.
The study of the “like” button is an example of Facebook questioning the basic characteristics of social networks. As the company faces crises again and again in misinformation, privacy and hate speech, the core question is whether there is a problem with the basic operation of the platform – in essence, it is these features that make Facebook. In addition to the “like” button, Facebook also carefully examined the “share” button that allows users to instantly spread the content published by others, the group function used to form a digital community, and other tools that define the online behavior and interaction of more than 3.5 billion people. The study, displayed in thousands of pages of internal documents, highlights how Facebook repeatedly tries to deal with what it creates.
What researchers find is often far from positive. Over and over again, they have determined that people abuse key functions, or that these functions amplify toxic substances and other effects. In an internal memo in August 2019, several researchers said that Facebook’s “core product mechanism” – that is, the basis of how the product works – allows misinformation and hate speech to spread on the website. “The mechanism of our platform is not neutral,” they concluded. “We also have convincing evidence that our core product mechanisms, such as virtualization, recommendation and participatory optimization, are important reasons for the vigorous development of these types of content on the platform. If integrity takes a non-interference position on these issues, whether it is for Technology (precision) For philosophical reasons, the end result is that Facebook as a whole will actively (if not necessarily consciously) promote these types of activities. The mechanism of our platform is not neutral. ”
These documents — including slides, internal discussion leads, charts, memos, and presentations — do not show what Facebook did after receiving the survey results. In recent years, the company has changed some functions to make it easier for people to hide posts they don’t want to see, and turn off the suggestions of political groups to reduce the spread of misinformation. But the core way Facebook operates – a network where information can spread quickly and people can accumulate friends, followers and likes – eventually remains basically the same. Some current and former executives say many major changes to Facebook have been blocked to serve growth and maintain user participation. At present, Facebook has a market value of more than $900 billion.
Brian Boland, Facebook’s vice president who left last year, said: “as an employee, you can have a fairly open dialogue within Facebook. In fact, it may be much more difficult to complete the change.”
These company documents are part of the “Facebook archives” provided to the securities and Exchange Commission and the U.S. Congress by the informant and the lawyer representing Frances Haugen, a former Facebook employee. Ms. Hogan handed these documents to the Wall Street Journal earlier. This month, a congressional staff member provided edited disclosures to more than a dozen news organizations.
In a statement, Facebook spokesman Andy stone criticized articles based on these documents, saying they were based on “wrong premises”. “It is true that we are a business and we want to make money, but the idea that we do so at the expense of people’s safety or well-being misunderstands our own business interests,” he said. Stone said Facebook had invested $13 billion and hired more than 40000 people to protect people’s safety, adding that the company called for “updating regulations and allowing democratic governments to set industry standards that we can all abide by.”
Zuckerberg said in a post this month that it was “very illogical” for the company to give priority to harmful content because Facebook advertisers did not want to buy ads on platforms that spread hate and error messages. “At the most basic level, I don’t think most of us can recognize the false pictures of the companies being painted,” he said
The foundation of success

When Zuckerberg founded Facebook in Harvard dormitories 17 years ago, the mission of the site was to connect people on campus and bring them into digital groups with common interests and places. When Facebook launched a dynamic message composed of photos, videos and status updates released by users’ friends in 2006, the company ushered in explosive growth. In 2009, Facebook launched a “like” button with a like function. This little thumbs up symbol is a simple indicator of people’s preferences and has become one of the most important features of social networks. Facebook allows other sites to use the “like” button so that users can share their interests back to their Facebook profile.
This gives Facebook insight into people’s activities and emotions outside its website, so it can better use advertising to target them. Like also means that users want to see more content in their dynamic messages, so that people can spend more time on Facebook.
Facebook also adds a group function. People can talk about specific interests through private communication channels, so that enterprises and celebrities can accumulate a large number of fans and spread information to these fans.
Another innovation is the “share” button, which is used by people to quickly share photos, videos and messages posted by others to their own dynamic messages or elsewhere. An automatically generated recommendation system also recommends new groups, friends or pages for people based on their previous online behavior.
However, according to the document, these functions have side effects. Some people begin to compare themselves with others with praise. Others use the share button to quickly spread information, so false or misleading content spreads like a virus in a few seconds. When people are asked to recall an experience that triggered a negative social comparison on instagram, they may attribute this negative feeling to the number of likes.
Facebook said part of its internal research was to find problems that could be adjusted to make its products more secure. Adam moseri, the head of instagram, said that the research on the well-being of users led to investment in anti bullying measures on instagram.
Facebook profile
A former Facebook employee leaked internal documents, allowing people to closely observe the operation of the mysterious social media company, and once again called for better supervision of the company in a wide range of areas of users’ lives.
In September this year, the media began to serialize a series of reports “Facebook archives” based on leaked Facebook documents. The serial documents reveal evidence that Facebook knows that instagram, one of its products, is deteriorating its body image among teenagers. On October 3, the informant Hogan appeared on the TV program 60 minutes. As a former Facebook product manager who left in May this year, hogan revealed that she was responsible for the disclosure of those internal documents.
On October 5, hogan testified before the U.S. Senate subcommittee that Facebook was willing to use hate and harmful content on the website in order to get users back. Facebook executives, including Zuckerberg, said her allegations were untrue. Hogan also filed a lawsuit with the securities and Exchange Commission. Subsequently, a congressional staff member provided these documents called “Facebook archives” to a number of news organizations, including news agencies.
“Facebook profile” shows Facebook’s understanding of extremist organizations on its website trying to divide American voters before the election. They also revealed that internal researchers have repeatedly determined how Facebook’s key functions amplify toxic content on the platform. However, Jane lytvynenko, a senior researcher at the Kennedy scholenstein center at Harvard University who studies social networks and error messages, said that when so many problems go back to core functions, Facebook can’t simply adjust itself to become a healthier social network. “When we talk about like buttons, share buttons, dynamic messages and their functions, we are essentially talking about the infrastructure on which the network is built. The key problem here is the infrastructure itself,” she said
Self examination
As Facebook researchers delve into how their products work, there are more and more worrying results. In a group study in July 2019, researchers tracked how members of these communities became targets of misinformation. The researchers say the starting point is people called “invited whales”. They send invitations to others to join a private group.
The study said that these people effectively brought thousands of people into new groups, making the community expand almost overnight. According to the study, invited whales can send Posts promoting racial violence or other harmful content to these groups. Another 2019 report examines how some people accumulate a large number of fans on their Facebook pages, often using posts about lovely animals and other harmless topics. But once a page grows to thousands of followers, the founder will sell it. According to the study, buyers then use these pages to show followers false information or politically divisive content.
The document shows that when researchers study the “like” button, executives also consider hiding the function on Facebook. In September 2019, in a small experiment in Australia, it deleted likes from users’ Facebook posts. The company wants to see if this change will reduce user pressure and social comparison. This in turn may encourage people to post more frequently on the Internet. But after the “like” button was removed, people didn’t share more posts. Facebook chose not to conduct more extensive testing. He pointed out that “among the long list of problems we need to solve, the praise problem ranks very low.”

Last year, company researchers also evaluated the share button. In a study in September 2020, a researcher wrote that the sharing buttons in dynamic messages are designed to “attract attention and encourage participation.” but if not controlled, these functions may “amplify bad content and sources”, such as bullying and marginal nude posts. This is because these features allow people to share posts, videos and information without hesitation.
A common clue in the document is how Facebook employees advocate changing the way social networks operate and often accuse executives of getting in the way. In an internal post in August 2020, a Facebook researcher criticized the recommendation system, which recommended people to use pages and groups, and said that it could “very quickly guide users to conspiracy theories and groups.”
“Out of concern about the response of potential public and policy stakeholders, we deliberately expose users to the risk of damage to integrity,” the researcher wrote. “When we shilly Shally, novel coronavirus pneumonia, such as anonymous Q, anti vaccination and new crown pneumonia conspiracy movement, are seen more and more deeply in my hometown.” the researchers added, “it’s painful to see.” (Tencent technology compile / no bogey).