Is the membership price higher? How to break the “mature” platform in the era of big data? Expert interpretation


According to CCTV news on January 28, many consumers have questioned why it is not the peak season of tourism, but the ticket prices are getting higher and higher; why do you mention a certain product when chatting, and you will see the relevant advertising push on the major platforms in the twinkling of an eye? Why do you use an app to pop up dozens of pages of privacy policies that you can’t read or read, and to use it, you have to click “agree”? When one day you don’t want to use it any more, you can’t find where the logout button is?
In the era of big data, we can always find the information we want faster, but it seems to be more and more difficult to avoid the hidden algorithm trap.
Is the member price higher than the non member price? The platform uses algorithms to “kill”
Not long ago, an article questioning the algorithm used by a takeout platform to “kill familiar” caused heated discussion. Xiao, a member user of the platform, pointed out that for the same order, his delivery fee pricing is often higher than that of non member users.
Mr. Xiao
I often order the same takeout. One day, after I opened a membership, I found that the price of the delivery fee was three times higher than before. Then I tried for almost an hour that day, and the delivery fee was always six yuan higher than usual.
Xiao told reporters that in the following week, when ordering takeout, he often used two mobile phones at the same time to log in to the member account and non member account respectively, and carried out comparative experiments. He found that there was always a difference in the delivery fee. So he summed up these observations into an article published on the official account.
Mr. Xiao
Later, after my article came out, many netizens left messages in my backstage, all of which were “familiar” with members. For example, when a mouse is purchased by a non member, it costs 100 yuan, but the price for a member becomes 110 yuan. As a result, after I open a membership, the price is higher than that of non members. What’s the significance of opening a member.
After the article triggered a heated discussion, the relevant takeaway platform contacted Mr. Xiao. In response to his query, the reply given by the platform side is: the price difference on the distribution fee is the error caused by the system cache.
Mr. Xiao
I’ve tried many times and it’s been like this for a week. That’s not true. Where is the location cache? In the same place, the difference is between members and non members. After consulting some technical friends, it is impossible to make such a big error.
“Kill familiar” ≠ differential pricing, related complaints continue to increase
Mr. Xiao’s doubts, in the end, did not get a clear answer. However, his experience has brought the topic of big data maturity into the public view again. Around this issue, the reporter interviewed the China Consumer Association and relevant legal experts.
Chen Jian, director of Complaints Department of China Consumer Association:
Consumers often complain that I am an old member, but my price is higher than that of ordinary members (users). For example, in some online car booking platforms, movie booking platforms, ticket booking platforms and so on, there will be such problems. The second situation is that if some consumers browse a page many times, the price will rise. The consumer may place an order in a hurry because he is afraid of price increase or because he finds that, for example, the air ticket is very tight. As a result, after he places an order, he finds that the price soon falls back.
According to experts, the most critical step to get familiar with big data is to be very “familiar” with customers and provide differentiated choices. This has existed in some business models in the past, and there are few negative comments.
Pei Wei, associate professor, School of law, Beijing University of Aeronautics and Astronautics
As far as big data is concerned, the phenomenon behind it is actually differential pricing. For example, insurance, different people’s premium, insured amount or compulsory period of insurance will be different. Therefore, if we only use differentiated pricing, it does not necessarily constitute a negative thing. What we dislike is the differential pricing without justification.
Liang Zheng, vice president of the Institute of artificial intelligence international governance, Tsinghua University:
Traffic insurance will decide whether to give you a discount or increase the price according to your accident situation and your driving behavior. This behavior may also be an objective behavior or an objective personal difference accepted by everyone. Now you are more concerned about the fact that my personal information is used to generate my data portrait, and then to differentiate me in terms of pricing.
According to experts, big data is different from common differential pricing. When it ignores consumers’ right to know and initiative, it goes against the original intention of consumers to contribute data. In China’s current legal system, the price law, consumer protection law and anti-monopoly law all regulate the situation that “big data maturity” may be suspected to be illegal.
Pei Wei, associate professor, School of law, Beijing University of Aeronautics and Astronautics
In 2020, we have issued a corresponding standard for online tourism services, which requires online tourism operators and service providers not to abuse big data or corresponding technologies, and to make a differential treatment according to users’ personalized characteristics. There are corresponding provisions in the consumer protection law. We have the right to fair trade, which means that there should be a reasonable price. At the same time, this “killing familiarity” may also involve, for example, some relevant contents of the anti-monopoly law. There are problems, but we also see that our country’s legislation is constantly responding to these practical problems and striving to protect some legitimate rights and interests of our users.
According to the China Consumer Association, consumers’ complaints about “big data maturity” are increasing. The core of the problem is the application of algorithm technology on the Internet platform, which is mainly reflected in recommendation algorithm, price algorithm, evaluation algorithm, ranking algorithm, probability algorithm and traffic algorithm.
Chen Jian, director of Complaints Department of China Consumer Association:
For example, when consumers buy goods or services, the platform will form some personalized recommendations for them according to the pages they have visited, some preferences of consumers, including their tracks. In fact, the operator may have made some accurate personal data portraits of him. The goods and services related to this kind of portrait were only recommended to him, so the right to know that he obtained had great defects.

How is the Internet platform “familiar” with every user? The answer is through user portraits. Behind the accurate user portrait is a large amount of personal data accumulated by the platform.
Liang Zheng, vice president of the Institute of artificial intelligence international governance, Tsinghua University:
With my data, is it accurate that you made such a characterization of me? If it’s too accurate, people will worry about being manipulated. After a little more in-depth discussion, I will find that what I care about is the so-called discriminatory pricing, or is my information reflected behind it improperly collected and used?
The platform accumulates a large amount of personal data to “calculate” user portraits
User portraits based on big data can help businesses find out whether you prefer spicy food or sour food, and rank restaurants that are more in line with your taste by algorithms. User data is often collected by the Internet platform through a mobile app. And in the process of dealing with these apps, there are also problems that make us worry constantly.
How does app realize intelligent push with high precision? Is the “app eavesdropping” widely perceived by users real?
The app governance working group is composed of a group of network security legal and technical experts from the National Information Security Standardization Technical Committee, China Consumer Association, China Internet association and China Cyberspace Security Association.
Technical experts first through a simulation “app eavesdropping test program” developed by them. Through the test, it is proved that “eavesdropping” can be realized when the test program is placed in the foreground.
In addition, after a comparative experiment, the technicians told us that when the test program retreats to the background, or when the mobile phone is locked, the recording can still continue for a period of time, but it will automatically terminate. It’s just that the duration of continuous recording under the lock screen is slightly different for different mobile phone operating systems.
So, is the “eavesdropping” method that can be realized technically abused in the market app?
He Yanzhe, technical expert of APP governance working group:
When we do the actual detection of eavesdropping, we find that no app has such eavesdropping behavior after uploading voice information.
According to the experts of APP governance working group, although it can be realized technically, this method is costly, inefficient and has high legal risks.
Experts from the app governance working group said that at present, both the industry standard and the technical level are further strengthening the transparency of calling sensitive permissions such as mobile phone microphones, so as to let mobile phone users know.
He Yanzhe, technical expert of APP governance working group:
Now we can see that some mobile phone operating systems have realized real-time prompt for microphone use. We call it the video indicator, that is, when you want to call the microphone, there is a status bar, a red dot or a reminder.
Just talking about fitness plans with friends, online shopping platforms all begin to recommend fitness products to you in a twinkling of an eye; or after searching hot pot ingredients on one platform, many platforms begin to recommend hot pot ingredients and even utensils to you frequently. How to achieve such accurate personalized recommendation?
He Yanzhe, technical expert of APP governance working group:
An app can make a 360 degree portrait of a user. This portrait, I believe, is the result of years of accumulation. First, it is the result of multiple channels.
According to experts, the accuracy of the portrait is mainly achieved through big data analysis of our purchase records, browsing records, search records, and even downloaded application lists.
He Yanzhe, technical expert of APP governance working group:
Why do you feel overheard sometimes? The source of big data is not the current app, it may be other apps. You have done some operations, it will summarize the corresponding information together, it may also be associated with your friends, associated with your people in the same area. Then some of their actions may become a direction for it to push advertisements.
Experts suggest that users can find microphone permissions in the permission settings of mobile phone operating system, and check which apps are authorized to use microphone. According to their own needs, users can turn off the authorization of using microphone for app at any time.
More attention should be paid to the illegal collection of personal information by app
Even the data that originally did not belong to personal information, after a large number of aggregation, can also dig out the unknown relationship between things. Therefore, in the view of experts of APP governance working group, what should be paid more attention to is the illegal collection and use of personal information by app. When they recently tested the apps on the market, they found many problems, such as long privacy policy, difficult for users to understand, and unable to cancel the account.
He Yanzhe, technical expert of APP governance working group:
Governance will focus on minimizing the collection of personal information. In fact, we have actually solved part of the problem in the past two years. But the speed of solving the problem, including the effect, may not be ideal, including the privacy policy, but is it well written? If there is cancellation, then the conditions for cancellation are reasonable. There are some problems in this respect.
Let’s take an example. In the process of testing, we found that the privacy policy of this wallpaper app has more than 10000 words. Let’s see what it says. I believe that no one will finish reading this privacy policy. Even people like us who read the privacy policy every day have a headache. For example, the location information in the privacy policy. App said I want location information, it said a lot of reasons, but I think this reason has nothing to do with wallpaper.
Pei Wei, associate professor, School of law, Beijing University of Aeronautics and Astronautics
Whether the privacy policy really serves the users and protects the individuals is actually a broken state. Therefore, one of the concerns in the future is the formalization of this kind of protection.

The official account report mailbox of WeChat public number and special report box are set up by App governance team. The App group has a personal information report. Experts told reporters that the difficulty of account cancellation is another problem with a large number of user complaints.
Ding Xiaodong, vice president of the Institute of future rule of law of Renmin University of China:
From the perspective of information law, we often have a term called “digital brand”. In this network society, all the footprints are permanent. In our country’s personal information protection law, there are special provisions. In terms of legal rights and interests, it belongs to an individual’s right to delete.
Pei Wei, associate professor, School of law, Beijing University of Aeronautics and Astronautics
Information technology security national standard, which involves personal information security standards, specifically refers to the cancellation. That is to say, when you log off or stop using a service, the way and method should be the same as when you use the service. At the same time, the information you are required to provide in the process of logging out should not be more than the information you use and the identity information you provide when you use the app.
Experts said that in the process of data elements circulation, the collection of personal information is only one link. To protect personal information security, data storage, processing and circulation also need attention.
Ding Xiaodong, vice president of the Institute of future rule of law of Renmin University of China:
If the front-end link (data collection) is closed, it will have a great impact on the development of artificial intelligence and data industry. How to reconcile? I think a very important aspect is that after the collection of personal information, it must be managed in a very strict life cycle. We should have anonymization mechanism and fuzziness mechanism. When processing, we may have to adopt some principles of data minimization to process the necessary data. In particular, a very high standard should be adopted in the storage and circulation links and in the protection of leakage.
There are still some shortcomings in the law, and the demarcation of the security boundary needs to be explored
Now, the convenience brought by technology and the protection of personal information are at the two ends of a swing balance. What is the role of law in the balancing process?
Ding Xiaodong, vice president of the Institute of future rule of law of Renmin University of China:
Our traditional consumer protection law also includes the application of contract law to standard terms, the application of tort law to infringement, and many problems in administrative supervision. These laws can be invoked to a certain extent. But on the other hand, because many problems are caused by new technologies such as personal information and algorithms, to a certain extent, there are still some shortcomings, and the legislature is also actively following up. When our civil code comes into force, it may provide a lot of legal support for the protection of personal information. Then the formulation of the personal information protection law (Draft) and the data security law (Draft) will certainly provide a lot of additional protection in the future.
Today, each of us is producing a lot of information. Experts say that in the current social life driven by algorithms and big data, law plays a more fundamental role. The demarcation of security boundary needs the interaction of regulators, platforms and users.
Liang Zheng, vice president of the Institute of artificial intelligence international governance, Tsinghua University:
We can’t solve it just by banning it. With the development of technology, you can see that it will bring new problems, but the solution to this problem depends on the technology itself.
Industry insiders Tian Tian:
Data elements as a very important productivity, to enable the upstream and downstream of the industrial chain through data, to promote the circulation of data, including to promote data transactions and so on. Due to the existence of a large number of personal privacy information, or some commercial data assets and so on, it leads to the sharing, circulation and utilization of data. In fact, there are great difficulties in the middle. Now, in fact, in the academic circles, some federal learning and so on, this technology is to solve this problem, promote the relevant standards and specifications, and so on. I just don’t want you to stop eating because of this kind of data security problem.
Technical experts engaged in artificial intelligence security research told reporters that at present, technical means have been able to achieve the same analysis effect when the data is still stored in the original location.
Industry insiders Tian Tian:
It is equivalent to that the data is not stored in plaintext, but through some ways of encryption, the operation of artificial intelligence on encrypted data can be realized. The perceived effect is that my data assets have not been disclosed or given to others. There was no leakage of any privacy information, but at the same time I got a model trained on richer data.
Liang Zheng, vice president of the Institute of artificial intelligence international governance, Tsinghua University:
From the perspective of laws and regulations, it should be the bottom line. Then we draw some red lines, which can’t affect public safety. For example, in the digital economy, there are even some business models that can be iterated very quickly, which can’t wait until the law is formulated. For every problem, there can be corresponding technical solutions to deal with it. The key is to make the rules clear.