Digital biomass. How do net platforms prey on us?

instytutsprawobywatelskich.pl 1 year ago

Katarzyna Szymielewicz and I are talking about how we are grinding online platforms and how to do not become digital biomass.

(Interview is simply a edited and completed version of the podcast Are you aware? p. Are you a digital biomass? Google, smartphone and your life of January 30, 2024).

Katarzyna Szymielewicz

Lawyer specialising in human rights and fresh technologies, social activist and publicist. Co-founder and president of the Panoptykon Foundation, vice-president of the European Digital Rights network from 2012 to 2020. A postgraduate of the Faculty of Law and Administration of the University of Warsaw and improvement Studies in School of Oriental and African Studies. In the past – a lawyer at the global law firm Clifford Chance, a associate of the Social Council at the Minister for Digital Affairs. associate of Ashoki – an global network of social entrepreneurs. She published, among others, in The Guardian, Politics, Electoral Gazette, diary of the Legal Gazette and Magazine of Scripture.

Rafał Górski: What does it mean that we are grinded into digital biomass?

Katarzyna Szymielewicz: It's a brutal word that I usage to kill consumers of net services out of a sense of comfort – in my false opinion, due to the fact that we don't have the subjectivity there and we don't control the process in which we are drawn by net service providers. The aim of this process is to exploit and commercialize our attention, and, in the process, to gain cognition that will let it to be done even more effectively.

Internet platforms did not treat us like customers from the beginning. We could never control the logic of their algorithms or control what digital service designers call "user experience".

Platforms specified as Google and Facebook lured us with the promise of a free and additionally personalized service – “tailored”, as Mark Zuckerberg said. In the real world, erstwhile we order something that's tailored, like shoes, we pay more, no less. And in this case, I get a personalized service and I don't pay! In practice, of course, I am paying, and possibly more than always for any service. This is the thought that Tristan Harris, a erstwhile Google employee, is making 1 of the most critical of the big-tech institutions – Centre for Humane Technology. He just said perversely that there was no service in human past more costly for us than the free one. As a society, we pay for it with intellectual health, addictions, money spent on unnecessary products or incorrect political choices.

According to Harris, the harm caused to us by the business model of online platforms can be described with 1 phrase: downgrading humanity, that is, demoting us as humanity. This is due to the fact that algorithms prey on our basic, primary needs and basic instincts. They strengthen in us vanity, selfishness, and a tendency to blame others for our problems. They support our tendency to polarize, flee into information bubbles and search comfort in conspiracy theories. They turn on the hatred and individual net chases.

All of this is simply a side effect of the business model of large platforms, but not at all accidental – these social effects are included in the activities of recommending systems, which have 1 task: to affect us at any cost. In order to keep the profit from advertising high, due to the fact that only the engaged, current user can see and respond to the advertising. According to Harris, we are observing a kind of "race to the bottom" between specified platforms, due to the fact that whoever better works out and uses our basic instincts will be the most effective in engaging, and thus will accomplish the highest profits from advertising.

It is not people who decide what will be shown to them online and what they will experience, but the company with its learning algorithms.

On the 1 hand, they process our digital footprints into marketing profiles, which means they are trying to work out a pattern of behavior, features, needs to which you can then match the contents served. On the another hand, another sets of algorithms tend to choice content for experience in specified a way that, as consumers, as people undergoing digital biomass processing, they keep their attention and click as long as possible even erstwhile we feel tired, saturated, pissed off or sufficiently inspired by these content.

The key to correctly predicting what we click (which will effectively affect us) is the data interpreted from our behavior. And the most valuable, from the point of view of the platform, is not the information that we consciously transmit, but the metadata and behavioral observations that we completely do not control: the location of the phone, the IP numbers we log in with, the links that we click on, the posts that trigger our reactions, the people with which we interact, as well as erstwhile we usage the application, in which position we hold the phone, how rapidly we write, and even how we burden the device.

This brutal mechanics is optimised for commercial gain, not for the value of a man, called a "user", that is, individual dependent, treated, not a client who decides himself. The word "digital biomass" was created to illustrate the mechanics of grinding us.

What can a citizen do not be a digital biomass?

There is no simple answer to that question. It is surely not to proceed utilizing the current model, as if it were neutral for us due to the fact that it is not and we should change it.

Companies offer us privacy settings, advertising settings. We can spend quite a few time with “slips” that are available on online platforms and effort to fit them. This is not pointless due to the fact that to any degree we can improve our experience and defend ourselves from the most toxic or aggressive forms of what is waiting for us online. We can calibrate the filter through which specified information comes to us, not another information, we can disable notifications, which I encourage everyone, we can set time switches to limit ourselves; we can besides remove unnecessary apps from the telephone or usage them only erstwhile necessary.

Privacy settings give us the illusion of control. In fact, all UX paths lead to the place where we are being followed, and our data from different sources are combined into behavioral profiles. Notifications, interface, content-recommendation algorithms, and even the device we usage (living colors in the smartphone) – everything is designed to influence dopamine secretion and depend on in the longer term. Thanks to the mechanics of irregular reinforcements, known for gambling, we statistically check notifications from the app up to 150 times a day.

Platforms employment the best scientists and spend billions of dollars to get to the best of our brain's evolutionary circumstances. That's why it's not equal to a fight.

It is surely not that as citizens we can do nothing, but these are changes within the paradigm in which we have the imposed asymmetry of power – we are not the client, we do not pay for the service and we do not decide what experience will be served to us, but the advertiser. We can start by changing the business model, which is how profit is generated. We can effort as consumers to enter the American model, in which a 3rd organization – a trusted partner – negotiates on our behalf, what will happen to our data, or provides its algorithm, which corrects this toxic and drawing us into the spiral of content consumption. specified online platform caps are technologically possible, but this is entering into increasingly sophisticated services and paying for them real money.

Read Entire Article