Overseen society – the West will pursuit China away?

instytutsprawobywatelskich.pl 5 months ago

Katarzyna Szymielewicz and prof. David Lyon talk about how we are being watched by the state and corporations, details of monitoring systems in China and India, the situation in Poland, and the function of civilian society in the fight for freedom.

David Lyon

Sociologist, retired prof. of sociology and law at Queen’s University, Canada, erstwhile manager of Surveillance Studies Center, associate of editorial boards of many magazines including Surveillance & Society and The Information Society. 1 of the main theorists of the "surveillance society". For over 25 years he has been studying surveillance, regularly publishing books and articles on these issues, including: The Electronic Eye (1994), Surveillance Society (2001), Surveillance after September 11 (2003), Surveillance Studies (2007), Identifying Citizens (2009), Liquid Surveillance (written jointly with Zygmunt Bauman, 2013) or Surveillance after Snowden (2015). His books were translated into 17 languages and became literary classics in the field of surveillance studies.

Katarzyna Szymielewicz

Lawyer specialising in human rights and fresh technologies, social activist and publicist. Co-founder and president of the Panoptykon Foundation, vice-president of the European Digital Rights network from 2012 to 2020. A postgraduate of the Faculty of Law and Administration of the University of Warsaw and improvement Studies in School of Oriental and African Studies. In the past – a lawyer at the global law firm Clifford Chance, a associate of the Social Council at the Minister for Digital Affairs. associate of Ashoki – an global network of social entrepreneurs. She published, among others, in The Guardian, Politics, Electoral Gazette, diary of the Legal Gazette and Magazine of Scripture.

(The text is translated with English, edited and shortened evidence of the debate held on 30.09.2023 in Warsaw during the 5th Geopolitical Forum organised by the Institute of civilian Affairs).

Anna Turner:The past of Poland is simply a fascinating context for discussions on surveillance. Older people remember the times of government surveillance during the period of totalitarianism, while the younger generation experiences diverse forms of control, including monitoring and processing of data by global corporations. Nowadays surveillance technologies affect virtually everyone, especially net users. Let us begin our conversation with the question of your knowing of the supervisory society: can you identify the crucial moments that have shaped it in fresh years?

David Lyon: This is an crucial question and there are various answers to it. I am reasoning about surveillance with respect to the data that concerns us – specified a broad knowing has become crucial for many issues.

The concept of surveillance, which is surely known in Poland, is primarily about state surveillance. present in many, if not most countries in the world, the state is powerfully linked to commercial activity, to corporations, and so these 2 entities are working closely together. It frequently turns out that the state uses commercial data. During the pandemic, the Canadian national Government purchased data from a telecom service supplier (this was data from mobile phones) and utilized it to effort to monitor the spread of the virus. This is an apparent example of how you can trust on data from commercial companies. This is now a much more complicated issue.

So erstwhile I talk about surveillance, I mean interest in all action, in individual life, in all activity revealing to others any information about us, which can then be gathered in any way.

Not only is it literal visual surveillance, but present it is primarily surveillance by mobile devices.

Katarzyna Szymielewicz: I will mention to how the mission and scope of the Panopticon Foundation evolved for almost 15 years. As David pointed out, the primary nonsubjective of our activity was to analyse surveillance practices affecting citizens from the state. In a situation where we have faced an global war on terrorism and the question of any protection of human rights, it has become clear to us that we have crossed the line set by philosophers specified as Foucault or Agamben, for which no 1 is protected from the panic of the state. If we, as a civilization, as a community, accept (we never accepted, but politicians presented it as if we were doing) killing individual in defence of our society – it besides means that we ourselves can be killed in defence of the society from which we were excluded. Watching this dynamics of state surveillance and deciding who is within society and who is outside it – who deserves to last and who is to die – was the first large thought behind our work 15 years ago.

The more we explored this issue, the more we discovered the complex dynamics of power mentioned by David, in which the marketplace and the state work hand in hand to justify the request to monitor citizens, as well as the production of tools and infrastructure enabling them, and that it is fundamentally the same ecosystem. The turning point in this discussion was the revelations by Edward Snowden, who proved beyond uncertainty that this was the case, i.e. that the data collected by commercial companies that were previously associated with freedom, access to information and having the opinion of "the coolest companies in the world", specified as Google, Facebook and others, was an active part of the surveillance apparatus. Then it became clear to us that the main nonsubjective of our work should be to look at the practices utilized by these companies. This does not mean that state surveillance is no longer dangerous. It is, but we realized that's how the planet works.

We will most likely not be able to replace the non-surveillance state of the citizens' surveillance state, so we better make any kind of regulation that protects citizens' rights. To any extent, it is essential for the state to fulfil its functions, to defend us erstwhile we truly request defence and to organise public services erstwhile we request them, but surveillance in the net services marketplace should not be part of the package.

Over the last 5 years, many of our activities have been about specified companies and their regulation, besides due to the fact that surveillance has become virtually invisible and elusive to customers. erstwhile we meet with state surveillance, we know at least that individual is in control of us. We may feel intimidated, threatened, uncomfortable. It is completely different in a commercial environment, where it is sold to us as convenience: “Do nothing. Don't decide. We'll do it for you."

I think that this fresh wave of surveillance, based on tools that supply comfort, involves people becoming increasingly passive and are "happy" – they do not choose this consciously, but feel happy, withdrawing from active choices and simply giving in to suggestions, recommendations, targeted advertisements, watching companies form their lives.

This is most likely much more dangerous to the public than state surveillance, due to the fact that it is much harder for us to halt – to see what is happening, and to effort to question it. This is what I consider to be the main surveillance problem right now.

As a researcher, I am fascinated by comparative analysis, especially in the context of what societies have in common and what is different. investigation in Western countries shows that we have a mostly negative view of surveillance and of utilizing our data without our consent. However, it is worth noting that acceptance of supervision practices is expanding erstwhile these activities are motivated by safety concerns. In another words, I cannot tolerate a situation where my data is utilized without my knowledge, but I change my head erstwhile I am convinced that this is essential to guarantee security.

China is simply a country completely different from our attitude to surveillance. I will mention to a survey that came from a fascinating book published a fewer weeks ago by Ariane Ollier-Malaterre entitled "Living with Digital Surveillance in China: Citizens’ Narratives on Technology, Privacy and Governance". The author concludes: “The Chinese respondents who took part in the survey believe that technology improvement will reconstruct China to its erstwhile glory, solving all Chinese problems. They accept surveillance techniques due to the fact that they see the government as a trusted guardian, almost a parent who is needed erstwhile the “moral quality” is lacking. In another words, respondents say it is simply a form of discipline needed to counter chaos in specified a large countryIt’s okay. ”

So we see how crucial the perception of surveillance techniques in China differs, both in comparison with investigation carried out in Western countries, and in the context of communicative presented by Western media, in which the Chinese Social Trust strategy is presented as an example of makiawelic, totalitarian control. What is your opinion on this and is there anything we can learn from the Chinese?

D.L.: The situation in China is fascinating and very different from what we experience in Canada and, I understand, which you experience here.

Cultural differences make it impossible for me to make simple comparisons between these cultures – as Westerners we are very different from Chinese, shaped by the heritage of Confucianism.

The United States, in particular, but besides many another Western countries, tend to see China as any kind of avoidable dystopia, leading to increasing tensions in bilateral relations, with very small knowing from the West of what is actually happening in China. I agree that Ariane's book is simply a successful effort to overthrow any stereotypes and gross mistakes in our reasoning of China.

You have pointed to a different way of seeing state surveillance. This is related to the national humiliation that the Chinese have experienced in many ways in the last century. For example, the nipponese invasion of the 1930s is inactive subject to tension in China. This humiliation affects the way the Chinese think of relations between a citizen and a state. We are dealing with a fundamental difference with Western countries. As far as the Social Trust strategy is concerned, the Chinese are afraid to be ashamed of the low score in the Trust study – this shame is, in my opinion, much stronger than in Western societies. We are talking about akin phenomena, but experienced in different ways due to cultural conditions.

In China we are dealing not so much with the capitalism of surveillance – referring to the title of the book Shoshana Zuboff from 2019 – as with the state capitalism of surveillance.

We cannot simply apply Zuboff's diagnosis to the description of Chinese reality, as the state origin plays a much greater function there than in Poland, Canada, large Britain or in many another Western countries.

Therefore, before we bend over the detailed issues, let us be careful not to extrapolate the Western realities hastily to Chinese realities and not to adopt erroneous assumptions, nor to the motivations that guide the behaviour of Chinese citizens, nor to the goals that guide the actions of the Chinese government. Personally, I am not an enthusiast of Xi Jinping's regulation – but I effort to realize his motivations and assumptions taking into account the cultural context in which he functions.

We operate without censorship. We don't advertise, we don't charge for texts. We request your support. Throw yourself in the media.

Strengthen Citizens' Campaigns of the civilian Affairs Institute

Pass your 1.5% tax:

Enter No KRS 0000191928

or usage our free PIT settlement program.

K.S.: I full agree with David and, like him, I avoid comparing us with China. It seems ridiculous to me. This is not a criticism of your question – I know you ask it due to the fact that that is the media narrative, which you besides mention. China has no intentions for us. I mean, if they have plans, they don't focus on Poland, but they cover the full planet and large gameplay with much more powerful players. China has become a kind of smokescreen for us to hide the problems we face in the West and to say that "we are not China" is to close the debate that should be going on. I think this is fundamentally wrong. I think we should pay close attention to China. Like David, I am not in favour of their practices as such, but the consistency, precision and reasonableness with which they are implemented are impressive.

I'll give you 2 examples I didn't know about until I started asking myself questions about China. According to the researchers I spoke to during my own studies on this issue, the Social Trust strategy was designed specifically for social integration. This is so the same situation as in another underdeveloped countries, specified as India, where people do not have identity documents, half the population is uncounted, unidentified and citizens have no identity in relation to the state. These are completely different realities than ours, due to the fact that we are counted, identified and monitored.

So countries specified as China and India make systems for social integration of residents so that they can take loans, travel or receive benefits. We, on the another hand, place our cognitive filters on these processes and criticise the government's obtaining information about citizens, not knowing what their starting point was and what the challenges of these programmes are.

It was 1 of the reflections that came to me during the discussion of the Chinese Social Trust strategy with a individual who knows Chinese society much better than I do.

Another example is large companies, specified as TikTok, who met with large criticism in Europe – and there are good reasons for it. However, in the West, we do not truly realize how these companies operate in China – my observations show that they are controlled and that their activities are subject to a state whose policy determines their direction. I repeat, I am not a large fan of these solutions, but if we have a strong state that is able to control what monitoring companies do, and these must not exceed certain red lines, for example, they cannot offer children what they offer to children here—depending on technology, providing them with content that children should never see, manipulating their minds (and seemingly this is not the case in China due to the fact that relationships between these companies and the state look different) possibly it's something we could learn from the Chinese.

D.L.: I will address the issue that Kasia has just raised. In the West, we frequently mention China as a kind of dystopia that can be avoided. This is simply a way that we do not want to follow due to the fact that we take quite a few assumptions about it – for example, it is suggested, or even stated, that the Chinese Social Trust strategy was developed by a Chinese state to track and control all citizens. Yes, the systems of social trust that be in China are primarily managed by the government, but they mainly measure corporate activity. They do not collect data on individual citizens or consumers.

There is no unified strategy in China called Social Trust. As of 2014, there is simply a plan to make any aspects of the Social Trust strategy to be implemented in 2020. That's not what happened. Throughout the country, many people have opposed its circumstantial passages, and in many cities corporations reject certain elements of the Social Trust strategy due to the fact that they believe it is inadequate for the task to which the government has called it. So let us not imagine that there is simply a unified strategy of top-down totalitarian control in China, exercised by the Beijing authorities. It is much more complicated, much more fluid and much more open to the contest. As time passed, corporations opposed, changed, and withdrew any of its elements.

Kasia besides referred to India. India's population will shortly exceed China's population. If we are looking for a unified strategy that would include all citizen of the country, why don't we look at India – which is never mentioned as a state we want to imitate or avoid? India has 1 central, state-organised biometric registration strategy [Aadhaar – ed.] – to make it the Prime Minister of India invited the head of the largest Indian technology corporation, Infosys. presently there are 1.4 billion registered people. In technological and administrative terms, this strategy is astounding, built at a hard to believe pace and based on iris scanning (as well as facial and fingerprint photos). Scans of 1.4 billion human iris are found in 1 unified, comprehensive strategy in India. It has an highly crucial function to play in Indian politics, but, of course, it is besides subject to much controversy. Nevertheless, in this case, we are dealing with a biometric system, which was initiated by the state, but was implemented with the support of a large corporation, and which actually includes all citizens.

K.S.: We can besides mention companies specified as Meta or Alphabet, which have identity-based systems utilized by billions of people, service as an online identity supplier and possibly in the future – hopefully not – access to their services will be obtained through fingerprints, which will become the default way to enter the system. It's just a substance of programming the devices, mobile phones and tablets we usage to usage these services. specified a solution is justified due to the fact that it is fast, and people like speed, reliability, intuitiveness; the promise that they will not be hacked – and they do not even consider transferring their data to private companies, although they are opposed to the state obtaining it. Of course you will, too. He's got it! At airports in the European Union, fingerprints are taken from us and there is nothing we can do about it. My point is that specified a future awaits us here too, so it is better to focus on the practices of power here and now, alternatively of claiming colonial objections to teaching others how to defend the privacy of their citizens.

I must say that I have besides drawn attention to the condescending speech of any comments in the Western media, indicating alternatively a complete deficiency of knowing of local circumstances. A survey by Ariane Ollier-Malatere shows that many Chinese citizens do not really know about the existence of the Social Trust System, and digital surveillance programs are not something that concerns them besides much.

Emerging innovation is rapidly and on a large scale. How serious are surveillance practices in the West, and is adequate done to control and regulate them legally? Is it even possible that changes in law, which are usually alternatively slow, keep pace and keep up with the fast improvement of technology?

D.L.: We request to consider what is actually happening in our societies. I think it is very helpful in this regard. Shoshana Zuboff's work about the capitalism of surveillance. I disagree with any of the author's proposals, but we respect each other. I think Shoshana has hit the core of any truly crucial aspects of today's surveillance practices, noting that corporations are heavy active in collecting individual data and that these data are obtained on the basis of our regular behaviour. So here we are not dealing with any alien force that exposes and aggregates information about us. Rather, the point is that we, through our online activity related to the usage of digital devices, constantly produce data that is then collected. They are very valuable and can be earned. This is how they gain large, digital corporations, manipulating these data utilizing algorithms to usage them for their own purposes and reselling them to others.

I mentioned earlier an example of the Canadian corp Telus, which during the pandemic sold Canadian Public wellness Agency data on the usage of mobile phones and mobile traffic. no of the 33 million people whose data was sold to the government knew that specified a transaction was taking place. The pandemic was utilized as an excuse to rapidly capture data, and I imagine that people in the Canadian Public wellness Agency did not even think that anyone would ask them questions about it. But it did. I believe that serious consideration should be given to how citizens' data are now collected and collected. Today, even data on our relations with the state are frequently collected in the commercial sphere. Of course, there are inactive government safety agencies that have access to and usage our data.

Today, however, the key question concerns the commercial usage of individual data that can besides be utilized by state authorities

– from the police (who loves to gain access to data provided by corporations) to public wellness agencies, safety agencies, as well as various government institutions whose activities are frequently dependent on information. Sometimes our data is collected by these institutions, but they are now increasingly obtained from commercial agencies and corporations.

K.S.: I will go back to the question of whether it is possible for the law to keep up with technology: I think it should not. The law should never arise before defining problems – otherwise we would see them as authoritarian, despotic and dystopian. It's like you know better than we do how to prevent problems before we specify them. A very good example of specified a law, which was created on the basis of a fair definition of the problem, is the General Data Protection Regulation (GDPR), as it is built on assumptions that have existed since the 1970s.

We are talking about more than 50 years here, during which the thought that no data on a individual should be collected without a valid reason defined in the law, has worked in our reality. This may be the best interest of that individual or his consent. But it can besides be a State policy that can justify the request for specified action. We'll most likely agree that this is simply a very good rule. However, it creates a reality in which the state can implement a hypothetical policy that requires citizens to grin erstwhile crossing the border – and introduce facial scans to make certain more of them smile. But we can then say, no, that is against the law. And fight on it.

If we as citizens know what we defend – if we are motivated to defend our freedom – we can win. If not, i.e. if we actually submit to the narratives that offer us a false promise of safety in exchange for our freedom, then even the best assumptions will not aid us due to the fact that we will not defend them in a circumstantial case erstwhile our freedom is violated. Or even worse, we will be convinced that this is simply a situation where our consent should not matter.

So much for the state, but let's talk about a marketplace that has been regulated by data protection law for decades, and yet companies like Alphabet (formerly known as Google) or Meta (formerly known as Facebook) have developed an amazing surveillance camera that even China would not be able to build alone without the engagement of commercial companies (as they do now). How was that possible? On the 1 hand, these companies were formed in a country where there were no regulations, namely the United States. There is now a fierce discussion in the United States about this issue, where questions arise about how specified a situation could have occurred. I hear from American politicians and scientists how much they regret it. But there were reasons why these regulations were not introduced – these were the reasons for economical improvement and a circumstantial approach to freedom as a default value until the harm done to the public was sufficiently apparent to limit the freedom of the corporations to operate. We do not see how much corporations do to defend this freedom of action – and American society chose this narrative. It accepted the actions of digital giants due to the fact that it was deceived by promises of comfort, development, free, attractive services, and so on.

That's how it started. And then these processes reached Europe and, even though we had our regulations and regulations, these large marketplace players managed to circumvent them in many aspects, mainly due to their incredible ability to make narratives. Our courts, the European Commission and even NGOs representing citizens took a long time to effectively counter them by formulating the other narrative. Shoshana Zuboff's book has played a key function in this process, so regardless of what I think of her argument, I like the general way of presenting the problem of surveillance capitalism and placing companies liable for creating this strategy and bypassing many safeguards. Zuboff's book was 1 of many informing signals specified as the Cambridge Analytics scandal or Edward Snowden's revelations. The decision-makers understood what they were dealing with – and that the essence of the problem is not advertising shoes that are displayed to consumers with their consent in services specified as Facebook. That's not what this is about. The problem is the behavioral surplus that Zuboff defined, and which we besides talked about today.

It is about data on our subject, about our behavior, choices, preferences that are collected and utilized without our consent or even without our awareness.

For a long time, companies avoided consequences – although there were regulations covering these issues precisely – arguing that these were not individual data. This shows what the real problem with technologies is. We very frequently do not realize them adequate to make adequate regulations or rules. If we had another communicative and a better knowing of what is happening on the another side of our screens, we could better apply the existing regulations we had before Facebook was born and halt these practices. However, this was beyond the scope of societies.

People like us – NGOs, university lecturers, hackers, groups specified as the Chaos Computer Club in Germany or the Electronic Frontier Foundation in the USA – warned that these processes were taking place, but it was a niche, avant-garde, not completely understood until large names specified as Zuboff appeared, or popular films specified as “Social dilemma” on the Netflix platform that changed this narrative. It took 2 decades for our societies to realize what was happening. That's the problem. I would not blame all the blame on the law, and I would never encourage lawyers to act faster before problems are defined. For besides long, we have presented the public with issues related to the usage of services specified as Facebook as a problem of individual choice, not a immense social problem. Now that has changed, we have fresh regulations, we are discussing social damage, not individual failures, but it took us 2 decades. The question is: can we accelerate? Can we analyse the phenomena generated by fresh technologies more quickly? If we work at this rate and request 2 more decades to realize the operation of the fresh services, it is not a provision to keep freedom.

D.L.: I agree with you, Kate. It is crucial to place what we are discussing in the right context. Just as the function of cultural factors in China and India is important, so in the West, I note 2 aspects that I think are crucial in this situation. 1 of them is the belief that technology is the key to solving all our problems, which is part of the thought of technological solutionism. It is only a myth, but companies want us to believe in it. The second component is our perception of our own actions by the prism of convenience. Comfort has been raised to the highest value, although I think it should not occupy this place. So erstwhile an iPhone or another device of this kind is sold to us, the main argument behind its acquisition is usually convenience (not to mention that we pay respective 100 dollars more for this convenience). The thought of convenience has been very effectively implemented into our minds as consumers.

I full agree with Kasia that cultural factors have played an essential function in the failure of state institutions by introducing regulations limiting the actions of digital giants. However, the fact that these companies function as if they do not answer to anyone is primarily due to the cultural background of belief that we have technological answers to all challenges – and besides to the belief that comfort is an inherent human value.

When you talk about it, I callback an article that you wrote with Zygmunt Bauman, among another things, “After Snowden: Rethinking the Impact of Surveillance” in which you specify 3 factors that affect the acceptance of surveillance practices: fear, entertainment, and knowledge. I have already mentioned fear, and it is frequently based on government agencies that convince about the request to monitor data and information in order to guarantee the safety of citizens. amusement evolved from reasonably simple mechanisms on which social media initially relied, specified as contact with long-lost friends, to convenience that became a key value. On the another hand, knowing surveillance techniques is nothing but the ubiquitous presence of surveillance around us in so many ways and in so many places that we no longer announcement it, and yet any of us inactive effort to take steps to defend privacy on the Internet.

This brings me to another question about responsibility. I will mention here to a Eurobarometer survey asking respondents who they think should guarantee that the individual data they supply on the net are collected, stored and transmitted safely: government, net companies, or they themselves? In most countries, respondents found that they were responsible. Doesn't it seem amazing that people feel that they have any kind of control over the processing of their data, although in fact – knowing the principles of the strategy of capitalism supervision – there's not much they can do?

K.S.: It's not how they feel about it, it's what they were told. I see here a similarity to environmental narratives, erstwhile it became rather clear to the world's largest companies that problems had been noticed and contamination of our planet would not get distant with it (this was about 20 years ago). The change of narrative, financed by these companies, was frequently made in a fake way, so that the recipient could get the impression that he was dealing with civic campaigns, e.g. to reduce plastic consumption or to reduce air travel. It is large erstwhile consumers' behaviour changes to more responsible, but this is the final part of the puzzle, due to the fact that real power is always on the side of those who make trends, produce goods and then sale them to us. For large companies, changing the way they produce, specified as reducing plastic consumption, is simply a substance of 1 decision – while consumers are being reduced at so many levels over time, economical force or deficiency of access to another goods, that the effort by companies to shift work to them, for example, for the climate crisis, is simply unfair and we should fight this kind of communicative decisively.

At the same time, I believe that there are actions that anyone can take to defend their privacy. For example: not always take your telephone with you. Or think twice before you install anything on it and besides do not let the device to usage the location unless absolutely necessary. So there are things we can do, and very costly devices, specified as iPhones, aid us make these choices – due to the fact that we pay the maker to have greater protection. However, is this an option available to the average consumer? Not at all. It is simply a luxury service for fewer but sold as your choice: “Do you want to be protected? Buy an even more costly device. Think twice before you do something.” It's not fair.

We must attack those who have the power to change the ecosystem, change the logic of the services and the business models behind the services – the excess behavioral, so precisely described by Zuboff. We should never let companies to collect and exploit our "behavioral surplus" – for them it is only data, but for us it is our life, digitized traces of our lives, which should never become part of the service. And that's why it's the work of companies, due to the fact that as individuals, we cannot remove the traces of our lives from the devices and services we usage in this life. It's impossible. I can opt out of sharing locations or receiving notifications, but I can't quit sharing my behavioral data to Google due to the fact that their services work to any degree based on these data. This must change, and we must constantly exert force and request work from digital giants.

D.L.: I agree that as individuals we could be more careful. possibly not as careful as I effort to be. I don't have a mobile phone, which is simply a real inconvenience for people who want to contact me. So, not sharing the belief that convenience is the highest value known to human beings, I become uncomfortable to others. But that's a different story.

The problems we are talking about are not individual problems. We can experience them as individuals, but they are of a social nature.

How we are perceived by digital corporations is not just based on our data stream, but besides depends on those with whom we are connected and with whom we contact. This group membership builds our profiles. erstwhile you are available online in any way, Your profile is created based on your contacts – both business and private.

Nobody pretends that this is just about us as individuals – and it is truly crucial that we realize this. This is an area in which, in my opinion, civilian society is crucial and important. It is civilian organisations, specified as the Panoptycon Foundation, that address these issues and propose solutions to the problems diagnosed. In the US, computer scientist and activist Joy Buolamwini founded the Algorithmic Justice League to aid algorithm developers realize that in the way they are created, social justice issues are built in, and algorithms themselves can be grossly unfair and discriminatory. Linnett Taylor deals with data justice issues, reasoning peculiarly of those who are economically disadvantaged and who are usually disproportionately disadvantaged not only due to the position they are already in, but besides due to corporate profiling. civilian society action separates us from reasoning in terms of individuals in contra-state. civilian society groups are always looking for ways to inform government authorities, which are liable for citizens, that this peculiar kind of technology has a negative impact on certain social groups, on their life chances, and surely on their improvement as human beings.

The last question is addressed to Katarzyna Szymielewicz and concerns the situation in Poland. Does the Polish State monitor its citizens without their cognition and consent, and if so, to what degree and how is this applicable to the proposed fresh changes in the Electronic Communications Law?

K.S.: The Polish context is not unique in our opinion, and we have been studying it for over a decade. The scale of surveillance by the state is not shocking in Poland. However, it should be noted that we are little and little clear on this issue. erstwhile we started our business as the Panoptykon Foundation, we received information on the scale of surveillance by sending requests through access to public information. Later, the law ordering you to print statistical data on how and on what scale the state services usage surveillance tools came into force. This solution has been working for a long time, and only late has this changed with our current government [the governing organization was then Law and Justice – ed.]. The information we have, which is not very detailed, speaks of a large number of 1.8 million points of data on citizens – based on data retention by telecommunications companies.

So it's not about eavesdropping on telephone calls, reading text messages or emails, but about the location of devices: who talked to whom, which phones travel together, etc. But specified large numbers usually arise from the way mobile telephone base stations (called BTS) operate. If the police want to check whether a peculiar device was in a circumstantial location at a certain time – or which devices were present together – it is usually essential to collect data from the full location and thus gain access to immense amounts of data. I am just giving an example of how the state services usage data, not defending that 1.8 million data points are okay. I don't know if it's okay.

In our opinion, the problem is not how many times the service of individual has checked or how many data points they have technically obtained, but how these data were used, whether the scale of action was adequate for the intent and whether the data that were irrelevant to the case were immediately removed without any another consequences.

However, if we consider another script in which the police usage a bomb alarm or another event, which can be easy generated to intercept data from 1 BTS in the centre of Warsaw – so it creates a data pool to then usage them operationally – this is where we have a worrying situation. To sum up: the scale of data collection by state investigative and intelligence services is not very worrying to me if I realize how they are then utilized – and we do not know that.

In Poland we are presently dealing with the problem of deficiency of effective supervision of this area. We have courts that decide on telephone wiretaps, but this is simply a completely different scale – thousands, not millions per year. In practice, courts receive applications which are poorly justified and not detailed adequate for the court to be able to deal with the case in a liable manner. And since decisions must be made very quickly, as a result, 98% of the proposals are accepted – which means that this is almost automatic and as specified is criticised by members of the judiciary. Judges are under force and, in practice, having no tools to accurately identify the application, they usually accept it, reasoning that if the data were misused by the service, it would be possible to verify erstwhile the case is pending, as the data obtained under judicial control becomes part of the case file.

Therefore, we may presume that if the data is misused, this should be visible to the justice during the improvement of the case – and erstwhile the case is closed, the data obtained by the services for investigation should be destroyed. Is that what happens?

Well, most likely not. In Poland, we had a large deal about the usage of Pegasus spy software by the service, which, in addition to tracking, eavesdropping and watching the smartphone user in real time, allows access to all information stored on the device, as well as conducting provocations, dropping off compromising content, and creating content that never existed (e.g. email messages in the user's email account). In our opinion, the usage of specified software should not be treated in the same way as telephone listening. I think we could win this dispute in court, but so what? These things happen.

Therefore, as I have already mentioned, the main problem is not the scale, but the ability to hold the perpetrators of abuse responsible, which in our country does not function. The Panoptykon Foundation brought the case on behalf of me, my colleague from the foundation – Wojciech Klicki – and respective another lawyers who have reason to believe that we were under surveillance for a while. We argue that we should be informed of this fact at the time of the closure of the investigation. We hope that the European Court of Human Rights in Strasbourg will confirm that this standard should apply in all European country – and that we will have the right in Poland to oblige the services to notify the people under surveillance of the closure of the investigation in order to increase the accountability and transparency of these actions. This is 1 example of legal safeguards we don't have.

As I mentioned earlier, we besides do not have effective digital surveillance, and access to data stored by telecommunications companies is remote, without any engagement of the judiciary. The authority besides attempted to change Polish law to even softer and more flexible for law enforcement authorities – but it was stopped due to social protests in which we participated as the Panoptykon Foundation. The intent of this task was to extend the existing data retention mechanics to online services, and this would surely increase the data pool available for distant access without any supervision.

Thus, state services would have access not only to data from my phone, coming from telecommunications operators, but besides to data that are stored by all net service providers – which would go much deeper into our lives, allowing insight into the logs of possibly all online activity, all email sent, all chat in the chat, all message in the communicator, etc. Today, people who fear they will be monitored based on data retention and telecommunications companies can usage safer chat applications specified as Signal or Telegram. They are not controlled by telecommunications companies or large Tech, and their users feel that there is at least this space of safe communication for them. If these services were subject to the same data retention obligations, we would lose them. So the fight continues and so far we have stopped this effort in Poland.

Thank you for the discussion.

Read Entire Article