We talk to Katarzyna Szymielewicz and prof. David Lyon about how we are invigilated by the state and corporations, the details of monitoring systems in China and India, the situation in Poland, and the function of civilian society in the fight for freedom.
(The text is simply a translated from English, edited and abridged transcript of the debate held on 30.09.2023 in Warsaw during the 5th Geopolitical Forum organised by the Institute of civilian Affairs).
David Lyon – sociologist, Emeritus prof. of Sociology and Law at Queen’s University, Canada, erstwhile manager of the Surveillance Studies Center, associate of the editorial boards of a number of journals including Surveillance & Society and The Information Society. 1 of the foremost theorists of the 'surveillance society’. He has been active in surveillance studies for over 25 years and has regularly published books and articles on these issues, including: The Electronic Eye (1994), Surveillance Society (2001), Surveillance after September 11 (2003), Surveillance Studies (2007), Identifying Citizens (2009), Liquid Surveillance (co-written with Zygmunt Bauman, 2013) or Surveillance after Snowden (2015). His books have been translated into 17 languages and have become classics of the surveillance studies literature.
Katarzyna Szymielewicz – lawyer, social activist, publicist, associate of the Digitisation Council at the Ministry of Digitisation, co-founder and president of the Panoptykon Foundation, a Polish NGO defending human rights in the context of contemporary forms of surveillance. For more than a decade, she has coordinated efforts for better regulation of technology companies in Poland and the European Union. Hosts the Panoptykon 4.0 podcast and publishes in opinion-forming media. associate of the Ashoka global network of social entrepreneurs. She holds a degree in law from the University of Warsaw and a degree in improvement studies from the School of Oriental and African Studies.
Anna Turner – Assistant prof. at the Institute of doctrine and Sociology, Polish Academy of Sciences, where she works on issues related to the impact of fresh technologies on society. She conducts global investigation related to public interest in issues of surveillance and privacy. He is simply a associate of the teams of 2 large investigation projects: Polish Panel survey POLPAN, where she deals with topics related to threats, and the global Social survey Programme, where she participates as an Advisory Expert in the improvement of the Digital Societies module. Author of publications, associate of technological conferences. Vice-chair of the Digital Sociology Section of the Polish Sociological Association.
Anna Turner: Poland’s past provides a fascinating context for discussions on surveillance. Older people have memories of the times of government surveillance during the totalitarian period, while the younger generation experiences varied forms of control, including monitoring and data processing by multinational corporations. Today, surveillance technologies affect virtually everyone, especially net users. Let’s start our conversation by asking about your knowing of the surveillance society: can you identify crucial moments that have shaped it in fresh years?
David Lyon: It’s an crucial question and there are various answers to it. I think of surveillance in relation to the data that affects us – specified a broad knowing has become central to many issues.
The notion of surveillance, which is surely acquainted in Poland, refers primarily to state surveillance. Nowadays, in many if not most countries in the world, the state is powerfully linked to commercial activities, to corporations, and so these 2 entities work closely together. The state is frequently found to be utilizing data from the commercial sphere. During the pandemic, the Canadian national Government bought data from a telecommunications supplier (it was mobile telephone data) and utilized it to effort to monitor the spread of the virus. This is an apparent example of how the state can trust on data from commercial companies. It is now a much more complicated issue.
So, erstwhile I talk about surveillance, I mean an interest in all action, individual life, all activity that reveals to others any information about us, which can then be collected in any way. It’s not just literal visual surveillance, but present it’s primarily surveillance through mobile devices.
Katarzyna Szymielewicz: I will mention to how the mission and scope of the Panoptykon Foundation has evolved over almost 15 years. As David noted, the first intent of our activities was to analyse surveillance practices affecting citizens by the state. In a situation where we were dealing with the global war on panic and the undermining of any protection of human rights, it became clear to us that we had crossed the line drawn by philosophers specified as Foucault and Agamben, beyond which no 1 is safe from the panic of the state. If we, as a civilisation, as communities, accept (we never did, but politicians presented it as if we did) the killing of individual in defence of our society – this besides means that we ourselves can be killed in defence of a society from which we have been excluded. Looking at this dynamic of state surveillance and deciding who fits in and who is outside society – who deserves to last and who deserves to die – was the first large thought behind our work 15 years ago.
The more we explored the issue, the more we discovered the complex power dynamics mentioned by David, in which the marketplace and the state work hand in hand to justify the request for surveillance of citizens, and to produce the tools and infrastructure to enable it – and that it is fundamentally the same ecosystem. The turning point in this discussion was the Edward Snowden revelations, which proved beyond uncertainty that this was the case, i.e. that data collected by commercial companies previously associated with freedom, access to information and with a reputation as 'the coolest companies in the world’, specified as Google, Facebook et al, were an active part of the surveillance apparatus. It then became clear to us that the main focus of our work should be to look at the practices of these companies. This is not to say that state surveillance is no longer dangerous. It is, but we understood that this is how the planet works.
We will most likely not be able to replace a surveillance state for citizens with a non-surveillance state, so it is better to make any kind of regulation to defend citizens’ rights. To any degree it is essential for the state to fulfil its functions, to defend us erstwhile we truly request defending and to organise public services erstwhile we request them, but in the marketplace for online services surveillance should not be part of the package.
Over the past 5 years, much of our work has been about just specified companies and their regulation, not least due to the fact that surveillance has virtually become invisible and elusive for customers. erstwhile we encounter state surveillance, we at least know that individual is exercising power over us. We may feel intimidated, threatened, uncomfortable. It is completely different in a commercial environment, where it is sold to us as a convenience: 'Do nothing. Don’t decide. We’ll do it for you.”
I think this fresh wave of surveillance, based on comfort tools, is linked to people becoming more passive and 'happy’ – they don’t consciously choose it, but they feel happy, withdrawing from active choices and simply giving in to suggestions, recommendations, targeted advertising, watching how companies form their lives. This is most likely much more dangerous for society than surveillance by the state, due to the fact that it’s much harder for us to halt – to announcement what’s going on and effort to question it. This is what I see as the main problem with surveillance at the moment.
A.T.: As a researcher, I am fascinated by comparative analysis, especially in terms of what societies have in common and what makes them different. investigation in Western countries shows that we have a mostly negative view of surveillance and the usage of our data without our consent. However, it is worth noting that acceptance of surveillance practices increases erstwhile these actions are motivated by safety concerns. In another words, I do not tolerate my data being utilized without my knowledge, but I change my head erstwhile I am convinced that it is essential for security.
A country with a very different attitude to surveillance from ours is China. I will mention to a survey whose findings come from a fascinating book published a fewer weeks ago by Ariane Ollier-Malaterre entitled „Living with Digital Surveillance in China: Citizens’ Narratives on Technology Privacy and Governance”. „Living with Digital Surveillance in China: Citizens’ Narratives on Technology, Privacy and Governance”. In the conclusion, the author writes: „Chinese respondents to the survey believe that the improvement of technology will reconstruct China to its erstwhile glory, solving all Chinese problems. They accept surveillance techniques due to the fact that they see the government as a trusted guardian, almost a parent, who is needed erstwhile 'moral quality’ is lacking. In another words, respondents say it is simply a form of discipline needed to counter chaos in specified a immense country’.
We can so see how perceptions of surveillance techniques in China disagree significantly, both in comparison with studies conducted in Western countries and in the context of the communicative presented by the Western media, in which the Chinese Social Trust strategy is portrayed as an example of Machiavellian, totalitarian control. What is your opinion on this and is there anything we can learn from the Chinese?
D.L.: The situation in China is fascinating and very different from what we experience in Canada and, I understand, what you experience here. The cultural differences make it impossible, in my opinion, to make simple comparisons between these cultures – as Westerners we are very different from the Chinese, shaped by a heritage of Confucianism. The US in particular, but besides many another Western countries, tend to see China as any kind of avoidable dystopia, leading to increasing tensions in bilateral relations, with very small knowing from the West of what is actually happening in China. I agree that Ariane’s book is simply a successful effort to dispel any of the stereotypes and glaring errors in our reasoning about China.
You have highlighted a different perception of state surveillance. This has to do with the national humiliation that the Chinese have experienced in many ways over the last century. The invasion of Japan in the 1930s, for example, is inactive a subject of tension in China. This humiliation affects the way the Chinese think about the relation between citizen and state. Here we have a fundamental difference with Western countries. erstwhile it comes to the Public assurance System, the Chinese are afraid of being shamed due to a low score on a assurance study – this shame is, in my opinion, much stronger than in Western societies. We are talking about akin phenomena, but experienced differently due to cultural backgrounds.
What we are dealing with in China is not so much surveillance capitalism – referring to the title of Shoshana Zuboff’s 2019 book – as state surveillance capitalism. We cannot simply apply the diagnoses posed by Zuboff to describe the Chinese reality, as the state origin plays a much greater function there than in Poland, Canada, the UK or many another Western countries. Therefore, before delving into circumstantial issues, let us be careful not to hastily extrapolate Western realities to Chinese realities and not to make erroneous assumptions either about the motivations that drive the behaviour of Chinese citizens or about the goals that guide the actions of the Chinese government. Personally, I am not enthusiastic about Xi Jinping’s regulation – but I effort to realize his motivations and assumptions taking into account the cultural context in which he operates.
K.S.: I full agree with David and, like him, I avoid comparing us with China. specified attempts seem downright ridiculous to me. This is not a criticism of your question – I know you are asking it due to the fact that that is the media narrative, as you besides mention. China has no intentions towards us. I.e. if they have any plans, they are not focused on Poland, but affect the full planet and large games with much more powerful players. China has become a kind of smokescreen for us to hide the problems we face in the West, and the message that „we are not China after all” is intended to shut down the discussion that should be going on. I think this is simply a fundamentally flawed approach. I think we should keep a close eye on China. Like David, I am not a fan of their practices as such, but the consistency, precision and prudence with which they are implemented is impressive.
Let me give 2 examples that I did not know about until I started asking myself questions about China. According to the researchers I spoke to during my own survey of the issue, the Social Trust strategy was designed specifically for social inclusion. So it is the same situation as in another underdeveloped countries, specified as India, where people have no identity documents, half the population is uncounted, unidentified and citizens have no identity in relation to the state. These are very different realities from ours due to the fact that we are counted, identified and monitored. So countries like China and India are creating systems to socially integrate residents so that they can take out loans, travel or receive benefits. We, on the another hand, put our cognitive filters on these processes and criticise the acquisition of information about citizens by the governments there, without being aware of what their starting point was and what challenges these programmes are responding to. This was 1 of the reflections that came to me while discussing China’s Social Trust strategy with individual who knows Chinese society much better than I do.
Another example is large companies specified as TikTok, which have come in for a large deal of criticism in Europe – and with good reason. Yet in the West, we don’t truly realize how these companies operate in China – from my observations, they are controlled and their operations are governed by the state, whose policies set their direction. To reiterate, I’m not a large fan of these solutions, but if we have a strong state that is able to control what the monitoring companies do, and these companies cannot cross certain red lines, for example they cannot offer to children what they offer to children here – making them addicted to technology, providing them with content that children should never watch, manipulating their minds (and clearly this is not the case in China due to the fact that the relation between these companies and the state looks different) – then possibly this is something we could learn from the Chinese.
D.L.: I will mention to the point Kasia has just made. In the West, we frequently mention the example of China as a kind of dystopia that can be avoided. It’s a way we don’t want to follow due to the fact that we make quite a few assumptions about it – for example, it is suggested, or even stated, that the Chinese Social Trust strategy was developed by the Chinese state to track and control all citizens. Yes, the social trust systems that be in China are primarily run by the government, but they mainly measure the activities of corporations. They do not collect data on individual citizens or consumers.
There is no unified strategy called Social Trust in China. Since 2014, there has been a plan to make any aspects of the Social Trust System, which was expected to be realised in 2020. This has not happened. Across the country, many people have objected to circumstantial parts of it, and in many cities corporations are rejecting certain elements of the Social Trust strategy due to the fact that they feel it is inadequate to the task for which the government set it up. So let us not imagine that China has a unified strategy of top-down, totalitarian control, exercised by the authorities in Beijing. It is far more complex, far more fluid and far more open to contestation. Over time, corporations have opposed, changed and besides withdrawn any elements of it.
Kasia besides referred to India. India’s population will shortly surpass that of China. If we are looking for a uniform strategy that covers all citizen of a country, why are we not looking at India – which is never mentioned as a country whose solutions we want to emulate or avoid? India has 1 central, state-organised biometric enrolment strategy [Aadhaar – editor’s note] – the Indian Prime Minister invited the head of India’s largest technology corporation, Infosys, to set it up. There are presently 1.4 billion people registered with it. Technologically and administratively, it is simply a staggering system, built at a pace that is hard to believe and based on iris scans (as well as facial photos and fingerprints). The scans of 1.4 billion human irises are in 1 unified, comprehensive strategy in India. It fulfils an highly crucial function in Indian politics, but is of course besides the subject of much controversy. Nevertheless, what we have here is simply a biometric strategy that was initiated by the state, but realised with the support of a large corporation, and which effectively covers all citizens.
K.S.: We can besides mention companies like Meta or Alphabet, which have identity-based systems utilized by billions of people, they service as an online identity supplier and possibly in the future – I hope not – their services will be accessed utilizing fingerprints, which will become the default way to enter the system. It is just a substance of programming the devices, mobile phones and tablets that we usage to access these services. This solution makes sense due to the fact that this way is fast, and people like the speed, the reliability, the intuitiveness; the promise that they won’t be hacked – and they don’t even think about handing over their data to private companies, although they are opposed to the state acquiring it. Of course the state will get them too. It already has them! We are fingerprinted at airports in the European Union and there is nothing we can do about it. My point is that specified a future awaits us here too, so it is better to concentrate on the practices of the authorities here and now, alternatively of making colonial claims to teach others how to defend the privacy of their citizens.
A.T.: I gotta admit that I have besides noticed the patronising speech of any of the comments in the Western media, indicating alternatively a complete deficiency of knowing of local conditions. investigation by Ariane Ollier-Malaterre shows that many Chinese citizens are not even aware of the existence of the Social Trust System, and that digital surveillance programmes are not something that concerns them besides much.
Emerging at a dizzying pace, innovations are being implemented rapidly and on a large scale. How serious are surveillance practices in the West and is adequate being done to control and regulate them from a legal point of view? Is it at all possible for changes in the law, which is usually rather slow, to keep pace and keep up with specified fast developments in technology?
D.L.: We request to think about what is actually happening in our societies. I think Shoshana Zuboff’s work on surveillance capitalism is very helpful in this regard. I disagree with any of the author’s conclusions, but we respect each other. I think Shoshana hits the nail on the head of any truly crucial aspects of today’s surveillance practices, noting that corporations are heavy active in the collection of individual data and that this data is extracted from our everyday behaviour. So what we are dealing with here is not the action of any alien force uncovering and aggregating information about us. Rather, the point is that it is we, through our online activities associated with the usage of digital devices, who are constantly producing data that is then collected. These are very valuable and can be monetised. This is how the large digital corporations make money, by manipulating this data with algorithms to usage it for their own purposes and resell it to others.
I have previously given the example of the Canadian Telus corporation, which during the pandemic sold mobile telephone usage and mobile traffic data to the Public wellness Agency of Canada. no of the 33 million people whose data was sold to the government knew that specified a transaction was taking place. The pandemic was utilized as an excuse for a fast data grab, and I imagine the people at the Public wellness Agency of Canada didn’t even think anyone would ask them questions about it. But they did. I think there needs to be a serious rethink of how citizens’ data is presently captured and collected. Today, even data about our relation with the state is frequently collected in the commercial sphere. Of course, there are inactive government safety agencies that have access to and usage our data.
Today, however, the key question is about the commercial usage of individual data, which can besides be utilized by government bodies – from the police (who love to access data provided by corporations) to public wellness agencies, safety agencies, and a variety of government institutions that frequently depend on the information. Sometimes data about us is collected by these institutions, but nowadays it is increasingly obtained from commercial agencies and corporations.
K.S.: Let me come back to the question of whether it is possible for the law to keep up with the improvement of technology: I believe that it should not even. After all, the law should never arise before problems are defined – otherwise we would see it as authoritarian, despotic and dystopian. It is as if the state knows better than us how to prevent problems before we specify them. A very good example of specified a law that was drafted on the basis of sound problem definition is the General Data Protection Regulation (GDPR), as it is built on assumptions that have existed since the 1970s.
We are so talking about more than 50 years during which the thought that no data should be collected about a individual without a valid reason specified in the law has worked well in our reality. This reason could be the best interest of that individual or their consent. But it could besides be a policy implemented by a state that is able to justify the necessity of specified an action. We would most likely agree that this is simply a very good principle. However, it creates the reality that the state could implement a hypothetical policy whereby citizens are required to grin erstwhile crossing the border – and introduce facial scanning to guarantee that more of them smile. We, however, can then say: no, this is illegal. And take up the fight on this
If we as citizens know what we are defending – if we are motivated to defend our freedom – we can win. If we don’t – i.e. if we actually succumb to narratives that offer us the false promise of safety in exchange for our freedom – even the best assumptions won’t aid us, due to the fact that we won’t defend them in the circumstantial case where our freedom is violated. Or worse, we will let ourselves to be persuaded that this is simply a situation in which our consent should not matter
So much for the state, but let’s talk about a marketplace that has been regulated by data protection laws for decades, and yet companies specified as Alphabet (formerly known as Google) and Meta (formerly known as Facebook) have developed an astonishing surveillance apparatus that even China would not have been able to build on its own, without the engagement of commercial companies (as they do now). How was this possible? For 1 thing, these companies were created in a country where there was no regulation, namely the United States. There is presently a fierce discussion around this issue in the US, with questions being asked about how this situation could have happened. I constantly hear from US politicians and scientists how much they regret it. But there were reasons why these regulations were not put in place – those reasons were economical improvement and a circumstantial approach to freedom as a default value as long as the harm to society is not apparent adequate to limit the freedom of corporations to act. We neglect to see how much corporations do to defend this freedom of action of theirs – and the American public has chosen this narrative. It has accepted the actions of digital giants due to the fact that it has been seduced by promises of comfort, growth, free, attractive services, and so on.
That’s how it started. And then these processes reached Europe and – even though we had our laws and regulations – these large marketplace players managed to circumvent them in many ways, mainly due to their incredible ability to make a narrative. It took a long time for our courts, the European Commission and even NGOs representing citizens to effectively counter them by formulating a communicative to the contrary. Shoshana Zuboff’s book played a key function in this process, so regardless of what I think of her argument, I like the overall way she presents the problem of surveillance capitalism and holds companies liable for creating this strategy and circumventing many of the safeguards. Zuboff’s book was 1 of many informing signs, specified as the Cambridge Analytica scandal and the Edward Snowden revelations. Decision-makers understood what they were dealing with – and that the crux of the problem was not shoe advertisements displayed to consumers with their consent on services specified as Facebook. That is not the issue here. The problem is behavioural surplus, as defined by Zuboff, and which we have besides discussed today. It is about data about us, about our behaviour, our choices, our preferences, which is collected and utilized without our consent or even without our awareness.
For a long time, companies avoided the consequences – although there were regulations covering precisely these issues – by arguing that it was not individual data. This shows what the real problem with technologies is. Very frequently we do not realize them adequate to put in place adequate regulations or rules. If we had a different communicative and a better knowing of what happens on the another side of our screens, we could have more effectively applied the regulations we already had in place before Facebook was created and stopped these practices. However, this was beyond the scope of societies.
People like us – NGOs, university lecturers, hackers, groups like the Chaos Computer Club in Germany and the Electronic Frontier Foundation in the US – warned that these processes were happening, but it was a niche niche, an avant-garde niche, not full understood until large names like Zuboff or popular films like The Social Dilemma on Netflix came along and changed the narrative. It took our societies 2 decades to realize what was happening. Therein lies the problem. I wouldn’t put all the blame on the law, and I would never encourage lawyers to decision faster, even before the problems are defined. For besides long we have presented the issues surrounding the usage of services specified as Facebook to the public as a problem of individual choice alternatively than a immense social problem. Now that has changed, we have fresh legislation, social harm alternatively than individual failure is being discussed, but it has taken us 2 decades. The question is: can we velocity up? Can we analyse the phenomena generated by fresh technologies faster? If we work at the current pace and request 2 more decades to realize how the fresh services work, this is not a recipe for maintaining freedom.
D.L.: I agree with you, Kasia. It is crucial to place what we are discussing in the right context. Just as the function of cultural factors in China or India is important, in the West I announcement 2 aspects that I think play a key function in this situation. 1 is the belief that technology is the key to solving all our problems, which fits with the thought of technological solvationism. This is just a myth, but companies very much want us to believe in it. The second component is our perception of our own actions through the lens of convenience. Convenience has been elevated to the highest value, although in my opinion it should not occupy that place. So erstwhile we are sold an iPhone or another specified device, the main argument for buying it is usually convenience (not to mention the fact that we pay a fewer 100 dollars more for that convenience). The thought of convenience has been very effectively implanted in our minds as consumers.
I full agree with Kasia that cultural factors have been instrumental in the failure of state institutions to introduce regulations that restrict the activities of digital giants. However, the fact that these companies operate as if they are accountable to no 1 in peculiar is due to the cultural background of believing that we have the technological answers to all challenges – and the belief that convenience is an inherent human value.
A.T.: When you talk about this, I am reminded of an article you wrote with Zygmunt Bauman, among others: „After Snowden: Rethinking the Impact of Surveillance”, in which you specify 3 factors that influence the acceptance of surveillance practices: fear, amusement and familiarity. I have spoken about fear before, frequently based on it by government agencies who argue for the request to monitor data and information to keep citizens safe. amusement has evolved from the reasonably simple mechanisms that social media initially relied on, specified as contact with long-lost friends, to convenience, which has become a key value. In contrast, familiarity with surveillance techniques is nothing more than the ubiquity of surveillance surrounding us in so many ways and in so many places that we no longer announcement it, yet any of us inactive effort to take steps to defend our online privacy.
This brings me to my next question about accountability. I will mention here to a Eurobarometer survey in which respondents were asked who they thought should guarantee that the individual data they supply on the net is collected, stored and transferred securely: the government, net companies, or themselves? In most countries, respondents said that they themselves were responsible. Doesn’t it seem amazing that people feel they have any control over the processing of their data, even though in reality – knowing the rules of surveillance capitalism – there is small they can do?
K.S.: It’s not about how they feel about it, but what they’ve been told. I see a parallel here to the narratives associated with environmentalism, erstwhile it became rather clear to the world’s biggest companies that the problems had been noticed and they would no longer get distant with polluting our planet (this was any 20 years ago). The change in communicative – funded by these companies – was frequently done in a sham way, so that the viewer could get the impression that they were dealing with citizen campaigns to, for example, reduce plastic consumption or reduce air travel. It’s large erstwhile consumer behaviour changes to be more responsible, but it’s the last part of the puzzle, due to the fact that the real power is always on the side of those who make trends, produce goods and then sale them to us. For large companies, changing the way they produce, e.g. reducing their usage of plastic, is simply a substance of a single decision – whereas consumers are constrained on so many levels by time, economical pressures or a deficiency of access to another goods that an effort by companies to shift work for, for example, the climate crisis onto them is simply unfair, and we should definitely fight this kind of narrative.
At the same time, I believe that there are steps that each individual can take to defend their privacy. For example: don’t always take your telephone with you. Or think twice before installing anything on it, and don’t let the device to usage your location unless absolutely necessary. So there are things we can do, and very costly devices specified as iPhones aid us make these choices – due to the fact that we pay the maker to have more protection. But is this an option available to the average consumer? Not at all. It is simply a luxury service for the few, but marketed as your choice: 'You want to be protected? Buy an even more costly device. Think twice before doing something’. This is not fair.
We must attack those who have the power to change the ecosystem, to change the logic of the services and the business models behind the services – the behavioural surplus so accurately described by Zuboff. We should never let companies to collect and exploit our 'behavioural surplus’ – to them it’s just data, but to us it’s our lives, digitised traces of our lives that should never become part of a service. And that is why the work lies with companies, due to the fact that as individuals we cannot remove the traces of our lives from the devices and services we usage as part of those lives. This is not feasible. I can opt out of sharing my location or receiving notifications, but I can’t opt out of sharing my behavioural data with Google due to the fact that their services run on that data to any extent. This needs to change and we request to keep up the force and request accountability from the digital giants.
D.L.: I agree that as individuals we could be more careful. possibly not as careful as I effort to be. I don’t have a mobile phone, which is simply a real inconvenience for people who want to contact me. So, by not sharing the belief that convenience is the highest value known to human beings, I become an inconvenience to others. But that is another story.
The problems we are talking about are not individual problems. We may experience them as individuals, but they are social in nature. How we are perceived by digital corporations is not just based on the data stream coming from us, but besides depends on those we are connected to and interact with. It is membership in groups that builds our profiles. erstwhile you are online in any way, your profile is built from your contacts – both business and personal.
No 1 pretends that it is just about us as individuals – and it is truly crucial that we realise this. This is an area where I think civilian society has a key function and importance to play. It is civilian society organisations, specified as the Panopticon Foundation, that take up these issues and propose solutions to diagnosed problems. In the US, computer scientist and activist Joy Buolamwini founded the Algorithmic Justice League to aid programmers who make algorithms realize that social justice issues are built into the way they are created, and that algorithms themselves can be grossly unfair and discriminatory. Linnett Taylor addresses issues of data justice by reasoning specifically about those who are economically disadvantaged and who tend to be disproportionately disadvantaged not only by the position they already hold, but besides by corporate profiling. civilian society action takes us distant from reasoning in terms of the individual as opposed to the state. civilian society groups are always looking for ways to alert government authorities – who have a work to citizens – that this peculiar kind of technology is having a negative impact on certain groups in society, on their life chances and surely on their improvement as human beings.
A.T.: My last question is to Katarzyna Szymielewicz and it concerns the situation in Poland. Does the Polish state invigilate its citizens without their cognition and consent, and if so, to what extent, and how does this relate to the changes to the Electronic Communications Law that have been proposed until recently?
K.S.: The Polish context is not unique in our view, and we have been studying it for more than a decade. The scale of state surveillance is not shocking in Poland. However, it should be noted that we have little and little clarity on this issue. erstwhile we started our activities as the Panoptykon Foundation, we received information about the scale of surveillance by sending requests through access to public information. Later, a law came into force ordering the state to print statistic on how and to what degree state services usage surveillance tools. This solution had been in place for a long time and only late changed with our current government [the ruling organization was at the time Law and Justice – editor’s note]. The information we have, which is not very detailed, talks about a large number of 1.8 million data points on citizens – based on data retention by telecom companies
So it’s not about listening in on telephone calls, reading text messages or emails, it’s about locating devices: who has been talking to whom, which phones are travelling together, etc. But specified large numbers are usually due to the way mobile telephone base stations (so-called BTSs) work. If the police want to check whether a peculiar device was in a peculiar location at a peculiar time – or which devices were present there together – it is usually essential to collect data from the full location, and thus access immense amounts of data. I’m just giving an example of how government services usage data, and I’m not defending that 1.8 million data points is OK. I don’t know if it is OK.
In our view, the issue is not how many times the services checked individual or how many data points they technically obtained, but how the data was used, whether the scale of the action was fit for intent and whether data that was irrelevant to the case was immediately deleted without any another consequences. However, if we consider another scenario, in which the police usage as a pretext a bomb alarm or another event that can be easy generated to capture data from a single BTS in the centre of Warsaw – thus creating a pool of data to then usage operationally – then here we already have a worrying situation. To summarise: the scale of data acquisition by state investigative and intelligence services is not very worrying to me, if I realize how it is then utilized – which we do not know.
In Poland, we are presently facing the problem of deficiency of effective oversight in this area. We have courts which decide on telephone tapping, but this is on a completely different scale – thousands, not millions per year. In practice, the courts receive applications that are poorly justified and besides lacking in item for the court to review the case responsibly. And due to the fact that decisions should be made very quickly, the consequence is that 98% of applications are accepted – meaning that this happens almost automatically and as specified is criticised by members of the judiciary. Judges are under force and – in practice not having the tools to thoroughly review a request – tend to accept it, reasoning that if there was a misuse of data by the services, it will be verifiable erstwhile the case is pending, as the data obtained under the control of the court becomes part of the case file
We can so presume that if there is misuse of data, this should be apparent to the justice as the case develops – and erstwhile the case is closed, the data acquired by the services for the investigation should be destroyed. Is this happening? Well, most likely not. In Poland, we had a large scandal related to the usage of Pegasus spyware by the services against, among others, opposition politicians, which, in addition to tracking, eavesdropping and voyeurism of a smartphone user in real time, allows access to all information stored on the device, as well as carrying out provocations, planting compromising content, and creating content that never existed (e.g. emails on the user’s email account). In our opinion, the usage of this kind of software should not be put on the same level as telephone tapping. I think we would win this argument in court, but so what? These things do happen
Therefore, as I have already mentioned, the fundamental problem is not the scale, but the anticipation of holding the perpetrators of abuse accountable, which does not function in our country. The Panoptykon Foundation has brought a case on behalf of myself, my colleague from the foundation – Wojciech Klicki – and respective another lawyers who have reason to believe that we have been under surveillance for any time. We argue that we should have been informed of this erstwhile the investigations were closed. We hope that the European Court of Human Rights in Strasbourg will confirm that this standard should apply in all European country – and that we will have a law in Poland that obliges the services to notify those under surveillance erstwhile investigations are closed, so as to increase the accountability and transparency of these activities. This is 1 example of the legal safeguards we do not have.
As I mentioned earlier, we besides do not have effective digital surveillance, and access to data stored by telecom companies is done remotely, without any engagement of the judiciary. The authorities besides made an effort to change the Polish law to be even more lenient and flexible for law enforcement agencies – however, this was stopped as a consequence of public protests in which we participated as the Panoptykon Foundation. The aim of the aforementioned task was to extend the existing data retention mechanics to online services, and this would surely have increased the pool of data available for distant access without any oversight
So the government services would not only get access to my telephone data from telecoms operators, but besides to the data that is held by all ISPs – which would go much deeper into our lives, allowing us to see the logs of possibly all activity undertaken online, all email sent, all chat conversation, all instant messenger message, etc. At the moment, people who are afraid about being surveillance by data retention and telecoms companies can usage more safe chat apps specified as Signal or Telegram. These are not controlled by either telecoms companies or large Tech, while their users feel that there is at least this space of safe communication for them. If these services were subject to the same data retention obligations, we would lose them. So the fight is inactive on and we have stopped this effort in Poland for the time being.
A.T.: Thank you for the discussion.