How our personal data became a bargaining chip of political forces in elections

How our personal data became a bargaining chip of political forces in elections
Members of the Working Group of the Verkhovna Rada on Reforming the Legislation of Ukraine on Personal Data Protection and Processing (established by the Verkhovna Rada Committee on Digital Transformation and the Verkhovna Rada Committee on Human Rights, Deoccupation and Reintegration of Temporarily Occupied Territories in Donetsk, Luhansk Regions and the Autonomous Republic of Crimea, the City of Sevastopol, National Minorities and Interethnic Relations) continue to work on a draft of a new Law of Ukraine "On Personal Data Protection".

Why does Ukraine need new data protection legislation? This area often remains "in the shadows" - although it may affect the results of the upcoming Ukrainian elections. So now the issue of strengthening the legislation and its compliance with modern challenges is burning.

In a new special project, the Centre for Democracy and Rule of Law explains what personal data is, how it is used in the electoral sphere, the most high-profile scandals with voter data leaks and what neglect of privacy rules and unregulated circulation of personal data lead to.

Let's analyze the world practice in data protection and how this experience can be applied in Ukraine.
In the 21st century, we are faced with the total technologicalization of all processes, but we still do not understand what the provision and processing of personal data is, what it affects and what the end result we, as consumers, get.

Upon the receipt of services, we transfer a large amount of our data: for example, by filling out questionnaires, checking in in hospitals, shops, sports clubs, installing applications, etc. We give the most information about ourselves to social media and networks.
Personal data is a new "currency" that people pay to access information. They are being hunted by political forces and used against us to influence the will in elections. Knowing everything about people, the voters can be forced to vote the way politicians want.
Would you believe us if we said that technologies that destroy democracy are being used against voters? And if we give examples of such cases and prove that your personal data is in danger, do you promise to protect them better?

In this article, you will learn what personal data is, how it is used in the electoral sphere and the most high-profile scandals involving voter data, the consequences of neglecting privacy rules and the unregulated circulation of personal data used for political purposes.

Let's analyze the world practice in data protection and how this experience can be applied in Ukraine.
Let's take it from the top: what personal data is
Your personal data means any information by which you can be easily identified from among others. The Law of Ukraine "On Personal Data Protection" stipulates that personal data is information or a set of information about an individual who is identified or can be specifically identified.

Personal data is divided into general and sensitive.
General – full name, number / code of the identification document, place of residence, telephone number, email, citizenship, education, financial status, marital status.

Sensitive – racial or ethnic origin, political, religious or ideological beliefs, membership in political parties and trade unions, criminal record, health data, sexual life, biometric (DNA, voice, retina, fingerprints, facial images, height, weight) and genetic (hereditary traits, ways of inheriting characteristics within a relevant group of people, genetic information (genes) relating to any aspect of health or disease) data.
In Ukraine, sensitive data include:

racial or ethnic origin, political, religious or ideological beliefs, membership in political parties and trade unions, criminal record, and data relating to health, sexual life, biometric or genetic data.

As history shows, the use of personal data (especially sensitive data) for advertising and political purposes goes to ridiculous lengths and violates the law. And the political preferences of voters, which are sensitive data, are exactly what politicians need to exert their influence.
Politics borrowed everything from advertising
The chain of US retail stores Target (and not only) provides each of its customers with an ID – a unique code with which it can track everything that customers buy.

This unique code is linked to information about the goods that the buyer prefers, as well as:

  • Demographic information;
  • Age;
  • Marital status;
  • Whether they have children;
  • In which part of the city they live;
  • Probable salary;
  • How much time the buyer needs to get to the Target store;
  • Whether the buyer has recently moved;
  • What credit cards they use;
  • Which websites they visit.
There are also companies selling personal data (commercial profilers), where you can buy the necessary information.

So Target can easily buy customer information about:

  • Ethnicity; Employment history;
  • Favorite magazines;
  • Whether the buyer was bankrupt;
  • Whether the buyer was divorced;
  • Where they studied;
  • Topics they discuss online;
  • What coffee they like, what napkins they buy, what breakfasts they have, what they read and much more.
All data is aggregated to profile the customer to understand their behavior and habits to offer the "right products". Profiling has given rise to the industry of "customer relationship management".

Most likely you will not even know that such information about you is collected, because you are not informed about it (*carefully read the consent to the processing of personal data to know what information, by whom and why it will be processed, to whom it is transferred / sold).
For example, one visit to a website collects data about a person's activity on the website, gender, IP address, cookies, which device they used, interaction with the website, mobile applications, social media, shopping history, re-purchase of goods, preferences about product, product selection criteria.
Thus, companies can both collect customer data using sophisticated software and purchase them. In the market for the sale of personal data, companies sell customer data to third parties and the data regularly goes from hand to hand.
Too intimate
Target analyst Andrew Paul has created a model for predicting pregnancy, which determines that a woman is pregnant and sends her targeted advertising (cribs, toys, images of babies, etc.). If you download the algorithm to the Target National Customer Base, it will give tens of thousands of women who are most likely pregnant. Because of this model, Target faced a situation that became notorious. A father of a high school girl found out about her pregnancy by accident through coupons (maternity clothes, children's furniture, baby photos) that Target sent to the girl. Of course, Target disowned the case, but then the company's marketing department conducted a study and found that pregnant women do not like when their reproductive functions are monitored and when they are sent outright advertising. Therefore, they began to send mixed advertising, where children's products looked random (lawn mowers next to diapers). After such a marketing move, Target's Mom and Baby sales skyrocketed.

Modeling customer behavior and habits so that they buy more and increase sales is a science. Research into the psychology of behavior uses a lot of financial resources and intellectual faculties, so no action is taken at random.

The better the algorithm is designed, the more money they will make on you.
Not dangerous enough?
Metromail, one of largest commercial profilers, used inmates to enter personal information about people from questionnaires into computers to further create profiles. As a result, the man imprisoned for rape persecuted Beverly Dennis, whose full profile he received from Metromail (25 pages of personal data). In a letter to Beverly, the prisoner threatened her. This led to a lawsuit Beverly Dennis v.Metromail. According to the lawsuit, Metromail was banned from using prisoners to process personal data. This story took place before the 2000s.

Of course, in those days the outdated legal regulation of personal data protection in the United States lagged behind the methods of profiling people and could not protect data properly. Instead, legal proceedings for personal data protection date back to 1985, which proves that the problem of profiling existed a long time ago.
In fact, personalization and targeting are not bad. People receive information about products and services based on their interests, which eliminates the need to search among hundreds of thousands of irrelevant goods.

However, a consumer has the right to know and understand what information is collected about them, how the algorithms work, who buys their data and how to prohibit such activities with respect to themselves.
How are elections connected with all this?
If we talk about elections and the information that is sent to voters, the practice of profiling and targeting gets negative connotation because it discriminates against the audience through selective disclosure of information. Modeling customer behavior has flowed seamlessly into modeling voter behavior. If algorithms have crept into reproduction, can you imagine what they can do with the information you voluntarily post on social media, such as political views?

Technology has gone much further and now it not only studies us, but also influences our global decisions. It is one thing when it is a decision to buy coffee of another brand, and it's a totally different thing when it influences who to vote for in the election.
(Non) voluntary information diet: how voters are forced to vote for "the right person"
Everybody seemed to hear about Cambrige Analytica and Facebook. In reality, the affair is much more sofisticated
Realizing the potential and effectiveness of advertising personalization, since the mid-2000s the methods of profiling and targeting have been picked up by politicians. Thanks to the success of personalized advertising, algorithms with even more accurate predictions have been developed, based on data such as extraversion / introversion of a person, their values, thoughts, attitudes and interests. This helped to better identify the individual characteristics of people, which highlight their preferences.

To understand how deeply digital technology and behavioral modeling have penetrated the minds of voters, we will explain how they were covertly persuaded to vote for the right candidate and what methods were used. Of course, without the personal data theft, it would hardly have succeeded.

Interestingly, not all voters were manipulated. So how did they identify "chosen" people?
USA, Cambridge Analytica and Facebook leaked user data
Let's take a look at the history of the British company Cambridge Analytica (hereinafter referred to as CA).

CA was a subsidiary of SCL Group/SCL Elections (hereinafter referred to as SCL). In the 1990s, the SCL Group worked in the field of behavioral research and communication strategies in the military and political spheres: participated in the development of psychological warfare strategies as a contractor for the US and British military during operations in Afghanistan and Iraq, and referred to itself as a "global election management agency". By studying the "audience", SCL changed the behavior of potential voters, depending on the client's order and influenced the course of elections in Italy, Latvia, Albania, Romania, South Africa, India, Indonesia, Philippines, Thailand, Colombia, many other countries ... and in Ukraine.

CA itself was established to work with the US election. The company entered the market in 2012 and worked in 44 elections: to the US Congress (including the Senate), in 2014 - in the elections at the state level. And the presidential elections were not the exception. CA used in-depth analysis of voter data, including social media data, to create strategic communication during election campaigns, namely to create psychological profiles to develop and send personalized campaign advertising to voters.

The activities of SCL Group and CA ran smoothly until the end of the 2016 US presidential election.
Prerequisites
"The rules don't matter for them.
For them, this is a war, and it's all fair."
– about CA executives
Christopher Wiley – former CA Director of Research
In 2014, the Republican Party ordered CA services. After receiving a $15 million remuneration, CA faced the challenge: lack of voter data to profile and influence their behavior.

So they needed to come up with something. That same year, data analyst Alex Kogan developed the application "This Is Your Digital Life" – a test with questions like "I seldom feel blue" with answer options, such as "Very Inaccurate", "Very Accurate", "I don't know". At the end, the test gave a humorous answer about the user's character traits.
At the same time, the test is designed on the basis of serious psychological research, which allows determining the character traits of a person, such as openness to new experiences, extraversion, awareness, friendliness and neuroticism. To take the test, the user had to provide permission to use their data (as noted in the terms of use - for academic purposes). The application was implemented through the Facebook Open Graph platform (which allowed external application developers to have direct access to Facebook users).

In 2015, the application was used by 270 thousand users.

Back then, Facebook had weak data protection and access rules - it allowed data collection to improve user interaction, but banned its sale and use in advertising. Due to the fact that Facebook did not prohibit data collection, the application had access not only to the data of registered users, but also to their friends on Facebook, who did not install this application and had no idea that it has access to their data. After calculating the maximum number of friends that Kogan application users had, Mark Zuckerberg (Facebook founder) said that the application has collected data from at least 87 million users.
What the users of this application did not know was that Kogan's Global Science Research signed a cooperation agreement with CA, according to which CA was to be transferred (sold) user data:

  • Data about users from other platforms owned by Facebook, including Instagram and WhatsApp;
  • Information about which advertisers send ads to users;
  • Applications and websites that are installed and visited by users and that use Facebook services;
  • Location of users;
  • Payments of users processed on Facebook;
  • Connections with other Facebook users;
  • Messages, photos and other content sent by other users;
  • All information that users disclosed on Facebook (name, gender, political preferences, relationship status, religious beliefs, etc.);
  • User's actions on Facebook
  • Likes.
CA used the results of the Kogan test and all the data obtained from Facebook to develop an algorithm that analyzes voters and identifies their personal traits related to voting behavior. What for? Character traits influence political views. With voter data, including place of residence, political preferences, character, and complete psychological profiles, the CA algorithm effectively identified emotional voters who were still hesitant to choose their candidate.

They were sent targeted advertising with a candidate who ordered CA services, using emotional triggers for each individual voter. These methods formed the basis of CA's work on the Ted Cruz campaign in 2015, and then Donald Trump presidential campaign in 2016.
But for Facebook, we would not have won"
about the victory of D. Trump in the 2016 election
Teresa Hong, Digital Content Director
of the 2016 Trump campaign

Facebook claims that although data collection was not banned, Kogan violated Facebook's terms of use by passing user data on to CA. In fact, for Facebook, such data exchange is cost-effective.

In a leakage situation, both politicians and Facebook will win, but not voters. After receiving user data, politicians have an understanding of "who their voter is." The more voters, the more advertising they need to receive. You have to pay for advertising on Facebook. For example, in the US presidential election in 2020, D. Trump spent $ 20 million on 218 thousand advertising posts on Facebook, T.Steyer spent $ 16.8 million for almost 13 thousand posts.
The Guardian infographics
By the end of 2020, Facebook had 2.8 billion users, as of January 2021, 97% of users are adults, i.e. potential voters.

Campaigning methods based on voter data have become more "sophisticated". One form of campaigning sent to hesitant voters is digital morphing.
Morphing is a technology that allows you to watch how one picture changes to another when viewed from a certain angle.
The author of the photo is unknown. See photo source by the link
In the early 1990s, the technology was digitized to make it more realistic. Digital image morphing is used in the campaigns.

For example, we have 2 pictures of different people's faces and we want certain features of the person from the first picture to overlap with the features of the second person. In the first picture, we mark the areas of the nose, eyes and head shape, and in the second picture, we mark the same points. In the future, using special software, we will compare these two pictures, and the marked points from the first picture overlap and modify the same points in the second one.
(in the middle of a face subjected to digital morphing)
Photo by Fraunhofer Institute for Telecommunications
The result is barely noticeable facial changes. In the 21st century, digital morphing is a popular technology among politicians. The experiment was conducted by Professor N. Bailenson, Sh. Iyengar, N. Yee and N. Collins proved that the similarity of the faces of candidates and voters can affect the results of voting: the voter tends to choose a candidate that resembles him/her.

Given the revolution in information technology, political strategies will increasingly resort to the transformation of faces as a form of campaigning. At the 2016 US presidential election, CA used this technology to send "adapted" advertising to "vulnerable voters".
Such "hidden" methods require strict legal regulation, as digital morphing as a form of campaigning is a deliberate manipulation of the will of voters.
Parties and candidates are interested in getting a potential voter to join and espouse their "party views". But not all of them are interested in the objectivity of the information they send to their voters and in the "independence" of their choice. Here they are helped by social media. Based on the preferences of social media users, algorithms create an information vacuum.

Have you noticed that in social media you are surrounded by monotonous information selected based on your interests? For example, you love shopping and looking for the relevant Social media algorithms will fill your feed with your favorite products. The same is true for political preferences. The more you prefer to search for and study one political force and one political view, the less other information gets into your feed. The more ads a social network can display on your feed, the more money it will receive.
Thanks to targeted advertising
Politicians
get a voter that supports them
Social media
get money
Voters
get nothing, because their choice was influenced by the use of "dirty" political technologies
Implications for Facebook, Cambridge Analytica
CA's illegal activities were exposed in March 2018 thanks to journalistic investigations by The Guardian and The New York Times, which received information from the former CA employees. This led to a number of investigations. The UK Information Commissioner's Office applied for a search warrant for CA servers, which was received on March 23, 2018, and on May 1 CA and SCL filed for bankruptcy, which made it impossible to conduct a full investigation – CA assets were liquidated before it was conducted. Before the "blamestorming", the CA admitted that it had collected 5,000 pieces of information for each US voter whose personal data it received. And although it has been stated that all data have been destroyed and no longer exist… we have reasonable doubts about this.

Former CA executives and employees founded a number of CA-like companies and became "successor firms". In June 2018, Auspex International was founded to influence the politics and society of Africa and the Middle West. Emerdata Limited was also established, which was joined by CA employees and has subsidiaries: SCL Group Ltd and SCL Analytics Ltd. In May 2018, a data analysis company, Data Propria, was launched. It is run by former CA officials and was in charge of Illinois Governor Bruce Rauner's campaign and Donald Trump's campaign in the 2020 presidential election, using the same Facebook user data.

As for Facebook, in July 2018, the UK Information Commissioner's Office expressed its intention to fine the social media $ 663 thousand (the maximum fine at that time) for violating the requirements of personal data protection. In July 2019, the US Federal Trade Commission voted to impose a $ 5 billion fine on Facebook (the largest fine in US history) because the social media, despite its being aware of the leak, had not rectified the situation for two years.

Facebook has paid $ 100 million to settle a dispute with the US Securities and Exchange Commission about "misleading investors about the risks they face as a result of misuse of user data".

As a result of the scandal, Facebook introduced two advertising transparency mechanisms - buttons "Why am I seeing this post?" (explains why the user sees a particular ad) and "Ad Preferences" (shows a list of information that Facebook has collected about the user and sources of this information). Currently, it is impossible to say that the social network protects data better, because of the leak of data of 530 million users in 2019 to begin with.
If you're waiting for the end of the story with SCL and CA, as well as the social media ensuring the protection of their user data, don't hold your breath. However, the state has learned a painful but necessary lesson from this unprecedented abuse of confidential information.
GDPR sets the world standards for data protection
In order to avoid repeated cases of data leakage and misuse (for example, by CA), to enable people to understand what is happening with their data and to control their processing, and most importantly for stricter and more progressive legislation in the field of personal data, in 2016 the European Parliament adopted the General Data Protection Regulation (hereinafter - GDPR) to replace the Data Protection Directive, 1995, which entered into force on 05/25/2018.

The GDPR applies both in the EU and abroad:

  • if companies outside the EU collect data on individuals within the EU,
  • or EU companies collect data on people outside the EU.
The document largely preserves the principles of the Directive, while introducing new ones, such as the "right to be forgotten". The GDPR has direct legal force throughout the EU and is applicable to all data protection authorities and courts.
The GDPR, inter alia, specified that:
Information on how data will be collected and used should be provided in clear and transparent wording;
A person has the right to object to the processing of their data for the purposes of direct marketing, including profiling;
The person must have the right to withdraw their consent to the processing of their data at any time;
Genetic and biometric data are confidential information;
The person (according to certain circumstances) has the right to require the controller to delete their data and take reasonable measures to inform third parties about the data deletion;
The burden of proving consent to data processing is on the data controller (the organization that initiates the process of personal data processing, is responsible for its proper implementation, ensures the rights of subjects and reports to the supervisory authority);
Violations of the GDPR are subject to fines (from EUR 10-20 million or 2-4% of the company's annual turnover), as well as the ability to block the operation of websites, data processing tools;
Independent regulators should be established that will be responsible for complying with the GDPR requirements and will handle only the personal data issues, will have investigative powers and will be able to impose sanctions.
Since the entry into force of the GDPR (2018-2021), fines amounting to $332.4 million have been imposed for violating its requirements
The GDPR has set high standards for the protection of personal data. But the main positive impact of the document is following the example of other states. Following its adoption, the countries began to improve national legislation on personal data protection and bring it into line with the GDPR standards.
US legislation after CA
There is no single law in the United States governing personal data. There are a number of federal and state laws that are classified by sector of action (financial services, healthcare, telecommunications, education) and data specifics.

For example, information about children is protected at the federal level by the Children's Online Privacy Protection Act, which prohibits the online collection of any information about children under the age of 13. The Video Privacy Protection Act protects records of the rental or sale of videocassettes or similar audiovisual media from unauthorized disclosure. The Driver's Privacy Protection Act regulates privacy and the disclosure of information collected by the State Department of Motor Vehicles. However, a significant number of laws did not help protect the privacy of citizens on Facebook.

Following the example of the GDPR, the state of California (Facebook headquarters) developed and in 2018 passed the California Consumer Privacy Act (CCPA), which came into force in 2020. The law regulates personal data used for commercial purposes, takes into account modern realities and protects personal data from sale.

The law gives consumers the right to:

  • Know what personal information about them is collected, where it was obtained from, what it is used for, whether it was passed on to third parties and whether it was sold and to whom. Companies are required, through their privacy policies or otherwise, to disclose information about consumer rights, categories of personal information, purpose of collection, categories of personal information sold or disclosed during the last 12 months when collecting personal data. Consumers have the right to request this information from companies through toll-free numbers and websites;
  • The right of companies to delete personal information on demand;
  • Receive services and prices from companies on equal terms. Companies do not have the right to discriminate against consumers – to refuse to provide goods and services, to set different prices for goods or services, to provide goods and services with different quality for consumers who enjoy their right to privacy.

The Law applies to companies:

  • whose gross income exceeds $ 25 million;
  • which process the personal data of 50 thousand or more consumers or households;
  • which receive 50 percent or more of the annual income from the sale of personal data of California residents.

Penalties are punishable with a $ 7,500 fine for each intentional violation of the CCPA and a $ 2,500 fine for any other violation.

In 2020, the California Privacy Rights Act (CPRA) was adopted, which is an annex to the CCPA and introduces a new body in California - the California Privacy Protection Agency to monitor compliance with both laws.

CPRA introduces a list of confidential personal information, obliges websites to allow users to prohibit the sale of personal information to third parties, and introduces a "Do not sell or transfer my personal information" button. CPRA also requires the companies to provide a "Restrict the use of my confidential information" link to the users.

Citizens will have the right to correct inaccurate information, refuse from automatic decision-making (prohibit the use of data for targeted advertising profiling), know about automated decision-making (what technologies are used in processing and what their results are), restrict the use of personal confidential information.

The issue of targeted behavioral advertising has been settled: it is classified into personalized (you can refuse from it) and non-personalized behavioral advertising (you cannot refuse from it).
CPRA changes to some extent the subject to which it applies, namely the company and organization:
  • whose gross income exceeds $ 25 million;
  • which process the personal data of 100 thousand or more consumers or households a year;
  • which receive 50 percent or more of the annual income from the sale or exchange of personal data of California residents.
How personal data manipulations turned into political weapon worldwide
UK, Canada, Kenya, Phillipines... Ukrainians have reason to be concerned as well. Let`s start from the role of Cambridge Analytica and AggregateIQ in Brexit
The UK Independence Party and the political group Leave.eu, which was set up to work with the Brexit campaign, hired CA to hold a referendum on Britain's EU membership (Brexit). According to the former CA business development director Brittany Kaiser, Leave.eu used CA voter datasets to send political messages in social media. The aim was to influence public opinion for Britain's withdrawal from the European Union.
In addition to CA, Canadian digital advertising and software development company AggregateIQ (AIQ) was also working on Brexit, the company was independently hired by Brexit supporters – Vote Leave (AIQ costs – £ 2.9 million), BeLeave (AIQ costs – £ 625 thousand), Veterans for Britain (AIQ costs – £ 100 thousand) and the Democratic Unionist Party of Northern Ireland (AIQ costs – £ 32 thousand) to develop software to aggregate personal data and influence voters through social media.

AIQ's business concept was to collect and analyze personal data to personalize political slogans and send them to voters on social media, persuading them to vote according to client's wishes (in this case, vote for withdrawal from the EU).

AIQ collaborated with SCL Elections during the Brexit campaign: AIQ was the IT contractor for SCL, in particular it developed Project Ripon software and provided advice on its use. Using algorithms with data from Facebook, Project Ripon allowed targeted advertising to be sent to voters. According to Facebook, AIQ has placed 1,390 ads on behalf of pages related to the Brexit referendum campaign. Former CA employees claim that AIQ was the back office of SCL and also held the SCL database.
How football fans, personal data and Brexit are connected
There are various original ways to obtain personal voter data. Vote Leave invited football fans to take part in the quiz: whoever correctly predicts the outcome of each match played at the 2016 European Championships will win £ 50 million. The purpose of the quiz is to draw attention to how much Britain's stay in the EU costs - allegedly £ 350 million a week (emphasized as a disadvantage). The UK's statistical office claimed that the figure was manipulative and untrue.

To take part in the quiz, a person had to answer survey questions and provide: name, address, telephone number, information on the option the person will vote for in the referendum. Data collection was the main purpose of the quiz, in which the chances of winning were 2,250,000,000,000,000,000 to 1.

The information collected was passed on to AIQ, which processed it and identified the respondents' data using their Facebook profiles.

So be careful when filling out the questionnaires and pay attention to the questions you are asked. They may be pursuing a goal other than the one they told you about.
Were there consequences for CA and AIQ?
Understanding the connection between CA, AIQ and SCL, it should be added that SCL is not a random startup.

Thus, the former Deputy Secretary of State for Defense in the UK Parliament Patricia "Patty" Barron was a member of SCL's board of directors, Lord Marland, a former UK Prime Ministerial Trade Envoy, was a shareholder in SCL and CA. Therefore, although during the investigation of personal data analysis during political campaigns, the Information Commissioner's Office found a connection between SCL, CA and AIQ and violations in their actions, none was brought to justice.
Regulation
The history of personal data protection legislation in the UK is long: in 1984 the Data Protection Act was adopted, in 1987 – the Access to Personal Files Act. These laws were later replaced by the Data Protection Act of 1998, which implemented the Directive on the Protection of Individuals with regard to the Processing of Personal Data and on the Free Movement of Such Data.

Neither this nor the election law protected British voters from the misuse of their personal data.

Following the adoption of the GDPR, the UK revised its data protection policy and in 2018 adopted an updated law on personal data protection that implemented GDPR standards. The 1998 law was outdated and did not ensure sufficient data protection in the digital age (in particular, did not explain the processing of cookies and similar technologies), while the 2018 law guarantees the protection of users' digital footprints and prevents data from being sold to third parties without human consent, provides for the right to be forgotten; clear interpretation of exceptions to the law, the implementation of GDPR in the UK.
CA is as well known for influencing the elections in several countries
Canada
Personal data of more than 620,000 Canadian Facebook users was passed on and processed by CA, which undermined voter confidence in political campaigns. Following the Facebook users' data leaks and their use of CA in the US election, Canada studied the landscape of the issue to protect the privacy of its citizens and bring the law in line with the GDPR standards.

Prior to the development of the GDPR, Canada had problems with the protection of personal data due to the limited powers of regulators. Supervision of personal data law compliance in Canada is shared between the federal government and the provinces. In particular, the Federal Regulatory Authority, the Privacy Commissioner of Canada, had no right to prosecute political parties for breaches of personal data laws and to impose administrative penalties. Only one province, British Columbia, has such powers.

  • Parties also collect data on voters: the example of British Columbia
British Columbia law allows parties to obtain voter lists (including names and addresses) from the Chief Electoral Officer of British Columbia. For parties, this is the basic information used in profiling their voters. Also, personal information is obtained through door-to-door campaigning (one of the oldest methods). Campaigners or a candidate come to the voter's home and create a positive impression of the party / candidate and persuade him / her to vote. By opening the door to the campaigner, the voter may agree to provide contact information and such collection of information will be lawful. What the voter will not know and consent to what he will not be asked is the collection of information about ethnicity (remember that in Ukraine it is sensitive information and its collection is prohibited), disability, etc.

  • Regulation
After examining the "gray areas" of Canadian law, the House of Commons Committee on Access to Information, Privacy and Ethics recommended that the Government of Canada:
  • Apply confidentiality legislation to political parties;
  • Introduce requirements for organizations and political parties that use data in targeted political advertising and psychological profiling on the transparency of personal data collection through social media and other online platforms.
They faced a question of revising the legislation, in particular strengthening the protection of personal data of citizens and giving the Privacy Commissioner broader powers in the Personal Information Protection and Electronic Documents Act, in particular:
  • Impose fines on all subjects of violation of the relevant legislation;
  • Seize documents during the investigation;
  • Provide the Competition Bureau, other Canadian and international regulators with information about the investigation.
In accordance with the updated Cyber Security Strategy in 2018, a state security body was created - the Canadian Center for Cyber Security. In 2019, Canada introduced the Digital Charter (Plan from Canadians to Canadians), which detailed a plan to achieve the goals of protecting confidential information. All willing citizens were involved in its development.

The Elections Modernization Act proposes to include in the Canadian Elections Act a provision requiring political parties to adopt and maintain privacy policies and to publish them online.

In the article "Big Data and Democracy: Regulators Perspective", British Columbia Information and Privacy Commissioner Michael McEvoy notes that some British Columbia political parties have expressed concern after the CA and Facebook scandals about how the law affects their ability to reach and "communicate" with voters as the regulation of personal data will only get the situation worse. And here we can't but disagree with McEvoy's position that after the "data leak" the number of citizens who will not vote at all due to distrust has increased. Only if political parties ensure confidentiality, protection of personal data and their fair use, will voters actively participate in political campaigns.
India
SCL India is headquartered in India, offering political campaign management services including social media strategy development, election campaigns, online reputation management and day-to-day social media account management.

The Indian National Congress has commissioned CA services to conduct an in-depth analysis of the electorate and influence on voters, including elections to the Bihar Legislative Assembly.

The voter data was used from the same application "This Is Your Digital Life".

It was installed by 335 people in India, giving access to the data of more than half a million Indian voters.

SCL India is known to have conducted a campaign in India similar to that in the United States.

  • Regulation
As of 2018, there was no special legislation in India governing the protection of personal data or the confidentiality of information. The only laws that mentioned personal data were the Information Technology Act and the Contracts Act.

In 2011, the Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information Rules were adopted, which defined personal and sensitive data (passwords, financial information, health status, sexual orientation and human biometric information). At the same time, information that was freely available was not covered by the protection, as it was not considered confidential.

The person's name, location, preferences, access to the list of friends on social media were not protected. The rules did not apply to data that were not on electronic media.

In 2019, a Personal Data Protection Bill was introduced, which was developed on the basis of the GDPR.

The bill provides for the establishment of a central regulatory body, regulation of personal data of individuals, their processing, storage by both the state and companies registered in India. At the same time, it has been subjected to criticism, therefore work on legislation on personal data protection continues.
Kenya
CA was also involved in Kenya 2013 and 2017 presidential elections. CA was hired by 360 Degrees Media Consulting (a media and education consulting company). In a survey of 47,000 Kenyans during the 2013 election, the CA examined key national and local political issues, the level of trust in the key politicians, voting behavior and intentions, and the sources of information used by respondents.

Based on this data, information campaigns were developed in the media in 2017, where presidential candidate Raila Odinga was portrayed as a bloodthirsty politician who is also sympathetic to the terrorist group Al-Shabaab and has no plans for the development of the state, while the other candidate - Uhuru Kenyatta - was portrayed as being tough on terrorism and a conscientious person who is well versed in economic matters. Voters, sure thing, preferred the second candidate.

  • Regulation
The national legislation did not provide for substantial protection of the personal data of its citizens.

Although the Kenyan Constitution recognizes the human right to confidentiality, only in 2019 did Kenya adopt the Personal Data Protection Act, which defines what information is confidential, regulates data processing, defines the rights of personal data subjects, such as knowing the purpose of data use, prohibiting processing of their data, etc.

The responsibilities of data processors are also defined, for example before data collection they have to inform the person of their rights, the collection of data, the purpose of data collection, third parties to which the data will be transferred, and so on.
Philippines
In 2016, candidate Rodrigo Duterte used the services of CA to help in the presidential race and won.

CA worked on the 2013 Philippine National Senate and Congressional elections. The archived version of the SCL-group website states that the company boasts of its role in rebranding the client (it does not say which one). The section on participation in the Philippine elections, which has been removed from the SCL website, contained the following information: "Before the election, our client was widely perceived as a good and decent person. In particular, his team believed that these qualities were potentially beneficial. However, research has shown that voters prefer resilience and determination. In this regard, SCL has used the country's crime problems to rebrand the client as a strong, no-nonsense man of action, who would appeal to the true values of the voters."

Facebook Chief Technology Officer Mike Schroepfer said that personal data of 1.2 million Filipino users had been transferred to CA. Interestingly, as of 2019, the Filipinos held a leading position in the use of social media for the fourth time in a row. With a population of nearly 107 million, Facebook is used by 75 million. Thus, for this state, the issue of protection of personal data of its citizens is extremely pressing.

In the Philippines, the Personal Data Act was adopted in 2012, but the regulator was not established until 2016. Right after its creation, it began to strengthen the protection of personal data of Filipinos and issued the Act Implementing Rules and Regulations of Republic Personal Data Act, which describes in detail the requirements for the processing of personal data, as well as sanctions for the violations of the law.

During 2016-2020, the Philippines adopted a number of guidelines (for example, on reporting requirements for personal data leakage), memoranda (for example, on the management of personal data leakage) to protect the personal data of its citizens as much as possible.
Ukraine
There is no doubt that in Ukraine, too, politicians are willing to use aggressive forms of political technology, in particular, by resorting to SCL services.

Below is a message from the SCL Elections archives which demonstrates the latter's likely involvement in the 2004 Orange Revolution.
There is also evidence that one of the Ukrainian political parties hired CA in 2017 (when the company was already under investigation). It is unknown which one, though.

So, we should also get concerned about the "latest methods" of interfering in the will of Ukrainian voters and think of ways to protect them.
How are the personal data of Ukrainian voters collected and what`s wrong with the legislation
The study by Tetiana Bohdanova, NGO Digital Security Lab and OPORA, describes how the parties "Voice", "European Solidarity", "Servant of the People", "Fatherland" and "Opposition Platform – For Life" received the data of Ukrainians, in particular through the web-sites. Four out of five parties used analytical services to track human activity in the network and send data to third parties. The websites of all the above parties used cookies and trackers transmitted by social media for advertising purposes (Facebook, Youtube).
Cookies are files that store information about actions on the website, settings and help the website organize information in such a way as to simplify the stay on the website. Trackers collect information on the websites a person visits, IP address, gadgets they use, browser history, network connection data, etc. and transmit this data to their creator. In addition to information about online traffic, trackers can also collect information about age, place of residence, preferences (the Facebook Like button is a tracker). Trackers create an accurate user profile. Third party owners of trackers can process information and sell it on the black market.
A mere two parties with privacy policies on their websites informed their visitors about the use of cookies, but they did not ask for their consent to the use of cookies, did not provide alternative choices, and did not warn users that they provided their personal data the moment they visited the website.

As a result, none of the parties complied with the requirements of the Law of Ukraine "On Personal Data Protection", and the parties' websites did not protect the collected data.

In addition to websites, the parties also collected electoral data from social media and registration forms.

Therefore, collecting personal data of Ukrainian voters is much easier than one can imagine - voters themselves provide the necessary information without understanding the risks of their further use. Voter data can simply be bought on the black market, as there are many data leaks in Ukraine. For example, in 2020 there was a large-scale leak of data that was offered for sale through Telegram-Bot. According to the results of the investigation, 30 databases of 20 million individuals and legal entities were seized, which contain information about citizens from the state registers, banks, postal operators, telecommunications companies and the voter register. The complete database cost $ 10 thousand.
What is wrong with Ukrainian legislation?
The Law on Personal Data Protection (hereinafter referred to as the Law), which entered into force in 2011, is outdated and obsolete. It cannot fully protect against unauthorized actions on personal data, nor is it able to effectively prevent them. The current legislation provides for ridiculous fines for violating the law on personal data (a maximum fine ₴ 34 thousand), which does not serve as a deterrent for large companies. In case of leakage, personal data owners, controllers and third parties are not obliged to notify the regulator, as the law does not regulate this issue, which is a serious gap. Accordingly, you do not need to reinvent the wheel to obtain personal data of voters.
Will there be better legislation in Ukraine?
Ukraine is moving in the right direction to ensure an adequate level of personal data protection.

As part of the Association Agreement between the European Union and the European Atomic Energy Community and their Member States, of the one part, and Ukraine, of the other part, an action plan has been developed to improve the legislation on personal data protection and bring it in line with the GDPR.

Ukraine desperately needs a powerful regulator in this area.
At present, compliance with the legislation on personal data protection is ensured by the VRU Commissioner for Human Rights.

At the same time, the workload of the body, lack of resources, legal restrictions do not allow it to fully exercise the required level of control and protection. For example, the body does not have a punitive function, it cannot impose fines for committed offenses. What it can't do is draw up reports on administrative offenses and send them to the court, which slows down the decision-making process.
Due to the short timeframe for prosecution – 3 months from the date of the violation – a small number of protocols receives a positive court verdict (based on the number of complaints received), as many cases are received by the Commissioner after the missed deadlines.
Currently, a joint working group of the two parliamentary Committees and the Office of the Verkhovna Rada Commissioner for Human Rights has developed a new draft law on personal data protection, which should comply with international standards and the provisions of the GDPR.
A separate draft law is being developed to establish a body that will regulate the protection of personal data and access to information. It will perform both control and methodological functions, in particular, develop codes of conduct for personal data protection in relevant areas of activity, train citizens and government officials.
The amount of fines for violating the law on personal data will also be increased.
Therefore, we hope that it will be able to protect, in particular, the personal data of voters from the illegal use of their personal data both in the election campaign and in other political campaigns.
Conclusions: How can voters protect personal data and what can the state do
Voter information isolation prevents the pluralism of information, which is adversely reflected in democratic processes. In order to make informed decisions, it is necessary to have versatile and complete information, not just filtered information. Thus, such manipulations threaten democracies because they undermine the independence of voters and deprive them of the opportunity to form an opinion that reflects their views rather than those of political forces – and thus affects the independence of choice.

Targeted advertising, developed on the basis of personal data of voters, is a non-transparent and powerful technology that must be regulated at the level of national legislation.

In turn, national legislation must meet modern challenges, protect personal data from illegal transactions and be so detailed and well-written that it serves as a preventive mechanism rather than just being used in the event of infringements.
To do this, the state needs to:
1
Develop detailed legislation in the field of personal data protection and develop quality provisions taking into account national shortcomings, as well as international experience and bring it into line with the GDPR.
2
Provide a clear definition of consent to the processing of personal data. Enshrine that the request for consent should be in a clear and accessible form using clear and simple wording; prior to giving consent, the person must be informed of the right to withdraw his / her consent at any time.
3
Oblige data owners to inform persons whose data will be processed, for what purpose, whether they will be transferred / sold to third parties, data retention period, criteria for determining such period, right to correction, deletion, restriction on processing and objection to data processing, right to file a complaint to the supervisory body, whether the person profile will be created.
4
Statutorily enshrine the prohibition of discrimination against persons who object to their data processing.
5
Provide the person with an alternative choice in case of their objection to give consent to processing; give the person the opportunity to choose which of their personal data can be processed.
6
Oblige the owners of personal data to notify the person of a change in the purpose of processing their personal data.
7
Oblige the data owner to immediately notify the person about the receipt of his / her personal data, if they were not received from him / her, as well as to notify the regulator about the violation of personal data protection, including data leakage.
8
Establish an independent body – the Personal Data Commissioner – giving them expanded powers for high-quality and fast operation, in particular, give the authority to impose fines for the violations of personal data legislation. It is important to provide this body with adequate funding, which will, among other things, allow the introduction of educational courses for citizens, business representatives, government agencies, local governments, which in turn will contribute to public demand for quality control in this area.
9
Increase fines for the violations of personal data protection legislation.
10
Settle the issue of transferring personal data of Ukrainian citizens to third countries.
11
Investigate the problems of sending targeted and personalized campaigning to voters, identify which forms of political targeting and campaign advertising are the most dangerous in terms of influencing the will of voters and prohibit such forms at the legislative level.
12
Use California's experience in regulating personalized behavioral advertising, in particular give citizens the right to know what technologies are used to process their data and what results are obtained during such processing, and to prohibit their use in profiling for the purpose of sending targeted advertising.
13
Introduce awareness raising on personal data protection for the citizens, including the inclusion of relevant topics in the school curriculum.
In turn, the voter should:
1
Read the privacy policies (and review them from time to time) of all applications that are installed on gadgets, online tests / games, as well as information that websites collect when they visit them.
2
If you are not informed about the purpose of data processing and if you are not provided with alternatives to using the services, if you do not agree to data processing – avoid such applications, websites, etc.
3
Disagree to grant permissions to applications that do not in any way affect their work, thus not giving them an excessive amount of information about themselves.
4
Not disclose their personal data to third parties, as they may fall into the hands of fraudsters.
5
Upon termination of the relationship with the personal data owner, require the deletion of your data.
6
Get information on how your personal data are used: who gets them, for what purposes they are collected, whether they will be sold, etc.
7
Not prefer the same type of political information on social media, and explore the topic comprehensively.
8
View your account settings on your phone and computer to choose what data and how much data you allow to be collected and processed.
9
Before deleting the application, delete your account in it, because deleting the application does not indicate the termination of the personal data processing.
10
Not take entertaining tests such as What Type of Flower Am I? if the application requests too much data (e.g., access to a microphone, messages, etc.) and not download such applications. Or at least read what information they collect and where they pass it on.
11
Do not use websites that do not give you a choice to grant permission to use personal data.
12
Give preference to a variety of information, not just the information which gets into your feed, because only through full awareness can you make an informed choice.
13
Take courses on personal data protection.
And remember – the world never stops, so you have to keep up with it.
Yevhenyia Stadnik
CEDEM lawyer
September 27, 2021