Out of Control: a deep dive into the digital advertising and data sharing practices

image_pdfimage_print

An expected, but scary report on digital advertising through dating mobile applications run under Google Android OS.

On 14 January 2020, Forbrukerradet, the Norwegian Consumer Council (“NCC”) released on its designated web page a 186 pages study report called “OUT OF CONTROL: How consumers are exploited by the online advertising industry“. Supported by a 93 pages technical report, it explains how the industry of digital advertising (“AdTech” industry) is exploiting personal data of consumers through dating mobile applications to use and monetize such data for their business interest and the ones of “shadow companies”, giving no choice to consumers other than not using the App to avoid profiling and use of their information.

The AdTech players usually conduct tracking and profiling activities, such as behavioral targeted advertising, while sharing and transmitting and communicating electronic information to a wide range of third parties. Nothing new under the sun with this report. Except that the reading provides a lot more clarity about the transmission mechanisms between App providers and third parties active in the advertising industry and real-time bidding (defined below). Some of those mobile application editors use anonymized or aggregate data to conduct segmentation for specific targeted audiences, whereas others just collect personal data “in clear”, without the end-user knowing it, nor agreeing to such practices.

The Norwegian Consumer Council filed three complaints to the Data Protection Authority for breach of the GDPR. And because this report shows that it affects people globally, it is likely that we will hear more actions around the world (even class actions). To read more on the complaints filed against Grindr & Cie, click below:

_______________________________

ICO UK WARNS ADTECH COMPANIES TO EXPECT ENFORCEMENT ACTIONS

In the meanwhile, further to the report, Simon MacDougall, Executive Director for Technology and Innovation at the ICO (UK Information Commissioner’s Office), issued a clear message to the AdTech industry on its blog on 17 January 2020:

“There is a significant lack of transparency due to the nature of the supply chain and the role different actors play. Our June 2019 report identified a range of issues. We are confident that any organisation that has not properly addressed these issues risks operating in breach of data protection law. […]We gave industry six months to work on the points we raised, and offered to continue to engage with stakeholders. If these measures are fully implemented they will result in real improvements to the handling of personal data within the adtech industry. […] We will continue to engage with industry where we think engagement will deliver the most effective outcome for data subjects. […] Those who have ignored the window of opportunity to engage and transform [the ICO gave 6 months from september 2019] must now prepare for the ICO to utilise its wider powers.”

A very good timing for issuing this warning to the industry. In march 2020, once the given 6 month period expires, we should expect further guidance and more enforcement actions if the industry has not convinced the authority to change its doubtful practices.

The ICO UK is one the most active and productive data protection authority in the EU. Together with the CNIL, it issued guidance for this adtech industry, which you can find at the end of this article for more information and references.

_______________________________

UNDERSTANDING THE HARM TO INDIVIDUALS

The study offers a deep dive analysis into the data-sharing practices of AdTech companies communicating via mobile applications. In such cases, unless manufacturers includes built-in easily understandable opt-in consent or opt-out mechanisms associated with clear information about the use of personal data and what it means for them, consumers are unlikely to understand anything of what is going on through the App. This is even more true, when those Apps are used by teenagers or kids.

Out-of-control describes, with examples, how tracking and profiling activities can be used for data-driven persuasion, in particular with the collection of sensitive information. Data-driven models may not only serve dark patterns for commercial purposes aiming at influencing consumers so that they can decide to buy stuff that they initially didn’t want to. According to the report, persuasion based on personal data and profiling may lead to (among others):

  • discrimination
  • harassment
  • manipulation of information and influences of opinions
  • can impair freedom of expression through the “chilling effect” (the effect of not being free to express herself or himself due to the perception of being spied or under surveillance)
  • fraud

which can create other serious harm to individuals if the data goes into the wrong hands. To summarize, when in the wrong hands:

Companies can use tracking and profiling activities for data-driven persuasion, which can lead to serious harm to consumers, whose personal data are collected, including discrimination, fraud, manipulation, same as observed in the Cambridge Analytica’s case.

This is where the real danger is.

Such collection by shadow companies can contribute to the absence of trust about information, opinions, and impair how we all perceive today’s society based on information quickly consumed through information pushed to anyone’s eyes and mind unconsciously storing and memorizing information through data-driven persuasion. Reality becomes the one others want us to believe. According to an Amnesty International report (p. 43 of the out-of-control report), the technology used can dramatically impair fundamental human rights of people and it says:

These capabilities mean there is a high risk that the companies could directly harm the rights to freedom of thought, conscience and religion and freedom of opinion and expression through their use of algorithmic systems.

The study shows that some mobile applications (such as Perfect 365) can even communicate and exchange data with more than 70 external third parties and vendors, including social media platforms, such as Facebook and Twitter, which demonstrates how broad this can be.

Looking at the data flow and the complexity of this ecosystem, it becomes clear that consumers have no more control over their personal data.

_______________________________

EXAMINED MOBILE APPLICATIONS

The study was conducted by the NCC together with 10 other non-profit organizations, including the FRC, the French speaking division of the Swiss Consumer Federation, outlining the width and scope of data-sharing practices through selected mobile applications on Google Android mobile OS operating system. This study focuses on online dating platforms, such as Grindr (subject to 3 formal complaint to the Norwegian data protection authority), OkCupid and Tinder). A lot of information is processed through those applications, including location data, e-mail addresses, and other sensitive personal data such as sexual preferences, health-related information, such as HIV status.

Other mobile applications are included in the study.

For example, it analyses MakeUp Apps (such as Perfect365, based on augmented reality), allowing live photo editing before sharing on social media, Ovulation Calendar & Period Tracker, such as MyDays (which may be considered as a medical device as standalone medical software), Religious Apps, which can be used to receive reminders of events tied to religious matters (Muslim) and Apps for children (My Talking Tom 2) among others. For most of those applications, users have downloaded them million times, with a huge community of consumers, including minors. The study observes how publishers (software editor of the App – see below) share personal data with a range of third parties in the AdTech ecosystem built to monetize personal data.

_______________________________

GETTING INTO MORE DETAILS: PLAYERS OF THE ADTECH INDUSTRY

The online advertising industry uses different ways for marketers to publish their content online.

A company can either pay (alone or through a third party broker) for a space on an internet page or display an add in a mobile App with a fixed price. When a company displays its ad, it will either pay a fixed fee or a variable amount, based on the value of placing an advertisement at a certain period of time (lunch time, night) or to a certain segmented audience (type of audience, territory, etc.). It can decide to place its ads up for the value of the key word. The value of such keyword is constantly evolving based based on an algorithm. For example, such algorithms are used by Google on search engines to display advertisements. The costs can go up to a certain price if it is popular or lower if few people want to search this keyword. After this, online users can see sponsored content (content pushed to the public at a certain place on the website) in relation to the keyword entered into the search engine, where the marketer will pay Google on a per-click basis.

A company can also use real-time bidding (RTB), consisting of an instantaneous online auction. Real-time bidding means that advertising buyers bid on an impression (= number of times a content appears in someone’s feed; A viewer doesn’t have to engage with the post in order for it to count as an impression). If the bid is won, the content of the person buying the ad will instantly display on the publisher’s site, page, network or App. This is one of the standard business models for companies sponsoring their content online.  The ad will display and the marketer shall pay an amount subject to value of the won bid and the allocated budget. See the ICO UK’s guidance for more information or privacy in RTB (link at the end of the article).

The different stakeholders of the AdTech market usually are:

  • Publishers provide information and interactive services to users. The publisher is the editor of the mobile application and offers locations in the App, for advertisers to display their ads. Publishers are paid by the number of clicks on an ads, but often, publishers cannot know which ads will display because as it out of their control.

    –> App providers (publishers) are data controllers.

  • Marketers are entities that want to acquire and retain valuable customers, find and influence users across the digital world. This can include retailers, grocery stores, consumer goods brands, device makers, car vendors, the travel and hospitality industry, telecom and financial services providers, and many other providers of products and services.

    –> Marketers involved in the purchase of targeted advertising can be considered joint controllers, even if they do not actually process personal data themselves, as long as they define the means purposes of the processing.

  • Advertising networksanalytics vendors, and data brokers. These other third party vendors is another category that includes a number of actors in the digital marketing and AdTech industry, which is normally hidden from consumers. Unlike publishers and marketers, third party vendors mostly do not have any direct relationships with users.

    –> Those third party vendors, which are receiving personal data from the apps are either processors, separate controllers, or joint controllers, depending on how and under what terms they use the personal data;

  • Major platforms such as Google and Facebook are involved, for the large part, in the AdTech industry. They offer services and play an important role to either: (a) sell digital advertising based on user data (YouTube as a publisher of ads on video content); or (b) sell sponsored posts on their platforms (Facebook, Twitter, LinkedIn). Those major platforms also act as third party vendors to provide digital advertising services through their other entities and other divisions with other brand names.

    –> Major platforms can either be data controllers, processors or joint controllers depending on the types of services that they offer.

  • Consumers are the last players of this economy, called data subjects under the GDPR.

_______________________________

WHAT DATA IS SHARED AND WITH WHOM

The technical report is a supporting document for the complaints filed to the Norwegian data protection authority. It shows data flows, where third parties are involved in this massive data-sharing ecosystem. We read what categories of personal data are constantly collected by the publishers and communicated to, or accessible by their, business partners. The parameters analyzed in the technical report include: Advertising ID, IP address, MAC address, GPS location, device information, device configuration, Wifi networks, App name making the requests, account information, and user data, such as age and gender. It can also include e-mail addresses and sensitive personal data.

Android Advertising ID’s example

Out of control report and its supporting technical document underline what identifiers are used to track users information, especially the Android Advertising ID (“AAID”).

The AAID provides a user-specific, unique, resettable ID to be used for advertising and provides linkability, in the sense that it gives advertisers an easy way to connect the dots between multiple apps or multiple sources.

This AAID is embeded is any device. As a so-called “anonymous” ID, it provides advertising companies a unique device identifier for advertising and tracking, which can be used to correlate data from multiple sources (technical report p 12) and services. Although Android smartphones users may reset their Android Advertising ID, the report shows that combined with other unique identifiers, resetting the Android Ads ID may become useless, because third parties will often be able to re-identify users. Finally, the interesting part linking to privacy settings is the following (p. 13):

“There is an option in the Android system settings to opt out of targeted advertising based on the advertising ID, but this mechanism appears to be largely trust-based, as it requires the app developers to actively check for and honour an opt-out, rather than supporting the opt-out natively in the platform. At the same time, end users have no way to evaluate whether a given app is respecting Android’s privacy controls, and may believe that the control is effective even if they are not”.

The conclusion is clear: there is no way for an Android user to have the guarantee from Google’Android that even when deactivating the AAID, other companies will not use the ID to target them with online content!

Wide range of personal data shared with a wide range of third parties

Usually, most of the data that is communicated to third parties, relate to GPS location, names and surnames, contact details, and preferences. However, in other cases, the report shows that very intrusive personal data is collected and shared with third parties, such as questions, where the company shall have no valid purpose to hold such data. Questions asked in the OkCupid App shared the answers with Braze (an marketing agency), included information from:

  • Do you have student debt?
  • Do you prefer hardcore or softcore when it comes to your porn?
  • Are you jewish?
  • How does the idea of being slapped hard in the face during sex make you feel?
  • Is it easy for you to achieve orgasm?
  • Do you enjoy exercise?
  • Generally, do you enjoy being drunk?
  • Is climate change real?
  • Is the US educational system designed to benefit the rich?
  • Would you ever date someone that is hiv positive?

Such data is extremely sensitive and can lead to a serious harm for persons whose personal data may come in the wrong hands. In another scandal arose in 2018, where Grindr was found to share HIV status data with 2 other Apps being Apptimize and Localytics (p. 23 of the technical report).

_______________________________

INTRUSIVE AND NON-INTRUSIVE BUSINESS MODELS 

Although some sectors are regulated and prohibit publicity to protect consumers (doctors, alcool, tobacco, pharma and other sectors), advertising companies can conduct their business legally to the extent they follow relevant legislation, including at least: data privacy and consumer protection rules. In this regard, data protection laws (including ePrivacy Directive) and consumer protection laws mean the laws of each country, where users are residing (except that the GDPR applies everywhere, when in-scope), which can become a challenge to implement. In the AdTech sector, given the intrusiveness of the processing activity, Codes of conducts (IAB codes of conduct) and principles of ethics are emerging (see the Principles and practices for advertising ethics). Although those texts are not considered as law, they help companies using advertising practices that are fair and ethical. This is a growing area that any company, not only the AdTech sector, should have a look at to embed those principles in their culture. In a world, where data collection and sharing practices are commonly performed without transparency, ethics can help build trust of consumers again.

An interesting part of the study explains that several business models are available, which can involve or not the processing of personal data of consumers using an App. In reality, the more personal the profile is, the more companies can accurately target users, with content that may interest them. And there is still a large number of companies operating in the shadow of users taking advantage of the data business tracking and profiling  with debatable legitimate purpose to hold such information (report p. 5 and 45).

The consequences for a company to use personal data or not are massive. Let’s use an example for the use of personal data:

Does a third party, used by the data controller to backup information on hosting servers, really need to access political views, sexual preferences and orientations or religious believes? Using a subcontractor to store personal data on its server is perfectly okay, even if it leaves the EU. The cloud vendor should however, as data processor (subcontractor), follow all instructions of the data controller and comply at least with all art. 28 (data processing agreement) and art. 32 (appropriate security measures in place) requirements of the GDPR. If this backup company uses personal data for its own purpose, and let’s say it will make profit from it, this cloud provider becomes a data controller – i.e: responsible for handling personal data – and will have to comply all data protection laws, including the all GDPR requirements.

This means that any third party holding personal data of consumers through the App, deciding about what to do with the data (“for its own purpose”), such as making money of it, should inform the user about the processing, explaining why it holds the data, base such processing activity on a valid legal justification, such as consent or a legitimate interest, and comply with all other obligations under data privacy laws.

This is unlikely to be the case for shadow companies using personal data that is not anonymized, in order to take their own decisions.

Alternative business models vs the sad reality

The reports reminds that companies can use non-intrusive business models to conduct advertising on mobile applications.

This is the case for Subscription services. Subscription-based models, often used by online media and newspapers, require users to pay for the work done by journalists. Also, donation – which is the Wikipedia and TheGuardian business models, can also work. As opposed to behavioral targeted advertising, which requires to create user profiles, companies can chose contextual advertising as business model, where targeting ads can be based on the content that the consumer is looking at, rather than on the profile of the consumer her- or himself.

The problem for players of the AdTech industry, is that using less intrusive technologies may lead to a “race to the bottom” with the side-effect of “depressing the value of data, which may fuel further extensive sharing of personal data, leading to market inefficiencies from a competition point of view” (p. 53 and 54 of the report).

In practice of mobile Apps and social networks, sharing personal data is often the counterpart of using the free version of an App. In the mobile gaming industry, the use of a free game almost always includes other ways to monetize the App: just think about Candy Crush. It can include: (a) forcing the user to watch a video, (b) to share with friends the results of a game; or (c) share its personal data with third parties, or (d) offer direct purchase of items, which will make you progress in your game journey. A free version of an App usually forces the consumer to give something in return to its free use, therefore making this App not really “free” anymore. Many initiatives are emerging to monetize personal data, but in a transparent manner. In the case of companies doing market researches or surveys, the publishers remunerates consumers in exchanges of answers to certain questions. It can also be based on the physical performance displayed in the App owned by a insurance provider which the consumer is already a client of.

Often, even with paid Apps, subscription-based models or other less intrusive business models still use tracking and profiling activities of users. Those practices are very often done without the knowledge of the users; with very low transparency and consent-based mechanism to the users.

_______________________________

FREE VS PAID: REASONABLE EXPECTATIONS OR DATA HOLD-UP?

Should the out-of-control report be a surprise to the general public? Absolutely not! The subject is not new and those business models are already dated. However, the technology is evolving and with machine learning and other more clever algorithms, it will become less easy to understand the consequences for one’s privacy when using an online service through a mobile application.

Consumers can expect ads in free or even paid versions.

Users of mobile Apps, in particular social networks, should expect a difference in the business models between free and paid Apps. To run mobile applications consisting in any kind of social networks, it requires a connection. Thus, by design, it is not possible to use a social network without opening your phone to the external world, for games this might be different, but the publisher will often deny the game if the internet connection is not active. Now, when a user decides to use a free version, compared to a paid version, users can assume and expect that, with no money, the App will not be maintained or will just disappear from the market and App stores. Remember old times when using video games decades ago; a one-off payment would allow anyone to own the game and play ad eternam without any connection.

When we think of the current business models, Facebook is the best example of the next generation business models. According to a study, Facebook’s revenues in 2018 were almost only based on advertising (98%). This means how lucrative it is when the publisher offers a platform with millions of consumers, earning money with a pay-per-click model. In the case of Grindr, the App has been downloaded more than 100 million times. In this sense, consumers can expect to see ads when using free versions of a platform or a service, because this model has become mainstream. Compared to premium, subscription-based or paid versions, free versions almost always use advertisements. Advertising business models can be based on personal data or not, but it almost always requires a connection to the Internet or the transmission of information via the App.

Learning from the cookie wall debate?

The next question is: does consumer has to accept to be targeted with ads to use a free (or even paid) service? The answer is not  clear from a legal point of view.

Although this is a different situation, one may compare free Apps and choice, to the cookies and tracking technologies’ discussion in the EU (i.e:  article 5(3) of the ePrivacy Directive). As a reminder, the cookie wall debate relate to granting or denying access to an internet user depending whether she or he decides to accept to place cookies on its device or not. If the user says yes, which allows online tracking and potentially creating profiles, users can pass the wall and access the site. If the user says no, I don’t want to be tracked on your site, then the user cannot access the website. In my opinion, the situation is similar with free mobile Apps and social networks.

The cookie wall debate is still on and a court has not judged yet, whether a website can ban users if they don’t agree to be tracked. In the context of cookies at least, we know now that, further to most data protection authority’s recent guidance and approach (CNIL and ICO UK), there is no exception to consent, other than when the tracking technology is necessary for the performance of the services. Therefore, consent is almost always required. Cookies are regulated by the ePrivacy Directive principles, but consent and transparency obligations rules have to follow the GDPR (as seen in the recent CJUE Planet49 case).

The out-of-control report explains that Grindr and its third parties, go far beyond such practices. It involves tracking, profiling and transmission of sensitive personal data to external companies that may not have any valid ground to hold the data and do not inform individuals properly. Also there is almost not valid legal justification as consent and explicit consent are not properly included in the App.

When a user knows in advance what company is going to do what, with what data, and if the user can easily choose what to do and decide to continue to use the service or not, this could be fine (subject to the view that a cookie wall may not be lawful). This could be fine if you follow the argument of pro cookie walls, who argue that nobody is forced to use an App or a service. It is always possible to choose to use another one if the user is not happy. This apply as long as the choice for the user to leave the website or the App comes before the cookie is placed on the device. On the contrary, people not supporting the cookie wall concept, claim that such mechanism does not give the user a real choice, because it should be possible to use the service without detriment to the user and allow them to use read the content by refusing the placement of cookies as a counterpart of entering the website.

Although both arguments are arguable and valid, banning cookie walls means that it requires to find other ways to monetize the use of online services, especially in an data-driven economy.

In any case, if a user has no information and no choice, and uses a service that collects such amount of data, sharing with them with so many other companies, this is not just unethical anymore, it constitutes a sneaky and severe violation of someone’s privacy.

_______________________________

COMPLAINTS TO AUTHORITIES: NORWAY VS SWITZERLAND APPROACH

NORWAY – A GDPR COUNTRY

Norway is a GDPR country. As an EU country, it has implemented a sanction mechanism based on Regulation 2016/679, better known as the GDPR. This action includes appointing a data protection authority capable of taking enforcement actions for violations of data protection laws, which is the case of all EU countries. The fines can go up to 2% or 4% of the annual worldwide turnover for private companies. As a reminder, the GDPR applies regardless of the location of the data controller, if it uses personal data of individuals located in the EU. The controller is the company deciding about the purpose and the means for the processing of personal data. In this sense, the GDPR is very powerful to most companies (compared to tech giants that can easily survive to fines) because companies cannot exit the EU to escape the application of this law. Also, due to its sanction mechanisms, it can be used as a preventive and dissuasive tool.

The NCC filed 3 complaints to the Norwegian Data Protection Agency against Grindr and companies accessing personal data of consumers through the App. The content of the 3 complaints is very similar to each other. The main focus is on the absence of valid consent as the legal justification to use the data. It also mentions the Dominant Market Position of Grindr. This is interesting, because we start to see an interplay between privacy and anti-trust laws, where data controllers may abuse from their dominant positions.

In this article, I previously discussed Google’s €50M fine by the CNIL. I also analyzed the first GDPR enforcement cases to understand the level of fines under this EU data privacy law, where in this other article, I compared the legal regime in force before the GDPR, mentionning Equifax and Cambridge Analytica’s cases.

SWITZERLAND – A NON GDPR COUNTRY

The situation is slightly different in Switzerland.

The Swiss French speaking division of the Consumer Federation (FRC) also requested the Swiss Federal Data Protection Commissioner (Swiss Commissioner) to take further actions. This request will be very limited given the Swiss Commissioner can only issue non-binding recommendation, go in front of the federal administrative court to have a conceptual judgement. As such, the there is still no direct enforcement, nor sanction power for the Commissioner.  Therefore the Swiss complaint will lead to a statement or recommendation, under the means available under the Swiss Data Protection Act.

Further to a massive data breach to Swisscom, I commented (in French) the weakness of the Swiss Data Protection Act (“Swiss DPA”) in terms of enforcement. Although Switzerland benefits from an adequacy decision from the EU Commission and its law is comprehensive and solid, it lacks from enforcement and direct sanction mechanism compared to EU legislation to dissuade bad players exploiting data with no further consequences. Currently, the Swiss DPA only allows someone to initiate a criminal trial with a maximum criminal fine of CHF 10K. The currently revised text of the Swiss DPA will not grant anymore sanction powers for the Commissioner in case of infringement of the Swiss Federal Data Protection Act, but will increase the individual responsibility criminal offences up to CHF 250K.

Data protection is a balance between the fundamental rights of individuals against the rights of others to use such information for their own purpose.

Swiss politicians have chosen their side. To date, the legislator clearly gives more importance to the rights of companies to process personal data instead of protecting the fundamental rights and freedoms of Swiss citizen. AS a result, claiming more control in Switzerland, will always remain weak if companies decides not to apply privacy rules to Swiss citizen so that they can claim more control over their personal data. This remains a political choice.

_______________________________

TOP 5 PRIVACY RECOMMENDATIONS FOR THE ADTECH INDUSTRY

The following recommendations can apply to Adtech players, including manufacturers of mobile applications, but also to other industries.

1. DETERMINE YOUR ROLE

You should first understand whether your company acts as data controller, data processor or as joint controller. Your role is based on each processing activity. This first step is absolutely key and will determine the level of responsibility of your company and regulate how to act with your business partners, especially fulfilling other compliance obligations, having the right contracts in place (art. 28 GDPR), limiting your liability and indemnity obligations, and determine the roles and responsibilities of your counterparts for the processing of personal data in each situation.

2. KNOW WHAT DATA YOU HOLD, WHY AND WITH WHOM YOU SHARE IT

WHAT. Consumer data (B2C) requires the same care as B2B information. However, when is comes to involve children’s privacy or with sensitive data, it requires due care. On social networks, is will very often if not always involve sensitive personal data. Except where other rules may apply, such as consumer protection, age verification and authorization from parents may be of importance. When you determine what information your company collects, you can then apply data minimization principle (don’t collect more than what you need for what you want to achieve = only collect what is strictly necessary). Once you know the categories of data you have about whom, you can apply the appropriate security measures, assessing the risk associated to such data in case you suffer a breach, and data minimisation.

WHY. You need a purpose to process personal data and inform people about it. If you just collect personal data to hold it, sell it or for a future use, this might be against the GDPR and generally a lot of other data privacy laws around the world. The purpose needs to be explained to the data subjects (users/consumers) though privacy notices for each processing activities and with the right justifications and for each purpose.

WITH WHOM. Ensure you have all appropriate safeguards and legal mechanisms to share personal data with others, including knowing if the other party acts as your data processor or another controller. Collecting and sharing data with other vendors is part of the connected world. Nowadays, data and personal data are transferred to many third parties, including to countries with less data protection standard as in the EU or Switzerland, such as India or the USA. When transferring data that is personal, your company must ensure to have the right to pass it to third parties If you have to rely on consent, you cannot pass and transfer the consent to your third party if the third party becomes a controller. In this case, the other party will have to inform consumers and collect the consent or explicit consent. If not, any such practice will infringe data privacy laws.

3. USE THE RIGHT LEGAL BASIS AND COMPLY WITH IT

Still check whether you can rely on any exemption or exception to consent under art. 6 GDPR (among the 6 other legal bases) and/or under art. 9 GDPR (explicit consent if you hold sensitive personal data). The ICO UK stated in June 2019 that, in the case of marketers, it is unlikely that another legal basis than consent can be used. Also, tracking technologies are subject to the ePrivacy Directive, which now is clear to require consent each time the placement of a tracking technology on a device is not necessary for the performance of the service. The ICO states for sensitive data: “market participants must modify existing consent mechanisms to collect explicit consent, or they should not process this data at all” (guidance June 2019 p. 16). Therefore, you’d better use the appropriate consent mechanism (not bundled, freely-given, informed, unambiguous, etc.) if you do not want to hear from authorities or data subjects lodging a complaint.

4. USE TRANSPARENT PRACTICES TO BUILD TRUST

Privacy notices, not only support consent compliance, but are made to inform people about what exactly you hold about them. It explains what you will do with the data, how you protect it and with whom you share it (among others). Adtech companies operating as data controllers, need to find creative ways to draft (concisely, clearly, etc.) and display their privacy notices in a easily understandable way which will not discourage the user from reading, nor understanding, nor … using the App! Hiring pragmatic, business-oriented and creative privacy lawyers might be worth the price, as privacy on mobile applications is more difficult to achieve given the smaller screen of the devices. Demonstrating a valid consent is still, and will remain, a challenge.

5. APPLY PRIVACY BY DESIGN AND BY DEFAULT 

The architecture and the data flows in mobile applications and social networks can become extremely complex. Building and maintaining software, taking into account data lifecycle (data retention and data minimization) with built-in opt-in consent, privacy notices, appropriate security measures, features and technology to give the control to users and always apply the less intrusive parameter (turned off if not strictly necessary) to start using the service.

AND … FOLLOW INDUSTRY-SPECIFIC GUIDANCE (IBA, ISBA, ICO UK), DON’T FORGET EPRIVACY AND THINK ABOUT DATA ETHICS!

Future will tell if privacy laws will influence business models in the field of digital advertising…

To go further with authorities guidance:

By Gabriel Avigdor | NTIC.ch | datalex.ch | penalex.ch