The gig economy: Digital platform workers vs. personal data | In Principle

Go to content
Subscribe to newsletter
In principle newsletter subscription form

The gig economy: Digital platform workers vs. personal data

In recent years, the gig economy, based on a model of flexible employment using online platforms, has grown rapidly. In the EU, it is estimated that 43 million people will be employed through such platforms in 2025. A heated discussion is underway regarding the new regulations in this sector, in particular regarding the employment model for such workers, but also about making automated decisions regarding them using various types of algorithms. What should the organisers of job platforms keep in mind in light of the GDPR? We discuss this using the example of decisions on food delivery platforms issued by the Italian Data Protection Authority.

Contractors’ personal data—what to keep in mind?

Delivery of services through popular digital platforms involves the processing of personal data not only of the apps’ customers, but also of contractors, e.g. food delivery people. In our article, we will focus on the processing of their data.

The contractor’s personal data processed by the platform organisers includes not only their name, but also location, route, number and types of orders completed, length of time taken to deliver orders, messages the contractor sends through the app, and complaints about the contractor.

From the perspective of data protection, the key issue is that the entities organising the platforms are controllers of the contractors’ personal data. Therefore, they have a number of obligations under the EU’s General Data Protection Regulation, such as:

  • Ensuring adequate information regarding the processing of personal data
  • Determining the appropriate basis for data processing
  • Compliance with the principle of privacy by default, so that only data necessary to achieve the specific processing purpose is processed
  • Ensuring data subjects can exercise their rights related to processing of their personal data.

Below, we will look at selected duties based on decisions issued by the Italian Data Protection Authority (Garante per la protezione dei dati personali).

The business of organisers of job platforms is closely intertwined with data processing, and thus these companies must not treat compliance with the GDPR as an incidental aspect of their business. In Italy in 2021, the food delivery industry became painfully aware of this issue when the data protection authority imposed hefty fines for violations in handling of contractors’ personal data, of EUR 2.5 million (decision of 22 July 2021, no. 9685994) and EUR 2.6 million (decision of 10 June 2021, no. 9675440).

Transparency

One of the controller’s primary duties is informational. Under the GDPR, operators of platforms for use by contractors must provide the contractors individually with information on how they will process their personal data. The scope of this information is defined by Art. 13 and 14 of the GDPR, and in particular includes information on:

  • Who is the data controller
  • For what purposes and on what basis the data is processed
  • Who will have access to the data
  • Whether the data will be transferred outside the European Economic Area.

The information should be provided in such a way that the contractor can actually become acquainted with it, i.e. in principle it should be formulated in the language used by the contractor, presented in a clear, understandable and easily accessible form, written in clear and simple language.

The Italian regulator accused platform operators of numerous irregularities in implementing their informational duties. The data processing clauses included in particular:

  • Vague information on the period of data retention
  • Incorrect information on the processing of geolocation data (suggesting that there was a lack of continuity of data collection).

Moreover, the platforms did not include information on:

  • Automated decision-making, including profiling
  • Decision-making rules
  • The significance and anticipated consequences of such processing for the data subject.

As a result, the contractors did not have full information on operation of the mechanism, and did not know how the system processing their data affected, for example, the orders they received.

Analysing the Italian authority’s decision, it is apparent that drafting an appropriate information clause is not simple. A number of elements must be taken into account, and a balance must be struck between the clarity of the message and the proper level of detail.

Automated decision-making process

Often, organisers of job platforms introduce various innovative solutions to streamline the app. For example, an algorithm may assign orders to contractors who have met certain criteria. As a result, the application may either block a contractor or allow the contractor to handle a specific order (e.g. depending on his or her position in internal statistics).

From the perspective of data protection provisions, this is a situation where contractors are subject to a decision based solely on automated processing of their personal data (data regarding the method of performing their services) and producing legal consequences for them or similarly significantly affecting them. Pursuant to Art. 22(2)(a) GDPR, this practice will not be prohibited if, among other things, such an automated decision is necessary for conclusion or performance of a contract between the controller and the data subject (e.g. by a courier). In such a case, under Art. 22(3), “the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.”

Therefore, organisers of job platforms should assess whether the operation of their systems leads to automated decision-making within the meaning of Art. 22 GDPR regarding contractors. If so (and most often this will be the case), they should ensure that appropriate mechanisms are implemented ensuring at least the possibility of:

  • Human intervention in making such a decision
  • Expressing their own position
  • Questioning the decision, e.g. if there are objections to a given decision from the contractor.

The Italian regulator found that the data controllers had not implemented these kinds of measures as required under the GDPR.

Data protection impact assessment

Pursuant to Art. 35(1) GDPR, “Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.”

And under Art. 35(3), a data protection impact assessment (DPIA) is required in particular for “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person.”

Job platforms are involved in:

  • Providing services through an innovative digital platform
  • Detailed tracking (including geolocation) of the ordering system
  • Profiling and automated decision-making with respect to contractors.

Therefore, it would be hard to argue that this type of activity is not subject to Art. 35 GDPR and that the data controller does not have to conduct and document a DPIA.

This assessment (carried out even before the app starts operating) should be a starting point for adequately addressing the risks associated with the designed operation of the app with regard to processing of personal data, in particular to adequately implement the processing principles under the GDPR, including legality of processing, transparency, privacy by default, and privacy by design.

This was also the view taken by the Italian regulator, which found that the failure to conduct DPIAs by the entities engaging couriers was a violation of the GDPR.

Proposed provisions

Platform workers are often unaware of how the algorithms that make decisions for them work and how their data is used. Therefore, work is underway on the Directive on improving working conditions in platform work. The negotiations between the Council of the European Union and the European Parliament will resume under the Belgian Presidency (1 January – 30 June 2024).

Among other things, the proposal provides for an explicit obligation to inform digital platform workers in detail regarding the use of automated monitoring and automated decision-making systems. A ban on the processing of certain data (e.g. regarding mental state) through these systems is to be introduced, as well as a requirement for human monitoring of the impact on platform workers of particular decisions taken or supported by automated monitoring or decision-making systems.

The proposal also addresses the issue of the supplier employment model. In this regard, the possible changes would be revolutionary, since according to the proposal of the Council, digital platform workers (who in practice are now mostly self-employed) would be treated as working in an employment relationship (legal presumption) if at least three of seven criteria indicated in the directive are met, for example:

  • The digital job platform supervises the performance of work including by electronic means
  • The digital job platform requires the person performing platform work to respect specific rules with regard to appearance, conduct towards the recipient of the service or performance of the work
  • The digital job platform restricts workers’ freedom to organise their work, including through sanctions, by limiting the discretion to choose their own working hours or periods of absence.

Organisers of job platforms in the gig economy should keep a close eye on the progress of this proposal. Although the provisions have not yet been enacted and it is unclear when they will take effect, it is worth remembering that platforms are already required to comply with the GDPR, and failure to do so could put them at risk of fines.

Karolina Romanowska, adwokat, Data Practice—Data Economy, Wardyński & Partners

Aleksandra Drożdż, adwokat, M&A and Corporate practice, Wardyński & Partners

Łukasz Rutkowski, attorney-at-law, Data Practice—Data Economy, Wardyński & Partners