Frank, the algorithm used by Deliveroo, does not weed out discrimination. A Court in Bologna issued an historic ruling in late December 2020. The Court decision is a landmark one both because of its outcome – it ruled that the system was discriminatory and awarded 50,000 euros in punitive damages – and for the judge’s arguments. The internal documentation submitted by the union federations and the workers’ testimonies give an accurate picture of the system of rules, incentives and sanctions that governed the food-delivery services. A model that discriminates indirectly, since it generates disadvantageous effects without taking due account of the differences among diverse cases. The algorithm, according to the Court, was effective when it came to planning and managing business flows by dispatching a large pool of available workers. However, it allowed neither organisational adaptability nor flexibility of judgement (see: Il tuo capo è un algoritmo. Contro il lavoro disumano).
The case, promoted by the most representative Italian labour union, Cgil, brought to light that Deliveroo’s riders were evaluated primarily on two aspects: reliability and participation (we use the past tense because the company claims to have “adjusted” the statistics used for its slots through its new contracts, which were signed in November and are anyway widely contested). The combination of these metrics gave workers an internal ranking; by virtue of that ranking they were more or less likely to be offered new jobs or to be downgraded instead. Workers with good ratings were among the first to be able to apply for the most coveted work shifts and could also turn down the most uncomfortable ones. However, any waiver in the 24 hours prior to the shift weighed against future calls. Upon returning from a period of absence for various reasons (health problems, commitments related to the care of family members, or collective action), workers could be automatically downgraded and forced to start all over again, by climbing the ranking from the scratch.
Platforms often purport that their workers are independent contractors because they are able to “turn off” the app or not to “log in” to the internal staffing system. Many judgements around the world, and many observers before them, have argued that availability to accept shifts, together with the number of deliveries performed and customers’ ratings, contribute to defining the ranking, which is far from being a perfect representation of reality. This “credits-based" model engenders severe subordination towards the users and the platform, whose mood is unfathomable.
Autonomy and independence are only “virtual”, and courts gradually go beyond formalism to scrutinise the managerial prerogatives exercised by apps, which are increasingly equated with traditional employers. If some progress has been made on the issue of (mis)classification of the working relationship, it also about time to open up the “black boxes” also under the EU General Data Protection Regulation, which limits the use of “automated individual decision-making”, making the mechanics of algorithms transparent, impartial and contestable.
The profiling of workers and clients is now the core part of the business for the operators of the platform economy, the only really promising one, if we look at the financial statements in which negative signs abound in almost all the indexes, despite the turnover’s growth due to the pandemic. Internal “reputation” plays a prominent role, influencing the worker’s potential compensation, all the more so in a regime of exclusivity. The exercise of control and disciplinary powers is facilitated by opaque and misleading systems that deliberately reproduce the business strategies imposed by management on the algorithms’ programmers.
In recent weeks, the ecosystem of platform work was thoroughly shaken in Italy. Firstly, a Court of Palermo had reclassified a Glovo delivery rider as an employee for the first time in the country. If Italian courts lagged behind in reclassification cases, and employment status was only recognised after courts in many other countries had already done so, this case instead is at the forefront of litigation. It is the first time a court questions the operation of an algorithmic management system concerning workers and declares that algorithms may well discriminate against sick people or workers involved in union action.
Algorithms, therefore, are far from being neutral tools and can be subject to judicial review. The European Commission also announced it will present a proposal for a directive to improve the working conditions of platform workers. We thus can debunk once and for all three false myths: the alleged autonomy of workers in deciding whether, when and how much to work, the “superhuman” objectivity of algorithms that organize the work, and the mirage of workforce entirely made up of young people looking for pocket money. Now reality knocks at the door.