Algorithmic management as a kind of a new organizational system and technology at work brings several challenges into law, especially labour law - from processing of personal data, assignment of work and assessment of the work results to the safety and protection at work. Proper assessment of risks of algorithmic management by the platform provider is therefore a truly necessary step before its introduction in order to avoid the negative effects of breaching the law.
In this context in December 2021, the EU submitted a proposal for a directive on improving working conditions in platform work ("Directive proposal") that sets several duties to the digital labour platforms (providers) concerning algorithmic management, i.e. automated monitoring and decision-making systems. According to the Directive proposal the platforms shall provide information on working of algorithmic management, consult its aspects with trade unions or platform workers. Also a duty to ensure a human review of significant algorithmic decisions should bring a human aspect to an otherwise machine-like environment. The question is if these duties in the Directive proposal are sufficient or rather lenient. Either way, there is still a lot of work to be done by the national legislation of each EU Member State, their courts and administrative authorities.
The research and poster is based on case-studies using comparative-analytical methods. The case-studies stem from judgments an and other sources describing working of a particular algorithmic management used by the platform, especially from the EU and the USA, to find out acceptable settings of algorithmic management and formulate basic principles that should be respected. In particular, the paper will devote close attention to the discriminatory algorithmic settings or settings that can violate the principles of equal treatment.
The discrimination and unequal treatment can manifest itself in various forms within the algorithmic management of platforms. From initial selection of platform workers, work allocation or access to some types of assignments, the amount of remuneration to the type of contract that is concluded between platform provider and the platform worker. The fact that some platform workers carry out activities as self-employed persons without legal protection and entitlements and others as employees may itself be discriminatory or unequal, e.g. a platform chooses one business model in some countries but different in others, or it disadvantages minorities. From a technical point of view, however, the fundamental question is whether it is possible for the system to encompass and avoid all possible forms of discrimination, unequal treatment and, strictly speaking, also bias and prejudice usually held by people.