Uber accused of using discriminatory facial verification software
An employment tribunal claim has been filed against Uber by one of its drivers over its use of allegedly racially discriminatory facial-verification software.
Uber uses facial-verification software to log drivers onto the app, as part of its identity verification system called "Real-Time ID Check" to reduce account misuse. Drivers provide a selfie-style photo (the "Live Photo") when logging into the app which is then compared to the photo that Uber has on file (the "File Photo"). Drivers choose between automated or human verification of the Live Photo against the File Photo. If the driver opts for an automated review and the Live Photo does not match the File Photo, the Live Photo goes to human verification. If the human verification decides that the Live Photo and File Photo do not match, then the driver is "waitlisted" for 24 hours and unable to work. Following the 24-hour wait, the driver can take another Live Photo, only for human verification. If the human verification decides that the second Live Photo and the File Photo still do not match, the driver's account is deactivated.
The Claimant submits that the software is indirectly racially discriminatory because his account was deactivated when the facial-verification software decided that his Live Photo and File Photo did not match. It has been contended that the algorithm used by the software, which is made by Microsoft, is not as effective at recognising people of colour. Reports of the case draw attention to studies of several facial recognition software packages which have a higher margin of error when recognising individuals with darker skin in comparison to those with lighter skin.