ETHICAL AI
As the technology moves beyond basic consumer needs , what is the true cost of AI in surveillance - and is it out of control already ?
WRITTEN BY : ALEX TUCK
Artificial intelligence ( AI ) is intertwined in airports , entertainment venues , stadiums , hotels , casinos , shopping centres and in particular , police forces . In 2019 , at least 75 countries were using artificial intelligence-enabled surveillance . Irish start-up Liopa is trialling a phone application that can interpret phrases mouthed by people , and VisionLabs , which is based in Amsterdam , claims it is able to tell if someone is showing anger , disgust , fear , happiness , surprise or sadness ; with the aim of tracking productivity and even make hiring decisions .
“ Limiting data collection by central authorities is a good way of limiting government interference ”
ALAN CALDER CEO , GRC INTERNATIONAL GROUP
Concerns around facial recognition Nigel Jones , Co-Founder of the Privacy Compliance Hub , former Google executive and head of its legal team for Europe , said : “ While undoubtedly technology can be a force for good , and can and does track and find criminals , it can go too far and actually become a threat to people going about their everyday lawful business . Today , from unlocking your iPhone , to employers tracking productivity and police forces surveilling protests – facial recognition technology is becoming more and more embedded into our everyday lives . But there are several reasons why we think there are reasons for concern when it comes to facial recognition software and privacy . These tools work a lot better in the lab than they do in the real world . Small factors such as light , shade and how the image is captured can affect the result ,” said Jones .
Jones says that many critics argue that users have not consented to being scanned , and that going out in public
should not be considered consent to be tracked . Even knocking on a friend ’ s door ( complete with a Ring doorbell ) could see you added to a police database of images .
Without law and order , facial cognition for surveillance could be dangerous for those caught on
technologymagazine . com 123