Technology Magazine February 2023 | Page 127

AI / ML
But it is important to consider regional differences , including cultural and legal differences when looking to address AI bias . “ For example , while race can be used as one important reference characteristic when assessing for bias within the United States , in another country , it might be appropriate to put increased focus on ethnicity as a protected characteristic and consider individuals who identify with the largest ethnic group as a key reference class in bias auditing ,” says Heather Domin , IBM ’ s Program Director , AI Governance .
Domin authored the company ’ s report Standards for Protecting At-Risk Groups in AI Bias Auditing with colleagues Jamie VanDodick , Director of Tech Ethics Project Office and Governance ; Calvin Lawrence , Distinguished Engineer , Chief Architect Cognitive Solutions & Innovation ( AI ) Public Sector ; and Francesca Rossi , IBM fellow and AI Ethics Global Leader .
Protected characteristics – a matter of debate While auditors and developers will require guidance and standards to conduct consistent bias audits on AI systems , protected characteristics and their associated classes will remain a matter of local and legal debate , though the IBM team states that there will always be some level of discretion across contexts and locations .
“ It is also important that developers of AI be able to reflect their own ethical standards when evaluating AI systems for bias , which may go beyond regulatory requirements ,” they say . “ Therefore , the tooling that auditors and developers use to conduct bias testing should remain flexible and allow users to easily adapt testing to local norms , stakeholder requirements , and legal expectations for the location and context in which the system is deployed .”
technologymagazine . com 127