International Business Machines (IBM) is going to stop researching, developing, and selling products which include facial recognition software. Arvind Krishna, IBM’s CEO, wrote a letter to Congress in which he explains why this decision was made. The company fears that this technology is used for mass surveillance and ethnic profiling, which is in violation of civil rights. Now seems to be the perfect time to start the dialogue about if and how facial recognition technology should be used by law enforcement agencies.
Facial recognition software can be the ideal tool to keep confidential data secure from outsiders. Investigative and security services use this tool to find and monitor criminals and suspects. On the surface, it seems that facial recognition software is a great tool to ensure our safety.
At the same time, the software has been under attack for a while. Adversaries and civil rights activists feel that using facial recognition software on a large scale violates civil rights and privacy. More and more big companies have their doubts about using the software. These groups are therefore asking for regulations concerning facial recognition software.
Ethical and Moral Grounds
Sundar Pichai, Alphabet’s CEO (Google’s parent company), said earlier this year that in his opinion government regulation needs to play a large part in the use of facial recognition software. In a letter written to the Financial Times he referred to the rules concerning online privacy that are already in place. He said that these can function as the base for the new regulations for the use of facial recognition, which Pichai thinks need to be written into law.
If it was up to the European Commission, the use of facial recognition would be subject to strict regulations. The executive committee wants to limit ‘random’ use of the technology to protect the privacy of Europeans. Ideally, it would ban facial recognition from public spaces. Microsoft and Amazon have also been asking for regulation of the technology.
IBM agrees with this. Krishna decided to write the letter to congress becasue the death of George Floyd, Ahmaud Arbery, Breonna Taylor, and other African-Americans “remind us that the fight against racism is as urgent as ever”. IBM wants to work with congress to achieve justice and racial equality.
IBM wants to focus on three policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities. When a police officer uses excessive force, he needs to be held accountable. As things are now, officers usually walk away freely. According to Krishna this is in violation of the American constitution. There need to be laws in place that will make officers think twice about using excessive force.
Technology can increase transparency and protect society, but it should not be used to discriminate or increase inequality. That is why IBM is critical about using facial recognition software. The software is easily misused by governments and security services. Mass surveillance and racial profiling are examples of that misuse. It violates civil rights and limits our freedom. According to the CEO “now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies”.
He says that artificial intelligence (AI) is “a powerful tool” to keep Americans safe. “But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported”. So, regulations are needed to ensure that governments and investigative and security services use AI responsibly.
The CEO ends his letter by recommending that congress expands certain programs that educates people in ‘new collar’ jobs. These are jobs that require specialized skills, for example positions in cybersecurity and cloud computing. You don’t necessarily learn these skills during a traditional 4-year college degree. Programs such as P-TECH and Pell Grants are needed to educate people.