In January, Liz O’Sullivan wrote a letter to her boss at artificial intelligence startup Clarifai, asking him to set ethical limits on its Pentagon contracts. WIRED had previously revealed that the company worked on a controversial project processing drone imagery.

O’Sullivan urged CEO Matthew Zeiler to pledge the company would not contribute to the development of weapons that decide for themselves whom to harm or kill. At a company meeting a few days later, O’Sullivan says, Zeiler rebuffed the plea, telling staff he saw no problems with contributing to autonomous weapons. Clarifai did not respond to a request for comment.

O’Sullivan decided to take a stand. “I quit,” she says. “And cried through the weekend.” Come Monday, though, she took a previously planned trip to an academic conference on fairness and transparency in technology. There she met Adam Wenchel, who previously led Capital One’s AI work, and the pair got talking about the commercial opportunity of helping companies keep their AI deployments in check.

O’Sullivan and Wenchel are now among the cofounders of startup Arthur, which provides tools to help engineers monitor the performance of their machine learning systems. They’re supposed to make it easier to spot problems such as a financial system making biased lending or investment decisions. It is one of several companies, large and small, trying to profit from building digital safety equipment for the AI era.

Researchers and tech companies are raising alarms about AI going awry, such as facial recognition algorithms that are less accurate on black faces. Microsoft and Google now caution investors that their AI systems may cause ethical or legal problems. As the technology spreads into other industries such as finance, healthcare, and government, so must new safeguards, says O’Sullivan, who is Arthur’s VP of commercial operations. “People are starting to realize how powerful these systems can be, and that they need to take advantage of the benefits in a way that is responsible,” she says.

Keep Reading

The latest on artificial intelligence, from machine learning to computer vision and more

Arthur and similar startups are tackling a drawback of machine learning, the engine of the recent AI boom. Unlike ordinary code written by humans, machine learning models adapt themselves to a particular problem, such as deciding who should get a loan, by extracting patterns from past data. Often, the many changes made during that adaptation, or learning, process aren’t easily understood. “You’re kind of having the machine write its own code, and it’s not designed for humans to reason through,” says Lukas Biewald, CEO and founder of startup Weights & Biases, which offers its own tools to help engineers debug machine learning software.

Researchers describe some machine learning systems as “black boxes,” because even their creators can’t always describe exactly how they work, or why they made a particular decision. Arthur and others don’t claim to have fully solved that problem, but offer tools that make it easier to observe, visualize, and audit machine learning software’s behavior.

The large tech companies most heavily invested in machine have built similar tools for their own use. Facebook engineers used one called Fairness Flow to make sure its job ad recommendation algorithms work for people of different backgrounds. Biewald says that many companies without large AI teams don’t want to build such tools for themselves, and will turn to companies like his own instead.

Weights & Biases customers include Toyota’s autonomous driving lab, which uses its software to monitor and record machine learning systems as they train on new data. That makes it easier for engineers to tune the system to be more reliable, and speeds investigation of any glitches encountered later, Biewald says. His startup has raised $20 million in funding. The company’s other customers include independent AI research lab OpenAI. It uses the startup’s tools in its robotics program, which this week demonstrated a robotic hand that can (sometimes) solve a modified Rubik’s Cube.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.