It should be noted that Pierre Omidyar, the founder of eBay and Reid Hoffman, the founder and CEO of LinkedIn announced the formation of a similar research effort earlier this year. Bill Gates and Elon Musk have also expressed interest in these issues. While their accumulated findings may not agree, it is beneficial that society will have so many well-funded decisions to study and debate. JL
Dave Gershgorn reports in Quartz:
If you look for research on how artificial intelligence affects society, like how algorithms used in criminal justice can discriminate, or whether data used to train AI contains implicit bias—there’s almost no academic or corporate research. One barrier to creating solutions to AI’s societal issues is the lack of a shared language between those who build AI and those studying its implications and affects.“We’re talking to the people who build the systems about the real practices and processes. Where are the assumptions?”
When it comes to developing artificial intelligence, the largest technology companies in the world are all-in. Google and Microsoft say they’re “AI-first,” and businesses like Facebook and Amazon wouldn’t be possible without the scalable personalization that AI allows.
But if you look for research on how artificial intelligence affects society—like how algorithms used in criminal justice can discriminate against people of color, or whether data used to train AI contains implicit bias against women and minorities—there’s almost no academic or corporate research to be found.
Kate Crawford, principal researcher at Microsoft Research, and Meredith Whittaker, founder of Open Research at Google, want to change that. They announced the AI Now Institute, a research organization to explore how AI is affecting society at large. AI Now will be cross-disciplinary, bridging the gap between data scientists, lawyers, sociologists, and economists studying the implementation of artificial intelligence.
“The amount of money and industrial energy that has been put into accelerating AI code has meant that there hasn’t been as much energy put into thinking about social, economic, ethical frameworks for these systems,” Crawford tells Quartz. “We think there’s a very urgent need for this to happen faster.”
AI Now released a report last month that outlined many of the issues the institute’s researchers will explore more fully. Initially, the founders plan to hire somewhat fewer than 100 researchers.
The organization’s advisory board includes California supreme court justice Mariano-Florentino Cuéllar, NAACP Legal Defense Fund president Sherrilyn Ifill, and former White House CTO Nicole Wong. Other board members are Cynthia Dwork, the creator of differential privacy—an idea that has become a standard for protecting individuals’ data in a large database, and Mustafa Suleyman, cofounder of DeepMind.
The institute will be based at New York University, where many academics studied the artificial neural networks responsible for today’s AI boom. AI Now is partnered with eight NYU schools, including the NYU School of Law and the Steinhardt School of Culture, Education, and Human Development.
AI Now will focus on four major themes:
Crawford and Whittaker have worked for years on such issues within Google and Microsoft. One barrier to creating solutions to AI’s societal issues is the lack of a shared language between those who build AI and those studying its implications and affects.
- Bias and inclusion (how can bad data disadvantage people)
- Labor and automation (who doesn’t get hired when AI chooses)
- Rights and liberties (how does government use of AI impact the way it interacts with citizens)
- Safety and critical infrastructure (how can we make sure healthcare decisions are made safely and without bias)
“Part of what we’re doing is talking to the people who build the systems about the real practices and processes around this. Where are the assumptions?” says Whittaker. “And what don’t you know, that you would want to know, if you were going to do this in a way that you felt was responsible?”
0 comments:
Post a Comment