Alondra Nelson Wants to Make Science and Tech More Just

The pandemic taught us a lesson that we needed to learn again, says Alondra Nelson: Science and technology have everything to do with issues of society, inequality, and social life.

After a year in which science became politicized amid a pandemic and a presidential campaign, in January president-elect Joe Biden appointed Nelson deputy director of science and society in the White House Office of Science and Technology Policy, a new position. Nelson will build a science and society division within the OSTP aimed at addressing issues ranging from data and democracy to STEM education. In another first, Biden made his science adviser Eric Lander, who is also director of OSTP, part of his cabinet.

Nelson has spent her career at the intersection of race, tech, and society, writing about topics like how Afrofuturism can make the world better and how the Black Panthers used health care as a form of activism, leading the organization to develop an early interest in genetics. She’s the author of several books, including Social Life of DNA, which looks at the rise of the consumer genetics testing industry and how a desire to learn about their lineage led Black and Mormon people to become early users of the technology.

Nelson is a professor at the Institute for Advanced Study in Princeton, New Jersey. Before her appointment, she was writing a book about the OSTP and major scientific initiatives of the Obama administration, which included a series of reports on AI and government policy.

In her first formal remarks in her new role in January, Nelson called science a social phenomenon and said technology such as artificial intelligence can reveal or reflect dangerous social architectures that undergird the pursuit of scientific progress. In an interview with WIRED, Nelson said the Black community in particular is overexposed to the harms of science and technology and is underserved by the benefits.

In the interview, she talked about the Biden administration’s plans for scientific moonshots, why the administration has no formal position on banning facial recognition, and issues related to emerging technology and society that she thinks must be addressed during the administration’s time in office. An edited transcript follows.

WIRED: In January you talked about the “dangerous social architecture that lies beneath the scientific progress that we pursue” and cited gene editing and artificial intelligence. What prompted you to mention gene editing and AI in your first public remarks in this role?

Alondra Nelson: I think what genetic science and AI share is that they are data centric. There are things that we know about data and how data analysis works at scale that are as true of large-scale genomic analysis as they are of machine learning in some regard, and so these are kind of foundational. What I think we still need to address as a nation are questions about the provenance of data analyzed with AI tools and questions about who gets to make decisions about what variables are used and what questions are posed of scientific and technical research. What I hope is different and distinctive about this OSTP is a sense of honesty about the past. Science and technology has harmed some communities, left out communities, and left people out of doing the work of science and technology.

Working in an administration that on day one identified issues of racial equity and restoring trust in government as key issues means that the work of science and technology policy has to be really honest about the past and that part of restoring trust in government—part of restoring trust in the ability for science and technology to do any kind of good in the world—is really being open about the history of science and technology’s flaws and failures.

Unfortunately, there’s lots of examples. Next month there will be another anniversary of the Associated Press story that exposed the Tuskegee syphilis study almost 40 years ago, so we’re coming up on that anniversary again. Then of course we’ve got issues in AI from research that the data that’s used is incomplete and that their incompleteness means that they’re making inferences that are incomplete, inaccurate, and, when used in social services and the criminal justice system in particular, have real disproportionate harmful effects on Black and brown communities.

Lander said in his confirmation hearing that OSTP will address discrimination stemming from algorithmic bias. How will that work?

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.