SAN FRANCISCO, June 20 (Reuters) – The risks of artificial intelligence to national security and the economy need to be addressed, U.S. President Joe Biden said on Tuesday, adding he would seek expert advice.
“My administration is committed to safeguarding Americans’ rights and safety while protecting privacy, to addressing bias and misinformation, to making sure AI systems are safe before they are released,” Biden said at an event in San Francisco.
Biden met a group of civil society leaders and advocates, who have previously criticized the influence of major tech companies, to discuss artificial intelligence.
“I wanted to hear directly from the experts,” he said.
Several governments are considering how to mitigate the dangers of the emerging technology, which has experienced a boom in investment and consumer popularity in recent months after the release of OpenAI’s ChatGPT.
Biden’s meeting on Tuesday included Tristan Harris, executive director of the Center for Humane Technology, Algorithmic Justice League founder Joy Buolamwini and Stanford University Professor Rob Reich.
Regulators globally have been scrambling to draw up rules governing the use of generative AI, which can create text and images, and whose impact has been compared to that of the internet.
Biden has also recently discussed the issue of AI with other world leaders, including British Prime Minister Rishi Sunak whose government will later this year hold a first global summit on artificial intelligence safety. Biden is expected to discuss the topic with Indian Prime Minister Narendra Modi during his ongoing U.S. visit.
European Union lawmakers agreed last week to changes in draft rules on artificial intelligence proposed by the European Commission in a bid to set a global standard for a technology used on everything from automated factories to self-driving cars to chatbots.
Reporting by Trevor Hunnicutt in San Francisco
Writing by Kanishka Singh
Editing by Chris Reese, Alistair Bell and Matthew Lewis
Our Standards: The Thomson Reuters Trust Principles.