Google CEO Sundar Pichai warns against unregulated AI deployment

AFP

Alphabet Inc and Google CEO Sundar Pichai stressed the importance of regulating artificial intelligence technology to avoid potential harm during an interview on CBS’ 60 Minutes. 

Pichai expressed his concern about the urgency to deploy AI in a beneficial way, while acknowledging the potential harm if deployed wrongly.

As one of the leaders in developing and implementing AI across its services, Google’s pace in deploying the technology has been deliberately measured and cautious. However, OpenAI’s ChatGPT has opened up a race to move forward with AI tools at a much faster pace.

While Google is playing catch-up in looking to infuse its products with generative AI, Pichai warned against companies being swept up in the competitive dynamics.

"One of the points they have made is, you don’t want to put out a tech like this when it’s very, very powerful because it gives society no time to adapt," Pichai said. "I think that’s a reasonable perspective. I think there are responsible people there trying to figure out how to approach this technology, and so are we."

Pichai also highlighted the risks of generative AI, specifically deep-fake videos, which can cause harm to society. He stressed the need for regulation and consequences for creating deep-fake videos. "Anybody who has worked with AI for a while, you know, you realise this is something so different and so deep that we would need societal regulations to think about how to adapt," he said.

Former Google CEO Eric Schmidt also urged global tech companies to come together and develop standards and appropriate guardrails, warning that any slowdown in development would "simply benefit China".

Pichai acknowledged the fast pace of AI technology, and emphasised the need for responsible individuals and companies to work together to regulate its use. "We don’t have all the answers there yet, and the technology is moving fast," Pichai said. "So does that keep me up at night? Absolutely."

More from Business