Tech leaders were invited to the White House and given instructions to protect people from the dangers that artificial intelligence (AI) poses. It was asserted that Sam Altmann of OpenAI, Satya Nadella of Microsoft, and Sundar Pichai of Google had a ‘moral’ duty to defend society, reported BBC.
The White House made it apparent that more industry regulation was possible. The public’s fascination with recently released AI products like ChatGPT and Bard is growing.
They allow regular people to work with ‘generative AI,’ which can quickly summarize data from many sources, debug computer code, and create convincingly human-sounding presentations and even poetry, among other things.
Because they provide a concrete example of the possible benefits and drawbacks of the new technology, their implementation has generated fresh discussion about the place of AI in society.
When they gathered at the White House, technology leaders were advised that the government was open to novel regulations and laws, including artificial intelligence, that would integrate artificial intelligence, and they were told it was their responsibility to guarantee the confidentiality and protection of their services.
According to a report from the White House, the National Science Foundation will invest $140 million (£111 million) in seven new AI research centers.
Both lawmakers and tech leaders have been urging for stronger regulation of the rapidly expanding field of emerging AI.
Geoffrey Hinton, known as the ‘godfather’ of AI, left his job at Google early than expected, saying he regretted his work. He admitted that some of the risks associated with chatbots that use artificial intelligence were ‘quite scary’ to the BBC.
There are concerns that AI may swiftly replace people’s work and that virtual assistants like ChatGPT and Bard may be inaccurate and disseminate misleading information.
There are concerns that generative AI may violate copyright laws also. The possibility of voice-copying AI could worsen fraud. AI-generated videos can be used to distribute incorrect information.