Biden issues executive order to create safeguards for AI | ET REALITY

[ad_1]

President Biden on Monday signed a far-reaching executive order on artificial intelligence, requiring companies to inform the federal government about the risks that their systems could help countries or terrorists make weapons of mass destruction. The order also seeks to diminish the dangers of “deepfakes” that could influence elections or defraud consumers.

“Deep fakes use AI-generated audio and video to defame reputations, spread fake news, and commit fraud,” Biden said at the signing of the order at the White House. He described his concern that scammers could take three seconds of a person’s voice and manipulate its content, turning an innocent comment into something more sinister that would quickly go viral.

“I’ve seen one from me,” Biden said, referring to an experiment his staff showed him to point out that a well-built AI system could convincingly create a presidential statement that never happened and thus trigger a crisis. politics or national security. “I said, ‘When the hell did I say that?’”

The order is an effort by the president to demonstrate that the United States, considered the leading power in rapidly advancing artificial intelligence technology, will also take the lead in regulating it. Europe is already moving forward with its own rules, and Vice President Kamala Harris will travel to Britain this week to represent the United States at an international conference hosted by that country’s Prime Minister Rishi Sunak.

“We have a moral, ethical and social duty to ensure that AI is adopted and advanced in a way that protects the public from potential harm,” Harris said at the White House. He added: “We intend for the actions we are taking at the national level to serve as a model for international action.”

But the order issued by Biden, the result of more than a year of work by several government departments, is limited in scope. While Biden has broad powers to regulate how the federal government uses artificial intelligence, he has less ability to reach out to the private sector. Although he said his order “represents bold action,” he acknowledged that “we still need Congress to act.”

Still, Biden made clear that he intended the order to be the first step in a new era of regulation for the United States, as he seeks to put up barriers to a global technology that offers great promise: diagnosing diseases, predicting floods and other effects of climate change. , improving safety in the air and at sea, but also carries significant dangers.

“One thing is clear: To realize the promise of AI and avoid the risks, we must govern this technology,” Biden said. “In my opinion, there is no other way around it.”

The order focuses on safety and security mandates, but also contains provisions to encourage AI development in the United States, including attracting foreign talent to American companies and laboratories. Biden acknowledged that another element of his strategy is to stop China’s advances. He was specifically referring to new regulations – tightened two weeks ago – to deny Beijing access to the most powerful computer chips needed to produce so-called large language models, the mass of information with which intelligence systems are trained. artificial.

While companies often chafe at new federal regulations, executives at companies like Microsoft, Google, OpenAI, and Meta have said they fully expect the United States to regulate the technology, and some executives have, surprisingly, seemed a bit relieved. Companies say they are concerned about corporate liability if the most powerful systems they use are abused. And they hope that putting a government imprimatur on some of their AI-based products can ease concerns among consumers.

The CEOs of Microsoft, Google, OpenAI and another AI startup, Anthropic, met with Harris in May, and in July they and three other companies voluntarily committed to security testing their systems.

“We like the focus on innovation, the steps the US government is taking to build an AI workforce, and the ability for smaller companies to get the computing power they need to develop their own models” said Robert L. Strayer, executive vice president. at the Information Technology Industry Council, a trade group representing big technology companies, he said Monday.

At the same time, several companies have warned against mandates for federal agencies to step up surveillance of anticompetitive conduct and consumer harm. The U.S. Chamber of Commerce on Monday expressed concern about new consumer protection directives, saying the Federal Trade Commission and the Consumer Financial Protection Bureau “should not view this as a license to do whatever they want.” .

The executive order’s security mandates for companies were created by invoking a Korean War-era law, the Defense Production Act, which the federal government uses in what Biden called “the most urgent moments.” The order requires companies that deploy the most advanced artificial intelligence tools to test their systems to ensure they cannot be used to produce biological or nuclear weapons. Companies must report the results of those tests to the federal government, although they do not have to be made public.

The order also requires cloud service providers to report foreign customers to the federal government. It also recommends placing watermarks on photographs, videos and audio developed using artificial intelligence tools. Watermarks help trace the origin of online content and are used to combat deepfakes and manipulated images and text used to spread misinformation.

Biden, trying to make watermarks sound useful to Americans, said, “When your loved ones hear your voice on a phone, they’ll know it’s really you.”

In a speech Wednesday at the U.S. Embassy in London, Harris will announce new initiatives that build on the executive order, according to the White House. And at the British summit the following day, he will urge world leaders to consider the potentially calamitous risks of AI in the future, as well as the current dangers of bias, discrimination and misinformation.

Many of the order’s directives will be difficult to implement, said Sarah Kreps, a professor at Cornell University’s Technology Policy Institute. It calls for the rapid hiring of AI experts in the government, but federal agencies will be challenged to match salaries offered in the private sector. The order calls for privacy legislation, although more than a dozen bills have stalled in the divided Congress, she said.

It demands a lot of action that will probably go unanswered,” Ms. Kreps said.

Leave a Comment