{"id":27411,"date":"2023-07-21T09:01:19","date_gmt":"2023-07-21T09:01:19","guid":{"rendered":"https:\/\/entertainment.runfyers.com\/index.php\/2023\/07\/21\/top-ai-companies-visit-the-white-house-to-make-voluntary-safety-commitments-techcrunch\/"},"modified":"2023-07-21T09:01:19","modified_gmt":"2023-07-21T09:01:19","slug":"top-ai-companies-visit-the-white-house-to-make-voluntary-safety-commitments-techcrunch","status":"publish","type":"post","link":"https:\/\/entertainment.runfyers.com\/index.php\/2023\/07\/21\/top-ai-companies-visit-the-white-house-to-make-voluntary-safety-commitments-techcrunch\/","title":{"rendered":"Top AI companies visit the White House to make &#8216;voluntary&#8217; safety commitments | TechCrunch"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\">While substantive AI legislation may still be years away, the industry is moving at light speed and many \u2014 including the White House \u2014 are worried that it may get carried away. So the Biden administration has collected \u201cvoluntary commitments\u201d from 7 of the biggest AI developers to pursue shared safety and transparency goals ahead of a planned Executive Order.<\/p>\n<p>OpenAI, Anthropic, Google, Inflection, Microsoft, Meta, and Amazon are the companies taking part in this non-binding agreement, and will send representatives to the White House to meet with President Biden today.<\/p>\n<p>To be clear, there is no rule or enforcement being proposed here \u2014 the practices agreed to are purely voluntary. But although no government agency will hold a company accountable if it shirks a few, it will also likely be a matter of public record.<\/p>\n<p>Here\u2019s the list of attendees at the White House gig:<\/p>\n<ul>\n<li class=\"m_2456566108588919563MsoListParagraph\">Brad Smith, President, Microsoft<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Kent Walker, President, Google<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Dario Amodei, CEO, Anthropic<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Mustafa Suleyman, CEO, Inflection AI<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Nick Clegg, President, Meta<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Greg Brockman, President, OpenAI<u\/><u\/><\/li>\n<li class=\"m_2456566108588919563MsoListParagraph\">Adam Selipsky, CEO, Amazon Web Services<\/li>\n<\/ul>\n<p>No underlings, but no billionaires, either. (And no women.)<\/p>\n<p>The seven companies (and likely others that didn\u2019t get the red carpet treatment but will want to ride along) have committed to the following:<\/p>\n<ul>\n<li>Internal and external security tests of AI systems before release, including adversarial <a href=\"https:\/\/learn.microsoft.com\/en-us\/azure\/ai-services\/openai\/concepts\/red-teaming\" target=\"_blank\" rel=\"noopener\">\u201cred teaming\u201d<\/a> by experts outside the company.<\/li>\n<li>Share information across government, academia, and \u201ccivil society\u201d on AI risks and mitigation techniques (such as preventing \u201cjailbreaking\u201d).<\/li>\n<li>Invest in cybersecurity and \u201cinsider threat safeguards\u201d to protect private model data like weights. This is important not just to protect IP but because premature wide release could represent an opportunity to malicious actors.<\/li>\n<li>Facilitate third-party discovery and reporting of vulnerabilities, e.g. a bug bounty program or domain expert analysis.<\/li>\n<li>Develop robust watermarking or some other way of marking AI-generated content.<\/li>\n<li>Report AI systems\u2019 \u201ccapabilities, limitations, and areas of appropriate and inappropriate use.\u201d Good luck getting a straight answer on this one.<\/li>\n<li>Prioritize research on societal risks like systematic bias or privacy issues.<\/li>\n<li>Develop and deploy AI \u201cto help address society\u2019s greatest challenges\u201d like cancer prevention and climate change. (Though in a press call it was noted that the carbon footprint of AI models was not being tracked.)<\/li>\n<\/ul>\n<p>Though the above are voluntary, one can easily imagine that the threat of an Executive Order \u2014 they are \u201ccurrently developing\u201d one \u2014 is there to encourage compliance. For instance, if some companies fail to allow external security testing of their models before release, the E.O. may <em>develop<\/em> a paragraph directing the FTC to look closely at AI products claiming robust security. (One E.O. is already in force asking agencies to watch out for bias in development and use of AI.)<\/p>\n<p>The White House is plainly eager to get out ahead of this next big wave of tech, having been caught somewhat flat-footed by the disruptive capabilities of social media. The President and Vice President have both met with industry leaders and solicited advice on a national AI strategy, as well is dedicating a good deal of funding to new AI research centers and programs. Of course the national science and research apparatus is well ahead of them, as this highly comprehensive (though necessarily slightly out of date) <a href=\"https:\/\/www.anl.gov\/ai-for-science-report\" target=\"_blank\" rel=\"noopener\">research challenges and opportunities report from the DOE and National Labs shows<\/a>.<\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2023\/07\/21\/top-ai-companies-visit-the-white-house-to-make-voluntary-safety-commitments\/\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>While substantive AI legislation may still be years away, the industry is moving at light speed and many \u2014 including the White House \u2014 are worried that it may get carried away. So the Biden administration has collected \u201cvoluntary commitments\u201d from 7 of the biggest AI developers to pursue shared safety and transparency goals ahead [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":27412,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":{"0":"post-27411","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech"},"_links":{"self":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/27411","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/comments?post=27411"}],"version-history":[{"count":0,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/27411\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media\/27412"}],"wp:attachment":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media?parent=27411"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/categories?post=27411"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/tags?post=27411"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}