{"id":152382,"date":"2025-02-25T22:43:24","date_gmt":"2025-02-25T22:43:24","guid":{"rendered":"https:\/\/entertainment.runfyers.com\/index.php\/2025\/02\/25\/anthropics-latest-flagship-ai-might-not-have-been-incredibly-costly-to-train-techcrunch\/"},"modified":"2025-02-25T22:43:24","modified_gmt":"2025-02-25T22:43:24","slug":"anthropics-latest-flagship-ai-might-not-have-been-incredibly-costly-to-train-techcrunch","status":"publish","type":"post","link":"https:\/\/entertainment.runfyers.com\/index.php\/2025\/02\/25\/anthropics-latest-flagship-ai-might-not-have-been-incredibly-costly-to-train-techcrunch\/","title":{"rendered":"Anthropic&#8217;s latest flagship AI might not have been incredibly costly to train | TechCrunch"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\" class=\"wp-block-paragraph\">Anthropic\u2019s newest flagship AI model, Claude 3.7 Sonnet, cost \u201ca few tens of millions of dollars\u201d to train using less than 10^26 FLOPs of computing power. <\/p>\n<p class=\"wp-block-paragraph\">That\u2019s according to Wharton professor\u00a0Ethan Mollick, who in an X post on Monday relayed a clarification he\u2019d received from Anthropic\u2019s PR. \u201cI was contacted by Anthropic who told me that Sonnet 3.7 would not be considered a 10^26 FLOP model and cost a few tens of millions of dollars,\u201d <a href=\"https:\/\/x.com\/emollick\/status\/1894258450852401243\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">he wrote<\/a>, \u201cthough future models will be much bigger.\u201d<\/p>\n<p class=\"wp-block-paragraph\">TechCrunch reached out to Anthropic for confirmation but hadn\u2019t received a response as of publication time.<\/p>\n<p class=\"wp-block-paragraph\">Assuming Claude 3.7 Sonnet indeed cost just \u201ca few tens of millions of dollars\u201d to train, not factoring in related expenses, it\u2019s a sign of how relatively cheap it\u2019s becoming to release state-of-the-art models. Claude 3.5, Sonnet\u2019s predecessor, released in fall 2024, <a href=\"https:\/\/darioamodei.com\/on-deepseek-and-export-controls\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">similarly cost a few tens of millions of dollars to train<\/a>, Anthropic CEO Dario Amodei revealed in a recent essay.<\/p>\n<p class=\"wp-block-paragraph\">Those totals compare pretty favorably to the training price tags of 2023\u2019s top models. To develop its GPT-4 model, OpenAI spent more than $100 million, <a href=\"https:\/\/www.techradar.com\/pro\/openai-spent-usd80m-to-usd100m-training-gpt-4-chinese-firm-claims-it-trained-its-rival-ai-model-for-usd3-million-using-just-2-000-gpus\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">according<\/a> to OpenAI CEO Sam Altman. Meanwhile, Google spent close to $200 million to train its Gemini Ultra model, a Stanford study <a href=\"https:\/\/aiindex.stanford.edu\/wp-content\/uploads\/2024\/04\/HAI_2024_AI-Index-Report.pdf\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">estimated<\/a>.<\/p>\n<p class=\"wp-block-paragraph\">That being said, Amodei expects future AI models to <a href=\"https:\/\/www.businessinsider.com\/anthropic-ceo-cost-10-billion-train-ai-years-language-model-2024-4\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">cost billions of dollars<\/a>. Certainly, training costs don\u2019t capture work like safety testing and fundamental research. Moreover, as the AI industry embraces \u201creasoning\u201d models that work on problems for <a href=\"https:\/\/techcrunch.com\/2024\/11\/20\/ai-scaling-laws-are-showing-diminishing-returns-forcing-ai-labs-to-change-course\/\" target=\"_blank\" rel=\"noopener\">extended periods of time<\/a>, the computing costs of running models will likely continue to rise.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2025\/02\/25\/anthropics-latest-flagship-ai-might-not-have-been-incredibly-costly-to-train\/\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Anthropic\u2019s newest flagship AI model, Claude 3.7 Sonnet, cost \u201ca few tens of millions of dollars\u201d to train using less than 10^26 FLOPs of computing power. That\u2019s according to Wharton professor\u00a0Ethan Mollick, who in an X post on Monday relayed a clarification he\u2019d received from Anthropic\u2019s PR. \u201cI was contacted by Anthropic who told me [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":152383,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":{"0":"post-152382","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech"},"_links":{"self":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/152382","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/comments?post=152382"}],"version-history":[{"count":0,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/152382\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media\/152383"}],"wp:attachment":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media?parent=152382"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/categories?post=152382"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/tags?post=152382"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}