{"id":45909,"date":"2023-10-14T20:08:35","date_gmt":"2023-10-14T20:08:35","guid":{"rendered":"https:\/\/entertainment.runfyers.com\/index.php\/2023\/10\/14\/how-roboticists-are-thinking-about-generative-ai-techcrunch\/"},"modified":"2023-10-14T20:08:35","modified_gmt":"2023-10-14T20:08:35","slug":"how-roboticists-are-thinking-about-generative-ai-techcrunch","status":"publish","type":"post","link":"https:\/\/entertainment.runfyers.com\/index.php\/2023\/10\/14\/how-roboticists-are-thinking-about-generative-ai-techcrunch\/","title":{"rendered":"How roboticists are thinking about generative AI | TechCrunch"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\"><em>[A version of this piece first appeared in TechCrunch\u2019s robotics newsletter, Actuator. <a href=\"https:\/\/link.techcrunch.com\/join\/134\/signup-all-newsletters\" target=\"_blank\" rel=\"noopener\">Subscribe here<\/a>.]<\/em><\/p>\n<p>The topic of generative AI comes up frequently in my newsletter, Actuator. I admit that I was a bit hesitant to spend more time on the subject a few months back. Anyone who has been reporting on technology for as long as I have has lived through countless hype cycles and been burned before. Reporting on tech requires a healthy dose of skepticism, hopefully tempered by some excitement about what can be done.<\/p>\n<p>This time out, it seemed generative AI was waiting in the wings, biding its time, waiting for the inevitable cratering of crypto. As the blood drained out of that category, projects like ChatGPT and DALL-E were standing by, ready to be the focus of breathless reporting, hopefulness, criticism, doomerism and all the different K\u00fcbler-Rossian stages of the tech hype bubble.<\/p>\n<p>Those who follow my stuff know that I was never especially bullish on crypto. Things are, however, different with generative AI. For starters, there\u2019s a near universal agreement that artificial intelligence\/machine learning broadly will play more centralized roles in our lives going forward.<\/p>\n<p>Smartphones offer great insight here. Computational photography is something I write about somewhat regularly. There have been great advances on that front in recent years, and I think many manufacturers have finally struck a good balance between hardware and software when it comes to both improving the end product and lowering the bar of entry. Google, for instance, pulls off some truly impressive tricks with editing features like<a href=\"https:\/\/techcrunch.com\/2023\/10\/04\/google-announces-ai-powered-photo-editing-features-for-new-pixel-phones\/\" target=\"_blank\" rel=\"noopener\"> Best Take and Magic Eraser<\/a>.<\/p>\n<p>Sure, they\u2019re neat tricks, but they\u2019re also useful, rather than being features for features\u2019 sake. Moving forward, however, the real trick will be seamlessly integrating them into the experience. With ideal future workflows, most users will have little to no notion of what\u2019s happening behind the scenes. They\u2019ll just be happy that it works. It\u2019s the classic Apple playbook.<\/p>\n<p>Generative AI offers a similar \u201cwow\u201d effect out the gate, which is another way it differs from its hype cycle predecessor. When your least tech savvy relative can sit at a computer, type a few words into a dialogue field and then watch as the black box spits out paintings and short stories, there isn\u2019t much conceptualizing required. That\u2019s a big part of the reason all of this caught on as quickly as it did \u2014 most times when everyday people get pitched cutting-edge technologies, it requires them to visualize how it might look five or 10 years down the road.<\/p>\n<p>With ChatGPT, DALL-E, etc., you can experience it firsthand right now. Of course, the flip side of this is how difficult it becomes to temper expectations. Much as people are inclined to imbue robots with human or animal intelligence, without a fundamental understanding of AI, it\u2019s easy to project intentionality here. But that\u2019s just how things go now. We lead with the attention-grabbing headline and hope people stick around long enough to read about machinations behind it.<\/p>\n<p>Spoiler alert: Nine times out of 10 they won\u2019t, and suddenly we\u2019re spending months and years attempting to walk things back to reality.<\/p>\n<p>One of the nice perks of my job is the ability to break these things down with people much smarter than me. They take the time to explain things and hopefully I do a good job translating that for readers (some attempts are more successful than others).<\/p>\n<p>Once it became clear that generative AI has an important role to play in the future of robotics, I\u2019ve been finding ways to shoehorn questions into conversations. I find that most people in the field agree with the statement in the previous sentence, and it\u2019s fascinating to see the breadth of impact they believe it will have.<\/p>\n<p>For example, in my<a href=\"https:\/\/techcrunch.com\/2023\/10\/01\/a-tale-of-two-research-institutes\/\" target=\"_blank\" rel=\"noopener\"> recent conversation with Marc Raibert and Gill Pratt<\/a>, the latter explained the role generative AI is playing in its approach to robot learning:<\/p>\n<blockquote>\n<p>We have figure out how to do something, which is use modern generative AI techniques that enable human demonstration of both position and force to essentially teach a robot from just a handful of examples. The code is not changed at all. What this is based on is something called diffusion policy. It\u2019s work that we did in collaboration with Columbia and MIT. We\u2019ve taught 60 different skills so far.<\/p>\n<\/blockquote>\n<p><a href=\"https:\/\/techcrunch.com\/2023\/10\/07\/how-nvidia-became-a-major-player-in-robotics\/\" target=\"_blank\" rel=\"noopener\">Last week<\/a>, when I asked Nvidia\u2019s VP and GM of Embedded and Edge Computing, Deepu Talla why the company believes generative AI is more than a fad, he told me:<\/p>\n<blockquote>\n<p>I think it speaks in the results. You can already see the productivity improvement. It can compose an email for me. It\u2019s not exactly right, but I don\u2019t have to start from zero. It\u2019s giving me 70%. There are obvious things you can already see that are definitely a step function better than how things were before. Summarizing something\u2019s not perfect. I\u2019m not going to let it read and summarize for me. So, you can already see some signs of productivity improvements.<\/p>\n<\/blockquote>\n<p>Meanwhile, during my<a href=\"https:\/\/techcrunch.com\/2023\/08\/17\/what-is-a-liquid-neural-network-really\/\" target=\"_blank\" rel=\"noopener\"> last conversation with Daniela Rus<\/a>, the MIT CSAIL head explained how researchers are using generative AI to actually design the robots:<\/p>\n<blockquote>\n<p>It turns out that generative AI can be quite powerful for solving even motion planning problems. You can get much faster solutions and much more fluid and human-like solutions for control than with model predictive solutions. I think that\u2019s very powerful, because the robots of the future will be much less roboticized. They will be much more fluid and human-like in their motions.<\/p>\n<p>We\u2019ve also used generative AI for design. This is very powerful. It\u2019s also very interesting , because it\u2019s not just pattern generation for robots. You have to do something else. It can\u2019t just be generating a pattern based on data. The machines have to make sense in the context of physics and the physical world. For that reason, we connect them to a physics-based simulation engine to make sure the designs meet their required constraints.<\/p>\n<\/blockquote>\n<p>This week, a team at Northwestern University<a href=\"https:\/\/news.northwestern.edu\/stories\/2023\/10\/instant-evolution-ai-designs-new-robot-from-scratch-in-seconds\/\" target=\"_blank\" rel=\"noopener\"> unveiled its own research<\/a> into AI-generated robot design. The researchers showcased how they designed a \u201csuccessfully walking robot in mere seconds.\u201d It\u2019s not much to look at, as these things go, but it\u2019s easy enough to see how with additional research, the approach could be used to create more complex systems.<\/p>\n<p>\u201cWe discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers,\u201d said research lead Sam Kriegman. \u201cWe told the AI that we wanted a robot that could walk across land. Then we simply pressed a button and presto! It generated a blueprint for a robot in the blink of an eye that looks nothing like any animal that has ever walked the earth. I call this process \u2018instant evolution.\u2019\u201d<\/p>\n<p>It was the AI program\u2019s choice to put legs on the small, squishy robot. \u201cIt\u2019s interesting because we didn\u2019t tell the AI that a robot should have legs,\u201d Kriegman added. \u201cIt rediscovered that legs are a good way to move around on land. Legged locomotion is, in fact, the most efficient form of terrestrial movement.\u201d<\/p>\n<p>\u201cFrom my perspective, generative AI and physical automation\/robotics are what\u2019s going to change everything we know about life on Earth,\u201d Formant founder and CEO Jeff Linnell told me this week. \u201cI think we\u2019re all hip to the fact that AI is a thing and are expecting every one our jobs, every company and student will be impacted. I think it\u2019s symbiotic with robotics. You\u2019re not going to have to program a robot. You\u2019re going to speak to the robot in English, request an action and then it will be figured out. It\u2019s going to be a minute for that.\u201d<\/p>\n<p>Prior to Formant, Linnell founded and served as CEO of Bot &amp; Dolly. The San Francisco\u2013based firm, best known for its work on Gravity, was hoovered up by Google in 2013 as the software giant set its sights on accelerating the industry (the best-laid plans, etc.). The executive tells me that his key takeaway from that experience is that it\u2019s all about the software (given the arrival of Intrinsic and Everyday Robots\u2019 absorption into DeepMind, I\u2019m inclined to say Google agrees).<\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2023\/10\/14\/how-roboticists-are-thinking-about-generative-ai\/\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[A version of this piece first appeared in TechCrunch\u2019s robotics newsletter, Actuator. Subscribe here.] The topic of generative AI comes up frequently in my newsletter, Actuator. I admit that I was a bit hesitant to spend more time on the subject a few months back. Anyone who has been reporting on technology for as long [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":45910,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":{"0":"post-45909","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech"},"_links":{"self":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/45909","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/comments?post=45909"}],"version-history":[{"count":0,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/45909\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media\/45910"}],"wp:attachment":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media?parent=45909"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/categories?post=45909"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/tags?post=45909"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}