{"id":74776,"date":"2024-02-10T00:24:16","date_gmt":"2024-02-10T00:24:16","guid":{"rendered":"https:\/\/entertainment.runfyers.com\/index.php\/2024\/02\/10\/how-to-fake-a-robotics-demo-for-fun-and-profit-techcrunch\/"},"modified":"2024-02-10T00:24:16","modified_gmt":"2024-02-10T00:24:16","slug":"how-to-fake-a-robotics-demo-for-fun-and-profit-techcrunch","status":"publish","type":"post","link":"https:\/\/entertainment.runfyers.com\/index.php\/2024\/02\/10\/how-to-fake-a-robotics-demo-for-fun-and-profit-techcrunch\/","title":{"rendered":"How to fake a robotics demo for fun and profit | TechCrunch"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\"><span class=\"featured__span-first-words\">In March 2008<\/span>, a roboticist in winter wear gave Big Dog a big kick for the camera. The buzzing DARPA-funded robot stumbled, but quickly regained its footing amid the snowy parking lot. \u201cPLEASE DO NOT KICK THE WALKING PROTOTYPE DEATH MECH,\u201d pleads the video\u2019s top comment. \u201cIT WILL REMEMBER.\u201d<\/p>\n<p>\u201cCreepy as hell,\u201d notes another. \u201cImagine if you were taking a walk in the woods one day and saw that thing coming towards you.\u201d Gadget blogs and social media accounts variously tossed out words like \u201cterrifying\u201d and \u201crobopocalypse,\u201d in those days before Black Mirror gave the world an even more direct shorthand. Boston Dynamics had a hit. The video currently stands at 17 million views. It was the first of countless viral hits that continue to this day.<\/p>\n<p>It\u2019s hard to overstate the role such virality has played in Boston Dynamics\u2019 subsequent development into one of the world\u2019s most instantly identifiable robotics companies. Big Dog and its descendants like Spot and Atlas have been celebrated, demonized, parodied and even appeared in a Sam Adams beer ad. Along with developing some of the world\u2019s most advanced mechatronics, the Boston Dynamics team have proven themselves to be extremely savvy marketers.<\/p>\n<p>There\u2019s much to be said for the role such videos have played in spreading the gospel of robotics.<\/p>\n<p>It seems likely videos like this have inspired the careers of countless roboticists who are currently thriving in the field. It\u2019s a model countless subsequent startups have adopted to a wide range of success. Boston Dynamics certainly can\u2019t be held responsible for any of those companies that might have taken a few shortcuts along the way.<\/p>\n<p>In recent decades, viral robot videos have grown from objects of curiosity among the technorati to headline-grabbing hits filtered through TikTok and YouTube. As the potential rewards have increased, so too has the desire to soften the edges. Further complicating matters is the state of CGI, which has become indistinguishable from reality for many viewers. Confirmation bias, attraction to novelty and a lack of technical expertise all play key roles in our tendency to believe fake news and videos.<\/p>\n<p>You can forgive the average TikTok viewer, for instance, for not understanding the intricacies of generalization. Many roboticists have \u2014 perhaps unintentionally \u2014 added fuel to that fire by implying that the systems we\u2019re seeing in videos are \u201cgeneral purpose.\u201d Multi-purpose, perhaps, but we\u2019re still some ways off from robots that can perform any task not hampered by hardware limitations.<\/p>\n<p>More often than not, the videos you see are the product of months or years of work. Somewhere on a hard drive sits the hours of video that didn\u2019t make it into the final cut, featuring a robot stumbling, sputtering or stopping short. This is precisely why I\u2019ve encouraged companies to share some of these videos with the TechCrunch audience. Perhaps unsurprisingly, few have taken me up on the offer. I suspect much of this comes down to how people perceive such information. Among robotics, the hours and days of trial and failure are an indication of how hard you\u2019ve worked to get to the final product. Among the general public, however, such robot failures may be seen as a failure on the part of the roboticists themselves.<\/p>\n<p>Back in a 2023 issue of Actuator (RIP), I praised Boston Dynamics for the <a href=\"https:\/\/techcrunch.com\/2023\/03\/09\/mistakes-were-made-and-thats-fine\/\" target=\"_blank\" rel=\"noopener\">\u201cblooper reel\u201d<\/a> they published featuring Atlas losing its footing and falling in between successful parkour moves. As usual, a lot more ended up on the cutting room floor than made the final cut. Even when not dealing with robots, that\u2019s just how things go.<\/p>\n<p>A few weeks back, I attended a talk by director Kelly Reichardt following a screening of her wonderful new(ish) film<em>, <\/em>\u201cShowing Up.\u201d She reiterated that old W.C. Fields chestnut about never working with children or animals. In most cases, I would probably add advanced mechatronics to that list.<\/p>\n<p>Along with CG\/renders, creative editing is just one of many potential ways to sweeten a robotics demo. More often than not, the intent is not malicious. A sentiment musicians frequently share with me on <a href=\"https:\/\/riylcast.com\/\" target=\"_blank\" rel=\"noopener\">my podcast<\/a> is that once a song is released into the world, you no longer have control over it. To a certain extent, I believe the same can be true with video. Choices are made to tighten things up and sweeten the presentation. These are an essential part of making consumable online videos. Especially in the age of TikTok, however, context is the first casualty.<\/p>\n<p>There\u2019s no rulebook for what information one needs to include in a robotics demo. The more I think about it, however, the more I believe there should be \u2014 at the very least \u2014 some well-defined guidelines. I am not a roboticist. I\u2019m just a nerd with a BA in creative writing. I do, however, regularly speak with people far smarter than myself about the subject.<\/p>\n<p>Just ahead of CES, a <a href=\"https:\/\/www.linkedin.com\/posts\/brad-porter-1a989_ces2024-activity-7149844183699079168-BxW_?utm_source=share&amp;utm_medium=member_desktop\" target=\"_blank\" rel=\"noopener\">LinkedIn post<\/a> caught my eye (as well, it seems, the eyes of much of the robotics community). It was penned by Brad Porter, the <a href=\"https:\/\/www.co.bot\/\" target=\"_blank\" rel=\"noopener\">Collaborative Robotics<\/a> founder and CEO who formerly headed Amazon\u2019s industrial robotics efforts. I rarely recommend LinkedIn follows, but if you care about the space at all, he\u2019s a good one.<\/p>\n<p>In the piece, Porter notes that CES would likely be lousy with cool robotics demos (it was), but adds, \u201cthere are also a lot of amazing trick-shot videos out there. Separating reality from stagecraft is hard.\u201d The executive wasn\u2019t implying any of the negative baggage that a word like \u201cstagecraft\u201d might have in this context. He was instead simply suggesting that viewers approach such videos with a discerning and \u2014 perhaps \u2014 skeptical eye.<\/p>\n<p>I\u2019ve been covering this space for a number of years and have developed some of the skills to spot robotic kayfabe. But I still often lean on experts in the field like Porter when a demo feels off. Of course, not every viewer has my experience or access to these folks. They can, however, equip themselves with the knowledge of how such videos are sweetened \u2014 maliciously or otherwise.<\/p>\n<p>Porter identifies five different points. The first is \u201cstop-motion.\u201d This refers to a succession of rapid edits that make it appear as though the robot is moving in ways it\u2019s incapable of in real life.<\/p>\n<p>\u201cIf you see a robotics video with a lot of frame skips or camera cuts, [be] wary,\u201d he writes. \u201cYou\u2019ll notice Boston Dynamics videos are often one cut with no camera cuts, that\u2019s impressive.\u201d<\/p>\n<p>The second is simulation. This is, in practice, the CG example I gave above. Simulation has become a foundational tool in robotic deployment. It allows people to run thousands of scenarios simultaneously in seconds. Along with other computer graphics, robotic simulation has grown increasingly photorealistic in recent years. Creating and sharing a realistic simulation isn\u2019t a problem in and of itself. The issue, rather, arises when you pass off such things as reality.<\/p>\n<p>Issue three has a fun name. Wizard of Oz demos are called such due to the heavy lifting being done by the [person] behind the curtain (pay no attention). Porter cites Stanford\u2019s Mobile ALOHA demo as an example. I strongly believe there was no malice involved in the decision to run the (still extremely impressive) demo via off-screen teleop. In fact, the \u201crobot operator,\u201d Tony Zhao, appears in both the video and end credits.<\/p>\n<p>Unfortunately, the appearance occurs two-and-a-half minutes into a three-and-a-half minute demo. These days, however, we have to assume that:<\/p>\n<ol>\n<li>No one actually has the attention span to sit through two-and-a-half minutes of incredible robot footage anymore.<\/li>\n<li>This thing is going to get sliced up and stripped of all context.<\/li>\n<li>Your average TikTok X (Twitter) viewer isn\u2019t going to hunt down the video\u2019s source.<\/li>\n<\/ol>\n<p>For another example that arrived shortly after Porter\u2019s post, take a look at Elon Musk\u2019s X video of the <a href=\"https:\/\/techcrunch.com\/2024\/01\/15\/elons-tesla-robot-is-sort-of-ok-at-folding-laundry-in-pre-scripted-demo\/\" target=\"_blank\" rel=\"noopener\">Optimus humanoid robot folding laundry<\/a>. The video ran with the text \u201cOptimus folds a shirt.\u201d Eagle-eyed viewers such as myself spotted something interesting in the lower right-hand corner: a gloved hand that occasionally popped partially into frame that matched the robot\u2019s movement.<\/p>\n<p>\u201cFraming the Optimus laundry video just a few more inches to the left and you would have missed what looks like a tele-op hand controlling Tesla Bot,\u201d I <a href=\"https:\/\/www.linkedin.com\/posts\/brianheater_framing-the-optimus-laundry-video-just-a-activity-7153177936097931265-93RB?utm_source=share&amp;utm_medium=member_desktop\" target=\"_blank\" rel=\"noopener\">noted at the time<\/a>. \u201cNothing wrong with tele-op, of course It has some excellent applications, including training, troubleshooting and executing highly specialized tasks like surgery. But it\u2019s nice to know what we are (and are not) seeing. This strikes me as a obvious case of the original poster omitting key information, understanding that his audiences\/fans will fill in the gaps with what they believe they\u2019re seeing based on their feelings about the messenger.\u201d<\/p>\n<p>It could be wrong to accuse Musk of intentionally fully obfuscating the truth here. Twenty-three minutes after the initial tweet, he added, \u201cImportant note: Optimus cannot yet do this autonomously, but certainly will be able to do this fully autonomously and in an arbitrary environment (won\u2019t require a fixed table with box that has only one shirt).\u201d<\/p>\n<p>As not-Mark Twain <a href=\"https:\/\/www.nytimes.com\/2017\/04\/26\/books\/famous-misquotations.html\" target=\"_blank\" rel=\"noopener\">famously noted<\/a>, \u201ca lie can travel halfway around the world while the truth is still putting on its shoes.\u201d A similar principle can be applied to online video. The initial tweet isn\u2019t exactly a lie, of course, but it can certainly be classified as an omission. It\u2019s the old newspaper thing of hiding your corrections on page A12. Far more people will be exposed to the initial error.<\/p>\n<p>Again, I\u2019m not here to tell you whether or not that initial omission was intentional (if you chose to apply the benefit of the doubt here, you can absolutely see the follow-up tweet as a genuine clarification of incomplete context). In this specific instance, I suspect most opinions on the matter will be directly correlated with one\u2019s personal feelings about its author.<\/p>\n<p>Porter\u2019s next example is \u201cSingle-task Reinforcement Learning.\u201d You can do a deeper dive on reinforcement learning <a href=\"https:\/\/lamarr-institute.org\/blog\/reinforcement-learning-and-robotics\/\" target=\"_blank\" rel=\"noopener\">here<\/a>, but for the sake of brevity in a not-at-all brief article, let\u2019s just say it\u2019s a way to teach robots to perform tasks with repetitive real-world trial and error.<\/p>\n<p>\u201cOpen a door, stack a block, turn a crank,\u201d writes Porter. \u201cLearning these tasks is impressive and they look impressive and they are impressive. But a good RL engineer can make this work in a couple of months. One step harder is to make it robust to different subtle variations. But generalizing to multiple similar tasks is very hard. In order to be able to tell if it can generalize, look for multiple trained tasks.\u201d<\/p>\n<p>Like teleop, there\u2019s absolutely nothing wrong with reinforcement learning. These are both invaluable tools for training and operating robots. You just need to disclose them as clearly as possible.<\/p>\n<p>Porter\u2019s final tip is monitoring environment and potential omissions. He cites the then-recent video of Figure\u2019s humanoid making coffee. \u201cFluid, single-cut, shows robustness to failure modes,\u201d he writes. \u201cStill just a single task, so claims of robotic\u2019s ChatGPT moment aren\u2019t in evidence here. Production quality is great. But you\u2019ll notice the robot doesn\u2019t lift anything heavier than a Keurig cup. Picking up mugs has been done, but they don\u2019t show that. Maybe the robot doesn\u2019t have that strength?\u201d<\/p>\n<p>When I spoke with Porter about the intricacies of the post today, he was once again quick to point out that these observations don\u2019t detract from what is genuinely impressive technology. The issue, however, is that our brains have the tendency to fill in gaps. We anthropomorphize or humanize robots and assume they learn the way we do, when in reality, watching a robot open one door absolutely doesn\u2019t guarantee that it can open another \u2014 or even the same door under different lighting. TVs and movies have also given us unrealistic expectations of what robots can \u2014 and can\u2019t \u2014 do in 2024.<\/p>\n<p>One last point that didn\u2019t make it into the post is speed. The technology can be painfully slow at times, so it\u2019s common to speed things up. For the most part, universities and other research facilities do a good job noting this via a text overlay. This is the way to do it. Add the pertinent information on screen in a way that is difficult for a click-hungry influencer to crop out. In fact, this phenomenon is how 1X got its name.<\/p>\n<p>\u00a0<\/p>\n<p>A recent video from the company showcasing its use of neural networks draws attention to this fact. \u201cThis video contains no teleoperation, no computer graphics, no cuts, no video speedups, no scripted trajectory playback,\u201d the company explains. \u201cIt\u2019s all controlled via neural networks.\u201d The result is a three-minute video that can feel almost painfully slow compared to other humanoid demos.<\/p>\n<p>As with the blooper videos, I applaud this \u2014 and any \u2014 form of transparency. For truly slowly moving robots, there\u2019s nothing wrong with speeding things up, so long as you stick to three import rules:<\/p>\n<ol>\n<li>Disclose<\/li>\n<li>Disclose<\/li>\n<li>Disclose<\/li>\n<\/ol>\n<p>Much like the songwriter, companies have to acknowledge that you can\u2019t control what happens to a video once it belongs to the world. But ask yourself: Did I do everything within my power to stem the spread of potential fakery?<\/p>\n<p>It\u2019s probably too much to hope that such videos are governed by the same truth in advertising legislation that governs television advertisement. I would, however, love to see a group of roboticists join forces to standardize how such disclosures can \u2014 and should \u2014 work.<\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2024\/02\/09\/how-to-fake-a-robotics-demo-for-fun-and-profit\/\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In March 2008, a roboticist in winter wear gave Big Dog a big kick for the camera. The buzzing DARPA-funded robot stumbled, but quickly regained its footing amid the snowy parking lot. \u201cPLEASE DO NOT KICK THE WALKING PROTOTYPE DEATH MECH,\u201d pleads the video\u2019s top comment. \u201cIT WILL REMEMBER.\u201d \u201cCreepy as hell,\u201d notes another. \u201cImagine [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":74777,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":{"0":"post-74776","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech"},"_links":{"self":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/74776","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/comments?post=74776"}],"version-history":[{"count":0,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/posts\/74776\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media\/74777"}],"wp:attachment":[{"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/media?parent=74776"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/categories?post=74776"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/entertainment.runfyers.com\/index.php\/wp-json\/wp\/v2\/tags?post=74776"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}