Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: https://feeds.feedburner.com/IeeeSpectrumRobotics?format=xml

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Fri, 19 Apr 2024 11:43:54 -0000</lastBuildDate><item><title>Unlock the Future of Autonomous Drones with Innovative Secure Runtime Assurance (SRTA)</title><link>https://engineeringresources.spectrum.ieee.org/free/w_tecm20/prgm.cgi</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/unlock-the-future-of-autonomous-drones-with-innovative-secure-runtime-assurance-srta.gif?id=52036464&width=1200&height=400&coordinates=0%2C17%2C0%2C17"/><br/><br/><p>The paper delves into the significance of Secure Runtime Assurance (SRTA) for the operational integrity and safety of autonomous robotics groups, with a focus on drones. It presents a comprehensive view of how SRTA has evolved from traditional runtime assurance methods to address the dynamic and complex nature of autonomous systems. Through integrating artificial intelligence and machine learning, SRTA seeks to tackle the multifaceted challenges autonomous systems face, highlighting the need for adaptive, scalable, and secure solutions. Emphasizing a hierarchical approach to decision-making, the paper also highlights the critical role of redundancy in ensuring reliability and anticipates future advancements in RTA technologies. This paper reflects an ongoing effort to harmonize safety and efficiency within regulatory frameworks for autonomous robotics.</p>]]></description><pubDate>Thu, 18 Apr 2024 15:52:55 +0000</pubDate><guid>https://engineeringresources.spectrum.ieee.org/free/w_tecm20/prgm.cgi</guid><category>Type:whitepaper</category><dc:creator>Technology Innovation Institute</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/unlock-the-future-of-autonomous-drones-with-innovative-secure-runtime-assurance-srta.gif?id=52036464&amp;width=980"></media:content></item><item><title>U.S. Commercial Drone Delivery Comes Closer</title><link>https://spectrum.ieee.org/us-drone-delivery-comes-closer</link><description><![CDATA[
  4. <img src="https://spectrum.ieee.org/media-library/image.webp?id=52004785&width=980"/><br/><br/><iframe frameborder="no" height="180" scrolling="no" seamless="" src="https://share.transistor.fm/e/7c352e00" width="100%"></iframe><p>
  5. <strong>Stephen Cass:</strong> Hello and welcome to <em><em>Fixing the Future</em></em>, an <em><em>IEEE Spectrum</em></em> podcast where we look at concrete solutions to tough problems. I’m your host,<a href="https://spectrum.ieee.org/u/stephen-cass" target="_self"> <u>Stephen Cass</u></a>, a senior editor at <em><em>IEEE Spectrum</em></em>. And before I start, I just want to tell you that you can get the latest coverage of some of Spectrum’s most important beats, including AI, climate change, and robotics, by signing up for one of our free newsletters. Just go to<a href="https://spectrum.ieee.org/newsletters/" target="_self"> <u>spectrum.ieee.org/newsletters</u></a> to subscribe. We’ve been covering the drone delivery company<a href="https://www.flyzipline.com/" rel="noopener noreferrer" target="_blank"> <u>Zipline</u></a> in <em><em>Spectrum</em></em> for several years, and I do encourage listeners to check out our great onsite reporting from Rwanda in 2019 when we visited one of<a href="https://spectrum.ieee.org/in-the-air-with-ziplines-medical-delivery-drones" target="_self"> <u>Zipline’s dispatch centers</u></a> for delivering vital medical supplies into rural areas. But now it’s 2024, and Zipline is expanding into commercial drone delivery in the United States, including into urban areas, and hitting some recent milestones. Here to talk about some of those milestones today, we have<a href="https://www.linkedin.com/in/keenanwyrobek/" rel="noopener noreferrer" target="_blank"> <u>Keenan Wyrobek</u></a>, Zipline’s co-founder and CTO. Keenan, welcome to the show.
  6. </p><p>
  7. <strong>Keenan Wyrobek: </strong>Great to be here. Thanks for having me.
  8. </p><p>
  9. <strong>Cass: </strong>So before we get into what’s going on with the United States, can you first catch us up on how things have been going on with Rwanda and the other African countries you’ve been operating in?
  10. </p><p>
  11. <strong>Wyrobek: </strong>Yeah, absolutely. So we’re now operating in eight countries, including here in the US. That includes a handful of countries in Africa, as well as Japan and Europe. So in Africa, it’s really exciting. So the scale is really impressive, basically. As we’ve been operating, started eight years ago with blood, then moved into vaccine delivery and delivering many other things in the healthcare space, as well as outside the healthcare space. We can talk a little bit about in things like animal husbandry and other things. The scale is really what’s exciting. We have a single distribution center there that now regularly flies more than the equivalent of once the equator of the Earth every day. And that’s just from one of a whole bunch of distribution centers. That’s where we are really with that operation today.
  12. </p><p>
  13. <strong>Cass: </strong>So could you talk a little bit about those non-medical systems? Because this was very much how we’d seen blood being parachuted down from these drones and reaching those distant centers. What other things are you delivering there?
  14. </p><p>
  15. <strong>Wyrobek: </strong>Yeah, absolutely. So start with blood, like you said, then vaccines. We’ve now done delivered well over 15 million vaccine doses, lots of other pharmaceutical use cases to hospitals and clinics, and more recently, patient home delivery for chronic care of things like hypertension, HIV-positive patients, and things like that. And then, yeah, moved into some really exciting use cases and things like animal husbandry. One that I’m personally really excited about is supporting these genetic diversity campaigns. It’s one of those things very unglamorous, but really impactful. One of the main sources of protein around the world is cow’s milk. And it turns out the difference between a non-genetically diverse cow and a genetically diverse cow can be 10x difference in milk production. And so one of the things we deliver is bull semen. We’re very good at the cold chain involved in that as we’ve mastered in vaccines and blood. And that’s just one of many things we’re doing in other spaces outside of healthcare directly.
  16. </p><p>
  17. <strong>Cass: </strong>Oh, fascinating. So turning now to the US, it seems like there’s been two big developments recently. One is you’re getting close to deploying Platform 2, which has some really fascinating tech that allows packages to be delivered very precisely by tether. And I do want to talk about that later. But first, I want to talk about a big milestone you had late last year. And this was something that goes by the very unlovely acronym of a BVLOS flight. Can you tell us what a BVLOS stands for and why that flight was such a big deal?
  18. </p><p>
  19. <strong>Wryobek: </strong>Yeah, “beyond visual line of sight.” And so that is basically, before this milestone last year, all drone deliveries, all drone operations in the US were done by people standing on the ground, looking at the sky, that line of sight. And that’s how basically we made sure that the drones were staying clear of aircraft. This is true of everybody. Now, this is important because in places like the United States, many aircraft don’t and aren’t required to carry a transponder, right? So transponders where they have a radio signal that they’re transmitting their location that our drones can listen to and use to maintain separation. And so the holy grail of basically scalable drone operations, of course, it’s physically impossible to have people standing around all the world staring at the sky, and is a sensing solution where you can sense those aircraft and avoid those aircraft. And this is something we’ve been working on for a long time and got the approval for late last year with the FAA, the first-ever use of sensors to detect and avoid for maintaining safety in the US airspace, which is just really, really exciting. That’s now been in operations in two distribution centers here, one in Utah and one in Arkansas ever since.
  20. </p><p>
  21. <strong>Cass: </strong>So could you just tell us a little bit about how that tech works? It just seems to be quite advanced to trust a drone to recognize, “Oh, that is an actual airplane that’s a Cessna that’s going to be here in about two minutes and is a real problem,” or, “No, it’s a hawk, which is just going about his business and I’m not going to ever come close to it at all because it’s so far away.
  22. </p><p>
  23. <strong>Wryobek: </strong>Yeah, this is really fun to talk about. So just to start with what we’re not doing, because most people expect us to use either a radar for this or cameras for this. And basically, those don’t work. And the radar, you would need such a heavy radar system to see 360 degrees all the way around your drone. And this is really important because two things to kind of plan in your mind. One is we’re not talking about autonomous driving where cars are close together. Aircraft never want to be as close together as cars are on a road, right? We’re talking about maintaining hundreds of meters of separation, and so you sense it a long distance. And drones don’t have right of way. So what that means is even if a plane’s coming up behind the drone, you got to sense that plane and get out of the way. And so to have enough radar on your drone that you can actually see far enough to maintain that separation in every direction, you’re talking about something that weighs many times the weight of a drone and it just doesn’t physically close. And so we started there because that’s sort of where we assumed and many people assume that’s the place to start. Then looked at cameras. Cameras have lots of drawbacks. And fundamentally, you can sort of-- we’ve all had this, you taken your phone and tried to take a picture of an airplane and you look at the picture, you can’t see the airplane. Yeah. It takes so many pixels of perfectly clean lenses to see an aircraft at a kilometer or two away that it really just is not practical or robust enough. And that’s when we went back to the drawing board and it ended up where we ended up, which is using an array of microphones to listen for aircraft, which works very well at very long distances to then maintain separation from those other aircraft.
  24. </p><p>
  25. <strong>Cass: </strong>So yeah, let’s talk about Platform 2 a little bit more because I should first explain for listeners who maybe aren’t familiar with Zipline that these are not the kind of the little purely sort of helicopter-like drones. These are these fixed wing with sort of loiter capability and hovering capabilities. So they’re not like your Mavic drones and so on. These have a capacity then for long-distance flight, which is what it gives them.
  26. </p><p>
  27. <strong>Wyrobek: </strong>Yeah. And maybe to jump into Platform 2— maybe starting with Platform 1, what does it look like? So Platform 1 is what we’ve been operating around the world for years now. And this basically looks like a small airplane, right? In the industry referred to as a fixed-wing aircraft. And it’s fixed wing because to solve the problem of going from a metro area to surrounding countryside, really two things matter. Your range and long range and low cost. And a fixed-wing aircraft over something that can hover has something like an 800% advantage in range and cost. And that’s why we did fix wing because it actually works for our customers for their needs for that use case. Platform 2 is all about, how do you deliver to homes and in metro areas where you need an incredible amount of precision to deliver to nearly every home. And so Platform 2—we call our drone zips—our drone, it flies out to the delivery site. Instead of floating a package down to a customer like Platform 1 does, it hovers. Platform 2 hovers and lowers down what we call a droid. And so the droids on tether. The drone stays way up high, about 100 meters up high, and the drone lowers down. And the drone itself-- sorry, the droid itself, it lowers down, it can fly. Right? So you think of it as like the tether does the heavy lifting, but the droid has fans. So if it gets hit by a gust of wind or whatnot, it can still stay very precisely on track and come in and deliver it to a very small area, put the package down, and then be out of there seconds later.
  28. </p><p>
  29. <strong>Cass: </strong>So let me get this right. Platform 2 is kind of as a combo, fixed wing and rotor wing. It’s like a VTOL like that. I’m cheating here a little bit because my colleague Evan Ackerman has a great Q&A on the <em><em>Spectrum</em></em> website with you, some of your team members about<a href="https://spectrum.ieee.org/delivery-drone-zipline-design" target="_self"> <u>the nitty-gritty of how that design was evolved</u></a>. But first off, it’s like a little droid thing at the end of the tether. How much extra precision do all those fans and stuff give you?
  30. </p><p>
  31. <strong>Wyrobek: </strong>Oh, massive, right? We can come down and hit a target within a few centimeters of where we want to deliver, which means we can deliver. Like if you have a small back porch, which is really common, right, in a lot of urban areas to have a small back porch or a small place on your roof or something like that, we can still just deliver as long as we have a few feet of open space. And that’s really powerful for being able to serve our customers. And a lot of people think of Platform 2 as like, “Hey, it’s a slightly better way of doing maybe a DoorDash-style operation, people in cars driving around.” And to be clear, it’s not slightly better. It’s massively better, much faster, more environmentally friendly. But we have many contracts for Platform 2 in the health space with US Health System Partners and Health Systems around the world. And what’s powerful about these customers in terms of their needs is they really need to serve all of their customers. And this is where a lot of our sort of-- this is where our engineering effort goes is how do you make a system that doesn’t just kind of work for some folks, and they can use it if they want to, but a health system is like, “No, I want this to work for everybody in my health network.” And so how do we get to that near 100 percent serviceability? And that’s what this droid really enables us to do. And of course, it has all these other magic benefits too. It makes some of the hardest design problems in this space much, much easier. The safety problem gets much easier by keeping the drone way up high.
  32. </p><p>
  33. <strong>Cass: </strong>Yeah, how high is Platform 2 hovering when it’s doing its deliveries?
  34. </p><p>
  35. <strong>Wyrobek: </strong>About 100 meters, so 300 plus feet, right? We’re talking about high up as a football field is long. And so it’s way up there. And it also helps with things like noise, right? We don’t want to live in a future where drones are all around us sounding like swarms of insects. We want drones to make no noise. We want them to just melt into the background. And so it makes that kind of problem much easier as well. And then, of course, the droid gets other benefits where for many products, we don’t need any packaging at all. We can just deliver the product right onto a table in your porch. And not just from a cost perspective, but again, from— we’re all familiar with the nightmare of packaging from deliveries we get. Eliminating packaging just has to be our future. And we’re really excited to advance that future.
  36. </p><p>
  37. <strong>Cass: </strong>From Evan’s Q&A, I know that a lot of effort went into making the droid element look rather adorable. Why was that so important?
  38. </p><p>
  39. <strong>Wryobek: </strong>Yeah, I like to describe it as sort of a cross between three things, if you kind of picture this, like a miniature little fan boat, right, because it has some fan, a big fan on the back, looks like a little fan boat, combined with sort of a baby seal, combined with a toaster. It sort of has that look to it. And making it adorable, there’s a bunch of sort of human things that matter, right? I want this to be something that when my grandmother, who’s not a tech-savvy, gets these deliveries, it’s approachable. It doesn’t come off as sort of scary. And when you make something cute, not only does it feel approachable, but it also forces you to get the details right so it is approachable, right? The rounded corners, right? This sounds really benign, but a lot of robots, it turns out if you bump into them, they scratch you. And we want you to be able to bump into this droid, and this is no big deal. And so getting the surfaces right, getting them— the surface is made sort of like a helmet foam. If you can picture that, right? The kind of thing you wouldn’t be afraid to touch if it touched you. And so getting it both to be something that feels safe, but is something that actually is safe to be around, those two things just matter a lot. Because again, we’re not designing this for some piloty kind of low-volume thing. Our customers want this in phenomenal volume. And so we really want this to be something that we’re all comfortable around.
  40. </p><p>
  41. <strong>Cass: </strong>Yeah, and one thing I want to pull out from that Q&A as well is it was an interesting note, because you mentioned it has three fans, but they’re rather unobtrusive. And the original design, you had two big fans on the sides, which was very great for maneuverability. But you had to get rid of those and come up with a three-fan design. And maybe you can explain why that was so.
  42. </p><p>
  43. <strong>Wryobek: </strong>Yeah, that’s a great detail. So the original design, the picture, it was like, imagine the package in the middle, and then kind of on either side of the package, two fans. So when you looked at it, it kind of looked like— I don’t know. It kind of looked like the package had big mouse ears or something. And when you looked at it, everybody had the same reaction. You kind of took this big step back. It was like, “Whoa, there’s this big thing coming down into my yard.” And when you’re doing this kind of user testing, we always joke, you don’t need to bring users in if it already makes you take a step back. And this is one of those things where like, “That’s just not good enough, right, to even start with that kind of refined design.” But when we got the sort of profile of it smaller, the way we think about it from a design experiment perspective is we want to deliver a large package. So basically, the droid needs to be as sucked down as small additional volume around that package as possible. So we spent a lot of time figuring out, “Okay, how do you do that sort of physically and aesthetically in a way that also gets that amazing performance, right? Because when I say performance, what I’m talking about is we still need it to work when the winds are blowing really hard outside and still can deliver precisely. And so it has to have a lot of aero performance to do that and still deliver precisely in essentially all weather conditions.
  44. </p><p>
  45. <strong>Cass: </strong>So I guess I just want to ask you then is, what kind of weight and volume are you able to deliver with this level of precision?
  46. </p><p>
  47. <strong>Wryobek: </strong>Yeah, yeah. So we’ll be working our way up to eight pounds. I say working our way up because that’s part of, once you launch a product like this, there’s refinement you can do overtime on many layers, but eight pounds, which was driven off, again, these health use cases. So it does basically 100 percent of what our health partners need to do. And it turns out it’s, nearly 100 percent of what we want to do in meal delivery. And even in the goods sector, I’m impressed by the percentage of goods we can deliver. One of our partners we work with, we can deliver over 80 percent of what they have in their big box store. And yeah, it’s wildly exceeding expectations on nearly every axis there. And volume, it’s big. It’s bigger than a shoebox. I don’t have a great-- I’m trying to think of a good reference to kind of bring it to life. But it looks like a small cooler basically inside. And it can comfortably fit a meal for four to give you a sense of the amount of food you can fit in there. Yeah.
  48. </p><p>
  49. <strong>Cass: </strong>So we’ve seen this history of Zipline in rural areas, and now we’re talking about expanding operations in more urban areas, but just how urban? I don’t imagine that we’ll see the zip lines of zooming around, say, the very hemmed-in streets, say, here in Midtown Manhattan. So what level of urban are we talking about?
  50. </p><p>
  51. <strong>Wryobek: </strong>Yeah, so the way we talk about it internally in our design process is basically we call three-story sprawl. Manhattan is the place where when we think of New York, we’re not talking about Manhattan, but most of the rest of New York, we are talking about it, right? Like the Bronx, things like that. We just have this sort of three stories forever. And that’s a lot of the world out here in California, that’s most of San Francisco. I think it’s something like 98 percent of San Francisco is that. If you’ve ever been to places like India and stuff like that, the cities, it’s just sort of this three stories going for a really long way. And that’s what we’re really focused on. And that’s also where we provide that incredible value because that’s also matches where the hardest traffic situations and things like that can make any other sort of terrestrial on-demand delivery be phenomenally late.
  52. </p><p>
  53. <strong>Cass: </strong>Well, no, I live out in Queens, so I agree there’s not much skyscrapers out there. Although there are quite a few trees and so on, but at the same time, there’s usually some sort of sidewalk availability. So is that kind of what you’re hoping to get into?
  54. </p><p>
  55. <strong>Wyrobek: </strong>Exactly. So as long as you’ve got a porch with a view of the sky or an alley with a view of the sky, it can be literally just a few feet, we can get in there, make a delivery, and be on our way.
  56. </p><p>
  57. <strong>Cass: </strong>And so you’ve done this preliminary test with the FAA, the BVLOS test, and so on. How close do you think you are to, and you’re working with a lot of partners, to really seeing this become routine commercial operations?
  58. </p><p>
  59. <strong>Wyrobek: </strong>Yeah, yeah. So at relatively limited scale, our operations here in Utah and in Arkansas that are leveraging that FAA approval for beyond visual line-of-sight flight operations, that’s been all day, every day now since our approval last year. With Platform 2, we’re really excited. That’s coming later this year. We’re currently in the phase of basically massive-scale testing. So we now have our production hardware and we’re taking it through a massive ground testing campaign. So this picture dozens of thermal chambers and five chambers and things like that just running to really both validate that we have the reliability we need and flush out any issues that we might have missed so we can address that difference between what we call the theoretical reliability and the actual reliability. And that’s running in parallel to a massive flight test campaign. Same idea, right? We’re slowly ramping up the flight volume as we fly into heavier conditions really to make sure we know the limits of the system. We know its actual reliability and true scaled operations so we can get the confidence that it’s ready to operate for people.
  60. </p><p>
  61. <strong>Cass: </strong>So you’ve got Platform 2. What’s kind of next on your technology roadmap for any possible platform three?
  62. </p><p>
  63. <strong>Wyrobek: </strong>Oh, great question. Yeah, I can’t comment on platform three at this time, but. And I will also say, Zipline is pouring our heart into Platform 2 right now. Getting Platform 2 ready for this-- the way I like to talk about this internally is today, we fly about four times the equator of the Earth in our operations on average. And that’s a few thousand flights per day. But the demand we have is for more like millions of flights per day, if not beyond. And so on the log scale, right, we’re halfway there. Three hours of magnitude down, three more zeros to come. And the level of testing, the level of systems engineering, the level of refinement required to do that is a lot. And there’s so many systems from weather forecasting to our onboard autonomy and our fleet management systems. And so to highlight one team, our system test team run by this really impressive individual named<a href="https://www.linkedin.com/in/juanalbanell/" rel="noopener noreferrer" target="_blank"> <u>Juan Albanell</u></a>, this team has taken us from where we were two years ago, where we had shown the concept at a very prototype stage of this delivery experience, and we’ve done the first order math kind of on the architecture and things like that through the iterations in test to actually make sure we had a drone that could actually fly in all these weather conditions with all the robustness and tolerance required to actually go to this global scale that Platform 2 is targeting.
  64. </p><p>
  65. <strong>Cass: </strong>Well, that’s fantastic. Well, I think there’s a lot more to talk about to come up in the future, and we look forward to talking with Zipline again. But for today, I’m afraid we’re going to have to leave it there. But it was really great to have you on the show, Keenan. Thank you so much.
  66. </p><p>
  67. <strong>Wyrobek: </strong>Cool. Absolutely, Stephen. It was a pleasure to speak with you.
  68. </p><p>
  69. <strong>Cass: </strong>So today on <em><em>Fixing the Future</em></em>, we were talking with Zipline’s Keenan Wyrobek about the progress of commercial drone deliveries. For <em><em>IEEE Spectrum</em></em>, I’m Stephen Cass, and I hope you’ll join us next time.
  70. </p>]]></description><pubDate>Wed, 17 Apr 2024 15:10:22 +0000</pubDate><guid>https://spectrum.ieee.org/us-drone-delivery-comes-closer</guid><category>Type:podcast</category><category>Fixing the future</category><category>Zipline</category><category>Drone delivery</category><category>Drones</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://assets.rbl.ms/52004785/origin.webp"></media:content></item><item><title>Boston Dynamics’ Robert Playter on the New Atlas</title><link>https://spectrum.ieee.org/atlas-humanoid-robot-ceo-interview</link><description><![CDATA[
  71. <img src="https://spectrum.ieee.org/media-library/photo-montage-of-two-humanoid-robots-flanking-a-headshot-of-a-man-with-glasses-and-a-goatee.jpg?id=52019964&width=1200&height=400&coordinates=0%2C225%2C0%2C359"/><br/><br/><p>
  72. Boston Dynamics
  73. <a href="https://spectrum.ieee.org/hello-electric-atlas" target="_blank">has just introduced a new Atlas humanoid robot</a>, <a href="https://spectrum.ieee.org/boston-dynamics-atlas-retires" target="_blank">replacing the legendary hydraulic Atlas</a> and intended to be a commercial product. This is huge news from the company that has spent the last decade building the most dynamic humanoids that the world has ever seen, and if you haven’t <a href="https://spectrum.ieee.org/atlas-humanoid-robot" target="_blank">read our article about the announcement (and seen the video!)</a>, you should do that right now.
  74. </p><p>
  75. We’ve had about a decade of pent-up questions about an all-electric productized version of
  76. <a href="https://robotsguide.com/robots/atlas" target="_blank">Atlas</a>, and we were lucky enough to speak with
  77. <a href="https://www.linkedin.com/in/robert-playter-986b507/" rel="noopener noreferrer" target="_blank"><u>Boston Dynamics CEO Robert Playter</u></a> to learn more about where this robot came from and how it’s going to make commercial humanoid robots (finally) happen.
  78. </p><hr/><p>
  79. Robert Playter was the Vice President of Engineering at Boston Dynamics starting in 1994, which I’m pretty sure was back when Boston Dynamics still intended to be a modeling and simulation company rather than a robotics company. Playter became the CEO in 2019, helping the company make the difficult transition from R&D to commercial products with
  80. <a href="https://bostondynamics.com/products/spot/" target="_blank">Spot</a>, <a href="https://bostondynamics.com/products/stretch/" target="_blank">Stretch</a>, and now (or very soon)
  81. <a href="https://bostondynamics.com/atlas/" target="_blank">Atlas</a>.
  82. </p><p>
  83. We talked with Playter about what the heck took Boston Dynamics so long to make this robot, what the vision is for Atlas as a product, all that extreme flexibility, and what comes next.
  84. </p><p class="rm-anchors" id="top">
  85. <strong>Robert Playter on:</strong>
  86. </p><ul>
  87. <li><a href="#1">What Took So Long</a></li>
  88. <li><a href="#2">The Product Approach</a></li>
  89. <li><a href="#3">A General Purpose Robot?</a></li>
  90. <li><a href="#4">Hydraulic Versus Electric</a></li>
  91. <li><a href="#5">Extreme Range of Motion</a></li>
  92. <li><a href="#6">Atlas’ Head</a></li>
  93. <li><a href="#7">Advantages in Commercialization</a></li>
  94. <li><a href="#8">What’s Next</a></li>
  95. </ul><div class="horizontal-rule">
  96. </div><p>
  97. <strong><em>IEEE Spectrum</em>: So what’s going on?</strong><br/>
  98. </p><p>
  99. <strong>Robert Playter:</strong> Boston Dynamics has built an all-electric humanoid. It’s our newest generation of what’s been an almost 15-year effort in developing humanoids. We’re going to launch it as a product, targeting industrial applications, logistics, and places that are much more diverse than where you see <a href="https://robotsguide.com/robots/bdstretch" target="_blank">Stretch</a>—heavy objects with complex geometry, probably in manufacturing type environments. We’ve built our first robot, and we believe that’s really going to set the bar for the next generation of capabilities for this whole industry.
  100. </p><p class="rm-anchors" id="1">
  101. <strong>What took you so long?!</strong><br/>
  102. </p><p>
  103. <strong></strong><strong>Playter: </strong>Well, we wanted to convince ourselves that we knew how to make a humanoid product that can handle a great diversity of tasks—much more so than our previous generations of robots—including at-pace bimanual manipulation of the types of heavy objects with complex geometry that we expect to find in industry. We also really wanted to understand the use cases, so we’ve done a lot of background work on making sure that we see where we can apply these robots fruitfully in industry.
  104. </p><p>
  105. We’ve obviously been working on this machine for a while, as we’ve been doing parallel development with our legacy Atlas. You’ve probably seen some of the
  106. <a href="https://www.youtube.com/watch?v=LeeiN9smjjY" target="_blank">videos of Atlas moving struts around</a>—that’s the technical part of proving to ourselves that we can make this work. And then really designing a next generation machine that’s going to be an order of magnitude better than anything the world has seen.
  107. </p><p class="pull-quote">
  108. “We’re not anxious to just show some whiz-bang tech, and we didn’t really want to indicate our intent to go here until we were convinced that there is a path to a product.”
  109. <strong>—Robert Playter, Boston Dynamics</strong>
  110. </p><p class="rm-anchors" id="2">
  111. <strong>With Spot, it felt like Boston Dynamics developed the product first, without having a specific use case in mind: you put the robot out there and let people discover what it was good for. Is your approach different with Atlas?</strong>
  112. </p><p>
  113. <strong><strong>Playter:</strong></strong> You’re absolutely right. Spot was a technology looking for a product, and it’s taken time for us to really figure out the product market fit that we have in industrial inspection. But the challenge of that experience has left us wiser about really identifying the target applications before you say you’re going to build these things at scale.
  114. </p><p>
  115. Stretch is very different, because it had a clear target market. Atlas is going to be more like Stretch, although it’s going to be way more than a single task robot, which is kind of what Stretch is. Convincing ourselves that we could really generalize with Atlas has taken a little bit of time. This is going to be our third product in about four years. We’ve learned so much, and the world is different from that experience.
  116. </p><p>
  117. <a href="#top">[back to top]</a>
  118. </p><p class="rm-anchors" id="3">
  119. <strong>Is your vision for Atlas one of a general purpose robot?</strong>
  120. </p><p>
  121. <strong><strong>Playter:</strong></strong> It definitely needs to be a multi-use case robot. I believe that because I don’t think there’s very many examples where a single repetitive task is going to warrant these complex robots. I also think, though, that the practical matter is that you’re going to have to focus on a class of use cases, and really making them useful for the end customer. The lesson we’ve learned with both Spot and Stretch is that it’s critical to get out there and actually understand what makes this robot valuable to customers while making sure you’re building that into your development cycle. And if you can start that before you’ve even launched the product, then you’ll be better off.
  122. </p><p>
  123. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  124. </p><p>
  125. <strong>How does thinking of this new Atlas as a product rather than a research platform change things?</strong>
  126. </p><p>
  127. <strong><strong>Playter:</strong></strong> I think the research that we’ve done over the past 10 or 15 years has been essential to making a humanoid useful in the first place. We focused on dynamic balancing and mobility and being able to pick something up and still maintain that mobility—those were research topics of the past that we’ve now figured out how to manage and are essential, I think, to doing useful work. There’s still a lot of work to be done on generality, so that humanoids can pick up any one of a thousand different parts and deal with them in a reasonable way. That level of generality hasn’t been proven yet; we think there’s promise, and that AI will be one of the tools that helps solve that. And there’s still a lot of product prototyping and iteration that will come out before we start building massive numbers of these things and shipping them to customers.
  128. </p><p class="pull-quote">
  129. “This robot will be stronger at most of its joints than a person, and even an elite athlete, and will have a range of motion that exceeds anything a person can ever do.”
  130. <strong>—Robert Playter, Boston Dynamics</strong>
  131. </p><p class="rm-anchors" id="4">
  132. <strong>For a long time, it seemed like hydraulics were the best way of producing powerful dynamic motions for robots like Atlas. Has that now changed?</strong>
  133. </p><p>
  134. <strong><strong>Playter:</strong></strong> We first experimented with that with the launch of Spot. We had the same issue years ago, and discovered that we could build powerful lightweight electric motors that had the same kind of responsiveness and strength, or let’s say sufficient responsiveness and strength, to really make that work. We’ve designed an even newer set of really compact actuators into our electric Atlas, which pack the strength of essentially an elite human athlete into these tiny packages that make an electric humanoid feasible for us. So, this robot will be stronger at most of its joints than a person, and even an elite athlete, and will have a range of motion that exceeds anything a person can ever do. We’ve also compared the strength of our new electric Atlas to our hydraulic Atlas, and the electric Atlas is stronger.
  135. </p><p>
  136. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  137. </p><p class="rm-anchors" id="5">
  138. <strong>In the context of Atlas’ range of motion, that introductory video was slightly uncomfortable to watch, which I’m sure was deliberate. Why introduce the new Atlas in that way?</strong>
  139. </p><p>
  140. <strong>Playter:</strong> These high range of motion actuators are going to enable a unique set of movements that ultimately will let the robot be very efficient. Imagine being able to turn around without having to take a bunch of steps to turn your whole body instead. The motions we showed [in the video] are ones where our engineers were like, “hey, with these joints, we could get up like this!” And it just wasn’t something we had that really thought about before. This flexibility creates a palette that you can design new stuff on, and we’re already having fun with it and we decided we wanted to share that excitement with the world.
  141. </p><p>
  142. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  143. </p><p class="pull-quote">
  144. “Everybody will buy one robot—we learned that with Spot. But they won’t start by buying fleets, and you don’t have a business until you can sell multiple robots to the same customer.”
  145. <strong>—Robert Playter, Boston Dynamics</strong>
  146. </p><p>
  147. <strong>This does seem like a way of making Atlas more efficient, but I’ve heard from other folks working on humanoids that it’s important for robots to move in familiar and predictable ways for people to be comfortable working around them. What’s your perspective on that?</strong>
  148. </p><p>
  149. <strong><strong>Playter:</strong></strong> I do think that people are going to have to become familiar with our robot; I don’t think that means limiting yourself to human motions. I believe that ultimately, if your robot is stronger or more flexible, it will be able to do things that humans can’t do, or don’t want to do.
  150. </p><p>
  151. One of the real challenges of making a product useful is that you’ve got to have sufficient productivity to satisfy a customer. If you’re slow, that’s hard. We learned that with Stretch. We had two generations of Stretch, and the first generation did not have a joint that let it pivot 180 degrees, so it had to ponderously turn around between picking up a box and dropping it off. That was a killer. And so we decided “nope, gotta have that rotational joint.” It lets Stretch be so much faster and more efficient. At the end of the day, that’s what counts. And people will get used to it.
  152. </p><p class="rm-anchors" id="6">
  153. <strong>What can you tell me about the head?</strong>
  154. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  155. <img alt="Humanoid robot with circular light in the location of the head" class="rm-shortcode" data-rm-shortcode-id="f48b7f943cf52db5c097a86c36f1f0a7" data-rm-shortcode-name="rebelmouse-image" id="46a51" loading="lazy" src="https://spectrum.ieee.org/media-library/humanoid-robot-with-circular-light-in-the-location-of-the-head.jpg?id=52020085&width=980"/>
  156. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-625737" placeholder="Add Photo Caption..." spellcheck="false">Boston Dynamics CEO Robert Playter said the head on the new Atlas robot has been designed not to mimic the human form but rather “to project something else: a friendly place to look to gain some understanding about the intent of the robot.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics</small>
  157. </p><p>
  158. <strong><strong>Playter:</strong></strong> The old Atlas did not have an articulated head. But having an articulated head gives you a tool that you can use to indicate intent, and there are integrated lights which will be able to communicate to users. Some of our original concepts had more of a [human] head shape, but for us they always looked a little bit threatening or dystopian somehow, and we wanted to get away from that. So we made a very purposeful decision about the head shape, and our explicit intent was for it <em>not</em> to be human-like. We’re trying to project something else: a friendly place to look to gain some understanding about the intent of the robot.
  159. </p><p>
  160. The design borrows from some friendly shapes that we’d seen in the past. For example, there’s the old Pixar lamp that everybody fell in love with decades ago, and that informed some of the design for us.
  161. </p><p>
  162. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  163. </p><p class="rm-anchors" id="7">
  164. <strong>How do you think the decade(s) of experience working on humanoids as well as your experience commercializing Spot will benefit you when it comes to making Atlas into a product?</strong>
  165. </p><p>
  166. <strong><strong>Playter:</strong></strong> This is our third product, and one of the things we’ve learned is that it takes way more than some interesting technology to make a product work. You have to have a real use case, and you have to have real productivity around that use case that a customer cares about. Everybody will buy <em>one</em> robot—we learned that with Spot. But they won’t start by buying fleets, and you don’t have a business until you can sell multiple robots to the same customer. And you don’t get there without all this other stuff—the reliability, the service, the integration.
  167. </p><p>
  168. <a href="https://spectrum.ieee.org/boston-dynamics-spot-robot-dog-now-available" target="_blank">When we launched Spot as a product several years ago</a>, it was really about transforming the whole company. We had to take on all of these new disciplines: manufacturing, service, measuring the quality and reliability of our robots and then building systems and tools to make them steadily better. That transformation is not easy, but the fact that we’ve successfully navigated through that as an organization means that we can easily bring that mindset and skill set to bear as a company. Honestly, that transition takes two or three years to get through, so all of the brand new startup companies out there who have a prototype of a humanoid working—they haven’t even begun that journey.
  169. </p><p>
  170. There’s also cost. Building something effectively at a reasonable cost so that you can sell it at a reasonable cost and ultimately make some money out of it, that’s not easy either. And frankly, without the support of
  171. <a href="https://spectrum.ieee.org/hyundai-buys-boston-dynamics" target="_blank">Hyundai</a> which is of course a world-class manufacturing expert, it would be really challenging to do it on our own.
  172. </p><p>
  173. So yeah, we’re much more sober about what it takes to succeed now. We’re not anxious to just show some whiz-bang tech, and we didn’t really want to indicate our intent to go here until we were convinced that there is a path to a product. And I think ultimately, that will win the day.
  174. </p><p>
  175. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  176. </p><p class="rm-anchors" id="8">
  177. <strong>What will you be working on in the near future, and what will you be able to share?</strong>
  178. </p><p>
  179. <strong><strong>Playter:</strong></strong> We’ll start showing more of the dexterous manipulation on the new Atlas that we’ve already shown on our legacy Atlas. And we’re targeting proof of technology testing in factories at Hyundai Motor Group [HMG] as early as next year. HMG is really excited about this venture; they want to transform their manufacturing and they see Atlas as a big part of that, and so we’re going to get on that soon.
  180. </p><p>
  181. <a href="https://spectrum.ieee.org/r/entryeditor/2667789605#top" target="_self">[back to top]</a>
  182. </p><p>
  183. <strong>What do you think other robotics folks will find most exciting about the new Atlas?</strong>
  184. </p><p>
  185. <strong><strong>Playter:</strong></strong> Having a robot with so much power and agility packed into a relatively small and lightweight package. I’ve felt honored in the past that most of these other companies compare themselves to us. They say, “well, where are we on the Boston Dynamics bar?” I think we just raised the bar. And that’s ultimately good for the industry, right? People will go, “oh, wow, that’s possible!” And frankly, they’ll start chasing us as fast as they can—that’s what we’ve seen so far. I think it’ll end up pulling the whole industry forward.
  186. </p>]]></description><pubDate>Wed, 17 Apr 2024 13:15:16 +0000</pubDate><guid>https://spectrum.ieee.org/atlas-humanoid-robot-ceo-interview</guid><category>Boston dynamics</category><category>Humanoids</category><category>Atlas</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-montage-of-two-humanoid-robots-flanking-a-headshot-of-a-man-with-glasses-and-a-goatee.jpg?id=52019964&amp;width=980"></media:content></item><item><title>Hello, Electric Atlas</title><link>https://spectrum.ieee.org/atlas-humanoid-robot</link><description><![CDATA[
  187. <img src="https://spectrum.ieee.org/media-library/standing-humanoid-robot.jpg?id=52020105&width=1200&height=400&coordinates=0%2C956%2C0%2C956"/><br/><br/><p>Yesterday, <a href="https://spectrum.ieee.org/boston-dynamics-atlas-retires" target="_blank">Boston Dynamics bid farewell to the iconic Atlas humanoid robot</a>. Or, the hydraulically-powered version of Atlas, anyway—if you read between the lines of <a href="https://www.youtube.com/watch?v=-9EM5_VFlt8" target="_blank">the video description</a> (or even just read the actual lines of the video description), it was pretty clear that although <em><em>hydraulic</em></em> <a href="https://robotsguide.com/robots/atlas2016" target="_blank">Atlas</a> was retiring, it wasn’t the end of the Atlas humanoid program at Boston Dynamics. In fact, Atlas is already back, and better than ever.</p><p>Today, <a href="https://bostondynamics.com/blog/electric-new-era-for-atlas/" target="_blank">Boston Dynamics is introducing a new version of Atlas that’s all-electric</a>. It’s powered by batteries and electric actuators, no more messy hydraulics. It exceeds human performance in terms of both strength and flexibility. And for the first time, Boston Dynamics is calling this humanoid robot a <strong>product</strong>. We’ll take a look at everything that Boston Dynamics is announcing today, and have even more detail in <a href="https://spectrum.ieee.org/qa-boston-dynamics-robert-playter-on-the-new-atlas" target="_blank">this Q&A with Boston Dynamics CEO Robert Playter</a>.</p><hr/><p>Boston Dynamics’ new electric humanoid has been simultaneously one of the worst and best kept secrets in robotics over the last year or so. What I mean is that it seemed obvious, or even inevitable, that Boston Dynamics would take the expertise in humanoids that it developed with Atlas and combine that with its experience productizing a fully electric system like <a href="https://robotsguide.com/robots/spot" target="_blank">Spot</a>. But just because something <em><em>seems </em></em>inevitable doesn’t mean it actually <em><em>is</em></em> inevitable, and Boston Dynamics has done an admirable job of carrying on as normal while building a fully electric humanoid from scratch. And here it is:</p><p class="shortcode-media shortcode-media-youtube">
  188. <span class="rm-shortcode" data-rm-shortcode-id="be3da24d5993b8a9a91281d39a04b6cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/29ECwExc-_M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  189. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=29ECwExc-_M" target="_blank"><br/></a></small>
  190. </p><p>It’s all new, it’s all electric, and some of those movements make me slightly uncomfortable (we’ll get into that in a bit). The blog post accompanying the video is sparse on technical detail, but let’s go through the most interesting parts:</p><blockquote>A decade ago, we were one of the only companies putting real R&D effort into humanoid robots. Now the landscape in the robotics industry is very different.</blockquote><p>In 2010, we took a look at all the <a href="https://spectrum.ieee.org/humanoid-robots-rise" target="_self"><u>humanoid robots then in existence</u></a>. You could, I suppose, argue that Honda was putting real R&D effort into ASIMO back then, but yeah, pretty much all those other humanoid robots came from research rather than industry. Now, it feels like we’re <a href="https://spectrum.ieee.org/humanoid-robots" target="_self"><u>up to our eyeballs in commercial humanoids</u></a>, but over the past couple of years, as startups have appeared out of nowhere with brand new humanoid robots, Boston Dynamics (to most outward appearances) was just keepin’ on with that R&D. Today’s announcement certainly changes that.</p><blockquote>We are confident in our plan to not just create an impressive R&D project, but to deliver a valuable solution. This journey will start with Hyundai—in addition to investing in us, the Hyundai team is building the next generation of automotive manufacturing capabilities, and it will serve as a perfect testing ground for new Atlas applications.</blockquote><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  191. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="fb19c616d8f0a3959312461743ba9090" data-rm-shortcode-name="rebelmouse-image" id="e84d1" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=52020236&width=980" style="max-width: 100%"/>
  192. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Boston Dynamics</small></p><p>This is a significant advantage for Boston Dynamics—<a href="https://spectrum.ieee.org/hyundai-buys-boston-dynamics" target="_blank">through Hyundai</a>, they can essentially be their own first customer for humanoid robots, offering an immediate use case in a very friendly transitional environment. Tesla has a similar advantage with Optimus, but Boston Dynamics also has experience sourcing and selling and supporting Spot, which are those business-y things that seem like they’re not the hard part until they turn out to actually be the hard part.</p><blockquote>In the months and years ahead, we’re excited to show what the world’s most dynamic humanoid robot can really do—in the lab, in the factory, and in our lives.</blockquote><p>World’s most dynamic humanoid, you say? Awesome! Prove it! On video! With outtakes!</p><blockquote>The electric version of Atlas will be stronger, with a broader range of motion than any of our previous generations. For example, our last generation hydraulic Atlas (HD Atlas) could already lift and maneuver a wide variety of heavy, irregular objects; we are continuing to build on those existing capabilities and are exploring several new gripper variations to meet a diverse set of expected manipulation needs in customer environments. </blockquote><p>Now we’re getting to the good bits. It’s especially notable here that the electric version of Atlas will be “stronger” than the previous hydraulic version, because for a long time hydraulics were really the only way to get the kind of explosively powerful repetitive dynamic motions that enabled Atlas to do jumps and flips. And the switch away from hydraulics enables that extra range of motion now that there aren’t hoses and stuff to deal with. </p><p>It’s also pretty clear that the new Atlas is built to continue the kind of work that hydraulic Atlas has been doing, manipulating big and heavy car parts. This is in sharp contrast to most other humanoid robots that we’ve seen, which have primarily focused on moving small objects or bins around in warehouse environments. </p><p class="shortcode-media shortcode-media-youtube">
  193. <span class="rm-shortcode" data-rm-shortcode-id="b91aa350d6decc72a021e12ce095c26a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LeeiN9smjjY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  194. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=LeeiN9smjjY" target="_blank"><br/></a></small>
  195. </p><blockquote>We are not just delivering industry-leading hardware. Some of our most exciting progress over the past couple of years has been in software. In addition to our decades of expertise in simulation and model predictive control, we have equipped our robots with new AI and machine learning tools, like reinforcement learning and computer vision to ensure they can operate and adapt efficiently to complex real-world situations. </blockquote><p>This is all par for the course now, but it’s also not particularly meaningful without more information. “We will give our robots new capabilities through machine learning and AI” is what every humanoid robotics company (and most other robotics companies) are saying, but I’m not sure that we’re there yet, because there’s an “okay but how?” that needs to happen first. I’m not saying that it <em><em>won’t </em></em>happen, just pointing out that until it <em><em>does</em></em> happen, it <em><em>hasn’t</em></em> happened.</p><blockquote>The humanoid form factor is a useful design for robots working in a world designed for people. However, that form factor doesn’t limit our vision of how a bipedal robot can move, what tools it needs to succeed, and how it can help people accomplish more. </blockquote><p><a href="https://agilityrobotics.com/" target="_blank">Agility Robotics</a> has a similar philosophy with <a href="https://robotsguide.com/robots/digit" target="_blank">Digit</a>, which has a mostly humanoid form factor to operate in human environments but also uses a non-human leg design because Agility believes that it works better. Atlas is a bit more human-like with its overall design, but there are some striking differences, including both range of motion and the head, both of which we’ll be talking more about.</p><blockquote>We designed the electric version of Atlas to be stronger, more dexterous, and more agile. Atlas may resemble a human form factor, but we are equipping the robot to move in the most efficient way possible to complete a task, rather than being constrained by a human range of motion. Atlas will move in ways that exceed human capabilities.</blockquote><p>The introductory video with the new Atlas really punches you in the face with this: Atlas is <strong>not</strong> constrained by human range of motion and will leverage its extra degrees of freedom to operate faster and more efficiently, even if you personally might find some of those motions a little bit unsettling.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  196. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="180c10b7ebe7c868b070cb71e2dc275f" data-rm-shortcode-name="rebelmouse-image" id="4498b" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=52020228&width=980" style="max-width: 100%"/>
  197. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Boston Dynamics</small></p><blockquote>Combining decades of practical experience with first principles thinking, we are confident in our ability to deliver a robot uniquely capable of tackling dull, dirty, and dangerous tasks in real applications. </blockquote><p>As <a href="https://spectrum.ieee.org/marco-hutter-ai-institute" target="_self"><u>Marco Hutter pointed out</u></a>, most commercial robots (humanoids included) are really only targeting tasks that are dull, because dull usually means repetitive, and robots are very good at repetitive. Dirty is a little more complicated, and dangerous is a lot more complicated than that. I appreciate that Boston Dynamics is targeting those other categories of tasks from the outset.</p><blockquote>Commercialization takes great engineering, but it also takes patience, imagination, and collaboration. Boston Dynamics has proven that we can deliver the full package with both industry-leading robotics and a complete ecosystem of software, services, and support to make robotics useful in the real world.</blockquote><p>There’s a lot more to building a successful robotics company than building a successful robot. Arguably, building a successful robot is not even the hardest part, long term. Having over 1500 Spot robots deployed with customers gives them a well-established product infrastructure baseline to expand from with the new Atlas.</p><div class="horizontal-rule"></div><p>Taking a step back, let’s consider the position that Boston Dynamics is in when it comes to the humanoid space right now.</p><p>The new Atlas appears to be a reasonably mature platform with explicit commercial potential, but it’s not yet clear if this particular version of Atlas is truly commercially viable, in terms of being manufacturable and supportable at scale—it’s Atlas 001, after all. There’s likely a huge amount of work that still needs to be done, but it’s a process that the company has already gone through with Spot. My guess is that Boston Dynamics has some catching up to do with respect to other humanoid companies that are already entering pilot projects.</p><p>In terms of capabilities, even though the new Atlas hardware is new, it’s not like Boston Dynamics is starting from scratch, since they’re already transferring skills from hydraulic Atlas onto the new platform. But, we haven’t seen the new Atlas doing any practical tasks yet, so it’s hard to tell how far along that is, and it would be premature to assume that hydraulic Atlas doing all kinds of amazing things in YouTube videos implies that electric Atlas can do similar things safely and reliably in a product context. There’s a gap there, possibly an enormous gap, and we’ll need to see more from the new Atlas to understand where it’s at.</p><p>And obviously, there’s a lot of competition in humanoids right now, although I’d like to think that the potential for practical humanoid robots to be useful in society is significant enough that there will be room for lots of different approaches. Boston Dynamics was very early to humanoids in general, but they’re somewhat late to this recent (and rather abrupt) humanoid commercialization push. This may not be a problem, especially if Atlas is targeting applications where its strength and flexibility sets it apart from other robots in the space, and if their depth of experience deploying commercial robotic platforms helps them to scale quickly.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  198. <img alt="" class="rm-shortcode" data-rm-shortcode-id="e46edaf32a56947615d5cf9d5c74525b" data-rm-shortcode-name="rebelmouse-image" id="b68e1" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=52020214&width=980"/>
  199. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics</small></p><p>An electric Atlas may indeed have been inevitable, and it’s incredibly exciting to (finally!) see Boston Dynamics take this next step towards a commercial humanoid, which would deliver on more than <a href="https://spectrum.ieee.org/boston-dynamics-atlas-retires" target="_blank">a decade of ambition</a> stretching back through the DARPA Robotics Challenge to PETMAN. We’ve been promised more manipulation footage soon, and Boston Dynamics expects that Atlas will be in the technology demonstration phase in Hyundai factories as early as next year.</p><p>We have a lot more questions, but we have a lot more answers, too: you’ll find <a href="https://spectrum.ieee.org/atlas-humanoid-robot-ceo-interview" target="_blank">a Q&A with Boston Dynamics CEO Robert Playter right here</a>.</p>]]></description><pubDate>Wed, 17 Apr 2024 13:15:04 +0000</pubDate><guid>https://spectrum.ieee.org/atlas-humanoid-robot</guid><category>Boston dynamics</category><category>Humanoid robots</category><category>Atlas</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/standing-humanoid-robot.jpg?id=52020105&amp;width=980"></media:content></item><item><title>Boston Dynamics Retires Its Legendary Humanoid Robot</title><link>https://spectrum.ieee.org/boston-dynamics-atlas-retires</link><description><![CDATA[
  200. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-of-a-white-humanoid-robot-bowing-to-the-camera.gif?id=52008041&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>In a new video posted today, Boston Dynamics is sending off its hydraulic Atlas humanoid robot. “For almost a decade,” the video description reads, “Atlas has sparked our imagination, inspired the next generations of roboticists, and leapt over technical barriers in the field. Now it’s time for our hydraulic Atlas robot to kick back and relax.”</p><hr/><p>Hydraulic Atlas has certainly earned some relaxation; Boston Dynamics has been absolutely merciless with its humanoid research program. This isn’t a criticism—sometimes being merciless to your hardware is necessary to push the envelope of what’s possible. And as spectators, we just just get to enjoy it, and this highlight reel includes unseen footage of Atlas doing things well along with unseen footage of Atlas doing things not so well. Which, let’s be honest, is what we’re all really here for.</p><p class="shortcode-media shortcode-media-youtube">
  201. <span class="rm-shortcode" data-rm-shortcode-id="734e61733fb08587ec08f9d5eb4aa131" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-9EM5_VFlt8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  202. </p><p>There’s so much more to the history of Atlas than this video shows. Atlas traces its history back to a DARPA project called <a href="https://web.archive.org/web/20091101232348/http://www.bostondynamics.com/robot_petman.html" rel="noopener noreferrer" target="_blank"><u>PETMAN</u></a> (Protection Ensemble Test Mannequin), which we first wrote about in 2009, so long ago that we had to dig up our own article on the <a href="https://web.archive.org/web/20091030004131/https://spectrum.ieee.org/blog/robotics/robotics-software/automaton/boston_dynamics_to_develop_twolegged_humanoid_and_a_new_hopping_robot_in_their_spare_time" target="_self"><u>Wayback Machine</u></a>. As contributor <a href="https://www.linkedin.com/in/mikelltaylor/" rel="noopener noreferrer" target="_blank"><u>Mikell Taylor</u></a> wrote back then:</p><p><em><em>PETMAN is designed to test the suits used by soldiers to protect themselves against chemical warfare agents. It has to be capable of moving just like a soldier—walking, running, bending, reaching, army crawling—to test the suit’s durability in a full range of motion. To really simulate humans as accurately as possible, PETMAN will even be able to “sweat”.</em></em></p><p>Relative to the <a href="https://spectrum.ieee.org/humanoid-robots-rise" target="_self"><u>other humanoid robots out there at the time</u></a> (the most famous of which, by far, was Honda’s ASIMO), PETMAN’s movement and balance were very, very impressive. Also impressive was the presumably unintentional way in which <a href="https://spectrum.ieee.org/boston-dynamics-new-petman-video-must-be-watched-with-this-soundtrack" target="_self"><u>this PETMAN video synced up with the music video to Stayin’ Alive by the Bee Gees</u></a>. Anyway, DARPA was suitably impressed by all this impressiveness, and <a href="https://spectrum.ieee.org/darpa-selects-boston-dynamics-humanoid-for-robotics-challenge" target="_self"><u>chose Boston Dynamics</u></a> to build another humanoid robot to be used for the DARPA Robotics Challenge. <a href="https://www.youtube.com/watch?v=zkBnFPBV3f0" rel="noopener noreferrer" target="_blank"><u>That robot was unveiled ten years ago</u></a>.</p><p>The DRC featured a [still looking for a collective noun for humanoid robots] of Atlases, and it seemed like Boston Dynamics was hooked on the form factor, because less than a year after the DRC Finals the company announced <a href="https://www.youtube.com/watch?v=rVlhMGQgDkY" rel="noopener noreferrer" target="_blank"><u>the next generation of Atlas</u></a>, which could do some useful things like move boxes around. Every six months or so, Boston Dynamics put out a new Atlas video, with the robot running or jumping or dancing or doing parkour, leveraging its powerful hydraulics to impress us every single time. There was really nothing like hydraulic Atlas in terms of dynamic performance, and you could argue that there still isn’t. This is a robot that will be missed.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  203. <img alt="A film strip of images showing a series of humanoid robots gradually getting sleeker and  more polished." class="rm-shortcode" data-rm-shortcode-id="b7ac6b41934884c3c52012ecb8d7b8cd" data-rm-shortcode-name="rebelmouse-image" id="63273" loading="lazy" src="https://spectrum.ieee.org/media-library/a-film-strip-of-images-showing-a-series-of-humanoid-robots-gradually-getting-sleeker-and-more-polished.jpg?id=52013935&width=980"/>
  204. <small class="image-media media-caption" placeholder="Add Photo Caption...">The original rendering of Atlas, followed by four generations of the robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics/IEEE Spectrum</small></p><p>Now, if you’re wondering why Boston Dynamics is saying “it’s time for our <strong>hydraulic </strong>Atlas robot to kick back and relax,” rather than just “our <strong>Atlas </strong>robot,” and if you’re also wondering why the video description ends with “take a look back at everything we’ve accomplished with the Atlas platform “<strong>to date</strong>,” well, I can’t help you. Some people might attempt to draw some inferences and conclusions from that very specific and deliberate language, but I would certainly not be one of them, because I’m well known for never speculating about anything.</p><p>I would, however, point out a few things that have been obvious for a while now. Namely, that:</p><ul><li>Boston Dynamics has been focusing fairly explicitly on commercialization over the past several years</li><li>Complex hydraulic robots are not product friendly because (among other things) they tend to leave puddles of hydraulic fluid on the carpet</li><li>Boston Dynamics has been very successful with Spot as a productized electric platform based on <a href="https://spectrum.ieee.org/spot-is-boston-dynamics-nimble-new-quadruped-robot" target="_self"><u>earlier hydraulic research platforms</u></a></li><li>Fully electric commercial humanoids really seems to be where robotics is at right now</li></ul>There’s nothing at all new in any of this; the only additional piece of information we have is that the <strong>hydraulic</strong> Atlas is, as of today, retiring. And I’m just going to leave things there.]]></description><pubDate>Tue, 16 Apr 2024 15:25:55 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-atlas-retires</guid><category>Boston dynamics</category><category>Humanoid robot</category><category>Atlas</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-of-a-white-humanoid-robot-bowing-to-the-camera.gif?id=52008041&amp;width=980"></media:content></item><item><title>Video Friday: Robot Dog Can’t Fall</title><link>https://spectrum.ieee.org/video-friday-robot-dog-can-t-fall</link><description><![CDATA[
  205. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51980645&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://robocup.de/german-open/?lang=en">RoboCup German Open</a>: 17–21 April 2024, KASSEL, GERMANY</h5><h5><a href="https://www.xponential.org/xponential2024/">AUVSI XPONENTIAL 2024</a>: 22–25 April 2024, SAN DIEGO</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="c1lsfdhi3x8">I think suggesting that robots can’t fall is much less useful than instead suggesting that robots can fall and get quickly and easily get back up again.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="97f47a4f3eb75cff55ad8b1bdb5cc5e2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c1LsfDhI3x8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fh3zbusmaau">Sanctuary AI says that this video shows Phoenix operating at “human-equivalent speed,” but they don’t specify which human or under which conditions. Though it’s faster than I would be, that’s for sure.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5bf9595dd22bf715055c6712d0ad61ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FH3zbUSMAAU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sanctuary.ai/">Sanctuary AI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="a_080dmyimm">“Suzume” is an animated film by Makoto Shinkai, in which one of the characters gets turned into a three-legged chair:</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dcfbe85dab205cc3206375c08be0c2b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A_080DMYImM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Shintaro Inoue from JSK Lab at the University of Tokyo has managed to build a robotic version of that same chair, which is pretty impressive:</p><p class="shortcode-media shortcode-media-youtube">
  206. <span class="rm-shortcode" data-rm-shortcode-id="32806d2976b56829f3389611b926acaa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-f8LDlhmdBg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  207. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=-f8LDlhmdBg" target="_blank"><br/></a></small>
  208. </p><p>[ <a href="https://shin0805.github.io/chair-type-tripedal-robot/">Github</a> ]</p><p>Thanks, Shintaro!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7xyvoe2m_tm"><em>Humanoid robot EVE training for home assistance like putting groceries into the kitchen cabinets.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="77bfc9f2e1218664f7780256a04e3649" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7xYVoE2M_TM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4aeyx7ukbp4"><em>This is the RAM—robotic autonomous mower. It can be dropped anywhere in the world and will wake up with a mission to make tall grass around it shorter. Here is a quick clip of it working on the Presidio in SF.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3eedb2f37ada94911f8bf9112adca599" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4aEyx7UKbp4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sheeprobotics.ai/">Electric Sheep</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lkg3wflkz4u"><em>This year, our robots braved a Finnish winter for the first time. As the snow clears and the days get longer, we’re looking back on how our robots made thousands of deliveries to S Group customers during the colder months.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c3bc6bbfaa24d35cc123e6366d25681" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LKg3WFlKz4U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.starship.xyz/">Starship</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="6hhdpqnn4xg">Agility Robotics is doing its best to answer the (very common) question of “Okay, but what can humanoid robots actually do?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ab8b8f41d0340abf4cc13880403fa920" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6HHdpQNN4Xg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube">
  209. <span class="rm-shortcode" data-rm-shortcode-id="1c3d1ececc33249dc6b32093c076d566" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bnbxS6MkSc8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  210. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=bnbxS6MkSc8" target="_blank"><br/></a></small>
  211. </p><p>[ <a href="https://agilityrobotics.com/">Agility Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="6kgb8pqvlyu">Digit is great and everything, but Cassie will always be one of my favorite robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="41cfcb20b802f07d5d5f569fd2fab75f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6kgB8PqvLYU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://engineering.oregonstate.edu/CoRIS">CoRIS</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iouj7y6dpey"><em>Adopting omnidirectional Field of View (FoV) cameras in aerial robots vastly improves perception ability, significantly advancing aerial robotics’s capabilities in inspection, reconstruction, and rescue tasks. We propose OmniNxt, a fully open-source aerial robotics platform with omnidirectional perception.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f5ed3afcc24762a73939938afb0ffb72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IOuJ7Y6dpeY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hkust-aerial-robotics.github.io/OmniNxt/">OmniNxt</a> ]</p><div class="horizontal-rule"></div><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e33c564ba3072b207635c9efae392484" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lmzcFXVF_E0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The MAkEable framework enhances mobile manipulation in settings designed around humans by streamlining the process of sharing learned skills and experiences among different robots and contexts. Practical tests confirm its efficiency in a range of scenarios, involving different robots, in tasks such as object grasping, coordinated use of both hands in tasks, and the exchange of skills among humanoid robots.</em></blockquote><p>[ <a href="https://h2t.iar.kit.edu/pdf/PohlReister2024v2.pdf">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zc4-ypvsorc"><em>We conducted trials of Ringbot outdoors on a 400 meter track. With a power source of 2300 milliamp-hours and 11.1 Volts, Ringbot managed to cover approximately 3 kilometers in 37 minutes. We commanded its target speed and direction using a remote joystick controller (Steam Deck), and Ringbot experienced five falls during this trial.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="426d793dbd19a4ff5e32ff434dce1297" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zC4-yPVsOrc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10423226/">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ppvkshsvbvg">There is a notable lack of consistency about where exactly Boston Dynamics wants you to think Spot’s eyes are.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9423722cab25d1ffd3b5474c800a26ce" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PPVksHSvbVg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="t5xdomdpdzk">As with every single cooking video, there’s a lot of background prep that’s required for this robot to cook an entire meal, but I would utterly demolish those fries.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f51b987b6586ee273944b476dc898043" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/t5XDomdPdzk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dino-robotics.com/">Dino Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="8b1ngqm1zyk">Here’s everything you need to know about Wing delivery drones, except for how much human time they actually require and the true cost of making deliveries by drone, because those things aren’t fun to talk about.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8d52ebdd2b6c7f6b99af9cc589e5b215" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8B1NGqM1zYk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="op1my2-s-g0">This CMU Teruko Yata Memorial Lecture is by Agility Robotics’ Jonathan Hurst, on “Human-Centric Robots and How Learning Enables Generality.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b24bb64f00eeda49d075b05386fd096" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OP1mY2-S-g0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Humans have dreamt of robot helpers forever. What’s new is that this dream is becoming real. New developments in AI, building on foundations of hardware and passive dynamics, enable vastly improved generality. Robots can step out of highly structured environments and become more human-centric: operating in human spaces, interacting with people, and doing some basic human workflows. By connecting a Large Language Model, Digit can convert natural language high-level requests into complex robot instructions, composing the library of skills together, using human context to achieve real work in the human world. All of this is new—and it is never going back: AI will drive a fast-following robot revolution that is going to change the way we live.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/yata-jonathan-hurst/">CMU</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 12 Apr 2024 15:11:20 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-dog-can-t-fall</guid><category>Deep robotics</category><category>Electric sheep</category><category>Sanctuary ai</category><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Digit robot</category><category>Humanoid robots</category><category>Drones</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51980645&amp;width=980"></media:content></item><item><title>Pogo Stick Microcopter Bounces off Floors and Walls</title><link>https://spectrum.ieee.org/jumping-robot-quadrotor</link><description><![CDATA[
  212. <img src="https://spectrum.ieee.org/media-library/a-small-pogo-stick-shaped-robot-bounces-across-a-floor-with-a-blue-line-of-light-trailing-behind-it.jpg?id=51972177&width=1200&height=400&coordinates=0%2C1000%2C0%2C1000"/><br/><br/><p>
  213. We tend to think about hopping robots from the ground up. That is, they start on the ground, and then, by hopping, incorporate a aerial phase into their locomotion. But there’s no reason why aerial robots can’t approach hopping from the other direction, by adding a hopping ground phase to flight. Hopcopter is the first robot that I’ve ever seen give this a try, and it’s remarkably effective, combining a tiny quadrotor with a springy leg to hop hop hop all over the place.
  214. </p><hr/><p class="shortcode-media shortcode-media-youtube">
  215. <span class="rm-shortcode" data-rm-shortcode-id="6099d733ad295f53fe6f395431ba169f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Jd7IrVQHEok?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  216. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=Jd7IrVQHEok" target="_blank">Songnan Bai, Runze Ding, Song Li, and Bingxuan Pu</a></small>
  217. </p><p>
  218. So why in the air is it worth adding a pogo stick to an otherwise perfectly functional quadrotor? Well, flying is certainly a valuable ability to have, but does take a lot of energy. If you pay close attention to birds (acknowledged experts in the space), they tend to spend a substantial amount of time doing their level best <em><em>not</em></em> to fly, often by walking on the ground or jumping around in trees. Not flying most of the time is arguably one of the things that makes birds so successful—it’s that multimodal locomotion capability that has helped them to adapt to so many different environments and situations.
  219. </p><p>
  220. Hopcopter is multimodal as well, although in a slightly more restrictive sense: Its two modes are flying and intermittent flying. But the intermittent flying is very important, because cutting down on that flight phase gives Hopcopter some of the same efficiency benefits that birds experience. By itself, a quadrotor of hopcopter’s size can stay airborne for about 400 seconds, while Hopcopter can hop continuously for more than 20 minutes. If your objective is to cover as much distance as possible, Hopcopter might not be as effective as a legless quadrotor. But if your objective is instead something like inspection or search and rescue, where you need to spend a fair amount of time not moving very much, hopping could be significantly more effective.
  221. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  222. <img alt="A diagram of the Hopcopter system, and a closeup of the Hopcopter leg." class="rm-shortcode" data-rm-shortcode-id="a732711007d712541accc4581a80198a" data-rm-shortcode-name="rebelmouse-image" id="b25d9" loading="lazy" src="https://spectrum.ieee.org/media-library/a-diagram-of-the-hopcopter-system-and-a-closeup-of-the-hopcopter-leg.jpg?id=51972188&width=980"/>
  223. <small class="image-media media-caption" placeholder="Add Photo Caption...">Hopcopter is a small quadcopter (specifically a Crazyflie) attached to a springy pogo-stick leg.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Songnan Bai, Runze Ding, Song Li, and Bingxuan Pu</small>
  224. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  225. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="e9a42b12d009d89e6872c519011ea4cd" data-rm-shortcode-name="rebelmouse-image" id="62f1e" loading="lazy" src="https://spectrum.ieee.org/media-library/hopcopter-can-reposition-itself-on-the-fly-to-hop-off-of-different-surfaces.gif?id=51972247&width=980" style="max-width: 100%"/>
  226. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Hopcopter can reposition itself on the fly to hop off of different surfaces.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Songnan Bai, Runze Ding, Song Li, and Bingxuan Pu</small>
  227. </p><p>
  228. The actual hopping is mostly passive. Hopcopter’s leg is two rigid pieces connected by rubber bands, with a <a href="https://www.bitcraze.io/products/crazyflie-2-1/" rel="noopener noreferrer" target="_blank"><u>Crazyflie</u></a> microcopter stapled to the top. During a hop, the Crazyflie can add directional thrust to keep the hops hopping and alter its direction as well as its height, from 0.6 meters to 1.6 meters. There isn’t a lot of room for extra sensors on Hopcopter, but the addition of some stabilizing fins allow for continuous hopping without any positional feedback.
  229. </p><p>
  230. Besides vertical hopping, Hopcopter can also position itself in midair to hop off of surfaces at other orientations, allowing it to almost instantaneously change direction, which is a neat trick.
  231. </p><p>
  232. <span></span>And it can even do mid air somersaults, because why not?
  233. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  234. <img alt="" class="rm-shortcode" data-rm-shortcode-id="83daaab78c5d6665a3a65b6c3900977f" data-rm-shortcode-name="rebelmouse-image" id="d2508" loading="lazy" src="https://spectrum.ieee.org/media-library/hopcopter-s-repertoire-of-tricks-includes-somersaults.gif?id=51972259&width=980"/>
  235. <small class="image-media media-caption" placeholder="Add Photo Caption...">Hopcopter’s repertoire of tricks includes somersaults.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Songnan Bai, Runze Ding, Song Li, and Bingxuan Pu</small>
  236. </p><p>
  237. The researchers, based at the <a href="https://ris.bme.cityu.edu.hk/" target="_blank">City University of Hong Kong</a>, say that the Hopcopter technology (namely, the elastic leg) could be easily applied to most other quadcopter platforms, turning them into Hopcopters as well. And if you’re more interested in extra payload rather than extra endurance, it’s possible to use hopping in situations where a payload would be too heavy for continuous flight.
  238. </p><p>
  239. The researchers <a href="https://www.science.org/doi/10.1126/scirobotics.adi8912" target="_blank">published their work</a> 10 April in <em>Science Robotics</em>.
  240. </p>]]></description><pubDate>Fri, 12 Apr 2024 13:30:58 +0000</pubDate><guid>https://spectrum.ieee.org/jumping-robot-quadrotor</guid><category>Crazyflie</category><category>Jumping robots</category><category>Quadrotor</category><category>Science robotics</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-small-pogo-stick-shaped-robot-bounces-across-a-floor-with-a-blue-line-of-light-trailing-behind-it.jpg?id=51972177&amp;width=980"></media:content></item><item><title>Marco Hutter Wants to Solve Robotics’ Hard Problems</title><link>https://spectrum.ieee.org/marco-hutter-ai-institute</link><description><![CDATA[
  241. <img src="https://spectrum.ieee.org/media-library/the-ai-institutes-boston-headquarters-is-brimming-with-robotics-projects.gif?id=51962011&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>
  242. Last December, the <a href="https://theaiinstitute.com/" target="_blank">AI Institute</a> announced that it was <a href="https://theaiinstitute.com/news/marco-hutter-to-lead-zurich-office" target="_blank">opening an office in Zurich</a> as a European counterpart to its Boston headquarters and recruited Marco Hutter to helm the office. Hutter also runs the <a href="https://rsl.ethz.ch/" target="_blank">Robotic Systems Lab at ETH Zurich</a>, arguably best known as the origin of the <a href="https://www.anybotics.com/" target="_blank">ANYmal quadruped robot</a> (but it also does <a href="https://rsl.ethz.ch/robots-media.html" target="_blank">tons of other cool stuff</a>).</p><p>We’re doing our best to keep close tabs on <a href="https://spectrum.ieee.org/boston-dynamics-ai-institute-hyundai" target="_blank">the institute</a>, because it’s one of a vanishingly small number of places that currently exist where roboticists have the kind of long-term resources and vision necessary to make substantial progress on really hard problems that aren’t quite right for either industry or academia. The institute is still scaling up (and the branch in Zurich has only just kicked things off), but we did spot some projects that the Boston folks have been working on, and as you can see from the clips at the top of this page, they’re looking pretty cool.</p><p>
  243. Meanwhile, we had a chance to check in with Marco Hutter to get a sense of what the Zurich office will be working on and how he’s going to be solving all of the hard problems in robotics. All of them!</p><p><strong>How much can you tell us about what you’ll be working on at the AI Institute?</strong>
  244. </p><p>
  245. <strong>Marco Hutter:</strong> If you know the research that I’ve been doing in the past at ETH and with our startups, there’s an overlap on making systems more mobile, making systems more able to interact with the world, making systems in general more capable on the hardware and software side. And that’s what the institute strives for.
  246. </p><p>
  247. <strong>The institute describes itself as a research organization that aims to solve the most important and fundamental problems in robotics and AI. What do you think those problems are?</strong>
  248. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 image-crop-custom rm-float-right" data-rm-resized-container="25%" style="float: right;">
  249. <img alt="a man wearing a gray jacket and jeans sits in a chair." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="be1c89027fedde9b3000cd2efb96bcaf" data-rm-shortcode-name="rebelmouse-image" id="850e7" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-wearing-a-gray-jacket-and-jeans-sits-in-a-chair.jpg?id=51962144&width=379&height=467&quality=85&coordinates=246%2C66%2C175%2C0" style="max-width: 100%"/>
  250. <small class="image-media media-caption" placeholder="Add Photo Caption...">Marco Hutter is the head of the AI Institute’s new Zurich branch.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Swiss Robotics Day</small>
  251. </p><p>
  252. <strong>Hutter: </strong>There are lots of problems. If you’re looking at robots today, we have to admit that they’re still pretty stupid. The way they move, their capability of understanding their environment, the way they’re able to interact with unstructured environments—I think we’re still lacking a lot of skills on the robotic side to make robots useful in all of the tasks we wish them to do. So we have the ambition of having these robots taking over all these dull, dirty, and dangerous jobs. But if we’re honest, today the biggest impact is really only for the dull part. And I think these dirty and dangerous jobs, where we really need support from robots, that’s still going to take a lot of fundamental work on the robotics and AI side to make enough progress for robots to become useful tools.
  253. </p><p>
  254. <strong>What is it about the institute that you think will help robotics make more progress in these areas?</strong>
  255. </p><p>
  256. <strong>Hutter:</strong> I think the institute is one of these unique places where we are trying to bring the benefits of the academic world and the benefits from this corporate world together. In academia, we have all kinds of crazy ideas and we try to develop them in all different directions, but at the same time, we have limited engineering support, and we can only go so far. Making robust and reliable hardware systems is a massive effort, and that kind of engineering is much better done in a corporate lab.
  257. </p><p>
  258. You’ve seen this a little bit with the type of work my lab has been doing in the past. We built simple quadrupeds with a little bit of mobility, but in order to make them robust, we eventually had to <a href="https://www.anybotics.com/" target="_blank">spin it out</a>. We had to bring it to the corporate world, because for a research group, a pure academic group, it would have been impossible. But at the same time, you’re losing something, right? Once you go into your corporate world and you’re running a business, you have to be very focused; you can’t be that explorative and free anymore.
  259. </p><p>
  260. So if you bring these two things together through the institute, with long-term planning, enough financial support, and brilliant people both in the U.S. and Europe working together, I think that’s what will hopefully help us make significant progress in the next couple of years.
  261. </p><p class="pull-quote">“We’re very different from a traditional company, where at some point you need to have a product that makes money. Here, it’s really about solving problems and taking the next step.” <strong>—Marco Hutter, AI Institute</strong></p><p>
  262. <strong>And what will that actually mean in the context of <a href="https://theaiinstitute.com/research" target="_blank">dynamically mobile robots</a>?</strong><strong></strong></p><p>
  263. <strong></strong><strong></strong><strong>Hutter: </strong>If you look at Boston Dynamics’ Atlas doing parkour, or ANYmal doing parkour, these are still demonstrations. You don’t see robots running around in the forests or robots working in mines and doing all kinds of crazy maintenance operations, or in industrial facilities, or construction sites, you name it. We need to not only be able to do this once as a prototype demonstration, but to have all the capabilities that bring that together with environmental perception and understanding to make this athletic intelligence more capable and more adaptable to all kinds of different environments. This is not something that from today to tomorrow we’re going to see it being revolutionized—it will be gradual, steady progress because I think there’s still a lot of fundamental work that needs to be done.
  264. </p><p>
  265. <strong>I feel like the mobility of legged robots has improved a lot over the last five years or so, and a lot of that progress has come from Boston Dynamics and also from your lab. Do you feel the same?</strong>
  266. </p><p>
  267. <strong>Hutter: </strong>There has always been progress; the question is how much you can zoom in or zoom out. I think one thing has changed quite a bit, and that’s the availability of robotic systems to all kinds of different research groups. If you look back a decade, people had to build their own robots, they had to do the control for the robots, they had to work on the perception for the robots, and putting everything together like that makes it extremely fragile and very challenging to make something that works more than once. That has changed, which allows us to make faster progress.
  268. </p><p>
  269. <strong>Marc Raibert (founder of the AI Institute) likes to show videos of mountain goats to illustrate what robots should be (or will be?) capable of. Does that kind of thing inspire you as well?</strong>
  270. </p><p>
  271. <strong>Hutter:</strong> If you look at the animal kingdom, there’s so many things you can draw inspiration from. And a lot of this stuff is not only the cognitive side; it’s really about pairing the cognitive side with the mechanical intelligence of things like the simple-seeming hooves of mountain goats. But they’re really not that simple, they’re pretty complex in how they interact with the environment. Having one of these things and not the other won’t allow the animal to move across its challenging environment. It’s the same thing with the robots.
  272. </p><p>
  273. It’s always been like this in robotics, where you push on the hardware side, and your controls become better, so you hit a hardware limitation. So both things have to evolve hand in hand. Otherwise, you have an over-dimensioned hardware system that you can’t use because you don’t have the right controls, or you have very sophisticated controls and your hardware system can’t keep up.
  274. </p><p>
  275. <strong>How do you feel about all of the investment into humanoids right now, when quadrupedal robots with arms have been around for quite a while?</strong>
  276. </p><p><strong>Hutter:</strong> There’s a lot of ongoing research on quadrupeds with arms, and the nice thing is that these technologies that are developed for mobile systems with arms are the same technologies that are used in humanoids. It’s not different from a research point of view, it’s just a different form factor for the system. I think from an application point of view, the story from all of these companies making humanoids is that our environment has been adapted to humans quite a bit. A lot of tasks are at the height of a human standing, right? A quadruped doesn’t have the height to see things or to manipulate things on a table. It’s really application dependent, and I wouldn’t say that one system is better than the other.</p>]]></description><pubDate>Thu, 11 Apr 2024 19:21:31 +0000</pubDate><guid>https://spectrum.ieee.org/marco-hutter-ai-institute</guid><category>The ai institute</category><category>Boston dynamics</category><category>Zurich</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/the-ai-institutes-boston-headquarters-is-brimming-with-robotics-projects.gif?id=51962011&amp;width=980"></media:content></item><item><title>Ukraine Is the First “Hackers’ War”</title><link>https://spectrum.ieee.org/ukraine-hackers-war</link><description><![CDATA[
  277. <img src="https://spectrum.ieee.org/media-library/a-person-in-military-gear-launches-a-drone-skyward.jpg?id=51958871&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>Rapid and resourceful technological improvisation has long been a mainstay of warfare, but the war in Ukraine is taking it to a new level. This improvisation is most conspicuous in the ceaselessly evolving struggle between weaponized drones and electronic warfare, a cornerstone of this war.</p><p>Weaponized civilian <a href="https://www.economist.com/interactive/science-and-technology/2024/02/05/cheap-racing-drones-offer-precision-warfare-at-scale" rel="noopener noreferrer" target="_blank">first-person-view (FPV) drones</a> began dramatically reshaping the landscape of the war in the summer of 2023. Prior to this revolution, various commercial drones played critical roles, primarily for intelligence, surveillance, and reconnaissance. Since 2014, the main means of defending against these drones has been electronic warfare (EW), in its many forms. The iterative, lethal dance between drones and EW has unfolded a rich technological tapestry, revealing insights into a likely future of warfare where EW and drones intertwine.</p><p>After the <a href="https://www.atlanticcouncil.org/blogs/ukrainealert/putins-unpunished-crimean-crime-set-the-stage-for-russias-2022-invasion/" rel="noopener noreferrer" target="_blank">invasion of Crimea</a>, in 2014, Ukrainian forces depended heavily on commercial off-the-shelf drones, such as models from <a href="https://spectrum.ieee.org/the-consumer-electronics-hall-of-fame-dji-phantom-drone" target="_blank">DJI</a>, for reconnaissance and surveillance. These were not FPV drones, for the most part. Russia’s response involved deploying <a href="https://spectrum.ieee.org/the-fall-and-rise-of-russian-electronic-warfare" target="_self">military-grade EW systems</a> alongside law-enforcement tools like <a href="https://www.theverge.com/22985101/dji-aeroscope-ukraine-russia-drone-tracking" rel="noopener noreferrer" target="_blank">Aeroscope</a>, a product from DJI that allows instant identification and tracking of drones from their radio emissions. Aeroscope, while originally a standard tool used by law enforcement to detect and track illegal drone flights, soon revealed its military potential by pinpointing both the drone and its operator.</p><p class="pull-quote">On both sides of the line you’ll find much the same kind of people doing much the same thing: hacking.</p><p>This application turned a security feature into a significant tactical asset, providing Russian artillery units with precise coordinates for their targets—namely, Ukrainian drone operators. To circumvent this vulnerability, groups of Ukrainian volunteers innovated. By updating the firmware of the DJI drones, they closed the backdoors that allowed the drones to be tracked by Aeroscope. Nevertheless, after the start of the conflict in Crimea, commercial, off-the-shelf drones were considered a last-resort asset used by volunteers to compensate for the lack of proper military systems. To be sure, the impact of civilian drones during this period was not comparable to what occurred after the February 2022 invasion.<br/></p><p>As Russia’s “thunder-run” strategy became bogged down shortly after the invasion, Russian forces found themselves unexpectedly vulnerable to civilian drones, in part because most of their full-scale military EW systems were not very mobile.<br/></p><p class="shortcode-media shortcode-media-rebelmouse-image">
  278. <img alt="A dark, cloudy sky behind the silhouette of a drone and an anti-tank grenade" class="rm-shortcode" data-rm-shortcode-id="62189c77971ef659b9b96b63154278f3" data-rm-shortcode-name="rebelmouse-image" id="0a420" loading="lazy" src="https://spectrum.ieee.org/media-library/a-dark-cloudy-sky-behind-the-silhouette-of-a-drone-and-an-anti-tank-grenade.jpg?id=51959559&width=980"/>
  279. <small class="image-media media-caption" placeholder="Add Photo Caption...">During a training exercise in southern Ukraine in May 2023, a drone pilot maneuvered a flier to a height of 100 meters before dropping a dummy anti-tank grenade on to a pile of tires. The test, pictured here, worked—that night the pilot’s team repeated the exercise over occupied territory, blowing up a Russian armored vehicle. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Emre Caylak/Guardian/eyevine/Redux</small></p><p>The Russians could have compensated by deploying many <a href="https://dl.djicdn.com/downloads/AEROSCOPE/20201014/Aeroscope_AS-F1800_User_Manual_EN_v2.0.pdf" rel="noopener noreferrer" target="_blank">Aeroscope terminals</a> then, but they didn’t, because most Russian officers at the time had a dismissive view of the capabilities of civilian drones in a high-intensity conflict. That failure opened a window of opportunity that Ukrainian armed-forces units exploited aggressively. Military personnel, assisted by many volunteer technical specialists, gained a decisive intelligence advantage for their forces by quickly fielding fleets of hundreds of camera drones connected to simple yet effective battlefield-management systems. They soon began modifying commercial drones to attack, with grenade tosses and, ultimately, <a href="https://scrippsnews.com/stories/ace-ukrainian-fpv-drone-pilot-darwin-shows-war-s-explosive-evolution/" rel="noopener noreferrer" target="_blank">“kamikaze”</a> operations. Besides the DJI models, one of the key drones was the R18, an octocopter developed by the Ukrainian company <a href="https://aerorozvidka.ngo/" target="_blank">Aerorozvidka</a>, capable of carrying three grenades or small bombs. As casualties mounted, Russian officers soon realized the extent of the threat posed by these drones.</p><h2>How Russian electronic warfare evolved to counter the drone threat</h2><p>By spring 2023, as the front lines stabilized following strategic withdrawals and counteroffensives, it was clear that the nature of drone warfare had evolved. Russian defenses had adapted, deploying more sophisticated counter-drone systems. Russian forces were also beginning to use drones, setting the stage for the nuanced cat-and-mouse game that has been going on ever since.</p><p class="pull-quote">The modular construction of first-person-view drones allowed for rapid evolution to enhance their resilience against electronic warfare.</p><p>For example, early on, most Russian EW efforts primarily focused on jamming the drones’ radio links for control and video. This wasn’t too hard, given that DJI’s <a href="https://www.droneblog.com/dji-transmission-system/" rel="noopener noreferrer" target="_blank">OcuSync protocol</a> was not designed to withstand dense jamming environments. So by April 2023, Ukrainian drone units had begun pivoting toward first-person-view (FPV) drones with modular construction, enabling rapid adaptation to, and evasion of, EW countermeasures.</p><p>The Russian awakening to the importance of drones coincided with the stabilization of the front lines, around August 2022. Sluggish Russian offensives came at a high cost, with an increasing proportion of casualties caused directly or indirectly by drone operators. By this time, the Ukrainians were hacking commercial drones, such as DJI Mavics, to “anonymize” them, rendering Aeroscope useless. It was also at this time that the Russians began to adopt commercial drones and develop their own tactics, techniques, and procedures, leveraging their EW and artillery advantages while attempting to compensate for their delay in combat-drone usage. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  280. <img alt="A soldier sits on a sandy hill wearing special glasses and holding a remote to control a drone with a fake bomb which is in the air in front of him." class="rm-shortcode" data-rm-shortcode-id="81370d0ed1a4fae94f6fb11eed0df57a" data-rm-shortcode-name="rebelmouse-image" id="e8c08" loading="lazy" src="https://spectrum.ieee.org/media-library/a-soldier-sits-on-a-sandy-hill-wearing-special-glasses-and-holding-a-remote-to-control-a-drone-with-a-fake-bomb-which-is-in-the.jpg?id=51959583&width=980"/>
  281. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-773301" placeholder="Add Photo Caption..." spellcheck="false">On 4 March, a Ukrainian soldier flew a drone at a testing site near the town of Kreminna in eastern Ukraine. The drone was powered by a blue battery pack and carried a dummy bomb.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">David Guttenfelder/The New York Times/Redux<a href="https://archive.reduxpictures.com/#" rel="noopener noreferrer" target="_blank"></a></small></p><p>Throughout 2023, when the primary EW tactic employed was jamming, the DJI drones began to fall out of favor for attack roles. When the density of Russian jammer usage surpassed a certain threshold, DJI’s <a href="https://store.dji.bg/en/blog/what-is-dji-ocusync-and-how-does-it-work" rel="noopener noreferrer" target="_blank">OcuSync</a> radio protocol, which controls a drone’s flight direction and video, could not cope with it. Being proprietary, OcuSync’s frequency band and power are not modifiable. A jammer can attack both the control and video signals, and the drone becomes unrecoverable most of the time. As a result, DJI drones have lately been used farther from the front lines and relegated mainly to roles in intelligence, surveillance, and reconnaissance. Meanwhile, the modular construction of FPVs allowed for rapid evolution to enhance their resilience against EW. The Ukraine war greatly boosted the world’s production of FPV drones; at this point there are thousands of FPV models and modifications.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  282. <img alt="A soldier places his hand on a drone that carries a shell beneath it." class="rm-shortcode" data-rm-shortcode-id="cd5799a767f57087a4a2c0bbf845ceb9" data-rm-shortcode-name="rebelmouse-image" id="abd51" loading="lazy" src="https://spectrum.ieee.org/media-library/a-soldier-places-his-hand-on-a-drone-that-carries-a-shell-beneath-it.jpg?id=51959889&width=980"/>
  283. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-503656" placeholder="Add Photo Caption..." spellcheck="false">A “kamikaze” first-person-view drone with an attached PG-7L round, intended for use with an RPG-7 grenade launcher, is readied for a mission near the town of Horlivka, in the Donetsk region, on 17 January 2024. The drone was prepared by a Ukrainian serviceman of the Rarog UAV squadron of the 24th Separate Mechanized Brigade.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Inna Varenytsia/Reuters/Redux</small></p><p>As of early 2024, analog video signals are the most popular option by far. This technology offers drone operators a brief window of several seconds to correct the drone’s path upon detecting interference, for example as a result of jamming, before signal loss. Additionally, drone manufacturers have access to more powerful video transmitters, up to 5 watts, which are more resistant to jamming. Furthermore, the 1.2-gigahertz frequency band is gaining popularity over the previously dominant 5.8-GHz band due to its superior obstacle penetration and because fewer jammers are targeting that band.</p><p>However, the lack of encryption in analog video transmitter systems means that a drone’s visual feed can be intercepted by any receiver. So various mitigation strategies have been explored. These include adding encryption layers and using digital-control and video protocols such as <a href="https://www.hd-zero.com/" target="_blank">HDZero</a>, <a href="https://caddxfpv.com/" target="_blank">Walksnail</a>, or, especially, any of several new open-source alternatives.</p><p>In the war zone, the most popular of these open-source control radio protocols is ExpressLRS, or <a href="https://www.expresslrs.org/" rel="noopener noreferrer" target="_blank">ELRS</a>. Being open-source, ELRS not only offers more affordable hardware than its main rival, <a href="https://www.team-blacksheep.com/media/files/tbs-crossfire-manual.pdf" rel="noopener noreferrer" target="_blank">TBS Crossfire</a>, it is also modifiable via its software. It has been hacked in order to use frequency bands other than its original 868 to 915 megahertz. This adaptation produces serious headaches for EW operators, because they have to cover a much wider band. As of March 2024, Ukrainian drone operators are performing final tests on 433-MHz ELRS transmitter-receiver pairs, further challenging prevailing EW methods.</p><h2>Distributed mass in the transparent battlefield</h2><p>Nevertheless, the most important recent disruption of all in the drone-versus-EW struggle is distributed mass. Instead of an envisioned blitzkrieg-style swarm with big clouds of drones hitting many closely spaced targets during very short bursts, an ever-growing number of drones are covering more widely dispersed targets over a much longer time period, whenever the weather is conducive. Distributed mass is a cornerstone of the emerging <a href="https://mwi.westpoint.edu/preparing-to-win-the-first-fight-of-the-next-war/" rel="noopener noreferrer" target="_blank">transparent battlefield</a>, in which many different sensors and platforms transmit huge amounts of data that is integrated in real time to provide a comprehensive view of the battlefield. One offshoot of this strategy is that more and more kamikaze drones are directed toward a constantly expanding range of targets. Electronic warfare is adapting to this new reality, confronting mass with mass: massive numbers of drones against massive numbers of RF sensors and jammers.</p><p class="pull-quote">Ukraine is the first true war of the hackers.</p><p>Attacks now often consist of far more commercial drones than a suite of RF detectors or jammers could handle even six months ago. With brute-force jamming, even if defenders are willing to accept high rates of damage inflicted on their own offensive drones, these previous EW systems are just not up to the task. So for now, at least, the drone hackers are in the lead in this deadly game of “hacksymmetrical” warfare. Their development cycle is far too rapid for conventional electronic warfare to keep pace.</p><p>But the EW forces are not standing still. Both sides are either developing or acquiring civilian RF-detecting equipment, while military-tech startups and even small volunteer groups are developing new, simple, and good-enough jammers in essentially the same improvised ways that hackers would.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  284. <img alt="Two soldiers work on a piece of machinery consisting of a metal rectangular square with three heavy attached cables, as well as three vertical pieces coming out of it, while another man looks on." class="rm-shortcode" data-rm-shortcode-id="ce6940f5c30e39b75de0c7a865aea0a9" data-rm-shortcode-name="rebelmouse-image" id="75655" loading="lazy" src="https://spectrum.ieee.org/media-library/two-soldiers-work-on-a-piece-of-machinery-consisting-of-a-metal-rectangular-square-with-three-heavy-attached-cables-as-well-as.jpg?id=51959608&width=980"/>
  285. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-277510" placeholder="Add Photo Caption..." spellcheck="false">Ukrainian soldiers familiarized themselves with a portable drone jammer during a training session in Kharkiv, Ukraine, on 11 March 2024.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Diego Herrera Carcedo/Anadolu/Getty Images</small></p><p><span></span>Two examples illustrate this trend. Increasingly affordable, short-range jammers are being installed on tanks, armored personnel carriers, trucks, pickups, and even 4x4s. Although limited and unsophisticated, these systems contribute to drone-threat mitigation. In addition, a growing number of soldiers on the front line carry simple, commercial radio-frequency (RF) scanners with them. Configured to detect drones across various frequency bands, these devices, though far from perfect, have begun to save lives by providing precious additional seconds of warning before an imminent drone attack.</p><p>The electronic battlefield has now become a massive game of cat and mouse. Because commercial drones have proven so lethal and disruptive, drone operators have become high-priority targets. As a result, operators have had to reinvent camouflage techniques, while the hackers who drive the evolution of their drones are working on every modification of RF equipment that offers an advantage. Besides the frequency-band modification described above, hackers have developed and refined two-way, two-signal repeaters<strong> </strong>for drones. Such systems are attached to another drone that hovers close to the operator and well above the ground, relaying signals to and from the attacking drone. Such repeaters more than double the practical range of drone communications, and thus the EW “cats” in this game have to search a much wider area than before.</p><p>Hackers and an emerging cottage industry of war startups are raising the stakes. Their primary goal is to erode the effectiveness of jammers by attacking them autonomously. In this countermeasure, offensive drones are equipped with home-on-jam systems. Over the next several months, increasingly sophisticated versions of these systems will be fielded. These home-on-jam capabilities will autonomously target any jamming emission within range; this range, which is classified, depends on emission power at a rate that is believed to be 0.3 kilometers per watt. In other words, if a jammer has 100 W of signal power, it can be detected up to 30 km away, and then attacked. After these advances allow the drone “mice” to hunt the EW cat, what will happen to the cat?</p><p>The challenge is unprecedented and the outcome uncertain. But on both sides of the line you’ll find much the same kind of people doing much the same thing: hacking. Civilian hackers have for years lent their skills to such shady enterprises as narco-trafficking and organized crime. Now hacking is a major, indispensable component of a full-fledged war, and its practitioners have emerged from a gray zone of plausible deniability into the limelight of military prominence. Ukraine is the first true war of the hackers.</p><p>The implications for Western militaries are ominous. We have neither masses of drones nor masses of EW tech. What is worse, the world’s best hackers are completely disconnected from the development of defense systems. The Ukrainian experience, where a vibrant war startup scene is emerging, suggests a model for integrating maverick hackers into our defense strategies. As the first hacker war continues to unfold, it serves as a reminder that in the era of electronic and drone warfare, the most critical assets are not just the technologies we deploy but also the scale and the depth of the human ingenuity behind them.</p>]]></description><pubDate>Wed, 10 Apr 2024 14:05:08 +0000</pubDate><guid>https://spectrum.ieee.org/ukraine-hackers-war</guid><category>Electronic warfare</category><category>Fpv drone</category><category>Dji</category><category>Ukraine conflict</category><category>Drones</category><dc:creator>Juan Chulilla</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-person-in-military-gear-launches-a-drone-skyward.jpg?id=51958871&amp;width=980"></media:content></item><item><title>Video Friday: LASSIE On the Moon</title><link>https://spectrum.ieee.org/video-friday-lassie-moon</link><description><![CDATA[
  286. <img src="https://spectrum.ieee.org/media-library/the-legged-autonomous-surface-science-in-analog-environments-project-tests-a-quadrupedal-robot-on-mt-hood-in-oregon.png?id=51924196&width=1339&height=896&coordinates=148%2C63%2C433%2C121"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://robocup.de/german-open/?lang=en">RoboCup German Open</a>: 17–21 April 2024, KASSEL, GERMANY</h5><h5><a href="https://www.xponential.org/xponential2024/">AUVSI XPONENTIAL 2024</a>: 22–25 April 2024, SAN DIEGO</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="wbtyelffe1a"><em>USC, UPenn, Texas A&M, Oregon State, Georgia Tech, Temple University, and NASA Johnson Space Center are teaching dog-like robots to navigate craters of the moon and other challenging planetary surfaces in research funded by NASA.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba9c963db526aa0e128ac6131bd7afb8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wBTyelFFE1A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://viterbischool.usc.edu/news/2024/04/teaching-robots-to-walk-on-the-moon-and-maybe-rescue-one-another/">USC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vlxmc-eyfga"><em>AMBIDEX is a revolutionary robot that is fast, lightweight, and capable of human-like manipulation. We have added a sensor head and the torso and the waist to greatly expand the range of movement. Compared to the previous arm-centered version, the overall impression and balance has completely changed.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a1cb06c640a609ad516e35e2bd0cd66b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VLXmC-EyFgA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/ambidex">Naver Labs</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kzfewd3-cok"><em>It still needs a lot of work, but the six-armed pollinator, Stickbug, can autonomously navigate and pollinate flowers in a greenhouse now.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ddec0c33fedb5f77287ae855dd92cbe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kZFEwD3-cok?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I think “needs a lot of work” really means “needs a couple more arms.”</p><p>[ <a href="https://arxiv.org/abs/2404.03489">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5gwqwmeuuom"><em>Experience the future of robotics as UBTECH’s humanoid robot integrates with Baidu’s ERNIE through AppBuilder! Witness robots [that] understand language and autonomously perform tasks like folding clothes and object sorting.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3384c7e48c643878f9e16f3bd4b9fe12" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5GWqwMEuUOM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/">UBTECH</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kghj-qnjyls">I know the fins on this robot are for walking underwater rather than on land, but watching it move, I feel like it’s destined to evolve into something a little more terrestrial.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8f90bb6c6c818de6d222fb0099d8acb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kGHJ-qnjyLs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10388469">Paper</a> ] via [ <a href="https://hero.postech.ac.kr/">HERO Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="s163k_yoyi4">iRobot has a new Roomba that vacuums and mops—and at $275, it’s a pretty good deal.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7a2fc3f5730c96bb8a8490868869e922" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/s163k_yoYi4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Also, if you are a robot vacuum owner, please, please remember to clean the poor thing out from time to time. Here’s how to do it with a Roomba:</p><p class="shortcode-media shortcode-media-youtube">
  287. <span class="rm-shortcode" data-rm-shortcode-id="74d1001c43a1b0f516cfee926ad216ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kiSng9LeepQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  288. </p><p>[ <a href="https://www.irobot.com/en_US/roomba-combo-essential-robot/Y014020.html">iRobot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r_hqgkyj3o8"><em>The video demonstrates the wave-basin testing of a 43 kg (95 lb) amphibious cycloidal propeller unmanned underwater vehicle (Cyclo-UUV) developed at the Advanced Vertical Flight Laboratory, Texas A&M University. The use of cyclo-propellers allows for 360 degree thrust vectoring for more robust dynamic controllability compared to UUVs with conventional screw propellers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="835555e2f0e083be18a6137b9b10be00" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/r_hqgKyj3O8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://avfl.engr.tamu.edu/">AVFL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="awqpb1lo8um">Sony is still upgrading Aibo with new features, like the ability to listen to your terrible music and dance along.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f5037a98d1fba89c90df4854a0db3388" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AwqPB1Lo8uM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.aibo.com/feature/rhythmdance.html">Aibo</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5vh2_taaiaw"><em>Operating robots precisely and at high speeds has been a long-standing goal of robotics research. To enable precise and safe dynamic motions, we introduce a four degree-of-freedom (DoF) tendon-driven robot arm. Tendons allow placing the actuation at the base to reduce the robot’s inertia, which we show significantly reduces peak collision forces compared to conventional motor-driven systems. Pairing our robot with pneumatic muscles allows generating high forces and highly accelerated motions, while benefiting from impact resilience through passive compliance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b09c3760f1f1e4bb163c7a4d1758c39" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5vH2_TAaiaw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/pamy2">Max Planck Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="aygetczc4cc"><em>Rovers on Mars have previously been caught in loose soils, and turning the wheels dug them deeper, just like a car stuck in sand. To avoid this, Rosalind Franklin has a unique wheel-walking locomotion mode to overcome difficult terrain, as well as autonomous navigation software.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fe44b4c84c30085d5f4378f59ea880e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ayGEtczc4Cc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.esa.int/Science_Exploration/Human_and_Robotic_Exploration/Exploration/ExoMars/ExoMars_rover">ESA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pknidnennbm"><em>Cassie is able to walk on sand, gravel, and rocks inside the Robot Playground at the University of Michigan.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a1aab5e7c8dd3c019f84fc7d6956a9bb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pKNiDnennBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Aww, they stopped before they got to the fun rocks.</p><p>[ <a href="https://arxiv.org/abs/2403.02486">Paper</a> ] via [ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2uh4uwk7u84">Not bad for 2016, right?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9561b75f025ea1a2a71a45fda8730afb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2Uh4UWK7U84?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.em.eng.chiba-u.jp/~namiki/index-e.html">Namiki Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mejftm_wjok"><em>MOMO has learned the Bam Yang Gang dance moves with its hand dexterity. :) By analyzing 2D dance videos, we extract detailed hand skeleton data, allowing us to recreate the moves in 3D using a hand model. With this information, MOMO replicates the dance motions with its arm and hand joints.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="908034304ba47f72ddd212b89200d85c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MEJfTm_wjOk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/sungjoon-choi/home">RILAB</a> ] via [ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="laejn2-cbtk">This UPenn GRASP SFI Seminar is from Eric Jang at 1X Technologies, on “Data Engines for Humanoid Robots.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87c37e71e0f2bb9ee1ff4e88644223fa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/laeJn2-CBTk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>1X’s mission is to create an abundant supply of physical labor through androids that work alongside humans. I will share some of the progress 1X has been making towards general-purpose mobile manipulation. We have scaled up the number of tasks our androids can do by combining an end-to-end learning strategy with a no-code system to add new robotic capabilities. Our Android Operations team trains their own models on the data they gather themselves, producing an extremely high-quality “farm-to-table” dataset that can be used to learn extremely capable behaviors. I’ll also share an early preview of the progress we’ve been making towards a generalist “World Model” for humanoid robots.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/spring-2024-grasp-sfi-eric-jang/">UPenn</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zmxpoknlxjw">This Microsoft Future Leaders in Robotics and AI Seminar is from Chahat Deep Singh at the University of Maryland, on “Minimal Perception: Enabling Autonomy in Palm-Sized Robots.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="82f42ad844259ce0816959df7192f7b8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZmxPoKNlXjw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The solution to robot autonomy lies at the intersection of AI, computer vision, computational imaging, and robotics—resulting in minimal robots. This talk explores the challenge of developing a minimal perception framework for tiny robots (less than 6 inches) used in field operations such as space inspections in confined spaces and robot pollination. Furthermore, we will delve into the realm of selective perception, embodied AI, and the future of robot autonomy in the palm of your hands.</em></blockquote><p>[ <a href="https://robotics.umd.edu/">UMD</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 05 Apr 2024 17:10:33 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-lassie-moon</guid><category>Video friday</category><category>Sony aibo</category><category>Naver labs</category><category>Ubtech</category><category>Irobot</category><category>Robotics</category><category>Icra</category><category>Quadruped robots</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/the-legged-autonomous-surface-science-in-analog-environments-project-tests-a-quadrupedal-robot-on-mt-hood-in-oregon.png?id=51924196&amp;width=980"></media:content></item><item><title>Toyota’s “Bubble-ized” Humanoid Grasps With Its Whole Body</title><link>https://spectrum.ieee.org/humanoid-robot-tri-punyo</link><description><![CDATA[
  289. <img src="https://spectrum.ieee.org/media-library/a-humanoid-robot-with-soft-puffy-sleeves-grips-several-full-grocery-bags-to-its-chest.png?id=51891898&width=1200&height=400&coordinates=0%2C142%2C0%2C143"/><br/><br/><p>When we think about robotic manipulation, the default is usually to think about grippers—about robots using manipulators (like fingers or other <a data-linked-post="2661758852" href="https://spectrum.ieee.org/video-friday-biological-end-effectors" target="_blank">end effectors</a>) to interact with objects. For most humans, though, interacting with objects can be a lot more complicated, and we use whatever body parts are convenient to help us deal with objects that are large or heavy or awkward.</p><p>This somewhat constrained definition of robotic manipulation isn’t robotics’ fault, really. The word <em>manipulation</em> itself comes from the Latin for getting handsy with stuff, so there’s a millennium or two’s-worth of hand-related inertia behind the term. The Los Altos, Calif.–based <a href="https://www.tri.global/news/toyota-research-institute-opens-its-doors-first-time-uncommon-look-how-technology-can-help" target="_blank">Toyota Research Institute</a> (TRI) is taking a more expansive view with <a href="https://medium.com/toyotaresearch/meet-punyo-tris-soft-robot-for-whole-body-manipulation-research-949c934ac3d8" rel="noopener noreferrer" target="_blank">their new humanoid, Punyo</a>, which uses its soft body to help it manipulate objects that would otherwise be pretty much impossible to manage with grippers alone.</p><p class="pull-quote">“An anthropomorphic embodiment allows us to explore the complexities of social interactions like physical assistance, nonverbal communication, intent, predictability, and trust, to name just a few.” <strong>—Alex Alspach, Toyota Research Institute (TRI)</strong></p><hr/><p>Punyo started off as just <a href="http://punyo.tech/bubblegripper" rel="noopener noreferrer" target="_blank">a squishy gripper at TRI</a>, but the idea was always to scale up to a big squishy humanoid, hence this concept art of a squishified <a href="https://spectrum.ieee.org/toyota-gets-back-into-humanoid-robots-with-new-thr3" target="_self">T-HR3</a>:</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  290. <img alt="Sketch of two humanoid robots, the rightmost robot containing body parts such as arms and torso that are inflated like a balloon" class="rm-shortcode" data-rm-shortcode-id="179f5347c5ee46bccfb3e7e4ab56cf29" data-rm-shortcode-name="rebelmouse-image" id="74fa7" loading="lazy" src="https://spectrum.ieee.org/media-library/sketch-of-two-humanoid-robots-the-rightmost-robot-containing-body-parts-such-as-arms-and-torso-that-are-inflated-like-a-balloon.png?id=51891895&width=980"/>
  291. <small class="image-media media-caption" placeholder="Add Photo Caption...">This concept image shows what Toyota’s T-HR3 humanoid might look like when “bubble-ized.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">TRI</small></p><p>“We use the term ‘bubble-ized,’ says <a href="https://www.tri.global/about-us/alex-alspach" target="_blank">Alex Alspach</a>, tech lead for Punyo at TRI. Alspach tells us that the concept art above doesn’t necessarily reflect what the Punyo humanoid will eventually look like, but “it gave us some physical constraints and a design language. It also reinforced the idea that we are after general hardware and software solutions that can augment and enable both future and existing robots to take full advantage of their whole bodies for manipulation.”</p><p>This version of Punyo isn’t quite at “whole” body manipulation, but it can get a lot done using its arms and chest, which are covered with air bladders that provide both sensing and compliance:</p><p class="shortcode-media shortcode-media-youtube">
  292. <span class="rm-shortcode" data-rm-shortcode-id="26b0e29f594d0ac2b337ad965b872657" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FY-MD4gteeE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  293. </p><p>Many of those motions look very humanlike, because this is how humans manipulate things. Not to throw too much shade at all those humanoid warehouse robots, but as is pointed out in the video above, using just our hands outstretched in front of us to lift things is not how humans do it, because using other parts of our bodies to provide extra support makes lifting easier. This is not a trivial problem for robots, though, because interactions between point contacts that are rigid (like how most robotics manipulators handle the world) are fairly well understood. Once you throw big squishy surfaces into the mix, along with big squishy objects, it’s just not something that most robots are ready for.</p><p class="pull-quote">“A soft robot does <em>not</em> interact with the world at a single point.” <strong>—Russ Tedrake, TRI</strong></p><p>“Current robot manipulation evolved from big, strong industrial robots moving car parts and big tools with their end effectors,” Alspach says. “I think it’s wise to take inspiration from the human form—we are strong enough to perform most everyday tasks with our hands, but when a big, heavy object comes around, we need to get creative with how we wrap our arms around it and position our body to lift it.”</p><p>Robots are notorious for lifting big and heavy objects, primarily by manipulating them with robot-y form factors in robot-y ways. So what’s so great about the human form factor, anyway? This question goes way beyond Punyo, of course, but we wanted to get the Punyo team’s take on humanoids, and we tossed a couple more questions at them just for fun.</p><p><strong><em>IEEE Spectrum</em>: So why humanoids?</strong></p><p><strong>Alspach:</strong> The humanoid robot checks a few important boxes. First of all, the environments we intend to work in were built for humans, so the humanoid form helps a robot make use of the spaces and tools around it. Independently, multiple teams at TRI have converged on bimanual systems for tasks like grocery shopping and food preparation. A chest between these arms is a simple addition that gives us useful contact surfaces for manipulating big objects, too. Furthermore, our human-robot interaction (HRI) team has done, and continues to do, extensive research with older adults, the people we look forward to helping the most. An anthropomorphic embodiment allows us to explore the complexities of social interactions like physical assistance, nonverbal communication, intent, predictability, and trust, to name just a few. </p><p class="pull-quote">“We focus not on highly precise tasks but on gross, whole-body manipulation, where robust strategies help stabilize and control objects, and a bit of sloppiness can be an asset.” <strong>—Alex Alspach, TRI</strong></p><p><strong>Does having a bubble-ized robot make anything more difficult for you?</strong></p><p><strong>Russ Tedrake, VP of robotics research: </strong>If you think of your robot as interacting with the world at a point—the standard view from, for example, impedance control—then putting a soft, passive spring in series between your robot and the world <em>does</em> limit performance. It reduces your control bandwidth. But that view misses the more important point. A soft robot does <em>not</em> interact with the world at a single point. Soft materials fundamentally change the dynamics of contact by deforming around the material—generating patch contacts that allow contact forces and moments not achievable by a rigid interaction.</p><p><strong>Alspach: </strong>Punyo’s softness is extreme compared to other manipulation platforms that may, say, just have rubber pads on their arms or fingers. This compliance means that when we grab an object, it may not settle exactly where we planned for it to, or, for example, if we bump that object up against the edge of a table, it may move within our grasp. For these reasons, tactile sensing is an important part of our solution as we dig into how to measure and control the state of the objects we manipulate. We focus not on highly precise tasks but on gross, whole-body manipulation, where robust strategies help stabilize and control objects, and a bit of sloppiness can be an asset. </p><p><strong>Compliance can be accomplished in different ways, including just in software. What’s the importance of having a robot that’s physically squishy rather than just one that acts squishily?</strong></p><p><strong>Andrew Beaulieu, Punyo tech lead:</strong> We do not believe that passive and active compliance should be considered mutually exclusive, and there are several advantages to having a physically squishy robot, especially when we consider having a robot operate near people and in their spaces. Having a robot that can safely make contact with the world opens up avenues of interaction and exploration. Using compliant materials on the robot also allows it to conform to complicated shapes passively in a way that would otherwise involve more complicated articulated or actuated mechanisms. Conforming to the objects allows us to increase the contact patch with the object and distribute the forces, usually creating a more robust grasp. These compliant surfaces allow us to research planning and control methods that might be less precise, rely less on accurate object localization, or use hardware with less precise control or sensing.</p><p><strong>What’s it like to be hugged by Punyo?</strong></p><p><strong>Kate Tsui, Punyo HRI tech lead: </strong>Although Punyo isn’t a social robot, a surprising amount of emotion comes through its hug, and it feels quite comforting. A <a href="https://www.youtube.com/watch?v=G8ZYgPRV5LY&t=145s" rel="noopener noreferrer" target="_blank">hug from Punyo</a> feels like a long, sustained, snug squeeze from a close friend you haven’t seen for a long time and don’t want to let go.</p><div class="horizontal-rule"><br/></div><p class="shortcode-media shortcode-media-rebelmouse-image">
  294. <img alt="A four-panel cartoon of a humanoid robot hauling boxes, adjusting a photo on the wall, carrying dirty dishes, and hugging someone" class="rm-shortcode" data-rm-shortcode-id="774f37fb6334339db68084253267e4b9" data-rm-shortcode-name="rebelmouse-image" id="be6ca" loading="lazy" src="https://spectrum.ieee.org/media-library/a-four-panel-cartoon-of-a-humanoid-robot-hauling-boxes-adjusting-a-photo-on-the-wall-carrying-dirty-dishes-and-hugging-someon.png?id=51891893&width=980"/>
  295. <small class="image-media media-caption" placeholder="Add Photo Caption...">A series of concept images shows situations in which whole-body manipulation might be useful in the home.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">TRI</small></p><p>(<em>Interview transcript ends.</em>)</p><p>Softness seems like it could be a necessary condition for bipedal humanoids working in close proximity to humans, especially in commercial or home environments where interactions are less structured and predictable. “I think more robots using their whole body to manipulate is coming soon, especially with the <a href="https://spectrum.ieee.org/tag/Humanoid-Robots" target="_blank">recent explosion of humanoids outside of academic labs</a>,” Alspach says. “Capable, general-purpose robotic manipulation is a competitive field, and using the whole body unlocks the ability to efficiently manipulate large, heavy, and unwieldy objects.”</p>]]></description><pubDate>Tue, 02 Apr 2024 15:32:17 +0000</pubDate><guid>https://spectrum.ieee.org/humanoid-robot-tri-punyo</guid><category>Emotion</category><category>Grippers</category><category>Humanoid robot</category><category>Humanoids</category><category>Punyo</category><category>Robotics</category><category>Soft robotics</category><category>Tactile sensing</category><category>Toyota research institute</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-humanoid-robot-with-soft-puffy-sleeves-grips-several-full-grocery-bags-to-its-chest.png?id=51891898&amp;width=980"></media:content></item><item><title>Video Friday: Co-Expression</title><link>https://spectrum.ieee.org/video-friday-co-expression</link><description><![CDATA[
  296. <img src="https://spectrum.ieee.org/media-library/image.png?id=51871627&width=1200&height=400&coordinates=0%2C61%2C0%2C379"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://robocup.de/german-open/?lang=en">RoboCup German Open</a>: 17–21 April 2024, KASSEL, GERMANY</h5><h5><a href="https://www.xponential.org/xponential2024/">AUVSI XPONENTIAL 2024</a>: 22–25 April 2024, SAN DIEGO, CA</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="pwttzr_wxuq"><em>Columbia engineers build Emo, a silicon-clad robotic face that makes eye contact and uses two AI models to anticipate and replicate a person’s smile before the person actually smiles—a major advance in robots predicting human facial expressions accurately, improving interactions, and building trust between humans and robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="baa3bc058dc0fbd9f1f2295f255efdf7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pWTTzR_wXuQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.engineering.columbia.edu/news/robot-can-you-say-cheese">Columbia</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ukw4b0t5wgc"><em>Researchers at Stanford University have invented a way to augment electric motors to make them much more efficient at performing dynamic movements through a new type of actuator, a device that uses energy to make things move. Their actuator, published 20 March in <u>Science Robotics</u></em><em>, uses springs and clutches to accomplish a variety of tasks with a fraction of the energy usage of a typical electric motor.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3b47e1f229f21ffdbab0a2672bf9df5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UKW4b0t5Wgc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.stanford.edu/2024/03/20/new-efficient-motor-alternative-next-gen-robotics/">Stanford</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u70lo78ykzg">I’m sorry, but the world does not need more drummers.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c65bba5cc02d08c5c7057eb71b333d4e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/u70LO78YKZg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.fourierintelligence.com/">Fourier Intelligence</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="d1mjnvrp0p0">Always good to see NASA’s Valakyrie doing research.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bb6fbab6241a89891992e848d97ec622" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/d1mjNvRp0p0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ntrs.nasa.gov/citations/20240002748">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="l0fqhjks8bu"><em>In challenging terrains, constructing structures such as antennas and cable-car masts often requires the use of helicopters to transport loads via ropes.Challenging this paradigm, we present Geranos: a specialized multirotor Unmanned Aerial Vehicle (UAV) designed to enhance aerial transportation and assembly. Our experimental demonstration mimicking antenna/cable-car mast installations showcases Geranos ability in stacking poles (3 kilograms, 2 meters long) with remarkable sub-5 centimeter placement accuracy, without the need of human manual intervention.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="48fd0c5e17be4e7e01c45621f774e503" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l0fQhJks8BU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2312.01988">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="b-8n3g-m5-s"><em>Flyability’s Elios 2 in November 2020 helped researchers inspect Reactor 5 at the Chernobyl nuclear disaster site to determine whether any uranium was present in the area. Prior to this, Reactor 5 had not been investigated since the disaster in 1986.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f476352d09346f3f50e2962a29241b59" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b-8n3G-M5-s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flyability.com/nuclear">Flyability</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vhbfpyb-qxi"><em>Various musculoskeletal humanoids have been developed so far.  While these humanoids have the advantage of their flexible and redundant bodies that mimic the human body, they are still far from being applied to real-world tasks. One of the reasons for this is the difficulty of bipedal walking in a flexible body. Thus, we developed a musculoskeletal wheeled robot, Musashi-W, by combining a wheeled base and musculoskeletal upper limbs for real-world applications.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2487fb2def68fcb344c506f2b0e798d1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VhBfpYB-QxI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2403.11729">Paper</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bexqb_9mlgi"><em>A recent trend in industrial robotics is to have robotic manipulators working side-by-side with human operators. A challenging aspect of this coexistence is that the robot is required to reliably solve complex path-planning problems in a dynamically changing environment. To ensure the safety of the human operator while simultaneously achieving efficient task realization, this paper introduces... a scheme [that] can steer the robot arm to the desired end-effector pose in the presence of actuator saturation, limited joint ranges, speed limits, a cluttered static obstacle environment, and moving human collaborators.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="44b0d6d6f2388127f52e6410583c3ac0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Bexqb_9MLGI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.sciencedirect.com/science/article/abs/pii/S0736584523001862?via%3Dihub">Paper</a> ]</p><p>Thanks, Kelly!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jvqaq3rjcgm"><em>Our mobile manipulator Digit worked continuously for 26 hours split over the 3.5 days of Modex 2024, in Atlanta. Everything was tracked and coordinated by our newest product, Agility Arc, a cloud automation platform.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="68c3a2842f861ded765d05dab1789c62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JVQaQ3RjcgM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zfuhi3_oixk"><em>We’re building robots that can keep people out of harm’s way: Spot enables operators to remotely investigate and de-escalate hazardous situations. Robots have been used in government and public safety applications for decades but Spot’s unmatched mobility and intuitive interface is changing incident response for departments in the field today.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e79b3aabbc98520c20eff83cb983b31b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zFUHi3_oiXk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/watch?v=zFUHi3_oiXk">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="edkwbm2rlog"><em>This paper presents a Bistable Aerial Transformer (BAT) robot, a novel morphing hybrid aerial vehicle (HAV) that switches between quadrotor and fixed-wing modes via rapid acceleration and without any additional actuation beyond those required for normal flight.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="49a1901389213165d9c189851e6c2c42" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eDKwBM2RLOg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://asmedigitalcollection.asme.org/mechanismsrobotics/article-abstract/doi/10.1115/1.4065159/1198861/Bistable-Aerial-Transformer-BAT-A-Quadrotor-Fixed?redirectedFrom=fulltext">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9aux90a9t8k"><em>Disney’s Baymax frequently takes the spotlight in many research presentations dedicated to soft and secure physical human-robot interaction (pHRI). KIMLAB’s recent paper in TRO showcases a step towards realizing the Baymax concept by enveloping the skeletons of PAPRAS (Plug And Play Robotic Arm System) with soft skins and utilizing them for sensory functions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7fdd5eb4ab27388a8b4320e85f5aaf00" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9aux90A9T8k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10473193">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="qji993ftvuo">Catch me if you can!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d71c2e8279f6dd15a5d9f9aa4569b585" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QJI993ftVuo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mrs.felk.cvut.cz/">CVUT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="crwoytb8qvu"><em>Deep Reinforcement Learning (RL) has demonstrated impressive results in solving complex robotic tasks such as quadruped locomotion. Yet, current solvers fail to produce efficient policies respecting hard constraints. In this work, we advocate for integrating constraints into robot learning and present Constraints as Terminations (CaT), a novel constrained RL algorithm.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3c13c6a994e2e6294c49bcb575d8bf08" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/crWoYTb8QvU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://constraints-as-terminations.github.io/">CaT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cudl-cp-lww"><em>Why hasn’t the dream of having a robot at home to do your chores become a reality yet? With three decades of research expertise in the field, roboticist Ken Goldberg sheds light on the clumsy truth about robots—and what it will take to build more dexterous machines to work in a warehouse or help out at home.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d7fe6bf21831513cc3fa73a6ca133497" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cUdl-Cp-LWw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/ken_goldberg_why_don_t_we_have_better_robots_yet">TED</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fvtiohf17tg"><em>Designed as a technology demonstration that would perform up to five experimental test flights over a span of 30 days, the Mars helicopter surpassed expectations—repeatedly—only recently completing its mission after having logged an incredible 72 flights over nearly three years. Join us for a live talk to learn how Ingenuity’s team used resourcefulness and creativity to transform the rotorcraft from a successful tech demo into a helpful scout for the Perseverance rover, ultimately proving the value of aerial exploration for future interplanetary missions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2f0efe3dee717ce8871b0e2afdea494e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fVtiOhf17tg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mars.nasa.gov/technology/helicopter/">JPL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="c3mvrdczjei"><em>Please join us for a lively panel discussion featuring GRASP Faculty members Dr. Pratik Chaudhari, Dr. Dinesh Jayaraman, and Dr. Michael Posa. This panel will be moderated by Dr. Kostas Daniilidis around the current hot topic of AI Embodied in Robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5f6dc63462a7612204bf717ba4fcd69f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c3mVrdCZJEI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://events.seas.upenn.edu/event/spring-2024-grasp-on-robotics-grasp-faculty-panel-ai-embodied-in-robotics/">Penn Engineering</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 29 Mar 2024 18:51:11 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-co-expression</guid><category>Icra</category><category>Robotics</category><category>Video friday</category><category>Spot robot</category><category>Boston dynamics</category><category>Quadruped robots</category><category>Humanoid robots</category><category>Drones</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51871627&amp;width=980"></media:content></item><item><title>Boston Dynamics Unleashes New Spot Variant for Research</title><link>https://spectrum.ieee.org/boston-dynamics-research-spot</link><description><![CDATA[
  297. <img src="https://spectrum.ieee.org/media-library/a-photograph-of-a-yellow-and-black-robotic-dog.png?id=51860981&width=1200&height=400&coordinates=0%2C102%2C0%2C103"/><br/><br/><p>At Nvidia GTC last week, Boston Dynamics CTO Aaron Saunders gave a talk about deploying AI in real-world robots—namely, how Spot is leveraging <a href="https://bostondynamics.com/blog/starting-on-the-right-foot-with-reinforcement-learning/" target="_blank">reinforcement learning to get better at locomotion</a> (We spoke with Saunders last year <a href="https://spectrum.ieee.org/falling-robots" target="_blank">about robots falling over</a>). And Spot has gotten a <em>lot</em> better—a Spot robot takes a tumble on average once every 50 kilometers, even as the Spot fleet collectively walks enough to circle the Earth every three months.<strong></strong></p><p>That fleet consists of a lot of commercial deployments, which is impressive for any mobile robot, but part of the reason for that is because the current version of Spot is really not intended for robotics research, even though over 100 universities are home to at least one Spot. Boston Dynamics has not provided developer access to Spot’s joints, meaning that anyone who has wanted to explore quadrupedal mobility has had to find some other platform that’s a bit more open and allows for some experimentation.</p><p>Boston Dynamics is now announcing a new variant of Spot that includes a low-level application programming interface (API) that gives joint-level control of the robot. This will give (nearly) full control over how Spot moves its legs, which is a huge opportunity for the robotics community, since we’ll now be able to find out exactly what Spot is capable of.<strong> </strong>For example, we’ve already heard from <a href="https://theaiinstitute.com/" target="_blank">a credible source</a> that Spot is capable of running much, much faster than Boston Dynamics has publicly shown, and it’s safe to assume that a speedier Spot is just the start.</p><hr/><p class="shortcode-media shortcode-media-rebelmouse-image">
  298. <img alt="An animated GIF showing a yellow and black robotic dog jumping onto a stack of boxes in a research lab." class="rm-shortcode" data-rm-shortcode-id="ebd8b7f1e504207d880f702a2b27ad4e" data-rm-shortcode-name="rebelmouse-image" id="583b8" loading="lazy" src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-yellow-and-black-robotic-dog-jumping-onto-a-stack-of-boxes-in-a-research-lab.gif?id=51860962&width=980"/>
  299. <small class="image-media media-caption" placeholder="Add Photo Caption...">This is an example of a new Spot capability when a custom locomotion controller can be used on the robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics</small></p><p>When you buy a Spot robot from Boston Dynamics, it arrives already knowing how to walk. It’s very, very good at walking. Boston Dynamics is so confident in Spot’s walking ability that you’re only allowed high-level control of the robot: You tell it where to go, it decides how to get there. If you want to do robotics research using Spot as a mobility platform, that’s totally fine, but if you want to do research on quadrupedal locomotion, it hasn’t been possible with Spot. But that’s changing.</p><p>The <a href="https://resources.bostondynamics.com/spot-gtc-rl-researcher-kit" target="_blank">Spot RL Researcher Kit</a> is a collaboration between Boston Dynamics, Nvidia, and the <a href="https://spectrum.ieee.org/marc-raibert-boston-dynamics-instutute" target="_blank">AI Institute</a>. It includes a joint-level control API, an Nvidia Jetson AGX Orin payload, and a simulation environment for Spot based on Nvidia Isaac Lab. The kit will be officially released later this year, but Boston Dynamics is starting a slow rollout through an <a href="https://resources.bostondynamics.com/spot-gtc-rl-researcher-kit" target="_blank">early adopter beta program</a>.</p><p>From a certain perspective, Boston Dynamics did this whole thing with Spot backwards by first creating a commercial product and only then making it into a research platform. “At the beginning, we felt like it would be great to include that research capability, but that it wasn’t going to drive the adoption of this technology,” Saunders told us after his GTC session. Instead, Boston Dynamics first focused on getting lots of Spots out into the world in a useful way, and only now, when the company feels like it has gotten there, is the time right to unleash a fully featured research version of Spot. “It was really just getting comfortable with our current product that enabled us to go back and say, ‘How can we now provide people with the kind of access that they’re itching for?’ ”</p><p>Getting to this point has taken a huge amount of work for Boston Dynamics. Predictably, Spot started out as a novelty for most early adopters, becoming a project for different flavors of innovation groups within businesses rather than an industrial asset. “I think there’s been a change there,” Saunders says. “We’re working with operational customers a lot more, and the composure of our sales is shifting away from being dominated by early adopters and we’re starting to see repeat sales and interest in larger fleets of robots.”</p><p>Deploying and supporting a large fleet of Spots is one of the things that allowed Boston Dynamics to feel comfortable offering a research version. Researchers are not particularly friendly to their robots, because the goal of research is often to push the envelope of what’s possible. And part of that process includes getting very well acquainted with what turns out to be <em>not</em> possible, resulting in robots that end up on the floor, sometimes in pieces. The research version of Spot will include a mandatory <a href="https://support.bostondynamics.com/s/spot/spot-care-and-maintenance" rel="noopener noreferrer" target="_blank">Spot Care Service Plan</a>, which exists to serve commercial customers but will almost certainly provide more value to researchers, who want to see what kinds of crazy things they can get Spot to do.</p><p>Exactly how crazy those crazy things will be remains to be seen. Boston Dynamics is starting out with a beta program for the research Spots partially because it’s not quite sure yet how many safeguards to put in place within the API. “We need to see where the problems are,” Saunders says. “We still have a little work to do to really home in how our customers are going to use it.” Deciding how much Spot should be able to put itself at risk in the name of research may be a difficult question to answer, but I’m pretty sure that the beta program participants are going to do their best to find out how much tolerance Boston Dynamics has for Spot shenanigans. I just hope that whatever happens, they share as much video of it as possible.</p><p>The Spot Early Adopter Program for the new RL Researcher Kit is open for applications <a href="https://resources.bostondynamics.com/spot-gtc-rl-researcher-kit" rel="noopener noreferrer" target="_blank">here</a>.</p>]]></description><pubDate>Thu, 28 Mar 2024 19:03:56 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-research-spot</guid><category>Nvidia gtc</category><category>Spot robot</category><category>Boston dynamics</category><category>Legged robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photograph-of-a-yellow-and-black-robotic-dog.png?id=51860981&amp;width=980"></media:content></item><item><title>Video Friday: Project GR00T</title><link>https://spectrum.ieee.org/video-friday-project-gr00t</link><description><![CDATA[
  300. <img src="https://spectrum.ieee.org/media-library/image.png?id=51809103&width=1200&height=400&coordinates=0%2C5%2C0%2C289"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="kr7fazpfp6m"><em>See NVIDIA’s journey from pioneering advanced autonomous vehicle hardware and simulation tools to accelerated perception and manipulation for autonomous mobile robots and industrial arms, culminating in the next wave of cutting-edge AI for humanoid robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a27cd38a24bfa095f649fc6d991e401e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kr7FaZPFp6M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://developer.nvidia.com/project-GR00T">NVIDIA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kf9wdqykyqq"><em>In release 4.0, we advanced Spot’s locomotion abilities thanks to the power of reinforcement learning. Paul Domanico, Robotics Engineer at Boston Dynamics talks through how Spot’s hybrid approach of combining reinforcement learning with model predictive control creates an even more stable robot in the most antagonistic environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="178b396a6f4f97c65eaf6d811055678e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Kf9WDqYKYQQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/starting-on-the-right-foot-with-reinforcement-learning/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xpbwxlg-3bi"><em>We’re excited to share our latest progress on teaching EVEs general-purpose skills. Everything in the video is all autonomous, all 1X speed, all controlled with a single set of neural network weights.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8063efa9613368d91bf6717023905c63" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XpBWxLg-3bI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="v1lywsitgms">What I find interesting about the Unitree H1 doing a standing flip is where it decides to put its legs.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="293a5d3510defaeea3e5cf92306ff081" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V1LyWsiTgms?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/h1/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xu0nl126vni"><em>At the MODEX Exposition in March of 2024, Pickle Robot demonstrated picking freight from a random pile similar to what you see in a messy truck trailer after it has bounced across many miles of highway.  The piles of boxes were never the same and the demonstration was run live in front of crowds of onlookers 25 times over 4 days.  No other robotic trailer/container unloading system has yet to demonstrate this ability to pick from unstructured piles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b8669b2761f5c6fc660d6ecaf4f0cdc7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xU0nl126VNI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://picklerobot.com/">Pickle</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="eu6zeu4l1wo"><em>RunRu is a car-like robot, a robot-like car, with autonomy, sociability, and operability. This is a new type of personal vehicle that aims to create a “Jinba-Ittai” relationship with its passengers, who are not only always assertive, but also sometimes whine.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b18e4ba60574e58009568f95179e81e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eU6zEu4l1wo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.icd.cs.tut.ac.jp/index.php/portfolio/runru/">ICD-LAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="29pjy6movcg"><em>Verdie went to GTC this year and won the hearts of people but maybe not the other robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d83a00d40fd309dd3e04998f762fe6b6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/29PjY6MovCg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sheeprobotics.ai/">Electric Sheep</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qd6fahnwaec"><em>The “DEEPRobotics AI+” merges AI capabilities with robotic software systems to continuously boost embodied intelligence. The showcased achievement is a result of training a new AI and software system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="62163e44ba3ae46b9e53b55b6ded8792" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QD6fAhNWaec?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mbzwetk34wu">If you want to collect data for robot grasping, using Stretch and a pair of tongs is about as affordable as it gets.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c206029f9c639f7c09a696525e4a5302" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mbZwetk34WU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hello-robot.com/stretch-dex-teleop-kit">Hello Robot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xdgnhtppqc4">The real reason why Digit’s legs look backwards is so that it doesn’t bang its shins taking GPUs out of the oven.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4416af0eeec006e74d24ffa946d5a1f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XdGNhtPpQC4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Meanwhile, some of us can bake our GPUs without even needing an oven.</p><p>[ <a href="https://agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="upnid_rwdni"><em>P1 is LimX Dynamics’ innovative point-foot biped robot, serving as an important platform for the systematic development and modular testing of reinforcement learning. It is utilized to advance the research and iteration of basic biped locomotion abilities. The success of P1 in conquering forest terrain is a testament to LimX Dynamics’ systematic R&D in reinforcement learning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dc62b5d9cf09598ba32a07169f7ce595" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UpNid_rWDnI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="8jkh_m7seg8">And now, this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19d9b57d5e894e3889e75e4af2afaf51" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8jkH_M7Seg8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www-robot.mes.titech.ac.jp/index_e.html">Suzumori Endo Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jkp-rqnlw90"><em>Cooking in kitchens is fun. BUT doing it collaboratively with two robots is even more satisfying! We introduce MOSAIC, a modular framework that coordinates multiple robots to closely collaborate and cook with humans via natural language interaction and a repository of skills.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fd3b0f21d7d8f5f327ee320e03ff003c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jKp-RqNlW90?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://portal-cornell.github.io/MOSAIC/">Cornell</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="n5qtvjfdipm"><em>neoDavid is a Robust Humanoid with Dexterous Manipulation Skills, developed at DLR. The main focus in the development of neoDavid is to get as close to human capabilities as possible—especially in terms of dynamics, dexterity and robustness.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c5eac4d75bb54be38bd873cb05f124b1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/N5QTvjFdIPM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dlr.de/rm/desktopdefault.aspx/tabid-8017">DLR</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2zafpyonbmk"><em>Welcome to our customer spotlight video series where we showcase some of the remarkable robots that our customers have been working on.  In this episode we showcase three Clearpath Robotics UGVs that our customers are using to create robotic assistants for three different applications.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9af0229923e2a903ad6be83f3f33b5b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2zAfPYOnbMk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/">Clearpath</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nfkqoakzq0u"><em>This video presents KIMLAB’s new three-fingered robotic hand, featuring soft tactile sensors for enhanced grasping capabilities. Leveraging cost-effective 3D printing materials, it ensures robustness and operational efficiency.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b21e6a2e00d16206ea307676602ea01a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nfkQoAKzq0U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fim3ta6p_d0"><em>Various perception-aware planning approaches have attempted to enhance the state estimation accuracy during maneuvers, while the feature matchability among frames, a crucial factor influencing estimation accuracy, has often been overlooked. In this paper, we present APACE, an Agile and Perception-Aware trajeCtory gEneration framework for quadrotors aggressive flight, that takes into account feature matchability during trajectory planning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8114b735e34b021fb77980a50a42aa5e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FIM3ta6p_d0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2403.08365">Paper</a> ] via [ <a href="https://uav.hkust.edu.hk/">HKUST</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="umtu2yjkc0y"><em>In this video, we see Samuel Kunz, the pilot of the RSL Assistance Robot Race team from ETH Zurich, as he participates in the CYBATHLON Challenges 2024. Samuel completed all four designated tasks—retrieving a parcel from a mailbox, using a toothbrush, hanging a scarf on a clothesline, and emptying a dishwasher—with the help of an assistance robot. He achieved a perfect score of 40 out of 40 points and secured first place in the race, completing the tasks in 6.34 minutes.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="83077fa2e74d44d92f800bbac3ae0b67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/umTU2yjkc0Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://cybathlon.ethz.ch/en/teams/rsl">CYBATHLON</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dqggetklw8a"><em>Florian Ledoux is a wildlife photographer with a deep love for the Arctic and its wildlife. Using the Mavic 3 Pro, he steps onto the ice ready to capture the raw beauty and the stories of this chilly, remote place.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5b17eeac7b25427e40bd2abc1d379341" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dqGgETKLw8A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://viewpoints.dji.com/blog/florian-ledoux-polar-obsession">DJI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 22 Mar 2024 18:07:12 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-project-gr00t</guid><category>Boston dynamics</category><category>Nvidia</category><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Humanoid robots</category><category>Assistive technology</category><category>Drones</category><category>Robotic arm</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51809103&amp;width=980"></media:content></item><item><title>Electroadhesion Heralds New Implant and Robot Tech</title><link>https://spectrum.ieee.org/electroadhesion-for-better-implants</link><description><![CDATA[
  301. <img src="https://spectrum.ieee.org/media-library/two-photos-show-raw-chicken-and-a-piece-of-tomato-sticking-to-a-black-rectangular-material-as-they-hang-down-under-the-force-of.jpg?id=51766341&width=1200&height=400&coordinates=0%2C246%2C0%2C247"/><br/><br/><p>Applying electricity for a few seconds to a soft material, such as a slice of raw tomato or chicken, can strongly bond it to a hard object, such as a graphite slab, without any tape or glue, a new study finds. This unexpected effect is also reversible—switching the direction of the electric current often easily separates the materials, scientists at the University of Maryland say. Potential applications for such “electroadhesion,” which can even work underwater, may include improved biomedical implants and biologically inspired robots.</p><p>“It is surprising that this effect was not discovered earlier,” says Srinivasa Raghavan, a professor of chemical and biomolecular engineering at the University of Maryland. “This is a discovery that could have been made pretty much since we’ve had batteries.”</p><p>In nature, soft materials such as living tissues are often bonded to hard objects such as bones. Previous research explored chemical ways to accomplish this feat, such as with glues that mimic how <a href="https://www.upi.com/Science_News/2004/01/13/Superglue-could-aid-shipping-medicine/28001074010500/" target="_blank">mussels stick to rocks and boats</a>. However, these bonds are usually irreversible.</p><p class="pull-quote">They tried a number of different soft materials, such as tomato, apple, beef, chicken, pork and gelatin...</p><p>Previously, Raghavan and his colleagues discovered that <a href="https://chbe.umd.edu/news/story/umd-research-team-createsnbsplsquoswitchablersquo-adhesive-for-repairing-cuts-and-tears-in-tissue" rel="noopener noreferrer" target="_blank">electricity could make gels stick to biological tissue</a>, a discovery that might one day lead to gel patches that can help repair wounds. In the new study, instead of bonding two soft materials together, they explored whether electricity could make a soft material stick to a hard object.</p><p>The scientists began with a pair of <a href="https://spectrum.ieee.org/solid-state-battery" target="_self">graphite</a> electrodes (consisting of an anode and a cathode) and an <a href="https://www.niehs.nih.gov/health/topics/agents/acrylamide" rel="noopener noreferrer" target="_blank">acrylamide</a> gel. They applied five volts across the gel for three minutes. Surprisingly, they found the gel strongly bonded onto the graphite anode. Attempts to wrench the gel and electrode apart would typically break the gel, leaving pieces of it on the electrode. The bond could apparently last indefinitely after the voltage was removed, with the researchers keeping samples of gel and electrode stuck together for months.</p><p>Howeve, when the researchers switched the polarity of the current, the acrylamide gel detached from the anode. Instead, it adhered onto the other electrode.</p><p>Raghavan and his colleagues experimented with this newfound electroadhesion effect a number of different ways. They tried a number of different soft materials, such as tomato, apple, beef, chicken, pork and gelatin, as well as different electrodes, such as copper, lead, tin, nickel, iron, zinc and titanium. They also varied the strength of the voltage and the amount of time it was applied.</p><p>The researchers found the amount of salt in the soft material played a strong role in the electroadhesion effect. The salt makes the soft material conductive, and high concentrations of salt could lead gels to adhere to electrodes within seconds.</p><p class="pull-quote">“It’s surprising how simple this effect is, and how widespread it might be”</p><p>The scientists also discovered that metals that are better at giving up their electrons, such as copper, lead and tin, are better at electroadhesion. Conversely, metals that hold onto their electrons strongly, such as nickel, iron, zinc and titanium, fared poorly.</p><p>These findings suggest that electroadhesion arises from chemical bonds between the electrode and soft material after they exchange electrons. Depending on the nature of the hard and soft materials, adhesion happened at the anode, cathode, both electrodes, or neither. Boosting the strength of the voltage and the amount of time it was applied typically increased adhesion strength.</p><p>“It’s surprising how simple this effect is, and how widespread it might be,” Raghavan says.</p><p>Potential applications for electroadhesion may include improving biomedical implants—the ability to bond tissue to steel or titanium could help reinforce implants, the researchers say. Electroadhesion may also help create biologically inspired robots with stiff bone-like skeletons and soft muscle-like elements, they add. They also suggest electroadhesion could lead to new kinds of batteries where soft electrolytes are bonded to hard electrodes, although it’s not clear if such adhesion would make much of a difference to a battery’s performance, Raghavan says.</p><p>The researchers also discovered that electroadhesion could occur underwater, which they suggest could open up an even wider range of possible applications for this effect. Typical adhesives do not work underwater, since many cannot spread onto solid surfaces that are submerged in liquids, and even those that can usually only form weak adhesive bonds due to interference from the liquid.</p><p>“It’s hard for me to pinpoint one real application for this discovery,” Raghavan says. “It reminds me of the researchers who made the discoveries behind Velcro or Post-it notes—the applications were not obvious to them when the discoveries were made, but the applications did arise over time.”</p><p>The scientists detailed <u><a href="http://dx.doi.org/10.1021/acscentsci.3c01593" rel="noopener noreferrer" target="_blank">their findings</a></u> online 13 March in the journal ACS Central Science.</p>]]></description><pubDate>Tue, 19 Mar 2024 17:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/electroadhesion-for-better-implants</guid><category>Implants</category><category>Bioengineering</category><category>Electroadhesion</category><dc:creator>Charles Q. Choi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-photos-show-raw-chicken-and-a-piece-of-tomato-sticking-to-a-black-rectangular-material-as-they-hang-down-under-the-force-of.jpg?id=51766341&amp;width=980"></media:content></item><item><title>Nvidia Announces GR00T, a Foundation Model for Humanoids</title><link>https://spectrum.ieee.org/nvidia-gr00t-ros</link><description><![CDATA[
  302. <img src="https://spectrum.ieee.org/media-library/a-man-in-a-black-leather-jacket-stands-in-front-of-an-enormous-screen-displaying-flowcharts-and-graphics-of-robots-during-a-pres.jpg?id=51767293&width=1200&height=400&coordinates=0%2C534%2C0%2C0"/><br/><br/><p>Nvidia’s <a href="https://www.nvidia.com/gtc/" target="_blank">ongoing GTC developer conference</a> in San Jose is, unsurprisingly, almost entirely about AI this year. But in between the AI developments, Nvidia has also made a couple of significant robotics announcements.</p><p>First, there’s <a href="https://nvidianews.nvidia.com/news/foundation-model-isaac-robotics-platform" target="_blank">Project GR00T</a> (with each letter and number pronounced individually so as not to invoke the wrath of Disney), a <a href="https://spectrum.ieee.org/covariant-foundation-model" target="_blank">foundation model</a> for humanoid robots. And secondly, Nvidia has committed to be the founding platinum member of the <a href="https://osralliance.org/" target="_blank">Open Source Robotics Alliance</a>, a new initiative from the <a href="https://www.openrobotics.org/" target="_blank">Open Source Robotics Foundation</a> intended to make sure that the <a href="https://ros.org/" target="_blank">Robot Operating System</a> (ROS), a collection of open-source software libraries and tools, has the support that it needs to flourish.</p><h2>GR00T</h2><p>First, let’s talk about GR00T (short for “<a href="https://developer.nvidia.com/project-gr00t" target="_blank">Generalist Robot 00 Technology</a>”). The way that Nvidia presenters enunciated it letter-by-letter during their talks strongly suggests that in private they just say “Groot.” So the rest of us can also just say “Groot” as far as I’m concerned.</p><p>As a “general-purpose foundation model for humanoid robots,” GR00T is intended to provide a starting point for specific humanoid robots to do specific tasks. As you might expect from something being presented for the first time at an Nvidia keynote, it’s awfully vague at the moment, and we’ll have to get into it more later on. Here’s pretty much everything useful that Nvidia has told us so far:</p><blockquote>“Building foundation models for general humanoid robots is one of the most exciting problems to solve in AI today,” said Jensen Huang, founder and CEO of NVIDIA. “The enabling technologies are coming together for leading roboticists around the world to take giant leaps towards artificial general robotics.”<br/><br/>Robots powered by GR00T... will be designed to understand natural language and emulate movements by observing human actions—quickly learning coordination, dexterity and other skills in order to navigate, adapt and interact with the real world.</blockquote><p>This sounds good, but that “will be” is doing a lot of heavy lifting. Like, there’s a very significant “how” missing here. More specifically, we’ll need a better understanding of what’s underlying this foundation model—is there real robot data under there somewhere, or is it based on a massive amount of simulation? Are the humanoid robotic companies involved contributing data to improve GR00T, or instead training their own models based on it? It’s certainly notable that Nvidia is name-dropping most of the heavy-hitters in commercial humanoids, including <a href="https://www.1x.tech/" target="_blank">1X Technologies</a>, <a data-linked-post="2659620206" href="https://spectrum.ieee.org/agility-robotics-digit" target="_blank">Agility Robotics</a>, <a data-linked-post="2664258859" href="https://spectrum.ieee.org/humanoid-robot" target="_blank">Apptronik</a>, <a data-linked-post="2662332032" href="https://spectrum.ieee.org/boston-dynamics" target="_blank">Boston Dynamics</a>, <a data-linked-post="2667393441" href="https://spectrum.ieee.org/figure-robot-video" target="_blank">Figure AI</a>, <a href="https://fourierintelligence.com/" target="_blank">Fourier Intelligence</a>, <a data-linked-post="2660271022" href="https://spectrum.ieee.org/sanctuary-humanoid-robot" target="_blank">Sanctuary AI</a>, <a data-linked-post="2662333549" href="https://spectrum.ieee.org/quadruped-robot-unitree-go2" target="_blank">Unitree Robotics</a>, and <a href="https://www.pxing.com/en/index" target="_blank">XPENG Robotics</a>. We’ll be able to check in with some of those folks directly this week to hopefully learn more. </p><p>On the hardware side, Nvidia is also announcing a new computing platform called Jetson Thor:</p><p><em>Jetson Thor was created as a new computing platform capable of performing complex tasks and interacting safely and naturally with people and machines. It has a modular architecture optimized for performance, power and size. The SoC includes a next-generation GPU based on NVIDIA Blackwell architecture with a transformer engine delivering 800 teraflops of 8-bit floating point AI performance to run multimodal generative AI models like GR00T. With an integrated functional safety processor, a high-performance CPU cluster and 100GB of ethernet bandwidth, it significantly simplifies design and integration efforts.</em></p><p>Speaking of Nvidia’s Blackwell architecture—today the company also unveiled its <a href="https://spectrum.ieee.org/nvidia-blackwell" target="_blank">B200 Blackwell GPU</a>. And to round out the announcements, the chip foundry TSMC and Synopsys, an electronic design automation company, each said they will be moving Nvidia’s inverse lithography tool, <a href="https://spectrum.ieee.org/inverse-lithography" target="_blank">cuLitho</a>, into production.<strong></strong></p><h2>The Open Source Robotics Alliance</h2><p>The other big announcement is actually from the Open Source Robotics Foundation, which is launching the <a href="https://osralliance.org/" target="_blank">Open Source Robotics Alliance</a> (OSRA), a “new initiative to strengthen the governance of our open-source robotics software projects and ensure the health of the Robot Operating System (ROS) Suite community for many years to come.” Nvidia is an inaugural platinum member of the OSRA, but they’re not alone—other platinum members include Intrinsic and Qualcomm. Other significant members include Apex, Clearpath Robotics, Ekumen, eProsima, PickNik, Silicon Valley Robotics, and Zettascale.<strong></strong></p><p>“The [Open Source Robotics Foundation] had planned to restructure its operations by broadening community participation and expanding its impact in the larger ROS ecosystem,” explains Vanessa Yamzon Orsi, CEO of the Open Source Robotics Foundation. “<a href="https://spectrum.ieee.org/alphabet-intrinsic-open-robotics-acquisition" target="_blank">The sale of [Open Source Robotics Corporation]</a> was the first step towards that vision, and the launch of the OSRA is the next big step towards that change.”<strong></strong></p><p>We had time for a brief Q&A with Orsi to better understand how this will affect the ROS community going forward. </p><p><strong>You structured the OSRA to have a mixed membership and meritocratic model like the <a href="https://www.linuxfoundation.org/" target="_blank">Linux Foundation</a>—what does that mean, exactly?</strong></p><p><strong>Vanessa Yamzon Orsi:</strong> We have modeled the OSRA to allow for paths to participation in its activities through both paid memberships (for organizations and their representatives) and the community members who support the projects through their contributions. The mixed model enables participation in the way most appropriate for each organization or individual: contributing funding as a paying member, contributing directly to project development, or both.</p><p><strong>What are some benefits for the ROS ecosystem that we can look forward to through OSRA?</strong></p><p><strong>Orsi:</strong> We expect the OSRA to benefit the OSRF’s projects in three significant ways.</p><ul><li>By providing a stable stream of funding to cover the maintenance and development of the ROS ecosystem.</li><li>By encouraging greater community involvement in development through open processes and open, meritocratic status achievement.</li><li>By bringing greater community involvement in governance and ensuring that all stakeholders have a voice in decision-making.</li></ul><p><strong>Why will this be a good thing for ROS users?</strong></p><strong>Orsi:</strong> The OSRA will ensure that ROS and the suite of open source projects under the stewardship of Open Robotics will continue to be supported and strengthened for years to come. By providing organized governance and oversight, clearer paths to community participation, and financial support, it will provide stability and structure to the projects while enabling continued development.]]></description><pubDate>Mon, 18 Mar 2024 23:27:56 +0000</pubDate><guid>https://spectrum.ieee.org/nvidia-gr00t-ros</guid><category>Humanoid robots</category><category>Jensen huang</category><category>Nvidia</category><category>Open source</category><category>Open source robotics foundation</category><category>Qualcomm</category><category>Robotics</category><category>Ros</category><category>Tsmc</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-man-in-a-black-leather-jacket-stands-in-front-of-an-enormous-screen-displaying-flowcharts-and-graphics-of-robots-during-a-pres.jpg?id=51767293&amp;width=980"></media:content></item><item><title>How Zipline Designed Its Droid Delivery System</title><link>https://spectrum.ieee.org/delivery-drone-zipline-design</link><description><![CDATA[
  303. <img src="https://spectrum.ieee.org/media-library/a-collage-of-five-images-show-iterations-of-the-system-from-sketch-to-early-prototype-to-final-design.png?id=51721232&width=1200&height=400&coordinates=0%2C213%2C0%2C214"/><br/><br/><p>About a year ago, <a href="https://www.flyzipline.com/" rel="noopener noreferrer" target="_blank">Zipline</a> introduced <a href="https://spectrum.ieee.org/zipline-drone-delivery" target="_self">Platform 2</a>, an approach to precision urban drone delivery that combines a large hovering drone with a smaller package-delivery “Droid.” Lowered on a tether from the belly of its parent Zip drone, the Droid contains thrusters and sensors (plus a 2.5- to 3.5-kilogram payload) to reliably navigate itself to a delivery area of just one meter in diameter. The Zip, meanwhile, safely remains hundreds of meters up. After depositing its payload, the Droid rises back up to the drone on its tether, and off they go.</p><p>At first glance, the sensor and thruster-packed Droid seems complicated enough to be bordering on impractical, especially when you consider the relative simplicity of other drone delivery solutions, which commonly just drop the package itself on a tether from a hovering drone. I’ve been writing about robots long enough that I’m suspicious of robotic solutions that appear to be overengineered, since that’s always a huge temptation with robotics. Like, is this really the <em>best</em> way of solving a problem, or is it just the <em>coolest</em> way?</p><p>We know the folks at Zipline pretty well, though, and they’ve certainly made creative engineering work for them, <a href="https://spectrum.ieee.org/in-the-air-with-ziplines-medical-delivery-drones" target="_self">as we saw when we visited</a> one of their “nests” in rural Rwanda. So as Zipline nears the official launch of Platform 2, we spoke with Zipline cofounder and CTO <a href="https://www.linkedin.com/in/keenanwyrobek/" rel="noopener noreferrer" target="_blank">Keenan Wyrobek</a>, Platform 2 lead <a href="https://www.linkedin.com/in/zlaszlo/" rel="noopener noreferrer" target="_blank">Zoltan Laszlo</a>, and industrial designer <a href="https://www.linkedin.com/in/vdbgreg/" rel="noopener noreferrer" target="_blank">Gregoire Vandenbussche</a> to understand exactly why they think this is the best way of solving precision urban drone delivery.</p><hr/><p>First, a quick refresher. Here’s what the delivery sequence with the vertical takeoff and landing (VTOL) Zip and the Droid looks like:</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  304. <img alt="" class="rm-shortcode" data-rm-shortcode-id="589260fece8dc12da92042ed0c6d5dc8" data-rm-shortcode-name="rebelmouse-image" id="a4da6" loading="lazy" src="https://spectrum.ieee.org/media-library/image.gif?id=51730047&width=980"/>
  305. </p><p>The system has a service radius of about 16 kilometers (10 miles), and it can make deliveries to outdoor spaces of “any meaningful size.” Visual sensors on the Droid find the delivery site and check for obstacles on the way down, while the thrusters compensate for wind and movement of the parent drone. Since the big VTOL Zip remains well out of the way, deliveries are fast, safe, and quiet. But it takes two robots to pull off the delivery rather than just one.</p><p>On the other end is the infrastructure required to load and charge these drones. Zipline’s Platform 1 drones require a dedicated base with relatively large launch and recovery systems. With Platform 2, the drone drops the Droid into a large chute attached to the side of a building so that the Droid can be reloaded, after which it pulls the Droid out again and flies off to make the delivery:</p><p class="shortcode-media shortcode-media-youtube">
  306. <span class="rm-shortcode" data-rm-shortcode-id="5da3422218bfaf1f52ee8c6ff4cac7fa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2w-c0wA1uNc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  307. </p><p>“We think it’s the best delivery experience. Not the best drone delivery experience, the best <em>delivery</em> experience,” Zipline’s Wyrobek tells us. That may be true, but the experience also has to be practical and sustainable for Zipline to be successful, so we asked the Zipline team to explain the company’s approach to precision urban delivery.</p><h5>Zipline on: </h5><ul><li><a href="#zip1">Approach to drone delivery</a></li><li><a href="#zip2">Concept for Droid design</a></li><li><a href="#zip3">Designing for cuteness</a></li><li><a href="#zip4">Making pinpoint deliveries</a></li></ul><p class="rm-anchors" id="zip1"><strong><em>IEEE Spectrum</em>: What problems is Platform 2 solving, and why is it necessary to solve those problems in this specific way?</strong></p><p><strong>Keenan Wyrobek:</strong> There are literally billions of last-mile deliveries happening every year in [the United States] alone, and our customers have been asking for years for something that can deliver to their homes. With our long-range platform, Platform 1, we can float a package down into your yard on a parachute, but that takes some space. And so one half of the big design challenge was how to get our deliveries precise enough, while the other half was to develop a system that will bolt on to existing facilities, which Platform 1 doesn’t do.</p><p><strong>Zoltan Laszlo:</strong> Platform 1 can deliver within an area of about two parking spaces. As we started to actually look at the data in urban areas using publicly available lidar surveys, we found that two parking spaces serves a bit more than half the market. We want to be a universal delivery service.</p><p>But with a delivery area of 1 meter in diameter, which is what we’re actually hitting in our delivery demonstrations for Platform 2, that gets us into the high 90s for the percentage of people that we can deliver to.</p><p><strong>Wyrobek: </strong>When we say “urban,” what we’re talking about is three-story sprawl, which is common in many large cities around the world. And we wanted to make sure that our deliveries could be precise enough for places like that.</p><p><strong>There are some existing solutions for precision aerial delivery that have been operating at scale with some success, typically by winching packages to the ground from a VTOL drone. Why develop your own technique rather than just going with something that has already been shown to work?</strong></p><p><strong>Laszlo: </strong>Winching down is the natural extension of being able to hover in place, and when we first started, we were like, “Okay, we’re just going to winch down. This will be great, super easy.”</p><p>So we went to our test site in Half Moon Bay [on the Northern California coast] and built a quick prototype of a winch system. But as soon as we lowered a box down on the winch, the wind started blowing it all over the place. And this was from the height of our lift, which is less than 10 meters up. We weren’t even able to stay inside two parking spaces, which told us that something was broken with our approach.</p><p>The aircraft can sense the wind, so we thought we’d be able to find the right angle for the delivery and things like that. But the wind where the aircraft is may be different from the wind nearer the ground. We realized that unless we’re delivering to an open field, a package that does not have active wind compensation is going to be very hard to control. We’re targeting high-90th percentile in terms of availability due to weather—even if it’s a pretty blustery day, we still want to be able to deliver. </p><p><strong>Wyrobek: </strong>This was a wild insight when we really understood that unless it’s a perfect day, using a winch actually takes almost as much space as we use for Platform 1 floating a package down on a parachute. </p><p class="shortcode-media shortcode-media-youtube">
  308. <span class="rm-shortcode" data-rm-shortcode-id="be99a7a52d1e19e33079a58ddc3d714e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vsdjzzKBap8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  309. <small class="image-media media-caption" placeholder="Add Photo Caption...">Engineering test footage of Zipline’s Platform 2 docking system at their test site in Half Moon Bay in California.</small>
  310. </p><p class="rm-anchors" id="zip2"><strong>How did you arrive at this particular delivery solution for Platform 2?</strong></p><p><strong>Laszlo: </strong>I don’t remember whose idea it was, but we were playing with a bunch of different options. Putting thrusters on the tether wasn’t even the craziest idea. We had our Platform 1 aircraft, which was reliable, so we started with looking at ways to just make that aircraft deliver more precisely. There was only so much more we could do with passive parachutes, but what does an active, steerable parachute look like? There are remote-controlled paragliding toys out there that we tested, with mixed results—the challenge is to minimize the smarts in your parachute, because there’s a chance you won’t get it back. So then we started some crazy brainstorming about how to reliably retrieve the parachute. </p><p><strong>Wyrobek: </strong>One idea was that the parachute would come with a self-return envelope that you could stick in the mail. Another idea was that the parachute would be steered by a little drone, and when the package got dropped off, the drone would reel the parachute in and then fly back up into the Zip. </p><p><strong>Laszlo: </strong>But when we realized that the package has to be able to steer itself, that meant the Zip doesn’t need to be active. The Zip doesn’t need to drive the package, it doesn’t even need to see the package, it just needs to be a point up in the sky that’s holding the package. That let us move from having the Zip 50 feet up, to having it 300 feet up, which is important because it’s a big, heavy drone that we don’t want in our customer’s space. And the final step was adding enough smarts to the thing coming down into your space to figure out where exactly to deliver to, and of course to handle the wind.</p><p class="rm-anchors" id="zip3"><strong>Once you knew what you needed to do, how did you get to the actual design of the droid?</strong></p><p><strong>Gregoire Vandenbussche: </strong>Zipline showed me pretty early on that they were ready to try crazy ideas, and from my experience, that’s extremely rare. When the idea of having this controllable tether with a package attached to it came up, one of my first thoughts was that from a user standpoint, nothing like this exists. And the difficulty of designing something that doesn’t exist is that people will try to identify it according to what they know. So we had to find a way to drive that thinking towards something positive.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  311. <img alt="Two pen sketches side by side" class="rm-shortcode" data-rm-shortcode-id="cf538f94c3bfc00dd7d66eb1f95e5686" data-rm-shortcode-name="rebelmouse-image" id="7d601" loading="lazy" src="https://spectrum.ieee.org/media-library/two-pen-sketches-side-by-side.jpg?id=51729614&width=980"/>
  312. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-881043" placeholder="Add Photo Caption..." spellcheck="false">Early Droid concept sketches by designer Gregoire Vandenbussche featured legs that would fold up after delivery.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Zipline</small></p><p>First we thought about putting words onto it, like “hello” or something, but the reality is that we’re an international company and we need to be able to work everywhere. But there’s one thing that’s common to everyone, and that’s emotions—people are able to recognize certain things as being approachable and adorable, so going in that direction felt like the right thing to do. However, being able to design a robot that gives you that kind of emotion but also flies was quite a challenge. We took inspiration from other things that move in 3D, like sea mammals—things that people will recognize even without thinking about it.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  313. <img alt="Three pen sketches show a dolphin, the front of the droid design, and a more pulled back sketch." class="rm-shortcode" data-rm-shortcode-id="c851d6f5ef48fbc89ba497aea9f85e6d" data-rm-shortcode-name="rebelmouse-image" id="c34e9" loading="lazy" src="https://spectrum.ieee.org/media-library/three-pen-sketches-show-a-dolphin-the-front-of-the-droid-design-and-a-more-pulled-back-sketch.jpg?id=51729727&width=980"/>
  314. <small class="image-media media-caption" placeholder="Add Photo Caption...">Vandenbussche’s sketches show how the design of the Droid was partially inspired by dolphins.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Zipline</small></p><p><strong>Now that you say it, I can definitely see the sea mammal inspiration in the drone.</strong><br/></p><p><strong>Vandenbussche:</strong> There are two aspects of sea mammals that work really well for our purpose. One of them is simplicity of shape; sea mammals don’t have all that many details. Also, they tend to be optimized for performance. Ultimately, we need that, because we need to be able to fly. And we need to be able to convey to people that the drone is under control. So having something you can tell is moving forward or turning or moving away was very helpful.</p><p><strong>Wyrobek: </strong>One other insight that we had is that Platform 2 needs to be small to fit into tight delivery spaces, and it needs to feel small when it comes into your personal space, but it also has to be big enough inside to be a useful delivery platform. We tried to leverage the chubby but cute look that baby seals have going on.</p><p>The design journey was pretty fun. Gregoire would spend two or three days coming up with a hundred different concept sketches. We’d do a bunch of brainstorming, and then Gregoire would come up with a whole bunch of new directions, and we’d keep exploring. To be clear, no one would describe our functional prototypes from back then as “cute.” But through all this iteration eventually we ended up in an awesome place.</p><p><strong>And how do you find that place? When do you know that your robot is just cute enough?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  315. <img alt="A rendering of a grey and white device with a red tether on top." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="56b46fd757c867a311429401b0a26619" data-rm-shortcode-name="rebelmouse-image" id="893ed" loading="lazy" src="https://spectrum.ieee.org/media-library/a-rendering-of-a-grey-and-white-device-with-a-red-tether-on-top.jpg?id=51729818&width=980" style="max-width: 100%"/>
  316. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-336650" placeholder="Add Photo Caption..." spellcheck="false" style="max-width: 100%;">One iteration of the Droid, Vandenbussche determined, looked too technical and intimidating.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Zipline</small></p><p><strong>Vandenbussche:</strong> It’s finding the balance around what’s realistic and functional. I like to think of industrial design as taking all of the constraints and kind of playing Tetris with them until you get a result that ideally satisfies everybody. I remember at one point looking at where we were, and feeling like we were focusing too much on performance and missing that emotional level. So, we went back a little bit to say, where can we bring this back from looking like a highly technical machine to something that can give you a feeling of approachability?</p><p><strong>Laszlo:</strong> We spent a fair bit of time on the controls and behaviors of the droid to make sure that it moves in a very approachable and predictable way, so that you know where it’s going ahead of time and it doesn’t behave in unexpected ways. That’s pretty important for how people perceive it.</p><p>We did a lot of work on how the droid would descend and approach the delivery site. One concept had the droid start to lower down well before the Zip was hovering directly overhead. We had simulations and renderings, and it looked great. We could do the whole delivery in barely over 20 seconds. But even if the package is far away from you, it still looks scary because [the Zip is] moving faster than you would expect, and you can’t tell exactly where it’s going to deliver. So we deleted all that code, and now it just comes straight down, and people don’t back away from the Droid anymore. They’re just like, “Oh, okay, cool.”</p><p class="rm-anchors" id="zip4"><strong>How did you design the thrusters to enable these pinpoint deliveries?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  317. <img alt="An object in the air in a barren landscape." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="d185486b63263ea456d195b430275952" data-rm-shortcode-name="rebelmouse-image" id="800e6" loading="lazy" src="https://spectrum.ieee.org/media-library/an-object-in-the-air-in-a-barren-landscape.jpg?id=51729862&width=980" style="max-width: 100%"/>
  318. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-562587" placeholder="Add Photo Caption..." spellcheck="false" style="max-width: 100%;">Early tests of the Droid centered around a two-fan version.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Zipline</small></p><p><strong>Laszlo: </strong>With the thrusters, we knew we wanted to maximize the size of at least one of the fans, because we were almost always going to have to deal with wind. We’re trying to be as quiet as we can, so the key there is to maximize the area of the propeller. Our leading early design was just a box with two fans on it:</p><p>Two fans with unobstructed flow meant that it moved great, but the challenge of fitting it inside another aircraft was going to be painful. And it <em>looked</em> big, even though it wasn’t actually that big.</p><p><strong>Vandenbussche: </strong>It was also pretty intimidating when you had those two fans facing you and the Droid coming toward you.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  319. <img alt="Side by side images show a box with one white fan and a red box with two white fans." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="d172a7ee74891a25426187d029081aff" data-rm-shortcode-name="rebelmouse-image" id="22890" loading="lazy" src="https://spectrum.ieee.org/media-library/side-by-side-images-show-a-box-with-one-white-fan-and-a-red-box-with-two-white-fans.jpg?id=51729964&width=980" style="max-width: 100%"/>
  320. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">A single steerable fan [left] that acted like a rudder was simpler in some ways, but as the fan got larger, the gyroscopic effects became hard to manage. Instead of one steerable fan, how about two steerable fans? [right] Omnidirectional motion was possible with this setup, but packaging it inside of a Zip didn’t work.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Zipline</small></p><p><strong>Laszlo: </strong>We then started looking at configurations with a main fan and a second smaller fan, with the bigger fan at the back pushing forward and the smaller fan at the front providing thrust for turning. The third fan we added relatively late because we didn’t want to add it at all. But we found that [with two fans] the droid would have to spin relatively quickly to align to shifting winds, whereas with a third fan we can just push sideways in the direction that we need.</p><p><strong>What kind of intelligence does the Droid have?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  321. <img alt="A rendering of a rectangular object with thrusters" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="bfe4dc97b320d54ef2b8b762cf894b33" data-rm-shortcode-name="rebelmouse-image" id="8bdbc" loading="lazy" src="https://spectrum.ieee.org/media-library/a-rendering-of-a-rectangular-object-with-thrusters.jpg?id=51729830&width=980" style="max-width: 100%"/>
  322. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-63084" placeholder="Add Photo Caption..." spellcheck="false" style="max-width: 100%;">The current design of Zipline’s Platform 2 Droid is built around a large thruster in the rear and two smaller thrusters at the front and back.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Zipline</small></p><p><strong>Wyrobek: </strong>The Droid has its own little autopilot, and there’s a very simple communications system between the two vehicles. You may think that it’s a really complex coordinated control problem, but it’s not: The Zip just kind of hangs out, and the Droid takes care of the delivery. The sensing challenge is for the Droid to find trees and powerlines and things like that, and then find a good delivery site.</p><p><strong>Was there ever a point at which you were concerned that the size and weight and complexity would not be worth it?</strong></p><p><strong>Wyrobek: </strong>Our mindset was to fail fast, to try things and do what we needed to do to convince ourselves that it wasn’t a good path. What’s fun about this kind of iterative process is oftentimes, you try things and you realize that actually, this is better than we thought. </p><p><strong>Laszlo: </strong>We first thought about the Droid as a little bit of a tax, in that it’s costing us extra weight. But if your main drone can stay high enough up that it avoids trees and buildings, then it can just float around up there. If it gets pushed around by the wind, it doesn’t matter because the Droid can compensate. </p><p><strong>Wyrobek: </strong>Keeping the Zip at altitude is a big win in many ways. It doesn’t have to spend energy station-keeping, descending, and then ascending again. We just do that with the much smaller Droid, which also makes the hovering phase much shorter. It’s also much more efficient to control the small droid than the large Zip. And having all of the sensors on the Droid very close to the area that you’re delivering to makes that problem easier as well. It may look like a more complex system from the outside, but from the inside, it’s basically making all the hardest problems much easier.</p><div class="horizontal-rule"></div><p>Over the past year, Zipline has set up a bunch of partnerships to make residential deliveries to consumers using Droid starting in 2024, including prescriptions from <a href="https://consultqd.clevelandclinic.org/delivering-drugs-via-drone" target="_blank">Cleveland Clinic</a> in Ohio, medical products from <a href="https://www.flyzipline.com/newsroom/news/announcements/wellspan-health-will-bring-innovative-medical-drone-delivery-to-pennsylvania-with-global-logistics-leader-zipline" rel="noopener noreferrer" target="_blank">WellSpan Health</a> in Pennsylvania, tasty food from <a href="https://www.flyzipline.com/newsroom/news/announcements/better-food-delivery-drone-with-mendocino-farms" rel="noopener noreferrer" target="_blank">Mendocino Farms</a> in California, and a little bit of everything from <a href="https://corporate.walmart.com/news/2024/01/09/sky-high-ambitions-walmart-to-make-largest-drone-delivery-expansion-of-any-us-retailer" rel="noopener noreferrer" target="_blank">Walmart</a> starting in Dallas. Zipline’s plan is to kick things off with Platform 2 later this year.</p>]]></description><pubDate>Fri, 15 Mar 2024 20:08:27 +0000</pubDate><guid>https://spectrum.ieee.org/delivery-drone-zipline-design</guid><category>Zipline</category><category>Drones</category><category>Delivery drones</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-collage-of-five-images-show-iterations-of-the-system-from-sketch-to-early-prototype-to-final-design.png?id=51721232&amp;width=980"></media:content></item><item><title>The Heart and the Chip: What Could Go Wrong?</title><link>https://spectrum.ieee.org/the-heart-and-the-chip-what-could-go-wrong</link><description><![CDATA[
  323. <img src="https://spectrum.ieee.org/media-library/an-orange-book-cover-for-the-heart-and-the-chip-by-daniela-rus-and-gregory-mone.jpg?id=51715682&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>
  324. <strong>At several points</strong> in this book I’ve mentioned the fictional character Tony Stark, who uses technology to transform himself into the superhero Iron Man. To me this character is a tremendous inspiration, yet I often remind myself that in the story, he begins his career as an MIT-­trained weapons manufacturer and munitions developer. In the 2008 film
  325. <em>Iron Man</em>, he changes his ways because he learns that his company’s specialized weapons are being used by terrorists.
  326. </p><div class="ieee-sidebar-medium">
  327. <p>
  328. Legendary MIT roboticist
  329. <a href="https://danielarus.csail.mit.edu/" target="_blank">Daniela Rus</a> has published a new book called <em><a href="https://mitpressbookstore.mit.edu/book/9781324050230" rel="noopener noreferrer" target="_blank">The Heart and the Chip: Our Bright Future with Robots</a>. </em>“There is a robotics revolution underway,” Rus says in the book’s introduction, “one that is already causing massive changes in our society and in our lives.” She’s quite right, of course, and although some of us have been feeling that this is true for decades, it’s arguably more true right now than it ever has been. But robots are difficult and complicated, and the way that their progress is intertwined with the humans that make them and work with them means that these changes won’t come quickly or easily. Rus’ experience gives her a deep and nuanced perspective on robotics’ past and future, and we’re able to share a little bit of that with you here.
  330. </p>
  331. <p>
  332. The following excerpt is from Chapter 14, entitled “What Could Go Wrong?” Which, let’s be honest, is the right question to ask (and then attempt to conclusively answer) whenever you’re thinking about sending a robot out into the real world.
  333. <em>—Evan Ackerman</em>
  334. </p>
  335. <p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  336. <img alt="Portrait of a smiling woman with wavy brown hair and brown eyes." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="8b66c638de3ece153cd4ea69ad257d7f" data-rm-shortcode-name="rebelmouse-image" id="a5919" loading="lazy" src="https://spectrum.ieee.org/media-library/portrait-of-a-smiling-woman-with-wavy-brown-hair-and-brown-eyes.jpg?id=51715699&width=980" style="max-width: 100%"/>
  337. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-647539" placeholder="Add Photo Caption..." spellcheck="false" style="max-width: 100%;">Daniela Rus: Should roboticists consider subscribing to their own Hippocratic oath? </small>
  338. </p>
  339. </div><p>
  340. Remember, robots are tools. Inherently, they are neither good nor bad; it’s how we choose to use them that matters. In 2022, aerial drones were used as weapons on both sides of devastating wars. Anyone can purchase a drone, but there are regulations for using drones that vary between and within different countries. In the United States, the Federal Aviation Administration requires that all drones be registered, with a few exceptions, including toy models weighing less than 250 grams. The rules also depend on whether the drone is flown for fun or for business. Regardless of regulations, anyone could use a flying robot to inflict harm, just like anyone can swing a hammer to hurt someone instead of driving a nail into a board. Yet drones are also being used to deliver critical medical supplies in hard-­to-­reach areas, track the health of forests, and help scientists like Roger Payne monitor and advocate for at-­risk species. My group collaborated with the modern dance company Pilobolus to stage the first theatrical performance featuring a mix of humans and drones back in 2012, with a robot called Seraph. So, drones can be dancers, too. In Kim Stanley Robinson’s prescient science fiction novel
  341. <em>The Ministry for the Future</em>, a swarm of unmanned aerial vehicles is deployed to crash an airliner. I can imagine a flock of these mechanical birds being used in many good ways, too. At the start of its war against Ukraine, Russia limited its citizens’ access to unbiased news and information in hopes of controlling and shaping the narrative around the conflict. The true story of the invasion was stifled, and I wondered whether we could have dispatched a swarm of flying video screens capable of arranging themselves into one giant aerial monitor in the middle of popular city squares across Russia, showing real footage of the war, not merely clips approved by the government. Or, even simpler: swarms of flying digital projectors could have broadcasted the footage on the sides of buildings and walls for all to see. If we had deployed enough, there would have been too many of them to shut down.
  342. </p><p class="pull-quote">
  343. There may be variations of Tony Stark passing through my university or the labs of my colleagues around the world, and we need to do whatever we can to ensure these talented young individuals endeavor to have a positive impact on humanity.
  344. </p><p>
  345. The Tony Stark character is shaped by his experiences and steered toward having a positive impact on the world, but we cannot wait for all of our technologists to endure harrowing, life-­changing experiences. Nor can we expect everyone to use these intelligent machines for good once they are developed and moved out into circulation. Yet that doesn’t mean we should stop working on these technologies—­the potential benefits are too great. What we can do is think harder about the consequences and put in place the guardrails to ensure positive benefits. My contemporaries and I can’t necessarily control how these tools are used in the world, but we can do more to influence the people making them.
  346. </p><p>
  347. There may be variations of Tony Stark passing through my university or the labs of my colleagues around the world, and we need to do whatever we can to ensure these talented young individuals endeavor to have a positive impact on humanity. We absolutely must have diversity in our university labs and research centers, but we may be able to do more to shape the young people who study with us. For example, we could require study of the Manhattan Project and the moral and ethical quandaries associated with the phenomenal effort to build and use the atomic bomb. At this point, ethics courses are not a widespread requirement for an advanced degree in robotics or AI, but perhaps they should be. Or why not require graduates to swear to a robotics-­ and AI-­attuned variation on the Hippocratic oath?
  348. </p><p>
  349. The oath comes from an early Greek medical text, which may or may not have been written by the philosopher Hippocrates, and it has evolved over the centuries. Fundamentally, it represents a standard of medical ethics to which doctors are expected to adhere. The most famous of these is the promise to do no harm, or to avoid
  350. <em>intentional</em> wrongdoing. I also applaud the oath’s focus on committing to the community of doctors and the necessity of maintaining the sacred bond between teacher and pupils. The more we remain linked as a robotics community, the more we foster and maintain our relationships as our students move out into the world, the more we can do to steer the technology toward a positive future. Today the Hippocratic oath is not a universal requirement for certification as a doctor, and I do not see it functioning that way for roboticists, either. Nor am I the first roboticist or AI leader to suggest this possibility. But we should seriously consider making it standard practice.
  351. </p><p>
  352. In the aftermath of the development of the atomic bomb, when the potential of scientists to do harm was made suddenly and terribly evident, there was some discussion of a Hippocratic oath for scientific researchers. The idea has resurfaced from time to time and rarely gains traction. But science is fundamentally about the pursuit of knowledge; in that sense it is pure. In robotics and AI, we are building
  353. <em>things</em> that will have an impact on the world and its people and other forms of life. In this sense, our field is somewhat closer to medicine, as doctors are using their training to directly impact the lives of individuals. Asking technologists to formally recite a version of the Hippocratic oath could be a way to continue nudging our field in the right direction, and perhaps serve as a check on individuals who are later asked to develop robots or AI expressly for nefarious purposes.
  354. </p><p>
  355. Of course, the very idea of what is good or bad, in terms of how a robot is used, depends on where you sit. I am steadfastly opposed to giving armed or weaponized robots autonomy. We cannot and should not trust machine intelligences to make decisions about whether to inflict harm on a person or group of people on their own. Personally, I would prefer that robots never be used to do harm to anyone, but this is now unrealistic. Robots are being used as tools of war, and it is our responsibility to do whatever we can to shape their ethical use. So, I do not separate or divorce myself from reality and operate solely in some utopian universe of happy, helpful robots. In fact, I teach courses on artificial intelligence to national security officials and advise them on the strengths, weaknesses, and capabilities of the technology. I see this as a patriotic duty, and I’m honored to be helping our leaders understand the limitations, strengths, and possibilities of robots and other AI-­enhanced physical systems—­what they can and cannot do, what they should and should not do, and what I believe they must do.
  356. </p><p>
  357. Ultimately, no matter how much we teach and preach about the limitations of technology, the ethics of AI, or the potential dangers of developing such powerful tools, people will make their own choices, whether they are recently graduated students or senior national security leaders. What I hope and teach is that we should choose to do good. Despite the efforts of life extension companies, we all have a limited time on this planet, what the scientist Carl Sagan called our “pale blue dot,” and we should do whatever we can to make the most of that time and have a positive impact on our beautiful environment, and the many people and other species with which we share it. My decades-­long quest to build more intelligent and capable robots has only strengthened my appreciation for—­no, wonder at—­the marvelous creatures that crawl, walk, swim, run, slither, and soar across and around our planet, and the fantastic plants, too. We should not busy ourselves with the work of developing robots that can eliminate these cosmically rare creations. We should focus instead on building technologies to preserve them, and even help them thrive. That applies to all living entities, including the one species that is especially concerned about the rise of intelligent machines.
  358. </p><div class="horizontal-rule">
  359. </div><p>
  360. <em>Excerpted from “<a href="https://mitpressbookstore.mit.edu/book/9781324050230" target="_blank">The Heart and the Chip: Our Bright Future with Robots</a>”. Copyright 2024 by Daniela Rus, Gregory Mone. Used with permission of the publisher, W.W. Norton & Company. All rights reserved.</em>
  361. </p>]]></description><pubDate>Fri, 15 Mar 2024 19:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/the-heart-and-the-chip-what-could-go-wrong</guid><category>Daniela rus</category><category>Book excerpt</category><category>Robotics</category><category>Artificial intelligence</category><category>Mit</category><dc:creator>Daniela Rus</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-orange-book-cover-for-the-heart-and-the-chip-by-daniela-rus-and-gregory-mone.jpg?id=51715682&amp;width=980"></media:content></item><item><title>Video Friday: Many Quadrupeds</title><link>https://spectrum.ieee.org/video-friday-many-quadrupeds</link><description><![CDATA[
  362. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51743938&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="dds-e1rxi9u">How many quadrupeds can you control with one single locomotion policy? Apparently, the answer is “all of the quadrupeds.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="83c2cea1bdbf4dacbdfff21bb8ef110b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dDS-E1rxI9U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Look for this at ICRA 2024 in a couple of months!</p><p>[ <a href="https://miladshafiee.github.io/ManyQuadrupeds/">EPFL</a> ]</p><p>Thanks, Milad!</p><div class="horizontal-rule"></div><p id="sq1qzb5banw">Very impressive performance from Figure 01, I think, although as is frequently the case, it’s hard to tell exactly how impressive without more information about exactly what’s going on here.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0e0ce1cbd2866e9a4767118fcee07af4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Sq1QZB5baNw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pjwvf90l4cg"><em>That awesome ANYmal Parkour research is now published, which means that there’s a new video, well worth watching all the way to the end.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="499783f299f66e32d2e9839d3a98640e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PjWvf90l4cg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adi7566">Science</a> ] via [ <a href="https://sites.google.com/leggedrobotics.com/agile-navigation">ETHZ RSL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="480cauhxre0">Robotic vision can be pretty tricky when you’re cooking, because things can significantly change how they look over time, like with melting butter or an egg being fried. Some new research is tackling this, using a (now ancient?) PR2.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="93cb11f696b597d1dd620bd8cb0e6e94" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/480caUHXrE0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://haraduka.github.io/continuous-state-recognition/">JSK Lab</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ghm89iyyc2k"><em>Filmed in January of 2020, this video shows Atlas clearing debris and going through a doorway. Uses a combination of simple footstep planning, teleoperation, and autonomous behaviors through a single virtual reality operator interface. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3c8f47ea714ce5f8d5abc1f9df126958" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GHm89iYyc2k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.ihmc.us/">IHMC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ghm89iyyc2k">Sustainable fashion enabled by smart textiles shaped by a robot and a heat gun. Multiple styles, multiple sizes, all in one garment!</p><p class="shortcode-media shortcode-media-vimeo">
  363. <iframe class="rm-shortcode" data-rm-shortcode-id="81d5d67515a206c6ccc2cb7b830eae26" frameborder="0" height="480" scrolling="no" src="https://player.vimeo.com/video/893959788" width="100%"></iframe>
  364. </p><p>[ <a href="https://news.mit.edu/2024/4d-knit-dress-future-of-fashion-0307">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rczhlfs_pdm">Video of Boston Dynamics’ Stretch from MODEX, with a little sneak peak at the end of what the robot’s next warehouse task might be.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5e22733e975a07b8d19fc4536f6c3d99" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rCZHlfS_pdM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/stretch/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g8ah0fuieai"><em>Pickle Robots autonomously unload trucks and import containers.  The system is in production use at customer warehoues handling floor-loaded freight at human scale or better.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5a664a474f6dc9cb6a3d77893ba038a5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G8AH0fUIeaI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://picklerobot.com/">Pickle Robot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="o2rhnymfd54"><em>The ROBDEKON robotics competence center is dedicated to the development of robotic systems for hazardous environments that pose a potential risk to humans. As part of the consortium, the FZI Research Center for Information Technology developed robotic systems, technologies, and Artificial Intelligence (AI) methods that can be used to handle hazardous materials–for example, to sort potentially dangerous used batteries for recycling.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7f60731eb77cdd846254642bc184fa06" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o2RhNyMFd54?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robdekon.de/en">FZI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nof6r7y4wde"><em>This research project with Ontario Power Generation involves adapting Boston Dynamics Spot’s localization system to longterm changes in the environment. During this testing, we mounted a GoPro camera on the back of Spot and took a video of each walk for a year from Spot’s point of view. We put the footage together as a moving time-lapse video where the day changes as Spot completes the Autowalk around the campus.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="252b8fd2bcad830086dacebaa3a7741c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Nof6R7y4wDE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://mars.engineering.uoit.ca/">MARS Lab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 15 Mar 2024 15:45:48 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-many-quadrupeds</guid><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51743938&amp;width=980"></media:content></item><item><title>Covariant Announces a Universal AI Platform for Robots</title><link>https://spectrum.ieee.org/covariant-foundation-model</link><description><![CDATA[
  365. <img src="https://spectrum.ieee.org/media-library/one-robotics-foundation-model-can-control-many-different-robots-manipulating-many-different-objects.gif?id=51702916&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>When <em>IEEE Spectrum</em> <a href="https://spectrum.ieee.org/covariant-ai-gigantic-neural-network-to-automate-warehouse-picking" target="_self">first wrote about Covariant</a> in 2020, it was a <a href="https://spectrum.ieee.org/ai-startup-embodied-intelligence" target="_self">new-ish</a> robotics startup looking to apply robotics to warehouse picking at scale through the magic of a single end-to-end neural network. At the time, Covariant was focused on this picking use case, because it represents an application that could provide immediate value—warehouse companies pay Covariant for its robots to pick items in their warehouses. But for Covariant, the exciting part was that picking items in warehouses has, over the last four years, yielded a massive amount of real-world manipulation data<a rel="noopener noreferrer" target="_blank"></a>—and you can probably guess where this is going.<a href="#_msocom_1" rel="noopener noreferrer" target="_blank"></a></p><p>Today, Covariant is announcing <a href="https://covariant.ai/insights/introducing-rfm-1-giving-robots-human-like-reasoning-capabilities/" rel="noopener noreferrer" target="_blank">RFM-1</a>, which the company describes as a robotics foundation model that gives robots the “human-like ability to reason.” That’s from the press release, and while I wouldn’t necessarily read too much into “human-like” or “reason,” what Covariant has going on here is pretty cool.</p><p>“Foundation model” means that RFM-1 can be trained on more data to do more things—at the moment, it’s all about warehouse manipulation because that’s what it’s been trained on, but its capabilities can be expanded by feeding it more data. “Our existing system is already good enough to do very fast, very variable pick and place,” says Covariant co-founder <a href="https://www.linkedin.com/in/pieterabbeel/" rel="noopener noreferrer" target="_blank">Pieter Abbeel</a>. “But we’re now taking it quite a bit further. Any task, any embodiment—that’s the long-term vision. Robotics foundation models powering billions of robots across the world.” From the sound of things, Covariant’s business of deploying a large fleet of warehouse automation robots was the fastest way for them to collect the tens of millions of trajectories (how a robot moves during a task) that they needed to train the 8 billion parameter RFM-1 model.</p><p class="shortcode-media shortcode-media-youtube">
  366. <span class="rm-shortcode" data-rm-shortcode-id="5e0cdcd9d53892126d4f98b7e7142079" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/INp7I3Efspc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  367. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=INp7I3Efspc" target="_blank">C</a>ovariant</small>
  368. </p><p>“The only way you can do what we’re doing is by having robots deployed in the world collecting a ton of data,” says Abbeel. “Which is what allows us to train a robotics foundation model that’s uniquely capable.”</p><p>There have been other attempts at this sort of thing: The <a href="https://spectrum.ieee.org/global-robotic-brain" target="_self">RTX project</a> is one recent example. But while RT-X depends on research labs sharing what data they have to create a dataset that’s large enough to be useful, Covariant is doing it alone, thanks to its fleet of warehouse robots. “RT-X is about a million trajectories of data,” Abbeel says, “but we’re able to surpass it because we’re getting a million trajectories every few weeks.”</p><p class="pull-quote">
  369. “By building a valuable picking robot that’s deployed across 15 countries with dozens of customers, we essentially have a data collection machine.” <strong>—Pieter Abbeel, Covariant</strong>
  370. </p><p>You can think of the current execution of RFM-1 as a prediction engine for <a href="https://spectrum.ieee.org/covariant-ai-gigantic-neural-network-to-automate-warehouse-picking" target="_self">suction-based object manipulation</a> in warehouse environments. The model incorporates still images, video, joint angles, force reading, suction cup strength—everything involved in the kind of robotic manipulation that Covariant does. All of these things are interconnected within RFM-1, which means that you can put any of those things into one end of RFM-1, and out of the other end of the model will come a prediction. That prediction can be in the form of an image, a video, or a series of commands for a robot.</p><p>What’s important to understand about all of this is that RFM-1 isn’t restricted to picking only things it’s seen before, or only working on robots it has direct experience with. This is what’s nice about foundation models—they can generalize within the domain of their training data, and it’s how Covariant has been able to scale their business as successfully as they have, by not having to retrain for every new picking robot or every new item. What’s counter-intuitive about these large models  is that they’re actually better at dealing with new situations than models that are trained <em>specifically</em> for those situations.</p><p>For example, let’s say you want to train a model to drive a car on a highway. The question, Abbeel says, is whether it would be worth your time to train on other kinds of driving anyway. The answer is yes, because highway driving is sometimes <em>not</em> highway driving. There will be accidents or rush hour traffic that will require you to drive differently. If you’ve also trained on driving on city streets, you’re effectively training on highway edge cases, which will come in handy at some point and improve performance overall. With RFM-1, it’s the same idea: Training on lots of different kinds of manipulation—different robots, different objects, and so on—means that any single kind of manipulation will be that much more capable.</p><p>In the context of generalization, Covariant talks about RFM-1’s ability to “understand” its environment. This can be a tricky word with AI, but what’s relevant is to ground the meaning of “understand” in what RFM-1 is capable of. For example, you don’t need to <em>understand</em> physics to be able to catch a baseball, you just need to have a lot of experience catching baseballs, and that’s where RFM-1 is at. You could <em>also</em> reason out how to catch a baseball with no experience but an understanding of physics, and RFM-1 is <em>not</em> doing this, which is why I hesitate to use the word “understand” in this context.</p><p>But this brings us to another interesting capability of RFM-1: it operates as a very effective, if constrained, simulation tool. As a prediction engine that outputs video, you can ask it to generate what the next couple seconds of an action sequence will look like, and it’ll give you a result that’s both realistic and accurate, being grounded in all of its data. The key here is that RFM-1 can effectively simulate objects that are challenging to simulate traditionally, like floppy things.</p><p>Covariant’s Abbeel explains that the “world model” that RFM-1 bases its predictions on is effectively a learned physics engine. “Building physics engines turns out to be a very daunting task to really cover every possible thing that can happen in the world,” Abbeel says. “Once you get complicated scenarios, it becomes very inaccurate, very quickly, because people have to make all kinds of approximations to make the physics engine run on a computer. We’re just doing the large-scale data version of this with a world model, and it’s showing really good results.”</p><p>Abbeel gives an example of asking a robot to simulate (or predict) what would happen if a cylinder is placed vertically on a conveyor belt. The prediction accurately shows the cylinder falling over and rolling when the belt starts to move—not because the cylinder is being simulated, but because RFM-1 has seen a lot of things being placed on a lot of conveyor belts.<br/></p><p class="pull-quote">
  371. “Five years from now, it’s not unlikely that what we are building here will be the only type of simulator anyone will ever use.” <strong>—Pieter Abbeel, Covariant</strong>
  372. </p><p>
  373. This only works if there’s the right kind of data for RFM-1 to train on, so unlike most simulation environments, it can’t currently generalize to completely new objects or situations. But Abbeel believes that with enough data, useful world simulation will be possible. “Five years from now, it’s not unlikely that what we are building here will be the only type of simulator anyone will ever use. It’s a more capable simulator than one built from the ground up with collision checking and finite elements and all that stuff. All those things are so hard to build into your physics engine in any kind of way, not to mention the renderer to make things look like they look in the real world—in some sense, we’re taking a shortcut.”
  374. </p><p class="shortcode-media shortcode-media-youtube">
  375. <span class="rm-shortcode" data-rm-shortcode-id="f495707195f708b898b1b88a12c0e6c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1Go6HEC-bYU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  376. <small class="image-media media-caption" placeholder="Add Photo Caption...">RFM-1 also incorporates language data to be able to communicate more effectively with humans.</small>
  377. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Covariant</small>
  378. </p><p>For Covariant to expand the capabilities of RFM-1 towards that long-term vision of foundation models powering “billions of robots across the world,” the next step is to feed it more data from a wider variety of robots doing a wider variety of tasks. “We’ve built essentially a data ingestion engine,” Abbeel says. “If you’re willing to give us data of a different type, we’ll ingest that too.”<br/></p><p class="pull-quote">
  379. “We have a lot of confidence that this kind of model could power all kinds of robots—maybe with more data for the types of robots and types of situations it could be used in.” <strong>—Pieter Abbeel, Covariant</strong>
  380. </p><p>One way or another, that path is going to involve a heck of a lot of data, and it’s going to be data that Covariant is not currently collecting with its own fleet of warehouse manipulation robots. So if you’re, say, a humanoid robotics company, what’s your incentive to share all the data you’ve been collecting with Covariant? “The pitch is that we’ll help them get to the real world,” Covariant co-founder <a href="https://www.linkedin.com/in/peter-xi-chen/" rel="noopener noreferrer" target="_blank">Peter Chen</a> says. “I don’t think there are really that many companies that have AI to make their robots truly autonomous in a production environment. If they want AI that’s robust and powerful and can actually help them enter the real world, we are really their best bet.”</p><p>Covariant’s core argument here is that while it’s certainly possible for every robotics company to train up their own models individually, the performance—for anybody trying to do manipulation, at least—would be not nearly as good as using a model that incorporates all of the manipulation data that Covariant already has within RFM-1. “It has always been our long term plan to be a robotics foundation model company,” says Chen. “There was just not sufficient data and compute and algorithms to get to this point—but building a universal AI platform for robots, that’s what Covariant has been about from the very beginning.”</p>]]></description><pubDate>Mon, 11 Mar 2024 17:44:55 +0000</pubDate><guid>https://spectrum.ieee.org/covariant-foundation-model</guid><category>Ai robots</category><category>Autonomous robots</category><category>Covariant</category><category>Robotics</category><category>Robotics foundation model</category><category>Warehouse picking</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/one-robotics-foundation-model-can-control-many-different-robots-manipulating-many-different-objects.gif?id=51702916&amp;width=980"></media:content></item><item><title>Diminutive Deep Sea Drone Dives for Wrecks and Reefs</title><link>https://spectrum.ieee.org/underwater-drone</link><description><![CDATA[
  381. <img src="https://spectrum.ieee.org/media-library/a-white-box-looking-robot-floating-above-a-coral-reef.jpg?id=51687607&width=1200&height=400&coordinates=0%2C625%2C0%2C625"/><br/><br/><p>
  382. The global ocean is difficult to explore—the common refrain is that we know <a href="https://oceanexplorer.noaa.gov/facts/explored.html" target="_blank">less about the deep ocean</a> than we do about the surface of the moon. Australian company <u><a href="https://www.advancednavigation.com/" rel="noopener noreferrer" target="_blank">Advanced Navigation</a></u> wants to change that with a pint-sized <a data-linked-post="2657771923" href="https://spectrum.ieee.org/mapping-unchartered-waters" target="_blank">autonomous underwater vehicle</a> (AUV) that it hopes will become the maritime equivalent of a <a data-linked-post="2663254263" href="https://spectrum.ieee.org/skydio-drone-not-for-consumers" target="_blank">consumer drone</a>. And the new AUV is already getting to work mapping and monitoring Australia’s coral reefs and diving for shipwrecks.
  383. </p><p>
  384. The Sydney-based company has been developing
  385. <a data-linked-post="2650276931" href="https://spectrum.ieee.org/underwater-gps-inspired-by-shrimp-eyes" target="_blank">underwater navigation</a> technology for more than a decade. In 2022, Advanced Navigation unveiled its first in-house AUV, called <u><a href="https://www.advancednavigation.com/robotics/micro-auv/hydrus" rel="noopener noreferrer" target="_blank">Hydrus</a></u>. At less than half a meter long, the vehicle is considerably smaller than most alternatives. Even so, it’s fully autonomous and carries a 4k-resolution camera capable of 60 frames per second that can both capture high-definition video and construct detailed 3D photogrammetry models.
  386. </p><p>
  387. Advanced Navigation says Hydrus—with a depth rating of 3,000 meters, a range of 9 kilometers, and a battery that lasts up to three hours—is capable of a wide variety of missions. The company recently sold two units to the <u><a href="https://www.aims.gov.au/about-aims" rel="noopener noreferrer" target="_blank">Australian Institute of Marine Science</a></u> (AIMS), the country’s tropical marine science agency, which will use them to survey coral reefs in the <a href="https://en.wikipedia.org/wiki/North_West_Shelf" target="_blank">North West Shelf</a> region off the country’s west coast. Hydrus has also recently collaborated with the <a href="https://visit.museum.wa.gov.au/" rel="noopener noreferrer" target="_blank">Western Australian Museum</a> to produce a detailed 3D model of a shipwreck off the coast near Perth.
  388. </p><p class="pull-quote">“If people can go and throw one of these off the boat, just like they can throw a drone up in the air, that will obviously benefit the exploration of the sea.” <strong>—Ross Anderson, Western Australian Museum</strong></p><p>
  389. After many years of supplying components to other robotics companies,
  390. <u><a href="https://www.linkedin.com/in/peterbaker89/?originalSubdomain=au" rel="noopener noreferrer" target="_blank">Peter Baker</a></u>, subsea product manager at Advanced Navigation, says they company spotted a gap in the market. “We wanted to take the user experience that someone would have with an aerial drone and bring that underwater,” he says. “It’s very expensive to get images and data of the seabed. So by being able to miniaturize this system, and have it drastically simplified from the user’s point of view, it makes data a lot more accessible to people.”
  391. </p><p>
  392. But building a compact and low-cost AUV is not simple. The deep ocean is not a friendly place for electronics, says Baker, due to a combination of high pressure and corrosive seawater. The traditional way of dealing with this is to stick all the critical components in a sealed titanium tube that can maintain ambient pressure and keep moisture out. However, this requires you to add buoyancy to compensate for the extra weight, says Baker, which increases the bulk of the vehicle. That means bigger motors and bigger batteries. “The whole thing spirals up and up until you’ve got something the size of a minibus,” he says.
  393. </p><p>
  394. Advanced Navigation got around the spiral by designing bespoke pressure-tolerant electronics. They built all of their circuit boards from scratch, carefully selecting components that had been tested to destruction in a hydrostatic pressure chamber. These were then encapsulated in a water-proof composite shell, and to further reduce the risk of water ingress the drone operates completely wirelessly. Batteries are recharged using inductive charging and data transfer is either over
  395. <a href="https://spectrum.ieee.org/tag/wi-fi" target="_blank">Wi-Fi</a> when above water or via an optical modem when below the surface.
  396. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  397. <img alt="a white box looking robot with blue circles on side sitting on countertop of boat" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="de9d5ac500dc290de262235c94e4027e" data-rm-shortcode-name="rebelmouse-image" id="e23d8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-white-box-looking-robot-with-blue-circles-on-side-sitting-on-countertop-of-boat.jpg?id=51704014&width=980" style="max-width: 100%"/>
  398. <small class="image-media media-caption" placeholder="Add Photo Caption...">Hydrus AUVs are charged using induction to keep corrosive seawater from leaking in through charging ports.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Advanced Navigation</small>
  399. </p><p>
  400. This has allowed the company to significantly miniaturize the system, says Baker, which has a drastic impact on the overall cost of operations. “You don’t need a crane or a winch or anything like that to recover the vehicle, you can pick it up with a fishing net,” he says. “You can get away with using a much smaller boat, and the rule of thumb in the industry is if you double the size of your boat, you quadruple the cost.”
  401. </p><p>
  402. Just as important, though, is the vehicle’s ease of use. Most underwater robotics systems still operate with a tether, says Baker, but Hydrus carries all the hardware required to support autonomous navigation on board. The company’s “bread and butter” is inertial navigation technology, which uses accelerometers and gyroscopes to track the vehicle from a known starting point. But the drone also features a sonar system that allows it to stay a set distance from the seabed and also judge its speed by measuring the Doppler shift on echoes as they bounce back.
  403. </p><p>
  404. This means that users can simply program in a set of way points on a map, toss the vehicle overboard and leave it to its own devices, says Baker. The Hydrus does have a low-bandwidth
  405. <a data-linked-post="2650277403" href="https://spectrum.ieee.org/mit-researchers-develop-seamless-underwatertoair-communication-system" target="_blank">acoustic communication</a> channel that allows the operator to send basic commands like “stop” or “come home,” he says, but Hydrus is designed to be a set-and-forget AUV. “That really lowers the thresholds of what a user needs to be able to operate it,” he says. “If you can fly a DJI drone you could fly a Hydrus.”
  406. </p><p>
  407. The company estimates for a typical seabed investigation in water shallow enough for human divers, the Hydrus could be 75 percent cheaper than alternatives. And the savings would go up significantly at greater depths. What’s more, says Baker, the drone’s precise navigation means it can produce much more consistent and repeatable data.
  408. </p><p>
  409. To demonstrate its capabilities, Hydrus’ designers went hunting for shipwrecks in the
  410. <a href="https://en.wikipedia.org/wiki/Rottnest_ship_graveyard" target="_blank">Rottnest ships graveyard</a> just off the coast near Perth, in Western Australia. The site was a designated spot for scuttling aging ships, says Ross Anderson, curator at Western Australian Museum, but has yet to be fully explored due to the depth of many of the wrecks.
  411. </p><p>
  412. The Advanced Navigation team used the Hydrus to create a detailed 3D model of a sunken “coal hulk”—one of a category of old iron sailing ships that were later converted to floating coal warehouses for steamships. The Western Australian Museum has been unable to identify the vessel so far, but Anderson says these kind of models can be hugely beneficial for carrying out maritime archaeology research, as well as educating people about what’s below the waves.
  413. </p><h3></h3><br/><iframe allow="autoplay; fullscreen; xr-spatial-tracking" allowfullscreen="" execution-while-not-rendered="" execution-while-out-of-viewport="" frameborder="0" height="500px" mozallowfullscreen="true" src="https://sketchfab.com/models/699a6f44936e455e8611c528bf4d68db/embed" title="Micro AUV Hydrus Captures Shipwreck" web-share="" webkitallowfullscreen="true" width="100%" xr-spatial-tracking="">
  414. </iframe><p class="caption">
  415. Advanced Navigation used its new Hydrus drone to create a detailed 3D image of an unidentified “coal hulk” ship in the Rottnest ships graveyard off the western coast of Australia.
  416. </p><p class="photo-credit">
  417.  
  418. Advanced Navigation
  419. </p><p>
  420. Any technology that can simplify the process is greatly welcomed, Anderson adds. “If people can go and throw one of these off the boat, just like they can throw a drone up in the air, that will obviously benefit the exploration of the sea,” he says.
  421. </p><p>
  422. Ease of use was also a big driver behind AIMS’s purchase of two Hydrus drones, says technology development program lead
  423. <a href="https://www.aims.gov.au/about/our-people/melanie-olsen" rel="noopener noreferrer" target="_blank">Melanie Olsen</a>, who is also an IEEE senior member. Most of the technology available for marine science is still research-grade and a long way from a polished, professional product.
  424. </p><p>
  425. “When you’re an operational agency like AIMS, you typically don’t have the luxury of spending a lot of time on the back of the boat getting equipment ready,” says Olsen. “You need something that users can turn on and go and it’s just working, as time is of the essence.”
  426. </p><p>
  427. Another benefit of the Hydrus for AIMS is that the drone can operate at greater depths than divers and in conditions that would be dangerous for humans. “Its enabling our researchers to see further down in the water and also operate in more dangerous situations such as at night, or in the presence of threats such as crocodiles or sharks, places where we just wouldn’t be able to collect that data,” says Olsen.
  428. </p><p>
  429. The agency will initially use the drones to survey reefs on Australia’s North West Shelf, including Scott Reef and Ashmore Reef. The goal is to collect regular data data on coral health to monitor the state of the reefs, investigate how they’re being effected by climate change, and hopefully get early warning of emerging problems. But Olsen says they expect that the Hydrus will become standard part of their ocean monitoring toolkit going forward.
  430. </p><p><em>This story was updated on 11 March 2024 to correct the year when Advanced Navigation unveiled Hydrus.</em><br/></p>]]></description><pubDate>Mon, 11 Mar 2024 15:47:50 +0000</pubDate><guid>https://spectrum.ieee.org/underwater-drone</guid><category>Underwater robots</category><category>Underwater autonomous vehicle</category><category>Underwater vehicles</category><category>Environmental monitoring</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-white-box-looking-robot-floating-above-a-coral-reef.jpg?id=51687607&amp;width=980"></media:content></item><item><title>Video Friday: Human to Humanoid</title><link>https://spectrum.ieee.org/video-friday-human-to-humanoid</link><description><![CDATA[
  431. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51683158&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="0w4n2q7xtcq"><em>We present Human to Humanoid (H2O), a reinforcement learning (RL) based framework that enables real-time, whole-body teleoperation of a full-sized humanoid robot with only an RGB camera. We successfully achieve teleoperation of dynamic, whole-body motions in real-world scenarios, including walking, back jumping, kicking, turning, waving, pushing, boxing, etc. To the best of our knowledge, this is the first demonstration to achieve learning-based, real-time, whole-body humanoid teleoperation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="135dd2c5474c055de9d8b1cfb39c68a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0W4N2q7xtcQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://human2humanoid.com/">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qawbon55p9i"><em>Legged robots have the potential to traverse complex terrain and access confined spaces beyond the reach of traditional platforms thanks to their ability to carefully select footholds and flexibly adapt their body posture while walking. However, robust deployment in real-world applications is still an open challenge. In this paper, we present a method for legged locomotion control using reinforcement learning and 3D volumetric representations to enable robust and versatile locomotion in confined and unstructured environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b722de54686748c6e0e3e5a7790ccbfe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QAwBoN55p9I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://takahiromiki.com/publication-posts/learning-to-walk-in-confined-spaces-using-3d-representation/">Takahiro Miki</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="83shvgtyfag">Sure, 3.3 meters per second is fast for a humanoid, but I’m more impressed by the spinning around while walking downstairs.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cfd59f318a13cc83f20b3f358c5d0fa5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/83ShvgtyFAg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/h1/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qelugqmj-k8"><em>Improving the safety of collaborative manipulators necessitates the reduction of inertia in the moving part. We introduce a novel approach in the form of a passive, 3D wire aligner, serving as a lightweight and low-friction power transmission mechanism, thus achieving the desired low inertia in the manipulator’s operation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="47fd11b7af56fe6a71a7daee14bbdbb5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QEluGqmj-k8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tenrobo18.github.io/saqiel-ral2023-webpage/">SAQIEL</a> ]</p><p>Thanks, Temma!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xj6mtlfo248"><em>Robot Era just launched Humanoid-Gym, an open-source reinforcement learning framework for bipedal humanoids. As you can see from the video, RL algorithms have given the robot, called Xiao Xing, or XBot, the ability to climb up and down haphazardly stacked boxes with relative stability and ease.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2ead764e848d14fd9ab11655b8cf98f8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xj6MtLfO248?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/">Robot Era</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1__fuhy3-qo">“Impact-Aware Bimanual Catching of Large-Momentum Objects.” Need I say more?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9ebf577006608a6566d447ff23bda34a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1__FuHY3-qo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://web.inf.ed.ac.uk/slmc">SLMC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3mgeeonuua0"><em>More than 80% of stroke survivors experience walking difficulty, significantly impacting their daily lives, independence, and overall quality of life. Now, new research from the University of Massachusetts Amherst pushes forward the bounds of stroke recovery with a unique robotic hip exoskeleton, designed as a training tool to improve walking function. This invites the possibility of new therapies that are more accessible and easier to translate from practice to daily life, compared to current rehabilitation methods.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4c62e6ad1cce7d476fea13eb928f7994" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3mgeEoNuua0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.umass.edu/news/article/robotic-hip-exoskeleton-shows-promise-helping-stroke-patients-regain-their-stride">UMass Amherst</a> ]</p><p>Thanks, Julia!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ahbgyciuie8">The manipulation here is pretty impressive, but it’s hard to know how impressive without also knowing how much the video was sped up.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0893b62c3f121471c39652459301af09" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aHBGYciUie8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://getsomatic.com/">Somatic</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qpqvum_ecu4"><em>DJI drones work to make the world a better place and one of the ways that we do this is through conservation work. We partnered with Halo Robotics and the OFI Orangutan Foundation International to showcase just how these drones can make an impact.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2b4294798beee0872fdc6e8cbf9914cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qPqvum_eCU4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://viewpoints.dji.com/blog/drones-help-counting-orangutans-in-borneo-a-technological-leap-for-conservation">DJI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w5ncndwkydq"><em>The aim of the test is to demonstrate the removal and replacement of satellite modules into a 27U CubeSat format using augmented reality control of a robot. In this use case, the “client” satellite is being upgraded and refueled using modular componentry.  The robot will then remove the failed computer module and place it in a fixture. It will then do the same with the propellant tank. The robot will then place these correctly back into the satellite.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="135751605bf29c925c9133e6aa1c68dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w5ncNDWKYdQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5-qmfjgrwyk"><em>This video features some of the highlights and favorite moments from the CYBATHLON Challenges 2024 that took place on 2 February, showing so many diverse types of assistive technology taking on discipline tasks and displaying pilots’ tenacity and determination. The Challenges saw new teams, new tasks, and new formats for many of the CYBATHLON disciplines.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7982649d307b5b13a5c3a77ce8931558" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5-qmfjGRWyk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2oqcome8wdu">It’s been a long road to electrically powered robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c7689ae5ba8035f64a4cfd224cce4c5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2oqCOme8wDU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.abb/group/en/about/history">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a1s0zhqceou"><em>Small drones for catastrophic wildfires (ones covering more than [40,470 hectares]) are like bringing a flashlight to light up a football field. This short video describes the major uses for drones of all sizes and why and when they are used, or why not.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3a45d1bbcf774675ecccbc5749c1bbb1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A1s0zhQcEOU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://crasar.org/">CRASAR</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xrda7jrl788">It probably will not surprise you that there are a lot of robots involved in building Rivian trucks and vans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19648c78502255b1cc2f0a29b4961b9e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xrDA7jRl788?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kawasakirobotics.com/">Kawasaki Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="d8wbi_lcn4i"><em>DARPA’s Learning Introspective Control (LINC) program is developing machine learning methods that show promise in making that scenario closer to reality.  LINC aims to fundamentally improve the safety of mechanical systems—specifically in ground vehicles, ships, drone swarms, and robotics—using various methods that require minimal computing power. The result is an AI-powered controller the size of a cell phone.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="462e7c5d0ce6fb77d706ee0f98c96bb8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/D8Wbi_lcN4I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/program/learning-introspective-control">DARPA</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 08 Mar 2024 17:41:23 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-human-to-humanoid</guid><category>Video friday</category><category>Robotics</category><category>Robotic arm</category><category>Quadruped robots</category><category>Humanoid robots</category><category>Drones</category><category>Exoskeleton</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51683158&amp;width=980"></media:content></item><item><title>Anyware Robotics’ Pixmo Takes Unique Approach to Trailer Unloading</title><link>https://spectrum.ieee.org/anyware-robotics-pixmo</link><description><![CDATA[
  432. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=51631833&width=1200&height=400&coordinates=0%2C540%2C0%2C540"/><br/><br/><p>
  433. You’ve seen this before: a truck-unloading robot that’s made up of a mobile base with an arm on it that drives up into the back of a trailer and then uses suction to grab stacked boxes and put them onto a conveyor belt. We’ve written about a couple of the companies <a href="https://spectrum.ieee.org/stretch-is-boston-dynamics-take-on-a-practical-mobile-manipulator-for-warehouses" target="_self">doing</a> <a href="https://spectrum.ieee.org/no-human-can-match-this-highspeed-boxunloading-robot-named-after-a-pickle" target="_self">this</a>, and there are even more out there. It’s easy to understand why—trailer unloading involves a fairly structured and controlled environment with a very repetitive task, it’s a hard job that sucks for humans, and there’s an enormous amount of demand.
  434. </p><p>
  435. While it’s likely true that there’s enough room for a whole bunch of different robotics companies in the trailer-unloading space, a given customer is probably going to only pick one, and they’re going to pick the one that offers the right combination of safety, capability, and cost. <a href="https://anyware-robotics.com/" rel="noopener noreferrer" target="_blank">Anyware Robotics</a> thinks they have that mix, aided by a box-handling solution that is both very clever and so obvious that I’m wondering why I didn’t think of it myself.
  436. </p><hr/><p>
  437. The overall design of Pixmo itself is fairly standard as far as trailer-unloading robots go, but some of the details are interesting. We’re told that Pixmo is the only trailer-unloading system that integrates a heavy-payload collaborative arm, actually <a href="https://www.fanucamerica.com/products/robots/series/collaborative-robot/crx-25ia" rel="noopener noreferrer" target="_blank">a fairly new commercial arm from Fanuc</a>. This means that Anyware Robotics doesn’t have to faff about with their own hardware, and also that their robot is arguably safer, being ISO-certified safe to work directly with people. The base is custom, but Anyware is contracting it out to a big robotics original equipment manufacturer.
  438. </p><p>
  439. “We’ve put a lot of effort into making sure that most of the components of our robot are off-the-shelf,” cofounder and CEO <a href="https://www.linkedin.com/in/thomas-tang-6967b35b/" rel="noopener noreferrer" target="_blank">Thomas Tang</a> tells us. “There are already so many mature and cost-efficient suppliers that we want to offload the supply chain, the certification, the reliability testing onto someone else’s shoulders.” And while there are a selection of automated mobile robots (AMRs) out there that seem like they could get the job done, the problem is that they’re all designed for flat surfaces, and getting into and out of the back of the trailer often involves a short, steep ramp, hence the need for a design just for them. Even with the custom base, Tang says that Pixmo is very cost efficient, and the company predicts that it will be approximately one-third the cost of other solutions with a payback of about 24 months.
  440. </p><p>
  441. But here’s the really clever bit:
  442. </p><p class="shortcode-media shortcode-media-youtube">
  443. <span class="rm-shortcode" data-rm-shortcode-id="3a8779ea26ac05bd3b649156bde0afd5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bbgJg7OwfJ0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  444. <small class="image-media media-caption" placeholder="Add Photo Caption...">Anyware Robotics Pixmo Trailer Unloading</small>
  445. </p><p>
  446. That conveyor system in front of the boxes is an add-on that’s used in support of Pixmo. There are two benefits here: First, having the conveyor add-on aligned with the base of a box minimizes the amount of lifting that Pixmo has to do. This allows Pixmo to handle boxes of up to 65 pounds with a lift-and-slide technique, putting it at the top end of a trailer-unloading robot payload. And the second benefit is that the add-on system decreases the distance that Pixmo has to move the box to just about as small as it can possibly be, eliminating the need for the arm to rotate around to place a box on a conveyor next to or behind itself. Lowering this cycle time means that Pixmo can achieve a throughput of up to 1,000 boxes per hour—about one box every 4 seconds, which the Internet suggests is quite fast, even for a professional human. Anyware Robotics is introducing this add-on system at the MODEX manufacturing and supply-chain show next week, and the company has a patent pending on the idea.
  447. </p><p>
  448. This seems like such a simple, useful idea that I asked Tang why they were the first ones to come up with it. “In robotics startups, there tends to be a legacy mind-set issue,” Tang told me. “When people have been working on robot arms for so many years, we just think about how to use robot arms to solve everything. That’s maybe the reason why other companies didn’t come up with this solution.” Tang says that Anyware started with much more complicated add-on designs before finding this solution. “Usually it’s the most simple solution that has the most trial and error behind it.”
  449. </p><p>
  450. Anyware Robotics is focused on trailer unloading for now, but Pixmo could easily be adapted for palletizing and depalletizing or somewhat less easily for other warehouse tasks like order picking or machine tending. But why stop there? A mobile manipulator can (theoretically) do it all (almost), and that’s exactly what Tang wants:
  451. </p><blockquote>
  452. In our long-term vision, we believe that the future will have two different types of general-purpose robots. In one direction is the humanoid form, which is a really flexible solution for jobs where you want to replace a human. But there are so many jobs that are just not reasonable for a human body to do. So we believe there should be another form of general-purpose robot, which is designed for industrial tasks. Our design philosophy is in that direction—it’s also general purpose, but for industrial applications.
  453. </blockquote><p>
  454. At just over one year old, Anyware has already managed to complete a pilot program (and convert it to a purchase order). They’re currently in the middle of several other pilot programs with leading third-party logistics providers, and they expect to spend the next several months focusing on productization with the goal of releasing the first commercial version of Pixmo by July of this year.
  455. </p>]]></description><pubDate>Tue, 05 Mar 2024 18:00:18 +0000</pubDate><guid>https://spectrum.ieee.org/anyware-robotics-pixmo</guid><category>Robotics startup</category><category>Mobile manipulators</category><category>Warehouse robots</category><category>Robotics</category><category>Anyware robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=51631833&amp;width=980"></media:content></item><item><title>Video Friday: $2.6 Billion</title><link>https://spectrum.ieee.org/video-friday-2-6-billion</link><description><![CDATA[
  456. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51607852&width=1200&height=400&coordinates=0%2C92%2C0%2C92"/><br/><br/><p>
  457. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please
  458. <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  459. </p><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLORADO, USA</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>
  460. Enjoy today’s videos!
  461. </p><div class="horizontal-rule">
  462. </div><div style="page-break-after: always">
  463. <span style="display:none"> </span>
  464. </div><p class="rm-anchors" id="gejxceu3bbw">
  465. Figure has <a href="https://spectrum.ieee.org/figure-robot-video" target="_blank">raised a US $675 million Series B</a>, valuing the company at $2.6 billion.
  466. </p><p class="shortcode-media shortcode-media-youtube">
  467. <span class="rm-shortcode" data-rm-shortcode-id="80ef76b25a5841cb91b4bc951e628ed5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gEjXcEU3Bbw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  468. </p><p>
  469. [
  470. <a href="https://www.figure.ai/">Figure</a> ]
  471. </p><div class="horizontal-rule">
  472. </div><p class="rm-anchors" id="s1t3qi2wnei">
  473. Meanwhile, here’s how things are going at Agility Robotics, <a href="https://agilityrobotics.com/news/2022/future-robotics" target="_blank">whose last raise was a $150 million Series B in April of 2022</a>.
  474. </p><p class="shortcode-media shortcode-media-youtube">
  475. <span class="rm-shortcode" data-rm-shortcode-id="834a5daa9a401c08d30f57208ce0ad9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/s1t3Qi2WnEI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  476. </p><p>
  477. [
  478. <a href="https://agilityrobotics.com/">Agility Robotics</a> ]
  479. </p><div class="horizontal-rule">
  480. </div><p class="rm-anchors" id="s1t3qi2wnei">
  481. Also meanwhile, here’s how things are going at Sanctuary AI, <a href="https://sanctuary.ai/resources/news/series-a-funding-another-step-toward-building-machines-that-can-think-like-people/" target="_blank">whose last raise was a $58.5 million Series A in March of 2022</a>.
  482. </p><div class="twitter-tweet twitter-tweet-rendered" style="display: flex; max-width: 550px; width: 100%; margin-top: 10px; margin-bottom: 10px;"><iframe allowfullscreen="true" allowtransparency="true" class="" data-tweet-id="1762938715134157077" frameborder="0" id="twitter-widget-1" scrolling="no" src="https://platform.twitter.com/embed/Tweet.html?dnt=true&embedId=twitter-widget-1&features=eyJ0ZndfdGltZWxpbmVfbGlzdCI6eyJidWNrZXQiOltdLCJ2ZXJzaW9uIjpudWxsfSwidGZ3X2ZvbGxvd2VyX2NvdW50X3N1bnNldCI6eyJidWNrZXQiOnRydWUsInZlcnNpb24iOm51bGx9LCJ0ZndfdHdlZXRfZWRpdF9iYWNrZW5kIjp7ImJ1Y2tldCI6Im9uIiwidmVyc2lvbiI6bnVsbH0sInRmd19yZWZzcmNfc2Vzc2lvbiI6eyJidWNrZXQiOiJvbiIsInZlcnNpb24iOm51bGx9LCJ0ZndfZm9zbnJfc29mdF9pbnRlcnZlbnRpb25zX2VuYWJsZWQiOnsiYnVja2V0Ijoib24iLCJ2ZXJzaW9uIjpudWxsfSwidGZ3X21peGVkX21lZGlhXzE1ODk3Ijp7ImJ1Y2tldCI6InRyZWF0bWVudCIsInZlcnNpb24iOm51bGx9LCJ0ZndfZXhwZXJpbWVudHNfY29va2llX2V4cGlyYXRpb24iOnsiYnVja2V0IjoxMjA5NjAwLCJ2ZXJzaW9uIjpudWxsfSwidGZ3X3Nob3dfYmlyZHdhdGNoX3Bpdm90c19lbmFibGVkIjp7ImJ1Y2tldCI6Im9uIiwidmVyc2lvbiI6bnVsbH0sInRmd19kdXBsaWNhdGVfc2NyaWJlc190b19zZXR0aW5ncyI6eyJidWNrZXQiOiJvbiIsInZlcnNpb24iOm51bGx9LCJ0ZndfdXNlX3Byb2ZpbGVfaW1hZ2Vfc2hhcGVfZW5hYmxlZCI6eyJidWNrZXQiOiJvbiIsInZlcnNpb24iOm51bGx9LCJ0ZndfdmlkZW9faGxzX2R5bmFtaWNfbWFuaWZlc3RzXzE1MDgyIjp7ImJ1Y2tldCI6InRydWVfYml0cmF0ZSIsInZlcnNpb24iOm51bGx9LCJ0ZndfbGVnYWN5X3RpbWVsaW5lX3N1bnNldCI6eyJidWNrZXQiOnRydWUsInZlcnNpb24iOm51bGx9LCJ0ZndfdHdlZXRfZWRpdF9mcm9udGVuZCI6eyJidWNrZXQiOiJvbiIsInZlcnNpb24iOm51bGx9fQ%3D%3D&frame=false&hideCard=false&hideThread=false&id=1762938715134157077&lang=en&origin=https%3A%2F%2Fspectrum.ieee.org%2Fr%2Fentryeditor%2F2667403366%23seo&sessionId=79f0c0cafb22720f5df48cce8cdd498f8f5b1a0b&theme=light&widgetsVersion=2615f7e52b7e0%3A1702314776716&width=550px" style="position: static; visibility: visible; width: 550px; height: 648px; display: block; flex-grow: 1;" title="X Post"></iframe></div><script async="" charset="utf-8" src="https://platform.twitter.com/widgets.js"></script><p>
  483. [
  484. <a href="https://sanctuary.ai/">Sanctuary AI</a> ]
  485. </p><div class="horizontal-rule">
  486. </div><blockquote class="rm-anchors" id="uct7qpptt-g">
  487. <em>The time has come for humanoid robots to enter industrial production lines and learn how to assist humans by undertaking repetitive, tedious, and potentially dangerous tasks for them. Recently, UBTECH’s humanoid robot Walker S was introduced into the assembly line of NIO’s advanced vehicle-manufacturing center, as an “intern” assisting in the car production. Walker S is the first bipedal humanoid robot to complete a specific workstation’s tasks on a mobile EV production line.</em>
  488. </blockquote><p class="shortcode-media shortcode-media-youtube">
  489. <span class="rm-shortcode" data-rm-shortcode-id="8aa3271d71a7ffcaa27dbf298b3634a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UCt7qPpTt-g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  490. </p><p>
  491. [
  492. <a href="https://www.ubtrobot.com/">UBTECH</a> ]
  493. </p><div class="horizontal-rule">
  494. </div><p class="rm-anchors" id="xuqkcfj3-v8">
  495. <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_blank">Henry Evans</a> keeps working hard to make robots better, this time with the assistance of researchers from Carnegie Mellon University.
  496. </p><p class="shortcode-media shortcode-media-youtube">
  497. <span class="rm-shortcode" data-rm-shortcode-id="9d5ed13b97b86e6b2d89f4681a52e4e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XuQKCFJ3-V8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  498. </p><blockquote>
  499. <em>Henry said he preferred using head-worn assistive teleoperation (HAT) with a robot for certain tasks rather than depending on a caregiver. “Definitely scratching itches,” he said. “I would be happy to have it stand next to me all day, ready to do that or hold a towel to my mouth. Also, feeding me soft foods, operating the blinds, and doing odd jobs around the room.”</em><br/>
  500. <em>One innovation in particular, software called Driver Assistance that helps align the robot’s gripper with an object the user wants to pick up, was “awesome,” Henry said. Driver Assistance leaves the user in control while it makes the fine adjustments and corrections that can make controlling a robot both tedious and demanding. “That’s better than anything I have tried for grasping,” Henry said, adding that he would like to see Driver Assistance used for every interface that controls Stretch robots.</em>
  501. </blockquote><p>
  502. [
  503. <a href="https://sites.google.com/view/hat2-teleop">HAT2</a> ] via [ <a href="https://www.cs.cmu.edu/news/2024/in-home-hat-testing">CMU</a> ]
  504. </p><div class="horizontal-rule">
  505. </div><p class="rm-anchors" id="9azynihbjpk">
  506. Watch this video for the three glorious seconds at the end.
  507. </p><p class="shortcode-media shortcode-media-youtube">
  508. <span class="rm-shortcode" data-rm-shortcode-id="cc9c23a11dca7b7c7a8ba981b406f421" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9azYNihBJpk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  509. </p><p>
  510. [
  511. <a href="https://www.techunited.nl/?page_id=2135&lang=en">Tech United</a> ]
  512. </p><div class="horizontal-rule">
  513. </div><blockquote class="rm-anchors" id="lp3x4il4_ua">
  514. <em>Get ready to rip, shear, mow, and tear, as DOOM is back! This April, we’re making the legendary game playable on our robotic mowers as a tribute to 30 years of mowing down demons.</em>
  515. </blockquote><p class="shortcode-media shortcode-media-youtube">
  516. <span class="rm-shortcode" data-rm-shortcode-id="4104a31f54a64ff0b4b3af8b12f2f731" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lp3X4IL4_UA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  517. </p><p>
  518. Oh, it’s HOOSKvarna, not HUSKvarna.
  519. </p><p>
  520. [
  521. <a href="https://www.husqvarna.com/uk/learn-and-discover/news-and-media/doom-husqvarna-update/">Husqvarna </a> ] via [ <a href="https://www.engadget.com/can-your-robot-lawnmower-run-doom-this-one-can-162641979.html">Engadget</a> ]
  522. </p><div class="horizontal-rule">
  523. </div><blockquote class="rm-anchors" id="vxlpf3drvp0">
  524. <em>Latest developments demonstrated on the Ameca Desktop platform. Having fun with vision- and voice-cloning capabilities.</em>
  525. </blockquote><p class="shortcode-media shortcode-media-youtube">
  526. <span class="rm-shortcode" data-rm-shortcode-id="8c9e10d2615029fff4b82df1d4b09eac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VXlpF3DrVP0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  527. </p><p>
  528. [
  529. <a href="https://www.engineeredarts.co.uk/">Engineered Arts</a> ]
  530. </p><div class="horizontal-rule">
  531. </div><blockquote class="rm-anchors" id="s24edb59baw">
  532. <em>Could an artificial-intelligence system learn language from a child? New York University researchers supported by the National Science Foundation, using first-person video from a head-mounted camera, trained AI models to learn language through the eyes and ears of a child.</em>
  533. </blockquote><p class="shortcode-media shortcode-media-youtube">
  534. <span class="rm-shortcode" data-rm-shortcode-id="48a1af08fb8654d041fa744c9af52339" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/s24EDb59baw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  535. </p><p>
  536. [
  537. <a href="https://www.nyu.edu/about/news-publications/news/2024/february/ai-learns-through-the-eyes-and-ears-of-a-child.html">NYU</a> ]
  538. </p><div class="horizontal-rule">
  539. </div><blockquote class="rm-anchors" id="nfl02g-2ys8">
  540. <em>The world’s leaders in manufacturing, natural resources, power, and utilities are using our autonomous robots to gather data of higher quality and higher quantities of data than ever before. Thousands of Spots have been deployed around the world—more than any other walking robot—to tackle this challenge. This release helps maintenance teams tap into the power of AI with new software capabilities and Spot enhancements.</em>
  541. </blockquote><p class="shortcode-media shortcode-media-youtube">
  542. <span class="rm-shortcode" data-rm-shortcode-id="11f1b8e42301c5c9e6ed499bdceb373a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NFL02g-2ys8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  543. </p><p>
  544. [
  545. <a href="https://bostondynamics.com/products/orbit/">Boston Dynamics</a> ]
  546. </p><div class="horizontal-rule">
  547. </div><blockquote class="rm-anchors" id="uni9gqwehxe">
  548. <em>Modular self-reconfigurable robotic systems are more adaptive than conventional systems. This article proposes a novel free-form and truss-structured modular self-reconfigurable robot called FreeSN, containing node and strut modules. This article presents a novel configuration identification system for FreeSN, including connection point magnetic localization, module identification, module orientation fusion, and system-configuration fusion.</em>
  549. </blockquote><p class="shortcode-media shortcode-media-youtube">
  550. <span class="rm-shortcode" data-rm-shortcode-id="5e04b0fd3535b8d7ad092c59e1feba75" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uni9gQwehXE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  551. </p><p>
  552. [
  553. <a href="https://freeformrobotics.org/">Freeform Robotics</a> ]
  554. </p><div class="horizontal-rule">
  555. </div><blockquote class="rm-anchors" id="e-bb70qpuec">
  556. <em>The OOS-SIM (On-Orbit Servicing Simulator) is a simulator for on-orbit servicing tasks such as repair, maintenance and assembly that have to be carried out on satellites orbiting the earth. It simulates the operational conditions in orbit, such as the felt weightlessness and the harsh illumination.</em>
  557. </blockquote><p class="shortcode-media shortcode-media-youtube">
  558. <span class="rm-shortcode" data-rm-shortcode-id="b9a34503784cc55580a25c96fa432e5c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E-bB70QPUec?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  559. </p><p>
  560. [
  561. <a href="https://www.dlr.de/rm/en/desktopdefault.aspx/tabid-11675/#gallery/30051">DLR</a> ]
  562. </p><div class="horizontal-rule">
  563. </div><blockquote class="rm-anchors" id="2uyy0bqbyoy">
  564. <em>The next CYBATHLON competition, which will take place again in 2024, breaks down barriers between the public, people with disabilities, researchers and technology developers. From 25 to 27 October 2024, the CYBATHLON will take place in a global format in the Arena Schluefweg in Kloten near Zurich and in local hubs all around the world.</em>
  565. </blockquote><p class="shortcode-media shortcode-media-youtube">
  566. <span class="rm-shortcode" data-rm-shortcode-id="04812793cc26556c003fe689c272be47" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2Uyy0BqbYoY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  567. </p><p>
  568. [
  569. <a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">CYBATHLON</a> ]
  570. </p><div class="horizontal-rule">
  571. </div><blockquote class="rm-anchors" id="7gfbaksyel8">
  572. <em>George’s story is a testament to the incredible journey that unfolds when passion, opportunity and community converge. His journey from a drone enthusiast to someone actively contributing to making a difference not only to his local community but also globally; serves as a beacon of hope for all who dare to dream and pursue their passions.</em>
  573. </blockquote><p class="shortcode-media shortcode-media-youtube">
  574. <span class="rm-shortcode" data-rm-shortcode-id="f27f08a4bfb49be5a668f09e9ed1f20d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7GfbaKSyEl8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  575. </p><p>
  576. [
  577. <a href="https://werobotics.org/blog/flight-of-purpose-from-a-drone-enthusiast-to-a-global-changemaker">WeRobotics</a> ]
  578. </p><div class="horizontal-rule">
  579. </div><p class="rm-anchors" id="kxnkfw0oxcm">
  580. In case you’d forgotten, Amazon has a lot of robots.
  581. </p><p class="shortcode-media shortcode-media-youtube">
  582. <span class="rm-shortcode" data-rm-shortcode-id="d2047abc9bdcab4790b444c84c147d82" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kxNkfW0OXcM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  583. </p><p>
  584. [
  585. <a href="https://www.aboutamazon.com/news/tag/robotics">Amazon Robotics</a> ]
  586. </p><div class="horizontal-rule">
  587. </div><blockquote class="rm-anchors" id="u1jj3xtcq0c">
  588. <em>ABB’s fifty-year story of robotic innovation that began in 1974 with the sale of the world’s first commercial all-electric robot, the IRB 6. Björn Weichbrodt was a key figure in the development of the IRB 6.</em>
  589. </blockquote><p class="shortcode-media shortcode-media-youtube">
  590. <span class="rm-shortcode" data-rm-shortcode-id="4da86aaf7b568d3aa2e2078c7b17e088" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/u1Jj3xTCQ0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  591. </p><p>
  592. [
  593. <a href="https://new.abb.com/news/detail/1839/the-golden-age-of-robotics">ABB</a> ]
  594. </p><div class="horizontal-rule">
  595. </div><blockquote class="rm-anchors" id="adapm-vebv4">
  596. <em>Robotics Debate of the Ingenuity Labs Robotics and AI Symposium (RAIS2023) from October 12, 2023: Is robotics helping or hindering our progress on UN Sustainable Development Goals?</em>
  597. </blockquote><p class="shortcode-media shortcode-media-youtube">
  598. <span class="rm-shortcode" data-rm-shortcode-id="2ac7f427ee9e9abb4813e28208e5b6c6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AdAPm-vebV4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  599. </p><p>
  600. [
  601. <a href="https://ingenuitylabs.queensu.ca/rais2023/">Ingenuity Labs</a> ]
  602. </p><div class="horizontal-rule">
  603. </div>]]></description><pubDate>Fri, 01 Mar 2024 21:02:54 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-2-6-billion</guid><category>Agility robotics</category><category>Figure</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51607852&amp;width=980"></media:content></item><item><title>Figure Raises $675M for Its Humanoid Robot Development</title><link>https://spectrum.ieee.org/figure-robot-video</link><description><![CDATA[
  604. <img src="https://spectrum.ieee.org/media-library/a-close-up-shot-of-a-metal-humanoid-torso-in-a-white-room.png?id=51595601&width=1200&height=400&coordinates=0%2C230%2C0%2C231"/><br/><br/><p>Today, <a href="https://www.figure.ai/" rel="noopener noreferrer" target="_blank">Figure</a> is announcing an astonishing US $675 million Series B raise, which values the company at an even more astonishing $2.6 billion. <a href="https://spectrum.ieee.org/figure-humanoid-robot-2665982283" target="_blank">Figure</a> is <a href="https://spectrum.ieee.org/humanoid-robots" target="_self">one of the companies</a> working toward a multipurpose or general-purpose (depending on whom you ask) bipedal or humanoid (depending on whom you ask) robot. The astonishing thing about this valuation is that Figure’s robot is still very much in the development phase—although they’re making rapid progress, which they demonstrate in a new video posted this week.</p><hr/><p>This round of funding comes from Microsoft, OpenAI Startup Fund, Nvidia, Jeff Bezos (through Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures, and ARK Invest. Figure says that they’re going to use this new capital “for scaling up AI training, robot manufacturing, expanding engineering head count, and advancing commercial deployment efforts.” In addition, Figure and OpenAI will be collaborating on the development of “next-generation AI models for humanoid robots” which will “help accelerate Figure’s commercial timeline by enhancing the capabilities of humanoid robots to process and reason from language.”</p><p>As far as that commercial timeline goes, here’s the most recent update:</p><p class="shortcode-media shortcode-media-youtube">
  605. <span class="rm-shortcode" data-rm-shortcode-id="0017062cd488a53edf97ac2940b7b2c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gEjXcEU3Bbw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  606. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Figure</small></p><p>And to understand everything that’s going on here, we sent a whole bunch of questions to <a href="https://www.linkedin.com/in/jennareher/" rel="noopener noreferrer" target="_blank">Jenna Reher</a>, senior robotics/AI engineer at Figure.</p><p><strong>What does “fully autonomous” mean, exactly?</strong></p><p><strong>Jenna Reher: </strong>In this case, we simply put the robot on the ground and hit go on the task with no other user input. What you see is using a learned vision model for bin detection that allows us to localize the robot relative to the target bin and get the bin pose. The robot can then navigate itself to within reach of the bin, determine grasp points based on the bin pose, and detect grasp success through the measured forces on the hands. Once the robot turns and sees the conveyor, the rest of the task rolls out in a similar manner. By doing things in this way we can move the bins and conveyor around in the test space or start the robot from a different position and still complete the task successfully.</p><p><strong>How many takes did it take to get this take?</strong></p><p><strong>Reher: </strong>We’ve been running this use case consistently for some time now as part of our work in the lab, so we didn’t really have to change much for the filming here. We did two or three practice runs in the morning and then three filming takes. All of the takes were successful, so the extras were to make sure we got the cleanest one to show.</p><p><strong>What’s back in the Advanced Actuator Lab?</strong></p><p><strong>Reher:</strong> We have an awesome team of folks working on some exciting custom actuator designs for our future robots, as well as supporting and characterizing the actuators that went into our current robots.</p><p><strong>That’s a very specific number for “speed vs. human.” Which human did you measure the robot’s speed against?</strong></p><p><strong>Reher: </strong>We timed <a href="https://www.linkedin.com/in/brettadcock/" target="_blank">Brett</a> [Adcock, founder of Figure] and a few poor engineers doing the task and took the average to get a rough baseline. If you are observant, that seemingly overspecific number is just saying we’re at 1/6 human speed. The main point that we’re trying to make here is that we are aware we are currently below human speed, and it’s an important metric to track as we improve.</p><p><strong>What’s the tether for?</strong></p><p><strong>Reher:</strong> For this task we currently process the camera data off-robot while all of the behavior planning and control happens on board in the computer that’s in the torso. Our robots should be fully tetherless in the near future as we finish packaging all of that on board. We’ve been developing behaviors quickly in the lab here at Figure in parallel to all of the other systems engineering and integration efforts happening, so hopefully folks notice all of these subtle parallel threads converging as we try to release regular updates.</p><p><strong>How the heck do you keep your robotics lab so clean?</strong></p><p><strong>Reher:</strong> Everything we’ve filmed so far is in our large robot test lab, so it’s a lot easier to keep the area clean when people’s desks aren’t intruding in the space. Definitely no guarantees on that level of cleanliness if the camera were pointed in the other direction!</p><p><strong>Is the robot in the background doing okay?</strong></p><p><strong>Reher: </strong>Yes! The other robot was patiently standing there in the background, waiting for the filming to finish up so that our manipulation team could get back to training it to do more manipulation tasks. We hope we can share some more developments with that robot as the main star in the near future.</p><p><strong>What would happen if I put a single bowling ball into that tote?</strong></p><p><strong>Reher: </strong>A bowling ball is particularly menacing to this task primarily due to the moving mass, in addition to the impact if you are throwing it in. The robot would in all likelihood end up dropping the tote, stay standing, and abort the task. With what you see here, we assume that the mass of the tote is known a priori so that our whole-body controller can compensate for the external forces while tracking the manipulation task. Reacting to and estimating larger unknown disturbances such as this is a challenging problem, but we’re definitely working on it.</p><p><strong>Tell me more about that very Zen arm and hand pose that the robot adopts after putting the tote on the conveyor.</strong></p><p><strong>Reher:</strong> It does look kind of Zen! If you rewatch our coffee video, you’ll notice the same pose after the robot gets things brewing. This is a reset pose that our controller will go into between manipulation tasks while the robot is awaiting commands to execute either an engineered behavior or a learned policy.</p><p><strong>Are the fingers less fragile than they look?</strong></p><p><strong>Reher: </strong>They are more robust than they look, but not impervious to damage by any means. The design is pretty modular, which is great, meaning that if we damage one or two fingers, there is a small number of parts to swap to get everything back up and running. The current fingers won’t necessarily survive a direct impact from a bad fall, but can pick up totes and do manipulation tasks all day without issues.</p><p><strong>Is the Figure logo footsteps?</strong></p><strong>Reher:</strong> One of the reasons I really like the Figure logo is that it has a bunch of different interpretations depending on how you look at it. In some cases it’s just an F that looks like a footstep plan rollout, while some of the logo animations we have look like active stepping. One other possible interpretation could be an occupancy grid.]]></description><pubDate>Thu, 29 Feb 2024 13:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/figure-robot-video</guid><category>Development</category><category>Figure</category><category>Humanoid robots</category><category>Microsoft</category><category>Nvidia</category><category>Openai</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-close-up-shot-of-a-metal-humanoid-torso-in-a-white-room.png?id=51595601&amp;width=980"></media:content></item><item><title>Video Friday: Pedipulate</title><link>https://spectrum.ieee.org/video-friday-pedipulate</link><description><![CDATA[
  607. <img src="https://spectrum.ieee.org/media-library/image.png?id=51541316&width=1200&height=400&coordinates=0%2C238%2C0%2C238"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="gd4wyjpxqtu"><em>Legged robots have the potential to become vital in maintenance, home support, and exploration scenarios. In order to interact with and manipulate their environments, most legged robots are equipped with a dedicated robot arm, which means additional mass and mechanical complexity compared to standard legged robots. In this work, we explore pedipulation—using the legs of a legged robot for manipulation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4b37546966229a3ffa3b016ace6d951" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GD4WyJPXQtU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>This work, by Philip Arm, Mayank Mittal, Hendrik Kolvenbach, and Marco Hutter from ETH Zurich’s Robotic Systems Lab, will be presented at the IEEE International Conference on Robotics and Automation (<a href="https://2024.ieee-icra.org/" target="_blank">ICRA 2024</a>) in May, in Japan (see events calendar above).</p><p>[ <a href="https://sites.google.com/leggedrobotics.com/pedipulate">Pedipulate</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vixalvxtqpg">I learned a new word today: “stigmergy.” Stigmergy is a kind of group coordination that’s based on environmental modification. Like, when insects leave pheromone trails, they’re not directly sending messages to other individuals. But as a group, ants are able to manifest surprisingly complex coordinated behaviors. Cool, right? Researchers at IRIDIA are exploring the possibilities for robots using stigmergy with a cool “artificial pheromone” system using a UV-sensitive surface.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="99633cf03f2c0ee56b5479e863305371" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vIxAlVXtqpg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>“Automatic Design of Stigmergy-Based Behaviors for Robot Swarms,” by Muhammad Salman, David Garzón Ramos, and Mauro Birattari, is published in the journal <em>Communications Engineering</em>.</p><p>[ <a href="https://www.nature.com/articles/s44172-024-00175-7"><em>Nature</em></a> ] via [ <a href="https://iridia.ulb.ac.be/~mbiro/habanero/">IRIDIA</a> ]</p><p>Thanks, David!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xfkfojbei-g"><em>Filmed in July 2017, this video shows Atlas walking through a “hatch” on a pitching surface. This skill uses autonomous behaviors, with the robot not knowing about the rocking world. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="547e0dd98a406970f520eff907b80f6f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xFkFoJBei-g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.ihmc.us/">IHMC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ek_yuhc_css">That IHMC video reminded me of the SAFFiR program for <a href="https://spectrum.ieee.org/virginia-techs-robots-will-save-you-from-disasters" target="_blank">Shipboard Autonomous Firefighting Robots</a>, which is responsible for a bunch of really cool research in partnership with the <a href="https://www.nrl.navy.mil/" target="_blank">U.S. Naval Research Laboratory</a>. NRL did some interesting stuff with <a href="https://spectrum.ieee.org/tag/nexi" target="_blank">Nexi robots</a> from MIT and made their own videos. That effort I think didn’t get nearly enough credit for being very entertaining while communicating important robotics research.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7f886d35f0715d46978ef5185f00b69f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EK_YUhc_css?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nrl.navy.mil/itd/aic/">NRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="oq2udtyd2jw">I want more robot videos with this energy.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="16dd3caa700f8f0c22497ce5e7294ea0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oq2Udtyd2jw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.csail.mit.edu/research/distributed-robotics-laboratory">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4az8qzfbjha"><em>Large industrial-asset operators increasingly use robotics to automate hazardous work at their facilities. This has led to soaring demand for autonomous inspection solutions like ANYmal. Series production by our partner Zollner enables ANYbotics to supply our customers with the required quantities of robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="604ef99a01906761e531aa3651bc81c6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4az8qzFBJhA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/robotics/anymal/">ANYbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xrzlbkshv_s">This week is <a href="https://mediahub.unl.edu/media/21920" target="_blank">Grain Bin Safety Week</a>, and Grain Weevil is here to help.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1c95a509777511e7ce3f933e958e3ced" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XrZlBkShv_s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.grainweevil.com/">Grain Weevil</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ql34bxzb-70">Oof, this is some heavy, heavy deep-time stuff.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4be0bb24a974cc32051bb4aa9b0a9382" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ql34bXZb-70?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/news/autonomous-inspection-for-enhanced-nuclear-safety-at-onkalo/">Onkalo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rw94qcrhe9q">And now, this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bc69add46a60770ef778097a79605a94" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RW94QcrHE9Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/@RozenZebet">RozenZebet</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pdamz3oyugc"><em>Hawkeye is a real-time multimodal conversation-and-interaction agent for the Boston Dynamics’ mobile robot Spot. Leveraging OpenAI’s experimental GPT-4 Turbo and Vision AI models, Hawkeye aims to empower everyone, from seniors to health care professionals in forming new and unique interactions with the world around them.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="abc2bbb53f4dece206bde5fad935931b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PDaMZ3OyuGc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That moment at 1:07 is so relatable.</p><p>[ <a href="https://github.com/darryltanzil/spot-boston-dynamics">Hawkeye</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="sposwz_4zvo">Wing would really prefer that if you find one of their drones on the ground, you don’t run off with it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="becb1cddb2a522457b936308151f0505" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/spoSwZ_4Zvo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wprufur-zn8"><em>The rover Artemis, developed at the DFKI Robotics Innovation Center, has been equipped with a penetrometer that measures the soil’s penetration resistance to obtain precise information about soil strength. The video showcases an initial test run with the device mounted on the robot. During this test, the robot was remotely controlled, and the maximum penetration depth was limited to 15 millimeters.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c4bf852a9a49fc31bbd01f61322cac79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WpRUFUR-ZN8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/en/research/robot-systems/artemis">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yr7xzgwidme"><em>To efficiently achieve complex humanoid loco-manipulation tasks in industrial contexts, we propose a combined vision-based tracker-localization interplay integrated as part of a task-space whole-body-optimization control. Our approach allows humanoid robots, targeted for industrial manufacturing, to manipulate and assemble large-scale objects while walking.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4503bf01b4e4fb0dedd1504acddf4af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YR7xZGwIdmE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hal.science/hal-04125159">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ctydxeegbbu"><em>We developed a novel multibody robot (called the Two-Body Bot) consisting of two small-footprint mobile bases connected by a four-bar linkage where handlebars are mounted. Each base measures only 29.2 centimeters wide, making the robot likely the slimmest ever developed for mobile postural assistance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="430da0f7164f2c697ef51b872e5a9157" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CTyDXEEGbbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://darbelofflab.mit.edu/nri-eldercare/#handle-anywhere">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="5vnbbcm_zyq">Lex Fridman interviews Marc Raibert.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="944ec7a8c27ec901362fa20b6c1ac7ad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5VnbBCm_ZyQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://lexfridman.com/marc-raibert/">Lex Fridman</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 23 Feb 2024 16:53:19 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-pedipulate</guid><category>Humanoid robots</category><category>Industrial robotics</category><category>Legged robots</category><category>Quadruped robots</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51541316&amp;width=980"></media:content></item><item><title>Let Robots Do Your Lab Work</title><link>https://spectrum.ieee.org/air-force-research-ares-os</link><description><![CDATA[
  608. <img src="https://spectrum.ieee.org/media-library/image.webp?id=51522099&width=980"/><br/><br/><iframe frameborder="no" height="180" scrolling="no" seamless="" src="https://share.transistor.fm/e/a06d86c9" width="100%">
  609. </iframe><p><strong>Dina Genkina:</strong> Hi. I’m <a href="https://spectrum.ieee.org/u/dina_genkina" target="_self">Dina Genkina</a> for <em>IEEE Spectrum</em>‘s <em>Fixing the Future</em>. Before we start, I want to tell you that you can get the latest coverage from some of Spectrum’s most important beeps, including AI, Change, and Robotics, by signing up for one of our free newsletters. Just go to <a href="https://spectrum.ieee.org/newsletters/" target="_self">spectrum.ieee.org\newsletters</a> to subscribe. Today, a guest is Dr. <a href="https://www.linkedin.com/in/benji-maruyama-563b57b8/" rel="noopener noreferrer" target="_blank">Benji Maruyama,</a> a Principal Materials Research Engineer at the <a href="https://www.afrl.af.mil/" rel="noopener noreferrer" target="_blank">Air Force Research Laboratory</a>, or AFRL. Dr. Maruyama is a materials scientist, and his research focuses on carbon nanotubes and making research go faster. But he’s also a man with a dream, a dream of a world where science isn’t something done by a select few locked away in an ivory tower, but something most people can participate in. He hopes to start what he calls the billion scientist movement by building AI-enabled research robots that are accessible to all. Benji, thank you for coming on the show.</p><p>Benji Maruyama: Thanks, Dina. Great to be with you. I appreciate the invitation.</p><p>Genkina: Yeah. So let’s set the scene a little bit for our listeners. So you advocate for this billion scientist movement. If everything works amazingly, what would this look like? Paint us a picture of how AI will help us get there.</p><p>Maruyama: Right, great. Thanks. Yeah. So one of the things as you set the scene there is right now, to be a scientist, most people need to have access to a big lab with very expensive equipment. So I think top universities, government labs, industry folks, lots of equipment. It’s like a million dollars, right, to get one of them. And frankly, just not that many of us have access to those kinds of instruments. But at the same time, there’s probably a lot of us who want to do science, right? And so how do we make it so that anyone who wants to do science can try, can have access to instruments so that they can contribute to it. So that’s the basics behind citizen science or democratization of science so that everyone can do it. And one way to think of it is what happened with <a href="https://spectrum.ieee.org/tag/3d-printing" target="_self">3D printing</a>. It used to be that in order to make something, you had to have access to a machine shop or maybe get fancy tools and dyes that could cost tens of thousands of dollars a pop. Or if you wanted to do electronics, you had to have access to very expensive equipment or services. But when 3D printers came along and became very inexpensive, all of a sudden now, anyone with access to a 3D printer, so maybe in a school or a library or a makerspace could print something out. And it could be something fun, like a game piece, but it could also be something that got you to an invention, something that was maybe useful to the community, was either a prototype or an actual working device.</p><p>And so really, 3D printing democratized manufacturing, right? It made it so that many more of us could do things that before only a select few could. And so that’s where we’re trying to go with science now, is that instead of only those of us who have access to big labs, we’re building research robots. And when I say we, we’re doing it, but now there are a lot of others who are doing it as well, and I’ll get into that. But the example that we have is that we took a 3D printer that you can buy off the internet for less than $300. Plus a couple of extra parts, a webcam, a Raspberry Pi board, and a tripod really, so only four components. You can get them all for $300. Load them with open-source software that was developed by AFIT, the <a href="https://www.afit.edu/" rel="noopener noreferrer" target="_blank">Air Force Institute of Technology</a>. So <a href="https://www.afit.edu/BIOS/bio.cfm?facID=172" rel="noopener noreferrer" target="_blank">Burt Peterson</a> and Greg Captain [inaudible]. We worked together to build this fully autonomous 3D printing robot that taught itself how to print to better than manufacturer’s specifications. So that was a really fun advance for us, and now we’re trying to take that same idea and broaden it. So I’ll turn it back over to you.</p><p>Genkina: Yeah, okay. So maybe let’s talk a little bit about this automated research robot that you’ve made. So right now, it works with a 3D printer, but is the big picture that one day it’s going to give people access to that million dollar lab? How would that look like?</p><p>Maruyama: Right, so there are different models out there. One, we just did a workshop at the University of— sorry, North Carolina State University about that very problem, right? So there’s two models. One is to get low-cost scientific tools like the 3D printer. There’s a couple of different chemistry robots, one out of University of Maryland and NIST, one out of University of Washington that are in the sort of 300 to 1,000 dollars range that makes it accessible. The other part is kind of the user facility model. So in the US, the Department of Energy National Labs have many user facilities where you can apply to get time on very expensive instruments. Now we’re talking tens of millions. For example, Brookhaven has a synchrotron light source where you can sign up and it doesn’t cost you any money to use the facility. And you can get days on that facility. And so that’s already there, but now the advances are that by using this, autonomy, autonomous closed loop experimentation, that the work that you do will be much faster and much more productive. So, for example, on ARES, our Autonomous Research System at AFRL, we actually were able to do experiments so fast that a professor who came into my lab said, it just took me aside and said, “Hey, Benji, in a week’s worth of time, I did a dissertation’s worth of research.” So maybe five years worth of research in a week. So imagine if you keep doing that week after week after week, how fast research goes. So it’s very exciting.</p><p>Genkina: Yeah, so tell us a little bit about how that works. So what’s this system that has sped up five years of research into a week and made graduate students obsolete? Not yet, not yet. How does that work? Is that the 3D printer system or is that a—</p><p>Maruyama: So we started with our system to grow carbon nanotubes. And I’ll say, actually, when we first thought about it, your comment about graduate students being absolute— obsolete, sorry, is interesting and important because, when we first built our system that worked it 100 times faster than normal, I thought that might be the case. We called it sort of graduate student out of the loop. But when I started talking with people who specialize in autonomy, it’s actually the opposite, right? It’s actually empowering graduate students to go faster and also to do the work that they want to do, right? And so just to digress a little bit, if you think about farmers before the Industrial Revolution, what were they doing? They were plowing fields with oxen and beasts of burden and hand plows. And it was hard work. And now, of course, you wouldn’t ask a farmer today to give up their tractor or their combine harvester, right? They would say, of course not. So very soon, we expect it to be the same for researchers, that if you asked a graduate student to give up their autonomous research robot five years from now, they’ll say, “Are you crazy? This is how I get my work done.”</p><p>But for our original ARES system, it worked on the synthesis of <a href="https://spectrum.ieee.org/tag/carbon-nanotubes" target="_self">carbon nanotubes</a>. So that meant that what we’re doing is trying to take this system that’s been pretty well studied, but we haven’t figured out how to make it at scale. So at hundreds of millions of tons per year, sort of like polyethylene production. And part of that is because it’s slow, right? One experiment takes a day, but also because there are just so many different ways to do a reaction, so many different combinations of temperature and pressure and a dozen different gases and half the periodic table as far as the catalyst. It’s just too much to just brute force your way through. So even though we went from experiments where we could do 100 experiments a day instead of one experiment a day, just that combinatorial space was vastly overwhelmed our ability to do it, even with many research robots or many graduate students. So the idea of having artificial intelligence algorithms that drive the research is key. And so that ability to do an experiment, see what happened, and then analyze it, iterate, and constantly be able to choose the optimal next best experiment to do is where ARES really shines. And so that’s what we did. ARES taught itself how to grow carbon nanotubes at controlled rates. And we were the first ones to do that for material science in our 2016 publication.</p><p>Genkina: That’s very exciting. So maybe we can peer under the hood a little bit of this AI model. How does the magic work? How does it pick the next best point to take and why it’s better than you could do as a graduate student or researcher?</p><p>Maruyama: Yeah, and so I think it’s interesting, right? In science, a lot of times we’re taught to hold everything constant, change one variable at a time, search over that entire space, see what happened, and then go back and try something else, right? So we reduce it to one variable at a time. It’s a reductionist approach. And that’s worked really well, but a lot of the problems that we want to go after are simply too complex for that reductionist approach. And so the benefit of being able to use artificial intelligence is that high dimensionality is no problem, right? Tens of dimensions search over very complex high-dimensional parameter space, which is overwhelming to humans, right? Is just basically bread and butter for AI. The other part to it is the iterative part. The beauty of doing autonomous experimentation is that you’re constantly iterating. You’re constantly learning over what just happened. You might also say, well, not only do I know what happened experimentally, but I have other sources of prior knowledge, right? So for example, ideal gas law says that this should happen, right? Or Gibbs phase rule might say, this can happen or this can’t happen. So you can use that prior knowledge to say, “Okay, I’m not going to do those experiments because that’s not going to work. I’m going to try here because this has the best chance of working.”</p><p>And within that, there are many different machine learning or artificial intelligence algorithms. Bayesian optimization is a popular one to help you choose what experiment is best. There’s also new AI that people are trying to develop to get better search.</p><p>Genkina: Cool. And so the software part of this autonomous robot is available for anyone to download, which is also really exciting. So what would someone need to do to be able to use that? Do they need to get a 3D printer and a Raspberry Pi and set it up? And what would they be able to do with it? Can they just build carbon nanotubes or can they do more stuff?</p><p>Maruyama: Right. So what we did, we built ARES OS, which is our open source software, and we’ll make sure to get you <a href="https://github.com/AFRL-ARES/ARES_OS" rel="noopener noreferrer" target="_blank">the GitHub link</a> so that anyone can download it. And the idea behind ARES OS is that it provides a software framework for anyone to build their own autonomous research robot. And so the 3D printing example will be out there soon. But it’s the starting point. Of course, if you want to build your own new kind of robot, you still have to do the software development, for example, to link the ARES framework, the core, if you will, to your particular hardware, maybe your particular camera or 3D printer, or pipetting robot, or spectrometer, whatever that is. We have examples out there and we’re hoping to get to a point where it becomes much more user-friendly. So having direct Python connects so that you don’t— currently it’s programmed in C#. But to make it more accessible, we’d like it to be set up so that if you can do Python, you can probably have good success in building your own research robot.</p><p>Genkina: Cool. And you’re also working on a educational version of this, I understand. So what’s the status of that and what’s different about that version?</p><p>Maruyama: Yeah, right. So the educational version is going to be-- its sort of composition of a combination of hardware and software. So what we’re starting with is a low-cost 3D printer. And we’re collaborating now with the <a href="https://engineering.buffalo.edu/materials-design-innovation.html" rel="noopener noreferrer" target="_blank">University at Buffalo, Materials Design Innovation Department</a>. And we’re hoping to build up a robot based on a 3D printer. And we’ll see how it goes. It’s still evolving. But for example, it could be based on this very inexpensive $200 3D printer. It’s an <a href="https://store.creality.com/collections/ender-series-3d-printer" rel="noopener noreferrer" target="_blank">Ender 3D printer</a>. There’s another printer out there that’s based on <a href="https://jubilee3d.com/index.php?title=Main_Page" rel="noopener noreferrer" target="_blank">University of Washington’s Jubilee printer</a>. And that’s a very exciting development as well. So professors <a href="https://mse.washington.edu/facultyfinder/lilo-pozzo" rel="noopener noreferrer" target="_blank">Lilo Pozzo</a> and <a href="https://www.hcde.washington.edu/peek" rel="noopener noreferrer" target="_blank">Nadya Peek</a> at the University of Washington built this Jubilee robot with that idea of accessibility in mind. And so combining our ARES OS software with their Jubilee robot hardware is something that I’m very excited about and hope to be able to move forward on.</p><p>Genkina: What’s this Jubilee 3D printer? How is it different from a regular 3D printer?</p><p>Maruyama: It’s very open source. Not all 3D printers are open source and it’s based on a gantry system with interchangeable heads. So for example, you can get not just a 3D printing head, but other heads that might do things like do indentation, see how stiff something is, or maybe put a camera on there that can move around. And so it’s the flexibility of being able to pick different heads dynamically that I think makes it super useful. For the software, right, we have to have a good, accessible, user-friendly graphical user interface, a GUI. That takes time and effort, so we want to work on that. But again, that’s just the hardware software. Really to make ARES a good educational platform, we need to make it so that a teacher who’s interested can have the lowest activation barrier possible, right? We want she or he to be able to pull a lesson plan off of the internet, have supporting YouTube videos, and actually have the material that is a fully developed curriculum that’s mapped against state standards.</p><p>So that, right now, if you’re a teacher who— let’s face it, teachers are already overwhelmed with all that they have to do, putting something like this into their curriculum can be a lot of work, especially if you have to think about, well, I’m going to take all this time, but I also have to meet all of my teaching standards, all the state curriculum standards. And so if we build that out so that it’s a matter of just looking at the curriculum and just checking off the boxes of what state standards it maps to, then that makes it that much easier for the teacher to teach.</p><p>Genkina: Great. And what do you think is the timeline? Do you expect to be able to do this sometime in the coming year?</p><p>Maruyama: That’s right. These things always take longer than hoped for than expected, but we’re hoping to do it within this calendar year and very excited to get it going. And I would say for your listeners, if you’re interested in working together, please let me know. We’re very excited about trying to involve as many people as we can.</p><p>Genkina: Great. Okay, so you have the educational version, and you have the more research geared version, and you’re working on making this educational version more accessible. Is there something with the research version that you’re working on next, how you’re hoping to upgrade it, or is there something you’re using it for right now that you’re excited about?</p><p>There’s a number of things that we are very excited about the possibility of carbon nanotubes being produced at very large scale. So right now, people may remember carbon nanotubes as that great material that sort of never made it and was very overhyped. But there’s a core group of us who are still working on it because of the important promise of that material. So it’s material that is super strong, stiff, lightweight, electrically conductive. Much better than silicon as a digital electronics compute material. All of those great things, except we’re not making it at large enough scale. It’s actually used pretty significantly in lithium-ion batteries. It’s an important application. But other than that, it’s sort of like where’s my flying car? It’s never panned out. But there’s, as I said, a group of us who are working to really produce carbon nanotubes at much larger scale. So large scale for nanotubes now is sort of in the kilogram or ton scale. But what we need to get to is hundreds of millions of tons per year production rates. And why is that? Well, there’s a great effort that came out of <a href="https://spectrum.ieee.org/tag/Arpa-e" target="_self">ARPA-E</a>. So the Department of Energy Advanced Research Projects Agency and the E is for Energy in that case.</p><p>So they funded a collaboration between Shell Oil and Rice University to pyrolyze methane, so natural gas into hydrogen for the hydrogen economy. So now that’s a clean burning fuel plus carbon. And instead of burning the carbon to CO2, which is what we now do, right? We just take natural gas and feed it through a turbine and generate electric power instead of— and that, by the way, generates so much CO2 that it’s causing global climate change. So if we can do that pyrolysis at scale, at hundreds of millions of tons per year, it’s literally a save the world proposition, meaning that we can avoid so much CO2 emissions that we can reduce global CO2 emissions by 20 to 40 percent. And that is the save the world proposition. It’s a huge undertaking, right? That’s a big problem to tackle, starting with the science. We still don’t have the science to efficiently and effectively make carbon nanotubes at that scale. And then, of course, we have to take the material and turn it into useful products. So the batteries is the first example, but thinking about replacing copper for electrical wire, replacing steel for structural materials, aluminum, all those kinds of applications. But we can’t do it. We can’t even get to that kind of development because we haven’t been able to make the carbon nanotubes at sufficient scale.</p><p>So I would say that’s something that I’m working on now that I’m very excited about and trying to get there, but it’s going to take some good developments in our research robots and some very smart people to get us there.</p><p>Genkina: Yeah, it seems so counterintuitive that making everything out of carbon is good for lowering carbon emissions, but I guess that’s the break.</p><p>Maruyama: Yeah, it is interesting, right? So people talk about carbon emissions, but really, the molecule that’s causing global warming is carbon dioxide, CO2, which you get from burning carbon. And so if you take that methane and parallelize it to carbon nanotubes, that carbon is now sequestered, right? It’s not going off as CO2. It’s staying in solid state. And not only is it just not going up into the atmosphere, but now we’re using it to replace steel, for example, which, by the way, steel, aluminum, copper production, all of those things emit lots of CO2 in their production, right? They’re energy intensive as a material production. So it’s kind of ironic.</p><p>Genkina: Okay, and are there any other research robots that you’re excited about that you think are also contributing to this democratization of science process?</p><p>Maruyama: Yeah, so we talked about Jubilee, the NIST robot, which is from Professor Ichiro Takeuchi at Maryland and Gilad Kusne at NIST, National Institute of Standards and Technology. Theirs is fun too. It’s LEGO as. So it’s actually based on a LEGO robotics platform. So it’s an actual chemistry robot built out of Legos. So I think that’s fun as well. And you can imagine, just like we have LEGO robot competitions, we can have autonomous research robot competitions where we try and do research through these robots or competitions where everybody sort of starts with the same robot, just like with LEGO robotics. So that’s fun as well. But I would say there’s a growing number of people doing these kinds of, first of all, low-cost science, accessible science, but in particular low-cost autonomous experimentation.</p><p>Genkina: So how far are we from a world where a high school student has an idea and they can just go and carry it out on some autonomous research system at some high-end lab?</p><p>Maruyama: That’s a really good question. I hope that it’s going to be in 5 to 10 years, that it becomes reasonably commonplace. But it’s going to take still some significant investment to get this going. And so we’ll see how that goes. But I don’t think there are any scientific impediments to getting this done. There is a significant amount of engineering to be done. And sometimes we hear, oh, it’s just engineering. The engineering is a significant problem. And it’s work to get some of these things accessible, low cost. But there are lots of great efforts. There are people who have used CDs, compact discs to make spectrometers out of. There are lots of good examples of citizen science out there. But it’s, I think, at this point, going to take investment in software, in hardware to make it accessible, and then importantly, getting students really up to speed on what AI is and how it works and how it can help them. And so I think it’s actually really important. So again, that’s the democratization of science is if we can make it available to everyone and accessible, then that helps people, everyone contribute to science. And I do believe that there are important contributions to be made by ordinary citizens, by people who aren’t you know PhDs working in a lab.</p><p>And I think there’s a lot of science out there to be done. If you ask working scientists, almost no one has run out of ideas or things they want to work on. There’s many more scientific problems to work on than we have the time where people are funding to work on. And so if we make science cheaper to do, then all of a sudden, more people can do science. And so those questions start to be resolved. And so I think that’s super important. And now we have, instead of, just those of us who work in big labs, you have millions, tens of millions, up to a billion people, that’s the billion scientist idea, who are contributing to the scientific community. And that, to me, is so powerful that many more of us can contribute than just the few of us who do it right now.</p><p>Genkina: Okay, that’s a great place to end on, I think. So, today we spoke to Dr. Benji Maruyama, a material scientist at AFRL, about his efforts to democratize scientific discovery through automated research robots. For IEEE Spectrum, I’m Dina Genkina, and I hope you’ll join us next time on Fixing the Future.</p>]]></description><pubDate>Wed, 21 Feb 2024 17:33:43 +0000</pubDate><guid>https://spectrum.ieee.org/air-force-research-ares-os</guid><category>Fixing the future</category><category>Type:podcast</category><category>Robots</category><category>Afrl</category><category>Automation</category><category>Ares os</category><dc:creator>Dina Genkina</dc:creator><media:content medium="image" type="image/jpeg" url="https://assets.rbl.ms/51522099/origin.webp"></media:content></item><item><title>Video Friday: Acrobot Error</title><link>https://spectrum.ieee.org/video-friday-acrobot-error</link><description><![CDATA[
  610. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51482626&width=1200&height=400&coordinates=0%2C415%2C0%2C416"/><br/><br/><p>
  611. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  612. </p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>
  613. Enjoy today’s videos!
  614. </p><div class="horizontal-rule">
  615. </div><div style="page-break-after: always">
  616. <span style="display:none"> </span>
  617. </div><p class="rm-anchors" id="gt6olyyd3n4">
  618. Just like a real human, Acrobot will sometimes kick you in the face.
  619. </p><p class="shortcode-media shortcode-media-youtube">
  620. <span class="rm-shortcode" data-rm-shortcode-id="3334739d564ceaedf148f129005e8a0b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gT6oLYYd3n4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  621. </p><p>
  622. [ <a href="https://danielsimu.com/acrobotics/">Acrobotics</a> ]
  623. </p><p>
  624. Thanks, Elizabeth!
  625. </p><div class="horizontal-rule">
  626. </div><p class="rm-anchors" id="21f7iof9bms">
  627. You had me at “wormlike, limbless robots.”
  628. </p><p class="shortcode-media shortcode-media-youtube">
  629. <span class="rm-shortcode" data-rm-shortcode-id="04804c773eebe3ee6e0c340dd60f03e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/21F7IOF9BMs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  630. </p><p>
  631. [ <a href="https://ty-wang.github.io/MechIntelligence/">GitHub</a> ] via [ <a href="https://theconversation.com/we-designed-wormlike-limbless-robots-that-navigate-obstacle-courses-they-could-be-used-for-search-and-rescue-one-day-220828">Georgia Tech</a> ]
  632. </p><div class="horizontal-rule">
  633. </div><blockquote class="rm-anchors" id="raprq2lyeze">
  634. <em>Filmed in July 2017, this video shows us using Atlas to put out a “fire” on our loading dock. This uses a combination of teleoperation and autonomous behaviors through a single, remote computer. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.</em>
  635. </blockquote><p class="shortcode-media shortcode-media-youtube">
  636. <span class="rm-shortcode" data-rm-shortcode-id="f8c7f495f9b7554dce027b26086c7489" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Raprq2LyEZE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  637. </p><p>
  638. I would say that in the middle of a rainstorm is probably the best time to start a fire that you expect to be extinguished by a robot.
  639. </p><p>
  640. [ <a href="https://robots.ihmc.us/">IHMC</a> ]
  641. </p><div class="horizontal-rule">
  642. </div><blockquote class="rm-anchors" id="mwn5kjwenas">
  643. <em>We’re hard at work, but Atlas still has time for a dance break.</em>
  644. </blockquote><p class="shortcode-media shortcode-media-youtube">
  645. <span class="rm-shortcode" data-rm-shortcode-id="5749d903e772bcfae0b530b6af31be9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mWn5KjWeNas?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  646. </p><p>
  647. [ <a href="https://bostondynamics.com/atlas/">Boston Dynamics</a> ]
  648. </p><div class="horizontal-rule">
  649. </div><p class="rm-anchors" id="0rrvoakwdgo">
  650. This is pretty cool: BruBotics is testing its self-healing robotics gripper technology on commercial grippers from Festo.
  651. </p><p class="shortcode-media shortcode-media-youtube">
  652. <span class="rm-shortcode" data-rm-shortcode-id="d8a949e5d9970c60ca13540515389cf8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0RrVoakWdgo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  653. </p><p>
  654. [ <a href="https://ieeexplore.ieee.org/document/10423895">Paper</a> ] via [ <a href="https://www.brubotics.eu/">BruBotics</a> ]
  655. </p><p>
  656. Thanks, Bram!
  657. </p><div class="horizontal-rule">
  658. </div><p class="rm-anchors" id="srzxqe-hlc0">
  659. You should read <a href="https://spectrum.ieee.org/hello-robot-stretch-3" target="_blank">our in-depth article on Stretch 3</a>, so if you haven’t yet, consider this as just a teaser.
  660. </p><p class="shortcode-media shortcode-media-youtube">
  661. <span class="rm-shortcode" data-rm-shortcode-id="d85e7e2468a6a24490a25b99b6217cc7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/srzXqE-hlc0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  662. </p><p>
  663. [ <a href="https://hello-robot.com/">Hello Robot</a> ]
  664. </p><div class="horizontal-rule">
  665. </div><blockquote class="rm-anchors" id="0_cjqzmzis4">
  666. <em>Inspired by caregiving experts, we proposed a bimanual interactive robotic dressing assistance scheme, which is unprecedented in previous research. In the scheme, an interactive robot joins hands with the human thus supporting/guiding the human in the dressing process, while the dressing robot performs the dressing task. This work represents a paradigm shift of thinking of the dressing assistance task from one-robot-to-one-arm to two-robot-to-one-arm.</em>
  667. </blockquote><p class="shortcode-media shortcode-media-youtube">
  668. <span class="rm-shortcode" data-rm-shortcode-id="07384f315b1bd00d9ef9652fdb1626cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0_cJqZmZIS4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  669. </p><p>
  670. [ <a href="https://sites.google.com/view/bimanualassitdressing/home">Project</a> ]
  671. </p><p>
  672. Thanks, Jihong!
  673. </p><div class="horizontal-rule">
  674. </div><p class="rm-anchors" id="tne9nvodu0s">
  675. Tony Punnoose Valayil from the Bulgarian Academy of Sciences Institute of Robotics wrote in to share some very low-cost hand-rehabilitation robots for home use.
  676. </p><p class="shortcode-media shortcode-media-youtube">
  677. <span class="rm-shortcode" data-rm-shortcode-id="1b7158e26042e082bc1c7a30d2c76489" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Tne9NvoDU0s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  678. </p><blockquote>
  679. In this video, we present a robot-assisted rehabilitation of the wrist joint which can aid in restoring the strength that has been lost across the upper limb due to stroke. This robot is very cost-effective and can be used for home rehabilitation.
  680. </blockquote><p class="shortcode-media shortcode-media-youtube">
  681. <span class="rm-shortcode" data-rm-shortcode-id="449f853db93fd4f1b48345d2f029d44e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Flyx2x1G-cM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  682. </p><blockquote>
  683. In this video, we present an exoskeleton robot which can be used at home for rehabilitating the index and middle fingers of stroke-affected patients. This robot is built at a cost of 50 euros for patients who are not financially independent to get better treatment.</blockquote><p>
  684. [ <a href="http://www.ir.bas.bg/index_en.html">BAS</a> ]
  685. </p><div class="horizontal-rule">
  686. </div><p class="rm-anchors" id="0djr8fetpq0">
  687. Some very impressive work here from the Norwegian University of Science and Technology (<a href="https://en.wikipedia.org/wiki/Norwegian_University_of_Science_and_Technology" target="_blank">NTNU</a>), showing a drone tracking its position using radar and lidar-based odometry in some nightmare (for robots) environments, including a long tunnel that looks the same everywhere and a hallway full of smoke.
  688. </p><p class="shortcode-media shortcode-media-youtube">
  689. <span class="rm-shortcode" data-rm-shortcode-id="bc5a0356ca298331d7172e68486a35e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0dJr8fETpQ0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  690. </p><p>
  691. [ <a href="https://arxiv.org/abs/2310.16658">Paper</a> ] via [ <a href="https://github.com/ntnu-arl/lidar_degeneracy_datasets">GitHub</a> ]
  692. </p><div class="horizontal-rule">
  693. </div><p class="rm-anchors" id="5u-bszrx-t4">
  694. I’m sorry, but people should really know better than to make videos like this for social robot crowdfunding by now.
  695. </p><p class="shortcode-media shortcode-media-youtube">
  696. <span class="rm-shortcode" data-rm-shortcode-id="9decd93a25ef88830252bedd935e4278" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5U-bSZRx-t4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  697. </p><p>
  698. It’s on Kickstarter for about $300, and the fact that it’s been funded so quickly tells me that people have already forgotten about the social robotpocalypse.
  699. </p><p>
  700. [ <a href="https://www.kickstarter.com/projects/doly/doly-more-than-a-robot">Kickstarter</a> ]
  701. </p><div class="horizontal-rule">
  702. </div><blockquote class="rm-anchors" id="vvpgsd9jsw0">
  703. <em>Introducing Orbit, your portal for managing asset-intensive facilities through real-time and predictive intelligence. Orbit brings a whole new suite of fleet management capabilities and will unify your ecosystem of Boston Dynamics robots, starting with Spot.</em>
  704. </blockquote><p class="shortcode-media shortcode-media-youtube">
  705. <span class="rm-shortcode" data-rm-shortcode-id="78ca765204cebb9f5913cacd8dd8f0e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VVpgsd9Jsw0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  706. </p><p>
  707. [ <a href="https://bostondynamics.com/blog/robot-fleet-management-lifts-off-with-spot/">Boston Dynamics</a> ]
  708. </p><div class="horizontal-rule">
  709. </div>]]></description><pubDate>Fri, 16 Feb 2024 15:31:44 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-acrobot-error</guid><category>Atlas</category><category>Boston dynamics</category><category>Darpa robotics challenge</category><category>Georgia tech</category><category>Hello robot</category><category>Kickstarter</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51482626&amp;width=980"></media:content></item><item><title>Stretch 3 Brings Us Closer to Realistic Home Robots</title><link>https://spectrum.ieee.org/hello-robot-stretch-3</link><description><![CDATA[
  710. <img src="https://spectrum.ieee.org/media-library/a-picture-of-a-tall-skinny-robot-with-a-two-wheeled-base-and-a-single-arm-with-a-flexible-wrist-on-the-end.png?id=51473667&width=3001&height=2307&coordinates=417%2C0%2C422%2C93"/><br/><br/><p>A lot has happened in robotics over the last year. Everyone is wondering how AI will transform robotics, and everyone else is wondering whether humanoids are going to blow it or not, and the rest of us are busy trying not to get completely run over as things shake out however they’re going to shake out. </p><p>Meanwhile, over at <a href="https://spectrum.ieee.org/hello-robots-stretch-mobile-manipulator" target="_blank">Hello Robot</a>, they’ve been focused on making their Stretch robot do useful things while also being affordable and reliable and affordable and expandable and affordable and community-friendly and affordable. Which are some really hard and important problems that can sometimes get overwhelmed by flashier things.</p><p>Today, <a href="https://hello-robot.com/stretch-3-product" rel="noopener noreferrer" target="_blank">Hello Robot is announcing Stretch 3</a>, which provides a suite of upgrades to what they (quite accurately) call “the world’s only lightweight, capable, developer-friendly mobile manipulator.” And impressively, they’ve managed to do it without forgetting about that whole “affordable” part.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  711. <span class="rm-shortcode" data-rm-shortcode-id="314bde5772f929cd623891843c2b8c9c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/haauI2x2U1E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  712. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Stretch 3 looks about the same as the previous versions, but there are important upgrades that are worth highlighting. The most impactful: Stretch 3 now comes with the dexterous wrist kit that used to be an add-on, and it now also includes an <a href="https://www.intelrealsense.com/depth-camera-d405/" target="_blank">Intel Realsense D405 camera</a> mounted right behind the gripper, which is a huge help for both autonomy and remote teleoperation—a useful new feature shipping with Stretch 3 that’s based on research out of <a href="https://hcrlab.cs.washington.edu/" target="_blank">Maya Cakmak’s lab</a> at the University of Washington, in Seattle. This is an example of turning innovation from the community of Stretch users into product features, a product-development approach that seems to be working well for Hello Robot.<strong></strong></p><p>“We’ve really been learning from our community,” says Hello Robot cofounder and CEO Aaron Edsinger. “In the past year, we’ve seen a real uptick in publications, and it feels like we’re getting to this critical-mass moment with Stretch. So with Stretch 3, it’s about implementing features that our community has been asking us for.”<strong></strong></p><p>“When we launched, we didn’t have a dexterous wrist at the end as standard, because we were trying to start with truly the minimum viable product,” says Hello Robot cofounder and CTO Charlie Kemp. “And what we found is that almost every order was adding the dexterous wrist, and by actually having it come in standard, we’ve been able to devote more attention to it and make it a much more robust and capable system.”</p><p>Kemp says that having Stretch do everything right out of the box (with Hello Robot support) makes a big difference for their research customers. “Making it easier for people to try things—we’ve learned to really value that, because the more steps that people have to go through to experience it, the less likely they are to build on it.” In a research context, this is important because what you’re really talking about is time: The more time people spend just trying to make the robot function, the less time they’ll spend getting the robot to do useful things.</p><p class="shortcode-media shortcode-media-youtube">
  713. <span class="rm-shortcode" data-rm-shortcode-id="048a340e86bbefb9a6ea9581c8db4f9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Rxru8t1x1hg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  714. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>At this point, you may be thinking of Stretch as a research platform. Or you may be thinking of Stretch as a robot for people with disabilities, if you read our <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_self">November 2023 cover story about Stretch and Henry and Jane Evans</a>. And the robot is definitely both of those things. But Hello Robot stresses that these specific markets are not their end goal—they see Stretch as a generalist mobile manipulator with a future in the home, as suggested by this Stretch 3 promo video:</p><p class="shortcode-media shortcode-media-youtube">
  715. <span class="rm-shortcode" data-rm-shortcode-id="0a6194e086c360103ffb0e1e58aad9ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ni4p8axgqHM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  716. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Dishes, laundry, bubble cannons: All of these are critical to the functionality of any normal household. “Stretch is an inclusive robot,” says Kemp. “It’s not just for older adults or people with disabilities. We want a robot that can be beneficial for everyone. Our vision, and what we believe will really happen, whether it’s us or someone else, is that there is going to be a versatile, general-purpose home robot. Right now, clearly, our market is not yet consumers in the home. But that’s where we want to go.”</p><p>Robots in the home have been promised for decades, and with the notable exception of the <a data-linked-post="2665973683" href="https://spectrum.ieee.org/south-pole-roombas" target="_blank">Roomba</a>, there has not been a lot of success. The idea of a robot that could handle dishes or laundry is tempting, but is it near-term or medium-term realistic? Edsinger, who has been at this whole robots thing for <a href="https://www.linkedin.com/in/aaron-edsinger/" rel="noopener noreferrer" target="_blank">a very long time</a>, is an optimist about this, and about the role that Stretch will play. “There are so many places where you can see the progress happening—in sensing, in manipulation,” Edsinger says. “I can imagine those things coming together now in a way that I could not have 5 to 10 years ago, when it seemed so incredibly hard.”</p><p class="pull-quote">“We’re very pragmatic about what is possible. And I think that we do believe that things are changing faster than we anticipated—10 years ago, I had a pretty clear linear path in mind for robotics, but it’s hard to really imagine where we’ll be in terms of robot capabilities 10 years from now.” <strong>—Aaron Edsinger, Hello Robot</strong></p><p>I’d say that it’s <em>still</em> incredibly hard, but Edsinger is right that a lot of the pieces do seem to be coming together. Arguably, the hardware is the biggest challenge here, because working in a home puts heavy constraints on what kind of hardware you’re able to use. You’re not likely to see a humanoid in a home anytime soon, because they’d actually be dangerous, and even a quadruped is likely to be more trouble than it’s worth in a home environment. Hello Robot is conscious of this, and that’s been one of the main drivers of the design of Stretch.</p><p>“I think the portability of Stretch is really worth highlighting because there’s just so much value in that which is maybe not obvious,” Edsinger tells us. Being able to just pick up and move a mobile manipulator is not normal. Stretch’s weight (24.5 kilograms) is almost trivial to work with, in sharp contrast with virtually every other mobile robot with an arm: Stretch fits into places that humans fit into, and manages to have a similar workspace as well, and its bottom-heavy design makes it safe for humans to be around. It can’t climb stairs, but it can be carried upstairs, which is a bigger deal than it may seem. It’ll fit in the back of a car, too. Stretch is built to explore the world—not just some facsimile of the world in a research lab.</p><p>“<a href="https://dobb-e.com/" rel="noopener noreferrer" target="_blank">NYU students</a> have been taking Stretch into tens of homes around New York,” says Edsinger. “They carried one up a four-story walk-up. This enables real in-home data collection. And this is where home robots will start to happen—when you can have hundreds of these out there in homes collecting data for machine learning.”</p><p>“That’s where the opportunity is,” adds Kemp. “It’s that engagement with the world about where to apply the technology beneficially. And if you’re in a lab, you’re not going to find it.”</p><p>We’ve seen some compelling examples of this recently, with <a href="https://mobile-aloha.github.io/" rel="noopener noreferrer" target="_blank">Mobile ALOHA</a>. These are robots learning to be autonomous by having humans teleoperate them through common household skills. But the system isn’t particularly portable, and <a href="https://mobile-aloha.github.io/" rel="noopener noreferrer" target="_blank">it costs nearly US $32,000 in parts alone</a>. Don’t get me wrong: I love the research. It’s just going to be difficult to scale, and <a href="https://spectrum.ieee.org/global-robotic-brain" target="_self">in order to collect enough data to effectively tackle the world, scale is critical</a>. Stretch is much easier to scale, because you can just straight up buy one.</p><p>Or two! You may have noticed that some of the Stretch 3 videos have two robots in them, collaborating with each other. This is not yet autonomous, but with two robots, a single human (or a pair of humans) can teleoperate them as if they were effectively a single two-armed robot:</p><p class="shortcode-media shortcode-media-youtube">
  717. <span class="rm-shortcode" data-rm-shortcode-id="f2c94e0759331e3ec652133f40b5f5c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QtG8nJ78x2M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  718. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Essentially, what you’ve got here is a two-armed robot that (very intentionally) has nothing to do with humanoids. As Kemp explains: “We’re trying to help our community and the world see that there is a different path from the human model. We humans tend to think of the preexisting solution: People have two arms, so we think, well, I’m going to need to have two arms on my robot or it’s going to have all these issues.” Kemp points out that robots like Stretch have shown that really quite a lot of things can be done with only one arm, but two arms can still be helpful for a substantial subset of common tasks. “The challenge for us, which I had just never been able to find a solution for, was how you get two arms into a portable, compact, affordable lightweight mobile manipulator. You can’t!”</p><p>But with two Stretches, you have not only two arms but also two shoulders that you can put wherever you want. Washing a dish? You’ll probably want two arms close together for collaborative manipulation. Making a bed? Put the two arms far apart to handle both sides of a sheet at once. It’s a sort of distributed on-demand bimanual manipulation, which certainly adds a little bit of complexity but also solves a bunch of problems when it comes to practical in-home manipulation. Oh—and if those teleop tools look like modified kitchen tongs, that’s because <a href="https://github.com/hello-robot/stretch_dex_teleop" target="_blank">they’re modified kitchen tongs</a>.</p><p>Of course, buying two Stretch robots is twice as expensive as buying a single Stretch robot, and even though Stretch 3’s cost of just under $25,000 is very inexpensive for a mobile manipulator and very affordable in a research or education context, we’re still pretty far from something that most people would be able to afford for themselves. Hello Robot says that producing robots at scale is the answer here, which I’m sure is true, but it can be a difficult thing for a small company to achieve. </p><p>Moving slowly toward scale is at least partly intentional, Kemp tells us. “We’re still in the process of discovering Stretch’s true form—what the robot really should be. If we tried to scale to make lots and lots of robots at a much lower cost before we fundamentally understood what the needs and challenges were going to be, I think it would be a mistake. And there are many gravestones out there for various home-robotics companies, some of which I truly loved. We don’t want to become one of those.”</p><p>This is not to say that Hello Robot isn’t actively trying to make Stretch more affordable, and Edsinger suggests that the next iteration of the robot will be more focused on that. But—and this is super important—Kemp tells us that Stretch has been, is, and will continue to be sustainable for Hello Robot: “We actually charge what we should be charging to be able to have a sustainable business.” In other words, Hello Robot is not relying on some nebulous scale-defined future to transition into a business model that can develop, sell, and support robots. They can do that right now while keeping the lights on. “Our sales have enough margin to make our business work,” says Kemp. “That’s part of our discipline.”</p><a href="https://hello-robot.com/stretch-3-product" rel="noopener noreferrer" target="_blank">Stretch 3 is available now for $24,950</a>, which is just about the same as the cost of Stretch 2 with the optional add-ons included. There are lots and lots of other new features that we couldn’t squeeze into this article, including FCC certification, a more durable arm, and off-board GPU support. You’ll find a handy list of all the upgrades <a href="https://hello-robot.com/stretch-3-whats-new" rel="noopener noreferrer" target="_blank">here</a>.]]></description><pubDate>Thu, 15 Feb 2024 17:28:45 +0000</pubDate><guid>https://spectrum.ieee.org/hello-robot-stretch-3</guid><category>Hello robot</category><category>Home robots</category><category>Mobile manipulator</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-picture-of-a-tall-skinny-robot-with-a-two-wheeled-base-and-a-single-arm-with-a-flexible-wrist-on-the-end.png?id=51473667&amp;width=980"></media:content></item><item><title>What It’s Like to Eat a Robot</title><link>https://spectrum.ieee.org/edible-robots-2667244222</link><description><![CDATA[
  719. <img src="https://spectrum.ieee.org/media-library/a-woman-puts-a-cup-holding-a-wiggly-beige-cylinder-into-her-mouth-and-bites-off-the-top-of-the-cylinder.gif?id=51445183&width=1200&height=400&coordinates=0%2C88%2C0%2C88"/><br/><br/><p><em>Odorigui</em> is a type of Japanese cuisine in which people consume live seafood while it’s still moving, making movement part of the experience. You may have some feelings about this (I definitely do), but from a research perspective, getting into what those feelings are and what they mean isn’t really practical. To do so in a controlled way would be both morally and technically complicated, which is why <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0296697" rel="noopener noreferrer" target="_blank">Japanese researchers have started developing robots that can be eaten as they move</a>, wriggling around in your mouth as you chomp down on them. Welcome to HERI: Human-Edible Robot Interaction.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  720. <span class="rm-shortcode" data-rm-shortcode-id="0bd98a907f7c1e664360a4b6c98c1f72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OoAszrv5vy4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  721. </p><p>The happy little robot that got its head ripped off by a hungry human (who, we have to say, was exceptionally polite about it) is made primarily of gelatin, along with sugar and apple juice for taste. After all the ingredients were mixed, it was poured into a mold and refrigerated for 12 hours to set, with the resulting texture ending up like a chewy gummy candy. The mold incorporated a couple of air chambers into the structure of the robot, which were hooked up to pneumatics that got the robot to wiggle back and forth.</p><p>Sixteen students at Osaka University got the chance to eat one of these wiggly little robots. The process was to put your mouth around the robot, let the robot move around in there for 10 seconds for the full experience, and then bite it off, chew, and swallow. Japanese people were chosen partly because this research was done in Japan, but also because, according to the paper, “of the cultural influences on the use of onomatopoeic terms.” In Japanese, there are terms that are useful in communicating specific kinds of textures that can’t easily be quantified. </p><p>The participants were  asked a series of questions about their experience, including some heavy ones:</p><ul><li>Did you think what you just ate had animateness?</li><li>Did you feel an emotion in what you just ate?</li><li>Did you think what you just ate had intelligence?</li><li>Did you feel guilty about what you just ate?</li></ul><p>Oof.</p><p>Compared with a control group of students who ate the robot when it was <em>not</em> moving, the students who ate the <em>moving</em> robot were more likely to interpret it as having a “munya-munya” or “mumbly” texture, showing that movement can influence the eating experience. Analysis of question responses showed that the moving robot also caused people to perceive it as emotive and intelligent, and caused more feelings of guilt when it was consumed. The paper summarizes it pretty well: “In the stationary condition, participants perceived the robot as ‘food,’ whereas in the movement condition, they perceived it as a ‘creature.’”</p><p>The good news here is that since these robots are more like living things than nonrobots, they could potentially stand in for live critters eaten in a research context, say the researchers: “The utilization of edible robots in this study enabled us to examine the effects of subtle movement variations in human eating behavior under controlled conditions, a task that would be challenging to accomplish with real organisms.” There’s still more work to do to make the robots more like specific living things, but that’s the plan going forward:</p><blockquote>Our proposed edible robot design does not specifically mimic any particular biological form. To address these limitations, we will focus on the field by designing edible robots that imitate forms relevant to ongoing discussions on food shortages and cultural delicacies. Specifically, in future studies, we will emulate creatures consumed in contexts such as insect-based diets, which are being considered as a solution to food scarcity issues, and traditional Japanese dishes like “Odorigui” or “Ikizukuri (live fish sashimi).” These imitations are expected to provide deep insights into the psychological and cognitive responses elicited when consuming moving robots, merging technology with necessities and culinary traditions.</blockquote><p>“Exploring the Eating Experience of a Pneumatically Driven Edible Robot: Perception, Taste, and Texture,” by Yoshihiro NakataI, Midori Ban, Ren Yamaki, Kazuya Horibe, Hideyuki Takahashi, and Hiroshi Ishiguro from the University of Electro-Communications and Osaka University, was published in <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0296697" rel="noopener noreferrer" target="_blank"><em>PLOS One</em></a>.</p>]]></description><pubDate>Tue, 13 Feb 2024 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/edible-robots-2667244222</guid><category>Edible robots</category><category>Hiroshi ishiguro</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-woman-puts-a-cup-holding-a-wiggly-beige-cylinder-into-her-mouth-and-bites-off-the-top-of-the-cylinder.gif?id=51445183&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=https%3A//feeds.feedburner.com/IeeeSpectrumRobotics%3Fformat%3Dxml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda