Home Bots & Business ‘AI Advances Robotics Use Cases’

‘AI Advances Robotics Use Cases’

by Marco van der Hoeven

The recent advancements in AI make more complex robot systems a reality, like humanoid robots. The challenge now is to find practical used cases. At CES Unveiled, last week in Amsterdam, the Rocking Robots team sat down with Brian Comiskey to discuss the most significant trends in AI and robotics that will be on show in Las Vegas.

“I think we’re already starting to see the development of AI working really well alongside humans”, says Brian Comiskey, Senior Director, Innovation and Trends & Futurist at the Consumer Technology Association. “A good example of this is drones. Artificial intelligence is now being used in drones in a way that, if you go to an Ikea warehouse, for example, human workers are working alongside automated drones. The AI drone flies in, scans a pricing label, and the human checks it, working alongside those co-workers. That’s only possible because of recent coding developments that have really pushed that forward.”

“We are also seeing breakthroughs in robotic digital twins now. And on the digital twin side, Nvidia, who’s going to keynote at CES, has unveiled a digital twin of the entire Earth. This is allowing climate forecasting and other modeling to really help build resilient ecosystems. And on the humanoid robot side, it’s about productivity and also advancing ideas like caregiving.”

Humanoid robots

In the field of humanoid robots, they clearly benefit from some of the advances that are made in general AI. “In the past, most robots had an input A – output B structure. It was very binary, very linear. But with more advanced software systems on board, paired with more advanced chips, you get more advanced robots. So there is a lot of investment going into this space. Microsoft and Magna, for example, are two companies that might be surprising in this field. Microsoft has a lot of the AI infrastructure, while Magna brings a lot from the automotive side that can be applied to the mobility of a robot.”

“So now, it’s about getting the investment to advance the software on board and then identifying the use cases. Some of it is productivity within a warehouse—how do we meet productivity gaps? How do humanoid robots maybe fill jobs that are very dangerous or can put a lot of strain on the body, like moving heavy objects? But then there’s also thinking about novel solutions for humanoid robots, like caregiving, or addressing things like loneliness. How does a robot start to work alongside us, live alongside us? Both use cases are quite important.”

Tesla

He sees the widely publicized Tesla-announcement as a part of a larger trend in robots, from humanoids to robo-taxis. “We’re seeing more advancements in sensor development and self-driving capabilities, not just in cars. You see it in industrial vehicles and even boats. A big focus is not just the consumer side but also how fleets for taxis can use automated robo-taxis to redefine what the mobility sector looks like. That’s where Tesla is really focusing on with their robo-taxis, but those developments will take a few years to fully realize.”

CES exhibitor Waymo, part of Alphabet, is already piloting robo-taxi options in three states. “They even just made a deal with Hyundai, also at CES, to supply their robo-taxi fleet using the Ioniq 5 electric vehicle. So, they’re really trying to merge the sustainability story of EVs with the automation story of robots and robo-taxis.”

CES

At CES AI and robotics are clearly present as transformative technologies for every industry, every field, every sector. “In West Hall, where we have a lot of mobility and automotive showcases, you’ll see the robo-taxi story develop, but also a strong safety story. The sensors that go on board for a lot of these vehicles, from vision systems like Mobileye to LiDAR sensors from companies like Perceptive, are really about making vehicles safer, automating them, and integrating them into the robotics space. We’re even seeing some industrial applications there.”

“In North Hall, you’ll start to see more of an enterprise story emerge—how AI can help businesses, whether that’s building better solutions for marketing, advertising software, or client relationship management. On the robotic side in North Hall, there’s a focus on the service sector. For instance, RichTech Robotics will showcase how robotic waiters are evolving from being novelties to becoming practical tools that integrate into the service economy to help fill roles with staffing gaps.”

Sustainability

AI will also be seen applied to the sustainability question. “A company called Kobo develops onboard systems to manage battery output, helping smart grids become more energy efficient and aiding the energy transition, which is so critical right now. It is all about use cases. As a futurist, part of my job is to look at what’s going to happen—what could happen tomorrow. But what’s exciting now is that we’re already seeing proven use cases in generative AI. Whether it’s something like Co-Pilot from Microsoft or tools that help with efficiency, like chatbots assisting pharmacists, the practical applications are already here.”

And AI is already shaping the world. “I find that most people don’t realize how much AI is already part of their daily lives. For example, 90% of content recommendations on Netflix come from an AI algorithm. On any given day, 82% of Amazon’s e-commerce sales are driven by AI algorithms. If you’re using a wearable device or even just a smartphone, there are AI features integrated into those too. So, people are already interacting with AI on a daily basis.”

Innovation

“Generative AI is different because it has this creative element, and that’s where much of the concern comes from. I think the Consumer Technology Association (CTA), which organizes CES, has done a good job of bringing together industry leaders and policy makers to figure out how to build guidelines around generative AI. We all agree that regulation is necessary, but we need to ensure that it doesn’t stifle innovation, which is one of the biggest challenges.”

There are also concerns about bias in AI. That’s a huge issue because only 25% of AI developers are women, and 22% are ethnic minorities. That’s a gap, and it’s something the industry is already addressing. The biases and hallucinations that people sometimes see in AI are not the fault of the AI itself; AI is a reflection of us. And as humans, we have inherent flaws. So, the challenge is recognizing those flaws in ourselves and working to ensure that AI becomes a tool to help us. AI isn’t here to replace us—it’s here to collaborate with us, to co-work with us. At the end of the day, it’s about coexistence.”

 

Misschien vind je deze berichten ook interessant