|














 |
TECHnalysis Research president Bob O'Donnell publishes commentary on current tech industry trends every week at LinkedIn.com in the TECHnalysis Research Insights Newsletter and those blog entries are reposted here as well. In addition, those columns are also reprinted on Techspot and SeekingAlpha.
He also writes a regular column in the Tech section of USAToday.com and those columns are posted here. Some of the USAToday columns are also published on partner sites, such as MSN.
He also writes occasional columns for Forbes that can be found here and that are archived here.
In addition, he has written guest columns in various other publications, including RCR Wireless, Fast Company and engadget. Those columns are reprinted here.
January 14, 2026
By Bob O'Donnell
According to most predictions, this year was going to be the breakout year for robotics at CES. And, in fact, that was the case. But while the physical products made the biggest initial splash at the show, it’s the news about robotics platforms and tools that will have the most long-term impact.
Sure, there was a huge number of robotic devices and demonstrations available to see in Las Vegas, but most all of them were not ready for primetime and certainly aren’t going to be commercial products anytime soon. The innovations introduced by chip makers, system builders, IP-focused companies and more, however, will start to make a meaningful difference starting this year. Architectures, reference platforms, software tools and related sub-components built and debuted by companies like Nvidia, AMD, Intel, Qualcomm, Arm, and others are going to enable the coming Physical AI revolution to get off the ground.
Qualcomm kicked things off with the latest edition to its robotics platform, the Dragonwing IQ10 SOC (System on Chip). Designed for the most advanced robotics applications, including the “brains” for humanoid robots, the IQ10 brings together many of the company’s most advanced component technologies. It incorporates up to 18 compute-grade Oryon CPU cores, an Adreno GPU, Hexagon NPU, Spectra ISP (Image Signal Processor), support for up to 20 cameras and more, with a total system-wide TOPS performance number of 700 according to Qualcomm. The IQ10 also incorporates a real-time safety subsystem to help enable support for important functional safety standards, which are going to be extremely critical for most robotics applications.
In addition to the new chip, the company also touted its range of other Dragonwing industrial processors and software tools, noting that the company’s recent acquisition of Arduino allows it to reach an extremely large range of potential robotics developers—from students and enthusiasts to industrial professionals—with a consistent tool chain.
At the Nvidia keynote, CEO Jensen Huang described the entire robotics platform that his company is working to create. From upgraded versions of its Jetson and Thor compute platforms—the new Jetson T4000 is powered by Nvidia’s latest Blackwell GPU architecture—to a whole range of models and training tools, Nvidia laid out an impressive vision for how it’s moving the robotics market forward. What was most noticeable is how much of the company’s efforts in robotics are now focused on software and AI models. The new Cosmos Transfer and Cosmos Predict 2.5 models are the latest iterations of its world models, designed for simulations and training of robots and autonomous machines in real-world environments. Cosmos Reason 2 is Nvidia’s newest VLM (Vision Language Model) and Isaac GROOT N1.6 is a VLA (Vision Language Action) model—arguably the LLM equivalents of the robotics world. They’re both intended to let robots see, understand, and react to objects and actions in the real-world, with GROOT specifically focused on full body control for humanoid robots.
Intel’s keynote also made a brief mention of the robotics-related efforts the company is making with an Edge AI-focused version of its new Core Ultra Series 3 (codenamed “Panther Lake”) platform. Though many people don’t immediately think of Intel when it comes to robotics, the company has had a big presence in the industrial robotics market for a very long time. Now, with the dramatically improved GPU capabilities in Core Ultra Series 3—as well as an enhanced NPU—Intel clearly sees an opportunity to tap into the excitement around physical AI and robotics. Intel’s Robotics AI Suite was recently updated to support Core Ultra Series 3, and the company introduced an industrial grade Edge AI version of Panther Lake at CES. This marks the first time Intel has done a simultaneous launch of mainstream PC and edge-focused chips, emphasizing its desire to quickly jump into robotics and other Edge AI applications.
At the AMD keynote, CEO Dr. Lisa Su also made a big point to emphasize AMD’s interest in robotics as part of a broad sweeping narrative about how the company’s reach has grown. While there was no discussion about specific robotics-related products, the company did bring an impressive looking humanoid robot called GENE.01 developed by Italian firm Generative Bionics (recently spun out from the Italian Institute of Technology (IIT)) to the stage that was touted to be powered by AMD technology. Specifically, the robot leverages AMD’s FPGAs for embedded vision and sensor response, as well as AMD CPUs, GPUs and its open source ROCm software stack. Initially promised by the end of this year, the GENE.01 robot incorporates a number of intriguing innovations including whole-body tactile sensors—connected to the FPGAs—enabling a touch-sensitive human “skin” to the robot, which is supposed to allow for safer interactions with humans.
The fact that the robot didn’t walk onto or off the stage was a bit disappointing—and suggests that the initial timeline could be a bit overeager. In a separate meeting I had with Generative Bionics CEO Daniele Pucci, however, I learned more about the company’s heritage from IIT and its 20+-year history of building robots that perform a great deal of very precise movements. More importantly, I also started to really appreciate their partnership with AMD and how the experience of working with a company with that kind of robotics history could have a very positive long-term impact on AMD’s future robotic initiatives.
Semiconductor design leader Arm did not make any specific robotics-related product announcements at CES, but it did reveal that the company has recently done a reorganization that pulls together the automotive, IoT, and robotics efforts into a single group, led by industry veteran Drew Henry. In my conversations with Drew, he emphasized Arm’s very long history in robotics, as well as real-time systems, functional safety, microcontrollers, software systems linking all these pieces, and many other elements that are an essential part of building robotics platforms. Needless to say, I’m expecting to hear a lot more about robotics-related initiatives from Arm over the next few years.
What also came up during that discussion was something that I heard from other suppliers as well. Much of the work that many of them have done in the automotive market—particularly with regard to autonomous driving—has obvious relevance and application in robotics. Nvidia’s Jensen Huang has been talking about ADAS as a form of Physical AI for some time, so no surprise there. However, it also became clear that just as smartphone-related technologies can impact PCs (and vice versa), there are significant differences in the expectations and demands around robotics versus autonomous driving. Hence, just finding success in one market does not ensure success in the other. As always, it boils down to having the right combination of hardware, software, platforms, architectures, and other elements in order to create products that the market is willing to accept.
Of course, with robotics, there are still other factors to consider. First, as mentioned earlier, one thing that became abundantly clear at this year’s CES is that the full, widespread commercialization of humanoid-style robots is still many years away—and initial price points are likely to be extremely high. But even beyond these technical and economic concerns, another issue is social acceptance of these devices. Unlike most any other technology product category we’ve ever seen, I think humanoid-style robots are going to face a psychological barrier to adoption that goes well beyond even some of the concerns that have been raised regarding autonomous cars.
Robots in science fiction is one thing, but in everyday real life? Well, that’s another. (And it’s not just me. In numerous conversations around the show, I heard similar sentiments.) Plus, there are enormous physical safety issues around multi-hundred-pound metal devices traipsing through our home and work environments. Personally, I think incremental steps between a robotic vacuum cleaner and real-life Rosie the Robot are going to be essential, but it will be fascinating to see how this market evolves.
Even with these concerns, however, it’s clear that the excitement and anticipation for a robust robotics market is going to drive many important tech innovations over the next few years. It’s also going to create a huge range of opportunities for existing players as well as for startups and other as-yet-unknown companies. As with most things, people will eventually start to get used to robotics devices, so the question won’t be if the robotics market happens, but when.
Here's a link to the original column: https://www.linkedin.com/pulse/robotics-news-ces-all-platforms-bob-o-donnell-pzclc
Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on LinkedIn at Bob O’Donnell or on Twitter @bobodtech.
Podcasts
Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research has a video-based podcast called TECHnalysis Talks.
LEARN MORE |
|
Research Offerings
TECHnalysis Research offers a wide range of research deliverables that you can read about here.
READ MORE |
|