• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • About
    • Contact
    • Privacy
    • Terms of use
  • Advertise
    • Advertising
    • Case studies
    • Design
    • Email marketing
    • Features list
    • Lead generation
    • Magazine
    • Press releases
    • Publishing
    • Sponsor an article
    • Webcasting
    • Webinars
    • White papers
    • Writing
  • Subscribe to Newsletter

Robotics & Automation News

Where Innovation Meets Imagination

  • Home
  • News
  • Features
  • Editorial Sections A-Z
    • Agriculture
    • Aircraft
    • Artificial Intelligence
    • Automation
    • Autonomous Vehicles
    • Business
    • Computing
    • Construction
    • Culture
    • Design
    • Drones
    • Economy
    • Energy
    • Engineering
    • Environment
    • Health
    • Humanoids
    • Industrial robots
    • Industry
    • Infrastructure
    • Investments
    • Logistics
    • Manufacturing
    • Marine
    • Material handling
    • Materials
    • Mining
    • Promoted
    • Research
    • Robotics
    • Science
    • Sensors
    • Service robots
    • Software
    • Space
    • Technology
    • Transportation
    • Warehouse robots
    • Wearables
  • Press releases
  • Events

When wheels won’t do: Humanoid robots for human-centric spaces

June 17, 2025 by Sam Francis

By Nicolas Lehment, senior principal system architect, NXP Semiconductors

Automated guided vehicles (AGVs) and wheeled mobile robots currently dominate factories and warehouses, following magnetic stripes or optical markers permanently embedded in warehouse flooring to navigate routes.

As requirements evolve, however, wheels are simply not enough. Demand is growing for machines capable of navigating more challenging, human-centric environments, such as hospitals, restaurants, homes, and even rugged outdoor terrain.

In such environments, perfectly flat floors, free from steps and stairs, along with conspicuous, clutter-free aisles, are luxuries.

Instead, robots must move over thresholds, skirt around unpredictable obstacles and adapt on the fly to a world that’s neither uniform nor pre-mapped.

Humanoid robots are the natural solution, literally following in our footsteps, and there are three foundational areas shaping legged and humanoid robots for such complex settings.

These comprise motion control, perception/navigation, and modularity/flexibility. Together, the three are driving adoption beyond structured industrial floors toward truly versatile autonomous systems.

Controlling motion in unstructured spaces

Traditional factory robots execute pre-planned trajectories over known workspaces – think of three-axis arms that can move in three independent linear directions, for example, or gantry systems moving along fixed rails.

In contrast, legged and humanoid platforms demand real-time, closed-loop control across dozens of degrees of freedom.

Each footfall or joint adjustment must balance stability, torque, and body pose within milliseconds.

Modern embodied robots break the old, centralized motor-controller paradigm. Each joint or limb houses a microcontroller responsible for low-latency torque and position loops, while a central processor coordinates full-body motion plans.

This splitting of duties reduces communication delays and enables smoother, more robust responses to disturbances – crucial when a robot must, for example, steady itself after bumping into an object.

Coordinated multi-motor actions rely on real-time field-bus communications protocols such as EtherCAT, which guarantee sub-millisecond synchronization across dozens of actuators.

Emerging standards include OPC UA FX over TSN, which utilizes time-sensitive networking (TSN) to further enhance the reliable, low-latency communication required for industrial automation and advanced robotics.

In field trials of quadruped and biped robots traversing outdoor trails, tight timing prevented missteps when the terrain shifted underfoot.

Beyond pure control loops, AI-driven planners predict center-of-mass shifts and adjust joint targets on the fly. These systems blur the line between motion planning and motor control – embedding sensory feedback into every step.

Perception and navigation in complex environments

In a traditional warehouse, multiple 2D LiDAR units and QR-code-reading cameras will often suffice. But in unstructured, human-populated spaces, robots need richer, denser awareness.

Legged and humanoid robots combine 3D LiDAR, time-of-flight (ToF) depth cameras and stereo vision to build volumetric maps in real time.

Simultaneous Localization and Mapping (SLAM) algorithms fuse this data with Inertial Measurement Unit (IMU) readings to maintain accuracy even when visual features are sparse – under hospital curtains or in dim domestic lighting, for instance.

Traditional collision-avoidance treats all obstacles equally. Advanced systems use edge AI to distinguish static furniture from moving humans or pets.

A floor-cleaning robot might pause and reroute when it detects a child’s toy, for example, then resume cleaning once the path clears, minimizing interruption in dynamic settings.

In restaurants or care homes, robots must pick up cups or tools and hand them to people. Grasp planners leverage 3D point clouds and learned object models to identify stable gripping points.

In a simulated kitchen task, a humanoid robot correctly lifted varied vessels 92 percent of the time by combining vision-based detection with force-feedback compensation.

In mixed human-robot workspaces, safety standards demand redundant sensing. Zonal LiDAR scanners enforce protective zones around moving parts, shutting down motion if a person is too close.

While costly, these sensors remain essential in healthcare and hospitality applications.

Modularity and flexibility for rapid deployment

Human-centric robots also need to integrate seamlessly into existing infrastructures, while supporting frequent updates and adapting to evolving tasks.

Autonomous Mobile Robots (AMRs) once relied on a central “brain” to handle all SLAM, vision and control workloads.

Today’s systems push compute to edge modules: Neural Processing Units (NPUs) co-located with cameras for object detection, real-time IMU processing on microcontrollers for stabilization, and multicore hosts for high-level planning.

This distributed approach slashes both overall power consumption and material costs, while delivering optimal performance.

Meanwhile, the Robot Operating System 2 (ROS 2) provides a hardware-agnostic framework for message passing, lifecycle management, and real-time control.

Its support for Data Distribution Service (DDS) transport and actions simplifies coordination among sensor, planning, and actuation nodes, accelerating prototyping and reducing integration risk.

From Wi-Fi in hospitals to LTE or 5G outdoors, robots need flexible networking stacks.

Matter-based protocols may soon unify smart home connectivity, while edge-to-cloud pipelines feed analytics engines for fleet management and predictive maintenance.

Battery management subsystems optimize runtime via dynamic power scaling and health diagnostics, maximizing uptime between charges.

On the actuation side, crossover microcontrollers handle closed-loop motor control and support field-bus interfaces per axle or limb – letting developers scale robot platforms from four to thirty actuators without redesigning core electronics.

Toward mainstream adoption

Legged and humanoid robots are no longer science fiction. Early trials in hospitals are already under way, delivering medication, guiding visitors and sanitizing rooms.

In restaurants, robotic waitstaff are taking orders and clearing tables, while domestic assistants navigate cluttered living rooms to provide errand-running services.

Real-world deployments demand rigorous integration of control, perception, and modular architectures, however. As these systems venture outside perfectly controlled factories, three truths emerge.

First, bipedal and quadrupedal platforms excel where wheels slip, sink or stall, such as soft turf, uneven terrain or cluttered hallways.

Second, multi-modal sensing is increasingly essential. Only by fusing LiDAR, vision and inertial data can robots navigate safely and effectively in human-centric spaces.

Finally, modular compute and standardized middleware are reducing time-to-market and supporting continuous innovation.

In future, the convergence of advanced motion control, richer perception and plug-and-play hardware will usher in a new class of service robots that move seamlessly among people and adapt to the unpredictability of everyday environments.

As these legged platforms become more capable and affordable, we can expect to see them not only in industry, but also in our homes, hospitals, and public spaces, turning the science fiction of years gone by into an everyday reality.

Print Friendly, PDF & Email

Share this:

  • Click to print (Opens in new window) Print
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on X (Opens in new window) X
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket

Related stories you might also like…

Filed Under: Engineering, News Tagged With: edge ai robotics, humanoid robots, legged robots, mobile robots in healthcare, nxp semiconductors, robot motion control, robot navigation, robot operating system ros2, robot perception, service robotics

Primary Sidebar

Search this website

Latest articles

  • How advanced automation is transforming waste management
  • Augmentus raises $11 million to scale physical AI for complex robotic surface finishing and welding
  • GreenBot unveils autonomous system for weeding woody crop areas
  • The Rise of the Autonomous Fab Shop: Why Waterjet Cutting is Leading the Automation Revolution
  • Mendaera receives FDA clearance for handheld robotic system for ultrasound-guided needle placement
  • Prime Vision robots optimize K-Parts order picking for motorbike spares
  • Vidnoz Review: The Free AI Video Generator That’s Redefining Content Creation
  • Cases Motorcycle Injury Lawyers Consider
  • How a Truck Accident Lawyer Protects Your Rights
  • Loomia launches tactile sensing developer kit to accelerate robotic skin innovation

Secondary Sidebar

Copyright © 2025 · News Pro on Genesis Framework · WordPress · Log in

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Do not sell my personal information.
Cookie SettingsAccept
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT