This is the first in a series of articles in which we ask – and try and answer – the most common questions in robotics
Ask anyone what a robot is, and you’ll likely get a confident answer: “A machine that walks and talks like a human.”
Some people may offer something slightly more technical – perhaps “a programmable machine that can carry out tasks automatically”.
Both are right, to a point. But as we dig deeper, we find that the question “What is a robot?” quickly becomes more complicated than it first appears.
The word “robot” was first used as the word “robota”, to mean “slave”. Karel Čapek coined the term in his 1920 science fiction play R.U.R. – Rossum’s Universal Robots.
In today’s world of AI chatbots, autonomous vehicles, and humanoid assistants, drawing a line around what counts as a robot is no longer so straightforward. And as robotics and artificial intelligence continue to merge, the boundaries blur even further.
A basic definition: Machines that sense, think, and act
Traditionally, a robot is defined as a machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer. The key features of a classic robot include:
- Sensors: to perceive its environment
- Processing unit: to interpret data and make decisions
- Actuators or effectors: to interact physically with the world
This definition neatly applies to industrial robots on factory floors – those arm-like machines that weld, screw, lift, and move objects with great speed and precision. They don’t look like humans, but they sense, think (in a limited way), and act. They are undeniably robots.
But what about a vacuum cleaner that maps your living room and avoids obstacles? Or a chatbot that books flights and answers your questions in natural language? Or a self-driving car?
Beyond the hardware: Is software a robot?
As robots have become more sophisticated, so too has our notion of what qualifies as one. The advent of AI has introduced purely digital agents – software with no moving parts – that can carry out tasks once thought to require human intelligence.
Consider ChatGPT or any other advanced conversational agent. It can hold a conversation, compose music, write code, and even generate new ideas. But it has no arms or legs. Is it still a robot?
Some argue that a robot must have a physical body. Others say that’s an outdated notion. After all, a bot that operates a stock-trading platform or automates customer service functions can displace a human worker just as effectively as a mechanical machine can.
This leads us to a broader definition: a robot is any autonomous system that senses its environment, processes information, and performs actions – whether in the physical or digital world.
Under this broader view, yes – chatbots are a type of robot. So are self-driving cars. So are robotic process automation (RPA) systems used in finance or healthcare. Some are physical, some are virtual, but all act independently in pursuit of a goal.
The Turing Test and the illusion of intelligence
Alan Turing, one of the founding figures of computer science, proposed a thought experiment in 1950: if a machine can hold a conversation indistinguishable from that of a human, should we say it is intelligent?
This became known as the Turing Test, and for decades it was a benchmark for machine intelligence. Though never a formal test for robotics, it highlights a subtle but crucial point: we tend to assign agency to machines that appear human-like in behavior.
A robot doesn’t need to be humanoid to be effective. But the more a system can mimic human abilities – especially in language – the more we relate to it as if it were alive.
That’s part of the reason AI chatbots provoke such strong reactions: curiosity, fascination, even fear. And as these bots become more “real”, we edge closer to what psychologists call the Uncanny Valley.
The Uncanny Valley: Almost human, but not quite
The term “Uncanny Valley” describes the uneasy feeling people get when encountering a robot or avatar that looks almost – but not quite – human. The closer it gets to lifelike appearance and behavior without fully crossing the threshold, the more unsettling it becomes.
Humanoid robots like Hanson Robotics’ Sophia or Tesla’s Optimus push into this territory. So do digital avatars used in customer service or virtual influencers on social media. They are emotionally confusing because they force us to reckon with what it means to be human – and what it means to fake it.
The Uncanny Valley isn’t just a psychological oddity. It’s a real design challenge for robotics companies. Robots that are too human can backfire, while those that are just human enough – like Pixar’s WALL-E or Boston Dynamics’ Spot – tend to be more warmly received.
Moravec’s Paradox: Why walking is hard, but chess is easy
One of the great ironies of robotics is summed up by Moravec’s Paradox: tasks that are easy for humans, like walking, recognizing faces, or catching a ball, are extremely difficult for machines. Meanwhile, activities that seem intellectually difficult – like playing chess or calculating and predicting share price movements – are relatively easy for computers.
This paradox helps explain why we have AI that can outplay grandmasters or write poetry, but still struggle to build robots that can walk up stairs or fold laundry with human-level grace.
It also hints at something deeper: much of human intelligence is subconscious, intuitive, and tied to physical embodiment. This makes building robots that truly replicate human capabilities a massive engineering challenge – especially when those robots are expected to operate safely and reliably in the real world.
Are autonomous vehicles robots?
Autonomous cars have all the hallmarks of a robot:
- Sensors (cameras, radar, lidar)
- Processing units (onboard computers, AI chips)
- Actuators (motors, steering, brakes)
They navigate complex environments, make split-second decisions, and operate without direct human control. By most definitions, an autonomous vehicle is a robot – albeit one disguised as a car.
Yet public perception is often different. Most people don’t call Tesla’s Full Self-Driving system a robot, even though it fulfills robotic criteria. This illustrates how context, branding, and form factor influence our mental models. If it doesn’t look like a robot, is it still one?
A looming disruption: Robots and the future of work
Regardless of how we define them, robots – both physical and virtual – are poised to transform the global economy.
According to a report from Goldman Sachs, AI and automation could eliminate or transform up to 300 million jobs globally in the coming decades. White-collar roles like customer service, paralegal work, and data entry are at especially high risk, while blue-collar sectors will continue to see increased use of physical robots.
Warehouse workers are already sharing space with autonomous mobile robots. In agriculture, robotic harvesters are gaining ground. In healthcare, robotic surgery systems are becoming standard.
Meanwhile, AI-powered systems are replacing human labor in marketing, design, legal analysis, education, and software development.
Whether we call these systems robots, bots, or AI agents, the impact is the same: a fundamental shift in how work is done, who does it, and what it means to be a skilled worker in the age of intelligent machines.
So… what is a robot?
The question “What is a robot?” is not just technical; it’s philosophical. A robot can be a machine on an assembly line, a car that drives itself, a humanoid that smiles, or a disembodied voice in your phone. It can have limbs, wheels, or nothing but code.
At its core, a robot is any autonomous system – physical or digital – that senses the world, processes information, and acts upon it. The form may change, but the function remains: to extend human capabilities through machine agency.
As robots become more intelligent and more integrated into society, we may need to move beyond rigid definitions and accept that the age of machines thinking – and acting – on their own is no longer the stuff of science fiction. It’s here. And it’s only accelerating.