• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • About
    • Contact
    • Privacy
    • Terms of use
  • Advertise
    • Advertising
    • Case studies
    • Design
    • Email marketing
    • Features list
    • Lead generation
    • Magazine
    • Press releases
    • Publishing
    • Sponsor an article
    • Webcasting
    • Webinars
    • White papers
    • Writing
  • Subscribe to Newsletter

Robotics & Automation News

Where Innovation Meets Imagination

  • Home
  • News
  • Features
  • Editorial Sections A-Z
    • Agriculture
    • Aircraft
    • Artificial Intelligence
    • Automation
    • Autonomous Vehicles
    • Business
    • Computing
    • Construction
    • Culture
    • Design
    • Drones
    • Economy
    • Energy
    • Engineering
    • Environment
    • Health
    • Humanoids
    • Industrial robots
    • Industry
    • Infrastructure
    • Investments
    • Logistics
    • Manufacturing
    • Marine
    • Material handling
    • Materials
    • Mining
    • Promoted
    • Research
    • Robotics
    • Science
    • Sensors
    • Service robots
    • Software
    • Space
    • Technology
    • Transportation
    • Warehouse robots
    • Wearables
  • Press releases
  • Events

Robotic piece-picking: Inside the quest for human-level dexterity

June 16, 2025 by Sam Francis

Exploring the convergence of advanced grippers, 3D vision, and sophisticated AI that is enabling robots to handle an ever-wider variety of items with near-human skill

The relentless surge of global e-commerce has transformed our expectations for speed and convenience. But behind the one-click purchase lies a vast, complex, and physically demanding logistics network.

At its heart is a critical bottleneck: the manual, often strenuous task of piece picking. For decades, the simple act of identifying a specific item from a diverse jumble in a bin, grasping it securely, and placing it in a shipping container has remained stubbornly in the human domain.

Humans’ innate combination of sight, touch, and judgment has been the one ingredient that automation couldn’t replicate. Possibly until now.

A new wave of robotic piece-picking systems is finally cresting. These are not the rigidly programmed, repetitive robots of the factory floor, but a sophisticated new class of machine, integrating advanced “hands”, “eyes”, and “brains”.

This article delves into the technological trifecta – advanced grippers, AI-powered vision, and machine learning – that is enabling robots to tackle the complexities of the modern warehouse.

We explore how these innovations are solving critical labor and efficiency problems and chart the future for investment and development in this dynamic field.

The dexterity dilemma: The warehouse bin as the ultimate test

To understand the scale of the challenge, picture not a structured puzzle, but a junk drawer or a child’s messy toy box. This is the reality of a modern fulfillment bin from a robot’s perspective.

It’s a chaotic, unstructured environment containing a near-infinite variety of stock keeping units (SKUs). A fragile lightbulb can be pressed against a deformable polybag, which in turn is draped over a rigid, shrink-wrapped box.

For a robot, this presents a cascade of problems: identifying the target item from its neighbours, calculating its precise 3D position and orientation, determining the best way to grasp it without crushing it or letting it slip, and then executing that grasp at a speed that delivers a tangible return on investment.

The sheer variability has, until recently, made the task computationally and mechanically overwhelming. The solution has required a fundamental rethinking of how robots see, touch, and think.

The technological triumvirate: Deconstructing the modern picking robot

Success in robotic piece picking rests on the seamless integration of three distinct yet interdependent technologies.

1. The ‘eyes’: Seeing in three dimensions

A robot cannot pick what it cannot see. While 2D cameras were a starting point, the unstructured nature of a bin requires true depth perception.

This has led to the universal adoption of 3D vision systems, with several key technologies leading the charge:

  • Stereo vision: Using two cameras offset from each other, these systems mimic human binocular vision. By comparing the two images, they calculate depth and create a detailed 3D map. This method excels at capturing the rich color and texture information needed to differentiate between similar-looking items.
  • Structured light: This technique involves projecting a known pattern of light, such as a grid or stripes, onto the contents of the bin. A camera observes how the pattern deforms over the surface of the objects, allowing for the precise calculation of their 3D shape and topography. It is highly effective for generating detailed models of static objects.
  • Time-of-flight (ToF): These cameras operate like a fast-acting sonar, emitting a pulse of infrared light and measuring the precise time it takes for the light to bounce off an object and return. This allows for rapid and accurate depth mapping across the entire scene and performs well even in variable ambient light.

The question naturally arises: could a technology like LiDAR (Light Detection and Ranging), famous for its use in autonomous vehicles, play a role?

The answer is increasingly yes. While traditionally used for large-scale mapping, more compact, high-resolution solid-state LiDAR systems are becoming viable for in-bin analysis.

By creating a dense “point cloud” of the bin’s contents, LiDAR can offer exceptional geometric accuracy and is highly resistant to lighting issues.

As the cost and size of these sensors continue to fall, we can speculate that LiDAR will become an increasingly important tool in the robot’s vision arsenal, likely used in combination with other sensors.

However, the sensor is only half the equation. The raw 3D data is useless without an AI that can interpret it. This is where deep learning, specifically convolutional neural networks (CNNs), becomes critical.

Trained on vast image libraries, these AI models can instantly analyze a 3D point cloud, identify the target SKU, ignore the surrounding objects, and calculate the item’s precise orientation for the grasp.

2. The ‘hands’: A softer, smarter touch

The classic, rigid, two-fingered gripper is ill-suited for the variety in a fulfillment bin. The revolution in picking has been driven by a new generation of versatile grippers.

  • Soft robotics: Pioneered by companies like Soft Robotics Inc., these grippers use compliant materials like food-grade silicone. By pumping air into or out of flexible chambers, these “fingers” can conform to an object’s unique shape, providing a gentle yet secure hold on everything from a bottle of pills to a head of lettuce.
  • Granular jamming: Other soft grippers are filled with a granular material (like coffee grounds). In its normal state, the gripper is soft and compliant. When it has conformed around an object, a vacuum is applied, causing the granules to lock together, creating a solid, form-fitting grip.
  • Hybrid multi-modal systems: Recognizing that no single method is perfect, leading solutions from companies like RightHand Robotics employ a “multi-modal” approach. Their grippers often combine compliant fingers with a suction cup at the center. The robot’s AI can then decide in milliseconds whether to use the fingers, suction, or both, dramatically increasing the range of items it can successfully handle.

3. The ‘brain’: From programming to learning

The most significant leap forward lies in the AI that orchestrates the entire process. Instead of being programmed for every possible item and scenario, modern systems learn.

Platforms like the “Covariant Brain” from Covariant AI are prime examples. They use a form of machine learning called reinforcement learning. A robot attempts a pick, and whether it succeeds or fails, the result is used to refine its algorithm.

This trial-and-error process, often accelerated in millions of physics-based simulations, allows the system to build an intuitive understanding of how to handle different objects.

Crucially, this learning is shared across the entire fleet of robots. A lesson learned by a robot in a German warehouse can be instantly applied by a robot in Ohio, causing the entire network to become smarter and more capable over time.

From lab to logistics floor: Investment and commercialization

This technological convergence is already delivering significant value. Major logistics providers and retailers are deploying these systems to increase order accuracy, boost throughput, and operate 24/7, helping to mitigate persistent labor shortages.

Innovators like Berkshire Grey are providing full-stack robotic solutions to retail and grocery giants, while the aforementioned Covariant and RightHand Robotics focus on providing the core picking intelligence and hardware that can be integrated into various warehouse environments.

For the investment community, the appeal is clear. The global warehouse automation market is projected to be worth tens of billions of dollars within the next few years, and robotic piece picking is one of its fastest-growing segments.

Investors are backing companies that can demonstrate not just novel technology, but a clear and rapid return on investment (ROI). The key metrics are speed (picks per hour), reliability, and the breadth of SKUs a system can handle.

The most attractive solutions are those that are scalable and can be easily integrated with existing Warehouse Management Systems (WMS) without a complete operational overhaul.

The horizon: The future of dexterity

The quest for human-level dexterity is not over. The next frontier is tackling the “long tail” – the final 5 percent of exceptionally challenging items like complex tangled objects or highly reflective packaging.

Development is focused on even more advanced tactile sensing to give robots a human-like sense of “feel”.

Furthermore, the industry is moving towards mobility. The integration of these sophisticated arms onto autonomous mobile robots (AMRs) will untether piece-picking from a fixed station, creating a truly flexible robotic workforce that can move around the warehouse to where it’s needed most.

In conclusion, the convergence of intelligent 3D vision, adaptive gripping technology, and advanced AI has cracked the code of robotic piece picking.

What was once a concept confined to research labs is now a commercial reality, fundamentally reshaping the economics and operational realities of the logistics industry.

We are at the start of a new era of automation, where robots can finally handle the complexity and variety of the physical world, freeing human workers for safer, more complex, and more valuable roles.

Print Friendly, PDF & Email

Share this:

  • Click to print (Opens in new window) Print
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on X (Opens in new window) X
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to share on Telegram (Opens in new window) Telegram
  • Click to share on Pocket (Opens in new window) Pocket

Related stories you might also like…

Filed Under: Features, Logistics Tagged With: AI in robotics, computer vision, dexterous manipulation, logistics automation, order fulfillment, robotic grippers, robotic piece picking, soft robotics, supply chain technology, warehouse automation

Primary Sidebar

Search this website

Latest articles

  • How advanced automation is transforming waste management
  • Augmentus raises $11 million to scale physical AI for complex robotic surface finishing and welding
  • GreenBot unveils autonomous system for weeding woody crop areas
  • The Rise of the Autonomous Fab Shop: Why Waterjet Cutting is Leading the Automation Revolution
  • Mendaera receives FDA clearance for handheld robotic system for ultrasound-guided needle placement
  • Prime Vision robots optimize K-Parts order picking for motorbike spares
  • Vidnoz Review: The Free AI Video Generator That’s Redefining Content Creation
  • Cases Motorcycle Injury Lawyers Consider
  • How a Truck Accident Lawyer Protects Your Rights
  • Loomia launches tactile sensing developer kit to accelerate robotic skin innovation

Secondary Sidebar

Copyright © 2025 · News Pro on Genesis Framework · WordPress · Log in

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Do not sell my personal information.
Cookie SettingsAccept
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT