Robotics & Automation News

Market trends and business perspectives

Self-driving car companies face choice between active and passive sensors

Self-driving cars are close to becoming a reality. When this happens, the sensors they use will be particularly important, says Foresight Autonomous Holdings Ltd, parent company of Foresight Automotive.

Foresight is developing a sensor system that uses multiple visual light and infrared cameras in stereoscopic technology to interpret its surroundings.

Google’s parent company Alphabet, Inc.  is installing a wide range of sensors in its cars, combining their different inputs to create a fuller picture.

Ford is focusing on other aspects of the future of driving, with its app-controlled Chariot commuting system. General Motors is about to enter mass production of a car that uses LIDAR to test its surroundings.

Meanwhile Tesla, Inc., the business most famous for work in this area, is using pattern recognition to help its cars interpret the input from their sensors.

Autonomous vehicles – more often referred to as self-driving cars and trucks – are close to becoming a regular feature of the world’s streets. Several companies, automotive insiders, and tech innovators like Valeo and Bosch are working on the technology needed to make them a reality. From complex driving software to steering equipment, an industry is growing in the cars of the future.

Among the most important features of these cars are the sensors they use to read the world around them. These fall into two groups – passive and active. Active sensors project energy into the world and then use the reflections they get back to understand what’s there.

Passive sensors using energy that’s already in the world, particularly light or heat. There are a variety of sensors of each type being tested, and a variety of different technologies that could come out on top. But the most important distinction may be whether the sensors we get are active or passive.

Specialist companies have started to emerge in this field, focusing entirely on car sensor technology. One of these is Foresight Autonomous Holdings Ltd.

Founded in 2015, Foresight is committed to designing, developing, and commercialising a range of sensor systems and associate technologies for use in autonomous vehicles.

This includes stereo and quad camera systems and the software that will allow a car to interpret the signals from those cameras. These can be used to help avoid accidents between cars and will eventually allow self-driving cars to see and act on objects in their surrounding environment.

The company has already drawn attention with its advanced technology. It has acquired leading investors from the local automotive industry and reached a market cap of $100 million as of January 2018. Resources to continue its ambitious research and development program was assured by the recent announcement of a merger agreement with Tamda Ltd.

Foresight’s leading product is the QuadSight system. One of the distinctive features of the QuadSight system is that it doesn’t rely on pattern recognition to identify when there is an obstacle in its way.

Any object can be detected, regardless of its material, colour or shape. This gives the system an advantage over competitors whose sensors have to be programmed or trained to identify specific hazards.

QuadSight is based on passive rather than active sensors. Its cameras don’t project any sort of energy into the world, but instead absorb the light that’s already available, some of it invisible to the human eye. Passive sensors have two major benefits over active sensors.

One of the most serious problems with active sensors is interference. As a growing number of autonomous vehicles hit the streets, the number of sensors increases.

As long as they use active sensors, this means an increase is the amount of energy being put out into the world by these sensors. As a result, they can end up interfering with each other, and this problem is only going to grow while people uses active sensors. This could lead to objects with low radar cross sections going undetected.

Although active sensors are certified according to safety regulations (FCC / FDA / IEC etc.) and are thus safe, it is important to keep in mind that each device is certified as a separate unit.

At this stage, it is too early to measure the effects of energy exposure emitted by hundreds of vehicles and road infrastructure on road users. Active sensors (especially radars) might pose a health hazard.

QuadSight’s sensors don’t project any sort of energy. As such, they are unaffected by the interference problem and don’t contribute to it. This makes QuadSight a more reliable option than many others available.

“At Foresight, we believe that a car’s vision system should be nothing less than perfect,” said Haim Siboni, CEO of Foresight. “Vision is the foundation of passenger safety, and vision perfection under all weather and lighting conditions is clearly the breakthrough that vehicle makers need to build consumer confidence in order to accelerate autonomous vehicle adoption.”

As the potential of driverless cars grows, a number of companies are making advances in the sector.

Google’s parent company Alphabet, Inc.  is exploring the potential of automated vehicles through its subsidiary Waymo. It is experimenting with a wide range of sensors on its vehicles, including active sensors such as sonar, lasers, lidar, and radar, and stereo cameras on the passive side.

One of the distinct features of Waymo’s cars is the way that these different sensors are used together, each contributing something different to the car’s understanding of the world around it.

Ford is approaching the future of driving from a different angle. Its acquisition of Chariot has put it in the business of providing transport to busy commuters, who can book rides in Chariot vehicles.

An app lets its users book a ride with Chariot and propose new routes for the vehicles. Such apps could eventually be used to provide access to driverless transport, with vehicles following pre-programmed routes to pick up travellers without cars of their own.

General Motors has announced that it will begin mass production of its first autonomous vehicle next year. The design of the Cruise AV was acquired by the company in 2016 when it absorbed startup Cruise Automation.

The car will have a dedicated production line of its own at a facility at Orion Township, integrating LIDAR sensors produced at its Brownstone plant. When a powerhouse like GM starts mass production of self-driving cars, it’s a sure sign that these vehicles will soon be a major feature of our streets.

The company most recognised for its work in self-driving cars is Tesla, Inc. (NASDAQ: TSLA). Like other companies, it is using a range of different sensors, including visible light cameras. The information is processed using pattern recognition software, which looks for familiar shapes and colours to identify hazards. Though it has suffered some setbacks due to accidents, Tesla is still leading the way in both developing and publicising the potential of self-driving vehicles.

Self-driving cars will soon be an important part of the transport landscape. As that happens, different types of sensors will hit the streets, giving people a chance to see which work best.

Print Friendly, PDF & Email