How Sensor Fusion Enables AMRs to Maneuver Around Factory Floors Efficiently

Update: April 17, 2024

With increasing instances of people and autonomous mobile robots (AMRs), also called industrial mobile robots (IMRs), working in the same area, multiple inherent safety risks must be addressed. The safe and efficient operation of AMRs is too important to rely on a single sensor technology.

Multi-sensor fusion, or simply “sensor fusion,” combines technologies like laser range finding (LIDAR), cameras, ultrasonic sensors, lasers obstacle sensors, and radio frequency identification (RFID) to support a range of AMR functions, including navigation, path planning, collision avoidance, inventory management, and logistics support. Senor fusion also encompasses alerting nearby people to the presence of the AMR.

To address the need for the safe and efficient operation of AMRs, the American National Standards Institute (ANSI) and the Association for Advancing Automation (A3), formerly the Robotic Industries Association (RIA), are developing the ANSI/A3 R15.08 series of standards. R15.08-1 and R15.08-2 have been released, focusing on basic safety requirements and integrating AMRs into a site. R15.08-3 is currently under development and will expand the safety requirements for AMRs, including more detailed recommendations for using sensor fusion.

In anticipation of R15.08-3, this article reviews some of today’s best practices related to safety and sensor fusion in AMRs, beginning with a brief overview of functional safety requirements currently used with AMRs, including generic industrial safety standards like IEC 61508, ISO 13849 and IEC 62061, and the safety requirements for sensing human presence like IEC 61496 and IEC 62998. It then presents a typical AMR design detailing the numerous sensor technologies, presents representative devices, and looks at how they support functions like navigation, path planning, localization, collision avoidance, and inventory management/logistics support.

Good, better, best

AMR designers have a range of safety standards to consider, starting with general-purpose functional safety standards like IEC 61508, ISO 13849, and IEC 62061. There are also more specific safety standards related to sensing human presence, such as IEC 61496, IEC 62998, and the ANSI/A3 R15.08 series of standards.

IEC 61496 offers guidance for several sensor types. It refers to IEC 62061, which specifies requirements and makes recommendations for the design, integration, and validation of electrosensitive protective equipment (ESPE) for machines, including safety integrity levels (SILs), and ISO 13849 that covers safety of machinery and safety-related parts of control systems including safety performance levels (PLs) (Table 1).

Requirement Type
1 2 3 4
Safety performance in accordance with IEC 62061 and/or ISO 13849-1 N/A SIL 1 and/or PL c SIL 2 and/or PL d SIL 3 and/or PL e
SIL = safety integrity level; PL = performance level

Table 1: Safety requirements for ESPE by type specified in IEC 61496. (Table source: Analog Devices)

IEC 62998 is newer and can often be a better choice since it includes guidance on implementing sensor fusion, using artificial intelligence (AI) in safety systems, and using sensors mounted on moving platforms outside the coverage of IEC 61496.

R15.08 Part 3, when it’s released, may make the R15.08 series the best since it will add safety requirements for users of AMR systems and AMR applications. Likely topics may include sensor fusion and more extensive AMR stability testing and validation.

Sensor fusion functions

Mapping the facility is an essential aspect of AMR commissioning. But it’s not a one-and-done activity. It’s also part of an ongoing process called simultaneous localization and mapping (SLAM), sometimes called synchronized localization and mapping. It is the process of continuously updating the map of an area for any changes while keeping track of the robot’s location.

Sensor fusion is needed to support SLAM and enable the safe operation of AMRs. Not all sensors work equally well under all operating circumstances, and different sensor technologies produce various data types. AI can be used in sensor fusion systems to combine information about the local operating environment (is it hazy or smoky, humid, how bright is the ambient light, etc.) and enable a more meaningful result by combining the outputs of different sensor technologies.

Sensor elements can be categorized by function as well as technology. Examples of sensor fusion functions in AMRs include (Figure 1):

  • Distance sensors like encoders on wheels and inertial measurement units using gyroscopes and accelerometers help measure the movement and determine the range between reference positions.
  • Image sensors like three-dimensional (3D) cameras and 3D LiDAR are used to identify and track nearby objects.
  • Communications links, compute processors, and logistics sensors like barcode scanners and radio frequency identification (RFID) devices link the AMR to facility-wide management systems and integrate information from external sensors into the AMR’s sensor fusion system for improved performance.
  • Proximity sensors like laser scanners and two-dimensional (2D) LiDAR detect and track objects near the AMR, including people’s movement.

Figure 1: Examples of common sensor types and related system elements used in AMR sensor fusion designs. (Image source: Qualcomm)

2D LiDAR, 3D LiDAR, and ultrasonics

2D and 3D LiDAR and ultrasonics are common sensor technologies that support SLAM and safety in AMRs. The differences between those technologies enable one sensor to compensate for the weaknesses of the others to improve performance and reliability.

2D LiDAR uses a single plane of laser illumination to identify objects based on X and Y coordinates. 3D LiDAR uses multiple laser beams to create a highly detailed 3D representation of the surroundings called a point cloud. Both types of LiDAR are relatively immune to ambient light conditions but require that objects to be detected have a minimum threshold of reflectivity of the wavelength emitted by the laser. In general, 3D LiDAR can detect low-reflectivity objects with more reliability than 2D LiDAR.

The HPS-3D160 3D LiDAR sensor from Seeed Technology integrates high-power 850 nm infrared vertical-cavity surface-emitting laser (VCSEL) emitters and high-photosensitive CMOS. The embedded high-performance processor includes filtering and compensation algorithms and can support multiple simultaneous LiDAR operations. The unit has a range of up to 12 meters with centimeter accuracy.

When a 2D LiDAR solution is needed, designers can turn to the TIM781S-2174104 from SICK. It features an aperture angle of 270 degrees with an angular resolution of 0.33 degrees and a scanning frequency of 15 Hz. It has a safety-related working range of 5 meters (Figure 2).

Figure 2: This 2D LiDAR sensor has an aperture angle of 270 degrees. (Image source: SICK)

Ultrasonic sensors can accurately detect transmissive objects like glass and light-absorbing materials that LiDAR can’t always see. Ultrasonic sensors are also less susceptible to interference from high dust, smoke, humidity, and other conditions that can disrupt LiDAR. However, ultrasonic sensors are sensitive to interference from environmental noise, and their detection ranges can be more limited than LiDAR.

Ultrasonic sensors like the TSPC-30S1-232 from Senix can complement LiDAR and other sensors for AMR SLAM and safety. It has an optimum range of 3 meters, compared to 5 meters for the 2D LiDAR and 12 meters for the 3D LiDAR detailed above. This temperature-compensated ultrasonic sensor is IP68-rated in an environmentally sealed stainless-steel enclosure (Figure 3).

Figure 3: Environmentally sealed ultrasonic sensor with an optimum range of 3 meters. (Image source: DigiKey)

Sensor fusion usually refers to using several discrete sensors. But in some cases, multiple sensors are co-packaged as a single unit.

Three sensors in one

Visual perception using a pair of cameras to produce stereoscopic images plus image processing based on AI and ML can enable the AMR to see the background as well as identify nearby objects. Sensors are available that include stereo depth cameras, a separate color camera, and an IMU in one unit.

Stereo depth cameras like the Intel RealSense D455 RealSense Depth Cameras use two cameras separated by a known baseline to sense depth and calculate the distance to an object. One key to precision is using a sturdy steel framework that ensures an exact separation distance between the cameras, even in demanding industrial environments. The accuracy of the depth perception algorithm is dependent on knowing the exact spacing between the two cameras.

For example, the model 82635DSD455MP depth camera has been optimized for AMRs and similar platforms and has extended the distance between the cameras to 95 mm (Figure 4). That enables the depth calculation algorithm to reduce the estimation error to less than 2% at 4 meters.

Figure 4: This module includes stereo depth cameras separated by 95 mm, a separate color camera, and an IMU. (Image source: DigiKey)

D455 depth cameras also include a separate color (RGB) camera. A global shutter for up to 90 frames per second on the RGB camera, matched to the depth imager field of view (FOV), improves the correspondence between the color and depth images, enhancing the ability to understand the surroundings. D455 depth cameras integrate an IMU with six degrees of freedom that enables the depth calculation algorithm to include the rate of motion of the AMR and produce dynamic depth awareness estimates.

Lighting and sounding the way

Flashing lights and audible alerts for people near an AMR are important to AMR safety. The lights are usually in the form of a light tower or light strip on the sides of the AMR. They help the robot communicate its intended action(s) to people. They can also indicate status like battery charging, loading or unloading activities, intention to turn in a new direction (like the turn signals on a car), emergency conditions, and so on.

There are no standards for light colors, flashing speeds, or audible alarms. They can vary between AMR makers and are often developed to reflect the specific activities in the facility where the AMR operates. Light strips are available with and without built-in audible alert mechanisms. For example, the model TLF100PDLBGYRAQP from Banner Engineering includes a sealed audible element with 14 selectable tones and volume control (Figure 5).

Figure 5: This light bar annunciator includes a sealed audible element (top black circle). (Image source: DigiKey)

Logistics support

AMRs operate as part of larger operations and are often required to integrate with enterprise resource planning (ERP), manufacturing execution system (MES), or warehouse management system (WMS) software. The communications module on the AMR coupled with sensors like barcode and RFID readers enable AMRs to be tightly fused into enterprise systems.

When a barcode reader is needed, designers can turn to the V430-F000W12M-SRP from Omron, which can decode 1D and 2D barcodes on labels or Direct Part Mark (DPM) barcodes. It includes variable distance autofocus, a wide field of view lens, a 1.2-megapixel sensor, a built-in light, and high-speed processing.

The DLP-RFID2 from DLP Design is a low-cost, compact module for reading from and writing to high-frequency (HF) RFID transponder tags. It can also read the unique identifiers (UDI) of up to 15 tags at once and can be configured to use an internal or external antenna. It has an operating temperature range of 0°C to +70°C, making it suitable for use in Industry 4.0 manufacturing and logistics facilities.

Conclusion

Sensor fusion is an important tool for supporting SLAM and safety in AMRs. In anticipation of R15.08-3, which may include references to sensor fusion and more extensive AMR stability testing and validation, this article reviewed some current standards and best practices for implementing sensor fusion in AMRs. This is the second article in a two-part series. Part one reviewed the safe and efficient integration of AMRs into industry 4.0 operations for maximum benefit.