Service Center 800-441-8246
  1. Autonomous mobile robots
  2. O3R perception platform

Perception technology: Mobile robots in dynamic environments

Identifying objects in space is not a straightforward task for mobile robot developers. The robots require multiple perception technologies that allow them to operate safely in dynamic environments. 

General facilities, from manufacturing or shipping to logistics or automation, are wholly unforgiving. As a developer, you need a robust, comprehensive solution that provides certainty in all environments. 

Using Indirect Time of Flight (IToF) technology, ifm’s perception technology brings the need for human intervention to a bare minimum while also managing the Total Cost of Ownership. By enabling autonomous robots to make informed decisions, we significantly reduce downtime. 

The result is significant added value to your end user. Automated mobile robots and automated guided vehicles that work faster with fewer human interventions. Productivity – and profits  increase when robots complete more missions per day.

Multimodal: The future of perception stack

Mobile robotics is not limited to 3D cameras. 2D cameras, LiDARs, 3D LiDAR, ultrasonics, and more are all “standard” on most mobile robots. The ifm Perception Platform (O3R)  allows for multimodal fusion, tackles complex perception challenges and paves the way for advanced mobile robot operations.

Ease of integration

The O3R platform by ifm is designed to reduce complexity. We synchronize diverse data sets and modalities, ensuring your robot operates with seamless precision.

Indirect Time of Flight: A leap forward in sensing

Indirect Time of Flight (ToF) revolutionizes how mobile robots perceive depth. By using this unique waveform technology, we offer a cost-effective alternative to traditional LiDAR. The result is detailed 3D imaging for precise obstacle detection.

The ifm IToF system is perfectly tailored for close-range (0-4m) accuracy. It provides the detailed imaging necessary for mobile robots to navigate complex environments effectively.

Edge computing for enhanced perception

By separating the perception stack from primary processing, we introduce an edge compute for perception. This groundbreaking approach prevents the overloading of primary vehicle processors. This ensures a smooth data flow, making it easier for robots to make real-time decisions.

Our processing platform leverages the industry-standard NVIDIA Jetson ecosystem This provides the power of both GPU and CPU without the need to reinvent the wheel. 

We prioritize simplicity and flexibility in software integration, making it easier for developers to create and implement their vision.

Hardware components

O3R 2D / 3D head specifications

  • 3D image sensor: pmd 3D ToF chip
  • 3D resolution: 224 x 172 pixels
  • 3D field of view: 105 x 78 º or 60 x 45 º
  • 3D light source: 940 nm infrared
  • 2D image sensor: RGB
  • 2D resolution: 1280 x 800 pixels

OVP vision processing unit specifications

  • Ethernet ports: 2x 1 GigE
  • Camera ports: 6x proprietary 2D/3D camera ports
  • Power: 24 VDC and CAN
  • USB ports: USB 3.0 + USB 2.0
 

Gain a competitive edge

Are you ready to take your mobile robot development to the next level? Fill out the form or contact Tim McCarver directly at tim.mccarver@ifm.com