Track and traceability have applicability across many diverse sectors. This project focuses on an open and modular design for flexible configuration to benefit various sectors.

Project

This project centers on component tracking and traceability across the entire production line using sensor fusion, computer vision, and robotic and deep learning-based technologies, improving accuracy and lowering cost. The approach is to build a sensor fusion package that will scan factory space and build an accurate image and 3D volumetric map of the space and surfaces in multi-spectra modalities. Additionally, the technology will be designed for flexible integration with existing PLM and factory automation systems, allowing for widespread seamless adoption. 

Objective

Create a system that will improve component tracking and traceability using an open and modular design for flexible configuration within existing PLM and factory automation systems.

Technical Approach

The sensor fusion package will scan the factory floor to build an accurate images and 3D volumetric map of the space and surfaces. Deep learning-based vision and perception algorithms will extract the relevant parts from the information-rich sensor data. A mobile robotic sensory platform extends the sensor’s range and coverage in complex factory environments. Tracking algorithms then merge the sensor data with the process model, improving accuracy and enabling traceability. Additionally, the project centers on an open and modular design for flexible configuration and seamless integration with existing PLM systems.

Participants

HEBI Robotics, ITAMCO, Princeton University, University of Washington