This project was funded by the ARM Institute in collaboration with the JROBOT Working Group and is focused on aiding DoD efforts in sustainment.
In December 2019, ARM Institute Member organizations responded to a special Call for Proposals centered on DoD sustainment needs. From this Project Call, project teams were selected to pitch their proposals at an in-person Joint Summit: Robotics in Sustainment II event held at the ARM Institute’s headquarters in February 2020. From there, five projects were selected for funding with the ARM Institute awarding $2.35M across the projects. During the execution of the projects, the team members worked closely with DoD personnel to ensure that the projects were on track to generate impact for the DoD.
Technology developed by the Autonomous Coating with Realtime Control and Inspection team successfully applied a coating at the appropriate paint thickness on a test coupon using a low fidelity model that autonomously generates robot input parameters based on the desired thickness results. This demonstrates that the practice of developing a simplified and executable model to optimize a coating applications is feasible. Based on the results we can conclude that further refinement and development of this technology can potentially produce considerable cost savings, improve quality, and safety.
Secondly, using only simulated data, the team used machine learning methods to train and develop a computationally efficient defect inspection model. Additionally, the demonstration showed that approximately 70% of the defects were identified. The demonstration of this technology proves the feasibility of this approach, which can be improved and applied to other inspection techniques.
Coating operations play an important role in preventing the loss of DoD assets to damage (estimated to cost $20 billion annually). These operations have diverse use cases within DoD; and logistics and maintenance divisions such as Army G4, NAVSEA 04 and Airforce Logistics Management Agency have established dedicated coating programs. Furthermore, these coating operations have significant commercial applications in airline, shipping, construction and manufacturing industries.
Application of coatings on vehicles, ships, aircraft and facilities is currently a time and labor-intensive operation. In addition, these operations are hazardous (carcinogenic, flammable coating materials, elevated platforms) and can sometimes lead to tragic losses.
Coating quality and consistency also varies significantly from one operator to another. Insufficient coating leads to part damage (corrosion in metals and erosion in composites) while excess coating results in material wastage and increased weight (especially important in aviation).
The developed technology is an enabler to use large robotic systems with flexibility across different paint operations, and as an extension to different manufacturing operations.
There are three key principles. First, to take automation to the part rather than the opposite. That means we derive our robotic motions from the observed (i.e. scanned) geometry of the part, which itself is unknown. All paint robots in manufacturing lines have fixed motions that are repetitive and cannot adjust to the part if the geometry changes. The scan-n-plan technology from ROS-I is adopted here. The same approach is usable for operations like sanding polishing, deburring, paint stripping, cold spray coating etc.
Second, is to account for physics properties of the process and its relation to environmental and robotic variables. Painting operations are complex and expensive to perform for non experts. Having a model that captures the paint deposition is useful to validate paint operations before they are physically performed.
Third is to create a simulated defect generation pipeline for training inspection models. Collecting data for real world defects is an expensive and time consuming operation. A simulation based defect generation using image processing techniques allow for creating inexpensive synthetic data that is used to train deep neural network models.