Progress in Work Package 6: Testbeds – Manufacturing Robots 

Progress in Work Package 6 Testbeds – Manufacturing Robots 

Work Package 6 focuses on enhancing autonomous manufacturing robots’ functions. The primary objective is to demonstrate the augmented flexibility and autonomy of robots within manufacturing cells, and to validate the CoreSense technology through practical, real-world applications. This will be achieved through two demonstrators: 

  • The first demonstrator is in IMR’s pilot factory in Mullingar. It features a mobile manipulator that performs discrete assembly tasks, tends to various cells, and responds to machine maintenance calls. 
  • The second demonstrator will be developed at the TUD/SAM|XL facilities in Delft. This setup will focus on manufacturing large load-bearing composite structures, utilizing exchangeable end effectors for additive manufacturing, machining, and ultrasonic inspection. 

WP6, together with the other two testbeds in WP7 and WP8, will implement and evaluate the results from the technical work packages of CoreSense: 

  • WP2: Cognitive Architecture: Provides the architectural framework and cognitive models necessary for the autonomous and understanding capabilities of the robots used in WP6 . 
  • WP3: Cognitive modules and structures: Supplies the cognitive capabilities that enable robots to perform complex tasks and adapt to new scenarios autonomously.  
  • WP4: System lifecycle and toolchain: Ensures that the tools and processes developed are robust and can be integrated into the manufacturing robots.  
  • WP5: ROSification of software assets: Implements ROS2 platform capabilities, facilitating the integration of CoreSense technologies into the ROS ecosystem. 

Both manufacturing demonstrators have developed a set of scenarios to steer technical research work packages and evaluate CoreSense capabilities. 

Let’s see some examples of scenarios: 

1. Machine Tending and Part Intralogistics

Under this set of activities in the agile manufacturing testbed at the IMR Mullingar facility, the mobile robot (KUKA KMR equipped with an RGBD camera and F/T feedback) performs the unloading of manufactured parts from the production area upon an on-demand call and transports them to the inspection station for quality control, as illustrated in Figure 1. 

Fig 1(a) On-demand Machine Tending Call 
Fig 1(b) Part Unloading from 3D Plastic Printer 
Fig 1(c) Reactive Path Planning and Part Transportation 

2. Part Inspection and Human Robot Interaction  

In this scenario of the agile manufacturing testbed at the IMR Mullingar facility, the manufactured parts undergo inspection using model-based 3D construction and CAD comparison to detect, localize, and assess irregularities. Parts that meet the predefined threshold are deemed to have passed, while the rest are either discarded or reprocessed. Additionally, during the inspection cycle, a human oversees the process and is involved in ground truth generation and validation to avoid false positives and negatives from the automated inspection pipeline, as illustrated in Figure 2. 

Fig 2 Human Robot Collaboration 

3. Digital Twinning and Product Assembly 

In this section of the agile manufacturing process, the digital twin is queried to perform part of the assembly task (precise picking and placement). The control policy is then transferred to the real setup for reproduction and generalization across a diverse set of candidate objects. The assembly process involves part localization, grasp planning, grasping and manipulation, parts sorting, and product assembly, as illustrated in Figure 3. 

Fig 3(a) Digital Twin of Assembly Station 
Fig 3(b) Real-world Setup of Assembly Station 

4. Inspection in presence of humans    

An assembly area cannot be emptied completely for the robot to finish its inspection process. This means that the robot needs to be able to perform quality inspections in the presence of human operators. Although people will not directly interfere with the scanning operation, for safety the robot needs to be aware of where people are and make decisions accordingly. This scenario will take place at SAM|XL facilities in Delft. 

CoreSense technology will aid in: 

  • Detecting humans and determining if tasks can continue collision-free. 
  • Making decisions on subsequent steps should a collision risk arise, potentially halting or pausing the task until it is safe to resume. 
Related Posts
EU Logo