Charlotte Stonestreet
Managing Editor |
Home> | AUTOMATION | >Robots | >Furniture assembly using robot vision |
Home> | AUTOMATION | >Vision Systems | >Furniture assembly using robot vision |
Editor's Pick
Furniture assembly using robot vision
30 October 2018
As part of a research project, scientists at Nanyang Technological University, Singapore, have developed a robot that can independently assemble the individual components of an IKEA chair without interruption in less than ten minutes. The robot consists of an Ensenso N35 3D camera from IDS and two robot arms equipped with grippers to pick up objects.
The robot hardware is designed to simulate how people mount objects: the "eyes" are replaced by a 3D camera and the "arms” by industrial robot arms capable of moving in six axes. Each arm is equipped with parallel grippers for picking up objects. Force sensors are attached to the wrists to determine how strongly the "fingers" grip and how strongly they bring objects into contact with each other. The robot starts the assembly process by taking 3D images of the parts lying on the ground to create a map of the estimated positions of the various components. This task is performed by an Ensenso 3D camera from IDS. The camera works according to the "projected texture stereo vision" principle (Stereo Vision), which imitates human vision. Two cameras acquire images from the same scene from two different positions. Although the cameras see the same scene content, there are different object positions according to the cameras’ projection rays. Special matching algorithms compare the two images, search for corresponding points and visualiae all point displacements in a Disparity Map. The Ensenso software can determine the 3D coordination for each individual image pixel or object point, in this case the chair components.
Robot vision complexity
The challenge is to locate the components as precisely, quickly and reliably as possible in a confusing environment. This is ensured by the Ensenso camera’s light-intensive projector. This produces a high-contrast texture on the object surface by using a pattern mask, even under difficult light conditions. The projected texture supplements the weak or non-existent object surface structure found on the components of the IKEA chair.
Although not required for this application, the N35 model used here even goes one step further: thanks to the integrated FlexView projector technology the pattern projected on the object surface of the components can be shifted to vary the texture on the surface. Acquiring multiple image pairs with different textures of the same object scene produces a lot more image points. Thus the components of the chair are displayed in 3D in a much higher resolution to make them easier for the robot to recognize. Another advantage is the robot hand-eye calibration function of the Ensenso software. Using a calibration plate, it ensures that the position of the camera coordinate system (in this case the stationary camera) is determined with respect to the base coordinate system (position of the component). This enables the robot's hand to react precisely to the image information and reaches exactly its destination.
Assembly in less than ten minutes
"For a robot, putting together an IKEA chair with such precision is more complex than it looks," explains Professor Pham Quang Cuong of NTU. "The job of assembly, which may come naturally to humans, has to be broken down into different steps, such as identifying where the different chair parts are, the force required to grip the parts, and making sure the robotic arms move without colliding into each other. Through considerable engineering effort, we developed algorithms that will enable the robot to take the necessary steps to assemble the chair on its own." The result: the NTU robot installs the "Stefan" chair from Ikea in just 8 minutes and 55 seconds. According to Professor Pham Quang Cuong, artificial intelligence will make the application even more independent and promising in the future: "We are looking to integrate more artificial intelligence into this approach to make the robot more autonomous so it can learn the different steps of assembling a chair through human demonstration or by reading the instruction manual, or even from an image of the assembled product.”
The robot developed by the scientists at NTU Singapore is used for research into clever manipulation, an area of robotics that requires precise control of the forces and movements of special robot hands or fingers. This requires perfect interaction of all hardware and software components. 3D image processing using Ensenso stereo 3D cameras is the key to the solution. It convinces not only through accuracy, but also in terms of economy and speed. This marks real progress in furniture assembly – and not only here.
- 3D camera series: Ensenso X models with 5 MP resolution cameras
- Versatile all-round industrial camera with GigE or USB 3.1 Type-C connection
- App-based image processing with IDS NXT vegas: now with colour sensor
- 3D Machine Vision – The Ensenso N20 stereo 3D cameras by IDS
- Simplifies prototype development
- Camera for factory automation
- New robust 3D stereo cameras from IDS fully rated to IP65/67
- Robust GigE uEye FA industrial camera series now available from IDS
- New USB 3.0 industrial camera for automation systems
- High-res lenses
- Turnkey hovercraft drivetrain guarding
- More Ways of Identifying Objects
- Balls to Gravity
- ROBOTS GIVE FASTER CELL TOOL CHANGE TIMES
- ROBOT FOR DELICATE ASSEMBLY
- NUMBER OF ROBOTS IN UK ON THE RISE
- The future is bright - The future is automation!
- High Demand For Grant
- HD Machine Vision
- More Capacity, Less Space