Charlotte Stonestreet
Managing Editor |
Home> | AUTOMATION | >Robots | >Robots with vision: There’s an App for that! |
Home> | AUTOMATION | >Vision Systems | >Robots with vision: There’s an App for that! |
Robots with vision: There’s an App for that!
03 July 2020
We all take for granted downloading an App on our smart phones - but what about something as complex as configuring and commissioning vision-guidance for robots, will there ever be a ready-made ‘App’ for that? Neil Sandhu SICK’s UK Product Manager - Imaging, Measurement, Ranging & Systems, looks at the issues
Vision, and especially 3D vision, has historically been approached with caution by all but the experienced few, especially it comes to programming and set-up. Thankfully, that is coming to an end.
The foundations were laid several years ago, and it is already commonplace to download a ready-made “App” to program and configure a smart vision sensor as part of a robot or automated mobile vehicle application.
For SICK, our AppSpace software development environment, together with a growing range of programmable cameras, was the game-changer that demystified and democratised machine vision for our customers. SICK’s “AppPool” is already a go-to source of application-specific solutions for programmable devices.
In the past, vision, and especially 3D vision, required specialist programming expertise. Now even skilled vision engineers can save time and cost by customising an App designed to run on a SICK programmable vision camera or sensor. Meanwhile, a growing ‘basket’ of ready-made Apps for robot guidance applications are delivering “plug and play” systems for end-users with minimal set-up effort.
Among the latest is a Colour Inspection and Sorting SensorApp that provides an ‘out of the box’ solution when teamed with the ultra-compact SICK Pico- or midiCam 2D streaming cameras. The
Colour and Sorting SensorApp can be used to check that goods, assemblies or packs on a conveyor are the right size or colour. It can count objects with different sizes and colours as well as validate the correct colour or colour gradations. Objects with anomalies, such as the wrong colour or size, can be identified for rejection and the integrity and completeness of secondary packaging can be confirmed.
Robot & cobot guidance
SICK’s stated mission is to simplify vision for a growing number of adopters including vision guidance for collaborative robots and ‘cobots’. In a typical cobot application, a single camera running an app, talks to the robot. It can be ‘trained’ to find a shape of a part or product, then tell the robot how to pick it up and where to place it, very accurately. Critically, the camera talks directly to the robot and there is no control system in between.
Among the first to enable reliable, continuous in-line product detection to be customised for robotic belt picking was SICK’s Trispector P1000 programmable 3D vision camera. It offers the option of customised programming or the availability of the ready-made SICK Belt Pick Toolkit App, supporting robots from the leading manufacturers. The SICK Trispector P Beltpick comes ‘out of the box’ as a 3D vision-guided belt-picking solution for both industrial and collaborative robots. With the improved z-axis control available through 3D vision, even products with complex profiles can be picked from variable heights without risk of damage.
Following the success of the 3D beltpick application for its Trispector, SICK moved on to release vision solutions to facilitate two of the most common robot tasks: pick and place and bin picking. They are providing opportunities for production teams to replace repetitive or physically-demanding manual tasks more easily and more affordably with robot solutions. They are straightforward to set up and can be easily integrated with a cobot system. The SICK PLOC2D is 2D vision system for robot localisation of parts, products or packages, based on the SICK Inspector P vision sensor. Powered by a SICK-developed algorithm, it offers simple integration with pick and place robots, and can be used with static workstations, moving belts, or bowl feeders to handle much smaller parts than was previously possible.
The SICK PLOC2D provides a powerful 2D imaging system which connects directly to the robot controller or PLC. An EasyTeach function matches taught-in shapes against the object down to a 0.5 pixel resolution, with a rotational measurement to 0.1o. The approved shape and its location in the field of view is output to the robot controller to guide picking. SICK’s PLB 520 3D robot-guidance system has enabled bin picking of smaller objects in more confined containers than was previously possible. Now typical tasks such as picking specific small parts like bolts from a mixed bin and placing them on a conveyor or selecting part-completed items and placing on a press or machining centre, have become more feasible. SICK collaborated with Universal Robots to develop the SICK Inspector PIM60 URCap for pick and place, quality inspection and measurement. By integrating Inspector PIM60 2D vision sensors with Universal Robots’ UR3, UR5 or UR10 robots, an adaptable vision-guided cobot system was created that can be configured without the need for a separate PC or specialist software expertise. The in-camera software guides you through the set-up and calibration process so even if an engineer is new to 2D vision robot guidance, development of an effective solution is simple.
Meanwhile, experienced users still benefit from the directness and simplicity of the set-up and parameter change procedures, as well as the handling capabilities of the range of Universal Robots that the SICK Inspector PIM60 URCap supports.
AGV forklifts
The most recent Apps to be added into the AppPool are intended for use with SICK’s time-of-flight 3D snapshot camera, the Visionary T-AP. They promise to cut out delays associated with lining up both automated and manual forklifts to load pallets in high-bay warehouses, as well as positioning AGVs to collect dollies.
The SICK Pallet Pocket and SICK Dolly Positioning SensorApps are supplied already loaded onto the camera and ready for use. They work by positioning the camera in front of the pocket or dolly chassis within a range of 1.5m to 3m. Using a single shot of light, the SICK Visionary T-AP 3D camera captures a 3D image, then pre-processes and evaluates the co-ordinates of the pallet pocket or space under the dolly before outputting to the vehicle controller. The information can also be sent to a driver display to aid manual forklift operation, particularly useful in high-bay warehouses. Setting up the sensor is easy with SICK’s SOPASair configuration and diagnostics tool, and it can be easily adjusted to a wide range of pallet and dolly types.
Accessible to all
The appeal of machine vision and robotics lies in the ability to conduct relentless and repetitive tasks more reliably than a human operator every could. But this aspiration to make automated systems ‘see’ objects, as we can, was always an expensive and long-term undertaking, required specialist programming, huge amounts of processing power and bulky equipment.
Now, with the advent of intelligent vision sensors and ready-made software all that has changed. Vision technology is keeping pace with robot and cobot development to deliver automated solutions accessible to all.
- Balls to Gravity
- ROBOTS GIVE FASTER CELL TOOL CHANGE TIMES
- ROBOT FOR DELICATE ASSEMBLY
- NUMBER OF ROBOTS IN UK ON THE RISE
- The future is bright - The future is automation!
- High Demand For Grant
- FREEBIRD - THE THIRD DIMENSION
- PRECISION ROBOT CELL FOR THE PHARMACEUTICAL INDUSTRY
- THE ROBOTICS PACKAGE
- Long Reach, Easy Teach