1660601361883 Cobotspecialreportmain

Elbow to elbow, into the cobot future

May 17, 2019
Collaborative robots are here, as predicted

I recently had an opportunity to attend ProFood Tech, Automate and ProMat trade shows at McCormick Place in Chicago with a colleague.

While we have attended these events in previous years, this year brought us to Chicago with a specific mission. Like many manufacturing entities, we find ourselves challenged to provide enough workforce to run the lines the way we want to.

While the improved economy means fewer people are out of work, the downside there isn’t enough people to go around. Companies like ours are forced to look to improved automation to meet our obligations.

While the general subject matter of the shows was different, one common thread united them. As my colleague and I walked through the exhibits, the use of collaborative robots was the overwhelming theme of operation. This trend makes perfect sense when faced with the prospect of a long drought in finding suitable candidates to perform the repetitive tasks of assembly and packaging goods. If you can’t find people, then find small, people-sized replacements. The biggest drawback to using robots in the workplace in close proximity to actual people is the people themselves. The robot doesn’t care about the objects within the workspace, but the people sure do.

Traditional robot applications involved erecting a secure physical barrier around the robot to ensure that no harm would come to people who happen to wander into range. These earlier applications would identify the extreme range of motion of the robot and then build a wall around the possible path to keep automation and people from coming together by accident.

 As technology got more sophisticated, the work envelope began to shrink with the use of devices that would limit the reach of the robot in particular directions. Today, a robot application eliminates the physical barrier altogether by relying on proximity-sensing devices to avoid, slow down or stop completely in the presence of a person or other object.

The technology of presence sensing has improved dramatically in just a few years. Early versions involved physical contact devices, such as a safety mat that relies on pressure exerted by a weight applied to two conductive plates separated from each other by nonconducting layers.

When compressed, the conducting layers will contact, triggering an output. Another example of a presence-sensing device is a two-hand operator station. In this arrangement, operation of a device depends on physical contact on two electronic pads mounted far enough apart from one another so as to prevent incidental contact with one hand only. This ensures that a person is far enough away from a moving object so as to be deemed safe.

Both of the above examples have loopholes, however. They are both looking for the presence of a person in a particular position. They don’t cover the scenario where someone is in a position that is unexpected.

To extend this line of thinking, two other types of safety devices came into play.

The first was a light curtain. The concept is simple enough. Situate a series of send/receive light beams across an opening and completely surround any other means of entry into the danger zone. Any one beam broken would immediately disable the machine beyond the beam.

The beams are mounted in parallel and at a fixed gap so as to detect any object larger than the gap between successive beams.

The second approach broadens the scope of the protected area by the use of an area or safety zone scanner.

Earlier scanners were two-dimensional and looked out from the base in a fixed path that pulses back and forth across the width of the covered area. Think of it as a light curtain that rotates through an arc but with only one beam.

Scanners have expanded to the point of being three-dimensional and, in some cases, can see pretty much any direction around a central point, both up and down, much like radar would look for aircraft at varying altitudes.

Software manages the scanned area and allows for exceptions to permit the scanner to ignore inanimate objects within the defined area.

At this point, the conventional thinking of a defined safety area with devices used to monitor any entry/egress points can be thrown out the window. The demand for the use of a collaborative robot changes the perspective from “keep the people out” to “I know they are there, so let’s avoid them.”

The Automate 2017 show put it this way:

  • Past – Separate the human from the robot.

  • Present – Improved human access to the robot.

  • Future – Close human/robot interaction.

In 2019, the future is here, and the expectation is that robots and humans will be close enough to physically touch each other.

The challenge for technology developers was to come up with ways to alter the behavior limits of the robot depending on the degree of proximity to the human counterpart. Area scanners are used to downgrade the movement of the robot as the human gets closer to the operating envelope, but there comes a point where declining zones from 100% safe down to 0% safe will still bring everything to a halt.

This type of interaction is called near-term, and the method of sensing to still allow activity within close confines is much like two humans working in the same space. The “skin” of the robot becomes the sensing device.

Popular means of sensing the skin of the robot aren’t that much different than our own human sensitivities. Robots are designed using tactile (touch), capacitive, proximity, force, torque and ultrasonic sensors, as well as pads with compression sensors.

This idea of “force field” sensing has dramatically changed the deployment of robots and our conception of what humans can do with robots as a means to produce.

One of the more intriguing developments has been the force sensor. Force-sensing resistors are polymer-thick films that reduce in resistance with the more force applied to the active surface.

The robot reacts to the reduced resistance in much the same fashion that a person would react to you if you were working in very close proximity to each other and one of you made accidental contact. Instinctively, we would pull back from the contact.

The collaborative robot works in much the same manner. Once the sensation (presence sensor) has ceased to be triggered, the robot will resume activity at a slow pace until the normal work path can be ascertained.

Another interesting use of technology is in torque sensors. When applied to a robot, the torque used to carry out its normal work path is monitored for any torque that is contrary to the known operating parameters.

The contrary impulse could come from accidental contact with the human in the operating area and could be as blunt as a direct opposition to the travel path but might come as an ever-so-slight brushing against the robot appendage (axis) while on its way to the destination.

Capacitive sensing comes in the use of microscopic level “paint” skins on the robot itself. The skin of the robot is a capacitor, and touch by any object to the skin will result in a change in the charge of the skin.

By monitoring this ambient capacitance, the robot controller can respond to a change in that status quo and slow down or stop, according to the programmed algorithm.

While at the show with my colleague, I was both impressed by the vast number of uses of collaborative robots and cautiously optimistic of their use.

For every collaborating exhibit, lurking nearby was a well-dressed, non-descript person whose job it was to quietly direct people out of harm’s way from these emerging technologies. For me, the future is exciting, and the ability to enhance our workforce with robotic technology is encouraging as the struggle to keep lines running in a booming economy is a burden we share with many other producers. However, perhaps it is old age talking, but I am also a little leery yet of the space around me and my complete confidence in an inanimate object sharing the same workspace as me.

The technology is advancing at a rapid rate. However, whatever trepidation I might have, my recollection of that pamphlet from the Automate 2017 show is a good reminder that only two years ago the vendors were talking of a future where robots would be rubbing elbows with humans, and here we are in 2019, watching demonstrations of exactly that premonition. Elbow to elbow, forward into the battle we must go.

About the author: Rick Rice
About the Author

Rick Rice | Contributing Editor

Rick Rice is a controls engineer at Crest Foods, a dry-foods manufacturing and packaging company in Ashton, Illinois. With more than 30 years’ experience in the field of automation, Rice has designed and programmed everything from automotive assembly, robots, palletizing and depalletizing equipment, conveyors and forming machines for the plastics industry but most of his career has focused on OEM in the packaging machinery industry with a focus on R&D for custom applications. 

Sponsored Recommendations

2024 State of Technology Report: PLCs and PACs

Programmable logic controllers (PLCs) and programmable automation controllers (PACs) are the brains of the machine in many regards. They have evolved over the years.This new State...

2024 State of Technology Report: Packaging Equipment

Special considerations and requirements make packaging equipment an interesting vertical market unto itself. This new State of Technology Report from the editors of ...

High Sensitivity Accelerometers to Monitor Traffic and Railroad Vibration for Semiconductor Manufacturing

This paper examines highly sensitive piezoelectric sensors for precise vibration measurement which is critical in semiconductor production to prevent quality and yield issues....

Simulation for Automation Guide

How digital twin solutions are expanding the capabilities of plant engineers.