+46 18 660 700
Would you like to be contacted?
Quarterly Bulletin
 
Vision control leads the way in future robotics
 
 

Industrial robots of today require objects fixed with absolute precision. Expert programming skills are needed for production line set-up. Cost-efficiency for robots is found only in large production volumes and the typical user is a large automotive company.

 

Now the time has come for robots in low volume production also in small and medium sized companies, according to Lars Asplund, Professor at School of Innovation, Design and Engineering at Mälardalen University in Västerås, Sweden.

 

He is working in a research project where image processing algorithms run in parallel at high speed on custom hardware with FPGA and Intel® Atom™ in a way no one has done before.

– I wouldn’t go as for as saying that we are today’s world leaders in the field but we have a great opportunity to conquer that position, he says.
 

The project is funded by the Swedish Knowledge Foundation (Swedish: KK-stiftelsen). Three companies contribute to the project. Sensor Control AB in Västerås, Sweden, is a supplier of vision control systems for industrial robots and automation processes. Meeq AB, also in Västerås, is providing systems within Virtual Reality as well as accurate measurements of position and orientation. Hectronic contributes with hands-on experience in embedded PC technology and have developed the hardware platform used in the project.
 

Opening up for new application areas

The objectives are to increase performance in vision control systems for robots and to lower the prices. Lars Asplund gives an example to explain how the project result will help to extend the area of use for robots. A typical task is to pick objects scattered on a surface.
–  Today robots start by scanning the objects using a laser beam. It takes ten seconds to process the visual information for the robot to know object position and orientation in order to start picking them up.

Through the use of parallel processing in an FPGA the vision control platform developed by the project will process up to 30 picture frames per second without delay. Integrated into a robot this vision control system will enable continuous “seeing” and rapid movement towards the objects to make the pick. Faster robots mean shorter production cycles and thus saves money where used in the industry.
 
 
 
Hectronic H6049 - Qseven module with Intel Atom
The hardware used in the project is a turnkey solution from Hectronic, with a custom design carrier board combined with an Intel® Atom™ Qseven CPU module from Hectronic's standard product range.
 
The H6049 CPU module in the new compact Qseven form factor is based on the low power Intel® Atom™ Z530 processor with an US15W embedded chipset. The H6049 CPU module is mounted on a carrier board with dual 5 megapixel CCD sensors and an FPGA with DSP macrocells. The FPGA is Xilinx Spartan 3A featuring 16640 logical cells,  84 DS48A blocks and extended memory up to 1.5Mbit block RAM making it ideal for high-end signal processing applications such as this.
 
The FPGA is used for picture processing algorithms and advanced functions for accurately dynamic determination of position for, and distances to objects at a distance. The FPGA can be remotely updated through the Qseven Ethernet connection via a USB to JTAG converter on the carrier board.
 
 
 
 
A robot using high-speed vision control has other advantages compared to the most common industrial robots used today. The precision in which the object is placed in terms of position and orientation is not in the same way crucial to the quality of the work-up. A fairly accurate position of the object is enough. The vision control system will feed the robot enough information to adjust where to work on the object to achieve the required quality in the final result.  
– Today an expensive fixture is needed to hold the object in a precise position and furthermore a costly expert to program the robot. The profit is lost from expenses. Manual welding is often the faster and cheaper option.

It will be possible to program the robot without expert knowledge using the information from the vision control system. Let’s say that an edge on an object needs to be welded. Initially a three dimensional picture of the object is made up. The edge is marked on the computer screen. In principal that’s all the robot needs to know where to weld on the correct edge of the object in practice.
 

Making use of existing technology

The work of a robot is divided into three categories, sense, plan and act. New and cheap technology to support all three categories of tasks has arrived on the market and leads Lars Asplund to believe that the breakthrough for robots is imminent. Various types of sensors for acceleration, rotation and tactile sensing are there to buy and use. Through the development of mobile phones there are cheap and small cameras. Electric motors weighing 25 kg delivering 160 Hp have seen the day of light.
– Now there’s a possibility to develop small, compact and yet powerful robots. It’s time to actually decide what kind of robots we want to develop and use.

Lars Asplund is full of ideas.
– We may see robots that dig in the ground, road construction robots and robots that cut down trees.  

The research on intelligent sensor systems at Mälardalen University in Västerås contributes to the Robotdalen initiative. The purpose of the initiative is to promote the region around lake Mälaren in Sweden to a world leading position in research, development and manufacturing of robots. Lars Asplund would like to take the opportunity to invite more companies to take part in this initiative for more ideas about areas of use for future robots.

The publication of the final results from the research project is planned for 2010.
 
 
 
 
 
 

Presenting our research company partners

 
 
 
 
 

The company Meeq AB adds vision to armoured vehicle from BAE Systems Hägglunds

BAE Systems plc is a British defence and aerospace company with global interests. BAE is the world's third largest defence contractor and the largest in Europe. BAE Systems is involved in several major defence projects including the F-35 Lightning II, Eurofighter Typhoon and the Queen Elizabeth class aircraft carriers.

 

Problem:

Current battles are often fought in cities and built up areas firing weapons in close range. To protect soldiers from attacks, vehicles must be fully closed and armoured, meaning no windows, no periscopes, however still maintaining an increased need for full supervision of the environment, especially close to the vehicle, and ability to drive, defend and attack with high accuracy and speed.

 

Solution:

MEEQ AB has delivered a PosEye®-technology based solution featuring few components;
information sources (cameras) mounted on the exterior and individual head mounted Augmented Reality displays for the soldiers inside (vehicle commander, driver etc). The PosEye®-based solution provide a high sense of presence in the surrounding area, virtually a “see-through-armour”, allowing vehicle staff to drive in full precision, with full awareness while maintaining required safety measurements.

 

Result:

PosEye® based head-worn augmented reality system is now being marketed by BAE Systems Hägglunds.

 

Company website: www.meeq.se

 
 
 
 
 
 
 
 

The company Sensor Control AB adds 3D-localization and 3D curve correction to Skoda robot

An application with cameras is placed on two industry robots that cover a car roof assembly with a filler. The splice between the joined parts, roof and side, is covered to create a smooth transition.

 

The position on the chassis/bodywork vary with +/- 50 mm in all directions and the OptiMaster system measures in three dimensions the exact coordinates of the chassis. The cameras are then moved along the splice and using additional measurements the robot arm is corrected thus applying the filler in an extremely precise fashion.

 

The final result is subjected to considerable scrutiny and must withstand the critical eyes of every potential car buyer. The OptiMaster 3D system comes equipped with functions for automatic calibration so that results can be verified achieving repeatability in all steps. The robot combined with the OptiMaster system from Sensor Control together achieves a positioning accuracy of less than 0.2 mm in 3D.

 

Company website: www.sensorcontrol.us

 
 

Market segments

Meeting requirements from industry sectors.
 
Learn more

Technologies

Promoting a deeper understanding of technical possibilities at hand.
 
Learn more

Case studies

Development and production for industrial customers.
 
Learn more

Bits & Pieces bulletin

Technical articles, inspiring case studies product news and updates. B&P is distributed quarterly to registered users.
 

Subscribe

Enter your e-mail address and click subscribe.
 
 

Products

Technologies
 
Find us on
 
 
Hectronic AB | Phone: +46 18 660 700 | E-mail: info@hectronic.se | Sitemap | Cookies
© 2017 Hectronic