Home » Article » Tech Article » When Will The Robot Revolution Begin?

When Will The Robot Revolution Begin?

We will have to wait a few more years to find out whether or not the package delivery drones shown by Amazon and DHL are more than just a marketing gag. Other autonomous drones have been in service for a long time, just like self-driving cars, at least in pilot projects. But before AI-robots can populate this world in huge numbers, they need to overcome some technical and legal hurdles – and convince us of their benefits. Package delivery drones are hardly suitable for large-scale deployment but are quite an interesting solution for special deliveries, for example, medical emergencies.

Self-flying-drones When Will The Robot Revolution Begin?

Self-flying drones like the md4-1000 from Microdrones inspect industrial
systems (such as windmills) for damage

This is evident from the pilot project Defi-Copter conducted in Germany, where the device flew to hard to reach areas with a defibrillator that can be called via an app. The prototype covered a radius of ten kilometres and dropped the defibrillator at the location using a parachute system. Developer Height-Tech, like the companies Aibotix and Microdrones, has been making such drones for various purposes for few years now. The small aircrafts can record aerial films, inspect windmills, bridges or power poles and observe crowds of people at major events. The prices range from about 300 Euros (~RM1,355) for a hobby drone from Parrot to more than 40000 Euros (~RM180,420) for the md4-1000 from Microdrones that were used by DHL. The high costs of professional drones are primarily due to the additional devices used. The md4-1000 can be fitted with simple cameras as well as with a thermal imaging camera or a gas sensor. Once the additional devices are removed, the propeller-driven drone models are hardly different from one another. The standard fittings include sensors (see info-graphics on the next page) such as a gyroscope that measures axial inclination or ultrasound sensors for measuring the distance. The signals of sensors and the control unit are processed by a small, on-board computer. The Parrot drone is equipped with a 1GHz ARM Cortex A8 (which was also used in the first Samsung GALAXY S) with 1GB RAM.

Drones are flown by remote control (manually) or by using software (automatically). The drones from Microdrones cover air routes that are defined either with the help of Google Earth satellite photos or on the basis of aerial photographs taken by the drone itself. With the additional module POI-Orbit, it is also possible to generate an air route as an orbit around an object (the Point of Interest, POI in short). The drone can then circle a windmill for instance and take videos and images of it for inspection. While many countries have not come up with concrete laws for drones, most have stated that although drones can fly by themselves, someone needs to be within its range of vision to prevent a crash via remote control. Moreover, a drone may neither fly above 100 metres nor invade someone’s privacy (for example, by taking pictures).

Private use is allowed as long as these rules are observed; the responsible aeronautical authority of the respective state must test the commercial applications. This would also apply to package delivery drones but a lot many questions need to be answered before these can be used. For example, does the receiver need a special landing place? And how can they be protected from vandalism and theft? Neither Amazon nor DHL have answered these questions; their package delivery drones are nothing more than a concept and case study at this point in time.

On the streets with a laser, radar & camera

The-senses-of-a-drone When Will The Robot Revolution Begin?

The senses of a drone

Letting drones fly unmanned without them being thrown off by the wind or without them colliding against a tree is in itself a technological master stroke. However, the level of difficulty for selfdriving cars is even higher because they move in a very complex world of narrow lanes, traffic rules and signals – a world that is populated by creatures, whose actions cannot be predicted and who are even irrational at times: yes, humans. The vision for the deployment of autonomous cars therefore sounds like science fiction: if the complete traffic flow is autonomously controlled, there would be no jams, no accidents, no parking space problems and to top it all off, there will be less environmental pollution.

For human beings, this vision holds a promise of more mobility, stress-free travel and less expenditure for livelihood and insurance policies. But we are ages away from this scenario because combining all the necessary intelligent systems is a long process – and it has already been on-going since decades ago. Since the 80s, ABS and traction control have been used for braking and accelerating. Then came lane departure warning systems and parking sensors, which emits warning signals when there is a risk of the car leaving the lane or hitting an object. Almost all major car manufacturers are working on further developing and combining these systems today. Some BMW and Audi models drive themselves in a traffic jam.

Volvo’s cars can be parked automatically, while the driver stands next to the space that he has found via the smartphone. And in Singapore, a shuttle autonomously transported eight people on a one kilometre long stretch at 20km/h. The Induct Navia used here orients itself with the LiDAR system, which has been tested by Google for a long time and by Ford as well. In principle, LiDAR is a radar with light. With 64 rotating laser beams, it generates more than one million reading points per second and uses these to calculate a 3D model of the surroundings and combine it with known map material. Radar, camera and GPS support the system. Google has been testing such cars since 2009; they have travelled much more than 500,000 test kilometres without a single accident documented so far. Google is aiming to launch these autonomous cars in 2018. Even Mercedes has successfully demonstrated its own autonomous car; an S500 with the Intelligent Drive system autonomously drove on the 100 kilometre stretch between Mannheim and Pforzheim in Germany. Intelligent Drive is based on the Distronic Plus system with steering assistance and Stop&Go pilot for autonomous control in jams, which has already been incorporated in the E and S class. A 3D camera on the rear mirror, two normal cameras, various high-range radar measuring devices, ultrasound sensors (see infographics on the right) and, optionally, infrared and thermal imaging cameras analyse the surroundings. About 300GB data is generated per hour.

An on-board computer uses this data to decide in real time whether the car should dodge an object for instance or what it should do at an intersection. At the maximum speed of 100km/h, the computer synchronises the signals received with a map ten times per second. The map, with a ten centimetre precision, has a record of all the traffic lights and pedestrian crossings and was prepared especially for the test route because the existing satellite images were too inaccurate. Such precise map material is an important prerequisite for serial deployment, but the images would need to be updated quickly in the event of structural changes on the roads – this would prove to be the biggest hurdle.

According to Mercedes, there is another, very pragmatic hurdle: traffic light recognition. In this respect, technology faces the same problems as humans because traffic lights are often located in an unfavourable angle or in direct sunlight. But the recognition of objects (like the traffic lights) itself is just a part of the challenge. “The big and yet unresolved problem is to place the detected objects in a context”, says Prof. Frank Kirchner, head of the Robotics Innovation Centre at the German Research Centre for Artificial Intelligence. His research objective, says Kirchner, is to construct a model-based reality, which will serve as the basis for decision making by autonomous systems. Man has a very good ability to separate important information from unimportant information and to react flexibly. “This is still quite difficult for computers, especially when they are confronted with irrational behaviour of other road users”, says Kirchner. Car-to-X communication, in which vehicles exchange information with each other as well as with the surroundings, can be a possible solution to this problem.

The future depends on the law of 1968

Kirchner states that self-driving cars could be seen on the roads in five years from now, “at least on motorways, because this environment is easier and can therefore be formalised better”. His estimation corresponds to that of the automobile industry. Even Mercedes estimates a market launch at the end of this decade.

On the other hand in the British town of Milton Keynes, 20 self-driving cars would drive passengers to the city centre as soon as in 2015 and from 2017 onwards, there would be a 100 of these cars on the roads. The actual implementation in day to day life also depends on certain legal issues. The Vienna Convention on Road Traffic from 1968, which states in article 8: “Each driver must be able to have constant control of the vehicle […]” is still valid. This is why assistance systems such as the park distance control or Stop-and- Go assistant can only be used if it is ensured that the driver can take over the control of the vehicle at all times.

In Europe, there is a directive on steering equipment in vehicle that states: “Independent-power steering equipment is not permissible” also needs to be customised. Mercedes estimates that these laws would be revised in the coming three to four years. But despite new directives and a Blackbox, which records all data from cameras and sensors to be able to reconstruct the cause of accident, it would still be unclear as to who would be liable for accidents: the owner of the car – or the manufacturer?

Leave a Reply

Your email address will not be published. Required fields are marked *

Name *
Email *