Soma logo

Ocado, the world’s largest online-only supermarket, has been evaluating the feasibility of robotic picking and packing of shopping orders in its highly-automated warehouses through the SoMa project, a Horizon 2020 framework programme for research and innovation funded by the European Union.

SoMa is a collaborative research project between the Technische Universität Berlin (TUB), Università di Pisa, Istituto Italiano di Tecnologia, Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), the Institute of Science and Technology Austria, Ocado Technology, and Disney Research Zurich.

One of the main challenges of robotic manipulation has been the handling of easily damageable and unpredictably shaped objects such as fruit and vegetable groceries. These products have unique shapes and should be handled in a way that does not cause damage or bruising. To avoid damaging sensitive items, the project uses a compliant gripper (i.e. one that possesses spring-like properties) in conjunction with an industrial robot arm.

The variation in shape of the target objects imposes another set of constraints on the design of a suitable gripper. The gripper must be sufficiently versatile to pick a wide variety of products, including Ocado’s current range which includes over 48,000 hypermarket items.

How RBO softhand could help address these challenges

The SoMa project (EU Horizon 2020 GA 645599) aims to design compliant robotic hands that are suitable for handling fragile objects without much detailed knowledge of an item’s shape; in addition, the robotic arms should also be capable of exploiting environmental constraints (physical constraints imposed by the environment). The goal is to develop versatile, robust, cost-effective, and safe robotic grasping and manipulation capabilities.

An example of a compliant gripper is the RBO Hand 2 developed by the Technische Universität Berlin (TUB). The gripper uses flexible rubber materials and pressurized air for passively adapting grasps which allows for safe and damage-free picking of objects. With seven individually controllable air chambers, the anthropomorphic design enables versatile grasping strategies.

Due to its compliant design, the robotic hand is highly under-actuated: only the air pressure is controlled, while the fingers, palm, and thumb adjust their shape to the given object geometry (morphological computation). This simplifies control and enables effective exploitation of the environment.

Integrating the RBO Hand 2 with an industrial manipulator and testing with a standard object set

The Ocado Technology robotics team replicated a production warehouse scenario in order to evaluate the performance of the RBO Hand 2 for Ocado’s use case. The team mounted the soft hand on two different robot arms, a Staubli RX160L and a KUKA LBR iiwa14. Both of these arms can operate in the standard position controlled mode; in addition to this, the KUKA provides the capability of demonstrating a certain amount of software controlled compliance in the arm.

KUKA robotic arm

We designed a set of experiments to evaluate grasping performance on an example set of artificial fruit stored in an IFCO (International Fruit Container) tray. The adopted strategies attempted to exploit environmental constraints (e.g. the walls and the bottom of the tray) to perform the gripping tasks successfully,.

RBO robotic arm

The experiments started with the simple scenario of grasping a single object from the example set using only the bottom of the tray. Initial results showed that the hand is able to successfully grasp a variety of shapes and the results suggested the chance of success increased when environmental constraints are being used effectively to restrict the movement of the object.

In the coming months, we plan to explore more complex scenarios, adding more objects in the IFCO, and introducing additional environmental constraints that could be exploited by a grasping strategy.

Graham Deacon, Robotics Research Team Leader

January 31st, 2017

Posted In: Blog

Tags: , , , , , , , , , , ,

Alex speaking on stage

Earlier this month I took part in a panel discussion on robotics at Stuff Innovators 2015.

I shared the stage with Patrick Levy Rosenthal CEO of EmoShape, they produce a microchip for robots aimed at producing emotional responses, and Nicolas Boudot from Aldebaran, the company that developed a companion robot called Pepper (seen in the picture above).

Initially I was asked about robotics at Ocado and why there was a need, so I talked about Ocado being much more than a webshop with some delivery vans. I explained our CFCs and the over 700 technologists needed to develop such sophisticated systems, and that part of this work revolves around robotics, the difficulties in picking the entire 45,000 SKU range, and in the design, construction and maintenance of the automation.

The questions and discussion were really aimed at identifying the key challenge when developing robots that interact with people in an intuitive manner. All the panel agreed in pinpointing AI.

Learning knowledge is currently beyond state-of the-art, and the challenges with the physical elements of the robots are being improved all the time, but the required AI to allow robots to exhibit even a basic set of skills with a degree of competence needed for really intuitive human interaction is the most significant challenge.

We then discussed whether humans have anything to fear from the rise of robots. The panel agreed not for the foreseeable future, and not until there is some radical change in state-of-the-art AI.

The Ocado Technology view was that robots will be employed to assist the human workforce and to reduce the environmental challenges they face (for example lifting heavy, awkward SKUs like packs of bottled water ot cat litter). Also, that the customer might see many potential improvements as a result, such as: better quality (pick to tessellation algorithm); shorter delivery windows; reduced minimum order sizes.

At the end, the chair asked us how long it would be before the types of robots we see on screen (Ex-Machina, Humans etc) will be available to buy. Again the panel was unanimous: not within the next ten years. However, Nicolas thought it would be commonplace to have less intelligent robots sooner than that.

While at the conference I listened to a panel discussing wearables, and while the majority of the discussion was very much focused on the consumer side (smartwatches and health/fitness apps), the representative from Intel alluded to some industrial wearables they were involved in that were really interesting to us. Take a look:

      • Vuzix produces smart glasses for commercial/warehouse use, they had a 30% investment from Intel in 2015
      • Munich-based, Intel partner Workaround UG (set up by ex-BMW employees) produces the ProGlove, an intelligent glove that uses chips to power a simple display on the wrist telling the person wearing it whether they completed the assembly task correctly. Here’s a company demo of it:

 

Alex Harvey, Head of Reseach and Project Management

November 11th, 2015

Posted In: Blog

Tags: , , , , ,

Chris Brett

My team researches, implements and analyses new automation mechanisms. We sit within the Simulation and Visualisation area and our focus is mainly on the company’s huge CFCs (Customer Fulfilment Centres – warehouses).

Essentially, we create simulations that we then use as sandboxes to test out different control algorithms.

This could be anything from really innovative new ways of automating, involving entirely new hardware that doesn’t even exist yet, to simple tweaks to existing systems. The impact this has on the business makes this really exciting.

In terms of scale, the KPIs produced by our simulations mean big money for the business.

They allow evidence based decisions to be made as to which algorithm to use, or even what to actually construct, based on how well we predict it will perform in production. We can make the company more profitable thanks to reduced hardware requirements, greater maximum throughput, higher reliability and resiliency etc.

The simulations that we create need to be highly true to life, to be confident that an algorithm that performs well in simulation will also perform well in the real world. This means modelling edge cases such as mechanical failures and time deviations, message loss and latency, and incorrect sensor reports.

When developing high level control systems, the simulations also need to be fairly far reaching. It’s not just the mechanics that must be simulated. Other systems – or even people – that interact with the system we’re developing also have to be modelled. Any event that might happen in the real world needs to be considered.

A lot of the problem spaces that we work in are extremely complex, which makes designing and implementing a highly optimal algorithm within that space similarly complex, and a lot of fun. Even if you can design an algorithm that can derive an optimal decision, can you make it run fast enough to keep up?

The research and development nature of our work means that our goals often change rapidly. A new hardware idea may come along that we then simulate, prototype the control of, analyse, and discard based on the results, moving on to the next challenge. Or the bottleneck of some system may be overcome, revealing the next bottleneck that can be optimised. I really enjoy this variety of work. There’s always a new puzzle to solve.

I work with a great team – they’re good fun, really smart, and we learn a lot from each other.

Some are experts in particular areas, but mostly we try to spread the knowledge and skills throughout the team so that everyone gets variety, new challenges, and the chance to work with different people.

If you’d like to join us, we’re currently recruiting for these roles and would love to hear from you:

Senior Java Software Engineer – Simulation

Java Software Engineer (SE2) – Simulation

Chris Brett, Simulation Algorithm Development Team Leader

September 22nd, 2015

Posted In: Blog

Tags: , , , , , , , ,

Matt Whelan

My team builds simulations of physical systems. Our work falls into 3 categories: experimental, tactical, and operational.

At the experimental end, we build simulations and design tools for new technologies and warehouse layouts, along with prototype control algorithms.

Tactically, we try out proposed changes to our warehouse topologies in silico and perform ROI analysis. We create and mine large data sets so we can spot and remove risk from our growth strategy.

Operationally, we pipe streams of production data into 3D visualisations, originally developed for playing back simulations, allowing real-time monitoring of our live control systems.

We get to work on some pretty bold conceptual projects because, when working at such a massive scale (last year our operation turned over £1billion), even seemingly small percentage efficiency savings mean serious money to the business.

I read a lot about how the more theoretical aspects of computing – things that interested me in the subject in the first place – aren’t as important in the ‘real world’ of enterprise software development. There are big players in all kinds of industries getting left behind because they shy away from AI, robotics, and large scale automation. I think we’re really lucky that we get to spend our time creating novel path searches, travelling salesman solvers, discrete optimisers and the like, and it gives us an edge over our competitors in a fierce market.

The team is a real mixed bag of interests and hobbies. We have a physics doctor, a swing dancer, and a gaming software expert for starters. One thing we all have in common is that we’re unfazed by scale – an attitude which pervades Ocado Technology – and all looking to be the person with the big idea.

The beauty of the environment we’re in is that we can prove how big that idea is before millions are spent on building it.

If that sounds like a team you want to be a part of, these are the positions we’re recruiting for now:

Full Stack Django/Celery Software Engineer

Java Software Engineer (SE2) – Simulation

Senior Java Software Engineer – Simulation

Matt Whelan, Simulation Research Team Leader

September 16th, 2015

Posted In: Blog

Tags: , , , , , , , , ,

SecondHands

Recently we kicked off an exciting project to develop an autonomous humanoid robot. It will use artificial intelligence, machine learning and advanced vision systems to understand what human workers want, in order to offer assistance.

For example, it will be able to hand tools to maintenance technicians, and manipulate objects like ladders, pneumatic cylinders and bolts.

The ultimate aim is for humans to end up relying on collaborative robots because they have become an active participant in their daily tasks. In essence, the robot will know what to do, when to do it, and do it in a manner that a human can depend on.

The project is called SecondHands as it will literally provide a second pair of hands, and is part of the European Union’s Horizon 2020 Research and Innovation programme. We are leading the research, working with four other European institutions.

The tasks our robot will carry out will increase safety and efficiency, and require us to focus on key areas of robotics including:

Proactive assistance – the robot will have cognitive and perceptive ability to understand when and what help its operator needs, and then to provide it.

Artificial intelligence – to anticipate the needs of its operator and execute tasks without prompting, the robot will need to progressively acquire skills and knowledge.

3D perception – advanced 3D vision systems will allow the robot to estimate the 3D articulated pose of humans.

Humanoid form and flexibility – SecondHands will feature an active sensor head, two redundant torque controlled arms, two anthropomorphic hands, a bendable and extendable torso, and a wheeled mobile platform.

For more information, see the project’s website.

Dr Graham Deacon

Robotics Research Team Leader

UPDATE: If you think that sounds interesting, we’re looking for a talented Robotics Research Software Engineer to join the team. Take a look at the role now.

July 1st, 2015

Posted In: Blog

Tags: , , , , , , , ,

Pisa/IIT SoftHand

I’m very excited about a new wave of robot development known as soft robotics (technical name: variable stiffness actuators), so when I saw there was a training course happening in Rome in February, I knew I had to go.

Actuators, or motors, are the muscles of the robot – the mechanical parts that create movement. Traditionally, motors (and robots) have been large, heavy, precise and rigid. So rigid and strong that you can’t move them when the power is off. New hardware called ‘series elastic actuators’ is changing that.

The first commercial product using series elastic actuators is a dual manipulator called Baxter from Rethink Robotics. This robot is designed to be safe.

Sawyer robot

Sawyer robot courtesy of Rethink Robotics

It’s much lighter than industrial robot arms and moves slowly, so it won’t hurt you if you bump into it. More importantly, the motors themselves incorporate sensors to detect the contact. Robots designed for sensing contacts and to be safe are often called ‘collaborative robots’, and they promise to revolutionise the adoption of robotics in industry.

In industry, robots production cells are traditionally designed with a fixed robot and a fixed environment, and the cell layout is locked up and protected. This means nobody can move around and place an unexpected object in the robot environment or bump into the robot while it’s moving. Like here:

Tesla autobots

Tesla Autobots by Steve Jurvetson licensed under Creative Commons Attribution 2.0 Generic

By contrast, collaborative robots can share a space with people and keep moving – even when those people move around or the environment changes unexpectedly – thanks to their capability for sensing contact. They can even work together with people; you can physically move the robot arm to where you want it to be.

Back at the school, with variable stiffness actuators the focus was on design and control of soft robotics – in particular new types of variable stiffness actuators that are biologically inspired. They can work like our muscles do i.e. they can be in a relaxed or rigid mode, switching between the two by changing the tensions of an internal springs mechanism.

The theoretical and practical consequences of soft components between the electric motors and the robot joint are amazing: we can make robots that sense contacts; are more robust; and more energy efficient.

The school had both theory lectures and hands-on sessions where we wrote software for modelling and control of these types of motors. The hands-on sessions were not only exciting because we got experience on a real robot, we also ran a competition on which team could write the best control system to perform a simple robot task.

Four variable stiffness actuators

Four variable stiffness actuators

qb-move motor

The internal springs of the qb-move motor with two electric motors and a biologically-inspired spring mechanism

Industry will still need fast and heavy-payload robot arms for many years to come, but a lot more applications are opening up in terms of technological capabilities and costs that were completely impossible to achieve with traditional industrial robots.

I’m proud that Ocado Technology is at the cutting edge of robotics research, as we are participating in two EU-funded research projects on collaborative and soft robotics, alongside many academic and industrial partners. We’ll be working on the key capabilities our robots will need for advanced applications, such as sensing the environment and reacting to the unexpected. A soft robotics revolution is under way, and will bring us safer, more versatile robots.

Marco Paladini, Robotics Research Software Engineer

May 1st, 2015

Posted In: Blog

Tags: , , , , ,

Scroll Up