Now, as the article explains, watch what they are about to do with drones...JL
Bob Dormon reports in ars technica:
Domino’s application is artificial intelligence; it can communicate with natural language it can order you a pizza. It’s really good at doing one thing and that is just ordering a pizza. That’s what we call "designed intelligence." There is machine-learning in the natural language capability, there’s machine-learning in the speech recognition capability, but ultimately the system can only do one thing and that is what it was purposely built for, which is to order pizza.
It wasn't so long ago that the idea of biped or quadruped robots delivering pizzas seemed like a real possibility. Such speculation was fuelled by Google's acquisition of Boston Dynamics—a company more associated with potential military applications, which earned notoriety for its scarily nimble Big Dog robots.
Google, and then its parent company Alphabet, had to work out how to bring this technology to civvy street—a problem that certainly gnawed at the development team. The vision was perhaps too grandiose even for Alphabet and, earlier this year, it was reported that Boston Dynamics had been off the leash so it could sniff around for potential buyers.
With that short-lived episode over, the roadmap for robotic deliveries is now back under debate, at least if you were expecting arms and legs. But how about an autonomous delivery box on wheels or an airborne drone instead? These are all at the prototype stage but they’re definitely coming; in fact, there's a wide range of innovative technology under investigation by retailers to achieve a competitive edge, from drones to chatbots to redesigned Mars rovers and slasher robots.
Exploring Amazon
When it comes to delivery, robotics are already in use but mostly out of sight, unless of course you’re working in a customer fulfilment centre—a distribution warehouse to you and me.
Amazon’s acquisition of Kiva Systems in 2012 was a rather less dramatic move than Alphabet buying Boston Dynamics. In 2015 it was rebranded as Amazon Robotics, and its operation has retreated in-house, much to the chagrin of those who had a business model built around its machines. As a consequence, Quiet Logistics, previously a poster child for Kiva robots, decided that the only way around the problem would be to design its own. It formed Locus Robotics in 2014 and has since developed its own LocusBot warehouse picking assistant. It’s work in progress and demonstrates yet another approach to warehouse automation.
Amazon paid an eye-watering $775 million (£630 million) for its warehouse robotics company and was last month awarded patents for its “system and method for coordinating movement of mobile drive units,” which describes the cleverness that enables multiple bots to identify and select stacks of goods, and then move them around a warehouse floor without collision. The floor has a grid embedded into it with specified rotation areas that ensure the bots don’t stray.
What they actually transport are described as "magic shelves"—tall storage units that contain a range of goods. Compact orange robots slide underneath these structures and lift them from below. Then, by motoring along at a sober pace, they transport them to individual packing stations where human operatives pick items from the magic shelves to prepare an order. As the video shows around the 1:13 mark, the bots line up for a picker who never moves from her station.
Old news perhaps, but at least this is a robotics technology that is actually being deployed to deliver increased efficiency. Admittedly, these bots aren’t covering the last mile to your doorstep, but they’re covering thousands of miles a year in warehouses, overseen by some tricky optimisation tasks to maximise route efficiency.
Indeed, it’s the work of analytics and machine-learning algorithms that are determining just how effective robotics and automation can be at accelerating delivery, whether within a warehouse or out on the road.
Ocado: From warehouse to table
For online grocery retailer Ocado, the use of big data analytics infiltrates just about every part of the business from customer profiling on the website, to analysing product demand, delivery routes and, of course, providing the customer with delivery slots to choose from. To the user, clicking on a delivery time won’t involve a great deal of thought but it’s actually a fantastically complex offering, as Ocado's CTO Paul Clarke explains.
“In that click, a lot happens in the background. Our systems are looking at your geography and all around your geography. They’re looking at existing orders that have been placed. They are working out—between 6am and 10.30pm, in half-hour increments—which slot we could serve you that day. This is based on existing orders and what we think you will have in your order, if you haven’t already filled your basket. It’s also based on the capacity of vans. And that has to happen in about 500ms.
“If you ask that same question five seconds later, the answer will be different. Other customers will have placed an order and some people will have added more to their order. That takes up capacity in the van. It’s a real-time optimisation where the problem spec is changing under our feet all the time.”
Fulfilling timed delivery slots certainly adds a layer of complexity that typical courier firms don’t experience. For them, the batch-optimisation problem is simply in working out a delivery order and choosing the vans. If a customer is out, just put a card through the door.
Moreover, the courier won’t be putting the items in the boxes for distribution either. Lest we forget the CFC warehouse that’s having to manage deliveries from suppliers and, of course, the pick-and-pack needed to satisfy individual orders before they even set off in a van with an assigned time slot.
To deal with these variables Ocado looks to the cloud, with AWS and Google providing the muscle for its Ocado Smart Platform (OSP) offering. Besides marketing a delivery solution that other retailers can buy into, Morrison’s being the latest example, the Ocado Smart Platform is being revamped with robots designed in-house. These are currently being tested at its Andover facility and have already processed full orders, although Clarke wants to do more larger-scale tests before “pushing the button.”
The new robots operate on a grid which Clarke confesses is not an exactly new concept, dating back to container ports, where cranes on a frame roam over boxes and move them around. Companies such as Swisslog utilise a similar approach with their grid bots and it has had success with Asda deploying its AutoStore Small Parts Storage System at its warehouses. Ocado’s technology is another variation on the space-efficient grid theme.
Still, there was a fleeting glimpse of these new bots in action on Twitter recently, and the technology behind them suggests that this really could be an interesting development in terms of robotic efficiencies. The autonomy bottleneck, as discussed in a recent Ars article, is that robots are merely shuffling bins around that humans have to pick from, as we’re just a bit too clever for the pick-and-pack side of things to be replaced by automation just yet.Just like Amazon’s Kiva bots, the Ocado system is designed to improve access to the goods in the warehouse, but this is being refined at every level. In essence, it’s a four-dimensional optimisation problem. There’s the two-dimensional motion of the robots, the third-dimensional optimisation of the grid that has the storage bins underneath, and the fourth dimension—achieving that process efficiently by shifting these bins of goods over time.
As orders continuously arrive from customers, the system not only has to consider the best place to store a bin back on the grid after picking but also has to look ahead to future orders that will be processed, which may have a bearing on that location. At the same time, the best route-optimisation at a 2D level needs to be calculated as the robots whizz around, which adds to the complexity.
“There are a load of algorithms and machine-learning from doing route optimisation,” says Clarke “but that’s slightly different from the analytics that sits on top, which is almost independent. Its aim is to detect things, like a modification to a piece of firmware that has changed the physics model of a bot, or if an individual bot has an issue that’s going out of spec. You’re trying to pick that up. Like everything else, our first port of call is to put everything into a data lake in the cloud, and then perform smart analytics on top of it.”
A further aspect to route optimisation is the underlying system that monitors more routine conditions, like when a particular isn't accelerating as fast as expected or if its battery doesn't last as long. Such symptoms can impact the effectiveness of the whole grid, and so poorly performing bots need to be identified and called in for a service or battery replacement. Indeed, any robot can take over tasks from another and, as they’re battery powered, the setup as a whole is designed to be more tolerant to power loss or brownouts.
While customer orders influence the storage locations of the bins of goods zooming around in the grid, there are other factors that have a significant bearing on their “velocity.” In short: the speed that goods move out of the warehouse. Milk and bread shift quickly, but an exotic scented candle will be a slow burn. Demand alters with the weather, time of year, or a festive period—just watch the candles and pumpkins fly during Halloween. Even a TV chef can make a difference.
“We saw the speed of a vanilla pod—which
typically has a spike only at Christmas time—as a result of some celebrity chef recipe,” says Clarke.
Yet rather be taken by surprise, analytics and predictive algorithms for customer buying patterns are routinely performed. The ideal is no empty shelves and no overstock either, as Clarke explains.
“We run multiple forecasting models that compete with each other for accuracy. Based on which one wins, a particular forecasting model is chosen for each product and if that changes over time, then we will pick a different forecasting model. That’s an important of part of us predicting how we get goods into our warehouse just in time; minimise stock cover and maximise freshness, while at the same time, giving uniquely high availability to our customers.”
Next-day delivery is so yesterday
For some, next-day delivery is too slow. What about now? Say, within 30 minutes? The takeaway food guy on his bike will probably manage it but there’s a cost involved. Amazon Prime Now arrives within an hour in some areas, but don’t expect it to be delivering hot pizza, prawn dhansak, or chop suey. And of course, there’s a cost involved there too. As for its drones, how soon we’ll see them landing at our homes is up in the air at the moment.
Evidently, we’re not quite ready for a humanoid pizza-bot to appear on the doorstep, but the giant fast-food franchise Domino’s Pizza announced its Domino’s Robotic Unit (DRU) in March this year. Not the April Fool’s joke from 2015; it's a land robot though not much more has been heard about DRU since the announcement.
While DRU is exclusive to Domino’s Pizza, Starship Technologies has an attractive alternative available to all businesses, and driving it forward is its goal of reducing delivery costs. To achieve this objective there was an obligatory journey across Mars involved—a trek that any self-respecting entrepreneur should undertake, surely?
It’s not as absurd as it sounds, either; in 2013-14 the company’s cofounder, Ahti Heinla, led an Estonian team that participated in the annual NASA Centennial Challenge contest, building their own Sample Retrieval Robot. The task was to create an autonomous rover that could traverse unknown terrain on a planet such as Mars and gather samples. But rather than construct one big robot, the team decided to build a swarm of robots that could help each other. The knowledge from that experience led Heinla and Janus Friis (both Skype cofounders) to launch Starship Technologies in 2014.You might be forgiven for mistaking the company’s delivery robot design for an elaborate wheelie coolbox, as its basic purpose is as a secure container. It just so happens have six motorised wheels, nine cameras, a 360-degree ultrasonic sensor array, and an Nvidia Tegra K1 processor handling the machine vision and autonomous driving. It has GPS but the computer vision combined with the company’s proprietary mapping system lies behind its two-centimetre navigational precision. Even the best non-military GPS receivers are only accurate to within 3.5 metres. Starship Tech’s land robots are designed to roam our pavements, but they are purposely restricted to trundle along at 4mph (6km/h), although they’re capable of 10mph (16km/h). The idea is for the robot to follow pedestrian routes and match a typical walking pace.
The unit itself weighs nearly 16kg and can take a 10kg load within its 406 x 343 x 330mm cargo space. The unit itself measures up at 686 x 559 x 559mm, and the power consumption is around 50W, with a battery life of two to 2.5 hours.
The design is for one-off deliveries within a three-mile radius that take less than 30 minutes to complete. So, unlike Ocado, there’s no massive undertaking to manage scheduled slots that are affected by what other customers need to have delivered in the area, or the size of those orders.
Admittedly, you can only fit up to three shopping bags inside one of these land robots, but if you’ve ordered a curry for a family of five and some drinks, there should be plenty of room. Starship Technologies is targeting three delivery scenarios: food orders, groceries, and parcels. In the UK, there are trials underway with takeaway food deliveries available through Just East in select areas of Greenwich in South London, which just so happens to be where the company’s UK headquarters is based. In Germany, it’s working with Metro Group (groceries) and Hermes (parcels), likewise with Swiss Post in Switzerland. Starship Technologies wants to get food or grocery delivery costs down to £1.
Autonomous, but not quite
First things first though, the robot has to learn its journey, and to do that it needs to map the neighbourhood. From its nine cameras it’s analysing thousands of straight lines every single second, building a 3D map of its environment. The map constructs lanes indicating where the robot can and can’t drive, and during the process of map-creation the robot is not driving autonomously but under human operation. Even so, it can still do deliveries, so these journeys aren’t wasted.
Autonomous driving is achieved by comparing the straight lines on the map with the straight lines it is seeing live while operating alone. By analysing what it sees and what it expects to see, it determines its location to within less than an inch.
If there are barriers, obstacles, pedestrians, dogs, and suchlike, it responds accordingly, either stopping to let things pass or navigating a more persistent obstruction, adapting and updating itself. As it encounters permanent changes it feeds back to every other robot in the area. There will be times when a robot simply doesn’t know how best to proceed, and in these instances it pings back to a human operator who can check the video feed and remotely control it, if necessary. By design, it’s not an entirely autonomous system, explains Henry Harris-Burland, the company's marketing and communications manager.“We don’t want 100 percent autonomous driving. We always want 99 percent because we know and want an element of human oversight and operation at all times. We are estimating that, in the future, it will be one human operating 100 robots. There would be a bank of operators. Think of a call centre of human operators being pinged by robots in various neighbourhoods around the world, saying: ‘Ping, there’s a hole in the road, what do I do?’ or ‘Ping, there’s a new obstacle in my way, what do I do?’ A human can make that decision in a split second. There’s even a microphone and speaker on board, so nervous pedestrians can be reassured or perhaps even asked for assistance.”
The communications are managed through commercial 4G mobile networks and, during autonomous driving, a minuscule amount of data travels to the servers conveying robot health and telemetry. The bot only streams video when it's under operator control, and of the nine cameras, three are stereo pairs and the other three are time-of-flight (like the second-gen Kinect) that are used to determine distances.
Video is recorded onboard the robot and the top part of what a human operator would see in the live stream is apparently blurred to anonymise people and number plates. If there’s no video footage that can be used for analysis to improve mapping or autonomous operation, then the data for that day is wiped.
Key to the success of Starship Technologies’ robotic delivery method is sharing. Demand varies for different services throughout the day, so the bots won’t necessarily be exclusive to one business. Instead, it’s likely they’ll be busy with parcel or grocery deliveries for most of the day, switching over to food in the evening.
“Food delivery in general has a very specific peak time of, say, 5pm to 9pm, which means nobody’s going to ring for pizza at eight in the morning.” says Harris-Burland. “What would those robots be doing for the entire day up until 5pm? In order to get cost-efficient delivery, you have to be efficient in every single area of the business model, which means the robots will be shared. You’ll have multiple partners, same robots.”
Even though the bots can climb kerbs of 20cm and trudge through 5cm of snow, having them journey through London's packed Leicester Square or Shoreditch High Street isn’t the best use-case scenario. Besides the excessive pedestrian traffic in those areas, there’s also the need for a hub.Ideally, a fleet of robots would operate in residential and suburban neighbourhoods. Think of rows of semis and terraced houses. The robots would all be based in a local hub near a restaurant, grocery store, or parcel shop—preferably all three. Parcel deliveries would arrive at the hub and robots given individual packages to deliver to the customer at the time when they want to receive their parcel.
A restaurant wouldn’t deliver a pizza to a hub though. Instead, a robot would journey from the hub to the restaurant, receive the food, deliver it to the customer and then return to the hub. A delivery triangle, hub and spoke. This also allows for battery charges or swaps, and preparations for shared-client shifts back at the hub. The company has also been experimenting with Mercedes-Benz, whose prototype Robovan can store eight Starship Tech robots together on a racking system that holds delivery packages. The thinking behind the Robovan is to deliver more parcels in one shift than a conventional delivery round. The van arrives in a neighbourhood and either drops off robots loaded with parcels, or collects robots and fills them up with more. Apparently, 400 packages can be distributed this way over a nine-hour shift, minimising congestion and vehicle emissions. By contrast, traditional man-plus-van can only manage 180 parcels in that period, bearing in mind the extra time needed to abuse cyclists, block bus lanes, and idly press the wrong doorbell.
Talking of doorbells, there will be an app to notify customers of bot delivery. Obviously, not everybody will bother to download it, so there is a text-messaging alternative currently being used during the trials in Greenwich. Texts need to be acknowledged, and if they’re not customers receive a follow-up phone call. The user’s mobile is also used to unlock the delivery bot.
Admittedly, collecting samples from the terrain of a distant planet does fire the imagination rather more than redistributing the output of a local pizza oven. Still, the research in learning to deal with hostile environments will no doubt be useful if and when Starship Technologies decides to test its bots beyond leafy Greenwich and ventures into neighbouring Woolwich.Pizza, the action
It’s often been said that without the online adult entertainment industry driving innovation on the Internet, the e-commerce and video streaming platforms that we take for granted today would never have matured so rapidly.
In years to come, will we be saying the same about pizza’s role in accelerating retail delivery technologies? Oddly enough, it’s never far from advances in food and grocery deliveries, and has a role in purchasing innovations too. Ever keen to embrace new technology, Domino’s Pizza has seen sales boosted by Dom, its voice-controlled virtual assistant app that takes orders speedily and accurately.
Nuance, experts in speech recognition and natural language understanding, helped devise the app for the company using an implementation of Nina (Nuance Interactive Natural Assistant) to function as a mobile voice-enabled customer service assistant. Nina can be used to provide an intelligence layer on top of a mobile app so that it can communicate with you the same way you’d communicate with a person. So, instead of teaching the user how to operate the system, the system is taught how to communicate with the user.
“The Domino’s application is artificial intelligence; it can communicate with natural language it can order you a pizza," says Nuance's Mark Hanson. "It’s really good at doing one thing and that is just ordering a pizza. And that’s what we call "designed intelligence." In order to create that system, what we had to do was curate that experience. So there is machine-learning in the natural language capability, there’s machine-learning in the speech recognition capability, but ultimately the system can only do one thing and that is what it was purposely built for, which is to order pizza. It doesn’t learn beyond that. If you want it to do more things you have to bring back those human experts and build onto it.”
So what if you do want your chatbot to do more? That all comes down to "intents." Hanson suggests that a typical customer service centre might have to handle around 200,000 intents—“the objectives of a customer, which map to a very specific answer or set of actions that are performed by an agent in order to resolve them.”
A virtual assistant couldn’t possibly deal with so many intents out of the box, and there are limits to what is practical to attempt, as each intent can take up to 100 hours to develop.
“Right now what we do is leverage designed intelligence to get to between 200 and maybe 1,000 intents," he says. "And that usually gives you a significant amount of value.”
When a virtual assistant doesn’t know an answer and, in effect, the problem is beyond the scope of its known intents, the customer query is escalated to a human adviser. This presents a new opportunity for the virtual assistant to learn by observation.
“When we introduce human-assisted learning we can actually monitor what is happening when those interactions go to an agent,” he says. “We can start to automatically learn all of those things beyond the 200 or 1,000 that you built up using designed intelligence.”
To deal with the less frequent intents that make up the long tail, Nuance has developed what it calls the human-assisted virtual assistant (Hava). With this approach, when the virtual assistant doesn’t know the answer, it seeks help from a hidden human agent. The hidden agent then performs the actions required, supplies an answer back to the virtual assistant, which in turn presents the answer back to the user.
“Over time, through reinforcement learning, we’re able to understand what that agent did. So, we’ve got the question that was asked and all of the actions and answers that the agent performed to learn what that agent did. The next time that question is asked, the virtual assistant knows the answer,” says Hanson.
The level of sophistication required ultimately depends on the client’s ambitions. Apple’s Siri uses Nuance technology for general queries and actions, likewise Viv Labs feeds its Dynamic Program Generation software with text instructions to identify the intents. And when it comes to intents, Viv does seem to have an extremely good grasp of them.The brains behind Viv is Dag Kittlaus, who also created Siri at SRI. Back in May this year, when Kittlaus showed Viv off for the first time, he was coy about selling off his new mobile technology in the near future. However, this month Samsung enticed him with a deal that enables Viv Labs to operate independently. He cites the partnership as a means to “accelerate our vision” and bring scale to Viv. And of course, Samsung will always be first in the queue for any new tricks that Viv’s AI assistant comes up with. No doubt purchasing goods and services using spoken, natural language will be a high priority, but probably the first question that any self-respecting owner of a shiny new Samsung phone will ask of Viv's AI is: “Where's the nearest fire extinguisher?”
While AI is destined to insinuate itself into our daily buying habits with conversational apps that understand our purchasing requests and smart shopping baskets that predict what you’ve consumed to enable a weekly shop in just one click, there are forces at work that are busy analysing trends and making recommendations which aren't simply based on what you’ve bought previously.
Online fashion retailer Zalando is tailoring its recommendation engine to be aware of the individual in terms of taste. This is harder than it sounds, as the intention is not to get a user to buy more of the same, but to predict the fashion items coming on stream that they would be interested in purchasing.
Indeed, how will Zalando’s virtual Zoolander work out that you’ve only bought that fake leather outfit for a bondage fancy dress party and that it’s not a significant feature in your everyday wardrobe? Apparently, collaborative filtering is the answer, which involves analysing masses of data to get an accurate picture of user behaviour from their clicks and views. Zalando’s ultimate aim is to use machine-learning to recognise the visual vocabulary of fashion; identifying styles, patterns, and colour matches to inform its recommendations.
For now though, the starting point for new users is content-based filtering that relies on algorithms that churn through Zalando’s product data resources to generate initial recommendations. Whether you actually like the brands featured on the site is another problem, but that’s all part of the fun of shopping, right?
Recently, Ars ran a story on Shipwallet, a company that wants to provide easy and convenient shipping options to rival Amazon’s logistics supremacy. Behind the scenes, Shipwallet’s machine-learning and analytics work out the best dispatch method derived from an army of different couriers and a knowledge of your local area. It provides simple choices based on your own preferences that appear at the online checkout. It’s designed to give a shot in the arm to online retailers who are desperate to remain competitive. Whether Starship Technologies will find its bots enrolled as part of this delivery offensive remains to be seen, but one area where Shipwallet will need to investigate, if it really wants to match Amazon, is airborne drones.
Will we see DHL, FedEx, Parcelforce et al, join a swarm of independent couriers learning how to fly in the near future? Many delivery firms are working on drone projects, but DHL has been through three different designs, and its latest Parcelcopter is intended for remote areas that are equipped with a SkyPort. This is effectively automated post-office with a roof that opens up to allow the drone to land, manage the package, and recharge. The SkyPort is quite a hefty prerequisite, considering this tiltwing drone with a two-metre wingspan can only manage a 2kg payload, but that hasn’t stopped DHL from installing a couple up a mountain in Germany, with one at Reit im Winkl, and the other 500m further up at Winklmoosalm. The journey between the two SkyPorts takes eight or nine minutes to cover the 8.6km, with a typical flight speed of 70km/h and a maximum speed of 126km/h. The flight is fully automated, and as it's in a mountainous region, there’s little chance of encountering other aircraft.
Regulatory bodies normally place restrictions on drone test-flights, insisting on line-of-sight operation and prohibiting multiple drones from being controlled by one person. That said, Amazon struck a deal with the UK government in July 2016 allowing for such tests.
Amazon’s drone deliveries still seem some way off, and it's a design challenge that has its share of compromises. Performance speculation is rife given the paucity of information available on Amazon’s latest iteration, which surfaced in November 2015, although Phys.org has had a stab at assessing its airworthiness.
All Amazon will tell us is its latest Prime Air drone can travel for 15 miles at a speed of 55mph, flying at under 400 feet, and that it's equipped with onboard sense-and-avoid systems. It weighs 25kg and can manage a 2.3kg burden, but overall dimensions and hold capacity remain a mystery.
In a press release, Amazon wrote: “We have more than a dozen prototypes that we’ve developed in our research and development labs. The look and characteristics of the vehicles will evolve over time.”
If you want to take a peek at one, then the not-so-secret location of Fleam Dyke in Cambridge, England appears to be where the action is.
Google’s Project Wing is still flying, despite junking earlier designs in March 2015 in favour of a quadcopter design with a 1.5m wingspan. But as with Amazon, designs are subject to change. Google reckons the challenge of “moving stuff around” is too great for just one company, and its website dedicated to the project is now seeking collaborators in industry and government.
The site’s video suggests flying burritos will arrive one day, but the overall mood of Project Wing implies more noble causes: ferrying aid to disaster zones and drought-stricken regions of Australia.In neighbouring New Zealand though, the emphasis is once again on pizza. Drone company Flirtey has teamed up with Domino’s, and in August 2016 let loose its DRU Drone in Auckland, which it hopes to use in customer trials later this year. As with everywhere else on the planet, there’s red tape involved but Domino’s CEO Don Meij is optimistic.
“We are planning a phased trial approach which is based on the CAA granting approval, as both Domino’s and Flirtey are learning what is possible with the drone delivery for our products but this isn’t a pie-in-the-sky idea," he said at the time. "It’s about working with the regulators and Flirtey to make this a reality for our customers. DRU drone is the next stage of the company’s expansion into the artificial intelligence space and gives us the ability to learn and adopt new technologies in the business.”
It might seem odd that low-flying drones that weigh no more than a sack of spuds struggle to get aerial approval when 18-tonne autonomous trucks are already on the road. As with autonomous cars, the proviso is that there has to be human driver in situ, ready to take control if things go a bit wobbly, assuming they’re not too busy with Pokémon Go.
Last year Daimler announced a world-first when it tested a standard Mercedes-Benz Actros truck equipped with its intelligent Highway Pilot system, resulting in an autonomous operation on a public highway, the A8 autobahn in Germany. While a vehicle of this size won’t be rolling up your driveway with your weekly shop, it wouldn’t be out of place turning up at a warehouse for an online supermarket.
For now, these vehicles are driven in a semi-autonomous mode, with the driver intervening as required. However, Mercedes-Benz has a roadmap of sorts, embodied by its Future Truck 2025.
This concept vehicle shows a driver effectively taking a back seat for most of the journey as the chair swivels away from the steering wheel. It’s an arrangement with the potential for much longer journeys without the risk of driver fatigue and, if other trucks line up in platooning mode, fuel efficiencies from the cumulative aerodynamics. Quicker deliveries from distant locations promise fresher food, or a longer shelf-life at a CFC warehouse, which, along with the lower fuel bill, is bound to interest food retailers.
As part of the German motor-manufacturing consortium that bought Nokia’s Here WeGo last year, Daimler is certainly making strides in commercial vehicle autonomy that others such as Tesla and Uber will view with interest. Indeed, the company’s Freightliner Inspiration models, equipped with Highway Pilot, became the first autonomous trucks to be granted a state license in the US last year.
So far, meanwhile, all we have had from Uber’s acquisition of Otto, the autonomous truck rig developed by four former Google engineers, is a video of a retrofitted Volvo 18-wheeler in action.
Tesla will no doubt be applying its own autonomous-driving technology to its semi development. However, it has had a long relationship with Daimler and has even been poaching its engineers and hiring them to work on the Tesla semi. The attraction for Tesla is Daimler’s own advanced electric vehicle research.
At the IAA Commercial Vehicles show in Hannover last month, Daimler premiered the Mercedes-Benz Urban eTruck, a vehicle with a gross weight of up to 26 tons designed for town distribution. The battery pack capacity of 212kWh gives it a range of around 200km (124 miles) and it should hit the streets in the early 2020s. There’s no word of any autonomous driving capability but by the time it arrives, Uber might be ready to fit an Otto system onto them.Both Tesla and Google are always quick to point out that their autonomous vehicles have covered hundreds of thousands of miles with a collision rate well below the human average. Yet the fact remains, accidents do happen, and when an autonomous robot is behind the wheel, who is responsible?
In his book Humans Need Not Apply, Jerry Kaplan, the computer scientist, futurist, and fellow at Stanford University, highlights a series of arguments that are taxing the minds of programmers and legal bodies alike. As he sees it, part of the problem is how you punish an AI robot that has no way of knowing good from bad, especially as new situations present themselves. Does
the owner get punished, or do you erase the memory of the AI agent?
What about the choices an autonomous vehicle might need to make? The decisions are ultimately issues of programming and have become hot topics. In the chapter “Officer, Arrest That Robot,” Kaplan describes some challenging autonomous driving scenarios that make for unsettling reading.
“Your self driving car can run over a dog to save your life: pretty clear what you would want it to do. But what if it has to choose between running over an elderly couple or a bunch of kids crossing the street? How about the Sophie’s Choice of which one of your own children to kill, the one in the front seat of the one in the back?”
And for an autonomous vehicle system, these choices would merely be a calculation, not a conscious response. As this technology proliferates, humans will be more exposed to the decisions made by AI systems that have no sense of emotion, pain, or morality and we’ll need some robust thinking to insure us, in every sense, against the consequences. There are too many strands to these scenarios to cover here, but if these looming questions interest you then Humans Need Not Apply is a fascinating exploration of how AI is likely to affect us all.
Semi-autonomy
When it comes to doorstep grocery delivery, the man-with-van option still remains the order of the day. However, one promising semi-autonomous contender was the VW eT!, a concept vehicle that was developed between 2010-2011 and unveiled in April 2012. This electric van could be controlled with an app and instructed to autonomously follow a delivery driver at walking pace while he made his dispatches on foot. The Volkswagen site still showcases this concept vehicle, but the idea appears to have been mothballed.
It’s a shame, as the VW eT! had the makings of a practical autonomous system that really could enhance delivery times—and might well suit the concerns of Ocado’s Paul Clarke, who wants to maintain a personal touch as part of the service.
“We could anticipate a range of last-mile services that autonomous vehicles will fit into, but it’s unlikely to be one-size-fits-all. We take our customers’ groceries to their kitchen table,” he says, conceding that a bot would probably be better suited to convenience orders in the middle of the night, assuming a customer was prepared to go to the kerbside to collect it.
For now though, the autonomous vehicles Clarke is focused on are the next-generation Ocado Smart Platform bots, and he's working on ways to optimise their performance within a private cloud.
“Some of the systems that run our facilities are very low latency, very realtime-control-orientated, and they really need to run on premises,” he says. “Whereas the rest we want to put into the [public] cloud in future. It’s a hybrid cloud strategy.”
0 comments:
Post a Comment