Refraction AI plowing through first snowstorm in pilot launching in Ann Arbor, Michigan
Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed rounds. The postal creed dates back to Ancient Greece and never has been more true as Refraction AI robots take to the snowy streets of Ann Arbor, Michigan to pilot their online food delivery service.
As their robot navigates dense urban passages to get grandma her hot bowl of soup, Refraction’s founders, Matthew Johnson-Roberson and Ram Vasudevan, cheer it on through their teleoperated control room, keeping their eyes on the prize of solving the logistics problem of the soon-to-be $200 billion online food delivery market.
GrubHub, DoorDash, Postmates and Uber Eats have been losing money hand-over-fist. In a letter to shareholders regarding disappointing third-quarter earnings, GrubHub explained why the current model of food delivery might never be profitable. “Bottom line is that you need to pay someone enough money to drive to the restaurant, pick up food and drive it to a diner. That takes time and drivers need to be appropriately paid for their time or they will find another opportunity. At some point, delivery drones and robots may reduce the cost of fulfillment, but it will be a long time before the capital costs and ongoing operating expenses are less than the cost of paying someone for 30-45 minutes of their time.”
As the gauntlet has been thrown to drive cost per delivery down, the competitive landscape has filled with sidewalk delivery robots (Starship Technologies, Kiwibots), road delivery robots (Nuro, Udelv), drones (Amazon Prime Air), and scooter rideshares (OjO) hawking their solutions, but cutting through the noise, Refraction might just have found out a passage through the bike lane that no one has addressed yet, a state-compliant low cost transport that can operate autonomously through the harshest weather conditions.
Refraction AI CEO and co-founder, Matthew Johnson-Roberson
Not one to shy away from a good challenge, Johnson-Roberson has traveled the toughest terrain on the planet to solve some of the world’s most complex problems with AI and robotics. As a Carnegie Mellon undergraduate, he competed in the 2004 DARPA race across the Mojave desert under the tutelage of autonomous vehicle pioneers Red Whittaker and Chris Urmson. While getting his PhD from the University of Sydney, he ventured to the bottom of the sea with underwater robots to understand the effects of climate change on the Great Barrier Reef. Now he’s in the snowbelt of America’s Heartland where he heads up the University of Michigan’s robotics department, DROP Lab and Ford Center for Autonomous Vehicles. In 2015, the National Science Foundation recognized his lifetime achievements with the prestigious NSF Career Award, and he’s just getting started.
I had a chance to talk with him about his founder’s journey, what makes his last mile robot a product-market fit, and how startups can have social impact from launch. What follows is an edited transcript of our discussion:
How did you meet your co-founder, Ram Vasudevan? Were you college roommates?
We’re both robotics professors at University of Michigan. When he joined the faculty, I tried to get him to be my roommate. I was renting this big house and invited him over. He took one look, and said, “Nah, I’m good.” Haha, he did not want to live with me but we became fast friends and have been working together ever since.
In 2016, we founded the Ford Center for Autonomous Vehicles together and became frustrated with how long it was taking industry to bring robo taxis to market. By September 2017, we wanted to explore what could be useful right now and this was the genesis for founding Refraction AI.
How did you get funded?
The first few years we bootstrapped and built the robot in my garage. Cold and dark, it was great place to do physical hardware. Then this past March, we raised $2.5 million from eLab Ventures and Trucks Venture Capital. Doug Neal, who is an eLab partner, has close ties to UM having previously run the Center for Entrepreneurship. He joined our board and helped us spin out of the University. We met Trucks VC’s Reilly Brennan and Jeff Schox at mobility events including their Das Tegernsee conference they produce with Rob Coneybeer of Shasta Ventures.
On December 12, you launched your first pilot in Ann Arbor. How’s it going?
It’s been very exciting. We now have over 250 people signed up for deliveries, five restaurant partners, 15 full time employees, and tons of snow! Plus we’re growing. We have 5 vehicles right now and should be close to 12 or 15 by end of February.
What are your plans for scaling?
We are in the midst of raising our Seed+ to expand our fleet to 70 vehicles, 30 to max out Ann Arbor, and then 20 in other deployment cities like Boston and Palo Alto. Palo Alto because it’s a great place to be in front of a lot of capital and Boston because we want to show we can operate where the weather is harsh, terrain is challenging and driving is difficult. Both cities are in college towns with a density of restaurants, population of hungry students who are quick adopters of new technology, and in states that have robust AV and e-bike regulation that we can use in tandem to get statewide deployment.
Other areas high on the list to deploy to as we get to scalability over the next two years include Pittsburgh, Madison, and the Eastern seaboard down to Florida.
Is there any region you’re not planning to deploy to?
New York City just banned e-bikes for delivery so we’re not going there.
Starship Technologies sidewalk delivery robots make food deliveries on college campuses
It’s a gold rush right now chasing the last mile, what differentiates you from the competition?
We operate in the street which is regulated by the state and has less regulatory and technical challenges than operating on the sidewalk which is regulated by the city. Sidewalk delivery robots need to comply with regulations in every different city they want to operate in. That’s a big challenge with scaling that model.
We’re also not a car as we meet all of the classifications of a Class 2 E-Bike which is already legal in most states – our motor is under 500 watts, we weigh 100 pounds, and we travel at 15 miles per hour. This means vehicle and safety requirements are easier to comply with. That’s a huge win for us.
Nuro full sized AV just launched pilot with Walmart
The risk with full size autonomous vehicles is that if you get hit with it traveling 35-40 mph, there could be a fatality (as in the case of Uber in Tempe), but our vehicle moves slow and is very light and similar to a small child riding a bicycle, the impact risk is very low. If something dashes in front of it, its sensor modalities – the cameras, ultrasound, radar, LIDAR – pull it to a stop and because its so light and slow, our robust teleoperations can take over quickly and safely. This allays public safety concerns of municipalities about having a human there if something goes wrong.
If you think about it, it makes no sense to use a 4,000 pound vehicle to deliver one hamburger which could be done much lighter and easier in the bike lane while reducing congestion and pollution on the road.
What is the ratio of your teleoperators to vehicles?
One operator manages three vehicles but in time can manage up to six vehicles. Our teleoperators are highly competent, they go through an extensive training program and are provided with comprehensive manuals and resources. They work in shifts with no less than two people on at any given time. As we expand we will keep our teleoperations in Ann Arbor because of our skilled workforce and lower cost to operate here.
Postmates Delivery Robot in teleoperations pilot powered by Phantom Auto
What does the journey of your robot look like?
An order is placed and the robot leaves the depot from our downtown Ann Arbor offices through big roller doors. It goes down an alley into the street, then proceeds to the restaurant. A Refraction tablet sits alongside DoorDash and GrubHub’s tablets at the restaurant. When the order comes in, the food gets prepared and the tablet tells the restaurant when the robot has arrived. The restaurant host drops food package in robot, the robot then goes to the customer’s house. As it approaches it texts, the person can specify where the robot should drop off the food, for example alongside a porch or in the driveway.
Complex crossings along the route have been micro geofenced and will signal a teleoperator to takeover as the robot approaches. We use a multiple modem system that ensures communication with the robot even when there is congestion.
What is in your tech stack?
We’re focused on driving the unit economics of this by keeping the cost of the platform low. If you look at the roof of a Cruise car, it has a couple hundred thousand dollars of equipment. Ultimately, its very difficult to figure out any business model where you can deliver $12 of McDonald’s in a vehicle that costs more than a Bentley.
So we’re focused on having the sensors we need to drive but keeping them low cost. Cameras due to cell phones have become very inexpensive so we have 12 cameras on the vehicle some with a very wide field of view like 200 degrees some more narrow 90-100 degrees. This spread out fields of view allows us to see 360 degrees around the vehicle and perform depth estimation with a high level of fidelity which lets the vehicle position itself with respect to parked cars and lanes. We also have sensors that make sure we’re being safe like ultrasound which is the same as the backup sensor that beeps in a car when you get close to things. Ultrasound works really well over short distances for blind spot detection but isn’t used on a full sized vehicle like Cruise or Waymo because you can only see 5 to 8 feet with it. It’s not useful if you’re going 30 miles per hour, but because we’re going 12 miles per hour it’s an incredibly viable sensor for maintaining safety of where the vehicle is. We also have millimeter wave radar and Livox has a low cost LIDAR system for under $500 that gives us another modality for optical detection to make sure we don’t run into things. We use both AWS and Google cloud.
At the crux of all of this is our AI deep learning system that provides rich depth estimation using all 12 cameras together with 360 point cloud to do object detection and segmentation to identify people.
What data are you collecting? What do your cameras see?
We collect depth data. Telemetry and control information is processed locally on the vehicle and sent back for navigation. The cameras see a point cloud of a human outline similar to how body scanners see a depth map. Blobs, not uniquely identifiable images of people. We also collect valuable data for cities related to potholes, road damage and accidents. And we collect food data because that’s what’s being transported.
Do you have concerns about theft or vandalism, or snowball fights?
I grew up in New York City in the 1980s and remember the high crime era. I know scooters have a life of three to four weeks due to theft and vandalism, but it’s really not a concern for us. Our vehicle is kind of bulky so not easy to pick up and throw like scooters which we’re getting tossed into the Bay. It transports only food, not high value items, not cash. It costs about $4,000 but is constantly in motion, in the street, with sensors, cameras, a GPS, and a speaker with a teleoperator that can make out what’s happening and say, “Hey.” It’s also not parked on the street overnight. When it’s done on its journey it comes home to a locked depot.
What is your cost per delivery?
At steady state, we’ll be around $2-$2.50 per delivery. The estimate from publicly available companies is about $11-$12 per delivery using human beings.
Do you see your platform doing partnerships with DoorDash, Postmates, GrubHub and Uber Eats?
Ultimately we want to offer this as a service to anyone who wants to move goods safely and quickly around a city. At the end of the day, there are going to be a number of markets and verticals that we can go into. Food delivery is easy because its a well established market. There is a ton of customer demand and current solutions are not great. In the long, run there are pharmaceuticals, small packages, and a host of things we want to deliver. The market potential is massive.
Right now, the food delivery apps are struggling with how to make the unit economics work. They charge restaurants upwards of 30% revenue share and on the consumer side $10-12 for delivery. It’s a difficult business to make work. If you talk to the drivers, they’re not great jobs. Some don’t even make minimum wage and rely heavily on tips. There are big structural changes happening in the gig economy and laws are startign to change (see California Assembly Bill 5 – The Gig Worker Bill).
People have become accustomed to getting things on demand and outpacing expectations on every front, but we’re exploiting humans workers to do it and that’s not right. There’s a lot we can do to think about the ways we’re servicing that demand. Our goal is to create a way that is not just economical but sustainable.
It sounds like you’re not a proponent of the gig economy. Will you remain an employee-only company and not use contractors?
I am not a proponent of the gig economy. I believe as we try to build a fair and more just economy, we need to employ people and give them healthcare and purpose. I’m fine with exploiting robots – you can work them 24 hours a day and cut costs to make them as cheap as possible, and at the end of the day it’s just a robot, not a person. We want to invest in our employees and build something we care about – it’s a better model than trying to squeeze people as much as you can to make your margins higher.
What do you think about autonomous vehicle companies who say they are not at parity with hiring women because they can’t find enough qualified candidates with the right experience?
The pipeline issue is a copout. Ultimately what has been revealed to be false about those narratives is the employment rate has dipped below 4%.
If you’re trying to build a future that makes the world better for everybody, you need to include everybody in that process. If you don’t have representation, you’re only going to make the world better for a very small and limited segment of the population. I’m not sure you can even achieve the goals that you have for making the world a better, safer and more sustainable place if you don’t have a diverse workforce.
We think about the bias that goes into hiring, the way resumes are read, where interviews take place etc., to ensure that implicit biases don’t creep into the hiring process and we look broadly for candidates with different backgrounds. No one has the right experience building these robots because we’re building the first ones. You need to hire smart people who care about people, the planet, and building the technology and you can find them across every spectrum of the human experience. Once you hire them, you need to train them and invest in them early.
You’re heading off to CES, what are you hoping to see in terms of tech being announced?
I’m always looking for better depth cameras, Intel has a few, and looking forward to seeing faster purpose-built AI chips that can run deep learning networks on a lower power budget. Hoping to find one running on 10 to 15 watts as opposed to the 100 watts running now. That would increase battery charge from 12 to 24 hours and range from 50 to 100 miles which would make a huge difference in unit economics.
Refraction will be showing off our REV-1 robot at CES in the Westgate Smart Cities Pavilion with Livox (Booth 833), January 7-10. I hope to see you there!