• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Robotics, AI & Data

June 6, 2019

Alexa Will Soon Book You Dinner and a Movie. What Will its Multi-Tasking Mean for Connected Kitchens?

At the company’s re:Mars conference yesterday, Amazon showed off a Alexa’s new multi-tasking functionality, which allows the virtual assistant to engage in more contextual interactions and commands.

Normally Alexa is a mono-tasking machine, which means you have to break up your requests into single items, all of which much start with your saying “Alexa.”

“Alexa, what time is the movie playing?”

“Alexa buy me tickets to the movie at 7:00.”

“Alexa, what restaurants are near the movie theater?”

But as discussed at re:Mars, soon Alexa will be able to collapse these multiple requests into more of a single, multi-part conversation. So when you buy your movie tickets, Alexa will then ask if you want to eat out nearby restaurant, make your reservation and even call you an Uber.

What immediately springs to my mind, however, is not having Alexa plan my date night, but how the virtual assistant’s new multitasking functionality could help in the kitchen. (Does that make me a bad husband?)

More than 100 million Alexa devices have been sold, and according to a January article on The Verge, there are “more than 150 products with Alexa built in, more than 28,000 smart home devices that work with Alexa made by more than 4,500 different manufacturers, and over 70,000 Alexa skills.”

Those products include kitchen devices big and small like LG appliances, GE microwaves, the June Oven, and even the Silo food storage system. Just like Alexa connecting movie tickets, dining and transportation, it should also be able to better connect various parts of your connected kitchen.

For example, you could ask Alexa for a lasagna recipe. Alexa could then check with the cameras in your connected fridge and your grocery purchase history to see if you have all the ingredients. If you’re missing something, it will be happy to order it for you same day delivery. Alexa could then ask what time you want to eat, set a reminder and pre-heat the oven at the appropriate time and show you a video on how to make the lasagna. After the lasagna reaches a certain cook time, Alexa could also turn on your June to cook a side.

The point is that thanks to yesterday’s news, all of these actions will be automatically coordinated via a single conversation with Alexa pushing some suggestions and contextually reacting to other requests.

The result of this multi-tasking Alexa will be a better guided cooking system that makes it more of a true assistant.

June 3, 2019

Welp. Robots Have Knives Now, and Know How to Use Them (to Slice Onions)

Well, fellow humans, we had a good run, but our time is over. Robots have their knives out — literally — and know how to use them.

Terminator-esque teasing aside, IEEE Spectrum has a video roundup of some of cutting-edge (sorry) robotics research being done right now. Included among the videos is “Robotic Cutting: Mechanics and Control of Knife Motion,” by Xiaoqian Mu, Yuechuan Xue, and Yan-Bin Jia from Iowa State University, in Ames, Iowa, USA.

You may think that having a robot to slice an onion mainly entails a big mechanical arm slamming a knife down, but you’d be wrong. The researchers created a program that combines and coordinates pressing, pushing and slicing motions. From the research paper’s Introduction:

Cutting skills such as chop, slice, and dice are mostly beyond the reach of today’s robots. Technical challenges come not just from manipulation of soft and irregularly-shaped objects, but more from doing so while fracture is happening. The latter requires planning and force control based on reliable modeling of an object’s deformation and fracture as it is being cut. The knife’s movement needs to be adjusted to progress in terms of material fracture. Its trajectory may need to be replanned in the case of an unforeseeable situation (e.g., appearance of a bone).

Robotic Cutting: Mechanics and Control of Knife Motion

As you can see from the video, this particular robot won’t be wowing crowds at a Benihana anytime soon, but it shows once again that robots are getting more proficient at higher-skilled tasks. Automation is coming for food sector jobs, and while we think of them right now in terms of flipping burgers and bussing tables, robots will be automating more and more tasks in restaurants, like prepping vegetables.

Dishcraft, for example, is still pretty tight lipped around what it’s working on, but the company has talked about building robots to do specific tasks in restaurant kitchens like prep work. Miso Robotics’ Flippy was created in part to take over dangerous tasks like working the grill and deep fryer in the kitchen, and the company has already talked about Flippy eventually chopping vegetables.

While there are still many issues to work through with the rise of robots, having them handle knives in the kitchen (and saving countless fingertips from lacerations) is probably not such a bad thing.

May 30, 2019

Google Maps Adds Popular Dish Feature to Surface Favorite Meals at Restaurants

Google Maps has always helped navigate you to a nearby restaurant, but with a new feature launched today, Maps will help you navigate that restaurant’s menu by surfacing its most popular dishes.

The popular dishes feature uses machine learning to parse through photos and reviews of dishes posted by Google Maps users and identify a restaurant’s most popular meals. The new feature is available now on Android with an iOS version to follow later. From a Google blog post announcing the service:

Simply pull up a restaurant on Google Maps to find its popular dishes in the overview tab. Feeling extra peckish? Dive into the menu tab to scroll through all the most-talked about meals, and tap on a popular dish to explore reviews and photos. In a country where you can’t read the language? Maps will also translate the reviews for you too.

Google has certainly been interested in feeding you lately. Last week, the company announced a feature that allows Google Maps, Search and Assistant users to order food for delivery directly through those apps. Earlier this month, the company revealed a new Google Lens feature that let users point their phone cameras at a menu to bring up pictures of popular meals. And there’s also, Google Duplex the human-sounding AI assistant that can make restaurant reservations for you.

It’s not hard to connect the dots to see where all this is going. Knowing what restaurants are nearby, what type of cuisine they serve and what their most popular dishes are creates the foundation for an even more powerful AI assistant. Why should Duplex stop at making restaurant reservations when it could also order your food for home delivery? While this is useful on your phone, having this kind of functionality on a Google Smart Hub smart screen would be equally powerful for families ordering dinner. Google can recommend the restaurant, suggest dishes and then automatically have it delivered to your door.

Getting your purchase history and surrounding data (when you ordered, etc.) would provide Google even more data to power its algorithms, and the company is pretty upfront about wanting that data. From the Popular Dishes blog post today:

At the end of the day, this feature is made possible because of contributions from people around the world who want to help others using Google Maps. So if you want to pay it forward to the next dinner, simply take a photo of your meal (before you’ve scarfed it down!) and add a dish name so others can know what’s good on the menu.

Of course, in a world where we freely hand over so much information about ourselves, at some point you have to ask: who, exactly, is being served?

May 29, 2019

Will More Artificial Intelligence Lead to Fewer Foodborne Illnesses at Restaurants?

Chick-fil-A is now using AI to monitor social media feedback from customers in order to detect and prevent foodborne illness, QSR reports.

At last week’s ReWork Deep Learning Summit in Boston, Chick-fil-A’s senior principle IT leader of food safety and product quality, Davis Addy, explained how this tech works.

For most QSRs, gathering feedback from social media is key for getting insights into what’s working and what isn’t with the business. That includes spotting any mentions of food safety issues or potential foodborne illnesses like norovirus, which on average causes 19–21 million cases of acute gastroenteritis annually in the U.S. Restaurants and catered events are one of the most common settings for norovirus, according to the CDC.

But to find these mentions, Chick-fil-A (or any QSR, for that matter) has to sift through hundreds if not thousands of customer reviews every day, many of them written with poor (or no) grammar and what Addy called “mixed sentiments and off-topic musings.” In other words, references to legitimate foodborne illness issues are often hidden amid hyperbole, misspellings, and other facets of modern communication.

Hence, more AI. Chick-fil-A has developed a custom AI platform that’s hosted on Amazon’s AWS Comprehend, a natural language processing service that will check data from Chick-fil-A’s social media channels every 10 minutes and filters for over 500 related keywords like “nausea” and “food poisoning.” The system can then uses machine learning to check the sentiment of the feedback and determine its legitimacy. In the case of legitimate issues, the system notifies store managers.

At the end of 2018, researchers at Google and Harvard unveiled a somewhat similar model called FINDER that uses machine learning to scan anonymous and aggregated data on searches that indicate food poisoning (e.g., “stomach bug”) from users who save location data onto their phones. It uses that information to pinpoint restaurants which are potential sources of foodborne illness, then notifies the city’s health department. FINDER is still in pilot mode; tests have so far taken place in Chicago and Las Vegas.

Chick-fil-A says its system currently operates on 78 percent accuracy and that the company is working with Amazon to improve those numbers. Since machine learning is a type of AI that learns over time rather than needing to be programmed, the hope is that it can gradually improve its ability to both detect keywords and also determine if they come from legitimate complaints about food illness.

Once it’s done that, AI’s job ends. Restaurant operators and managers have to take that information and decide how to act on it, at least for now. With humans and robots/AI starting to work side by side in an increasing number of restaurant settings, AI could not only track down foodborne illnesses, but also help to prevent them in the first place.

May 22, 2019

Ford Developing Bi-Pedal Robot to Carry Deliveries from Driverless Cars to Your Door

Plenty of companies are bringing robot-powered delivery of food and other household goods to the last mile, but most stop at the last few feet. Autonomous cars park at a curb and little rover bots typically can’t climb the front steps of a house.

Which is why Ford is working on a bi-pedal robot that literally walks deliveries from driverless cars right up to your front door(h/t to Bloomberg). Dr. Ken Washington, Vice President, Ford Research and Advanced Engineering, and Chief Technology Officer published a post on Medium today outlining the program, writing:

Enter Digit, a two-legged robot designed and built by Agility Robotics to not only approximate the look of a human, but to walk like one, too. Built out of lightweight material and capable of lifting packages that weigh up to 40 pounds, Digit can go up and down stairs, walk naturally through uneven terrain, and even react to things like being bumped without losing its balance and falling over.

Like something straight out of an Asimov novel, Digit folds up and sits in the back of a driverless delivery van. When a package needs to be delivered, it emerges from the vehicle, stands up and carries the package to a person’s doorstop. Digit doesn’t have a ton of autonomy gear and processing power on it. Instead, the driverless car, which is packed with sensors and mapping equipment, sees the surrounding area and sends Digit the best path to the door. If Digit needs help, or encounters something unexpected, the problem can be sent up to the cloud where another system (perhaps even a human) can assist.

Though the Ford post didn’t mention groceries specifically, they are a good use case for this type of robot delivery. Groceries are heavy, and even if a driverless car brings them to a house, a person still needs to go out to the street to retrieve and lug them back inside. The weight of groceries is one of the reasons self-driving delivery company, AutoX moved more into (the much lighter) restaurant food. For most, this walking to the curb is a minor inconvenience, but for those who have trouble moving, a robot walking packages to the door would be a big help.

Ford isn’t alone in getting goods up directly to your front door. Earlier this year, FedEx unveiled a its own delivery robot that can climb stairs (though it uses wheels, not legs), and Amazon received a patent for an autonomous robot that would live in a home’s garage and would venture out to fetch packages from delivery trucks.

There’s no word on when or if this particular version of Ford’s delivery vision will be coming to a neighborhood near you, there are still a lot of regulatory hurdles for self-driving vehicles to get through. But with the pace of innovation, robots are bound to be bounding up your walkway to deliver a package someday soon.

May 21, 2019

Newsletter: The Spoon’s Food Tech 25 Is Here. So Is the Battle for the Drive-Thru.

This the post edition of our newsletter. To get the Weekly Spoon delivered to your inbox, subscribe here. 

One of my favorite things about tech is that it starts a lot of debate. Even within our small team here at The Spoon, we’re constantly on different pages about what’s groundbreaking and what’s just hype, whether something’s progressive or just invasive, how to spell the phrase “food tech.”

So when it came time to put together our annual Food Tech 25 list, which dropped yesterday, you can bet it took a whole lot of discussion to whittle the entire food industry down to just 25 companies.

As we always do, though, the Spoon team — Mike Wolf, Chris Albrecht, Catherine Lamb, and myself — managed to compile a list of companies we individually and collectively, believe are truly impacting the human relationship to food. That impact takes many forms, from the way Creator makes it possible for humans and robots to coexist in the kitchen to Yo-Kai’s vending machine of the future to Goodr’s efforts to use tech to keep food out of the trash and redistribute it to those in need.

I’m hoping readers enjoy this list, but I’m also hoping it sparks some healthy dispute, too. Who else should be on the list? For that matter, who shouldn’t, and why? We encourage you to email us with any additions, subtractions, rants and raves on the matter.

And, most important, congratulations to the companies who made it on this list!

Image via Unsplash.

Drive-Thru Tech Moves Into the Fast Lane

One area of food tech that’s going to raise many more questions over the next few years is the QSR drive-thru. Specifically, how AI is changing the drive-thru and what that means for both restaurant operators and customers.

We’ve been following closely the story behind McDonald’s acquisition of Dynamic Yield, a New Zealand-based AI company whose tech has already been rolled out to almost 1,000 Mickey D’s drive-thru lanes. Then, this week, Clinc, best known for its work in the financial sector, announced a new funding round that will allow the company to expand into other markets with QSR drive-thrus at the top of the list.

Clinc’s using AI-powered voice controls to facilitate more natural conversation between the customer and the ordering system in the hopes of making the drive-thru experience smoother and faster. Drive-thru order times are much longer than they used to be, and companies are betting AI will speed up the order process by making it more accurate and also making more personalized recommendations, like immediately suggesting a pastry to someone when they place their morning coffee order. There are even companies working on making those recommendations not just in real time but also based on existing customer data. One such company is 5thru, which does away with voice altogether by scanning your license plate number, which is attached to a profile stored with the restaurant and can make real-time recommendations based on your existing preferences and order history attached to that license plate number. Cue progressive-versus-creepy debate.

Join the Conversation at The Spoon’s New Food Tech Fireside Event

As much as we value the sound of our own voices over here, though, we actually want to hear more from readers on their thoughts around tech. That’s why we started a new online event, The Spoon’s Food Tech Firesides. Every month, we’ll hold a virtual sit down with one or two food industry innovators and invite the audience to join in the talk via written questions.

First up will be Tessa Price of WeWork Food Labs and Peter Bodenheimer from Food-X talking about food accelerators: what they are, what they’re not, and which companies and entrepreneurs should consider them as a path towards growth.

The event takes place May 30 at 10:00 a.m. PDT/1:00 p.m. EDT. Catch the full details here, and be sure to register early, as there’s limited space available.

May your week be filled with lively debate.

Onwards,

Jenn

May 17, 2019

Bear Robotics Launches Second-Gen Restaurant Robot, Adds Swappable Tray System

Bear Robotics has officially launched the second-generation version of its Penny restaurant robot. The autonomous robot, which shuttles food and dishes between the front and back of house, now features a versatile tray system for carrying more and different types of items.

With its new design, Penny has lost its bowling pin shape and single carrying surface. Instead, Penny 2.0 is more cylindrical in shape, and can sport up to three tiers of carrying surface. Not only can Penny carry more, a new swappable tray system means it can be configured to carry any combination of food, drinks or bus tub.

On the inside, Bear updated the smarts of Penny, giving the robot enhanced obstacle-avoidance technology, and while the company didn’t go into specifics, a tablet can now be attached to Penny for expanded customer interaction capabilities.

Penny 2.0 is being shown at the National Restaurant Association trade show this weekend and is available now. While Bear doesn’t disclose actual pricing, Penny is offered on a monthly subscription, which includes the robot, setup and mapping of a restaurant and technical support.

Penny is among a wave of robots coming to restaurants in the near future: Flippy makes burgers and fries up chicken tenders, Dishcraft is still stealthily working on automating tasks in the kitchen, and there are entire establishments like Creator and Spyce built around robotic cooking systems.

Any discussion of automation always involves the loss of human jobs. John Ha, CEO of Bear Robotics, actually owned a restaurant and built Penny after noticing how hard servers work, often for little pay. By automating the expediting of food and bussing, Bear aims to free up humans to provide higher levels of customer service (ideally earning those humans higher tips).

Ha and Linda Pouliot, CEO of Dishcraft recently spoke at our recent Articulate Food Robotics conference about the challenges restaurants face, and how robotics can help. You can watch their session in full right here.

Articulate 2019: Robots in Restaurants

May 17, 2019

LG Develops Its Own AI Chip to Make Appliances like Refrigerators Smarter

LG announced today it is has developed its own artificial intelligence (AI) chip that will better mirror the neural network of the human brain to for improved processing of deep-learning algorithms. The end result is chips powering appliances like smart fridges that are actually… smart.

From LG’s press announcement:

The AI Chip incorporates visual intelligence to better recognize and distinguish space, location, objects and users while voice intelligence accurately recognizes voice and noise characteristics while product intelligence enhances the capabilities of the device by detecting physical and chemical changes in the environment. The chip also makes it possible to implement customized AI services by processing and learning from video and audio data in order to enhance recognition of the user’s emotions and behaviors and the situational context.

The chip can also do all this advanced processing without an internet connection, so it’s not reliant on the cloud for these newfound smarts, and features a security engine to protect a user’s personal data.

I’m no hardware engineer, but if it works as promised, it’s not hard to see the immediate applications in something like a smart fridge. Sure, cameras inside current smart fridges can already help you see what’s inside them, and even suggest recipes, but better AI could make all of this smarter and more automatic.

A smarter smart fridge would automatically recognize whatever food you put inside it, and be able to detect when it’s going bad by the composition of the food (visual cues, chemical changes to the atmosphere inside the fridge), not just an expiration date. From the description in the press release, the fridge could even detect your mood by the tenor of your voice or facial expression and make suggestions based on that. Having a bad day? Have some ice cream!

Now, your mileage may vary on whether or not you think these features are helpful or creepy, but the broader point is that what we think of as a “smart” device now will seem dumb in the not too distant future.

May 15, 2019

Halla Raises $1.4M Seed Round, Pivots to Focus on AI-Powered Grocery Recommendations

Halla, a startup that uses machine learning and AI to power food recommendations for grocery shoppers, announced today that it has raised a $1.4 million seed round led by E&A Venture Capital with participation from SOSV. This brings the total amount of money Halla has raised to $1.9 million.

Halla has a B2B platform dubbed Halla I/O that helps recommend relevant food products to shoppers. As we wrote at the time of Halla’s launch last year, the “company created an entirely new model and a new taxonomy that doesn’t just look at what a food item is, but also the molecules that make it up, a map of attributes linked to other food as well as how people interact with that food.”

So if you are using a grocer’s app with Halla I/O built in, the app will serve up intelligent recommendations as you continue to shop online. Buy salt, it could recommend pepper. By salt and noodles and beef, and it might guess that you are making a bolognese and recommend tomato sauce.

If you read our coverage of Halla last year, you’d notice something different about the company now. Initially, its go-to market strategy included both grocery stores and restaurants. But in the ensuing year, Halla has abandoned its pitch to restaurants, choosing instead to focus exclusively on grocery retail.

“What we’ve found is that the market timing was screaming ‘where tech meets grocery,'” Halla Co-Founder and CEO Spencer Price told me by phone recently, “The restaurant space is more crowded for building recommendations.”

But all that work in the restaurant space didn’t go to waste. “The truth is we were able to keep all of our learnings from restaurant and made our grocery recommendations stronger,” Price said. “One core learning is that restaurant dishes and menu items, as long as they have descriptions, are just recipes without instructions.”

Halla now has more than 100,000 grocery items and one hundred million unique grocery transactions from retailers across the country in its data set, informing its machine learning algorithms. Price is quick to point out that Halla does not have any personally identifiable information on people. “We can make recommendations to customer X without knowing who customer X is,” Price said

Though a grocery chain can move a lot of product and provide a lot of data for better purchasing recommendations, grocery chains as a whole do not move quickly. To get them to adopt a new technology is like turning a battleship — they need a lot of time to execute. “They’re not looking for speed,” Price said, “but a reliable solution.”

To this end, the biggest thing Halla’s funding buys them is time. “We’ve bought some runway,” said Price. The company now has some breathing room to take its time and conduct even more tests with slow-moving retailers. Halla is in tests with unnamed grocers right now, and offers its recommendation on an API pay-per-call model.

AI-based B2B food recommendation is almost its own mini-industry. Spoonshot, Analytical Flavor Systems, and Tastewise all use vast data sets to make product predictions and recommendations to restaurants and CPG companies. Other companies like AWM Smart Shelf are using a combination of prediction and smart digital signage to make in-store grocery purchase recommendations.

With online grocery shopping reaching a tipping point, people buying food via apps adding one or two more items to their cart because of intelligent recommendations could mean a nice sales boost for the grocery industry.

May 10, 2019

Report: Uber Exploring Nuro Partnership for Self-Driving Restaurant Food Delivery

Just in time for Uber’s IPO today comes word from The Information (paywalled) that the mobility company has been chatting with autonomous vehicle company Nuro about a food delivery partnership. If such a partnership were to come to pass, using Nuro’s self-driving pods could be a way for Uber to lower driver costs and improve Uber Eats’ margins.

Nuro makes electric, low speed vehicles that are about half the size of a regular car and top out at 25 miles per hour. They are built for cargo and do not even have a space for a driver. According to documentation seen by The Information, the partnership with Uber would start later this year in Houston. This makes sense as Nuro is already operating there as part of its expanded pilot with Kroger to do self-driving grocery delivery.

Nuro’s partnership with Uber would be different from its program with Kroger. Instead of Nuro vehicles carrying food from restaurants directly to a consumer, they would instead take food from restaurants to a central hub. Once at this central hub, a human driver would take it the last mile, delivering it to doorsteps. The Information writes:

The hope is that the centralized hub for orders would allow drivers to handle more food orders than they currently do and potentially make more money because they won’t have to spend time going to each restaurant to pick up the food.

There aren’t many details, so this Uber/Nuro partnership could manifest in different ways, but two things immediately spring to mind. First is that it seems like more wear and tear on the food. Restaurants place meals in a Nuro that travels to the hub. Food is removed from Nuro and sits on a shelf until a human picks it up and puts it in their car. Human drives the food to the consumer.

Then there’s the temperature of the food because as we know, soggy food sucks. I’m sure Nuro’s can be outfitted with thermal zones to keep hot food hot and cold food cold, but that seems like a recipe for disaster when bundling together multiple entrees, sodas and desserts, especially when they need to be packed and re-packed.

The second thought that springs to mind is whether Uber would employ a similar hub system as Zume Pizza. Zume sets up mobile distribution points in various neighborhoods where delivery drivers come and pick up orders. Rather than leasing a brick and mortar location, Uber could set up a customized van or food truck that could be parked in different locations. Based on the data Uber has about what types of food people are buying, when and how often, Uber could change and optimize the location of these mobile hubs on any given night.

Then again, all this speculation is moot if the deal never comes to pass. Neither Uber nor Nuro would confirm the story with The Information. Uber is understandably a little pre-occupied with its IPO today, and thanks to the $940M investment from Softbank this year, Nuro has some runway to experiment with programs like this.

Regardless if this particular deal comes to pass, it’s nice to see companies are continuing to experiment and iterate the food delivery process. Whether its low-speed-vehicles, full-on self-driving sedans, rover robots, or even drones, the way we get our meals delivered to us is going to drastically change over the next five years.

May 9, 2019

Ocado Leads $9M Seed Round in Food Robotics Company, Karakuri

U.K. based online grocer Ocado announced today that it has acquired a minority stake in London-based food robotics company, Karakuri. Ocado’s investment led a $9.1 million seed round in Karakuri, which also included Hoxton Ventures, firstminute Capital and Taylor Brothers.

Karakuri makes two different food robots: The DK-One, a more industrial robot that can assemble (not cook) 48 ingredients into ready-to-go meals on a mass scale in commercial kitchens; and the Marley, which is a smaller scale machine meant for applications like candy stores and frozen yogurt dispensing and topping.

Ocado is no stranger to robots: the company uses them to power its smart, automated warehouses, where totes on rails bundle up grocery orders for delivery. With the minority stake in Karakuri, Ocado appears to be setting itself up to expand this robot-powered automation into other forms of food delivery. From Ocado’s press announcement:

The [DK-One] can be used in the assembly of all boxed meals, using a configurable, modular design which can easily be installed in-store or in “dark kitchens”, and can aggregate up to 48 food items to create a wide range of food-to-go options.

Dark kitchens (restaurants that are delivery only) in particular are an interesting avenue for Ocado/Karakuri. Not only could a dark kitchen automate order assembly quickly, but the restaurant could then subscribe to Ocado’s logistics and delivery service to manage and optimize getting those orders to customers. This would mean more revenue for Ocado and also more data, giving the company insight into what, when and where people are ordering different restaurant meals.

Ocado also said it would tie Karakuri’s robots into its existing grocery service, which makes me wonder they will be used for something akin to customized meal kits, or even prepared food that customers could shop for as part of their daily or weekly shopping.

As we saw at our ArticulATE conference last month, automation is invading almost every part of the food stack. Here in the U.S. companies like Takeoff, Alert Innovation and Common Sense Robotics are creating robot-powered micro-fulfillment centers for grocery stores to speed up online order processing. Kroger, which is an investor in Ocado, is building out Ocado-powered smart fulfillment centers here in the U.S. to speed up its own grocery fulfillment and delivery. Will that now include Karakuri robots?

Ocado said that it would take delivery of its first Karakuri robot in the second half of this year. For its part, Karakuri said it will use the new money to further develop its technology, “strengthen its IP base,” and expand its team.

May 9, 2019

Video: “Food is Very, Very Dumb Compared to Robots.” But Automation Can Still Help It Taste Better

“Food is very, very dumb compared to robots.” Ali Bouzari made his case onstage at our ArticulATE food robotics conference last month. “It’s incredibly complex, it’s cumbersome, unyielding, and frustratingly ornate in a way that does not lend itself to innovation.”

So do we just give up and assume there are no new ways to make food? Not exactly. Bouzari, who founded culinary consultancy Pilot R&D and packaged food company Render, actually thinks that automation can help give us better eating experiences… within reason.

It’s pretty obvious how robots can help deliver food or run orders in a restaurant. They never get sick, they don’t need to take breaks, and they won’t drop a dish or mess up your order. But when it comes to making food taste better, even automation has to work within the confines of food science. As Bouzari put it: “Starch is gonna starch.” And that makes food robotics unique.

That doesn’t mean there isn’t plenty of room for robots and automation to shake up the food we’re eating — it just means we have to be really strategic about how we do it. Because in the end, “food is going to call the shots.”

Watch the video below to see Bouzari’s full talk on how robots can work within the limitations of food science, and check back for more of our full sessions from ArticulATE 2019.

How to Make Good Experiences (and Food) with Robots

Previous
Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...