• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

ArticulATE 2019

May 3, 2019

Video: Cafe X and Byte Technologies on Data Rabbit Holes and the One Thing Your Data Team Must Track

The beauty of running an automated or robot-powered business like the coffee-slinging Cafe X or Byte Technologies smart fridges, is that you generate a lot of data. What people are purchasing, when they purchase it, where, how often, etc. All this data powers demand algorithms that help companies like Cafe X and Byte be more efficient by accurately determining business critical decisions such as products to offer and how much inventory to carry.

The downside of running an automated or robot-powered business like Cafe X or Byte Technologies is that you generate a lot of data. Too much data, actually. You could easily spend most of your time diving down a data rabbit hole, trying to fine-tune demand algorithms and actually wind up making them less useful.

So how do you determine what data to capture and what data to pay attention to?

Glad you asked! Because the role of data was central to my discussion with Cafe X COO Cynthia Yeung and Byte Technologies Founder Lee Mokri during the Forecasting, Personalization & Customer Service Challenges for Robotic Retail panel at our ArticulATE food robotics conference last month.

It was fascinating to sit down and learn from Yeung and Mokri about how their respective companies tackle the data issue, and how they use that data to maximize product variety within the limited square footage of their automated storefronts.

Plus, for all you entrepreneurs out there, Yeung explains the one thing you absolutely need to do when building out your data team and plan.

Watch the full video below, and check back for more of our full sessions from ArticulATE 2019.

ArticulATE 2019: Forecasting, Personalization & Customer Service Challenges for Robotic Retail

April 25, 2019

Video: Albertsons Brings Labor into Automation Decisions on “Day One”

A recurring theme during our recent ArticulATE food robot and automation conference was the issue of human labor. What do you do when machines displace people in the workforce? It’s a tough question employers throughout the food world are grappling with.

On the one hand, automation is coming, and we need to figure out what combination of government and private sector will be responsible to help re/train people put out of work by robots. On the other hand, food-related companies like Albertsons (and restaurants), are having a tough time finding people to work for them.

Trung Nguyen, VP of eCommerce for Albertsons, told the audience at ArticulATE that the grocer is having a hard time filling jobs, especially truck driving jobs. Turns out people would rather set their own hours and work unsupervised driving for Uber than a delivery truck.

Even with a labor crunch, Albertsons is very methodical in its approach to automation. That’s because Albertsons is unionized, and, as Nguyen explained, before any kind of automation is implemented, the company brings in union reps and lawyers on “day one.” This way, Nguyen said that there is clear communication in what the company does.

Check out the full video from the conference to hear more about how the Albertsons grapples with the labor issue, why scale is important to a chain like Albertsons, and how it, too, has started piloting some autonomous vehicle delivery tests in the Bay Area.

ArticulATE 2019, Rethinking Grocery in the Age of Automation

Be sure to check back all through the week for more videos of the full sessions from ArticulATE!

March 18, 2019

ArticulATE Q&A: Why Starship’s Delivery Robots are as Wide as Your Shoulders

We are a little less than a month away from our food robotics summit, ArticulATE, happening on April 16 in San Francisco. The excitement around Spoon HQ is palpable because we have locked in a fantastic lineup of speakers, including today’s Q&A guest Ryan Tuohy, the Senior VP of Business Development at Starship.

Starship’s autonomous rover bots deliver snacks, groceries and packages to corporate and college campuses and even out to the general public in Milton Keynes in the UK. In advance of Tuohy’s talk on-stage at ArticulATE, we wanted to set the stage and learn a little more about, well, what Starship has learned from making all these deliveries (like how big to make its ‘bot!).

If you want to know how robots are impacting the present and setting up the future of food delivery, get your tickets to ArticulATE today, they are going quickly!

THE SPOON: Starship has been running deliveries for a while now on Intuit’s corporate campus, in Milton Keynes in the UK, and most recently you started at George Mason University. What have you learned about how people use and interact with robot delivery?

TUOHY: One of the things we’re proud of is how happy customers are when the robot arrives with their order – every interaction is overwhelmingly one of delight and we’ve received thank you notes and special drawings of robots from every location where we operate. At GMU specifically, the bots have become famous across social media, with many students posting pictures/videos of their deliveries.

For students at George Mason University, the bots provide a low cost and convenient service to let them spend time doing what they want, rather than waiting in line for food or skipping a meal because they’re too busy, GMU students can have food delivered anywhere on campus and use that extra time to hang out with friends, study or take a break.

In some locations, like Milton Keynes, some of our most popular customers are parents who struggle to leave the house because of their children. It’s a lot easier to get a robot delivery than try and get two young kids in the car, find parking, walk round the grocery store and then drive back. It’s a lot more environmentally friendly as well.

The vast majority of people notice the robot on the sidewalk and simply pass by it. Initially, some people take a selfie or a photo of the robot. This effect diminishes over time as people in the particular location become familiar with seeing the robot in the area.

As you talk with campuses like George Mason, what are the concerns they have about robot delivery and how do you alleviate them?

At George Mason University, some people have asked whether the robots can handle rough weather conditions such as rain and snow. One of the reasons we brought the robots to GMU was for this very reason. The robots have safely traveled over 150,000 miles around the world and completed thousands of deliveries regardless of weather conditions.

Additionally, everyone’s first test when they see the robot on the sidewalk is to check if the robot will stop in time, when [they’re] blocking its path. But each robot travels at 4mph and has a ‘situational awareness bubble’ around it with a range of sensors that can detect obstacles like dogs, pedestrians, bikers, and is able to either maneuver around them or stop at a safe distance.

Is there a sweet spot when designing a rover delivery robot, in terms of size and speed, and have you hit that, or is it more of a moving, evolving target?

Robotic delivery is affordable, convenient and environmentally friendly. Starship’s robots are intelligent and designed to seamlessly co-exist with humans in the community. They are purposefully about as wide as a human shoulder width when walking on the sidewalk and travel at walking pace (4mph max).

Starship has designed and built our robots with a vast amount of advanced proprietary technology, including a combination of computer vision, sensor fusion and machine learning for seamless navigational and situational awareness. The company’s proprietary mapping process enables the robots to understand their exact location to the nearest inch.

What do you think are the biggest misconceptions about robot delivery?

One of the biggest misconceptions surrounding autonomous delivery is around security. Many people believe that robots will be stolen but the reality is very different from this. In tens of thousands of autonomous deliveries, we’ve not had any robots stolen. The robot has many theft prevention measure to stop this from happening, including 10 cameras, sirens (like a car alarm), tracking to the nearest inch and the lid is of course locked at all times. It would be a big effort to steal a robot and get it home without being caught, only to find some milk and eggs in the basket!

What is your favorite fictional robot?

I enjoy the classic R2D2 robots from Star Wars. Who doesn’t enjoy a cute robot that can help you on all your expeditions? We are already at the point where robots can deliver packages and food to your doorstep. It’s amazing to see technology we once imagined has now become a reality.

March 13, 2019

ArticulATE Q&A: Miso’s CEO on How Flippy the Robot Will Move From Frying to Chopping

Ahh Flippy. It was the first food robot I ever wrote about, way back in…2018. Back then, it could only grill burgers. Now, a year later, it can fry tater tots and chicken tenders, and will reportedly soon get a job in a deli.

They grow up so fast. Soon Flippy will want the keys to not drive the autonomous car.

We’re going to get a full report on what Flippy is — and will be — up to when Dave Zito, CEO of Miso Robotics, sits down for an on-stage chat at our upcoming ArticulATE conference on April 16 in San Francisco. We were so excited to have him be a part of the show that we couldn’t wait and sent him some questions via email, which he was kind enough to answer.

This is but an autonomous amuse bouche — get your ticket today to see Zito and a host of truly amazing speakers at ArticulATE!

THE SPOON: Flippy started off grilling burgers and then moved on to frying up chicken tenders. What particular jobs in the kitchen are Flippy, and robots in general, really good at?

ZITO: We started Miso Robotics with the idea of giving eyes and a brain to a robotic arm so it could work in commercial kitchens with real-time situational awareness and real-time robotic controls. We designed and starting building the system from Day One as a software platform that could automate the cooking of all manner of foods and recipes, with all equipment and restaurant brands, and all kitchen formats.

Our autonomous robotic Kitchen Assistants are focused on helping with the most repetitive, dangerous, and least desirable tasks in the kitchen. Flippy grilling burgers was our proof of concept. Flippy can now fry many different kinds of foods as well. These tasks can be improved and optimized for consistency, ensuring each meal is cooked to the perfect temperature with minimal food waste. Beyond frying, grilling, and other cooking, expect them over time to help with tasks like chopping onions, cutting other vegetables, and even cleaning.

The Kitchen Assistant improves and learns over time based on the data available. Ultimately, this frees up kitchen staff to spend more time with customers. We believe the future of food is on-demand, accessible, personalized, and scalable. We are building the technology platform leveraging automation, machine learning, and robotics advancements to deliver on this future.

What did you learn from Flippy’s time at Dodger Stadium?

Flippy’s deployment at Dodger Stadium emphasized how much one kitchen assistant can impact productivity and efficiency in a high-volume commercial kitchen. Dodger Stadium was the first time we deployed our frying capabilities, and we matched max productivity while producing consistently fried foods to the chef’s expectations. Cooking for extended periods of peak demand during baseball games was a key proof point for the reliability and sustained high throughput of our Kitchen Assistants.

But don’t just take our word for it; here is what our partner Levy had to say about the experience:

“The robotic kitchen assistant helps us more quickly and safely cook perfectly crispy chicken tenders and tater tots,” said Robin Rosenberg, Vice President and Chef de Cuisine for Levy. “It’s amazing to see the kitchen assistant and team members working together, and the consistency of product is incredible.”

“New technologies at large scale venues and events like this need to add value for both guests and team members,” said Jaime Faulkner, CEO of E15, Levy’s analytics subsidiary. “Working with Miso, we were able to create a process that both delivers high quality food more quickly, and gives kitchen team members a chance to hone sought-after skills working with robotics and automation.”

We are looking forward to resuming frying with Levy this baseball season.

What is the biggest misconception about food robots in the kitchens?

The biggest misconception about the use of technology in the kitchen is that it’s about job replacement. There is a growing labor crisis in the restaurant industry. Local workforces are shrinking, and wages are increasing, making commercial cooking uneconomical. Meanwhile, consumers have an increased desire for meals cooked for them, whether via delivery, take-out, dining out, or grocery deli meals, adding pressure on kitchen workers.

Restaurants already see 150% turnover today from a dissatisfied workforce. Pair this with an aging workforce that can’t handle some of the physical demands that come with the job and commercial kitchens are struggling to recruit and retain talent. Intelligent automation not only creates an avenue for meaningful work for the next generation through the creation of new jobs like a Chef Tech (employees trained to manage the robot), but also takes the physical burden off of more mature employees who want to continue to contribute later in life.

The tasks that Miso’s technology can perform are some of the most dangerous tasks in the kitchen, not to mention messy and menial, ultimately improving the employee experience by freeing up time for them to focus on more meaningful work, like warm customer service that a robot simply can’t match.

What should restaurant owners know about food robots before implementing them?

Expect improvements across several aspects of their business — better food, better customer service, better inventory and cost management. While a signature recipe for a restaurant can make it a success, it can be hard to reliably reproduce at scale to every customer, but robots like Flippy can deliver consistency in flavor to help keep customers loyal. Furthermore, the value proposition of implementing robotics in the kitchen spans productivity and cost-savings to one of the most pressing issues in our world today – sustainability. Food waste is a huge contributor to the climate crisis we are in, wasting $160 billion of food a year. This technology has the potential to significantly reduce that number – restaurants can contribute to a positive step in the right direction of food waste and ensure they are maximizing inventory as they begin to grow.

What is your favorite fictional robot?

As a kid I loved Johnny 5 from the film Short Circuit. I loved the idea that technology built for one purpose, in this case the military, once embedded with artificial intelligence shifted to more compassionate pursuits. In that way we are inspired at Miso to take industrial robotic arms, add our intelligence, and in so doing improve them for a broader and more impactful service — helping liberate commercial kitchens from repetitive tasks and mediocre menus, while empowering chefs to make delicious and nutritious meals accessible for all.

March 13, 2019

ArticulATE Summit Agenda Announced: Sony, Google, AutoX, Starship, Albertsons and More to Speak

We are excited to officially announce the full speaker (and robot!) lineup and agenda for our upcoming ArticulATE Food Robotics Summit. The one-day conference will be held on April 16 at General Assembly in downtown San Francisco, and it promises to be a great event packed with amazing talks about the future of food robots and automation.

A seriously stellar roster of speakers from up and down the robotic food stack will be there, including:

  • Linda Pouliot, CEO – Dishcraft
  • Narayan Iyengar, SVP, eCommerce & Digital – Albertsons
  • Jewel Li, COO – AutoX
  • Randall Wilkinson, CEO – Wilkinson Baking Company (BreadBot)
  • Vincent Vanhoucke, Principle Scientist, Google Brain team – Google
  • Masahiro Fujita, Chief Research Engineer, AI Collaboration Office of Sony – Sony
  • Ali Ahmed, CEO – RoboMart
  • Cynthia Yeung, COO – Cafe X
  • Ali Bouzari, Chief Scientist – Pilot R&D
  • Dave Zito, CEO – Miso Robotics
  • Deepak Sekar, CEO – Chowbotics
  • John Ha, CEO – Bear Robotics
  • Ryan Tuohy, Senior VP, Business Development – Starship
  • Alex Vardakostas, CEO – Creator
  • Brian Frank, General Partner – FTW Ventures
  • Mara Behrens, Chowbotics
  • Brita Rosenheim, Partner – Better Food Ventures
  • Chas Studor, Founder/CTO – Briggo
  • Lee Mokri, Founder – Byte
  • Shawn Lange, President – Lab2Fab (A Middleby Corporation)
  • Avidan Ross, Root VC
  • Matt Rolandson, Ammunition Group

But it’s not just humans! We couldn’t have a robotics conference without some robots, so we are equally pleased to announce that Kiwi, Penny, and Sally will all be in attendance as well!

We’ve been putting a ton of work into this show to make sure that it delivers an exciting day that gives every attendee better insight into the business, technological and societal implications of our impending move towards robots and automation in food tech.

As such, we have hand crafted a solid agenda that brings out the expertise we’ve gathered on-stage to create a full day of insight and engaging conversation. Talks will include:

Grocery: Rethinking Grocery in the Era of Robotics

Society-Robot Relationship: How to Navigate Skepticism and Fear around Automation in Food

Restaurants: The Front and Back of House

Food Production: Hyperlocal Food Factories

Robots on Roads: Robotics For the Last Mile

24 Hour Storefronts: Automation At Point of Sales

Overcoming The Big Technical Challenges With Food Robotics

Novelty, Niche or Next Big Thing? How To Make Good Experiences (and Food) With Robotics

Collaborate, Coexist, Cobot: Building Towards Integrated Robot, Human Work Environments

Investment Opportunities in Food Robotics

Check out the full agenda.

We hope to see you there! Tickets are limited so get yours today!

February 24, 2019

ArticlATE Q&A: Google Brain’s Vanhoucke on Robots, AI and Programming vs. Learning

The hardware of a robot is only as good as the software brain that powers it. This is why we are so excited to have Vincent Vanhoucke, Principle Scientist with the Google Brain team speak at our upcoming ArticulATE conference.

In addition to being the director of Google’s robotics research efforts, Vanhoucke has spent his career researching artificial intelligence and machine learning. Before we sit down with him at the show, we wanted to give you a little taste of what he’ll be talking about with a brief Q&A that we conducted over email.

If you want to see Vanhoucke in person and be a part of the discussion on the future of food robotics and automation, get your ticket to ArticulATE today!

What is Googley about robots and automation?

There is something new and exciting happening in the world of robotics: thanks to the advances in deep learning of the last few years, we now have vision systems that work amazingly well. It means that robots can see, understand, and interact with the complicated, often messy and forever changing human world. This opens up the possibility of a robot helper that understands its environment and physically assists you in your daily life. We are asking ourselves: what could happen if the devices you interact with every day could carry out physical tasks, moving and picking things up — if they could ask ‘How can I help you’ and do it directly?

What are some broad applications that AI and machine learning are really good at right now and where is biggest room for improvement?

Perception at large has made enormous progress: visual understanding, localization, sensing, speech and audio recognition. Much of the remaining challenge is to turn this progress into something actionable: connecting perception and action means understanding the impact of what a robot does on the world around it, and how that relates to what it sees. When a robot operates in a human environment, safety is paramount, but also understanding social preferences, as well as people’s goals and desires. I believe that enabling robots to learn, as opposed to being programmed, is how we can establish that tight degree of human connection.

Do you need massive amounts of data for training AI, or do you just need the right data?

Improving data efficiency has been a major focus in recent years. In the early days of deep learning, we explored what was possible when lots of data was available. Today, we’re probing whether we can do the same thing with a lot less data. In most cases, the answer is yes. In robotics, we’ve been able to leverage lots of simulated data for instance. We’re also finding new ways to improve systems on the fly, as they operate, by leveraging data-efficient techniques such as self-supervision and meta-learning.

Computer vision + AI is shaping up to be a versatile and powerful combination (spotting diseases on crops, assessing food quality, automating store checkout). Where is this technology headed and what are some untapped uses for it?

If you look at the perception systems that have evolved in animals and humans, they’re largely driven by function: for instance, our own visual system is very good at sensing motion, and at focusing its attention on the entities that we interact with in a scene. Computer vision systems don’t work like that today, because we haven’t closed the loop between sensing and acting. I think that one of the grand challenges of computer vision is how to optimize visual representations for the tasks we care about. Robotics will enable us to close this functional loop, and I expect that we will see the technology improve dramatically as a result.

What is your favorite fictional robot?

Japanese giant robots were a fixture of TV shows when I was a kid in 80’s France. Of the lot, I’m going to have to go with UFO Robot Grendizer out of sheer nostalgia. Today, I find inspiration watching Simone Giertz’ terrific robot contraptions on YouTube.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...