• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Articulate 2019

June 3, 2019

Video: Sony’s Masahiro Fujita on Bringing AI and Robotics to Food

We were thrilled when Masahiro Fujita, Chief Research Engineer of the AI Collaboration Office of Sony agreed to be a speaker at our ArticulATE food robotics conference in April. He started Sony’s Robot Entertainment project in 1993 and led the development of the famous AIBO robotic dog.

During his ArticulATE presentation, Fujita talked about how historically, Sony has provided technology throughout the entertainment stack. Its studio Sony Pictures finances movies, filmmakers use Sony products to shoot and create movies, and consumers can watch movies on their Sony Blu-Ray or PlayStation Network.

Fujita said that Sony views the food world in much the same way. It wants to provide the underlying robotic and AI technology that can help creative types like chefs make their food, as well as the mechanisms for people at home to enjoy high-level cooking.

Because the technology is still so nascent and not ready for prime time with consumers, Sony is looking first at B2B applications. The company really wants its AI and robots to be able to make meals in a 3 Michelin star restaurant. Those of us waiting for robotic help at dinnertime at home are going to have to wait awhile.

Check out Fujita’s full presentation (and a glimpse into the robo-chef future) from ArticulATE below:

ArticulATE 2019: Where Is It All Going? A Look Forward with Sony's Masahiro Fujita

May 3, 2019

Video: Cafe X and Byte Technologies on Data Rabbit Holes and the One Thing Your Data Team Must Track

The beauty of running an automated or robot-powered business like the coffee-slinging Cafe X or Byte Technologies smart fridges, is that you generate a lot of data. What people are purchasing, when they purchase it, where, how often, etc. All this data powers demand algorithms that help companies like Cafe X and Byte be more efficient by accurately determining business critical decisions such as products to offer and how much inventory to carry.

The downside of running an automated or robot-powered business like Cafe X or Byte Technologies is that you generate a lot of data. Too much data, actually. You could easily spend most of your time diving down a data rabbit hole, trying to fine-tune demand algorithms and actually wind up making them less useful.

So how do you determine what data to capture and what data to pay attention to?

Glad you asked! Because the role of data was central to my discussion with Cafe X COO Cynthia Yeung and Byte Technologies Founder Lee Mokri during the Forecasting, Personalization & Customer Service Challenges for Robotic Retail panel at our ArticulATE food robotics conference last month.

It was fascinating to sit down and learn from Yeung and Mokri about how their respective companies tackle the data issue, and how they use that data to maximize product variety within the limited square footage of their automated storefronts.

Plus, for all you entrepreneurs out there, Yeung explains the one thing you absolutely need to do when building out your data team and plan.

Watch the full video below, and check back for more of our full sessions from ArticulATE 2019.

ArticulATE 2019: Forecasting, Personalization & Customer Service Challenges for Robotic Retail

April 28, 2019

Video: Creator’s Robo-Burger Joint Seems Like a Pretty Cool Place for a Human to Work

Out of all the burger restaurants you could work for, Creator in San Francisco seems like it would be a good choice. First, the burgers there are made by a robot, so duh, awesome. But while a robot may be the center of attention, it’s actually the humans that Creator CEO Alex Vardakostas seems to care the most about.

Vardakostas was on stage at our recent ArticulATE conference, where he explained that Creator’s mission is to create the most human-centric dining experience. Ironically, a giant robot helps them do just that. First, diners can enjoy a nice meal with friends at a fair price ($6 for a burger) in a well-appointed restaurant (with, admittedly, severely restricted open hours).

But using a robot to cook the burgers, Vardakostas also hopes to help free up the creativity of his human employees. With the monotonous “burger flipping” done by the robot, employees can provide better customer service and spend time in more creative ways.

But Varakostas and Co. take it one step further. Creator gives all their employees 5% time. This paid time is for employees to learn about whatever they want. Some take the opportunity to take online classes, or learn console code used on the robot, or study flavors in the hopes of opening up their own restaurant one day. Vardakostas doesn’t want Creator to be a dead-end job where you do the same thing for ten years; he understands that he’s equipping his employees to move on to other, better opportunities. Hopefully it’s something other restaurants implementing robots will learn from.

ArticulATE 2019: Building a Better Burger Joint with Creator CEO Alex Vardakostas

Be on the lookout for even more food robot-related videos from our ArticulATE conference here on The Spoon!

*An earlier version of this post said Creator staff were freed up to do social media campaigns. Creator reached out to say that restaurant staff do not engage in social media campaigns.

April 25, 2019

Video: Albertsons Brings Labor into Automation Decisions on “Day One”

A recurring theme during our recent ArticulATE food robot and automation conference was the issue of human labor. What do you do when machines displace people in the workforce? It’s a tough question employers throughout the food world are grappling with.

On the one hand, automation is coming, and we need to figure out what combination of government and private sector will be responsible to help re/train people put out of work by robots. On the other hand, food-related companies like Albertsons (and restaurants), are having a tough time finding people to work for them.

Trung Nguyen, VP of eCommerce for Albertsons, told the audience at ArticulATE that the grocer is having a hard time filling jobs, especially truck driving jobs. Turns out people would rather set their own hours and work unsupervised driving for Uber than a delivery truck.

Even with a labor crunch, Albertsons is very methodical in its approach to automation. That’s because Albertsons is unionized, and, as Nguyen explained, before any kind of automation is implemented, the company brings in union reps and lawyers on “day one.” This way, Nguyen said that there is clear communication in what the company does.

Check out the full video from the conference to hear more about how the Albertsons grapples with the labor issue, why scale is important to a chain like Albertsons, and how it, too, has started piloting some autonomous vehicle delivery tests in the Bay Area.

ArticulATE 2019, Rethinking Grocery in the Age of Automation

Be sure to check back all through the week for more videos of the full sessions from ArticulATE!

April 24, 2019

Video: Google Brain Director on Creating Robots for a Messy World (Like Kitchens!)

Robots are really good at moving around. They can roam the aisles of grocery stores easily or manipulate an articulating arm to pour you a cup of coffee. But what robots aren’t good at is understanding the world around them.

“The human world is by definition messy and complicated,” Vincent Vanhoucke, Principal Scientist and Director of Perception and Robotics for the Google Brain Team told the crowd during his presentation at our recent ArticulATE food robot conference in San Francisco.

Vanhoucke’s team is working on taking the things robots do well — moving around — and marrying that with advancements in computer vision and deep learning to make robots more useful in the messy and complicated real world. And it turns out that food in particular, with its different textures and properties, is quite messy and complicated.

The Google team does this by training robots to pick up objects of various sizes and shapes up using deep learning. Through recognition, repetition and reinforcement, the robots develop their own strategies and behavior for solving problems (the inability to pick up an object) on their own without a human programming specific solutions.

The applications for this can be seen in something like a feeding robot for the disabled. Rather than having a “dumb” arm that only scoops food from a predefined area in a bowl and lifting that food to a predetermined height, a deep-learning enabled robot can identify different food on a plate no matter where it is, pick it up and lift it directly to a person’s mouth.

It’s really fascinating and cutting-edge science, and you should definitely watch the video of his full presentation.

ArticulATE 2019: Using Robotics in Messy Environments (Like Kitchens!)

Be on the lookout for more videos from ArticulATE 2019, coming throughout the week.

April 22, 2019

Video: Do Food Robot Startup Founders Need Restaurant Experience to Be Successful?

Say you’re a VC looking to invest in a company that makes strawberry-picking robots. There are three or four companies in the space, all vying for your capital to get off the ground.

How do you choose where to put your dollars?

That’s one of the questions that we tackled last week at ArticulATE, our inaugural food and automation summit. We closed out the day with a lively panel on the opportunities — and challenges — in the food robotics investment landscape. Our speakers were VCs from the foodtech, hard tech and IoT spaces: Brian Frank of FTW Ventures, Brita Rosenheim of Better Food Ventures, Rajat Bhageria of Prototype Capital, and Avian Ross of Root VC.

The Spoon’s Michael Wolf moderated the conversation on what these investors are looking for in a food automation startup pitch specifically — and where they see significant opportunities in the fast-growing market. (Yes, both Frank and Ross have invested in strawberry-picking robots — and Ross claims he made the right choice.)

It’s certainly an exciting time in the food robotics space: there are tons of entrepreneurs out there with lofty plans to build the next robotic sushi restaurant, the next automated food delivery bot, or the next burrito-rolling robot arm (which, apparently, really hard to do).

However, the panelists seemed to agree that food robotics is a trickier investment space than a lot of other tech areas. Sure, the basic building blocks of food robotics — AI, articulating arms, etc. — are pretty democratized. But Ross (who — fun fact — spent a former life building robots for the Food Network) said that “robotics feels special and different.” He pointed out that the food system is incredibly complex and that a whole host of players have to be involved to deliver even the most basic meal to the consumer. And that’s just logistics: getting a robotic system to reach parity with a basic human fast food experience in terms of taste or customer experience is really tricky.

Because there are so many complexities at play it can require more capital than some other tech investments. It can also take longer to bring food automation technology to market. Which isn’t a problem — unless, as Rosenheim pointed out, you’re working with investors who are looking for “the next shiny thing” and aren’t patient enough to be in it for the long haul.

Investors in food robotics have to be especially willing to take risks and play the long game. However, not all the VCs saw eye-to-eye on what it takes for a food automation startup to be successful. The panelists disagreed on whether or not startups need deep restaurant market knowledge to be successful, how high the capital investment has to be in food automation, and what sets one seemingly identical food robotics startup apart from another.

Check out the video below to see the whole conversation — it was a really fun one.

Articulate 2019: Investment Opportunities in Food Robotics

Look out for more ArticulATE 2019 videos rolling out on our YouTube channel over the next week! 

April 16, 2019

Here’s The Spoon’s 2019 Food Robotics Market Map

Today we head to San Francisco for The Spoon’s first-ever food-robotics event. ArticulAte kicks off at 9:05 a.m. sharp at the General Assembly venue in SF, and throughout the daylong event talk will be about all things robots, from the technology itself to business and regulatory issues surrounding it.

When you stop and look around the food industry, whether it’s new restaurants embracing automation or companies changing the way we get our groceries, it’s easy to see why the food robotics market is projected to be a $3.1 billion market by 2025.

But there’s no one way to make a robot, and so to give you a sense of who’s who in this space, and to celebrate the start of ArticulAte, The Spoon’s editors put together this market map of the food robotics landscape.

This is the first edition of this map, which we’ll improve and build upon as the market changes and grows. If you have any suggestions for other companies or see ones we missed you think should be in there, let us know by leaving a comment below or emailing us at tips@thespoon.tech.

Click on the map below to enlarge it.

The Food Robotics Market 2019:

March 18, 2019

ArticulATE Q&A: Why Starship’s Delivery Robots are as Wide as Your Shoulders

We are a little less than a month away from our food robotics summit, ArticulATE, happening on April 16 in San Francisco. The excitement around Spoon HQ is palpable because we have locked in a fantastic lineup of speakers, including today’s Q&A guest Ryan Tuohy, the Senior VP of Business Development at Starship.

Starship’s autonomous rover bots deliver snacks, groceries and packages to corporate and college campuses and even out to the general public in Milton Keynes in the UK. In advance of Tuohy’s talk on-stage at ArticulATE, we wanted to set the stage and learn a little more about, well, what Starship has learned from making all these deliveries (like how big to make its ‘bot!).

If you want to know how robots are impacting the present and setting up the future of food delivery, get your tickets to ArticulATE today, they are going quickly!

THE SPOON: Starship has been running deliveries for a while now on Intuit’s corporate campus, in Milton Keynes in the UK, and most recently you started at George Mason University. What have you learned about how people use and interact with robot delivery?

TUOHY: One of the things we’re proud of is how happy customers are when the robot arrives with their order – every interaction is overwhelmingly one of delight and we’ve received thank you notes and special drawings of robots from every location where we operate. At GMU specifically, the bots have become famous across social media, with many students posting pictures/videos of their deliveries.

For students at George Mason University, the bots provide a low cost and convenient service to let them spend time doing what they want, rather than waiting in line for food or skipping a meal because they’re too busy, GMU students can have food delivered anywhere on campus and use that extra time to hang out with friends, study or take a break.

In some locations, like Milton Keynes, some of our most popular customers are parents who struggle to leave the house because of their children. It’s a lot easier to get a robot delivery than try and get two young kids in the car, find parking, walk round the grocery store and then drive back. It’s a lot more environmentally friendly as well.

The vast majority of people notice the robot on the sidewalk and simply pass by it. Initially, some people take a selfie or a photo of the robot. This effect diminishes over time as people in the particular location become familiar with seeing the robot in the area.

As you talk with campuses like George Mason, what are the concerns they have about robot delivery and how do you alleviate them?

At George Mason University, some people have asked whether the robots can handle rough weather conditions such as rain and snow. One of the reasons we brought the robots to GMU was for this very reason. The robots have safely traveled over 150,000 miles around the world and completed thousands of deliveries regardless of weather conditions.

Additionally, everyone’s first test when they see the robot on the sidewalk is to check if the robot will stop in time, when [they’re] blocking its path. But each robot travels at 4mph and has a ‘situational awareness bubble’ around it with a range of sensors that can detect obstacles like dogs, pedestrians, bikers, and is able to either maneuver around them or stop at a safe distance.

Is there a sweet spot when designing a rover delivery robot, in terms of size and speed, and have you hit that, or is it more of a moving, evolving target?

Robotic delivery is affordable, convenient and environmentally friendly. Starship’s robots are intelligent and designed to seamlessly co-exist with humans in the community. They are purposefully about as wide as a human shoulder width when walking on the sidewalk and travel at walking pace (4mph max).

Starship has designed and built our robots with a vast amount of advanced proprietary technology, including a combination of computer vision, sensor fusion and machine learning for seamless navigational and situational awareness. The company’s proprietary mapping process enables the robots to understand their exact location to the nearest inch.

What do you think are the biggest misconceptions about robot delivery?

One of the biggest misconceptions surrounding autonomous delivery is around security. Many people believe that robots will be stolen but the reality is very different from this. In tens of thousands of autonomous deliveries, we’ve not had any robots stolen. The robot has many theft prevention measure to stop this from happening, including 10 cameras, sirens (like a car alarm), tracking to the nearest inch and the lid is of course locked at all times. It would be a big effort to steal a robot and get it home without being caught, only to find some milk and eggs in the basket!

What is your favorite fictional robot?

I enjoy the classic R2D2 robots from Star Wars. Who doesn’t enjoy a cute robot that can help you on all your expeditions? We are already at the point where robots can deliver packages and food to your doorstep. It’s amazing to see technology we once imagined has now become a reality.

March 13, 2019

ArticulATE Q&A: Miso’s CEO on How Flippy the Robot Will Move From Frying to Chopping

Ahh Flippy. It was the first food robot I ever wrote about, way back in…2018. Back then, it could only grill burgers. Now, a year later, it can fry tater tots and chicken tenders, and will reportedly soon get a job in a deli.

They grow up so fast. Soon Flippy will want the keys to not drive the autonomous car.

We’re going to get a full report on what Flippy is — and will be — up to when Dave Zito, CEO of Miso Robotics, sits down for an on-stage chat at our upcoming ArticulATE conference on April 16 in San Francisco. We were so excited to have him be a part of the show that we couldn’t wait and sent him some questions via email, which he was kind enough to answer.

This is but an autonomous amuse bouche — get your ticket today to see Zito and a host of truly amazing speakers at ArticulATE!

THE SPOON: Flippy started off grilling burgers and then moved on to frying up chicken tenders. What particular jobs in the kitchen are Flippy, and robots in general, really good at?

ZITO: We started Miso Robotics with the idea of giving eyes and a brain to a robotic arm so it could work in commercial kitchens with real-time situational awareness and real-time robotic controls. We designed and starting building the system from Day One as a software platform that could automate the cooking of all manner of foods and recipes, with all equipment and restaurant brands, and all kitchen formats.

Our autonomous robotic Kitchen Assistants are focused on helping with the most repetitive, dangerous, and least desirable tasks in the kitchen. Flippy grilling burgers was our proof of concept. Flippy can now fry many different kinds of foods as well. These tasks can be improved and optimized for consistency, ensuring each meal is cooked to the perfect temperature with minimal food waste. Beyond frying, grilling, and other cooking, expect them over time to help with tasks like chopping onions, cutting other vegetables, and even cleaning.

The Kitchen Assistant improves and learns over time based on the data available. Ultimately, this frees up kitchen staff to spend more time with customers. We believe the future of food is on-demand, accessible, personalized, and scalable. We are building the technology platform leveraging automation, machine learning, and robotics advancements to deliver on this future.

What did you learn from Flippy’s time at Dodger Stadium?

Flippy’s deployment at Dodger Stadium emphasized how much one kitchen assistant can impact productivity and efficiency in a high-volume commercial kitchen. Dodger Stadium was the first time we deployed our frying capabilities, and we matched max productivity while producing consistently fried foods to the chef’s expectations. Cooking for extended periods of peak demand during baseball games was a key proof point for the reliability and sustained high throughput of our Kitchen Assistants.

But don’t just take our word for it; here is what our partner Levy had to say about the experience:

“The robotic kitchen assistant helps us more quickly and safely cook perfectly crispy chicken tenders and tater tots,” said Robin Rosenberg, Vice President and Chef de Cuisine for Levy. “It’s amazing to see the kitchen assistant and team members working together, and the consistency of product is incredible.”

“New technologies at large scale venues and events like this need to add value for both guests and team members,” said Jaime Faulkner, CEO of E15, Levy’s analytics subsidiary. “Working with Miso, we were able to create a process that both delivers high quality food more quickly, and gives kitchen team members a chance to hone sought-after skills working with robotics and automation.”

We are looking forward to resuming frying with Levy this baseball season.

What is the biggest misconception about food robots in the kitchens?

The biggest misconception about the use of technology in the kitchen is that it’s about job replacement. There is a growing labor crisis in the restaurant industry. Local workforces are shrinking, and wages are increasing, making commercial cooking uneconomical. Meanwhile, consumers have an increased desire for meals cooked for them, whether via delivery, take-out, dining out, or grocery deli meals, adding pressure on kitchen workers.

Restaurants already see 150% turnover today from a dissatisfied workforce. Pair this with an aging workforce that can’t handle some of the physical demands that come with the job and commercial kitchens are struggling to recruit and retain talent. Intelligent automation not only creates an avenue for meaningful work for the next generation through the creation of new jobs like a Chef Tech (employees trained to manage the robot), but also takes the physical burden off of more mature employees who want to continue to contribute later in life.

The tasks that Miso’s technology can perform are some of the most dangerous tasks in the kitchen, not to mention messy and menial, ultimately improving the employee experience by freeing up time for them to focus on more meaningful work, like warm customer service that a robot simply can’t match.

What should restaurant owners know about food robots before implementing them?

Expect improvements across several aspects of their business — better food, better customer service, better inventory and cost management. While a signature recipe for a restaurant can make it a success, it can be hard to reliably reproduce at scale to every customer, but robots like Flippy can deliver consistency in flavor to help keep customers loyal. Furthermore, the value proposition of implementing robotics in the kitchen spans productivity and cost-savings to one of the most pressing issues in our world today – sustainability. Food waste is a huge contributor to the climate crisis we are in, wasting $160 billion of food a year. This technology has the potential to significantly reduce that number – restaurants can contribute to a positive step in the right direction of food waste and ensure they are maximizing inventory as they begin to grow.

What is your favorite fictional robot?

As a kid I loved Johnny 5 from the film Short Circuit. I loved the idea that technology built for one purpose, in this case the military, once embedded with artificial intelligence shifted to more compassionate pursuits. In that way we are inspired at Miso to take industrial robotic arms, add our intelligence, and in so doing improve them for a broader and more impactful service — helping liberate commercial kitchens from repetitive tasks and mediocre menus, while empowering chefs to make delicious and nutritious meals accessible for all.

March 13, 2019

ArticulATE Summit Agenda Announced: Sony, Google, AutoX, Starship, Albertsons and More to Speak

We are excited to officially announce the full speaker (and robot!) lineup and agenda for our upcoming ArticulATE Food Robotics Summit. The one-day conference will be held on April 16 at General Assembly in downtown San Francisco, and it promises to be a great event packed with amazing talks about the future of food robots and automation.

A seriously stellar roster of speakers from up and down the robotic food stack will be there, including:

  • Linda Pouliot, CEO – Dishcraft
  • Narayan Iyengar, SVP, eCommerce & Digital – Albertsons
  • Jewel Li, COO – AutoX
  • Randall Wilkinson, CEO – Wilkinson Baking Company (BreadBot)
  • Vincent Vanhoucke, Principle Scientist, Google Brain team – Google
  • Masahiro Fujita, Chief Research Engineer, AI Collaboration Office of Sony – Sony
  • Ali Ahmed, CEO – RoboMart
  • Cynthia Yeung, COO – Cafe X
  • Ali Bouzari, Chief Scientist – Pilot R&D
  • Dave Zito, CEO – Miso Robotics
  • Deepak Sekar, CEO – Chowbotics
  • John Ha, CEO – Bear Robotics
  • Ryan Tuohy, Senior VP, Business Development – Starship
  • Alex Vardakostas, CEO – Creator
  • Brian Frank, General Partner – FTW Ventures
  • Mara Behrens, Chowbotics
  • Brita Rosenheim, Partner – Better Food Ventures
  • Chas Studor, Founder/CTO – Briggo
  • Lee Mokri, Founder – Byte
  • Shawn Lange, President – Lab2Fab (A Middleby Corporation)
  • Avidan Ross, Root VC
  • Matt Rolandson, Ammunition Group

But it’s not just humans! We couldn’t have a robotics conference without some robots, so we are equally pleased to announce that Kiwi, Penny, and Sally will all be in attendance as well!

We’ve been putting a ton of work into this show to make sure that it delivers an exciting day that gives every attendee better insight into the business, technological and societal implications of our impending move towards robots and automation in food tech.

As such, we have hand crafted a solid agenda that brings out the expertise we’ve gathered on-stage to create a full day of insight and engaging conversation. Talks will include:

Grocery: Rethinking Grocery in the Era of Robotics

Society-Robot Relationship: How to Navigate Skepticism and Fear around Automation in Food

Restaurants: The Front and Back of House

Food Production: Hyperlocal Food Factories

Robots on Roads: Robotics For the Last Mile

24 Hour Storefronts: Automation At Point of Sales

Overcoming The Big Technical Challenges With Food Robotics

Novelty, Niche or Next Big Thing? How To Make Good Experiences (and Food) With Robotics

Collaborate, Coexist, Cobot: Building Towards Integrated Robot, Human Work Environments

Investment Opportunities in Food Robotics

Check out the full agenda.

We hope to see you there! Tickets are limited so get yours today!

March 8, 2019

ArticulATE Q&A: Briggo’s Coffee Haus has Served 470,000 Cups of Robot-Made Drinks

Look in any coffee shop in any city in America (or perhaps even the world) in the morning and chances are you’ll see people lined up to get their first cup of joe. Pretty soon, however, those people could be lined up in front of a robot that automatically and tirelessly slings custom-made lattes 24 hours a day, seven days a week.

Austin-based Briggo is one such company making always-open robo-baristas a reality. The company’s Coffee Haus is an automated high-tech vending machine designed to serve up morning (and afternoon, and evening, and middle of the night) fixes in high-volume areas like corporate campuses and airports.

But Briggo is more than a robotics company: it’s actually a full-stack coffee company, which also sources and roasts its own beans. This farm-to-cup approach to automated coffee was just one of the reasons we wanted to invite Chas Studor, Founder and CTO of Briggo, to speak at our upcoming ArticulATE food robot conference in San Francisco on April 16th (get your tickets today!).

But before he sits down at our show, Studor sat down for an emailed Q&A to tell us a little bit more about Briggo, the whole coffee process and why flight attendants love the Coffee Haus.

THE SPOON: What makes robots a good fit for coffee?

STUDOR: Precision and consistency are essential throughout the coffee supply chain. Some of the best coffees are already processed in highly automated dry mills in origin, and we employ sophisticated automation at the roaster. So why not create automated precision at the final step when the coffee is made.

You also create your own coffee, why did you decide to take this full stack approach.

For the best results, the ingredients and process must be designed in concert. Working with our importer and roaster partner, I spent months developing the Briggo Blend simultaneously with complementary extraction parameters. I could vary the roast level and proportion of the various origins and I would modify dosing, pre-infusion levels, temperatures and several other variables to maximize the intent of each flavor component as revealed through cuppings.

I started the company with the intent of applying emerging technology in a way that could create opportunities from farm to cup. By employing a “direct sourcing” approach, we understand how each step in the supply chain adds value, and then we apply our technology to get the most out of those beans.

In the end, we use technology to reduce waste and inconsistencies in the process, which in turn supports our partners at origin and delivers an amazing customer experience.

You recently launched at the Austin Airport, what makes airports a good fit for Briggo?

There are few other locations where speed, quality, and on-demand services are at such a premium. Being a completely automated solution, we can serve customers long after other shops close and long before they open. As you can imagine, some of our greatest advocates are the TSA agents, flight crews, and road warriors who have early schedules and limited time for coffee lines. Next time you visit the Austin Airport (or SFO later this spring), try using our Briggo app to order as you pass through security, swing past the Coffee Haus to pick up your beverage, and be on your way in minutes. Just like other venues, quality and convenience is often a trade-off, not so much at Briggo!

What kind of stats can you share about your customers? What are they buying, how often, does location impact what people purchase?

  • ~90k customers & 470k drinks served to date
  • 40% of our registered customers have bought 10 or more drinks and earned their first free loyalty beverage, taking just over 3 months to get there
  • 46% of our corporate customers have tried 3 or more unique menu items
  • 5 of our most loyal customers have purchased more than 1,000 drinks all time
  • 54% of all drinks are customized with either added flavors, dairy types, shots, sweeteners, strength, or even drink temperature

What is your favorite robot from fiction?

Rosie from the Jetson’s. I always liked her sarcasm, and we often get the “Jetson’s delivered” comment.

February 24, 2019

ArticlATE Q&A: Google Brain’s Vanhoucke on Robots, AI and Programming vs. Learning

The hardware of a robot is only as good as the software brain that powers it. This is why we are so excited to have Vincent Vanhoucke, Principle Scientist with the Google Brain team speak at our upcoming ArticulATE conference.

In addition to being the director of Google’s robotics research efforts, Vanhoucke has spent his career researching artificial intelligence and machine learning. Before we sit down with him at the show, we wanted to give you a little taste of what he’ll be talking about with a brief Q&A that we conducted over email.

If you want to see Vanhoucke in person and be a part of the discussion on the future of food robotics and automation, get your ticket to ArticulATE today!

What is Googley about robots and automation?

There is something new and exciting happening in the world of robotics: thanks to the advances in deep learning of the last few years, we now have vision systems that work amazingly well. It means that robots can see, understand, and interact with the complicated, often messy and forever changing human world. This opens up the possibility of a robot helper that understands its environment and physically assists you in your daily life. We are asking ourselves: what could happen if the devices you interact with every day could carry out physical tasks, moving and picking things up — if they could ask ‘How can I help you’ and do it directly?

What are some broad applications that AI and machine learning are really good at right now and where is biggest room for improvement?

Perception at large has made enormous progress: visual understanding, localization, sensing, speech and audio recognition. Much of the remaining challenge is to turn this progress into something actionable: connecting perception and action means understanding the impact of what a robot does on the world around it, and how that relates to what it sees. When a robot operates in a human environment, safety is paramount, but also understanding social preferences, as well as people’s goals and desires. I believe that enabling robots to learn, as opposed to being programmed, is how we can establish that tight degree of human connection.

Do you need massive amounts of data for training AI, or do you just need the right data?

Improving data efficiency has been a major focus in recent years. In the early days of deep learning, we explored what was possible when lots of data was available. Today, we’re probing whether we can do the same thing with a lot less data. In most cases, the answer is yes. In robotics, we’ve been able to leverage lots of simulated data for instance. We’re also finding new ways to improve systems on the fly, as they operate, by leveraging data-efficient techniques such as self-supervision and meta-learning.

Computer vision + AI is shaping up to be a versatile and powerful combination (spotting diseases on crops, assessing food quality, automating store checkout). Where is this technology headed and what are some untapped uses for it?

If you look at the perception systems that have evolved in animals and humans, they’re largely driven by function: for instance, our own visual system is very good at sensing motion, and at focusing its attention on the entities that we interact with in a scene. Computer vision systems don’t work like that today, because we haven’t closed the loop between sensing and acting. I think that one of the grand challenges of computer vision is how to optimize visual representations for the tasks we care about. Robotics will enable us to close this functional loop, and I expect that we will see the technology improve dramatically as a result.

What is your favorite fictional robot?

Japanese giant robots were a fixture of TV shows when I was a kid in 80’s France. Of the lot, I’m going to have to go with UFO Robot Grendizer out of sheer nostalgia. Today, I find inspiration watching Simone Giertz’ terrific robot contraptions on YouTube.

Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...