• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Google Lens Could Make the Restaurant Experience Super Convenient — or Super Predictable

by Jennifer Marston
May 8, 2019May 8, 2019Filed under:
  • Business of Food
  • Delivery & Commerce
  • Interfaces
  • Personalized Food
  • Restaurant Tech
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to email this to a friend (Opens in new window)

Yesterday at its I/O conference, Google announced new features to Google Lens, its image-recognition app for Android, and they’re all about improving the restaurant experience for customers.

Google Lens uses machine learning, computer vision, and a whole lot of data to interact with the world around you and answer your questions. For example, when Lens launched in 2017, it was touted as a way to instantly translate another language.

According to a blog post from Google published yesterday, the new features for Google Lens “provide more visual answers to visual questions.”

Like what’s popular on your restaurant menu right now. Within the Google Lens app, when a user snaps a photo of the menu and taps on an item, Lens automatically pulls up relevant information, like a photo item description, and reviews — data Google already has thanks to its Google Maps-Yelp integration.

Lens is basically automating something most of us have done at least once while out at a restaurant, which is see another table’s food arrive and, intrigued, ask them what it is. But asking Drew at the next table what she ordered only offers so much information. After all, Drew hasn’t tasted the food yet, and her preferences could be completely opposite of yours.

What Lens appears to be doing with this new feature is taking most of that guesswork out of the ordering process by not only matching a photo with its name and description, but also aggregating reviews, so a user can get a clearer sense of how the dish tastes. If nine people say the dish was super bland, those who prefer a little more kick to their meals might order something else.

It all sounds wonderfully convenient for us consumers. For example, it could be a valuable tool when you’re trying out a new type of cuisine and have no idea where to start.

What I wonder, though, is how this will affect menu planning for restaurants. On the one hand, it could provide valuable information for restaurants when it comes to figuring out what is and isn’t selling on the menu, so chefs and operators could better adjust their planning and inventory (potentially helping them avoid food waste and keep costs down).

But what this will do to the adventure of going out to eat? Part of the fun of the restaurant experience is the guesswork, which would be gone were we to rely too heavily on data-driven recommendations. This seems unlikely at higher-end restaurants and places designed for adventurous foodies, with robust appetites for the unknown. For all the places in between, though, too much knowledge might make the restaurant experience just a little too predictable.


Related

More Than Hot Dogs: Pinterest & Google Image Recognition AI Make A ‘Shazam For Food’ Possible

In this season of Silicon Valley, one story line has housemate and programmer Jian-Yang developing a food recognition app called 'See Food.' Since the idea was born out of a spitballed pitch for a "Shazam for Food" by Yang's landlord Erlich Bachman, it's not altogether surprising that when Jian-Yang finally gets around to hacking…

Visual Search Holds Great Promise for the Future of Food

The fruits of advances in visual search technology offer great promise for the future of food in a wide range of applications. Applying such technologies as computer vision, machine learning, artificial intelligence and the ability to delve deeply into tags and other metadata, innovators can build applications that can assist…

Google Continues Its Quiet March Into the Restaurant Biz With Panera Integration

Fast-casual chain Panera today announced a new integration with Google that lets customers order pickup and delivery meals directly via Google's Search, Maps, and Assistant apps. It’s a pretty simple setup. Search “Panera” and, if nearby locations of the chain are participating, you’ll see “order pickup” and “order delivery” buttons…

Get the Spoon in your inbox

Just enter your email and we’ll take care of the rest:

Find us on some of these other platforms:

  • Apple Podcasts
  • Spotify

Post navigation

Previous Post Nutrition Innovation Raises $5M to Improve Sugar
Next Post Dunkin’ Is Taking Its Next-Generation Store to Texas and Beyond

Reader Interactions

Comments

  1. Jonathan Trenn says

    May 10, 2019 at 10:02 am

    Jennifer, I’m confused here…

    I work both in digital marketing and in the restaurant industry. After I read this article, I pointed Google Lens to the words “Filet Mignon” on a restaurant menu and what popped up was not photos of that restaurant’s filet mignon or reviews of the dish. Instead, the response was that of a typical Google search. In fact, I think the Wikipedia entry was the top response.

    Do you know if (and if yes, how and when) this service plans to match the results to actual photos and reviews about that specific restaurants dishes? Where will they be pulling the info from…Google Maps and Yelp? Anywhere else? I think this could be a great tool for restaurants. Thanks!!

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Get The Spoon in Your Inbox

The Spoon Podcast Network!

Feed your mind! Subscribe to one of our podcasts!

Tasting Cultivated Seafood in London’s East-end
After Leaving Starbucks, Mesh Gelman Swore Off The Coffee Biz. Now He Wants To Reinvent Cold Brew Coffee
Brian Canlis on Leaving an Iconic Restaurant Behind to Start Over in Nashville With Will Guidara
Food Waste Gadgets Can’t Get VC Love, But Kickstarter Backers Are All In
Report: Restaurant Tech Funding Drops to $1.3B in 2024, But AI & Automation Provide Glimmer of Hope

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.