• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Voice Technology

October 1, 2019

How Sevenrooms Is Making Voice Tech the Centerpiece of Restaurant Operations

If you are a restaurant in 2019, one of your most valuable assets is your customer data: what they order, how much they spend, whether or not they hate parsley. There are numerous tech platforms nowadays to help restaurants access this mountain of information, but historically that’s meant handling a tablet or mobile device along with all the other items restaurant staff juggle.

Guest-management platform Sevenrooms wants to change that by making it possible to access vital customer information using your voice.

The NYC-based company’s software platform already lets restaurants track customer data points in real time and access that information quickly to provide guests with more personalized service. Now the company is doubling-down on voice tech, which it believes will be the key tool for collecting and inputing customer data into restaurant systems of the future.

The company, who has raised $21.5 million to date, received an investment for an undisclosed sum from the Amazon Alexa Fund in late 2018 and has been working on an Alexa skill ever since to help restaurants access customer data faster and more seamlessly, and without having to use their hands.

“That’s a thing that would have originally required a GM to be looking down at a tablet or some form of screen,” Allison Page, Sevenrooms’ cofounder and Chief Product Officer, says over the phone of getting customer data. “And Alexa’s going to make it so much easier to get [that information] hands free in the middle of service so they don’t have to interrupt that hospitality they’re providing.”

So long as a guest’s information is stored in the restaurant’s system (via, for example, a loyalty program), Alexa can access that information with a simple voice command. For example, a GM could ask Alexa who is sitting at Table 5 and be told it’s a local customer who’s spent a total of $5,000 at the restaurant over the course of time and is celebrating an anniversary that night. The GM could then send over a giftcard, dessert or some other token of appreciation for the guest that would both personalize their experience that night and, hopefully, keep them coming back.

In certain settings, it might seem superfluous to add a voice layer to a system. But restaurants are inherently chaotic settings where multitasking reigns supreme and staff quite literally have their hands full most of the time with trays of food that could easily be spilled and damage a touchscreen device. Going hands-free with voice-enabled technology is potentially a far more seamless way of integrating guest management into a restaurant’s system. Page says the skill can also tell a user things like how much revenue a restaurant has booked that night and how that number compares to previous nights, if a guest has dietary restrictions, and even if they wrote any recent reviews of the restaurant.

The system also works the other way around. If a server or GM learns, for example, that a guest just moved to the neighborhood, they can tell Alexa to input that data into the guest’s profile to store as information for future visits.

All of this can be done without the user ever having to log into the Sevenrooms system, and that’s at the heart of Sevenrooms’ Alexa integration: bringing tech into the restaurant without letting it take over a la tablet hell.

Page demonstrated this at the 2019 NRN show by donning a pair of Alexa-enabled glasses and showing the audience how she could ask the skill questions about a restaurant guest and have the information appear right on the lens.

Whether its glasses, watches, or some other wearable device that’s the future of voice tech is yet to be determined. While voice tech in the restaurant has gotten a lot of press lately thanks to McDonald’s acquisition of Apprente, it’s still early days for the technology’s place in restaurants, and there are still challenges to work through. For example, Page says one of the current hurdles for Sevenrooms is getting Alexa to properly understand voice commands and questions in the middle of a noisy dining room. The company is currently working with Amazon on solving this issue.

There’s also the question of whether restaurants will sign up for yet-another piece of tech, and one they can’t even put their hands on. Page doesn’t seem terribly concerned about this, however. As she sees it, the benefits of “not having to take your eyes off the dining room and not having to take your eyes off the guest” will prove valuable enough to the customer to justify making voice tech a central part of a restaurant’s guest management system.

March 12, 2018

Hearst Unveils Visual Guided Recipe Skill for Amazon Echo

Alexa, let’s have Pancetta Chicken for dinner.

Last month, publisher Hearst expanded its Amazon Echo- and Spot-enabled Good Housekeeping skill to include connected recipes. Dubbed Good Housekeeping Test Kitchen, the skill provides simple “meal ideas” that can be thrown together in 30 minutes or less. The recipes will be curated by Susan Westmoreland, food director of Good Housekeeping, and, in addition to being speedy, are said to be easy to execute.

Previously, the Good Housekeeping skill only included step-by-step advice to remove stains. (Don’t worry—it can still help you get out that wine spill from your carpet.)

With the Good Housekeeping skill, users can select a recipe based on a photo and short description (or tell Alexa to do it for them). The smart display then provides a step-by-step guide through the recipe. Users can swipe around to see more recipes, skip ahead in the steps, and reference the ingredients. They can also use voice commands like “Alexa, tell Good Housekeeping to continue” if they want to move forward in the recipe but don’t want to touch the screen with, say, raw chicken hands.

Hearst’s expansion into recipes isn’t exactly surprising. At the end of 2016, the media company took a big leap into the realms of AI and AR by establishing the Native and Emerging Technologies (NET) group, which focuses heavily on voice-activated experiences for virtual assistants and smartphones.

This new skill speaks (literally) towards the growing role of voice assistants in the connected household, and the kitchen in particular. “We’re raising the stakes from what a user can expect [in terms of] information and utility from these devices,” Chris Papaleo, executive director of emerging technology at Hearst, told AdWeek. Which is something we’ve predicted but haven’t seen developed in as big a way as we’d thought—yet.

Photo: AdWeek

It also brings us one step closer to the integration of recipes (and other food media) and AI-enabled voice technology.

We’ve seen a voice-enabled smart kitchen assistant before with Freshub, which lets users add items to their shopping carts using voice commands. Then, at last year’s Smart Kitchen Summit, Emma Persky, who runs point on the Google Assistant’s guided cooking team, talked about Google’s work combining recipe content with their voice-enabled AI platform by offering video aids for recipe steps (say, sautéeing an onion). And Amazon’s 2016 partnership with AllRecipes allowed users to access voice-guided cooking instructions of their 60,000-strong recipe database.

But by combining recipes on a visual display with voice-enabled controls—albeit simplistic ones like telling it to move to the next step—this new skill from Good Housekeeping is the first time that virtual assistants have really entered the hands-free recipes zone in a synched-up visual and auditory way. While the Google Assistant can show you a video of how to sauté an onion if you’re stuck, it doesn’t have a connected visual element that takes you through each step of the recipe, since it relies almost entirely on voice guidance. This is nice since you don’t have to add another piece of equipment to your virtual assistant lineup, but not as helpful when you’re wondering how small the recipe wants you to dice your pancetta.

With this new skill, Hearst is betting on more voice assistants expanding into smart displays and a corresponding need for more visual content in the sphere. As the number and popularity of voice assistants grow and become a more commonplace part of consumers’ homes, I imagine we’ll see a lot more skills aimed at facilitating the home cooking process, from expanded shoppable recipe applications to visual cooking aids.

As of now, the Test Kitchen skill doesn’t have a sponsor. But with so many large companies trying to carve out a space in the trending foodtech world, it seems only a matter of time before a big-name recipe site or even CPG brand (who have been trying to get into foodtech in any way they can) snags the title.

The success (or lack thereof) of this skill could indicate where we are in that process.

 

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...