• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Voice interfaces

September 11, 2020

“Alexa, Look Into My Eyes”: New Prototype Combines Human Gaze with Voice Control to Help You Cook

There’s no doubt that voice control interfaces like Alexa and Google have had a huge impact on the way we search for recipes, access our appliances and add things to our grocery lists.

But what if that voice assistant had a contextual understanding of where you where looking when you are cooking the evening meal?

That’s the big idea behind a new prototype from Synapse, a division of Cambridge Consultants. The new Hobgoblin technology concept utilizes machine vision to gather information about where a person is looking when issuing a voice command and applies that information to the cooking experience.

From the project page:

We have been exploring the use of computer-vision based sensing as context, and for this cooktop demonstration we augmented the VUI using gaze tracking to make what feels like a magical interaction. The cooktop infers which burner is being addressed in a voice command by using its camera tracking to know which burner you’re looking at. This way, when the system detects a person standing in front of it looking at a burner, commands can omit the burner designation, e.g. “turn that burner on,” or simply saying a level like “medium high.”

In the concept video, a user is cooking and says “Alexa, turn up the heat.” Using a camera that is built into the cooktop, Alexa is able to infer that the user is cooking because they are looking at the cooktop.

There’s no doubt that the ability to combine contextual understanding of what is happening in a room to power commands given to digital assistants like Alexa could create much more powerful potential “smart kitchen” user scenarios. One could easily imagine combining other information to create more contextually relevant experiences, including facial recognition to, for example, apply a personalized knowledge base that understands a person’s cooking capabilities or their favorite recipes.

You can see the Hobgoblin demo in action below:

Hobgoblin Smart Appliance Interface | This New User-Interface Tech Isn't Just for the Kitchen

January 4, 2018

Amazon Brings Cooking Capabilities To Alexa Smart Home Skill API

While over 50% of Echos end up in the kitchen, a lack of cooking-specific commands and categories within the popular voice assistant’s smart home API has meant few people actually prepare food with Alexa today.

But that could soon change.

That’s because today Amazon introduced built-in cooking controls for cooking appliances into the Alexa smart home API. Initially rolling out in microwaves from Whirlpool and others, the new cooking capabilities will let users define time and temperature parameters and will eventually use the Alexa voice interface to walk through cooking a meal.

From the Alexa developer blog:

Customers are increasingly using voice user interfaces (VUIs) as a hands-free way to manage their lives, and hands-free control is especially valuable when cooking. With the built-in cooking device controls in the Smart Home Skill API, you will make it easier for your customers to control your cloud-connected microwave. Instead of pressing multiple buttons to enable advanced microwave features, your customers can now use their voices. For example, a customer can say “Alexa, defrost three pounds of chicken” or “Alexa, microwave for 50 seconds on high.”

Initially, there are four new capability interfaces in the Smart Home Skill API – Alexa.Cooking, Alexa.Cooking.TimeController, Alexa.TimeHoldController, and Alexa.CookingPresetController. You can leverage these interfaces today for microwaves and for appliances that support preset cooking. The interfaces are designed for future extensibility as support for more cooking devices becomes available.

The new Alexa cooking capability understands food categories (for example, Alexa will take a food term from the Echo user – such as “sockeye salmon” – to categorize food in the “Fish” category) and cooking modes.  Appliance makers are able define their different cooking modes that are discoverable within the Alexa app, which means users will be able to access modes such as “defrost” in products such as Whirlpool’s line of connected microwaves. The new cooking capability from Amazon also allows appliance makers to make their presets libraries available through Alexa.

While Whirlpool’s expected to be the first to launch the new Alexa cooking capability for its connected microwaves (no exact date has been given), Amazon also announced Samsung, GE, Kenmore and LG are all working to bring the new Alexa cooking capability to market.

And finally, one last piece of news embedded in the announcement: The company has invested in June, high profile maker of the June connected oven, via the Alexa fund. This means, of course, you can expect the June oven to work with Alexa’s cooking capabilities sometime in 2018.

Enjoy the podcast and make sure to subscribe in Apple podcasts if you haven’t already.

December 10, 2017

“Alexa, How Can You Be Used in Restaurants?”

There’s a good chance that an Amazon Alexa or Google Home device is on a holiday wish list of someone you know. Consumer Intelligence Research Partners estimates that 15 million Amazon Echo units have been sold across the U.S. (Amazon does not disclose sales figures). As of now, Alexa’s use in dining out is centered more around at-home, consumer experiences. A quick glance through the restaurant related Alexa skills show an emphasis on discovery, information and ordering. Find a nearby restaurant. Order a pizza. Etc.

But is there a bigger opportunity for Alexa and Google Home inside the restaurant?

According to the National Restaurant Association, there are more than 1 million restaurant locations in the U.S. generating $799 billion in sales. One million on its own isn’t huge, but with some creative thinking, you could easily envision multiple devices deployed per restaurant, plus all the data captured from in-store interactions and you can see restaurants becoming a front in the voice assistant battle worth fighting over.

As a fun thought experiment, I put together a few potential uses for Alexa inside dining out:

An interactive table alert. Instead of a dumb, inert buzzer that flashes and vibrates when your table is ready, modify an Echo Dot to be the messenger. Instead of bugging the host, patrons could ask Alexa how much longer the wait time is, and be alerted when they can be seated. If you wanted to get real adventurous, in the right setting, you could even do ongoing interactive trivia games to keep people entertained.

Informed ordering. With its touch screen, an Echo Show would be an excellent way to show menu items, explain more about ingredients, and highlight popular dishes. You could also enable ordering and payment for a more streamlined experience. In a cruel, horrible world, one can imagine restaurants offer a cheaper meal if customers allow ads to be displayed while they’re eating (please don’t do that).

Back of house. Alexa could be used for quick ordering of ingredients, equipment or other sundries which, of course, could be fulfilled by Amazon that day. It could also be used to alert employees about their break times and inform them of any news or specials.

Communication back home. By gathering real time data inside a restaurant, Alexa at home could provide better, more informed real-time decisions about where and when to eat at a particular establishment.

Having said all this, there are some real world limitations to this type of implementation:

There’s a pretty small needle to thread in terms of the types of restaurant voice control will work. Too loud and voice control is useless and you can’t hear Alexa talk. Too quiet and voice control is annoying for everyone else.

It’s hard to imagine restaurants buying and modding Echo devices or writing their own skills. However companies such as Toast or Square could weave Alexa into their platforms and embed them on customized devices that are sold into restaurants.

We are still in the early stages of voice assistants and how far into our lives they will go. But as they get better, restaurant owners may not want to wait until the holidays to get their own.

Have you seen Alexa or Google Home used in interesting ways inside restaurants? Or have a wild idea about how they could? Leave a comment below and let us know what you think.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...