Back in April, Google’s Emma Persky wrote a post telling the world that Google has been working on a recipe guidance capability for Google Assistant.
At last month’s Smart Kitchen Summit, she gave a little more detail on what exactly the company has been up to.
Persky, who runs point for Google Assistant’s cooking guidance team, told the Spoon’s Allen Weiner how much of the focus has been on building in contextual understanding of recipes.
“We can talk you through step by step how to cook a recipe and answer contextual questions about how to do that,” said Persky. “To do this, we have to build a deep model of what a recipe actually looks like. Have to take the text of a recipe and understand that text so we know what pasta is, we know what type of pasta you’re talking about, we know what the units are, we know the cooking temperature.”
“On the other side, we know what the user is saying. We have a whole bunch of machinery at Google that is able to understand what a user is saying and turn that into a machine question. We have a whole bunch of data about how different people ask these questions, which we use to build a model and understand these types of questions.”
When asked about how Google is utilizing company competencies like search and YouTube, Persky said that while there’s been significant work done here, there’s an opportunity to get better.
“We do a pretty good job now when you ask on your phone or desktop ‘how do I sauté an onion?’ we show you a nice video of how to sauté an onion.”
But, she said, “there’s a lot of opportunity for guided cooking feature to more deeply integrate with this, so when your recipe says sauté the onion, and you don’t know how to sauté the onion, we are able to return these types of video answers on on Google Home platform to help you become a better chef over time.”
Persky also discussed how she thought web content schemas could evolve to create a foundation for richer content through platforms like Google Assistant.
“When it comes to companies that have this kind (recipe) data available to them, there are a lot of opportunity for finding ways to increase the fidelity of the data that we have access to. At the moment we have schema.org markup, which is a good first pass. We don’t have a lot of detail and use machine learning to extract a lot of the context from that. And I think where there’s an opportunity to where a lot of people working on this stuff is to find ways to access more high fidelity data that we could offer to the users as an improved experience.”
When asked by Weiner how schema.org and other web markup languages could improve, she had this to say: “There’s a lot of work we can do to improve the quality of that markup. For example, right now the markup just has one block of text for all the instructions in the recipe, but actually if we could break that down and have a step by step, it would be easier for us to parse that out. Right now we have to apply machine learning across that to do that.”
It’s a really good conversation to understand what Google has been up to as they look to combine recipe content with their voice AI platform. You can watch the full conversation between Allen Weiner and Emma Persky below:
Ed note: Answers in this interview have been edited slightly for brevity/clarity