• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • News
    • Alternative Protein
    • Business of Food
    • Connected Kitchen
    • COVID-19
    • Delivery & Commerce
    • Foodtech
    • Food Waste
    • Future of Drink
    • Future Food
    • Future of Grocery
    • Podcasts
    • Startups
    • Restaurant Tech
    • Robotics, AI & Data
  • Spoon Plus Central
  • Events
  • Newsletter
  • Connect
    • Send us a Tip
    • Spoon Newsletters
    • Slack
    • RSS
    • The Spoon Food Tech Survey Panel
  • Advertise
  • About
    • Staff
  • Become a Member
The Spoon
  • Home
  • News
    • Alternative Protein
    • Business of Food
    • Connected Kitchen
    • Foodtech
    • Food Waste
    • Future Food
    • Future of Grocery
    • Restaurant Tech
    • Robotics, AI & Data
  • Spoon Plus Central
  • Newsletter
  • Events
  • Jobs
  • Slack
  • Advertise
  • About
  • Become a Member

AR

August 10, 2020

As Menus Move to Mobile Phones, Research Shows AR Could Drive More Sales

Among the countless ways COVID is altering the meal journey is the humble menu. Gone are the germy, reusable laminated menus of the past, and while single-use paper menus are a cheap stopgap, the whole experience will move to our mobile phones.

There’s a problem with ordering through mobile menus, though: they aren’t very enticing. Unless you’re familiar with the restaurant you’re ordering from, you’re scrolling and swiping through a lengthy list of tiny 2D thumbnail images to find what you want.

But new research out of Oxford University shows that augmented reality (AR) could be the way to create appetizing menus on your mobile phone. Oxford’s study, conducted in Oxford, England last year in partnership with the AR company Qreal, a subsidiary of The Glimpse Group, gave some participants traditional menus and others AR-capable menus that presented the virtual food as it would look right in front of them on the table.

Highlights from the study, which were emailed to The Spoon, found that “Participants were significantly more likely to order a dessert if they viewed options in the AR menu (41.2%) versus the control menu (18%).” This was across age groups and sexes, as well as across familiarity with AR, so it wasn’t just tech-savvy folk indulging in a shiny new toy.

Not only that, but participants in the study using the AR menu also “spent significantly more on dessert than those in the control condition, $2.93 versus $1.38 (increase of 112%)”

As Mike Cadoux, General Manager of Qreal, summed it up with me over the phone last week, the addition of AR plays into the old adage that you “eat with your eyes first.”

“It’s like a test drive for a car,” said Cadoux, “Same way when you buy food, you want to think about what it’s like to eat it.”

If the results of this study had been released even six months ago, it probably would have been viewed as more of an interesting idea. A nice-to-have kind of thing, but definitely a can kicked down the road in favor of something more pressing.

The coronavirus, however, could accelerate AR’s adoption in the restaurant industry. First, as noted, even if you can (legally and emotionally) to sit and dine in a restaurant, the menu is moving towards mobile, so restaurants need to rethink their digital strategy and how they present their food to customers to begin with.

But more pressing is the fact that the restaurant business was already moving towards off-premises eating before the pandemic and now relies on delivery and takeout to stay alive. This, in turn means that more people will be selecting their meals from the comfort of their couch via mobile phone.

“Instead of a thumbnail of a picture,” Cadoux said, “You can view it in 3D and move it to an AR experience.” AR gives customers a better sense of what the food will look like, from all angles, when it’s on their own plates on and tables.

In addition to restaurants, third-party delivery services, with their marketplaces and massive audiences, should also be looking closely at providing an AR option.

There are the economics of a shift to an AR menu for any restaurant of delivery service to consider. But thanks to Apple and Google, AR technology is easier than ever to implement. And while the Oxford study doesn’t prove outright that implementing AR menus will guarantee increased sales, the study is a nice data point that seems to indicate it’s worth at least experimenting with it.

February 27, 2020

Coolhobo is Developing AR to Identify Dietary Restrictions in the Grocery Aisle

I’ve been on a diet since the beginning of the year. It’s nothing crazy, just trying to cut out a lot of sugar and carbs. This involves more time at the grocery store because I have to read the label and inspect the nutrition facts on every item before I buy it.

So I was intrigued when Chinese AR startup Coolhobo posted a video of a new technology its working on called Hobose that uses your phone’s camera and computer vision to quickly find foods that fit your particular dietary needs/restrictions. The Hobose technology can be integrated into a native app from a partner like a grocery retailer, and once opened, Hobose creates a personalized “Traffic Light” for shoppers. Users go through and select what they want to avoid, things like high calories, or high carbs, high fats, etc.. There’s also a ranking for how many additives a shopper will tolerate.

Once in the store, the shopper fires up the mobile app and points the camera at various products on the shelves. As each item is scanned, Hobose comes back with a green light (good!) or red light (too much of what you don’t want). You can see it in action in this demo video that Coolhobo posted.

Hobose in action, green is a go, red is a no!

When we first wrote about Coolhobo almost two years ago, the company was working on much bigger, virtual assistant for grocery shoppers. It would direct people to items in the store, provide lots of information on that product (like the story of an imported wine), and also had social features.

This is definitely more of a stripped down solution from the company. Coolhobo Co-Founder and CEO Loic Kobbes told me via email “We’ve learned from many iterations that users don’t want to be overloaded with information, they just want to know what’s good for them.”

It’s easy to see the appeal of this type of fast, should I/shouldn’t I information at the grocery store where there are a ton of products, some of which may make health claims that are at the very least, misleading. Other companies are working on nutrition guidance solutions as well, such as DNA Nudge, which uses a combination of your DNA and a wearable that scans products to give you a thumbs up or thumbs down on particular items.

Right now, Kobbes said that Hobose is in the development stage and the company is looking for partners to test it out. While Coolhobo may be offering a more svelte AR application than its previous work, it is expanding its worldview for the first time and looking to bring in partners from countries outside of China.

I’m not sure if it will make it to my local market, but who knows? Perhaps I can replace reading each box I pick up with a quick scan of my phone.

August 12, 2019

Panasonic’s DishCanvas Uses AR to Put Moving Images on Your Dinner Plate

They say that before you even take a bite of your food, you eat first with your eyes. Panasonic seems to be really taking that idiom to heart with the new product from its innovation incubator GameChanger Catapult.

DishCanvas is a smart plate (no, not that kind of smart plate) equipped with a display which can project moving images. It’s controlled through a smartphone app, through which you can select your desired pattern, texture, and movement to be projected on the dinnerware, as well as any desired transition effects. The dynamic image loop is then “played” on the DishCanvas plate.

We got to check out DishCanvas’ prototype in Tokyo this week at SKS Japan. The plate is made up of a glass top, a display, and batteries. Eventually the display technology will be built into the plate itself, but for now it’s pretty low-tech — essentially just an iPad slide underneath a corresponding dish. The GameChanger Catapult team who showed off the DishCanvas told me that they’re also hoping to make the images interactive, so you could theoretically rearrange images or even play games while you eat.

Dish Canvas

As I noted, DishCanvas is currently just a prototype. But the team member I spoke with told me that Game Changer Catapult is already in early talks with Disney to use the plates in their parks for children.

Which, to me, is a pretty smart use case. What kid wouldn’t be more likely to eat their veggies if they placed on a plate displaying life-like versions of their favorite Disney characters?

I could also imagine DishCanvas being used in fine dining. For example, a fancy Michelin star restaurant could create custom plate displays to communicate more information about the ingredients used in each dish. Maybe a rare steak would be served on a plate bearing a moving image of rustling grass in a field where the cow once grazed, or a grilled fish could be placed atop a display of ocean waves.

DishCanvas corresponds to a new movement of augmented dining which tackles not just the taste of food, but the entire eating experience. From scotch tastings enhanced with VR to leveraging sounds  to change the flavor of food, companies are experimenting with ways in which technology can enhance our meals by appealing to all five of our senses.

Here at the Spoon, we cover lots of technology aiming to optimize the way your food tastes. But it’s good to remember that smell, sound, touch, and sight play a role in how we eat, as well.

September 20, 2018

Kabaq Wants to Create the World’s First Virtual Reality Food Court

There’s no denying that we live in an age of curated images, where every Instagram photo is cropped, edited, and put through a filter before it’s sent into the stratosphere.

Kabaq, one of the 13 companies pitching at the Startup Showcase for the Smart Kitchen Summit (SKS) this October, is leveraging virtual and augmented reality (VR and AR) to create a more immersive food experience for our image-obsessed society. Thus far, the company has worked with Magnolia Bakery to help engaged couples see their wedding cake before the big day, and also led an AR campaign to let Snapchat users play with 3D-visualized pizzas.

But Kabaq founder Alper Guler has much grander ambitions for his company. Read our Q&A with Guler to get a better picture (pun intended) of their vision for a future in which our food choices are guided by VR and AR.

This interview has been edited for clarity. 

The Spoon: First thing’s first: give us your 15-second elevator pitch.
Kabaq: As Kabaq, we create the most lifelike 3D models of food in the world through AR/VR. Our main goal is to help customers to decide what to eat, while at the same time helping restaurants to push premium items and tell stories about their food.

What inspired you to start Kabaq?
The era of Instagram, Facebook and Snapchat has changed what and how we eat at restaurants. Today food isn’t just about taste; it’s also about the visual experience. Now social platforms and smartphone manufacturers have created this shift in food, investing and pushing heavily in immersive technologies like augmented and virtual reality. These two emerging trends inspired us to bring Kabaq into life.

What’s the most challenging part of getting a food tech startup off the ground?
Food and technology are connected to dining experiences more than ever. Technology is improving our experience of how we grow, source, discover and order food. But adaptation of new technology has been slow, and we are experiencing a relatively slow response from the market.

How will Kabaq change the day-to-day life of its users?
In the future I believe smart glasses will replace smartphones. Everybody will use these smart glasses to engage with digital experiences around them. Imagine you are in this restaurant, using your augmented reality glasses: you can see the whole menu on your table virtually, and even order through your glasses. Don’t worry about the check — it is already paid through your glasses.

VR can also change how we order delivery. Think about how we used to connect to the internet through dial-up modems. We needed to disconnect from the internet to call and order food for pick-up. Then, companies like Seamless created platforms to order food online. With mobile phones and location-based services like UberEats, the experience became even more smooth.

In the near future I believe when you are connected to VR, you will also order your food in VR. We will create the world’s first virtual food court for people to visit through VR and order directly through the same system.

What’s next for Kabaq?
We are creating beneficial use-cases for using AR in-restaurants, delivery apps, marketing, catering and cookbooks. We’re working to bring AR to all aspects of food — and soon.

—

Thanks, Alper! Get your tickets to SKS to hear him pitch alongside 12 emerging food tech companies at our Startup Showcase and get a taste of how Kabaq applies VR to food.

July 16, 2018

How Will AR and VR Change the Way We Eat? Jenny Dorsey Has Some Thoughts

Part chef, part entrepreneur, all innovator, Jenny Dorsey has become to go-to expert in the intersection of augmented and virtual reality. When Smart Kitchen Summit founder Michael Wolf spoke with her on our podcast last year, he called her “foremost authority on the nexus point between AR/VR and food.”

So of course we invited Dorsey to speak about it on stage at SKS. To whet your palate, we asked her a few questions to discover more about what exactly we have to look forward to in culinary future — virtual and otherwise.

Want to learn more? Make sure to get your tickets to SKS on October 8-9th to see Jenny Dorsey talk about how augmented and virtual reality will change the way we eat.

This interview has been edited for clarity and content.

Q: What drew you to explore AR and VR through food, something seemingly very separate and disconnected?
A: It is the strangest story. I went to acupuncture in the spring of 2017 totally confused about what I wanted to do with my life and art. I had this random idea pop into my head at acupuncture that I should focus on AR and VR…which I literally knew nothing about. I went home to my husband and he just said, “Okay, I support you…but I don’t know what you’re talking about.”

Fast forward a year, and I’ve been experimenting with different ways to merge these various things together. I’ve learned a lot about what doesn’t work (eating with headsets on) and what makes people prone to distraction (AR apps), but I also found some pretty awesome ways to communicate and strengthen my food through AR/VR. For instance, I hosted a tasting event in Nicaragua where we profiled three different types of Nicaraguan agricultural staples using 360° video, then served guests both headsets and the final tasting menu after they watched — and learned — the seed-to-harvest process of these ingredients. It was really educational, fun (for many, it was their first time in VR!) and the process added some extra meaning to the food and drink we prepared.

Our next big thing is a series called “Asian in America”, which explores the Asian American identity through a symbolic meal, paired with a stroke-by-stroke Tilt Brush recreation of each dish for viewers to watch, while listening to the symbolic explanations, before eating. (You can see more about both of those events over at Studio ATAO.)

Q: Tell us about your experimental pop-up series, Wednesdays.
A: Wednesdays started in January 2014 as a personal creative outlet while I was working in a restaurant and feeling pretty burned out. At the time, my then-boyfriend, now-husband was still in business school (where we met) and I remember us commiserating on how hard it was to get to really know people around us. He was interested in making cocktails, and we thought: why don’t we host a dinner party? We wanted to create an environment where people would be comfortable enough to be themselves and be vulnerable around others.

We hosted a beta-series of dinners with friends for the first month, then we started getting strangers coming to the table to eat, which prompted us to say “Hey, maybe we are onto something”. Fast forward 4 ½ years and we’ve hosted hundreds of dinners for thousands of guests across New York City and San Francisco, been written up in many major food media outlets, and usually sell out in 30 minutes or less!

We aren’t your average dinner party — we do ask a lot from our guests. There’s mandatory questions to answer before you even purchase your ticket (everything from “What’s your biggest failure and how has it motivated you?” to “Are you in the job you want? If not, how are you getting there?”), lots of bizarre things to eat and drink when you arrive (like bugs!) and direct, in-your-face realness from me, my husband and our team. There’s no small talk. It’s not for everyone, but for the people who follow us I think it’s really what they are looking for.

Q: What’s the coolest/craziest way you’ve seen technology changing the food system? Blow our mind!
A: I’m currently very interested in how blockchain could help the food system. Seeds & Chips just put out a call for blockchain influencing the egg supply chain, so I’m really excited to see what different companies come up with. I also spent some time at a winery last year and was amazed to see they have drones which tell them literally when and which plot of vineyards to pick for a certain Brix (sugar) count in that specific grape. That sort of detailed information would’ve taken constant field-walks to ascertain years ago.

There’s also technology that will calculate exactly how much food waste your restaurant generates in a week/month/year, AND a system that will turn that waste into compost. While technology has done a lot in terms of streamlining of our food system, I’m still waiting for it to solve some of the biggest issues we face today: a living wage, worker rights, consistency and training, preventing food waste, educating consumers, etc. — pieces that require more politics and facetime. Overall, we still have lots of work to do!

Q: How do you see AR/VR — and technology in general — shaping the future of food?
A: I still stand by the major points in my TechCrunch article from late last year. I think the biggest areas of impact will be food products (CPG) and how they are marketed — both experientially (through VR), but also packaging (through AR).

In terms of restaurants, I just wrote a piece about VR training, which I do think will be a fantastic and hugely influential piece of the technology — but it really needs to come down in price point first.

Overall, I think artists and creators are still getting acclimated to how this technology works and what they can do it with. I hope to see AR/VR become almost an expected point of interaction or engagement between food business (product, service or restaurant) and the customer as we continue finding artistry in it.

Q: What’s your desert island food or dish?
A: I feel I should say something cold, because I would be hot, but most likely I would be craving pho. LOL!

February 2, 2018

Podcast: A Chef’s Journey To The Intersection Of Virtual/Augmented Reality & Food

Ever since I saw Chewie and CP30 playing hologram chess in Star Wars as a kid, I’ve been intrigued by the idea of creating virtual images and worlds.

A generation later, I more fascinated than ever by what we now call augmented and virtual reality. I’m especially intrigued about where these new technologies intersect with food, and a week doesn’t go by where I read about an innovator creating a new way to enhance the shopping, restaurant or cooking experience with AR or VR.

Another person excited about this fast growing space is Jenny Dorsey. A year ago, the professional chef had an epiphany: she needed to become the foremost authority on nexus point between AR/VR and food.

On the podcast, I catch up with Jenny to hear how her journey to become the go-to expert in this exciting area is going and learn about some new and interesting ways that augmented and virtual reality are changing food.

You can listen to the podcast below, download here or find it on Apple podcasts.

November 20, 2017

Campbell’s Evaluating Augmented Reality and Other Tech As It Navigates Digital Transformation

While cans of Campbell’s Soup may not look much different today from when you or I grew up, don’t be fooled. Executives at the storied food brand are spending lots of time nowadays figuring out how to navigate a fast-changing market increasingly disrupted by e-commerce, IoT and other digital technologies.

So at last month’s Smart Kitchen Summit, we decided to ask Campbell’s Matt Pritchard about the company’s digital transformation. As Campbell’s VP of Digital Marketing, Pritchard is one of the executives responsible for charting the course of the company as it prepares for the future.

One thing the soup company exec made clear is that need for Campbell’s to stay close to the consumer no matter what the technology.

“What we got to figure out is how we deepen our consumer connection and how do we become more meaningful in their life.”

That connection, according to Pritchard comes across the entire meal journey, whether that’s during planning, shopping or making meals.

“Where we got to focus on is what is the desired consumer journey, where they are trying to get to, and how do we create integrated experiences to play after that.”

Some of the ways in which Campbell’s touches the consumer through digital are expected, such as recipe apps. But some ideas suggested by Pritchard hint at some interesting potential directions for the brand and CPG products more broadly.

“Some could be a communication challenge where we make clear we are in recipe collection apps. But it could quite easily be in the center store where we’ve got those rows and rows of cans of soup, how can we create an experience with augmented reality that brings the nutrition panel to life.”

We’ve seen a variety of retailers embrace augmented reality as of late as the technology matures, interest from the brands themselves could extend the reach of the technology from store shelves into the home and around the cooking experience.

And it’s in the kitchen where Campbell’s sees much of their future technology evolution.

“Being in the heart of the connected kitchen is the purpose of today and one of those things we need to figure out,” said Pritchard.

What would that look like? One possible way is through smarter inventory management and replenishment.

“A consumer goes home, put some of our products in the pantry, then they use those products,” said Pritchard. “We’ve got to figure out how do we know they used those products so we can help with a replenishment scenario. So, I’m looking at things like sensor companies because they can figure out when a product has been in a fridge or a pantry.”

So while the soup may stay the same, the company behind the red and white cans is busy figuring out how to stay connected with consumers as the food journey becomes increasingly digital.

You can watch the entire interview with Matt Pritchard below:

September 25, 2017

Visualize it: Augmented Reality and the Future of Food

All around us, augmented reality technology is beginning to give us more information about our immediate environment than can be seen with the naked eye. There are (AR) apps for overlaying where nearby WiFi signals are centered, and apps that help surface unseen nearby locations and attractions to visit. Now, food production is set for transformative developments thanks to AR.

On this front, Huxley has developed what it bills as the world’s first “augmented operating system,” which mashes up augmented reality with artificial intelligence. “ By combining vision, environment, and plant data, we can now grow more with less using AI,” Huxley reports.

Imagine a smart greenhouse of the future, where farmers with augmented reality glasses can surface information about what kinds of plants at various stages of growth surround them. The same greenhouse might have smart cameras that keep track of everything from watering status, to activity from pests and threats.

Huxley is being leveraged for these kinds of food production scenarios, and is even being leveraged to optimize marijuana production. It is a hands-free system that combines AR, AI, and machine learning to optimize “plant vision,” as seen in the screenshot here.

“Intelligent Automation for Controlled Environments is the future,” the Huxley team reports. “By collaborating with the most innovative companies and organizations we can provide anyone in any language the power of a master grower. Data just got dimensional.”

According to a recent Wall Street Journal story, augmented reality can also help optimize harvesting plants: “New cameras, sensors and smartphone apps help monitor plant growth. One company is even developing augmented-reality glasses that can show workers which plants to pick.”

The same story notes that companies are also developing new ways to grow vegetables in tiny spaces and often urban spaces, including rooftops, balconies, and abandoned lots. From controlled lighting to augmented reality solutions for discerning when to harvest plants, these solutions were not found on grandpa’s old farm.

Meanwhile, Danish researchers are investigating ways to use augmented reality to optimize the trimming and boning process for pork bellies. “The AR technology has demonstrated lucrative applications in industrial QA procedures and even farm management applications appear to benefit from applying the technology,” the researchers note.

So how might augmented reality boost your food frontiers when you are at the table in a restaurant? A company called Kabaq is on top of that concept. It is developing 3D and augmented reality menu and visualization technologies so that you can see exactly what your order will look like in front of you, from every angle. Check out the technology in action in this video:

Watch how you can use Apple ARKit in Food ordering

The technology driving augmented reality devices and applications is rapidly advancing as well. Apple is one of many tech giants driving the technology forward, and the result is likely to be ever smarter AR-driven food applications. Stay tuned to this space.

June 6, 2017

Analysis: With HomePod, Apple May Finally Deliver On The Promise Of HomeKit

After months of rumors, Apple finally introduced a wireless streaming speaker called HomePod this week, their first new hardware product since the Apple Watch debuted in 2015.  The Siri-enabled wireless speaker also doubles as a HomeKit powered smart home hub, giving Apple a new fixed HomeKit control point beyond Apple TV. The HomePod will ship in December and is priced at $349.

There is a bunch to break down here, including the HomePod compares to Amazon’s Echo, but let’s first look at what exactly Apple introduced.

The Hardware

The new HomePod is an impressive piece of hardware. The HomePod includes Apple’s A8 chip, a system-on-chip CPU/GPU that debuted in 2014 with the iPhone 6. It has a six-microphone array with advanced echo cancellation, which Apple says will enable “Siri to understand people whether they are near the device or standing across the room, even while loud music is playing.”  The Siri-powered speaker also features seven beam-forming tweeters, each with an amplifier, and it also includes what Apple calls room-sensing technology that allows the device to optimize its sound based on the specific spatial characteristics of where music is being played.

During the event, Apple made a string of other important announcements that led up to the climactic debut of the HomePod:

AirPlay2

One of the most important foundational technology upgrades announced Monday was AirPlay 2, a much-needed update to Apple’s wireless streaming protocol. With Airplay 2 we finally get multiroom audio support, a huge upgrade that will allow homes with Apple’s HomePod – as well as products from partners like Bose, Bang & Olufsen, Marantz and others – to stream audio wirelessly to different rooms and to multiple speakers. The upgrade puts Apple’s streaming music framework on par with Google’s Chromecast for audio, which already supports multiroom audio.

Apple’s AirPlay 2 Early Partners. Image credit: The Verge

A notable absence from the list of initial partners was Sonos, a company that almost single-handedly created the wireless multiroom audio category.

HomeKit

The release of AirPlay 2 will not only bring multiroom audio support, but it also adds speakers to the list of devices controllable with Apple’s smart home protocol, HomeKit. By adding the speaker category to HomeKit, consumers will be able to control their wireless speakers through the iOS Home app.

The arrival of the HomePod also brings a second fixed smart home hub device into the lineup. Like Apple TV, the HomePod allows for remote access to any HomeKit compatible device through the Home app. However, with far-field listening capabilities and integrated Siri, the HomePod instantly surpasses Apple TV to become Apple’s most capable smart home hub.

What Does All This Mean?

The pricing, capabilities, industrial design and messaging gave us all we need to know to break down Apple’s strategy:

The HomePod Is, Above All, A Music Product: The HomePod is built to be a great wireless streaming speaker. With seven beam-forming tweeters – that’s one more sound driver than the Sonos Play 5 – it’s built to sound great. Sure, the HomePod has built-in Siri, but Apple messaged this as a revolutionary multi-room speaker first and a virtual assistant second.

This Is a Premium Product : The price of the HomePod, $349, may seem fairly affordable when compared to other Apple products, but at roughly double the price of the Amazon Echo and nearly triple that of Google Home, this is a much higher priced than other smart speakers.  It’s clear Apple has Sonos, probably moreso than Amazon or Google, in its crosshairs.

Apple Is Finally Bringing An Upgraded Siri Home: One of the messages from Apple this week is Siri has finally grown up. By adding anticipatory computer features, opening it up further to developers with a year two SiriKit and creating a Siri face for Apple Watch, the company finally feels they have a virtual assistant on par with Google Assistant. And now with HomePod, Apple has a true voice assistant to bring into the home.

Apple Vs. Amazon

Where does this position Apple relative to Amazon and the Echo?

I think given the premium pricing strategy, Apple appears to be ceding the fixed smart speaker mass market to Amazon. By choosing a music-first, premium approach, Apple appears content to let Amazon win the numbers battle with its lower-cost smart speaker.

However, letting Amazon blanket the mass market with $49 Echo Dots does not mean Apple is ceding the virtual assistant market to Amazon. In fact, if we learned anything this week it’s that Apple plans to leverage the hundreds of millions of Siri-powered iPhones, iPads and Apple Watchs in the market as it does battle with Alexa. .

And not only is Apple leading with iOS, but they plan to make it a much more rich and robust platform with new efforts like ARkit, their new augmented reality developer platform. Imagine pairing a well-done augmented reality app with a voice assistant capability in the home, and you might have something pretty cool.

Ok, so while it’s a bit of a risky strategy, it’s probably the right one for Apple. By ‘dancing with one who brought them’ in iOS and augmenting their home strategy with a premium-priced smart speaker/virtual assistant for the home with HomePod, Apple now at least has a strategy to do battle with Echo, even if their new smart speaker is priced out of reach for some consumers.

Lastly, let’s not forget that the HomePod with HomeKit is a true smart home hub, with all the built-in intelligence to make a powerful Apple-powered smart home come to life. While the Amazon Echo has done an good job integrating with hundreds of various smart home devices through its skill platform, it’s limited in its ability to execute on things such as scenes. With HomeKit and a new rev of the Home app for iOS, I think Apple may finally have what it needs with the HomeKit-HomePod combo to deliver on the early promise that had so many excited about HomeKit.

Make sure to subscribe to the Spoon newsletter to get it in your inbox. And don’t forget to check out Smart Kitchen Summit, the only event on the future of the food, cooking, and the kitchen. 

November 26, 2016

Virtual Eating: How Virtual Reality Can Make You Think You’re Eating Pizza When You’re Not

The rise of virtual and augmented reality systems have only just begun; we’re almost positive we’ll see even more VR demos at CES this year, and the convergence of smart home technology and VR/AR has only just begun. But what about virtual eating? Virtual reality is designed to simulate sounds and sights of an environment -- but could it simulate taste and smell too?

That’s the premise of a project from researchers in Japan and Singapore who have been testing out electrical and thermal probs that can stimulate muscles and trick the human brain to believe it was tasting food that wasn’t really there. In one experiment, scientists focused on the neurons that are sensitive to hot and cold temperature changes that also play a role in how we taste things. By rapidly heating or cooling a square of thermoelectric elements on the tip of someone’s tongue, the user experiences a sweet taste. The thermal experiment also produced some strange results, with some participants reporting a spicy flavor when the probs were heated up and a minty flavor when they were cooled down.

In another experiment, electrical currents were used instead of heat to enhance or create a salty, sour or bitter taste in someone’s mouth.

The last experiment used electrode’s attached to the masseter muscle, one of four muscles in the jaw used for mastication (chewing), to simulate biting through actual food. The strength of the electric impulse controlled the texture, or hardness of the simulated food and the duration of the impulse controlled the elasticity sensation of the jaw opening and closing during chewing. By varying the strength and duration, researchers were able to more realistically produce the sensation of biting into real food.

Study on Control Method of Virtual Food Texture by Electrical Muscle Stimulation

The role of heat as it relates to taste isn’t a new concept, it’s one chefs have been using to transform dishes and create unique flavors. But using solely heat or electricity to mimic a specific taste or sensation So it turns out, your taste buds, and even jaw muscles can be hacked -- making it possible to have a virtual reality dining experience without having to suffer the calories.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2021 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube