• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Osmo

November 6, 2024

The Idea of Food ‘Teleportation’ Isn’t New—But AI Is Finally Making Distributed Digital Food Replication a Reality

Over the past decade, there’s been no shortage of attempts to better understand, map, and recreate food and its properties with the click of a mouse.

From new data ontologies to a proliferation of digital noses, we’ve seen incremental steps toward an envisioned world where the fundamental building blocks of food can be better understood. However, in the past year, there has been a rapid acceleration in our collective ability to digitize various properties of food, largely driven by advances in AI.

The latest example of this comes from Osmo, which recently announced its development of the ability to digitally “teleport” a scent by using AI to digitize and re-materialize it.

Scent Teleportation

As the company’s CEO, Alex Wiltschko, explained:

“We select a scent to teleport and introduce it to a machine called the GCMS (Gas Chromatography-Mass Spectrometry). If it’s a liquid, we inject it directly; if it’s a physical sample, like a plum, we use headspace analysis, trapping the scent in the air around the object and absorbing it through a tube. The GCMS identifies the raw data, which can be interpreted as molecules, and uploads it to a cloud. There, it becomes a coordinate on our Principal Odor Map — a novel, advanced AI-driven tool that can predict what a particular combination of molecules smells like. This formula is sent to one of our Formulation Robots, which treats it as a recipe and mixes different scents to replicate our sample.”

In other words, Osmo breaks down the building blocks (molecules), creates a map, and then sends this digital map to an essence “printer” that re-creates it.

This announcement comes just weeks after leaders of NotCo’s scent and flavor AI team shared research on their new generative AI that creates scent and flavor formulations. Here’s how Aadit Patel, NotCo’s head of product, described the model:

“The system takes your prompt—such as ‘an ocean scent on a breezy summer day on a tropical island’—and creates a novel chemical formulation of that scent in one shot.” The model then generates a corresponding fragrance formula. According to Patel, the model is built on a “natural language to chemical composition” framework, tokenizing molecules to create a system capable of understanding and generating novel combinations.

With years of work focused on digitally understanding, quantifying, mapping, and reproducing scents, flavors, and food building blocks, what is allowing these latest efforts to make such significant leaps?

In a word (or two): AI. In the past, creating a facsimile of a flavor or scent took thousands of hours, relying on trained experts to work in a lab or kitchen, drawing on years of expertise. Now, AI is expediting that process with orders of magnitude more efficiency, often leaving the expert to provide a final sign-off to ensure the AI-created formula meets standards for proximity to the desired result, as well as checks for safety, cost feasibility, and more.

For the record, Osmo isn’t the first company to discuss “teleporting” a formula for digital recreation. In 2018, Japanese startup Open Meals made headlines with its “sushi teleportation” demo, essentially sending 3D printing instructions to create a sushi-like meal. We also saw Cana’s ambitious attempt to make a Star Trek replicator (though, as it turns out, investors weren’t quite ready to enter the food teleportation age).

All of this follows years of efforts to quantify and understand food digitally, including the creation of ontologies for the Internet of Food and early attempts to use AI to analyze food. But over the past couple of years, there’s no doubt that parallel advances in AI (especially in large language models) and breakthroughs in food, olfactory, and chemical science are ushering in a world where true food “teleportation”—or, more accurately, the ability to understand and synthetically recreate food, flavors, and scents—has arrived.

I’m excited to see where this all goes. To manifest the vision laid out in science fiction over the years and imaginative product concepts like that of Open Meal required a true digital understanding of the molecular building blocks of food. With AI, we are closer than ever to that understanding, and the products we’ll see built in the coming decade will not only create some mind-blowing consumer experiences but also possibly fundamentally change how food and beverage products are made and distributed.

August 20, 2024

The Idea of Smell-O-Vision Has Been Around for Over a Century. AI May Finally Make It Work

Since the early 1900s, the entertainment industry has been attempting to pair the experience of smell with video entertainment.

In 1916, the Rivoli Theater in New York City introduced scents into the theater during a movie called The Story of Flowers. In 1933, the Rialto Theater installed an in-theater smell system. Hans Laube developed a technique called Scentovision, which was introduced at the 1939 World’s Fair. A decade ago, Japanese researchers were also exploring “Smell-O-Vision” for home TVs, working on a television that used vaporizing gel pellets and emitted air streams from each corner of the screen into the living room.

However, none of these efforts took off, primarily because they didn’t work very well. These attempts at Smell-O-Vision failed because we’ve never been able to adequately recreate the world’s smells in an accurate or scalable way, largely because we’ve never been able to digitally capture them.

This doesn’t mean the fragrance and scent industry hasn’t been robust and growing, but it’s a very different task to create a singular fragrance for a consumer product than to develop something akin to a “smell printer” that emits scents on command. The latter requires a comprehensive digital understanding of scent molecules, something that has only recently become possible.

The digital understanding of the world of smells has accelerated in recent years, and one company leading the way is Osmo, a startup that has raised $60 million in funding. Osmo is led by Alex Wiltschko, a Harvard-trained, ex-Googler who received his PhD in olfactory neuroscience from Harvard in 2016. Wiltschko, who led a group at Google that spent five years using machine learning to predict how different molecules will smell, founded Osmo in early 2023 with the mission of “digitizing smell to improve the health and well-being of human life” by “building the foundational capabilities to enable computers to do everything our noses can do.”

Osmo employed AI to explore the connection between molecular structure and the perception of smell, demonstrating that a machine can predict scents with remarkable accuracy. They developed a machine-learning model using graph neural networks (GNNs), trained on a dataset of 5,000 known compounds, each labeled with descriptive smells like “fruity” or “floral.” This model was then tested on 400 novel compounds, selected to be structurally distinct from anything previously studied or used in the fragrance industry, to see how well it could predict their scents compared to human panelists.

The model’s capabilities were further challenged in an “adversarial” test, where it had to predict scents for molecules that were structurally similar but smelled different. Osmo’s model correctly predicted scents 50% of the time in this difficult scenario. Additionally, the model was able to generalize well beyond the original training data, assessing other olfactory properties like odor strength across a massive dataset of 500,000 potential scent molecules.

The Principal Odor Map (POM) created by Osmo’s model outperformed human panelists in predicting the consensus scent of molecules, marking a significant advancement in olfactory science and demonstrating that AI can predict smells based on molecular structure better than individual human experts in many cases.

We’ve been able to digitally capture and categorize other sensory categories, such as vision, which has led to massive new industry value creation in robotics and autonomous vehicles. The biggest leaps have been a result of machine learning models, and now we’re seeing another massive leap forward in capabilities and product innovation through the application of generative AI.

One potential application Wiltschko describes is “teleporting scent,” where we’ll be able to capture a smell from one part of the world and digitally transfer it to another. To do this, he envisions a world where a local AI-guided molecular sensor could instantly identify the molecular makeup of any scent. From there, his odor map can create what is essentially a formula ready for teleportation without significant manual intervention by scent experts.

This idea, using AI to recreate scents based on a digital framework quickly, could lay the foundation for what film and TV makers have long dreamed of: creating technology that can recreate odors and smells at scale. In other words, we may finally enter a world where Smell-O-Vision becomes a reality. The potential for video entertainment, virtual reality, and other experiences in food service, travel, and more would no doubt lead to a multitude of new applications, much like we’ve seen over the past couple of decades with advances in computer and machine vision.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...