• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

Food AI Weekly

August 7, 2024

Food AI Bulletin: Google’s Robot Breakthrough & Wendy’s Spanish-Speaking AI Drive-Thru Bot

While it’s mid-summer, and while most of Europe (and a good chunk of the American workforce) is taking some well-deserved time off, the AI news hasn’t slowed down one bit.

This week’s Food AI bulletin has updates on a new Google breakthrough on enabling better contextual understanding of our homes (including our kitchens), how Gemini is powering new features in Google’s smart home products, Wendy’s release of a Spanish-language edition of its AI drive-thru assistant, Amazon’s AI refresh of Just Walk Out, a new AI-powered digital tool called the NOURISH to help those living in food deserts make better food choices, a Danone and Microsoft multiyear deal to upskill employees on AI tools, and a survey that shows South Korean students prefer AI-generated healthy food options over more conventionally developed products.

Here we go:

Google’s New Robot Breakthrough Could Make It Easier to Train Your Robot Butler to Cook or Grab You a Cola

In the past, robots were challenged in doing useful tasks with autonomy, in part because they didn’t generally understand what they were seeing and how it related to a person’s specific living situation, etc.

That’s begun to change in recent years, in part because we’ve seen significant advances in robot navigation as researchers using new tools such as Object Goal Navigation (ObjNav) and Vision Language Navigation (VLN) have allowed robots to understand open commands such as “go to the kitchen.”

More recently, researchers have created systems called Multimodal Instruction Navigation (MIN), which enable robots to understand both verbal and visual instructions simultaneously. For example, a person can show a robot something like a toothbrush and ask it where to return it using both the spoken request and the visual context.

Now, Google researchers have taken things a step further by creating what they call Mobility VLA, a hierarchical Vision-Language-Action (VLA). This is a “navigation policy that combines the environment understanding and common sense reasoning power of long-context VLMs and a robust low-level navigation policy based on topological graphs.”

In other words, showing a robot an exploration video of a given environment will allow it to understand how to navigate an area. According to the researchers, by using a walkthrough video and Mobility VLA, they were able to ask the robot and have it achieve previously infeasible tasks such as “I want to store something out of sight from the public eye. Where should I go?” They also write that they achieved significant advances in how easily users can interact with the robot, giving the example of a user recording a video walkthrough in a home environment with a smartphone and then ask, “Where did I leave my coaster?”

One of the biggest challenges around having robots be useful in a food context is that the act of cooking is complex and requires multiple steps and contextual understanding of a specific cooking space. One could imagine using this type of training framework to enable more complex and useful cooking robots or even personal butlers that will actually be able to do something like fetching you a cold beverage.

You can watch a robot using this new Gemini-enable navigation framework in the video below:

“You’re Food Delivery Is Here”: Google Bringing Gemini Intelligence to Google Home

Speaking of Google, this week, the company announced a new set of features coming to their suite of smart home products that their Gemini model will power. The new features were revealed as part of an announcement about a new version of the company’s smart thermostat and its TV streaming device. According to the company, they are adding Gemini-powered capabilities across a range of products, including their Nest security cameras and its smart voice assistant, Google Home.

By underpinning its Nest camera products with Gemini, the company says its Nest Cams will go from “understanding a narrow set of specific things (i.e., motion, people, packages, etc.) to being able to more broadly understand what it sees and hears, and then surface what’s most important.” Google says that this will mean that you can ask your Google Home app questions like “Did I leave my bikes in the driveway?” and “Is my food delivery at the front door?”

During a presentation to The Verge, Google Home head of product Anish Kattukaran showed an example of a video of a grocery delivery driver which was accompanied by an alert powered by Gemini:

“A young person in casual clothing, standing next to a parked black SUV. They are carrying grocery bags. The car is partially in the garage and the area appears peaceful.”

After what’s been a somewhat moribund period of feature-set innovation for smart homes over the past couple of years, both Google and Amazon are now tapping into generative AI to create new capabilities that I’m actually looking forward to. By empowering their existing smart home products like cameras and their smart home assistants with generative AI models, we are finally starting to seeing leaps in useful functionality that are bringing the smart home closer to the futuristic promise we’ve been imagining for the last decade.

Wendy’s Pilots Spanish-Language Drive-Thru AI Voice Assistant

This week, Wendy’s showed off its new Spanish-language capabilities for its Fresh AI drive-thru voice assistant according to announcement sent to The Spoon. The new assistant, which can be seen in the Wendy’ s-provided b-reel below, has a conversant AI bot that seamlessly switches to Spanish, clarifies the order, and upsells the meal.

Wendy's Demos Fresh AI Drive-Thru in Espanol

According to Wendy’s, the company launched its Fresh AI in December of last year and has expanded it to 28 locations across two states.

This news comes just a week after Yum! Brands announced plans to expand Voice AI technology to hundreds of Taco Bell drive-thrus in the U.S. by the end of 2024, with future global implementation across KFC, Pizza Hut, and Taco Bell. Currently, in over 100 Taco Bell locations, the company believes the technology will enhance operations, improve order accuracy, and reduce wait times.

Amazon Previews New Generative AI-Powered Just Walk Out

Last week, Amazon gave a sneak peek at the new AI model that powers its Just Walk Out platform.

In a post written by Jon Jenkins, the VP of Just Walk Out (and, as Spoon readers may remember, the former founder of Meld and head of engineering for the Hestan Cue), we get a peek at the new AI model from Amazon. Jenkins writes the new technology is a “multi-modal foundation model for physical stores is a significant advancement in the evolution of checkout-free shopping.” He says the new model will increase the accuracy of Just Walk Out technology “even in complex shopping scenarios with variables such as camera obstructions, lighting conditions, and the behavior of other shoppers while allowing us to simplify the system.”

The new system differs from the previous system in that it analyzes data from multiple sources—cameras, weight sensors, and other data—simultaneously rather than sequentially. It also uses “continuous self-learning and transformer technology, a type of neural network architecture that transforms inputs (sensor data, in the case of Just Walk Out) into outputs (receipts for checkout-free shopping).”

Academic Researchers Creating AI Tool to Help Americans Living in Food Deserts Access Better Food Options

A team of researchers led by the University of Kansas and the University of California-San Francisco is tackling the issue of food deserts in the U.S. with an AI-powered digital tool called the NOURISH platform. According to an announcement released this week about the initiative, the group is supported by a $5 million grant from the National Science Foundation’s Convergence Accelerator program and the U.S. Department of Agriculture. The project aims to provide fresh and nutritious food options to the estimated 24 million Americans living in areas with limited access to healthy food. The platform will utilize geospatial analyses and AI to identify optimal locations for new fresh food businesses, linking entrepreneurs with local providers and creating dynamic, interactive maps accessible via mobile devices in multiple languages.

Danone Announces Multiyear Partnership with Microsoft for AI

An interesting deal focused on bringing AI training to a large CPG brand’s workforce:

Danone has announced a multi-year collaboration with Microsoft to integrate artificial intelligence (AI) across its operations, including creating a ‘Danone Microsoft AI Academy.’ This initiative aims to upskill and reskill around 100,000 Danone employees, building on Danone’s existing ‘DanSkills’ program. Through the AI Academy, Danone plans to enhance AI literacy and expertise throughout the organization, offering tailored learning opportunities to ensure comprehensive training coverage. The partnership will initially focus on developing an AI-enabled supply chain to improve operational efficiency through predictive forecasting and real-time adjustments. Juergen Esser, Danone’s Deputy CEO, emphasized that collaboration is not just about technology but also about fostering a culture of continuous learning and innovation. Microsoft’s Hayete Gallot highlighted the significance of AI in transforming Danone’s operations and the broader industry, aiming to empower Danone’s workforce to thrive in an AI-driven economy.

My main critique of a deal like this is that it essentially brings training and curriculum to train employees from an AI platform provider with skin in the game in Microsoft. As someone who’s long weaned myself off of most of Microsoft’s software products, I’d hate to go into a curriculum that will mostly be largely Microsoft AI tools training, not really broader AI training.

It is a good deal for Microsoft, with a smart focus on upskilling by Danone. Let’s hope Microsoft’s training brings a broad-based AI tool belt to the Danone workforce that is not entirely walled-gardened within Microsoft’s products.

Survey: Korean Students Prefer AI-Driven Health Foods

While some Americans are becoming more concerned about AI’s impact on our lives, it appears that at least some South Korean students are embracing AI in the development of healthier food options.

According to a recent survey conducted by Korea University Business School, young South Koreans are more likely to trust and purchase healthy functional foods (HFF) developed using artificial intelligence (AI) than those created through traditional methods. The study involved 300 participants and revealed that AI-developed HFFs scored higher in trustworthiness, perceived expertise, positive attitude, and purchase intention. The AI model, NaturaPredicta™, uses natural language processing to analyze botanical ingredients, significantly reducing the time and cost required for new product development. However, researchers noted the potential bias due to the relatively young demographic of the participants and suggested broader studies for more representative results.

July 11, 2024

Food AI Weekly Bulletin: Is AI-Washing a Problem For Food Tech?

Welcome to this week’s edition of the Food AI Weekly Bulletin, our weekly wrapup that highlights important happenings at the intersection of AI and food. If you’d like to sign up to get this bulletin delivered to your inbox, you can do so here.

Is AI-Washing a Problem for Food Tech? Some predict one-third of startups will feature AI as part of their core product by the end of this year. Is stretching the truth about how truly AI-powered your product is a problem for food tech?

Gatorade’s AI Hydration Coach. Is Gatorade’s AI-powered hydration coach a marketing trick or another sign that wellness copilots are beginning to pop up everywhere?

Researchers Build RhizoNet, Which Uses Next-Gen Neural Network to Analyze Plant Roots. A new tool called RhizoNet could provide a big leap in understanding root growth.

Number of Retailers Using AI Doubles In The Past Year. AI adoption is skyrocketing, and retail looks to be one of the fastest growing sector.

CaperCart Continues Roll-Out of AI-Enabled Shopping Carts. Speaking of AI at retail, Instacart is hoping to do its part to spread the technology through its computer-vision enabled smart shopping carts.

Mineral AI Winds Down. Somewhat surprisingly, Mineral AI announced last week they would wind down and distribute their technology through license to partners such as Driscoll’s.

Mars Using AI to Develop 50 Product Concepts a Day. Big CPG is getting the hang of this generative AI thing.

eGrowcery Launches 70 Thousand AI-Generated Recipes. Are recipes set to become almost all AI-generated?

Is AI-Washing a Problem for Food Tech?

Anytime a new technology captures the public zeitgeist, brands invariably jump on the bandwagon. After all, that’s what brands do, and it’s incumbent on any good marketer to capitalize on any buzzy association possible.

Where it could be problematic is when a company claims to have a key competitive differentiator through a given technology and they’re stretching the truth. Being an early adopter of, say, cloud computing, Web3, or, yes, AI is worth mentioning or making the center of a marketing campaign (even if it can be eyeroll-inducing), but when it’s on your pitch deck and you are exaggerating just how core it is to your product, it can be potentially deceptive, at least according to the SEC.

And now the regulatory body has started to take a stand against what they’re calling ‘AI-washing’. While the focus of the SEC is on claims by investment firms claiming to deliver investor value through AI-powered decision-making, it’s clear they are policing the broader use of claims by companies looking to benefit by association.

According to European investment fund OpenOcean, by the end of the year, one-third of startups will feature AI in their pitch decks. My guess is many will have legitimate claims, but as we’ve seen over the past year in food tech, sometimes such claims seem a stretch. It’s a logical move as a founder, particularly in a market where raising capital has become exceedingly difficult. But as everyone jumps on the AI bandwagon, it’s worth it for startups to be cautious about their claims, and those looking to invest or partner with these companies should do their due diligence to see if there’s real substance behind the pitch deck.

Gatorade’s AI Hydration Coach

A couple of weeks ago, at Cannes Lions, the advertising industry’s biggest international confab and awards gala, Gatorade debuted its generative AI-powered app that acts as a hydration coach.

From Marketing Dive:

Gatorade’s AI Hydration coach app applied AI “to educate users about the best ways to stay hydrated through an assistant that draws on decades of historical data from the sport beverage brand’s research institute. The concept leans into the idea that AI has the power to democratize services that were once exclusive, giving everyday consumers the type of expert guidance usually reserved for elite athletes.”

It’s an interesting concept – after all, the future very well may be filled with AI-powered co-pilots, assistants will sit on our proverbial shoulders whispering in our ears to coach us through life – but I’m not sure how seriously consumers will take specific brand-activated assistants. Millennials and Gen-Z and are for tooling up when it comes to their health, but they all have great authenticity sniffing capabilities and brands aren’t always the most trusted advisors in part because it’s easy to question their motivation.

Researchers Build RhizoNet, Which Uses Next-Gen Neural Network to Analyze Plant Roots

According to a story in Interesting Engineering, the Lawrence Berkeley National Laboratory’s Applied Mathematics and Computational Research (AMCR) and Environmental Genomics and Systems Biology (EGSB) Divisions have developed a new neural network tool called RhizoNet to analyze plant roots.

From IE:

It paved the way for scientists to accurately measure root growth and biomass, making it much easier and faster to study plant roots in the lab. 

In simple terms, RhizoNet automatically interprets images of plant roots which makes it more effortless and faster to comprehend the workings of root growth and how they respond to different conditions.

According to the report, the system will enable much faster and more accurate analysis of root biomass, especially compared to more manual (i.e. human-driven) analysis of plant roots.

The researchers noted they tested the system’s effectiveness through analysis of Brachypodium distachyon, a grass species of plants, as they subjected it to deprived nutrients over five weeks. 

“We’ve made a lot of progress in reducing the manual work involved in plant cultivation experiments with the EcoBOT, and now RhizoNet is reducing the manual work involved in analyzing the data generated,” stated Peter Andeer, a research scientist in EGSB and a lead developer of EcoBOT.

“This increases our throughput and moves us toward the goal of self-driving labs,” he added.

Number of Retailers Using AI Doubles In The Past Year

According to the Food Industry Association’s just-published annual study on the state of food retail, The Food Retailing Industry Speaks 2024, retailers are increasingly turning to AI to optimize parts of their business. The report, which surveyed large and small retail chains, said that 41% of those surveyed say they are using AI for parts of their business. This is double the number of retailers using AI over just a year ago.

A doubling of AI usage at retail is not surprising given the proliferation of various tools, from supply-chain optimization to in-store computer vision systems for real-time inventory and theft reduction analysis. My guess is that by next year, most retailers will be deploying AI in some parts of their operations.

CaperCart Continues Roll-Out of AI-Enabled Shopping Carts

Instacart has added another round of stores to the list of those its CaperCart smart shopping carts. The company announced that Price Chopper and  McKeever’s Market & Eatery have each added the “AI-powered grocery carts” to a store locations in Missouri. This comes a week after the company announced Wakefern announced they were increasing the number of storefronts using the Capercart.

As we’ve noted for the past couple years, the company is increasingly looking to diversify from its personal shopper business. It has been focused on creating technology platforms for the non-Amazon grocery retailers of the world to transform themselves in an increasingly digitized grocery shopping industry.

Mineral AI Winds Down

Whenever a company graduates from Google’s moonshot factory X to become an operating company under parent company Alphabet, most assume said company will be a success. But the reality is these graduates, for whatever reason, sometimes just don’t get the traction required as an independent company and eventually wind down.

The latest example of this is Mineral AI. Mineral, a ‘computational agriculture’ company that graduated from X with significant fanfare last year, announced last week it was winding down. In a post by Mineral CEO, Elliott Grant, he says the company’s technology will live on through a license to Driscoll’s and other “leading agribusinesses where they can have maximum impact.”

It’s hard to decipher variables around decision-making at a giant like Alphabet, but it’s been clear since the beginning of the year that the company has been paring back its moonshot initiatives, both through layoffs and in how much capital they expand to fund the initiatives. A decade ago, X was home to a bunch of out-there concepts, such as balloon-powered broadband, but as cheap capital has dried up past and the company is funneling resources towards keeping up in LLM space race, those days look to be coming to a close.

Mars Using AI to Develop 50 Product Concepts a Day

While it’s easy to caricature big food brands as giant behemoths slow to adapt to new innovation and technology, it’s becoming increasingly clear that many are quickly building internal capabilities leveraging AI to accelerate core product development processes.

The latest indication is the news that Mars has developed its own generative AI-powered tool called Brahma to develop up to fifty product concepts a day. According to a report in Consumer Goods Technology, Brahma “uses data from consumer insights studies the CPG conducted last year involving 80,000 consumers and 800,000 consumption moments across 11 countries.” 

eGrowcery Launches 70 Thousand AI-Generated Recipes

eGrowcery, a white-label grocery e-commerce platform company, announced today they have launched an AI-powered personalized recipe offering to their SaaS customers.. The company said their new AI recipe feature uses AI to tailor recipe suggestions based on regional preferences and store inventory. The company hopes to boost shopper satisfaction, sales, and market share for retailers with a suite of over 70 thousand personalized recipes.

The move by eGrowcery is an indication that shoppable recipes are an obvious early candidate of a category that will be consumed by generative AI. It will be interesting to watch how much consumers take to these offerings, given that so many home cooks take inspiration nowadays from other sources (such as social media) to get recipe and meal ideas. However, we’ve begun to hear that the influencer-recipe space is struggling to keep up with the rapidly changing landscape resulting from Google’s push towards using AI-powered summaries rather than link-outs to other sites.

In other words, AI is beginning to sink its hooks ever deeper into the food planning and inspiration space from seemingly every angle.

Our Favorite AI-Generated Food Images of The Week

Over on our Spoon community Slack, we had some of our Spoon community drop some AI-generated artwork

First this tasty looking food from Min Fan. You can see all the images from Min on our Slack.

And we also liked this futuristic food creation facility from Emma Forman:

If you would like your AI-art work featured on The Spoon, drop them into our Spoon Slack.

June 27, 2024

The Food AI Weekly Bulletin: Will AI Fakery Make Restaurant Reviews a Thing of the Past?

Welcome to the Food AI Weekly Bulletin, our new weekly wrapup that highlights important happenings at the intersection of AI and food.

Nowadays, there’s a lot of noise around how AI is changing food, so we decided to create a weekly brief to bring you what’s important, decipher through all the noise, and deliver actionable insights. If you’d like to sign up for our weekly Food AI Weekly, you can do so here.

Highlights

Is AI Ruining Restaurant Reviews? A new study shows people cannot distinguish between real and AI-generated reviews.

AI Food Art Is Everywhere (And It’s Not Great for Freelancers) Generative AI tools like Midjourney and DALL-E are revolutionizing food imagery, but what does this mean for freelancers and creatives who traditionally provided these services?

First, Al Michaels. Next, How About an AI-powered Anthony Bourdain? The news of Al Michaels allowing AI to replicate his voice has almost everyone freaking out, but what does it mean for the future of AI-generated avatars of famous food personalities?

Swallowing A Robot. Endiatx has developed the Pillbot, a tiny robot that can be swallowed to explore the gastrointestinal tract, potentially revolutionizing diagnostics and personalized nutrition.

Food & Nutrition Centric LLMs Could Be an Investible Opportunity. VCs see potential in industry-specific AI models, particularly in the domains of biology, chemistry, and materials, as these specialized LLMs could offer unique investment opportunities.

Brightseed’s Forager AI Finds Novel Bioactives. Cranberry giant Ocean Spray teams up with Brightseed to uncover new bioactive compounds in cranberries.

Our Favorite AI Food Art of the Week. We’ll be making this a regular feature. If you’d like your art featured, submit it on our Spoon slack. 

We’re going to be exploring all of this at our Food AI Summit in September. Join us won’t you? Super Early Bird pricing expires at the end of this month.

Is AI Ruining Restaurant Reviews?

What happens when humans can’t tell real restaurant reviews from fake ones? The restaurant industry has begun asking itself this question as a tidal wave of fake AI reviews floods online sites.

According to Yale professor Balazs Kovacs, humans are already losing their ability to discern the real from the fake. Kovacs recently unveiled the results of a study demonstrating AI’s ability to mimic human-written restaurant reviews convincingly. For his test, Kovacs fed Yelp reviews into GPT-4 and then asked a panel of human test subjects whether they could tell the difference. At first, the results generated by GPT-4 were too perfect, so Kovacs then prompted GPT-4 to insert “

While this raises obvious concerns about the authenticity of online reviews and the trustworthiness of consumer-generated content, it shouldn’t be surprising. Figure 01’s human-like speech tics were creepy, but mostly because of how human its awkward conversation seemed. With typos and sub-par grammar—in other words, what we see every day on social media—it makes sense that AI-generated reviews seemed more human.

One potential workaround to this problem of AI-generated reviews is using AI to detect and notify us what fake content is, but early tests show that even AI can’t tell what is real and what is fake. Another suggestion is to require reviewers to have purchased a product to review it (similar to having Amazon labels whose reviews are from verified purchasers) and apply it to restaurants. My guess is that this will be the best (and potentially last) line of defense against the coming tidal wave of AI reviews.

AI Food Art Is Everywhere (And It’s Not Great for Freelancers)

One early application of generative AI, as it applies to food, is the creation of images. Midjourney, DALL-E, and other tools allow us to create instant realistic images with a few sentences. As a result, we’ve seen CPGs, food tech software companies, and restaurant tech startups jump on the generative art trend.

While that isn’t necessarily good news for actual artists (this WSJ article is a must-read about the impact of AI on freelancers and creatives), these tools have democratized professional-ish like photos and art for folks in the same way Canva made professional-style graphics and presentations available to anyone.

One company that’s benefitted significantly is Innit. The company, which focused in its early life on hiring celebrity chefs like Tyler Florence and spending tens of thousands on photo shoots for a recipe, is now whipping them instantly with generative AI for its Innit FoodLM.

While most Internet-savvy marketing types at food brands, restaurants, and other food-related businesses have at least learned to dabble in generative AI prompt engineering, that hasn’t stopped some from trying to create a business out of it. Lunchbox created an AI food image generator utilizing DALL-E as the underlying LLM over a year ago (the website has since gone dark), and just this week I got pitched on a new AI-powered food generator that wants to charge for its service (which is essentially a user interface to manage prompt engineering for an underlying LLM (which most likely is Midjourney or GPT-4). There’s likely a small lifespan for these types of services, but my guess is most marketing folks will learn to prompt engineers directly with popular image generators like Midjourney.

First, Al Michaels. Next, an AI-Powered Anthony Bourdain?

The Internet freaked out yesterday when news broke that Al Michaels has agreed to let an AI copy his voice, and rightly so. First off, it’s creepy. Second, this is the exact thing was the main reason the Hollywood writers and actors guilds kept striking for so long, so I’m guessing the Hollywood creative community isn’t exactly happy with Al. And finally, it goes to show you that if you throw enough money at us humans, the temptation to cave to the bots will be too much.

My guess is we’ll eventually see AI-generated avatars of famous chefs. All it would take is for the estate of Julia Child or Anthony Bourdain to get a good enough offer and it won’t be long before we hear (and maybe see) their avatars.

Swallowing A Robot

According to Venturebeat, Endiatx has developed a microscopic robot that can traverse your body and is equipped with cameras, sensors, and wireless communication capabilities. The robot, called Pillbot, allowing doctors to examine the gastrointestinal tract and be used both for diagnostic and therapeutic purposes.

The company’s CEO, Torrey Smith, has taken 43 of these Pillbots and swallowed one live on stage, which can be seen here. If this technology actually works (and those pills can be made smaller because, holy cow, that’s a literal big pill to swallow), it’s not hard to imagine these being used to dial in and optimize personalized nutrition regimens.

Food & Nutrition Centric LLMs Could Be an Investible Opportunity

Business Insider asked some VCs what they’re bored by when it comes to AI and what they’re excited about. Not surprisingly, they talked alot about how it will be hard for startups to break through in foundational large language models, where big players like Open AI and Google play. And like any good VC looking at an early market they talked up up picks and shovels

Even as investors shift their focus to promising AI infrastructure startups, there may still be some opportunities for new LLM startups to win, especially when they’re trained for specific industries, explained Kahini Shah, a principal at Obvious Ventures.

“We’re excited about what we call Generative Science at Obvious, i.e, large, multi-modal models trained in domains such as biology, chemistry, materials,” she said.

Brightseed’s Forager AI Finds Novel Bioactives

Brightseed, a company that uses AI to accelerate bioactive and food compound discovery, announced that it has (in partnership with Ocean Spray) used its Forager AI to uncover novel bioactive compounds in cranberries. Forager identified multiple bioactives, such as terpenes, which Brightseed believes hold significant potential for human health. These findings, based on in silico analyses, will undergo further clinical validation and will be presented at the American Society of Nutrition’s NUTRITION 2024 conference.

This acceleration effect of new health-positive compounds is another example of the AI acceleration effect I wrote about yesterday. Things are beginning to move exponentially faster at every stage of the food value chain, which over time means our basic understanding of the rules underpinning what we do (such as food product development) gives way to entirely new rules that are rewritten in large part by AI.

Our Favorite AI Food Image of the Week: Hungry Monkey

We like looking at AI-generated food art and figured we’d show you some of our favorites on a weekly basis. 

If you’d like to submit your AI-created food art (or if you’ve found one you think we should feature, drop the image and the source/attribution (preferably a link) on our Spoon Slack.

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...