• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Skip to navigation
Close Ad

The Spoon

Daily news and analysis about the food tech revolution

  • Home
  • Podcasts
  • Events
  • Newsletter
  • Connect
    • Custom Events
    • Slack
    • RSS
    • Send us a Tip
  • Advertise
  • Consulting
  • About
The Spoon
  • Home
  • Podcasts
  • Newsletter
  • Events
  • Advertise
  • About

machine learning

August 15, 2023

Strella Believes Its Machine Learning Tech Will Help Deliver The Perfectly Ripened Banana

Did you know that there’s a job in the banana industry called a ripener?

It makes sense, right? After all, anyone who eats bananas knows the time it takes to go from rock-hard green banana to brown mushy mess can be as short as a week. This means the banana industry has to work hard to ensure bananas ripen at the right time so they are peaking in bright, beautiful yellow by the time they show up on grocery store shelves.

Like many jobs, the ripener role relies heavily on judgment. Not that they don’t use some modern tools when monitoring and managing the ripening cycle of the banana, but from the looks of it, the ripener job seems ripe (sorry) for a Moneyball-style analytics and technology revolution.

Enter Strella. The company, which has gained traction in the apple industry for its IoT monitoring technology over the past few years, has gone bananas. According to company CEO Katherine Sizov, the company’s new AI-powered model helps them (and those working as ripeners) better decipher the signals the bananas send.

“We’ve built a machine learning model that helps us get bananas from that green to that perfectly yellow color every single time,” Sizov told The Spoon. “And the way that we do that is we measure what the bananas are telling us.”

According to Sizov, the hardware they use for banana monitoring is the same as for apples. The difference is software.

“The hardware is the same, but the algorithms are different,” Sizov said.

Sizov says that whether it’s apples or pears (fruit with longer ripening cycles) or avocadoes or bananas (fruit with shorter ripening cycles), the key indicators sending signals around the ripening stage are ethylene and CO2 emitted from the produce. The Strella hardware module has eight different sensors, sensing ethylene, CO2, and other environmental factors such as heat and moisture.

And just as with apples, the Strella technology can help determine what exactly is needed to slow down or accelerate the ripening cycle of a banana. The only difference is that things move much more quickly with bananas or avocados, which is why a job explicitly focuses on managing the process of ripening the produce.

“Unlike bananas, apples are picked perfect off the tree,” Sizov said. “And they can last a whole year in gigantic storage rooms.”

With bananas or avocadoes, the ripening process is much more closely managed. They are picked before they are ripe and then stored cold to slow the ripening until they get near the point of consumption. From there, they go into ripening rooms, and the ripener introduces ethylene gas and CO2 and adjusts the temperature to kick the ripening process into gear. And now, according to Strizov, Strella’s new banana and avocado machine-learning algorithms can help determine precisely how much of each is needed to adjust the ripening cycle to get the desired output.

Should ripeners be worried about technology taking their jobs away? Sizov doesn’t think so.

“When people are very good at their jobs, they’re always looking for tools to do better,” Sizov said. “Ripeners have a ton on their plate, they’re working 12 to 14-hour shifts, so I think they’re always looking for ways to get a little more sleep. Our tool is one way to do that.”

According to Sizov, Strella has worked with 85% of the US market for apple and pears suppliers and estimates the company has saved 20 million pounds of apples and pears from going to waste. Now, she hopes they can replicate that success in bananas and avocados.

“We’re growing pretty quickly, and we’re excited to get into bananas and avocados after having had our foray into apples for five years now.”

If you’d like to hear Katherine discuss how AI can perfect the ripening of bananas, she will be speaking at the Spoon’s Food AI Summit on October 25th in Alameda, CA! Get your early bird tickets today!

June 17, 2021

InnerPlant Raises $5.65M to Turn Plants Into “Living Sensors” and Mitigate Crop Loss

Agtech company InnerPlant, which is changing plant DNA to create “living sensors” that mitigate crop loss, has raised $5.65 million in pre-seed and seed funding, according to an official announcement sent to The Spoon. The round was led by MS&AD Ventures, the investment arm of Japan’s MS&AD Insurance Group. Bee Partners, Up West, and TAU Ventures also participated in the round. 

InnerPlant created its technology platform to spot threats to plant growth — pests, nutrient deficiencies, water stress, etc. — quicker than is possible via traditional farming methods. To do that, the company recodes plant DNA to include a fluorescent safe-for-human-consumption protein that lights up the leaves of a plant when there is a problem. Essentially, it is turning the entire plant into a living signal that can “talk” to the farmer when there is a problem. Different colored lights indicate different issues.  

Since these signals are invisible to the human eye, farmers can use InnerPlant’s augmented reality system to photograph their fields and view potential problems via an iPhone or iPad. The signals can also be detected via a drone flying overhead or even a satellite.

This handy explainer video goes into more detail:

According to the company, it only takes tens of these sensor plants to protect an entire field. Once the signal plants send off a distress signal, a farmer can address the impacted area before it spreads to the whole crop. For example, if a harmful fungi breaks out in one area of a field, a farmer can get rid of only the impacted plants, instead of spraying the whole field with fungicide. Think of it as on-demand crop protection.  

InnerPlant says its entire concept is merely piggy-backing off the natural signals plants send to one another when they are in distress. Recoding the DNA to include the protein is “amplifying” these natural signals, so that farmers can spot problems faster. It also frees them from what InnerPlant founder and CEO Shely Aronov calls “the pesticide treadmill,” which is our increasing use of chemicals and pesticides that harm waterways, impact microbial diversity in soil, and are linked to some cancers.

It remains to be seen how consumers will feel about eating produce with recoded DNA, or how that message will get effectively communicated. And since InnerPlant is a relatively new company (it released its first product, the InnerTomato, in 2020), it is too soon to have much data on how effective these living plant sensors are compared to other modes of crop protection. 

The technology does, however, show us yet-another possibility for improving crop yields and mitigating loss in the food system at a time when the world’s population is growing. 

InnerPlant says it is currently working on a new product, InnerSoy. Funds from the seed and pre-seed rounds will go towards developing other products in future. 

May 11, 2021

Apeel Acquires ImpactVision to Fight Food Waste With Hyperspectral Imaging

Apeel officially announced today is has acquired machine learning company ImpactVision for an undisclosed amount. The plan is to integrate ImpactVision’s hyperspectral imaging technology into Apeel’s applications systems at produce packing houses and distribution centers in North America, South America, and Europe. This is Apeel’s first major acquisition, according to a press release sent to The Spoon. To date, ImpactVision has raised $2.8 million.

Apeel’s existing application systems involve coating different types of produce with what the company calls its “shelf-life extension technology” — an edible, plant-based coating that gets applied to produce after harvest. The coatings extend the shelf life of fruits and vegetables by keeping moisture trapped inside the produce and oxygen out. In doing so, the rate of decay significantly slows. 

With the ImpactVision acquisition, Apeel will be able to add further analysis of the produce to its operations. ImpactVision’s tech collects and processes hyperspectral images of each individual piece of produce. Through machine learning models, the system can identify cues in the produce around its freshness, degree of maturity, and phytonutrient content.

Based on those elements, suppliers and distributors can then decide where each piece of produce should go. Those  with a shorter ripening window can ship to retailers geographically closer to the supplier, for instance, to avoid excess food spoilage. By way of example, today’s news announcement gave the following scenario: “If a produce supplier sees that one avocado will ripen tomorrow while another will ripen in 4 days, they know that one has more time to travel and should be sent to the retailer that is further away.”

Writing in a blog post today, Apeel CEO James Rogers noted that through the acquisition, “Apeel will now be able to integrate hyperspectral imaging technology into our supply chains, enabling us to provide new insights to our customers, both upstream and downstream, ranging from ripeness prediction to nutritional characteristics, even information on how the produce was grown; the very aspects that make every individual piece of fruit unique.”  

Rogers added that Apeel has already started the process of upgrading its application systems to include hyperspectral camera capabilities. The company says it has 30 supplier integrations on three continents with plans to double that number by the end of 2021.

January 8, 2020

CES 2020: Stratuscent’s Digital Nose Can “Smell” When Crops are Ripe or Food is Burning

Something good-smelling must be in the air at CES this week, because digital noses are becoming a bit of a thing at this year’s tech expo. Yesterday I dropped by the booth of Stratuscent, a Montreal, Quebec-based startup which is digitizing scents to detect freshness.

The company’s sensors, called eNoses, detect chemicals in the air to create a scent print — like a fingerprint for a smell. According to CEO David Wu, who gave me a tour, Stratuscent’s “secret sauce” is its superior AI and machine learning, which can quickly and accurately determine any number of complex scents, even ones too tricky for humans to smell. The company’s tech came from NASA, where it was originally used for leak detection.

The eNose is pretty simple to use. Just wave the product in question under the eNose and it will determine what it is — as well as its percentage of accuracy — in under thirty seconds. You can Wu demonstrating the technology below:

CES 2020: Stratuscent's eNose is a Digital Smelling Machine
Can Stratuscent determine this mystery smell? (Spoiler: Yes, yes it can.)

Wu told me that Stratuscent’s noses have a variety of applications, including sniffing ethylene, a chemical that indicates spoilage, in crop shipments. They’re also working with a dairy company to detect milk freshness. In the home, Wu told me that the eNose could also be integrated into smart kitchen appliances to identify cooking stages (your sauce is about to burn!) and alert users to food spoilage.

Startuscent was founded in 2017 and has raised $4.3 million thus far. Wu said that in addition to its partnership with a dairy company, Stratuscent is pushing further into the food and agriculture space, and is also in conversations to work with indoor agriculture farmers.

Stratuscent isn’t the only player digitizing smell technology (what a world). Yesterday Chris wrote about Aryballe’s new Digital Nose 2.0, which also debuted at CES this week and also digitizes scent to detect freshness, cooking smells, etc.

Regardless, the digital scent landscape is just beginning to emerge. As food safety outbreaks grow — and consumers become more conscious about reducing home food waste — I think there will be a growing market for this sort of technology. Which means there’s ample opportunity for more than one player to nose its way into the digital smelling space.

August 23, 2019

Ai Palette Raises $1.1M in Seed Funding for its Food Trend Prediction Platform

Ai Palette, a Singapore-based startup that uses machine learning and artificial intelligence to predict trends for food companies, announced it has raised $1.45 million Singapore dollars (~$1.1M USD) in seed funding. The blog e27 reported the news first, writing that the funding round was led by Decacorn Capital with participation from the Singapore government’s SGInnovate and AgFunder as well as existing investor Entrepreneur First.

According to Ai Palette’s website, the company uses “Artificial Intelligence and Machine Learning to draw insights from millions of data points to identify consumer needs in real time and combine it with the brand personality to create winning food product concepts.”

Using machine learning and AI to help food and CPG companies capitalize on emerging trends is an already crowded field. Companies like Spoonshot, Analytical Flavor Systems, Tastewise and Halla all use various implementations of machine learning and AI in this capacity.

The goal for all of these companies is to speed up the R&D process for food producers. The sooner a food trend can be spotted, the faster the process can be to get it to market. Additionally, AI allows companies to suck up even more data to provide more granular predictions based on things like regional preferences, or even come up with new flavor combinations that might not ordinarily be tried.

Ai Palette says it will use its new funding to scale up development and grow out its customer base across multiple markets.

June 27, 2019

Will Punchh’s New ML-Based Customer Lifetime Value Predictor Create More Data Darwinism?

Punchh, a software startup that creates digital marketing tools for physical retailers like restaurants, announced the release of its new machine learning-based “Predictive Customer Lifetime Value” (PCLV) application this week. But will this new technology just be another avenue for data darwinism?

CLV is a metric companies use to predict how much money they will reasonably get from any one customer. The concept certainly makes you wonder whether restaurants are feeding you meat, or if you’re the meat feeding the restaurant. Regardless, it’s something restaurants are using more. Fellow restaurant app-maker, Toast, did an explainer post on CLV awhile back.

From the Punchh press release:

From the moment a customer makes their first purchase, Punchh instantly predicts their CLV, then constantly refines that prediction as the relationship between the brand and customer deepens. Based on that PCLV, retail marketers can create target segments with this data to, for example, encourage high CLV segments to enroll in rewards programs while offering low CLV segments incentives through coupons.

While this PCLV may be useful to a restaurant for marketing purposes, it also feels like more data darwinism, like my purchases will determine a company’s level of interest in me. If a restaurant predicts that I’m a low-ticket customer for them from my very first purchase, will they just ignore me? Or will I get worse service? I asked Punchh about this and Xin Heng, Punchh’s Senior Director of Data sent me the following response:

Xin: They won’t be ignored, they will just be put in a different bucket (or segmentation). In other words, low CLV customers will be continuously monitored and treated with winback campaigns. But those who are outside this segment can be subject to games, compression campaigns, a referral callout and more. It’s just about segmentation, but every customer is consistently monitored regardless of what segment they fall into.

Great(?), a restaurant will still be sending me emails no matter how much–or little–I spend! The company is billing the PCLV tool as a restaurant’s virtual data scientist, but it seems like moneyballing me could be just another way that data ruins dining out with too many predictions about my behavior.

We’ll soon see, as Punchh works with more than 160 brands including Pizza Hut, Del Taco, Denny’s and TGI Friday’s. The company has raised $31 million in funding and earlier this month opened up an engineering hub in Toronto.

May 15, 2019

Halla Raises $1.4M Seed Round, Pivots to Focus on AI-Powered Grocery Recommendations

Halla, a startup that uses machine learning and AI to power food recommendations for grocery shoppers, announced today that it has raised a $1.4 million seed round led by E&A Venture Capital with participation from SOSV. This brings the total amount of money Halla has raised to $1.9 million.

Halla has a B2B platform dubbed Halla I/O that helps recommend relevant food products to shoppers. As we wrote at the time of Halla’s launch last year, the “company created an entirely new model and a new taxonomy that doesn’t just look at what a food item is, but also the molecules that make it up, a map of attributes linked to other food as well as how people interact with that food.”

So if you are using a grocer’s app with Halla I/O built in, the app will serve up intelligent recommendations as you continue to shop online. Buy salt, it could recommend pepper. By salt and noodles and beef, and it might guess that you are making a bolognese and recommend tomato sauce.

If you read our coverage of Halla last year, you’d notice something different about the company now. Initially, its go-to market strategy included both grocery stores and restaurants. But in the ensuing year, Halla has abandoned its pitch to restaurants, choosing instead to focus exclusively on grocery retail.

“What we’ve found is that the market timing was screaming ‘where tech meets grocery,'” Halla Co-Founder and CEO Spencer Price told me by phone recently, “The restaurant space is more crowded for building recommendations.”

But all that work in the restaurant space didn’t go to waste. “The truth is we were able to keep all of our learnings from restaurant and made our grocery recommendations stronger,” Price said. “One core learning is that restaurant dishes and menu items, as long as they have descriptions, are just recipes without instructions.”

Halla now has more than 100,000 grocery items and one hundred million unique grocery transactions from retailers across the country in its data set, informing its machine learning algorithms. Price is quick to point out that Halla does not have any personally identifiable information on people. “We can make recommendations to customer X without knowing who customer X is,” Price said

Though a grocery chain can move a lot of product and provide a lot of data for better purchasing recommendations, grocery chains as a whole do not move quickly. To get them to adopt a new technology is like turning a battleship — they need a lot of time to execute. “They’re not looking for speed,” Price said, “but a reliable solution.”

To this end, the biggest thing Halla’s funding buys them is time. “We’ve bought some runway,” said Price. The company now has some breathing room to take its time and conduct even more tests with slow-moving retailers. Halla is in tests with unnamed grocers right now, and offers its recommendation on an API pay-per-call model.

AI-based B2B food recommendation is almost its own mini-industry. Spoonshot, Analytical Flavor Systems, and Tastewise all use vast data sets to make product predictions and recommendations to restaurants and CPG companies. Other companies like AWM Smart Shelf are using a combination of prediction and smart digital signage to make in-store grocery purchase recommendations.

With online grocery shopping reaching a tipping point, people buying food via apps adding one or two more items to their cart because of intelligent recommendations could mean a nice sales boost for the grocery industry.

February 4, 2019

McCormick and IBM are Using AI to Develop Better-Tasting Spices

Today spice giant McCormick announced that it is partnering with IBM to create a research coalition to explore how artificial intelligence (AI) can improve flavor and food product development.

According to the press release, McCormick will use IBM Research AI for Product Composition to “explore flavor territories more quickly and efficiently” and “predict new flavor combinations from hundreds of millions of data points across the areas of sensory science, consumer preference, and flavor palettes.” In short: McCormick is applying IBM’s AI/machine learning power to their own taste data in an effort to develop better-tasting products more quickly — and with fewer duds.

The first product platform, called “ONE,” will debut mid-2019, and will include a set of one-dish Recipe Mixes, which I’m assuming are spice packets. The mixes are meant to season both a protein and a vegetable and come in flavors like Tuscan Chicken and New Orleans Sausage. McCormick is aiming to have them on grocery shelves by spring of this year.

To learn more about the ONE platform, we spoke with McCormick’s Chief Science Officer Hamed Faridi. “This technology uses multiple machine learning algorithms that are trained on information, including hundreds of millions of data points across the areas of sensory science, decades of past McCormick product formulas and information related to consumer taste preferences and palettes,” he told the Spoon. “What distinguishes the new system is its ability to learn and improve every time [it] is being used by our product and flavor developers.”

Unlike McCormick consumer-facing Flavorprint, which drew on recipe search histories to recommend new flavors and recipes, the ONE platform is purely internal. However, Faridi made it clear that the ONE platform would not replace consumer taste testing. “AI can’t taste flavors in the same way a human can,” he said. However, it will seriously up the speed of new product development. Faridi said that the AI system would let McCormick create new flavors up to three times faster, giving the company more agility so it could quickly develop products to take advantage of new dining trends.

Anytime the term “AI” — or the even trendier “machine learning” — is used by a Big Food company or fast food chain, it’s wise to take it with a grain of salt. As buzzwords, AI and machine learning can sometimes be more of a marketing gimmick than a value add.

That’s not to say that there aren’t several companies successfully leveraging AI to improve flavor. In fact, last year I wrote a piece predicting that services combining flavor and AI would be a new food tech trend. Analytical Flavor Systems has an AI-powered flavor prediction platform to help companies develop new food products with less trial and error. Plantjammer uses AI to help home cooks make better plant-based dishes. And Foodpairing applies AI to its flavor database to help professional chefs develop more innovative recipes. But these are smaller, tech-driven startups that have built their service based around AI from the beginning. For Big Food, AI is sometimes as much of a promotional tool as an actual service.

Since McCormick is working with IBM, its new platform seems more like a serious effort than, say, Dominos’. But is McCormick, as it states in the aforementioned press release, “ushering in a new era of flavor innovation and changing the course of the industry”? Probably not. But then again, I haven’t tried that New Orleans Sausage recipe mix.

December 6, 2018

Amazon Go is On a Massive Hiring Spree, and Not Just in the U.S.

Amazon Go, the retail store that uses cashierless technology so you can walk in, choose your items, and walk out without stopping to pay, has 338 open listings on its job site (big h/t to Sean Butler).

There are a few takeaways from this, but most notable is the sheer amount of investment in engineers on both the software and hardware side. There are a whopping 130 positions in software development, and 44 in hardware development.

But that’s just the start. While the majority of the listings are for Software Engineers, listings also include everything from Data Collection Technician to Creative Director to Security Engineer to Senior Vision Research Scientist. There are even 7 listings for real estate and construction positions, Whew!

They’re also searching for a Specification Technologist to join the Amazon Meal Kits team and help out with product development. Meal kits are already some of the most popular items at Go stores, so it’s not surprising that Amazon is looking to amp up its offerings, especially as they expand into new cities.

Many of the jobs are quite recent, and were either posted or updated within the past month. Which means that Amazon is poised to make some serious Amazon Go expansion moves in the new year, and willing to invest some serious man (and woman) power to do it. Good thing too, since the company is considering a plan to open 3,000 Go stores by 2021.

It’s also worth noting where the Go jobs are located. While the locations don’t necessary indicate where Amazon will set up future Go stores, it’s a good data point to learn where they will base R&D and development of their cashierless technology.

In the U.S. there are openings in Seattle (duh), Westborough, MA, San Francisco, and New York City. Abroad, there are listings in two cities in Israel: Tel Aviv and Haifa.

Perhaps most eye-catching on the list is Westborough, MA. That’s the home of Amazon Robotics, a subsidiary which works on Amazon’s mobile robotic fulfillment systems. According to job descriptions, that’s also where Amazon is building an Advanced Projects Group, which will develop “new technologies that go well beyond the current state of the art.”

The location is certainly strategic from a hiring standpoint: Westborough is less than an hour outside of Boston, making it an easy way to recruit tech-savvy post-grads from MIT and Harvard. I’m speculating here, but the Westborough job listings, with its proximity to Amazon Robotics, could also indicate plans on Amazon’s part to add more robots to its Go store experience.

Outside of the U.S., Amazon Go is hiring in Israel. This could simply be a way for Amazon to take advantage of Israel’s flourishing AI landscape and hire some top-notch computer scientists. But it could also indicate that Amazon is ready to expand its Go stores internationally.

It wouldn’t be the first company to bring cashierless tech Israel. Trigo Vision recently partnered with Israel’s largest supermarket chain Shufersal to implement its checkout-free tech in all locations across Israel. However, Trigo Vision and Amazon aren’t direct competitors: Trigo licenses out its tech to existing retailers, while Amazon builds its Go stores from the ground up.

Of course, even outside of Israel Amazon still has plenty of competition in the cashierless tech space. Microsoft has been working on its own version and has reportedly been in partnership talks with Walmart. In San Francisco, Aipoly is developing its own walk-in-walk-out store solution and Standard Cognition recently opened up a store in San Francisco to show off its technology.

Which is all the more reason that Amazon needs to grow fast if it wants to keep up its unique value proposition in the food retail space. The high number of job listings, and their wide geographic reach, show that when it comes to Go stores (and most things grocery, in fact), Amazon isn’t slowing down anytime soon. Now we just have to wait and see when they launch a cashierless Whole Foods.

Thanks to Sean Butler, who posted on his Linkedin about Amazon Go’s massive open jobs list. Do you have a tip for us at the Spoon? We’d love to hear it. 

November 6, 2018

Taranis Harvests $20M for Aerial Imaging Tech that Detects Crop Diseases

Today crop threat detection company Taranis announced that they closed a $20 million Series B funding round led by Viola Ventures, with participation from existing investors Finistere Ventures, Vertex Ventures, and others. This latest round brings the company’s total funding to $30 million.

Founded in 2014 and based in Tel Aviv, Taranis uses aerial imaging to help farmers monitor their acreage for crop threats and irregularities, such as disease, weeds, soil nutrition, and harmful insects.

To do this, the startup combines multi-spectral imagery gathered from satellites, planes, and drones to keep constant tabs on farmers’ fields. The image resolution is so high, according to Taranis co-founder and CEO Ofir Schlam, that they can see a single beetle on a single leaf.

All images (they’ve captured 2 million in the past year alone) are uploaded into the Taranis database, which then analyzes them and creates a synthesized report of any potential threats/problems they “see”. Farmers can access said report through Taranis’ mobile app or via a web browser and decide how best to remedy any issues.

A soy plant with a potassium deficiency.

Pricing varies depending on the type of crop: high-value plans like sugar beets or potatoes, which require more scans, would cost farmers around $15 to $20 per acre per season. Lower-lift crops like soybeans, wheat, or cotton would cost them only $5 per acre per season.

Considering the average U.S. farmer has 444 acres, the price of Taranis’ service can really add up. However, Schlam was quick to emphasis that the return is 3 to 5 times the price, and that their technology can increase crop yields by up to 7.5 percent. As regulations around drones relax and open up in rural areas, he also expects that they’ll be able to reduce their price.

While I’m wary of any company that claims to do anything so drastic as increase crop yields across the board by 7.5 percent, Taranis seems to have the technology and team to back it up. Schlam has a background in tech and intelligence, and another cofounder came from aerospace engineering. He told me that the company has been awarded three patents and has 24 others pending. Earlier this year, Taranis also acquired Mavrx, a San Francisco-based agricultural aerial imaging platform.

As of now, the company has around 60 contracted agronomists who manually train the system to identify problems. However, once the technology has learned that a certain discoloration equates to, say, a potassium deficiency, it can make the connection on its own from then on. Bad for any future agronomists, good for farmers.

Taranis isn’t alone in using technology to make farmers’ lives easier. On the ground, companies like CropX (also based in Tel Aviv) and Teralytic use wireless sensors to help farmers manage things like moisture and fertilizer in their fields. Most similarly, Walmart has filed a patent for an application which uses machine learning to monitor pests, though to our knowledge they haven’t done any pilots so far.

The startup currently works with 19,000 farms across 30 states in the U.S., with an additional footprint in Canada, Argentina, Brazil, Russia, the Ukraine and Australia. With their new funding, Schlam said Taranis would continue to scale up operations in their existing geographies. They’ll also invest in R&D so that their service can identify more types of pests and diseases.

July 30, 2018

Dishq Raises $400,000 Pre-Seed From Techstars and Arts Alliance

Dishq, the Bengaluru, India-based startup that uses artificial intelligence (AI) to deliver personalized food recommendations, announced today that it has raised a $400,000 “pre-seed” round of financing from several investors including the Techstars Farm to Fork accelerator and Arts Alliance. This brings the total amount of money raised by the company to $560,000.

Dishq combines a vast database of dishes and their attributes, as well as anonymized customer behavior analytics and food science research into a machine learning algorithm. The company’s AI platform plugs into any digital food business, such as Uber Eats or at an in-restaurant ordering kiosk, to provide recommendations for the end-user.

According to Vasani, dishq has 11 customers across 6 countries spanning across more than one thousand restaurants and foodservice locations. The software is powering 30 million recommendations per month with more than 176,000 consumers receiving recommendations each month. For comparison, when I spoke with the company in January, it has 6 customers and was powering just two million recommendation a month.

In addition to TechStars and Arts Alliance, existing investors Zeroth and Artesian Venture Partners also participated in this round, as well as The Syndicate Fund and angel investor Sven Hensen. Dishq says it’s going to use the new money to build out its engineering team and expand sales and marketing activities. In addition to the Bangaluru office, dishq will be growing its presence in London and just opened up an office in St. Paul, MN, which is where the TechStars accelerator is located.

Now is a good time for dishq to bolster its coffers, as the AI-powered recommendation space is getting hot. Just yesterday, Halla announced its I/O platform for B2B food recommendations, and others such as FoodPairing and Plantjammer are all bringing their machine learning to help predict what people want to eat.

This money will also presumably help dishq get closer to its long-term goal of creating a real-time “taste analytics as a service,” which will help CPG brands better react to food trends as they are happening and take action.

Vasani spoke at our recent Smart Kitchen Summit: Europe, where he was on the panel Personalized, Shoppable and Guided: Recipes Are Not Dead. You can watch it here.

SKS Europe: Personalized, Shoppable and Guided: Recipes Are Not Dead from The Spoon on Vimeo.

May 23, 2018

Microsoft Gets Visual Food Logging Patent

Microsoft appears to be applying its computer vision and AI smarts to make watching what you eat easier. The Redmond giant was awarded a patent yesterday for “Food Logging From Images.” That basically means, you can take a picture of your food and Microsoft will provide you with its nutritional information (calories, protein, vitamins, etc.).

Yesterday’s patent indicates that it is a continuation of a previous Microsoft patent in May of 2017 for “Restaurant-Specific Food Logging From Images.” Restaurants are called out specifically in this new patent because the Microsoft Food Logger would use GPS to know when you’re at a restaurant. From there, the Food Logger could use information from text menus online via Yelp! or a restaurant’s own site to assess nutritional information.

The technology would supposedly also work outside restaurants, using image recognition to understand home cooked meals as well. And there are tools to allow the user to edit or correct any inaccuracies in what the Logger identifies. So if you slathered butter on a piece of bread, you could specify the amount.

The obvious use case with this patent is a mobile phone app, which is listed. However, Microsoft goes even further to say this technology would work with camera-equipped glasses. From how the company describe it, if you walk into a restaurant wearing these hypothetical Food Logger glasses, you would almost get Terminator-like vision looking at the nutritional content of various meals people around you were eating.

The idea of taking a picture of food and automatically getting its nutritional content isn’t new. Apps like Lose It and Calorie Mama AI say they offer the same type of functionality. Samsung even recently added Calorie Mama’s technology into its Bixby virtual assistant.

Google, of course, has also been working on food recognition for awhile. And this week it came to light that Google is reportedly adding human-powered food identification capabilities to Google Maps. Humans labeling pictures of food taken from different angles and visibility will be beneficial to help train Google’s image recognition algorithms.

Right now, this is just a patent for Microsoft, so who knows how this will ever make it to market. But that market is huge, and it’s unlikely Microsoft will sit on the sidelines.

Next

Primary Sidebar

Footer

  • About
  • Sponsor the Spoon
  • The Spoon Events
  • Spoon Plus

© 2016–2025 The Spoon. All rights reserved.

  • Facebook
  • Instagram
  • LinkedIn
  • RSS
  • Twitter
  • YouTube
 

Loading Comments...