Retail and E-commerce
The AI deployments most consumers interact with every day without realising. Recommendations, pricing, inventory, computer vision, marketing — all of it has machine learning underneath.
Walk through your local Coles or Woolworths and almost every decision you can see has been shaped by AI. The brand of beans on the end-of-aisle promotion was selected by a model. The price of the steak was set by a model. The timing of the discount on the bread that goes off tomorrow was set by a model. The route the truck took to deliver the milk this morning was optimised by a model. The recommendations on the app on your phone are produced by yet another model. The supermarket's loyalty program is, more than anything else, a data-collection system to feed all of these models.
None of this is hidden, exactly. None of it is much advertised either. Retailers — like banks — have been doing this for long enough that they take it for granted, and they would prefer you do too.
Recommendation engines — the original
The single most-deployed AI system in consumer life is the recommendation engine. Amazon's "customers who bought this also bought" launched in 1998. Netflix opened up its recommendation algorithm via the famous Netflix Prize in 2006. Spotify's Discover Weekly, launched in 2015, was the first time a recommendation system felt eerily personal to a mass audience. By 2026 every major e-commerce site, streaming platform, social network and increasingly every supermarket app runs a recommendation system trained on your behaviour and the behaviour of millions of similar users.
The architectures have evolved. Early collaborative filtering ("people who liked X also liked Y") gave way to matrix factorisation, then to deep neural networks, then to transformer-based sequence models that treat your purchase or viewing history as a sentence and try to predict the next item the way an LLM predicts the next word. The metric they optimise has also evolved. Early systems optimised click-through; current systems optimise long-term engagement, repeat purchase, or some company-specific blend.
The trade-off most readers will recognise is the filter bubble. A system that maximises your engagement is a system that surfaces what you have engaged with before. That is fine for buying socks. It is more concerning for news feeds, where the same dynamic is now well-documented to push users toward more extreme content. We will return to this on the When AI Goes Wrong page.
Demand forecasting and supply chain
The hardest invisible problem retailers solve is matching supply to demand. Order too much milk and it goes off. Order too little and you lose the sale and annoy the customer. Multiply that across 30,000 product lines, 1,000 stores, 365 days a year, with weather, public holidays, sporting events and viral TikTok trends all distorting demand, and you have a problem that has consumed enormous machine-learning effort for two decades.
Walmart, Amazon, Tesco and the major Australian supermarkets all run multi-level forecasting systems. The best of them combine traditional time-series models (ARIMA, Prophet) with modern deep-learning forecasters (Amazon's DeepAR, Google's TFT, increasingly transformer-based) and feature engineering for known events: school holidays, public holidays, sporting finals, Cup days, weather forecasts, competitor promotions. Coles publicly stated in 2023 that its AI forecasting reduced its grocery waste rate; Woolworths' has been similarly described in industry conferences. These claims are hard to independently verify but the direction of travel is clear.
Logistics builds on this. Once you know what is going to sell where, the question becomes how to get it there. Route optimisation for delivery fleets is a classical operations-research problem with a long history; the AI overlay in the last decade has been to incorporate live traffic, driver behaviour, weather, predicted demand at each stop, and dock capacity at the receiving end. Australia Post, Toll, Linfox and the supermarket-owned fleets all run versions of this.
Dynamic pricing — the part that gets political
Some retailers change prices constantly based on demand, inventory, competitor pricing, time of day, and individual customer signals. This is the AI deployment that consumers most actively dislike when they notice it.
The clearest examples are in travel and hospitality. Airline ticket prices and hotel room rates have been algorithmically priced for decades; almost no human is involved in setting any specific price you see. Uber's surge pricing is a real-time algorithmic response to the ratio of riders to drivers in a given area. Amazon changes prices on millions of products multiple times a day.
The supermarket version is more contained but still real. Electronic shelf labels, now widely deployed in European retail and increasingly in Australia, allow stores to change prices through the day. The current public claim is that this is used mostly for clearance — marking down items approaching their use-by date — but the technology obviously enables more aggressive uses.
Personalised pricing — different prices for different customers based on what the algorithm thinks they will pay — is the version that crosses most consumers' moral threshold. There is evidence it happens in some online retail, particularly through "personalised discounts" that are economically equivalent to differential pricing. Australian Consumer Law, the EU's Digital Services Act, and the UK's Digital Markets Act all touch on this and none of them yet fully prohibits it.
Computer vision in physical stores
Walk through an Amazon Fresh store in the US or UK and the entire experience is built on computer vision. Cameras on the ceiling track every shopper. Sensors on the shelves track every product picked up or put back. The system attributes each pick to a shopper, totals their basket, and charges them when they walk out. There is no checkout, because there is no need for one.
That technology has not been fully rolled out elsewhere — Amazon retreated from "Just Walk Out" stores in 2024 amid reports that human reviewers in India were doing much of the actual classification — but the underlying components are now standard retail technology. Loss prevention cameras with computer vision flag suspected shoplifting in real time. Stock cameras on shelves report empty spaces to staff systems. Customer-flow analytics tell stores where to put end caps.
The more contentious example is facial recognition. Bunnings and Kmart in Australia were caught in 2022 deploying face recognition in stores without clear customer notice. The Office of the Australian Information Commissioner ruled this a breach of the Privacy Act in 2024. The case is one of the clearer signals that public tolerance for in-store surveillance has limits, even where the stated purpose (loss prevention) is legitimate.
Personalisation and marketing
The data that supermarket loyalty programs collect feeds personalisation systems. The Coles Flybuys card and Woolworths Everyday Rewards card know what you buy, when, in what combinations, at what stores, and how those patterns shift over time. The personalised offers you receive in the app, the catalogue items emphasised when you log in, the targeted ads served to you on Facebook and Instagram — all of these are downstream of the same underlying customer model.
This is not unique to supermarkets. It is the same machinery that runs across Amazon, the major Australian banks (open banking has accelerated this), every airline frequent flyer program, every streaming service, and most major retailers. The technical name is "customer lifetime value modelling" or "next-best-action". The colloquial description is that the company knows what you want before you do, and is gently pushing you toward it.
Where it has gone wrong
Target's pregnancy prediction remains the canonical case. In 2012 the US retailer Target's analytics team famously built a model that could identify pregnant customers from their purchase patterns (changes in unscented lotion, certain vitamin supplements, large bag purchases). The model worked well enough that the company began sending targeted maternity advertising. A father in Minneapolis complained to a store manager about the maternity ads being sent to his teenage daughter — only to learn weeks later that she was, in fact, pregnant. The story made the point clearly: companies know things about you that you have not told anyone, and they will act on that knowledge.
Algorithmic price discrimination by location has been documented multiple times. Staples charged customers different prices on its online store based on inferred zip-code wealth. Orbitz served Mac users more expensive hotel options than Windows users. The pattern recurs.
Recommender-driven harms get treated more carefully on the When AI Goes Wrong page, but the principle applies across consumer technology: a system that maximises engagement does not necessarily maximise wellbeing. The systems are not malicious; they are working as designed. The question is whether what they were designed to do is what we want them to do.
The honest summary
Retail AI is the most thoroughly normalised AI deployment in everyday life. It is mostly working as advertised. It makes supermarkets more efficient, e-commerce more useful, and routine purchasing decisions easier. The cost is that the corporations you transact with know you in considerable detail, and the regulatory framework for governing what they can do with that knowledge has lagged a long way behind what the technology enables. Most readers' lived experience of "AI at scale" is in fact this layer, not the chatbots.