Friday, December 05, 2025 | 11:31 AM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

Amazon's Lens Live AI lets you shop products with visual search: What is it

Amazon's new Lens Live feature lets iOS shoppers scan items in real time, see instant matches in a carousel, and use Rufus AI for quick product insights and answers

Amazon Lens Live

Amazon Lens Live

Aashish Kumar Shrivastava New Delhi

Listen to This Article

Amazon has added a live visual-search feature to its Shopping app that scans items in real time and shows matching products as you point your phone camera. The new Lens Live feature, announced by Amazon, displays swipeable product matches while you pan the scene, lets you tap to focus on a single item, add matches to your cart or wish list without leaving the camera view, and surfaces short product summaries and suggested questions using Amazon’s Rufus AI assistant. Amazon says Lens Live is now available to “tens of millions” of US iOS users and will reach more customers in the coming weeks.
 

What Amazon's Lens Live does

As soon as eligible customers open Amazon Lens, the feature begins detecting items and presents top matches in a carousel at the bottom of the screen. Users can tap any item in the live view to refine the match, add it to the cart with a plus icon, or save it to a wish list with a heart icon. Under the carousel, Amazon shows quick summaries of notable product details and suggested conversational prompts powered by Rufus, so shoppers can ask follow-up questions or get short explanations without switching screens. Traditional Lens actions — taking a photo, uploading an image, or scanning a barcode — remain available. 

How it works

Amazon says Lens Live combines on-device vision and cloud models. A small object-detection model runs on the phone to identify items in real time as you move the camera, which reduces the need for manual interaction. The system then uses a visual embedding model to match the camera view against Amazon’s catalogue of billions of listings and returns exact or similar results.
 
The feature also calls Rufus, Amazon’s large language model (LLM), to generate the summaries and suggested questions you see under the results. Behind the scenes, Amazon uses managed services such as Amazon OpenSearch and Amazon SageMaker to host and scale the machine-learning models.
 
Similar visual shopping tools already exist: Google Lens offers image-based product search and shopping suggestions, and Pinterest provides visual search and “Shop the Look” tools that let users find items from images. Amazon’s Lens Live differs by combining continuous, real-time camera scanning with direct shopping actions and an integrated conversational assistant.

Availability

Amazon says Lens Live is rolling out in the shopping app on iOS first to US customers and will expand availability in the weeks ahead. A timeline for the India rollout has not yet been announced.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Sep 03 2025 | 2:13 PM IST

Explore News