Tailoring the AR Experience

 

With a need for retail that is less reliant on touch, creating immersive ways to test products has become stable stakes.

Memomi sits at the forefront of augmented reality and transforming the in-store and online experience by enabling customers to try products virtually and get personalized recommendations based on profile, style and preference. From skincare and make-up to eyewear and scarves, here how they are thinking about the possibilities of testing and sampling.

About the Guest

Salvador Nissi is the Founder and CEO of Memomi, the creator and patent holder of the Memory Mirror, an award-winning Digital Mirror platform that is revolutionizing the way people shop today. He is a serial entrepreneur and interaction design expert. Prior to founding Memomi, Salvador founded and exited three successful network systems integration companies. A recognized leader in experience-based design, Salvador has 15 years of international, executive-level management experience, specializing in rapid product developments and is an inventor holding 74 patents and can always be found at the crossroads of innovation.

With Memomi, customers can virtually “try on” products such as clothing, eye-wear, make-up,hair color, footwear and accessories in real time without any of the inconveniences of the actual try-on experience. On any device and operating system, through artificial intelligence, Deep Learning and augmented reality. Moreover, there is an added benefit of capturing try-on sessions that can be reviewed and shared later on. Among Memomi’s clients are world renowned companies, such as LVMH Group, L’Oreal Group, Estee Lauder Group , Neiman Marcus, Luxottica, and many more.

What Melissa Asked

  1. Tell us how Memomi works?
  2. Why do you think AR has proven to be intuitive for the beauty industry?
  3. How has COVID accelerated the adoption and need for technology like Memomi?
  4. How does your data capture help advance personalization?
  5. What is your vision for taking AR and retail to the next level as we see consumer behaviors quickly evolving?
  6. Who are some of your customers and what has been the formula for successful proof cases?

Listen To Previous Episodes of Retail Refined Right Here!

 

Follow us on social media for the latest updates in B2B!

Image

Latest

AI costs
QumulusAI Brings Fixed Monthly Pricing to Unpredictable AI Costs in Private LLM Deployment
February 18, 2026

Unpredictable AI costs have become a growing concern for organizations running private LLM platforms. Usage-based pricing models can drive significant swings in monthly expenses as adoption increases. Budgeting becomes difficult when infrastructure spending rises with every new user interaction. Mazda Marvasti, CEO of Amberd, says pricing volatility created challenges as his team expanded its…

Read More
GPU infrastructure
Amberd Moves to the Front of the Line With QumulusAI’s GPU Infrastructure
February 18, 2026

Reliable GPU infrastructure determines how quickly AI companies can execute. Teams developing private LLM platforms depend on consistent high-performance compute. Shared cloud environments often create delays when demand exceeds available capacity Mazda Marvasti, CEO of Amberd, says waiting for GPU capacity did not align with his company’s pace. Amberd required guaranteed availability to support…

Read More
private LLM
QumulusAI Secures Priority GPU Infrastructure Amid AWS Capacity Constraints on Private LLM Development
February 18, 2026

Developing a private large language model(LLM) on AWS can expose infrastructure constraints, particularly around GPU access. For smaller companies, securing consistent access to high-performance computing often proves difficult when competing with larger cloud customers. Mazda Marvasti, CEO of Amberd AI,  encountered these challenges while scaling his company’s AI platform. Because Amberd operates its own…

Read More
custom AI chips
Custom AI Chips Signal Segmentation for AI Teams, While NVIDIA Sets the Performance Ceiling for Cutting-Edge AI
February 18, 2026

Microsoft’s introduction of the Maia 200 adds to a growing list of hyperscaler-developed processors, alongside offerings from AWS and Google. These custom AI chips are largely designed to improve inference efficiency and optimize internal cost structures, though some platforms also support large-scale training. Google’s offering is currently the most mature, with a longer production…

Read More