| || | submitted by /u/Archy_AR
You can also find us on Instagram, Medium, Pinterest and LinkedIn
- Pinterest Adds AR Makeup Try-On Feature to Its Mobile Lens Tool
Last week, social media giant Pinterest unveiled "Try On," a virtual make-up visualization tool running on its Lens visual search tool.
The feature is rolling out now to US users of the Pinterest apps for iOS and Android.
Try On is accessible through the camera tool in the search section of the app, where users will find a "try on" button that flips the view to the front-facing mode and a carousel for different shades of lipstick. In addition, users can access the tool via the "try it" button embedded in lipstick-related content on boards.
After arriving at a preferred look, users can swipe up to shop for the selection or capture a photo to pin to a board or share through other means. Participating brands include bareMinerals, Estée Lauder, Neutrogena, Sephora, and select L'Oreal brands, namely Lancôme, NYX Professional Makeup, Urban Decay, and YSL Beauté.
- Mozilla launched a new WebXR app called Hello WebXR, which is compatible with most headsets that use web browsers such as Oculus’s or Google Chrome on PC VR headsets. The app acts as an introductory experience for those who are new to VR, showcasing the different types of content and interactions available on the platform.
The experience will work on any WebXR compatible browser on a headset, including Mozilla’s own VR browser Firefox Reality. Other browsers, such as the Oculus Browser on Oculus Quest or Google Chrome on Oculus Rift, also support WebXR and should work with the Mozilla Hello WebXR site.
Mozilla stated in a blog post that they expect the experience to grow over time, and develop it into “a sandbox that we could use to prototype new experiences and interactions.” To try out Hello WebXR for yourself, just head to this page on your WebXR-compatible headset.
- Microsoft’s Project Tokyo helps visually impaired users ‘see’ with AI and AR
An LED strip affixed above the HoloLens’ band of cameras tracks the person closest to the user and turns green when said person has been identified, in order to let communication partners or bystanders know they’ve been seen or to cue them to move out of the device’s field of view. One computer vision model detects the pose of people in the environment, providing a sense of where and how far away they are. Another analyzes footage from the headset’s camera to recognize people and determine if they’ve opted to make their names known to the system.
Microsoft says it’s using a scaled-down version of the tech to help blind and low-vision children develop social interaction skills.
Project Tokyo, which is still ongoing, follows on the heels of efforts like Microsoft’s Seeing AI, a mobile app designed to help low- and impaired-vision users navigate the world around them. More recently, the tech giant debuted Soundscape, a navigation app that uses binaural audio to help visually impaired users build mental maps and make personal route choices in unfamiliar spaces.