Plato Data Intelligence.
Vertical Search & Ai.

New Ray-Ban Stories Will Reportedly Support Livestreaming

Date:

The second generation Ray-Ban Stories will reportedly have better cameras and support first person livestreaming.

Meta and Ray-Ban owner Luxottica officially announced work on new smart glasses in October last year.

Journalist Janko Roettgers says he viewed internal Meta documents detailing the improvements and new features of the upcoming device.

The current Ray-Ban Stories are camera glasses for taking hands-free first person photos and videos. They also have speakers and a microphone for music and phone calls but there is no display of any sort. Snapchat has been selling successive generations of a similar product, Spectacles, since 2017.

Dataminer NyaVR found an apparent image of the new case for the glasses in the companion app.

Roettgers reports the second generation Ray-Bans won’t have a display either. But it will apparently have higher quality cameras, longer battery life, and an anti-tamper mechanism to disable capturing images or videos when the front LED is covered.

The glasses will also support livestreaming to Instagram and Facebook, with viewer comments read out by an assistant via the built-in speakers, according to the report.

Third Gen Smart Ray-Bans Reportedly Getting A HUD And Neural Wristband

The third generation Ray-Ban smartglasses set for 2025 will get a HUD and neural wristband, according to a Meta roadmap leaked to The Verge.

In March The Verge reported that Meta’s VP of AR Alex Himel told staff the company planned to release third generation glasses in 2025 with a display and neural input wristband.

Called the “viewfinder”, this heads-up display will reportedly be used to show notifications, scan QR codes, and translate real-world text in real time. To be clear: this wouldn’t be true AR, it would be a small floating contextual display.

The neural wristband is based on the tech from CTRL-Labs, a startup Facebook acquired in 2019, and Meta has openly discussed its development. It uses EMG (electromyography) to read the neural signals passing through your arm from your brain to your fingers. Such a device could sense even incredibly subtle finger movements not clearly perceptible to people nearby. Himel reportedly said it will let the wearer “control the glasses through hand movements, such as swiping fingers on an imaginary D-pad”.

Ray-Ban Stories Reportedly Only ‘Used Actively’ By 10% Of Owners

A report claims Ray-Ban Stories are only ‘used actively’ by 10% of owners. More details:

Earlier this month though The Wall Street Journal reported that less than 10% of the current first generation Ray-Ban Stories are being “used actively”.

Meta reportedly isn’t giving up though it seems, and the company likely hopes the new features and improvements can draw in a wider audience and retain active users.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?