Meta, in a press release, announced that it has opened a developer preview for a toolkit that will let mobile apps access sensors on its Ray-Ban Meta smart glasses. The Meta Wearables Device Access Toolkit — slated for a preview release later this year, with broader publishing expected in 2026 — will provide developers with access to on-device camera, microphone, and open-ear audio data so they can build hands-free, point-of-view experiences that extend mobile apps into physical, wearable use cases.
Meta said the preview will include an SDK, documentation, testing tools, and a beta distribution path through a Wearables Developer Center. The company also clarified that access to Meta AI features, including voice commands, will not be available in the initial preview, though it is exploring the capability for future updates.
Meta presented the toolkit as a way to let mobile apps make use of hands-free sensor inputs from smart glasses.
Also Read
What does the toolkit expose?
Meta says the first version of the toolkit will surface a “suite of on-device sensors,” specifically calling out the wearer’s camera, open-ear audio, and microphone. The company framed the functionality as enabling three types of capabilities: POV camera experiences, hands-free information retrieval and communication, and extensions of existing mobile apps into the physical world. Meta also stated the toolkit will include pre-built libraries and sample apps to accelerate development.
Developer resources and testing
Meta said it will supply documentation, API references and dedicated testing environments during the preview to help developers integrate with the glasses. The release emphasised controlled testing and limited distribution in the preview phase, rather than an immediate, wide rollout.
What it means for the users
For end users, the developer preview does not bring immediate new features but sets the groundwork for future applications on Meta-branded smart glasses. As developers begin experimenting with on-device camera and audio access, users could eventually see mobile apps extending into hands-free, wearable experiences, such as real-time information overlays or POV content sharing. However, since publishing remains limited until at least 2026, most of these features will not reach general audiences in the near term.

)