Wearable computing devices, including head-mounted displays and smart glasses, have experienced considerable advancement in the past few years, yet they have not achieved widespread acceptance. Gadgets such as Meta Quest, PlayStation VR headsets, and Ray-Ban Meta glasses are owned by millions, but they have not attained the same level of prevalence as smartphones or smartwatches. Nevertheless, Snap is determined to alter this landscape with its forthcoming Specs, which are scheduled to debut in 2026.
Snap’s Specs are touted as an “exceptionally powerful wearable computer” embedded within a sleek pair of glasses featuring transparent lenses. These glasses leverage machine learning to comprehend the environment around them, provide AI support in real-time within a 3D realm, and facilitate shared interactions among users equipped with Specs. According to Snap, the “small 2D rectangle” of smartphones does not suffice to fully exploit AI’s capabilities, envisioning AI chatbots as a foundation for immersive, AI-integrated 3D experiences.
Although Snap has yet to reveal its lightweight AR glasses, the organization is gearing up for a release in 2026. In the meantime, Snap is actively enhancing Snap OS, the software platform for its Spectacles. Recent updates include collaborations with OpenAI and Gemini for creating multimodal Lenses, a depth module API for converting 2D content into 3D, real-time transcription across more than 40 languages, and tools for dynamically generating 3D objects.
Snap OS additionally provides developers with innovative tools for designing location-specific experiences, such as a Fleet Management application for overseeing multiple Specs, Guided Mode for launching Specs into particular Lenses, and Guided Navigation for crafting AR-guided experiences. These innovations seek to broaden the functionality and attractiveness of Snap’s wearable devices, potentially opening avenues for increased adoption in the future.