Skip to content

The Complete Guide to Apple Glasses: AR Innovation or Tech Hype?

Augmented reality (AR) headsets are positioned as the next major computing platform, with the potential to change how we communicate, work and access information. All eyes are on Apple to see if its rumored "Apple Glasses" can popularize this nascent technology like the iPod did for music and the iPhone for smartphones. Here‘s everything we know so far about Apple‘s exploration into AR glasses and what we can expect.

What Are Apple Glasses?

Apple Glasses are Apple‘s upcoming augmented reality smart glasses, which are rumored to release as early as 2023. They will connect wirelessly to the iPhone to display contextual information, notifications and visuals overlayed onto the real world.

So unlike virtual reality headsets that fully immerse you in a simulated environment, Apple Glasses use AR to enhance your natural surroundings. You can still see and interact with your actual physical environment.

Apple Glasses are not a standalone computing device. They will rely on the iPhone for processing power and internet connectivity. The glasses themselves will provide stereoscopic 3D visuals through transparent lenses along with speakers for audio.

Multiple reports indicate that Apple Glasses will sport sleek, curved frames with thick arms housing batteries and chips. The interface will likely leverage gestures, voice and a touch panel.

In essence, Apple Glasses aim to remove the need to constantly glance at your phone screen for information. Everything is immediately accessible just by wearing the glasses.

How Will Apple Glasses Work?

As an iPhone accessory, Apple Glasses will deeply integrate with iOS and Apple apps via wireless connectivity options like Bluetooth and Wi-Fi. The exact technical capabilities remain speculative and shrouded in secrecy.

But based on the breadth of Apple‘s AR patents, we can expect Apple Glasses to:

  • Display system alerts, notifications and app data in your field of view
  • Enable augmented video calls and communication
  • Layer contextual maps/directions onto real-time surroundings
  • Identify objects, faces and places in view
  • Translate text instantly in foreign languages
  • Monitor health stats like heart rate
  • Create immersive 3D virtual objects and environments
  • Pair with various Apple devices, not just iPhone

Apple‘s acquisition of AR lens maker Akonia Holographics indicates they have overcome major technical challenges with transparent displays. Reports suggest Apple has developed specialized 3D sensors for spatial mapping and advanced computer vision abilities.

Custom Silicon Brains

Apple designs custom silicon chips like their A-series processors to optimize performance in products like iPhones, iPads and Macs. This allows greater hardware and software integration critical for power-efficient operation.

Leaker Jon Prosser cites a custom Apple chip nicknamed Garta as the central processor likely powering Apple Glasses. Specs are unknown but it surely packs an Apple-designed GPU, CPU and neural engine for handling graphics, AI computation and sensor fusion.

Extra power translates to better experiences whether its lag-free metaverse spaces or instant visual context translating languages on-the-fly. Apple Glasses needs to instantly render visually impressive 3D constructs stabilized in real-world coordinates without glitching or jitter. That requires serious number crunching efficiencies only possible thanks to custom Apple silicon suited for on-device AR versus relying on cloud connections.

Powerful integrated chipsets will handle intensive on-device processing to minimize latency and network dependence. Intuitive inputs via eye-tracking, voice commands and motion sensors round out the package, though the exact control mechanisms are still unofficial.

Expected Apple Glasses Features and Specs

Here are some of the leaked specs and capabilities reported so far on what Apple may unveil with their debut AR glasses:

Displays

  • Dual micro-LED or OLED displays
  • Resolution up to 1280×960 pixels per eye
  • Enable stereoscopic 3D visuals

Micro-LED rounds out as the vastly superior display technology for AR glasses but mass producing miniaturized panels still proves challenging. Apple acquired this expertise licensing numerous patents from Liquavista, Luxvue and many display researchers.

OLED‘s high contrasts and pixel brightness better blend virtual elements atop naturally lit real environments. Micro-LED‘s perform even better with purer blacks, less power drain plus faster refresh rates and durability.

If supply challenges persist though, mass producing millions of Apple Glasses units may dictate repurposing existing iPhone OLED manufacturing pipelines. Eventually Micro-LED production will catchup permitting improved sequels.

Camera

  • At least 5MP Outward-Facing Camera
  • First-Person Video
  • Environment Analysis
  • Spatial Mapping

Apple filed patents outlining ‘Truncated Ultra Wide Angle Lenses‘ for AR glasses which squeeze ultra wide field-of-view imagery into a compact form factor while minimizing edge distortion artifacts.

Coupled with onboard depth sensors, Apple Glasses cameras generate occlusion masks segmenting virtual objects in 3D space properly obscured by real-world obstacles in view. This occlusion effect proves essential for tangible real/virtual environment integration.

Jason Odom, Apple‘s Technology Development Manager and AR optics specialist, is listed on patents covering the optical engineering behind Apple Glasses environment analysis capabilities.

Input

  • Touchpad on temple
  • Eye tracking sensors
  • Voice control via Siri
  • Head gestures
  • In-air typing

A video based eye-tracking mechanism figures prominently in Apple Glasses interface patents using IR imagery outside normal human vision range. This facilitates both precise finger-like point selection atop AR constructs visible only through Apple Glasses while also enabling user presence confirmation.

In-air typing detected by wrist worn sensors may enable typing onto AR interfaces. Additionally, a newly uncovered patent depicts a finger worn ring peripherial packing sensors and feedback modules. Likely intended to increase input flexibility when interacting with Apple Glasses generated AR elements.

Connectivity

  • WiFi
  • Bluetooth
  • Cellular connectivity to iPhone and other Apple devices

Audio

  • Dual integrated speakers
  • Possible support for Spatial Audio

Spatial audio compatible with Apple Music content allows virtualized audio sources positioned freely across real-world space for more immersive experiences. Apple Glasses can simulate audio occuring at mapped coordinates using head-related transfer functions (HRTF) personalized via integrated mics sampling ear geometry.

So instead of music originating from fixed left/right stereo sources pumped through tiny drivers, Apple Glasses have the potential to render localized emitters blooming with concert hall resonance and externalized bass.

Sensors

  • Accelerometer
  • Gyroscope
  • Depth sensors for spatial mapping

Other

  • Prescription lens support
  • Adjustable, modular arms
  • Different price points from $499 to $749+
  • 4+ hour battery life

Apple Glasses Release Date

Credible rumors point to an Apple Glasses launch in 2023. Noted Apple analyst Ming-Chi Kuo predicts mass production ramping up by Q4 2022 for a 2023 release.

Though Apple has reportedly already produced a whole range of prototypes, the company is fine-tuning designs to solve issues like battery life, sizing and weight. Software also needs more time for refinement before execution.

An official launch timeframe is speculation, but a 2023-2025 release aligns with past reports of Apple‘s 5-7 year roadmap for AR glasses. Supply chain sources cite 2023 as the target, but delays could push it back.

Once Apple Glasses reaches consumers, Apple may iterate rapidly with discounted sequels integrating faster chips and displays annually. We saw this aggressive iteration in other Apple categories like the Apple Watch and AirPods.

So expect the initial Apple Glasses release as an early, pricey first-gen stepping stone towards an eventual $200-300 price range product everyone can afford.

Battery Life Challenges

reports indicate Apple is targeting 4+ hours of continuous battery life for Apple Glasses before it must recharge by slotting into an iPhone. While no metrics exist yet, we can extrapolate power requirements based on existing products.

The apple watch series 8 touts 18 hours battery life with a 308mah battery. Apple Glasses packing ~1000mah split across both stem arms should deliver 4-5 hours comfortably. However, advanced AR optics pipeline, 5G networking and sensor payload impacts runtimes heavily asabwe see. Integrating esoteric hardware like eye tracking sensors adds overhead. Plus inches away from eyes demands maximum display brightness swallowing more juice.

If computational photography sufficiently minimizes transmitted imagery for cloud processing, onboard SoCs sip power rather than gulp it. But continuous environment mapping, surface reconstruction and occlusion culling taxes resources. Sustained 5G mode creeps signal strength higher.

My estimate based on power profiles places worst case endurance around 3 hours before automatic low power mode kicks in limiting functionality as power savings. Best case still caps out around 5 hours max before charging. Innovations like solar harvesting or body heat recycling can help but initial gen will make compromises favoring performance over longevity.

How Much Will Apple Glasses Cost?

Respected Apple analyst Ming-Chi Kuo estimates Apple Glasses will be priced around $500 on launch. Lower than $1000 for Facebook‘s Oculus Quest 2 VR headset or Microsoft‘s $3500 HoloLens 2 AR headset.

The high-end model with added capabilities could reach $749+. Prescription lenses may cost extra too.

This positions it firmly as a premium consumer device compared to enterprise-focused competitors. Production costs estimated between $400 to $600 allows room for Apple‘s margins.

Component Est Unit Cost
Battery $50-60
Chipset $100-120
Waveguide optics $120-150
Cameras, sensors, input systems $90-100
Frames, lenses, speakers $30-50
Total $400-600

The battery, chipset and waveguide optics prove most expensive currently. But over time, presuming Apple Glasses succeeds in-market, Apple can refine components via optimizing custom designs to lower costs like with Apple Watch.

What consumers get for the price is Core AR functionality fused with vaunted Apple design and user experience. Essentially a head-worn iPhone accessory matching their digital lifestyle.

Apple Glasses vs The Competition

Though innovative, Apple Glasses enters an increasingly crowded arena when launched. Apple battles fierce rivals also offering next-gen wearables:

Product Company Description
Google Glass Google Pioneered the smart glasses concept but failed to catch on. Redesigned Enterprise version targets businesses.
Microsoft HoloLens 2 Microsoft Impressive yet expensive at $3500. Targets enterprises interested ready-made AR solutions.
Oculus Quest 2 Facebook Popular wireless VR/AR headset that sold over 10 million units.
Project Cambria Facebook High-end AR glasses rivaling Apple Glasses in sophistication.
Spectacles 3 Snap Basic AR selfie glasses and stepping stone to more advanced products

Apple waits long enough for the technology to feasibly miniaturize into stylish glasses form factor. Then utilize proven design expertise combing hardware, software and services to simplify and enhance mainstream AR utility.

The Apple ecosystem and huge iPhone installed base offers synergy rivals cannot match. Apple Glasses extends iPhone capabilities instead of totally replacing mobile devices in disruptive ways unacceptable to most users currently.

Real-World Apple Glasses Use Cases

Here are some examples of how Apple Glasses aims to be handy both at work and home:

  • Navigate unfamiliar cities with turn-by-turn directions overlaid onto real streets
  • View multiple virtual displays and interface via keyboard/mouse/trackpad
  • Show product specs scanning supermarket items; project recipes onto kitchen counters
  • Translate signs/menus in foreign locales instantly
  • Get important notifications during meetings without checking phones
  • Multiplayer games against opponents in augmented locations
  • Eventually replace devices like phones, watches and tablets

Apple Glasses keeps users present instead of buried in device screens. AR attaches digital content to real-world coordinates freeing them from physical monitors.

Everything from communication, creativity, productivity and entertainment refracts through Apple Glasses‘ AR prism once lived reality seamlessly synchronizes virtual aspects.

Apple SVP Jeff Williams calls AR "critically important for Apple’s future." CEO Tim Cook believes AR could elevate human ability similar to how the App Store democratized software innovation.

Privacy Concerns Around Face Data and Recording

Sharing life‘s moments publicly using AR technology raises fresh privacy issues considering the cameras and sensors continuously tracking user behavior, especially facial imagery and recordings of people.

Apple patent filings reveal designs accomodating privacy concerns like removable camera modules and recording status indicators. Analyst Ming-Chi Kuo also asserts that Apple Glasses lacks photography and filming functionality to ease regulatory pushback proactively.

That may disappoint some consumers but underscores Apple‘s stress on privacy fundamentals rather than catering to all user demands initially.

Multiple Apple executives reiterate that privacy governs decisions on unreleased products. Other Apple wearables like Apple Watch and AirPods display similar principles restricting third-party data gathering.

Augmented reality apps must likewise gain user approval before accessing sensitive data just as iOS apps currently do. Users can toggle permissions individually to camera, microphone etc.

So Apple Glasses should offer robust privacy safeguards managing personal data accessed by hardware/software as per Apple‘s established frameworks. But restrictions driven by regulatory factors may limit functionality short term versus rivals.

Closing Thoughts on Apple Glasses‘ Potential

Given a 2023 launch happens as predicted, Apple Glasses kicks off the next epoch of spatial computing by combining their user-focused design ethos with vertical hardware/software expertise. Even in version one, Apple Glasses can drive AR mainstream.

But for potential buyers, Apple Glasses launching circa 2025 may prove the better milestone, seriously evaluating if AR glasses improve daily life. Like the first Apple Watch in 2015, gen-one likely appeals more to early adopters rather than general consumers.

Lacking obvious necessity, niche technical appeal slows mainstream approbation until Apple refines the formula adequately. Interesting technology underwhelms versus real-life utility until expectations sync with reality.

Still, Apple chased iPod and iPhone relevance over profits seeking to change consumer behavior by solving problems uniquely. Reality gets better with help from Apple Glasses. Vital notifications or tools no longer interrupt enjoyment of reality. They co-exist to elevate living.

For AR to succeed, Apple must make Apple Glasses indispensable lifestyle enhancers akin to AirPods – not essential but making routine tasks more convenient. Long term though, Apple Glasses aims to supersede even the formidable iPhone.

Ultimately it remains whether consumers will truly take reality up on Apple‘s augmented offer to digitally accessorize the adventure of daily life by wearing Apple Glasses.