Though many considered the recent WWDC a “quiet” event, from the perspective of AR, we really have seen what Apple’s vision of augmented reality — and the key role that Maps occupies therein — come into focus.
Out of the many components that will form the backbone a viable “Apple Glass” product, LIDAR and ARKit 4’s enabling of “Location Anchors” is a direct downstream product of the massive understating of driving Apple Maps vehicles across major cities in the developed world. As noted by Justin O’Beirne and the developer video, ARKit 4 relies on solely on LIDAR data in order to reference the iDevice’s position relative to the physical world. Unusually, even for an Apple product, the “Location Anchor” feature is limited to just five metropolitan areas at this time, implying that future AR applications for placing virtual assets will be wholly dependent on the scope and reach of Apple Maps’ mapping cars and pedestrian backpackers. It’s clear that Apple has learned the lessons of Google Glass, and will not place any form of RGB camera on the forthcoming Apple Glass — it will be purely LIDAR driven, which brings me to the next point: App Clips and the Apple QR Code.
Remember this circular QR code from the keynote? It’s not just Apple being needlessly proprietary. Regular QR codes cannot be read by LIDAR — but according to AR evangelist Robert Scoble, the Apple QR codes are in fact visible to LIDAR sensors. From this perspective, we can see how Apple is seeding the world with information “touchpoints” that will reveal App Clips, Apple Pay interfaces, and other virtual assets that can be activated when “seeing” an Apple QR code. Scoble goes into a broader argument about e-commerce opportunities to challenge Amazon, that while compelling, I am not entirely convinced of at this point.
All in all, I find it impressive how Apple has methodically built out and tested major technological assets — from the physical world with Apple Maps Image Collection — to the digital — the most comprehensive and well developed software stack — all without actually having an explicitly AR product yet.
I have a few questions as a discussion starter:
- Will the future Apple Glasses — or other LIDAR enabled products — enable some form of federated AR map-making? At Apple’s current image collection rate, it would be a decade before they reach many parts of the developing world.
- How might Apple navigate this LIDAR scanning challenge when two of the biggest smartphone markets in the world — India and China — do not allow foreign companies to do any form of surveying?
Thanks guys, the future is exciting!