Apple Glass, LIDAR, GPS Inaccuracy and the Holy Grail of Location and Mapping Accuracy!

In the Graduate movie, a young Dustin Hoffman is playing a recent college graduate who doesn’t know what to do with his life. Some jerky older adult pulls Benjamin (Hoffman’s character) aside to tell him “one word, Plastics, there is a great future in Plastics.” Well it looks like this year, the great future is in LIDAR. One word, LIDAR. Hear me out.

Prosser’s Apple Glass Leak

As I mentioned in a recent post, the YouTuber, Jon Prosser, recently leaked from an Apple employee that Apple is working on releasing glasses next year. Prosser claims he has seen the glasses Apple is making and claims that the product will cost $499 (without prescription lenses), it will be called Apple Glass, Apple Glass1 looks like regular glasses, and Apple Glass will have LIDAR technology but no cameras. It’s the last part that I will address. Why would Apple not put cameras in Apple Glass? And is LIDAR technology enough for the Apple Glasses to make customers really want to purchase that product?

First, why won’t there be cameras in Apple Glass? Prosser suggested that there would be no cameras because people freaked out when Google Glasses had cameras in it. I think that makes sense. Apple doesn’t want people to be afraid to wear Apple Glass. Apple wants to make it easy for every day people to purchase the glasses. Apple also has been telling everyone that they respect the privacy for consumers. If cameras were on Apple Glass, people would be afraid that whoever is wearing Apple Glass is video recording what they see.

A number of years ago, I purchased Snap’s glasses, Spectacles. You can see the video I made here. They were fun to wear but the glasses were big and people could see the camera on one side of the glasses. When you were having the glasses take pictures or recording video, a light would go off on the side of the glasses letting people know you are recording them. I wore then for a while, but I felt I looked kind of silly wearing them and I have stopped using them.

So one reason Apple might not put cameras on the Apple Glass is that people might be afraid to wear them if everyone knew they had cameras on it. That makes sense.

Another reason to not use cameras on the Apple Glass is that cameras, that are always working or on a lot, could eat up the battery of the Apple Glasses. Prosser says the Apple Glass will be connected with the iPhone to work. So probably it will wirelessly connect with Bluetooth or some other wireless technology like the W1 chip. So that wireless connection itself will eat up some of the battery on the Apple Glass. And if Apple Glass look normal like regular glasses, well there won’t be much room for batteries on Apple Glass. So to save battery energy, maybe that is why Apple will not use cameras on the Apple Glass.

LIDAR

But how could Apple Glass be a great technology if they don’t have cameras? The answer is LIDAR, which Prosser mentioned is on the Apple Glass. He also pointed out that Apple put LIDAR on the latest iPad Pro so Apple could test out LIDAR before Apple Glass comes out. I think that makes sense. But how could LIDAR make Apple Glass be great new technology.

First, what is LIDAR? Here was what Velodyne a major manufacturer of LIDOR sensors and technology describes it as:

Lidar is an acronym for “light detection and ranging.” It is sometimes called “laser scanning” or “3D scanning.” The technology uses eye-safe laser beams to create a 3D representation of the surveyed environment. Lidar is used in many industries, including automotive, trucking, UAV/drones, industrial, mapping, and others.

Velodyne goes on to explain how LIDAR works:

A typical lidar sensor emits pulsed light waves from a laser into the environment. These pulses bounce off surrounding objects and return to the sensor. The sensor uses the time it took for each pulse to return to the sensor to calculate the distance it traveled. Repeating this process millions of times per second creates a real-time 3D map of the environment. An onboard computer can utilize this 3D map of the surrounding environment for navigation.

LIDAR has been around for a long time in automobile technology for autonomous cars to “see” as The Verge pointed out in an article right after the Consumer Electronic Show this past January. That technology, as the Verge pointed out, has been very expensive, but now that technology is becoming much cheaper.

So cheap, in fact, that the companies leading the pack now predict LIDAR will become as commonplace in mass-market vehicles as cameras, radar, and other low-cost safety technology.

(The Verge).

So why would Apple want to put pulsed lasers rather than cameras in Apple Glass? Or not both? As Velodyne explained, LIDAR has an advantage over cameras for mapping out what is around you because it immediately creates a 3D map while cameras create a 2D map that then the computer has to figure to make it 3D.

Cameras produce 2D images of the environment and companies are installing them around vehicles to use for navigation. However, there are serious problems with the accuracy of proposed camera-centric systems which have not been solved and will likely not be solved soon.

Lidar “sees” in 3D, a huge advantage when accuracy and precision is paramount. The laser-based technology produces real-time high-resolution 3D maps, or point clouds, of the surroundings, demonstrating a level of distance accuracy that is unmatched by cameras, even ones with stereo vision. Whereas cameras have to make assumptions about an object’s distance, lidar produces and provides exact measurements. For this reason, autonomous or highly automated systems require lidar for safe navigation. The ability to “see” in 3D can’t be underestimated. Lidar produces billions of data points at nearly the speed of light. Each point provides a precise measurement of the environment. Compared to camera systems, lidar’s ability to “see” by way of precise mathematical measurements, decreases the chance of feeding false information from the vision systems to the car’s computer.

(Velodyne).

So LIDAR appears to be more efficient than cameras at mapping the world. Maybe also LIDAR may use up less energy than having cameras open all the time. Also it looks like when you use cameras you will need more processor power to determine what the camera is seeing.

Now the LIDAR Apple puts into Apple Glass may be similar to the LIDAR sensor that Apple put into the newest iPad Pro. This is how Apple discussed its LIDAR technology in its March 18, 2020 Press Release when it announced the newest iPad Pros:

Breakthrough LiDAR Scanner

The breakthrough LiDAR Scanner enables capabilities never before possible on any mobile device. The LiDAR Scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds. New depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.

So you figure Apple will use LIDOR sensors in Apple Glass that are as good as the LIDOR sensor in the iPad Pro. So that means the LIDAR will be able to see anything in front of you that is at least within 5 meters, which is just short of 16 and 1/2 feet. So what could Apple do with 16 and 1/2 feet of LIDOR data for Apple Glass?

Some Ideas of What Apple Could Do!

What if Apple used the map and GPS data that the iPhone has together with the LIDAR data from the Apple Glass? Let’s think about this. Your Apple Glass maps out what is in front of you for 16.5 feet in front of you and sends it quickly to the iPhone. Your iPhone knows where you are because of GPS, Bluetooth Data and Wifi connections. Now Apple has been spending many of the last few years catching up to Google Map by sending cars mapping cities and towns in the U.S.A. and around the world. That is why in Apple Maps you can drill down on the map in the city your are and see 3D renders of buildings and monuments. So when the LIDAR data comes back to the iPhone, the iPhone can quickly connect it to the data from the GPS and Apple Maps and match up the LIDAR data. So even though you don’t have a camera, the pictures and virtual 3D pictures of the buildings on Apple Maps could be sent to your Apple Glass and matched up with the scan of your LIDAR data. That means your Apple Glass and your iPhone which is connected to it will know exactly what you are looking at through your Apple Glasses and it will be able to label it quickly for you. So you will able to look at a building or a a neighborhood and Apple will know exactly where you are and what you are looking towards and map out in front of you.

This LIDAR information is really helpful for Apple because currently, GPS data is not very accurate.2 Significantly, the current state of GPS data, according to the United States Government, is accurate to 16 feet. That is just short of the range of the LIDAR sensor on the iPad. The LIDAR data could fill in the 16 feet of inaccuracy from the GPS data. So the LIDAR data married with the GPS data will help Apple pinpoint exactly where you are standing and where you are looking. Apple will then put Augmented Reality (AR) data on your Glass as you are looking.

Right now, if Apple just used GPS data, the accuracy of what you see would be really messy. But with LIDAR data being sent back to the iPhone and matched with GPS data, Apple will have much better accuracy regarding where you are. Apple couldn’t do this with out first mapping out cities and towns on its own. Apple now has data regarding where buildings and structures are located. With LIDAR and other location data, Apple will be closer to the holy grail of location accuracy.

What does this mean? When I am driving and wearing my Apple Glass, I will be able to get map data on my Apple Glass rather than what I currently do, i.e. either look at my Apple Watch for prompts or look at the iPhone next to me. With Apple Glass, Apple Map data will be right in front of my eyes. I will look ahead. Data will come on telling me which way the route is going and where I have to take turns.

Similarly, using Apple Maps when you are walking and following directions is a pain. You have to look down at your phone or Apple Watch. And because the GPS isn’t very accurate, the map is often not lined up with the direction I am looking at. With Apple Glass, Apple Map will be more accurate and very convenient. The LIDAR data from Apple Glass will tell exactly Apple Maps where I am looking and the map data will be shown on my Apple Glass screen. Walking directions now will be incredibly easy.

With Apple Maps supercharged and showing on you Apple Glass, developers will make great apps using the map data combined with the LIDAR data. Someone can make a walking tour app for tourists on Apple Glass. When you look at certain landmarks, information can be showed on your Apple Glass or a video or audio played.

As found by Josh Constine in leaked iOS 14 files and building on what Benjamin May at 9to5Mac reported in March 2020, Apple is working on using Apple branded QR codes to work with AR. Longtime tech maven, and the guy who first wore Google Glasses in a shower, recently extolled how important Apple’s AR glasses with QR code will be. Scoble believes that that Apple’s QR code will allow people to use Apple Pay by just looking at the QR code and telling Siri it is ok to pay it. Scoble points out that Apple will further know your location to match up with the store and Apple Pay because they know exactly where they put the QR code for the store. His post is on LinkedIn here. What is interesting about Apple’s QR codes working with LIDAR is that LIDAR is not a camera but Apple figured out how to crate QR codes that LIDAR could read using software.

The Last Frontier of Mapping

But I think Apple will use Apple Glass for something even greater — one of the last frontiers that hasn’t been mapped? What is that? Is it Antarctica? Is it deep oceans? No, it is your local gigantic box store, like Costco or Home Depot. GPS can only map what is outdoors and generally tell you roughly where you might be in a building. But Apple Maps doesn’t not have mapping information for large buildings. Now Apple previously tried to use technology to map the great indoors. Way back at the Developers Conference in 2013, Apple launched the iBeacon protocol which uses small Bluetooth beacons that will work with iPhone. I played around with such iBeacons and they were cheap, like $10 each. You could plug them into the wall in a building and when you walked by with an iPhone that had an app that worked with it, the iBeacon could send you a message on your screen. And presumably, if Apple knew where the iBeacons were placed, it could map your location in doors. But that technology never really took off. It didn’t work well with the iPhone. For it to generally work, stores had to put the iBeacons around and map the location of each. Then for it to work, the store’s app had to be used by the customer and work well with the data from the store.3

What I want as a consumer, and which iBeacons failed to deliver, is walk into my local large Home Depot Store and when I tell my iPhone to get me a certain screw, the iPhone takes me directly to the hallway and exactly where the screw I need is located. Finding products in stores without having to search for help and look at a bad map in the store is want I want Apple to solve.

Apple Glass with LIDAR data could solve this problem. A developer working with Home Depot or other stores could use Apple Glass to quickly LIDAR map the store. Then the Home Depot app would have an exact map of each store. And when someone came in the store with Apple Glass, the Home Depo app would know exactly where in the store the customer was located based on the LIDAR data it was getting. The Home Depot app would then guide the customer exactly to where the product is.

More importantly, Apple could use the LIDAR data from customers who walk into stores to map each store. Already, Apple is using Siri data from customers, although anonymously so it doesn’t know the name of each customer, to improve Siri. Apple is crowdsourcing Siri data from its millions of customers to improve Siri. Apple could do the same with the LIDAR data that customers wearing Apple Glass are collecing when they walk into stores. What if Apple married the LIDAR data with the Apple Pay data when Apple customers purchase products in large stores and malls? Apple probably knows exactly what terminal in a store you used Apple Pay. Although the stores don’t know your name and ID, Apple probably knows where the Apple Pay transaction occurred. Apple could marry that location data with the LIDAR data to map indoor stores and malls.

Don’t Forget the U1 Chip!

There is more that Apple could marry up with the LIDOR Data. Apple has a U1 Chip. Remember, back last year, Apple put a U1 Chip in the iPhone 11 Pro. Back in September 2019, longtime Apple journalist Jason Snell was so impressed by the U1 chip and its potential, he penned an article entitled: The U1 Chip In The iPhone 11 Is The Beginning of an Ultra Wideband revolution. Snell pointed out Apple only released this nugget of information about the U1 chip on its website:

The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to understand its precise location relative to other nearby U1‑equipped Apple devices. It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities.

It looks like for now the U1 chip is used for making AirDrop transfers and maybe the Apple Tags which hasn’t been released. The U1 chip is low powered and can pinpoint other U1 chips.

Snell explained its potential after interviewing an industry person familiar with the technology:

But the possible applications of UWB go way beyond AirDrop and tracking tags. Decawave’s Viot says potential applications include smart home tech, augmented reality, mobile payments, the aforementioned keyless car entry, and even indoor navigation. (And it’s not a power hog, either—Viot says that Decawave’s latest UWB chip uses one-third of the power of a Bluetooth LE chip when in beacon mode, as a tracking tile would be.)

(SixColors).

So think about what this means? Apple is rolling out the U1 chip roughly at the same time it is rolling out the LIDRO technology.4 Apple could have both technologies working together. For example, eventually if all the Apple devices have the U1 chip, Apple could use the location of the devices together with the LIDAR data. So if you lose your device in a mall, Apple could pinpoint for you exactly where that device is.

Conclusion

Developing new sensors, it looks like Apple is very serious with AR and mapping indoors with LIDOR technology. Apple getting in the LIDOR game is very serious. Remember, there have been rumors for many years that Apple is working building a car — Project Titan. (See MacRumors). LIDOR has been used for building an autonomous car.
Who knows what new products and services Apple will bring to the market in the coming years. What ever they do will push us forward into the future.


  1. I refer to it as Apple Glass without “the” in front of it because that is Apple’s naming convention for products. For example, Apple always refers to “iPhone,” not “the iPhone”. ↩︎
  2. In addition to GPS, Apple’s latest iPhone also has GNSS (Global Navigation Satellite System)technology. GNSS is an umbrella over all the satellite location technology. GPS relies on the old United States Department of Defense satellites. But other countries and Europe are creating new location technology location using new satellites. Europe’s Galileo is precise to 1 meter, roughly three feet, much more precise than GPS. So eventually satellite location for iPhones and other devices will be more precise than GPS. ↩︎
  3. Apple hasn’t abandoned iBeacon technology and still touts it in the technology specifications for the iPhone and iPad devices. It refers to it as “iBeacon microlocation.” ↩︎
  4. Now interestingly, it doesn’t look like the latest iPad Pros released this spring have the U1 Chip. (See AppleInsider — speculating that because of supply issues or the iPad Pro doesn’t have the latest processor, that is why the iPad Pros do not have the U1 chip). ↩︎