Apple Glass, LIDAR, GPS Inaccuracy and the Holy Grail of Location and Mapping Accuracy!

In the Graduate movie, a young Dustin Hoffman is playing a recent college graduate who doesn’t know what to do with his life. Some jerky older adult pulls Benjamin (Hoffman’s character) aside to tell him “one word, Plastics, there is a great future in Plastics.” Well it looks like this year, the great future is in LIDAR. One word, LIDAR. Hear me out.

Prosser’s Apple Glass Leak

As I mentioned in a recent post, the YouTuber, Jon Prosser, recently leaked from an Apple employee that Apple is working on releasing glasses next year. Prosser claims he has seen the glasses Apple is making and claims that the product will cost $499 (without prescription lenses), it will be called Apple Glass, Apple Glass1 looks like regular glasses, and Apple Glass will have LIDAR technology but no cameras. It’s the last part that I will address. Why would Apple not put cameras in Apple Glass? And is LIDAR technology enough for the Apple Glasses to make customers really want to purchase that product?

First, why won’t there be cameras in Apple Glass? Prosser suggested that there would be no cameras because people freaked out when Google Glasses had cameras in it. I think that makes sense. Apple doesn’t want people to be afraid to wear Apple Glass. Apple wants to make it easy for every day people to purchase the glasses. Apple also has been telling everyone that they respect the privacy for consumers. If cameras were on Apple Glass, people would be afraid that whoever is wearing Apple Glass is video recording what they see.

A number of years ago, I purchased Snap’s glasses, Spectacles. You can see the video I made here. They were fun to wear but the glasses were big and people could see the camera on one side of the glasses. When you were having the glasses take pictures or recording video, a light would go off on the side of the glasses letting people know you are recording them. I wore then for a while, but I felt I looked kind of silly wearing them and I have stopped using them.

So one reason Apple might not put cameras on the Apple Glass is that people might be afraid to wear them if everyone knew they had cameras on it. That makes sense.

Another reason to not use cameras on the Apple Glass is that cameras, that are always working or on a lot, could eat up the battery of the Apple Glasses. Prosser says the Apple Glass will be connected with the iPhone to work. So probably it will wirelessly connect with Bluetooth or some other wireless technology like the W1 chip. So that wireless connection itself will eat up some of the battery on the Apple Glass. And if Apple Glass look normal like regular glasses, well there won’t be much room for batteries on Apple Glass. So to save battery energy, maybe that is why Apple will not use cameras on the Apple Glass.

LIDAR

But how could Apple Glass be a great technology if they don’t have cameras? The answer is LIDAR, which Prosser mentioned is on the Apple Glass. He also pointed out that Apple put LIDAR on the latest iPad Pro so Apple could test out LIDAR before Apple Glass comes out. I think that makes sense. But how could LIDAR make Apple Glass be great new technology.

First, what is LIDAR? Here was what Velodyne a major manufacturer of LIDOR sensors and technology describes it as:

Lidar is an acronym for “light detection and ranging.” It is sometimes called “laser scanning” or “3D scanning.” The technology uses eye-safe laser beams to create a 3D representation of the surveyed environment. Lidar is used in many industries, including automotive, trucking, UAV/drones, industrial, mapping, and others.

Velodyne goes on to explain how LIDAR works:

A typical lidar sensor emits pulsed light waves from a laser into the environment. These pulses bounce off surrounding objects and return to the sensor. The sensor uses the time it took for each pulse to return to the sensor to calculate the distance it traveled. Repeating this process millions of times per second creates a real-time 3D map of the environment. An onboard computer can utilize this 3D map of the surrounding environment for navigation.

LIDAR has been around for a long time in automobile technology for autonomous cars to “see” as The Verge pointed out in an article right after the Consumer Electronic Show this past January. That technology, as the Verge pointed out, has been very expensive, but now that technology is becoming much cheaper.

So cheap, in fact, that the companies leading the pack now predict LIDAR will become as commonplace in mass-market vehicles as cameras, radar, and other low-cost safety technology.

(The Verge).

So why would Apple want to put pulsed lasers rather than cameras in Apple Glass? Or not both? As Velodyne explained, LIDAR has an advantage over cameras for mapping out what is around you because it immediately creates a 3D map while cameras create a 2D map that then the computer has to figure to make it 3D.

Cameras produce 2D images of the environment and companies are installing them around vehicles to use for navigation. However, there are serious problems with the accuracy of proposed camera-centric systems which have not been solved and will likely not be solved soon.

Lidar “sees” in 3D, a huge advantage when accuracy and precision is paramount. The laser-based technology produces real-time high-resolution 3D maps, or point clouds, of the surroundings, demonstrating a level of distance accuracy that is unmatched by cameras, even ones with stereo vision. Whereas cameras have to make assumptions about an object’s distance, lidar produces and provides exact measurements. For this reason, autonomous or highly automated systems require lidar for safe navigation. The ability to “see” in 3D can’t be underestimated. Lidar produces billions of data points at nearly the speed of light. Each point provides a precise measurement of the environment. Compared to camera systems, lidar’s ability to “see” by way of precise mathematical measurements, decreases the chance of feeding false information from the vision systems to the car’s computer.

(Velodyne).

So LIDAR appears to be more efficient than cameras at mapping the world. Maybe also LIDAR may use up less energy than having cameras open all the time. Also it looks like when you use cameras you will need more processor power to determine what the camera is seeing.

Now the LIDAR Apple puts into Apple Glass may be similar to the LIDAR sensor that Apple put into the newest iPad Pro. This is how Apple discussed its LIDAR technology in its March 18, 2020 Press Release when it announced the newest iPad Pros:

Breakthrough LiDAR Scanner

The breakthrough LiDAR Scanner enables capabilities never before possible on any mobile device. The LiDAR Scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds. New depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.

So you figure Apple will use LIDOR sensors in Apple Glass that are as good as the LIDOR sensor in the iPad Pro. So that means the LIDAR will be able to see anything in front of you that is at least within 5 meters, which is just short of 16 and 1/2 feet. So what could Apple do with 16 and 1/2 feet of LIDOR data for Apple Glass?

Some Ideas of What Apple Could Do!

What if Apple used the map and GPS data that the iPhone has together with the LIDAR data from the Apple Glass? Let’s think about this. Your Apple Glass maps out what is in front of you for 16.5 feet in front of you and sends it quickly to the iPhone. Your iPhone knows where you are because of GPS, Bluetooth Data and Wifi connections. Now Apple has been spending many of the last few years catching up to Google Map by sending cars mapping cities and towns in the U.S.A. and around the world. That is why in Apple Maps you can drill down on the map in the city your are and see 3D renders of buildings and monuments. So when the LIDAR data comes back to the iPhone, the iPhone can quickly connect it to the data from the GPS and Apple Maps and match up the LIDAR data. So even though you don’t have a camera, the pictures and virtual 3D pictures of the buildings on Apple Maps could be sent to your Apple Glass and matched up with the scan of your LIDAR data. That means your Apple Glass and your iPhone which is connected to it will know exactly what you are looking at through your Apple Glasses and it will be able to label it quickly for you. So you will able to look at a building or a a neighborhood and Apple will know exactly where you are and what you are looking towards and map out in front of you.

This LIDAR information is really helpful for Apple because currently, GPS data is not very accurate.2 Significantly, the current state of GPS data, according to the United States Government, is accurate to 16 feet. That is just short of the range of the LIDAR sensor on the iPad. The LIDAR data could fill in the 16 feet of inaccuracy from the GPS data. So the LIDAR data married with the GPS data will help Apple pinpoint exactly where you are standing and where you are looking. Apple will then put Augmented Reality (AR) data on your Glass as you are looking.

Right now, if Apple just used GPS data, the accuracy of what you see would be really messy. But with LIDAR data being sent back to the iPhone and matched with GPS data, Apple will have much better accuracy regarding where you are. Apple couldn’t do this with out first mapping out cities and towns on its own. Apple now has data regarding where buildings and structures are located. With LIDAR and other location data, Apple will be closer to the holy grail of location accuracy.

What does this mean? When I am driving and wearing my Apple Glass, I will be able to get map data on my Apple Glass rather than what I currently do, i.e. either look at my Apple Watch for prompts or look at the iPhone next to me. With Apple Glass, Apple Map data will be right in front of my eyes. I will look ahead. Data will come on telling me which way the route is going and where I have to take turns.

Similarly, using Apple Maps when you are walking and following directions is a pain. You have to look down at your phone or Apple Watch. And because the GPS isn’t very accurate, the map is often not lined up with the direction I am looking at. With Apple Glass, Apple Map will be more accurate and very convenient. The LIDAR data from Apple Glass will tell exactly Apple Maps where I am looking and the map data will be shown on my Apple Glass screen. Walking directions now will be incredibly easy.

With Apple Maps supercharged and showing on you Apple Glass, developers will make great apps using the map data combined with the LIDAR data. Someone can make a walking tour app for tourists on Apple Glass. When you look at certain landmarks, information can be showed on your Apple Glass or a video or audio played.

As found by Josh Constine in leaked iOS 14 files and building on what Benjamin May at 9to5Mac reported in March 2020, Apple is working on using Apple branded QR codes to work with AR. Longtime tech maven, and the guy who first wore Google Glasses in a shower, recently extolled how important Apple’s AR glasses with QR code will be. Scoble believes that that Apple’s QR code will allow people to use Apple Pay by just looking at the QR code and telling Siri it is ok to pay it. Scoble points out that Apple will further know your location to match up with the store and Apple Pay because they know exactly where they put the QR code for the store. His post is on LinkedIn here. What is interesting about Apple’s QR codes working with LIDAR is that LIDAR is not a camera but Apple figured out how to crate QR codes that LIDAR could read using software.

The Last Frontier of Mapping

But I think Apple will use Apple Glass for something even greater — one of the last frontiers that hasn’t been mapped? What is that? Is it Antarctica? Is it deep oceans? No, it is your local gigantic box store, like Costco or Home Depot. GPS can only map what is outdoors and generally tell you roughly where you might be in a building. But Apple Maps doesn’t not have mapping information for large buildings. Now Apple previously tried to use technology to map the great indoors. Way back at the Developers Conference in 2013, Apple launched the iBeacon protocol which uses small Bluetooth beacons that will work with iPhone. I played around with such iBeacons and they were cheap, like $10 each. You could plug them into the wall in a building and when you walked by with an iPhone that had an app that worked with it, the iBeacon could send you a message on your screen. And presumably, if Apple knew where the iBeacons were placed, it could map your location in doors. But that technology never really took off. It didn’t work well with the iPhone. For it to generally work, stores had to put the iBeacons around and map the location of each. Then for it to work, the store’s app had to be used by the customer and work well with the data from the store.3

What I want as a consumer, and which iBeacons failed to deliver, is walk into my local large Home Depot Store and when I tell my iPhone to get me a certain screw, the iPhone takes me directly to the hallway and exactly where the screw I need is located. Finding products in stores without having to search for help and look at a bad map in the store is want I want Apple to solve.

Apple Glass with LIDAR data could solve this problem. A developer working with Home Depot or other stores could use Apple Glass to quickly LIDAR map the store. Then the Home Depot app would have an exact map of each store. And when someone came in the store with Apple Glass, the Home Depo app would know exactly where in the store the customer was located based on the LIDAR data it was getting. The Home Depot app would then guide the customer exactly to where the product is.

More importantly, Apple could use the LIDAR data from customers who walk into stores to map each store. Already, Apple is using Siri data from customers, although anonymously so it doesn’t know the name of each customer, to improve Siri. Apple is crowdsourcing Siri data from its millions of customers to improve Siri. Apple could do the same with the LIDAR data that customers wearing Apple Glass are collecing when they walk into stores. What if Apple married the LIDAR data with the Apple Pay data when Apple customers purchase products in large stores and malls? Apple probably knows exactly what terminal in a store you used Apple Pay. Although the stores don’t know your name and ID, Apple probably knows where the Apple Pay transaction occurred. Apple could marry that location data with the LIDAR data to map indoor stores and malls.

Don’t Forget the U1 Chip!

There is more that Apple could marry up with the LIDOR Data. Apple has a U1 Chip. Remember, back last year, Apple put a U1 Chip in the iPhone 11 Pro. Back in September 2019, longtime Apple journalist Jason Snell was so impressed by the U1 chip and its potential, he penned an article entitled: The U1 Chip In The iPhone 11 Is The Beginning of an Ultra Wideband revolution. Snell pointed out Apple only released this nugget of information about the U1 chip on its website:

The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to understand its precise location relative to other nearby U1‑equipped Apple devices. It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities.

It looks like for now the U1 chip is used for making AirDrop transfers and maybe the Apple Tags which hasn’t been released. The U1 chip is low powered and can pinpoint other U1 chips.

Snell explained its potential after interviewing an industry person familiar with the technology:

But the possible applications of UWB go way beyond AirDrop and tracking tags. Decawave’s Viot says potential applications include smart home tech, augmented reality, mobile payments, the aforementioned keyless car entry, and even indoor navigation. (And it’s not a power hog, either—Viot says that Decawave’s latest UWB chip uses one-third of the power of a Bluetooth LE chip when in beacon mode, as a tracking tile would be.)

(SixColors).

So think about what this means? Apple is rolling out the U1 chip roughly at the same time it is rolling out the LIDRO technology.4 Apple could have both technologies working together. For example, eventually if all the Apple devices have the U1 chip, Apple could use the location of the devices together with the LIDAR data. So if you lose your device in a mall, Apple could pinpoint for you exactly where that device is.

Conclusion

Developing new sensors, it looks like Apple is very serious with AR and mapping indoors with LIDOR technology. Apple getting in the LIDOR game is very serious. Remember, there have been rumors for many years that Apple is working building a car — Project Titan. (See MacRumors). LIDOR has been used for building an autonomous car.
Who knows what new products and services Apple will bring to the market in the coming years. What ever they do will push us forward into the future.


  1. I refer to it as Apple Glass without “the” in front of it because that is Apple’s naming convention for products. For example, Apple always refers to “iPhone,” not “the iPhone”. ↩︎
  2. In addition to GPS, Apple’s latest iPhone also has GNSS (Global Navigation Satellite System)technology. GNSS is an umbrella over all the satellite location technology. GPS relies on the old United States Department of Defense satellites. But other countries and Europe are creating new location technology location using new satellites. Europe’s Galileo is precise to 1 meter, roughly three feet, much more precise than GPS. So eventually satellite location for iPhones and other devices will be more precise than GPS. ↩︎
  3. Apple hasn’t abandoned iBeacon technology and still touts it in the technology specifications for the iPhone and iPad devices. It refers to it as “iBeacon microlocation.” ↩︎
  4. Now interestingly, it doesn’t look like the latest iPad Pros released this spring have the U1 Chip. (See AppleInsider — speculating that because of supply issues or the iPad Pro doesn’t have the latest processor, that is why the iPad Pros do not have the U1 chip). ↩︎

iPad Pro or MacBook Air For My Friend? Which To Get?

I have a really good and close friend who is not a tech geek like I am. Recently, his 2012 13-inch MacBook Pro broke down. I told him it was time he should get a new device. I suggested he get either an iPad Pro or a MacBook Air. I gave him the pros and cons of both devices: on one hand the 12.9-inch iPad Pro and on the other hand either the 13-inch MacBook Air.

Now mind you my friend didn’t do much with his prior 13-inch MacBook Pro. He is a writer. So the most he did with his MacBook Pro is write. He also used the mail app to send and receive emails. He also used the Safari and Chrome browsers to go on the internet. But that was about all he did with it. He didn’t have photos on that MacBook Pro. He never edited videos. As for sending messages or texts? He sent and received messages and texts on his iPhone, not on his MacBook Pro. He did watch videos on the MacBook Pro on the internet. And also listed to some music from the internet. He doesn’t have a tv, so his only movies or audios were on the MacBook Pro.

In the end, he went with a 13-inch MacBook Air. He is very happy with his purchase. It is a great upgrade from his 2012 MacBook Pro. But I think he made a mistake. I believe he should have purchased instead the 12.9-inch iPad Pro. Here is why:

Technology-wise, he gets much more technology from an iPad Pro than a MacBook Air.

Cameras. The iPad Pro has much better camera than the MacBook Pro and it has it on the front facing you and also on the back. The MacBook Air has a 720p FaceTime camera. The iPad Pro’s TrueDepth Camera, which is on the side with the screen, has an 1080p camera that shoots 7mp photos. In addition, it has two great camera lenses on the back that can shoot 4k video and photos at 12mp and 10mp. Now why should he care about these differences? Well, during the shutdown, like all of his, he is making many Zoom video calls with people. And if you have a better camera for the calls, then the Zoom or video calls you make are better. So right there for that reason the iPad Pro would be better for him than the MacBook Pro. But the cameras on the back are also better for him. He doesn’t take many pictures and he has an iPhone. But the cameras on the back would allow him to do more things with his iPad Pro than the MacBook Air. For one, he could scan documents with the iPad. You can’t scan papers with the MacBook Pro unless you have a good scanner connected to it. A good scanner can cost an additional $500. I do all my scanning now with my iPhone 11 Pro or my old iPad Pro.

Also, the cameras on the back have new technology for the iPad Pro. They have LIDAR sensors. Here is what the National Oceanic Service describes LIDAR as:

Lidar, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth. These light pulses—combined with other data recorded by the airborne system— generate precise, three-dimensional information about the shape of the Earth and its surface characteristics.

A lidar instrument principally consists of a laser, a scanner, and a specialized GPS receiver. Airplanes and helicopters are the most commonly used platforms for acquiring lidar data over broad areas. Two types of lidar are topographic and bathymetric. Topographic lidar typically uses a near-infrared laser to map the land, while bathymetric lidar uses water-penetrating green light to also measure seafloor and riverbed elevations.

No my friend isn’t going to be interested in LIDAR technology because he doesn’t know how it it will help him. But the thing is he will probably keep his new device for at least 7 years. And clearly Apple plans to do something with LIDAR on the iPad and also third-party developers will create programs that use LIDAR. Right now, without LIDAR, my iPhone 11 Pro has a measurement app which allows me to measure stuff with the iPhone. With LIDAR, the measurement app will work even better. By purchasing the MacBook Air rather than the iPad Pro, my friend is missing out on all the possible apps that could use LIDAR.

In addition to LIDAR, the iPad Pro has other sensors that the MacBook Pro doesn’t have. It has a Three-axis gyro, an Accelerometer and a Barometer. Plus it has FaceID security rather than a fingerprint security. The MacBook Air does not have these sensors.

Screens. How about the screen? The iPad Pro’s screen is superior to the MacBook Air’s screen. The 12.9 inch iPad Pro has 2732-by-2048-pixel resolution at 264 pixels per inch (PPI). The Macbook Air is 2560-by-1600 native resolution at 227 pixels per inch.

Weight. The 12.9-inch iPad Pro is lighter than the 13-inch MacBook Air as it weighs 643 grams while the MacBook Air weighs 1.29kg (or 1290 grams). But if you get the Magic Keyboard for the iPad Pro that combined weight will be about the same as the MacBook Air. But the point is you can have the iPad Pro naked with no cover and you can easily read it in your bed or sofa or when you are on a public transportation because its so light.

Power. The MacBook Air got a single-core Geekbench score of 1110 and multi-core score of 2862. The newer iPad Pros got Geekbench scores of 1118 single core and 4704 for multi-score. So the iPad is more powerful in multi-core processing.

Other Differences. There are a couple of other things that are different between the iPad Pros and and all the Mac.

  • First, the iPad Pro you can get with a cellphone antenna if you pay extra. You can’t get that with any Macintosh. So with the iPad Pro if you have a cellphone antenna, you can always connect to the internet even if you you are not at home and don’t have a connection to a wifi that is connected to the internet.
  • Second, the iPad Pro has access to many more apps and programs in the iOS App Store than what the Macs have access to in the Mac App Store. In addition, the iPad apps are generally cheaper than the programs for the Macs.
  • Third, and the most obvious, the iPad screen has touch on it so you can navigate the screen using your finger. This is a big deal because some apps are better to use by using your finger than using a mouse, trackpad and keyboard. Say you are watching a video. Isn’t it better to hold the iPad Pro and use your finger to start the video or to swipe around for other videos? Same with looking a photos. Isn’t touch better for looking at them? I think so.
  • Fourth, the iPad Pro can work with the latest Apple Pencil. My friend writes. Sometimes you want to write with a pencil and take notes. With the Macintosh you don’t have a choice. The screen on the Macintosh doesn’t have touch screen or the ability to use a pencil. With the iPad Pro you have a choice, touch, keyboard, trackpad, mouse or pencil. Actually also voice control. The Mac has voice control but I don’t think it is as useful or as ingrained in the system as voice control on iOS devices.
  • Fifth, the Mac has some major advantages but mainly for pros. For example, on the Mac you can use Xcode to create apps for the iPhone, iPad, Macs, Apple TV and Apple Watch. On the iPad Pro you can’t do that. But my friend is not a developer so he should not care about that. The Mac lets you put apps that don’t come from the Apple App Store. That is a big deal for a power user like myself. For example, I have long used the Audio Hijack program from Rogue Amoeba, which allows you to record any audio that comes through your Macintosh. There is nothing like that on iOS as Apple doesn’t let third-party developers fool around with the system underneath.1 Similarly, because your have a Terminal App on the Mac, and because MacOS was built on UNIX, you can put UNIX commands and do crazy stuff on your Mac. On your Mac, you can download Homebrew on the Terminal App and then do additional crazy stuff. Like for example, there are programs that can make your Mac a server for HomeKit so you can have devices that are not HomeKit work with Apple’s HomeKit on your iPhone and iPad. And of course scripting. On the Mac, you can run AppleScript and other scripts to automate your Mac. On the other hand, on iOS, Apple has the Shortcuts app, where essentially you can create your own scripts and automation apps. And iOS now has a finder so you can navigate into the folders for each app pretty much to find the files you need. So iOS is closer to having a finder like on the Mac.
  • Sixth, the security on iOS is greater than on the Mac because you can’t install non-Apple approved programs on iOS. Also, because the iOS apps are siloed when they run, unlike on the Mac where the programs get closer to the underlying system, your iPhones and iPads are much less likely to get hacked or hijacked. On the Mac, you can still get locked out and have your computer hijacked if you click on the wrong thing form the internet.

What about cost? That shouldn’t be the difference in deciding to purchase the MacBook Air or iPad Pro. The MacBook Air ranges in price from $999 to $2249 if you max out the ram and storage to 16gb of Ram and 2tb of SSD storage as well as bump up slightly the processor speed. The 12.9-inch iPad Pro starts at $999 and ranges up to $1649 if you max out storage to 1tb and get the cellphone antenna built in. If you add in the Magic Keyboard for the iPad Pro, add another $349. If you get the Apple Pencil, add in another $129 for the second generation model. So the prices of the iPad Pro and Macbook Air are congruent on price range. Given that my friend is likely to own this new device for at least 7 years, he could splurge on either the iPad Pro or MacBook Air top price and it would be not very expensive for each month he uses it. For example, take $2000 and divide it by 7 years times 12 months; it totals $23.8 per month.

What about storage? That also shouldn’t be the difference for my friend. The MacBook Air gives you more storage than the iPad Pro if you pay for it. Up to 2 TB for the MacBook Air and only up to 1TB for the iPad Pro. My friend told me he only used 10% of all the storage on his 2012 MacBook Pro. That MacBook Pro had 4gb of ram and a 500mb hard drive. So my friend used approximately 50 mb of storage. The entry level iPad Pro comes with 128gb of storage. That is 256 times more storage than he use don his 2012 MacBook Pro. That entry model would probably have sufficient storage for my friend.

So why did my friend pick the MacBook Air instead of the iPad Pro when the above shows that that iPad Pro has more flexibility and much better technology?

I think he was prejudiced against getting the iPad Pro by a prior experience with an iPad that he purchase many years ago and quickly returned to Apple. Many years ago, when the iPad was in its nascent existence in 2012, the iPad and its operating system couldn’t do as much compared to a Mac. Back then, there were no keyboard covers. You connected a keyboard by bluetooth or lightning back then. Significantly, the operating system for the iPad wasn’t geared for physical keyboard back then. Also, there was no support for a mouse or trackpad. So if you used a physical keyboard a lot, you had to each out to change an app or edit. For someone used to writing a lot on a MacBook Air, the 2012 iPad and its operating system could not really well complete with a Mac. I remember my friend purchase the iPad on my recommendation back then and he promptly returned if to Apple after trying to use it. He doesn’t trust that the experience on the iPad has changed so much since 2012 that it would be easy for him to write and use the iPad Pro.

Second, I don’t think he can appreciate that new things that he never tried could enhance his life. I think most people in the World are like that. It is normal. We live in today. If things are working today, we are happy with that. Something new might not work and it takes additional effort to learn. This is why Apple is careful about how and when it releases new products. Apple knows it has to bring people to the new technology by seducing them with great design that is friendly to people and technology that is so good that people will quickly adopt it. Look at the original iPhone. Most people didn’t think it was a big deal or given its high price anyone would adopt it. Remember Steve Balmer then the CEO of Microsoft. He said no one would purchase the iPhone. RIM, the makers of the then very popular Blackberry said no one would purchase the iPhone because it didn’t have a physical keyboard. iPhones and iPhone-copiers (i.e. Android phones) now completely control the cell phone market. No one purchases cellphones with physical keys or keyboards. I think most people when they start using an iPad regularly will full embrace it as their main computing device. My mom, who is 90, started using an iPad about 6 years ago. She has access to regular computers. But when I see her she is always on the iPad.

Eventually my friend will have an iPad as his main device. Eventually.


  1. Apple isn’t approving Audio Highjack to get into the Mac’s system to highjack audio. Audio Highjack is not sold in the Mac App Store. Instead you download and install it directly from Rogue Amoeba. ↩︎

Is the the Phone 11 Pro and iPhone 11 A Really Update from Last Year? — Yes!

Apple’s September announcement event is like a holiday for me. It is tech Christmas for me. For many years now, I look forward to September because that is when Apple announces a new iPhone. And many people in the media or my friends always say Apple’s new iPhone isn’t that much better than last year’s iPhone so why should anyone upgrade? But I point out to my friends that Apple’s iPhone is much better upgraded every year. Compare it with PCs. Back in the 1990s, everyone would get excited every year when company’s like Dell, HP, IBM, Compaq, etc would upgrade every year with a better processor from Intel. You would get a more powerful PC. And the processors would be maybe 20% more powerful. But then it leveled off.
But Apple has been killing it every year upgrading the iPhone. Most people don’t understand that. They look at the design and say Apple’s iPhone 11 Pro and iPhone 11 looks just like the iPhone XS or XR from last year. Big deal. But the real magic is the technology in that rectangular slab. With the iPhone 11 Pro and iPhone 11, Apple has updated 4 things that are much better than last year’s phone: 1) the camera, 2) the processors, 3) the battery, and 4) the screen.
The Camera (Cameras)
The cameras are upgraded from last year. The iPhone 11 is the price of the XR (actually a lower price than iPhone XR), but it now has 2 cameras on the back of the iPhone compared to 1 last year. And the iPhone 11 Pro and Pro Max now have 3 cameras on the back compared to 2 on the XS and XS Max.
The front-facing camera now has a 12 mp camera rather than a 7 mp camera. And the front camera now records video in higher data, like 60 frames per second and 4k.
But don’t forget the processors that work with the cameras. I won’t go into the details about the camera processors but the bottom line is they have been greatly updated from the XS and XR iPhones. And those new processors in all the iPhone 11s do things the iPhone XS and XR cannot do.
2 Video Recording at the same time!
Among the things they can do is record videos at the same time from 2 cameras. FILMIC, which has a great professional app in the iPhone for video recording, showed on the stage during Apple’s event that its app will be able to show what all 4 cameras on the iPhone 11 Pro is showing and on the fly you can record 2 of the video streams at the same time. This is a big deal. Before with devices, you could record two videos at the same time but then you would have to sync them. Now the syncing is automatic.
So this will be great for sports and event recording and vlogging. I record my kids playing tennis. But it’s hard to record the entire court. If you go to one end you get the back of one player and the other player is far away. But now you could stand right at the net and record both half’s of the court and show it split-screen.
Similarly, vloggers will love this. They can record them talking and recording where they are and quickly edit the back and forth view or do picture in picture.

Night Mode

The new processors and extra cameras allow the iPhone 11s to take great pictures in low light. Apple calls this tech “Night Mode.” This is what Apple says about it on its site:

2019 09 11 at 7 53 AM

Major Upgrade to the screens for the iPhone 11 Pro.

Another major reason to upgrade if you use the phone a lot is that the screens for the iPhone 11 is not much much better. Without going into the technology, the screen can get much brighter than the iPhone XS. Why should you care? You should care because when you are outdoor in the bright sun the screen will be easier to see. Also, whey you watch movies on your iPhone 11 Pro it will look much better.

Battery

Another key reason to upgrade to an iPhone 11 from last year’s phone is the battery. All three iPhone 11s have much better batteries. In particular, the iPhone 11, which is the upgrade from the iPhone XR, now has a battery that lasts 1 hour more from the XR. The iPhone 11 Pro, which is the upgrade from the iPhone XS, has a batter that is 4 hours more than the XS. And the iPhone 11 Pro Max? That phone now lasts 5 hours more than last years iPhone XS Max. If you use your phone a lot during the day, you might want to upgrade alone for the better battery.

Processors and other technology

But another reason to upgrade is the processors. The processors are more powerful, And when they are more powerful, you can do more with them. You can edit videos quicker. You can edit photos quicker.
But it’s not just batteries. Apple has upgraded the wifi technology so you can get faster data from wifi. So if you use your wifi at home or at work to stream video or transfers large files, it will go faster.
But other technology: The A1 chip. Apple added a chip, that makes the iPhone 11 more location-aware. It can be used when you do Airdrop on another phone. One problem now with Airdrop is when there are a lot of iPhones around with Airdrop, you see many iPhones in your Airdrop screen. But now with the A1 chip, if you point your iPhone to the iPhone you want to Airdrop, it will bring that iPhone to the top of the iPhones you see on the screen. But this technology is what Apple is going to use when it comes out to tokens. It will allow your iPhone to quickly find those tokens in the room.

Conclusion

Overall, if you are a heavy iPhone user, it is worth upgrading to the iPhone 11, in particular the iPhone 11 Pro. You get a better camera for taking pictures and video, you get a faster processor for all the work you do, you get a better screen, and you get a much better battery. That alone is worth upgrading.