Christmas In The Summer – 2020 WWDC

Christmas In The Summer – 2020 WWDC

We are just a day away from the virtual WWDC starting on June 22, 2020. This is Apple’s annual developers conference that usually takes place in the first week of June each year. But this year it is going to be only online and starting later in June.

Its always felt like Christmas in the summer. When I was a kid, Christmas was a time of surprise and new toys being given to me. For many years now, WWDC has been the same thing. A time of surprise and new toys being available. The difference is the new toys are new software technology and new hardware technology.

When Steve Jobs was alive, he ruled over WWDC. He ran the show on the main keynote speech that always takes place on Monday the first day at 10 am Pacific time. He was like the magic Uncle in the Nutcracker bringing magical toys to the children. He captivated us with his showmanship. Gather closely and I will pull something from my pocket which will dazzle you. Steve died on October 5, 2011. And his successors, led by Tim Cook try to continue with that summer time magic in WWDC.

I look forward to Monday, June 22, 2020 and the coming WWDC keynote. There are rumors and leaks of what is to come. I welcome the rumors and leaks. They provide hope of new and interesting things. At this time of difficulty in our country and world, we need the magic and positive excitement of WWDC.

Here are things that I am looking forward to:

New iMacs

The rumors are that Apple will announce new iMacs and release them at WWCD. They key part of that rumor is that Apple has finally redesigned the iMacs. As you may know, for about 8 years or so, Apple has not changed the design language of the iMacs. The rumor is that Apple has actually created a new design and its like the style of the latest iPad Pros and the expensive XDR monitor that Apple released in 2019 along with the new Mac pro. My 2020 iMac finally died earlier this year so I have been looking to get new Mac. I have been lusting after the Mac Pro but the price of that along with a monitor is fiscally irresponsible for me to pursue. A new iMac with a fantastic screen makes more financial sense. So if new ones come out, I will probably pull the trigger and get one.

Transition of Macs to Apple’s Own Processors

Apple is expected to announce to developers that it is transitioning the Macs away from using Intel’s processors to using Apple’s own ARM processors that Apple has developed and uses in its iOS devices. I am all for this. I remember in the mid-2000s, about 16 years ago, when Apple switched from RISC processors to Intel processors for its Macs. It was a big deal back then. The Intel processors were faster and better performers. Through out the 1990s and early 2000s, one knock about the Macs was that they weren’t as powerful as the Intel-based PCS. Also, back then, software was such, that there was less software to use on Macs because the software on Intel-based PCs couldn’t run on Macs without a virtualization software also being run. It was a miserable experience running Windows software on those non-Intel Macs. But then Apple switched to Intel. And Apple created the Bootcamp software so you could boot the Intel Macs as a Windows computer. It was a big deal and made the Macs more useful.

So why am I looking forward to ARM-based Macs? Because Apple has shown it is an amazing processor maker. Its processors for the iPhones and iPads are amazing. I am writing this on a 2020 iPad Pro and the power of this device is amazing. Every year Apple comes out with newer processors that greatly increase performance from the prior year. Apple optimizes its software with its processors to make them more efficient. Intel meanwhile has been lagging and rapidly going down hill as a processor company. It has let AMD lead the market in high end processors for PCs. Intel missed the boat in developing processors for smart phones. Apple and Qualcomm make all the processors for smart phones. Apple creating Macs based on its own processors will result in super powerful Macs that will eventually dwarf Intel-based PCs. I don’t think the new iMacs that will be announced next week will be running Apple processors, at least not the main workhorse processors. But it looks like Apple will announce the transition to developers. Most likely, the first ARM-based Macs will be the lightest MacBooks. Then Apple will migrate developing its own processors up to the Mac Pro. Apple just released last year the Mac Pro so it wouldn’t make sense to change that Apple-processors right away. No one who purchased $6k to $50k for a Mac Pro prior to WWCD wants to feel like their Mac Pro is obsolete. My guess is that Apple’s transition to all ARM-based Macs will take at least 3 years. The Mac Pros will be the last computers that Apple upgrades to Apple processors.

New IOS and Mac OS

One thing for certain, Apple will announce new features to the operating systems for iOS devices and Macs. That has been the case at each WWDC I can remember. What are some of the features I am expecting and hoping for?

Better Multitasking. Apple has made iPad OS much more useful for power users who are using the iPad as their main or only computing device. In particular, multi-tasking and access to files has greatly improved. We can bring up at least two Apps side by side and quickly get access to other Apps. We can quickly share data and files among apps through the Sharesheet, the Files App and Shortcut actions. I expect Apple will go further in that direction with more multi-tasking features in both the iPad OS as well as iOS for the iPhone. Apple may bring up some of the features on the iPad OS to iOS. Apple may allow its larger iPhones to run Apps side by side like on the iPad. I expect that Apple will increase the amount of Apps you can bring up side by side on the iPad and also make it more elegant when you do so.

Shortcuts App I think the Shortcuts app and its integration into iOS and iPadOS is one of the most amazing and powerful features on iPhones and iPads. I believe Apple will make that program and system integration more powerful. Among the things I would like to see:

  1. Allow Shortcuts to be organized into folders.
  2. Allow all shortcuts to be run and executed in the background.
  3. Improve the scripting actions to make it easer to create if and then actions and make it more intuitive.
  4. Add more actions tied into the system. For example, there is a built-in password manager in iOS which works when you log into accounts on the browser and apps. Tie that to Shortcuts.

SpringBoard Redesign

I desperately want Apple to redo the springboard on iOS. One huge problem with iOS is when you have many apps on the iPhone or iPad, its hard to organize them and move them around. In addition, if you have many apps on your device, its hard to find what page they are if you want to move them around. I would like Apple to make it much easier. For example, I wouldn’t mind when you edit a folder on the springboard, you can click somewhere and get a list of all the apps on your device and pick which apps you want to move into that folder. Similarly, I would like to click on the springboard anywhere or in the Control Center and see a list of all my apps and I can delete the apps. Or if I pick the apps on the list, I can chose to create a folder with them or move them to a list of folders. This needs to be fixed. Please please Apple fix this!

Airtags

There have been rumors for more than one year that Apple is developing Airtags. These would be cheap plastic tags that you can attach to anything and then you will be able to find them using your iPhone or Apple Watch. The key technology is that Apple will crowd source all the iPhones that people own to help you find where your lost Airtag is located. The idea would be that if I lost my Airtag, even if my iPhone was out of bluetooth or other wireless range from the Airtag, another person’s iPhone that went by the Airtag that I lost will tell Apple where my Airtag is and then Apple will automatically tell me where my Airtag is. I guess some people will worry about privacy because one’s iPhone could be used to help someone else find their lost Airtag. But Apple could do this while keeping any data about people’s Apple’s data or information. A number of years ago I tried a third-parties tag products which also tried to do this. But the product wasn’t that good. The tags were expensive, the batteries didn’t last too long, and the tags were too big and heavy. Significantly, the ability to find the tags was based on how many people but that tag. That is because that third-party didn’t control everyone’s iPhone or devices. It only controlled those devices where the owner purchased their tag. Here, Apple could use all the iPhones that are out there, regardless of whether the people purchased an Airtag, to help find lost Airtags.

Carkey

Apple is supposedly going to work to have the iPhone be able to replace your carkey. This makes sense. Just like Apple has worked to replace subway cards so you can use your iPhone to get an on a public bus or subway, Apple is working on making your iPhone be your car key. More and more Apple will make your iPhone be the control for everything you do. Apple Wallet and ApplePay is replacing my physical credit cards. Many buildings allow you to use NFC on your iPhone to get into your dorm or office. Soon governments will realize that your credentials, like passports and drivers license should be in Apple Wallet and be digital. Carkey technology will just be another thing that you don’t have to carry and have on your iPhone.

Augmented Reality Technology and Apps

Hey, my 2020 iPad Pro has lidar sensors in it. Apple should release more augmented reality apps to work with it. They should also release more AR technology in the iOS and Mac system. Bring it on. Very soon, they will release AR Glasses so let the customers get used to it.

These are just some of the things I hope they release on Monday. I am sure some will not be announced, some will be announced, and hopefully some which I never thought of will be announced and surprised me.

Apple Glass, LIDAR, GPS Inaccuracy and the Holy Grail of Location and Mapping Accuracy!

In the Graduate movie, a young Dustin Hoffman is playing a recent college graduate who doesn’t know what to do with his life. Some jerky older adult pulls Benjamin (Hoffman’s character) aside to tell him “one word, Plastics, there is a great future in Plastics.” Well it looks like this year, the great future is in LIDAR. One word, LIDAR. Hear me out.

Prosser’s Apple Glass Leak

As I mentioned in a recent post, the YouTuber, Jon Prosser, recently leaked from an Apple employee that Apple is working on releasing glasses next year. Prosser claims he has seen the glasses Apple is making and claims that the product will cost $499 (without prescription lenses), it will be called Apple Glass, Apple Glass1 looks like regular glasses, and Apple Glass will have LIDAR technology but no cameras. It’s the last part that I will address. Why would Apple not put cameras in Apple Glass? And is LIDAR technology enough for the Apple Glasses to make customers really want to purchase that product?

First, why won’t there be cameras in Apple Glass? Prosser suggested that there would be no cameras because people freaked out when Google Glasses had cameras in it. I think that makes sense. Apple doesn’t want people to be afraid to wear Apple Glass. Apple wants to make it easy for every day people to purchase the glasses. Apple also has been telling everyone that they respect the privacy for consumers. If cameras were on Apple Glass, people would be afraid that whoever is wearing Apple Glass is video recording what they see.

A number of years ago, I purchased Snap’s glasses, Spectacles. You can see the video I made here. They were fun to wear but the glasses were big and people could see the camera on one side of the glasses. When you were having the glasses take pictures or recording video, a light would go off on the side of the glasses letting people know you are recording them. I wore then for a while, but I felt I looked kind of silly wearing them and I have stopped using them.

So one reason Apple might not put cameras on the Apple Glass is that people might be afraid to wear them if everyone knew they had cameras on it. That makes sense.

Another reason to not use cameras on the Apple Glass is that cameras, that are always working or on a lot, could eat up the battery of the Apple Glasses. Prosser says the Apple Glass will be connected with the iPhone to work. So probably it will wirelessly connect with Bluetooth or some other wireless technology like the W1 chip. So that wireless connection itself will eat up some of the battery on the Apple Glass. And if Apple Glass look normal like regular glasses, well there won’t be much room for batteries on Apple Glass. So to save battery energy, maybe that is why Apple will not use cameras on the Apple Glass.

LIDAR

But how could Apple Glass be a great technology if they don’t have cameras? The answer is LIDAR, which Prosser mentioned is on the Apple Glass. He also pointed out that Apple put LIDAR on the latest iPad Pro so Apple could test out LIDAR before Apple Glass comes out. I think that makes sense. But how could LIDAR make Apple Glass be great new technology.

First, what is LIDAR? Here was what Velodyne a major manufacturer of LIDOR sensors and technology describes it as:

Lidar is an acronym for “light detection and ranging.” It is sometimes called “laser scanning” or “3D scanning.” The technology uses eye-safe laser beams to create a 3D representation of the surveyed environment. Lidar is used in many industries, including automotive, trucking, UAV/drones, industrial, mapping, and others.

Velodyne goes on to explain how LIDAR works:

A typical lidar sensor emits pulsed light waves from a laser into the environment. These pulses bounce off surrounding objects and return to the sensor. The sensor uses the time it took for each pulse to return to the sensor to calculate the distance it traveled. Repeating this process millions of times per second creates a real-time 3D map of the environment. An onboard computer can utilize this 3D map of the surrounding environment for navigation.

LIDAR has been around for a long time in automobile technology for autonomous cars to “see” as The Verge pointed out in an article right after the Consumer Electronic Show this past January. That technology, as the Verge pointed out, has been very expensive, but now that technology is becoming much cheaper.

So cheap, in fact, that the companies leading the pack now predict LIDAR will become as commonplace in mass-market vehicles as cameras, radar, and other low-cost safety technology.

(The Verge).

So why would Apple want to put pulsed lasers rather than cameras in Apple Glass? Or not both? As Velodyne explained, LIDAR has an advantage over cameras for mapping out what is around you because it immediately creates a 3D map while cameras create a 2D map that then the computer has to figure to make it 3D.

Cameras produce 2D images of the environment and companies are installing them around vehicles to use for navigation. However, there are serious problems with the accuracy of proposed camera-centric systems which have not been solved and will likely not be solved soon.

Lidar “sees” in 3D, a huge advantage when accuracy and precision is paramount. The laser-based technology produces real-time high-resolution 3D maps, or point clouds, of the surroundings, demonstrating a level of distance accuracy that is unmatched by cameras, even ones with stereo vision. Whereas cameras have to make assumptions about an object’s distance, lidar produces and provides exact measurements. For this reason, autonomous or highly automated systems require lidar for safe navigation. The ability to “see” in 3D can’t be underestimated. Lidar produces billions of data points at nearly the speed of light. Each point provides a precise measurement of the environment. Compared to camera systems, lidar’s ability to “see” by way of precise mathematical measurements, decreases the chance of feeding false information from the vision systems to the car’s computer.

(Velodyne).

So LIDAR appears to be more efficient than cameras at mapping the world. Maybe also LIDAR may use up less energy than having cameras open all the time. Also it looks like when you use cameras you will need more processor power to determine what the camera is seeing.

Now the LIDAR Apple puts into Apple Glass may be similar to the LIDAR sensor that Apple put into the newest iPad Pro. This is how Apple discussed its LIDAR technology in its March 18, 2020 Press Release when it announced the newest iPad Pros:

Breakthrough LiDAR Scanner

The breakthrough LiDAR Scanner enables capabilities never before possible on any mobile device. The LiDAR Scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds. New depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.

So you figure Apple will use LIDOR sensors in Apple Glass that are as good as the LIDOR sensor in the iPad Pro. So that means the LIDAR will be able to see anything in front of you that is at least within 5 meters, which is just short of 16 and 1/2 feet. So what could Apple do with 16 and 1/2 feet of LIDOR data for Apple Glass?

Some Ideas of What Apple Could Do!

What if Apple used the map and GPS data that the iPhone has together with the LIDAR data from the Apple Glass? Let’s think about this. Your Apple Glass maps out what is in front of you for 16.5 feet in front of you and sends it quickly to the iPhone. Your iPhone knows where you are because of GPS, Bluetooth Data and Wifi connections. Now Apple has been spending many of the last few years catching up to Google Map by sending cars mapping cities and towns in the U.S.A. and around the world. That is why in Apple Maps you can drill down on the map in the city your are and see 3D renders of buildings and monuments. So when the LIDAR data comes back to the iPhone, the iPhone can quickly connect it to the data from the GPS and Apple Maps and match up the LIDAR data. So even though you don’t have a camera, the pictures and virtual 3D pictures of the buildings on Apple Maps could be sent to your Apple Glass and matched up with the scan of your LIDAR data. That means your Apple Glass and your iPhone which is connected to it will know exactly what you are looking at through your Apple Glasses and it will be able to label it quickly for you. So you will able to look at a building or a a neighborhood and Apple will know exactly where you are and what you are looking towards and map out in front of you.

This LIDAR information is really helpful for Apple because currently, GPS data is not very accurate.2 Significantly, the current state of GPS data, according to the United States Government, is accurate to 16 feet. That is just short of the range of the LIDAR sensor on the iPad. The LIDAR data could fill in the 16 feet of inaccuracy from the GPS data. So the LIDAR data married with the GPS data will help Apple pinpoint exactly where you are standing and where you are looking. Apple will then put Augmented Reality (AR) data on your Glass as you are looking.

Right now, if Apple just used GPS data, the accuracy of what you see would be really messy. But with LIDAR data being sent back to the iPhone and matched with GPS data, Apple will have much better accuracy regarding where you are. Apple couldn’t do this with out first mapping out cities and towns on its own. Apple now has data regarding where buildings and structures are located. With LIDAR and other location data, Apple will be closer to the holy grail of location accuracy.

What does this mean? When I am driving and wearing my Apple Glass, I will be able to get map data on my Apple Glass rather than what I currently do, i.e. either look at my Apple Watch for prompts or look at the iPhone next to me. With Apple Glass, Apple Map data will be right in front of my eyes. I will look ahead. Data will come on telling me which way the route is going and where I have to take turns.

Similarly, using Apple Maps when you are walking and following directions is a pain. You have to look down at your phone or Apple Watch. And because the GPS isn’t very accurate, the map is often not lined up with the direction I am looking at. With Apple Glass, Apple Map will be more accurate and very convenient. The LIDAR data from Apple Glass will tell exactly Apple Maps where I am looking and the map data will be shown on my Apple Glass screen. Walking directions now will be incredibly easy.

With Apple Maps supercharged and showing on you Apple Glass, developers will make great apps using the map data combined with the LIDAR data. Someone can make a walking tour app for tourists on Apple Glass. When you look at certain landmarks, information can be showed on your Apple Glass or a video or audio played.

As found by Josh Constine in leaked iOS 14 files and building on what Benjamin May at 9to5Mac reported in March 2020, Apple is working on using Apple branded QR codes to work with AR. Longtime tech maven, and the guy who first wore Google Glasses in a shower, recently extolled how important Apple’s AR glasses with QR code will be. Scoble believes that that Apple’s QR code will allow people to use Apple Pay by just looking at the QR code and telling Siri it is ok to pay it. Scoble points out that Apple will further know your location to match up with the store and Apple Pay because they know exactly where they put the QR code for the store. His post is on LinkedIn here. What is interesting about Apple’s QR codes working with LIDAR is that LIDAR is not a camera but Apple figured out how to crate QR codes that LIDAR could read using software.

The Last Frontier of Mapping

But I think Apple will use Apple Glass for something even greater — one of the last frontiers that hasn’t been mapped? What is that? Is it Antarctica? Is it deep oceans? No, it is your local gigantic box store, like Costco or Home Depot. GPS can only map what is outdoors and generally tell you roughly where you might be in a building. But Apple Maps doesn’t not have mapping information for large buildings. Now Apple previously tried to use technology to map the great indoors. Way back at the Developers Conference in 2013, Apple launched the iBeacon protocol which uses small Bluetooth beacons that will work with iPhone. I played around with such iBeacons and they were cheap, like $10 each. You could plug them into the wall in a building and when you walked by with an iPhone that had an app that worked with it, the iBeacon could send you a message on your screen. And presumably, if Apple knew where the iBeacons were placed, it could map your location in doors. But that technology never really took off. It didn’t work well with the iPhone. For it to generally work, stores had to put the iBeacons around and map the location of each. Then for it to work, the store’s app had to be used by the customer and work well with the data from the store.3

What I want as a consumer, and which iBeacons failed to deliver, is walk into my local large Home Depot Store and when I tell my iPhone to get me a certain screw, the iPhone takes me directly to the hallway and exactly where the screw I need is located. Finding products in stores without having to search for help and look at a bad map in the store is want I want Apple to solve.

Apple Glass with LIDAR data could solve this problem. A developer working with Home Depot or other stores could use Apple Glass to quickly LIDAR map the store. Then the Home Depot app would have an exact map of each store. And when someone came in the store with Apple Glass, the Home Depo app would know exactly where in the store the customer was located based on the LIDAR data it was getting. The Home Depot app would then guide the customer exactly to where the product is.

More importantly, Apple could use the LIDAR data from customers who walk into stores to map each store. Already, Apple is using Siri data from customers, although anonymously so it doesn’t know the name of each customer, to improve Siri. Apple is crowdsourcing Siri data from its millions of customers to improve Siri. Apple could do the same with the LIDAR data that customers wearing Apple Glass are collecing when they walk into stores. What if Apple married the LIDAR data with the Apple Pay data when Apple customers purchase products in large stores and malls? Apple probably knows exactly what terminal in a store you used Apple Pay. Although the stores don’t know your name and ID, Apple probably knows where the Apple Pay transaction occurred. Apple could marry that location data with the LIDAR data to map indoor stores and malls.

Don’t Forget the U1 Chip!

There is more that Apple could marry up with the LIDOR Data. Apple has a U1 Chip. Remember, back last year, Apple put a U1 Chip in the iPhone 11 Pro. Back in September 2019, longtime Apple journalist Jason Snell was so impressed by the U1 chip and its potential, he penned an article entitled: The U1 Chip In The iPhone 11 Is The Beginning of an Ultra Wideband revolution. Snell pointed out Apple only released this nugget of information about the U1 chip on its website:

The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to understand its precise location relative to other nearby U1‑equipped Apple devices. It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities.

It looks like for now the U1 chip is used for making AirDrop transfers and maybe the Apple Tags which hasn’t been released. The U1 chip is low powered and can pinpoint other U1 chips.

Snell explained its potential after interviewing an industry person familiar with the technology:

But the possible applications of UWB go way beyond AirDrop and tracking tags. Decawave’s Viot says potential applications include smart home tech, augmented reality, mobile payments, the aforementioned keyless car entry, and even indoor navigation. (And it’s not a power hog, either—Viot says that Decawave’s latest UWB chip uses one-third of the power of a Bluetooth LE chip when in beacon mode, as a tracking tile would be.)

(SixColors).

So think about what this means? Apple is rolling out the U1 chip roughly at the same time it is rolling out the LIDRO technology.4 Apple could have both technologies working together. For example, eventually if all the Apple devices have the U1 chip, Apple could use the location of the devices together with the LIDAR data. So if you lose your device in a mall, Apple could pinpoint for you exactly where that device is.

Conclusion

Developing new sensors, it looks like Apple is very serious with AR and mapping indoors with LIDOR technology. Apple getting in the LIDOR game is very serious. Remember, there have been rumors for many years that Apple is working building a car — Project Titan. (See MacRumors). LIDOR has been used for building an autonomous car.
Who knows what new products and services Apple will bring to the market in the coming years. What ever they do will push us forward into the future.


  1. I refer to it as Apple Glass without “the” in front of it because that is Apple’s naming convention for products. For example, Apple always refers to “iPhone,” not “the iPhone”. ↩︎
  2. In addition to GPS, Apple’s latest iPhone also has GNSS (Global Navigation Satellite System)technology. GNSS is an umbrella over all the satellite location technology. GPS relies on the old United States Department of Defense satellites. But other countries and Europe are creating new location technology location using new satellites. Europe’s Galileo is precise to 1 meter, roughly three feet, much more precise than GPS. So eventually satellite location for iPhones and other devices will be more precise than GPS. ↩︎
  3. Apple hasn’t abandoned iBeacon technology and still touts it in the technology specifications for the iPhone and iPad devices. It refers to it as “iBeacon microlocation.” ↩︎
  4. Now interestingly, it doesn’t look like the latest iPad Pros released this spring have the U1 Chip. (See AppleInsider — speculating that because of supply issues or the iPad Pro doesn’t have the latest processor, that is why the iPad Pros do not have the U1 chip). ↩︎

iPad Pro or MacBook Air For My Friend? Which To Get?

I have a really good and close friend who is not a tech geek like I am. Recently, his 2012 13-inch MacBook Pro broke down. I told him it was time he should get a new device. I suggested he get either an iPad Pro or a MacBook Air. I gave him the pros and cons of both devices: on one hand the 12.9-inch iPad Pro and on the other hand either the 13-inch MacBook Air.

Now mind you my friend didn’t do much with his prior 13-inch MacBook Pro. He is a writer. So the most he did with his MacBook Pro is write. He also used the mail app to send and receive emails. He also used the Safari and Chrome browsers to go on the internet. But that was about all he did with it. He didn’t have photos on that MacBook Pro. He never edited videos. As for sending messages or texts? He sent and received messages and texts on his iPhone, not on his MacBook Pro. He did watch videos on the MacBook Pro on the internet. And also listed to some music from the internet. He doesn’t have a tv, so his only movies or audios were on the MacBook Pro.

In the end, he went with a 13-inch MacBook Air. He is very happy with his purchase. It is a great upgrade from his 2012 MacBook Pro. But I think he made a mistake. I believe he should have purchased instead the 12.9-inch iPad Pro. Here is why:

Technology-wise, he gets much more technology from an iPad Pro than a MacBook Air.

Cameras. The iPad Pro has much better camera than the MacBook Pro and it has it on the front facing you and also on the back. The MacBook Air has a 720p FaceTime camera. The iPad Pro’s TrueDepth Camera, which is on the side with the screen, has an 1080p camera that shoots 7mp photos. In addition, it has two great camera lenses on the back that can shoot 4k video and photos at 12mp and 10mp. Now why should he care about these differences? Well, during the shutdown, like all of his, he is making many Zoom video calls with people. And if you have a better camera for the calls, then the Zoom or video calls you make are better. So right there for that reason the iPad Pro would be better for him than the MacBook Pro. But the cameras on the back are also better for him. He doesn’t take many pictures and he has an iPhone. But the cameras on the back would allow him to do more things with his iPad Pro than the MacBook Air. For one, he could scan documents with the iPad. You can’t scan papers with the MacBook Pro unless you have a good scanner connected to it. A good scanner can cost an additional $500. I do all my scanning now with my iPhone 11 Pro or my old iPad Pro.

Also, the cameras on the back have new technology for the iPad Pro. They have LIDAR sensors. Here is what the National Oceanic Service describes LIDAR as:

Lidar, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth. These light pulses—combined with other data recorded by the airborne system— generate precise, three-dimensional information about the shape of the Earth and its surface characteristics.

A lidar instrument principally consists of a laser, a scanner, and a specialized GPS receiver. Airplanes and helicopters are the most commonly used platforms for acquiring lidar data over broad areas. Two types of lidar are topographic and bathymetric. Topographic lidar typically uses a near-infrared laser to map the land, while bathymetric lidar uses water-penetrating green light to also measure seafloor and riverbed elevations.

No my friend isn’t going to be interested in LIDAR technology because he doesn’t know how it it will help him. But the thing is he will probably keep his new device for at least 7 years. And clearly Apple plans to do something with LIDAR on the iPad and also third-party developers will create programs that use LIDAR. Right now, without LIDAR, my iPhone 11 Pro has a measurement app which allows me to measure stuff with the iPhone. With LIDAR, the measurement app will work even better. By purchasing the MacBook Air rather than the iPad Pro, my friend is missing out on all the possible apps that could use LIDAR.

In addition to LIDAR, the iPad Pro has other sensors that the MacBook Pro doesn’t have. It has a Three-axis gyro, an Accelerometer and a Barometer. Plus it has FaceID security rather than a fingerprint security. The MacBook Air does not have these sensors.

Screens. How about the screen? The iPad Pro’s screen is superior to the MacBook Air’s screen. The 12.9 inch iPad Pro has 2732-by-2048-pixel resolution at 264 pixels per inch (PPI). The Macbook Air is 2560-by-1600 native resolution at 227 pixels per inch.

Weight. The 12.9-inch iPad Pro is lighter than the 13-inch MacBook Air as it weighs 643 grams while the MacBook Air weighs 1.29kg (or 1290 grams). But if you get the Magic Keyboard for the iPad Pro that combined weight will be about the same as the MacBook Air. But the point is you can have the iPad Pro naked with no cover and you can easily read it in your bed or sofa or when you are on a public transportation because its so light.

Power. The MacBook Air got a single-core Geekbench score of 1110 and multi-core score of 2862. The newer iPad Pros got Geekbench scores of 1118 single core and 4704 for multi-score. So the iPad is more powerful in multi-core processing.

Other Differences. There are a couple of other things that are different between the iPad Pros and and all the Mac.

  • First, the iPad Pro you can get with a cellphone antenna if you pay extra. You can’t get that with any Macintosh. So with the iPad Pro if you have a cellphone antenna, you can always connect to the internet even if you you are not at home and don’t have a connection to a wifi that is connected to the internet.
  • Second, the iPad Pro has access to many more apps and programs in the iOS App Store than what the Macs have access to in the Mac App Store. In addition, the iPad apps are generally cheaper than the programs for the Macs.
  • Third, and the most obvious, the iPad screen has touch on it so you can navigate the screen using your finger. This is a big deal because some apps are better to use by using your finger than using a mouse, trackpad and keyboard. Say you are watching a video. Isn’t it better to hold the iPad Pro and use your finger to start the video or to swipe around for other videos? Same with looking a photos. Isn’t touch better for looking at them? I think so.
  • Fourth, the iPad Pro can work with the latest Apple Pencil. My friend writes. Sometimes you want to write with a pencil and take notes. With the Macintosh you don’t have a choice. The screen on the Macintosh doesn’t have touch screen or the ability to use a pencil. With the iPad Pro you have a choice, touch, keyboard, trackpad, mouse or pencil. Actually also voice control. The Mac has voice control but I don’t think it is as useful or as ingrained in the system as voice control on iOS devices.
  • Fifth, the Mac has some major advantages but mainly for pros. For example, on the Mac you can use Xcode to create apps for the iPhone, iPad, Macs, Apple TV and Apple Watch. On the iPad Pro you can’t do that. But my friend is not a developer so he should not care about that. The Mac lets you put apps that don’t come from the Apple App Store. That is a big deal for a power user like myself. For example, I have long used the Audio Hijack program from Rogue Amoeba, which allows you to record any audio that comes through your Macintosh. There is nothing like that on iOS as Apple doesn’t let third-party developers fool around with the system underneath.1 Similarly, because your have a Terminal App on the Mac, and because MacOS was built on UNIX, you can put UNIX commands and do crazy stuff on your Mac. On your Mac, you can download Homebrew on the Terminal App and then do additional crazy stuff. Like for example, there are programs that can make your Mac a server for HomeKit so you can have devices that are not HomeKit work with Apple’s HomeKit on your iPhone and iPad. And of course scripting. On the Mac, you can run AppleScript and other scripts to automate your Mac. On the other hand, on iOS, Apple has the Shortcuts app, where essentially you can create your own scripts and automation apps. And iOS now has a finder so you can navigate into the folders for each app pretty much to find the files you need. So iOS is closer to having a finder like on the Mac.
  • Sixth, the security on iOS is greater than on the Mac because you can’t install non-Apple approved programs on iOS. Also, because the iOS apps are siloed when they run, unlike on the Mac where the programs get closer to the underlying system, your iPhones and iPads are much less likely to get hacked or hijacked. On the Mac, you can still get locked out and have your computer hijacked if you click on the wrong thing form the internet.

What about cost? That shouldn’t be the difference in deciding to purchase the MacBook Air or iPad Pro. The MacBook Air ranges in price from $999 to $2249 if you max out the ram and storage to 16gb of Ram and 2tb of SSD storage as well as bump up slightly the processor speed. The 12.9-inch iPad Pro starts at $999 and ranges up to $1649 if you max out storage to 1tb and get the cellphone antenna built in. If you add in the Magic Keyboard for the iPad Pro, add another $349. If you get the Apple Pencil, add in another $129 for the second generation model. So the prices of the iPad Pro and Macbook Air are congruent on price range. Given that my friend is likely to own this new device for at least 7 years, he could splurge on either the iPad Pro or MacBook Air top price and it would be not very expensive for each month he uses it. For example, take $2000 and divide it by 7 years times 12 months; it totals $23.8 per month.

What about storage? That also shouldn’t be the difference for my friend. The MacBook Air gives you more storage than the iPad Pro if you pay for it. Up to 2 TB for the MacBook Air and only up to 1TB for the iPad Pro. My friend told me he only used 10% of all the storage on his 2012 MacBook Pro. That MacBook Pro had 4gb of ram and a 500mb hard drive. So my friend used approximately 50 mb of storage. The entry level iPad Pro comes with 128gb of storage. That is 256 times more storage than he use don his 2012 MacBook Pro. That entry model would probably have sufficient storage for my friend.

So why did my friend pick the MacBook Air instead of the iPad Pro when the above shows that that iPad Pro has more flexibility and much better technology?

I think he was prejudiced against getting the iPad Pro by a prior experience with an iPad that he purchase many years ago and quickly returned to Apple. Many years ago, when the iPad was in its nascent existence in 2012, the iPad and its operating system couldn’t do as much compared to a Mac. Back then, there were no keyboard covers. You connected a keyboard by bluetooth or lightning back then. Significantly, the operating system for the iPad wasn’t geared for physical keyboard back then. Also, there was no support for a mouse or trackpad. So if you used a physical keyboard a lot, you had to each out to change an app or edit. For someone used to writing a lot on a MacBook Air, the 2012 iPad and its operating system could not really well complete with a Mac. I remember my friend purchase the iPad on my recommendation back then and he promptly returned if to Apple after trying to use it. He doesn’t trust that the experience on the iPad has changed so much since 2012 that it would be easy for him to write and use the iPad Pro.

Second, I don’t think he can appreciate that new things that he never tried could enhance his life. I think most people in the World are like that. It is normal. We live in today. If things are working today, we are happy with that. Something new might not work and it takes additional effort to learn. This is why Apple is careful about how and when it releases new products. Apple knows it has to bring people to the new technology by seducing them with great design that is friendly to people and technology that is so good that people will quickly adopt it. Look at the original iPhone. Most people didn’t think it was a big deal or given its high price anyone would adopt it. Remember Steve Balmer then the CEO of Microsoft. He said no one would purchase the iPhone. RIM, the makers of the then very popular Blackberry said no one would purchase the iPhone because it didn’t have a physical keyboard. iPhones and iPhone-copiers (i.e. Android phones) now completely control the cell phone market. No one purchases cellphones with physical keys or keyboards. I think most people when they start using an iPad regularly will full embrace it as their main computing device. My mom, who is 90, started using an iPad about 6 years ago. She has access to regular computers. But when I see her she is always on the iPad.

Eventually my friend will have an iPad as his main device. Eventually.


  1. Apple isn’t approving Audio Highjack to get into the Mac’s system to highjack audio. Audio Highjack is not sold in the Mac App Store. Instead you download and install it directly from Rogue Amoeba. ↩︎