Brett Burney was in California when we recorded this week’s episode of the In the News podcast. Was he meeting with Tim Cook? I forgot to ask him during the podcast, so I suppose that means that the answer could be yes. Instead, I started by following up with Brett’s previous report on the Starbucks app and the Live Activities feature that tells you when your order is ready. After trying the feature a few times, I have some tips. But most of the episode is devoted to follow-up on the Apple Vision Pro, which is by far the primary topic of news in the Apple world right now. We also bid farewell to iTunes on Windows and discuss a few more topics.
My tip of the week is based on my initial very favorable experiences with the Apple Magic Trackpad ($109.99 on Amazon) as an input device for the Apple Vision Pro.
The Apple Vision Pro remains by far the big topic in the world of Apple. I’ve now seen perhaps all of the reviews. Many of them conclude, as I did, that this is such a new device that it is premature to give it a full review. But others have done their best anyway, and I think that the best review that I have seen so far is this one by Jason Snell of Six Colors. Not only does he do a great job of describing the product, he also puts it in historical context and sets the stage for what the future could be for this device. I definitely recommend that you check out that article. And now, the other news of note from the past week:
Although we know that iOS 17.4 will be released in March, yesterday Apple released iOS 17.3.1. to fix a problem where text may unexpectedly duplicate or overlap while typing.
A while back, I went on a quest to improve the quality of the video for all of the Zoom and Teams teleconferences that I was on. For a while, I thought that I needed a better camera, and for a while, I even used software called Camo so that I could use the excellent lens on my iPhone to serve as my webcam. But ultimately, I discovered that the best solution was just to have better lighting, which made a world of difference. The item that I selected was a 2-pack of the Lume Cube Edge 2.0 LED Desk Lamp, which costs $239.99 on Amazon. I have those two lights set up on both sides of my monitor in my office, and whether I am participating in a videoconference with a court, a meeting with partners at my firm, or recording the In the News podcast, the video quality is so much better thanks to the lights. I mention all of this because I saw an article this week called You Don’t Need A New Webcam, You Need a Light by Corbin Davenport of How-To Geek, and it reminded me of how happy I have been with those lights that I purchased a year ago. A big advantage of the two-camera setup that I have is that it solves having weird reflections off of my glasses and having half of my face look much better lit than the other half. If you want to upgrade how you look on camera, I have been extremely happy with the Lume Cube for the last year.
On May 30, 2007, when Steve Jobs was being interviewed at the D conference, he mentioned how popular iTunes was on Windows computers by saying “We’ve got cards and letters from lots of people that say that iTunes is their favorite app on Windows. It’s like giving a glass of ice water to somebody in Hell.” I thought about that quip this week when I saw Juli Clover of MacRumors announce that Apple has officially launched its new Apple Music, Apple TV, and Apple Devices apps for the PC, which means that just like iTunes is long dead on the Mac, it is now dead on the PC as well. Hopefully, PC users will be happy that they now get three small glasses of ice water instead of one the one big Stanley water bottle that was iTunes.
If you are in the market for AirTags, you can get them at a big discount on Amazon right now. A few days ago, they were down to $79.99 for a 4-pack (list price is $99.99), so I bought a set. As I type this, they are even cheaper at $78.99.
Speaking of finding things, Jason Cross of Macworld provides some initial thoughts on the Apple Vision Pro including a good question: why doesn’t this device support Find My like so many other Apple devices? Strange.
The folks at iFixIt have been taking apart a Vision Pro and posting some amazing pictures along with an explanation of what everything does. This firs post explains the EyeSight feature. The second post discusses the dual displays, noting for example that you can “fit more than 50 Vision Pro pixels into the space of a single iPhone 15 Pro pixel. Yes, you read that right.” Wow.
If you are curious about the security of using your eye to unlock an Apple Vision Pro using Optic ID, Apple has posted this page to describe the process. Apple says: “The probability that a random person in the population could unlock your Apple Vision Pro using Optic ID is less than 1 in 1,000,000.” That is similar to what Apple says about Face ID.
How much storage space do you need on an Apple Vision Pro? It depends upon how much you want downloaded to the device—the same thing that you need to think about for an iPhone or iPad. But to give you an example, I downloaded five 3D movies from Disney+ so that I can watch them when I am traveling and they take up a total of about 50 GB. Specifically: Guardians of the Galaxy Vol. 3 (6.86 GB), Avatar: The Way of Water (23.63 GB), Elemental (9.91 GB), Ant-Man and the Wasp: Quantumania (7.3 GB), and The Marvels (4.92 GB).
We don’t yet have a YouTube app for the Apple Vision Pro (although Google is working on one), but in the means time, John Voorhees of MacStories recommends Juno, a Vision Pro native app that provides a front end for YouTube.
The day after I published my “initial review” of the Apple Vision Pro, I went back and updated the first paragraph to note that Zac Hall of 9to5Mac reports that MDM support is coming in the next software update. I hope that the means that the specific MDM software that my law firm uses it updated quickly, which should mean that I can get my email, contacts, and calendar information on my Vision Pro.
As expected, different companies are starting to come out with cases for the Apple Vision Pro as an alternative to the $199 case sold by Apple. For example, WaterField Designs has a good reputation for building nice products, and they recently released the Vision Pro Shield Case. It’s a little cheaper at $159 (or $179 for a model that includes leather) and it looks like it is much more compact than Apple’s case.
As you can tell from last week’s video episode of the In the News podcast, the current version of digital Personas, used in conference calls on the Vision Pro, leaves a lot to be desired. But Apple is already making significant improvements. Dan Barbera of MacRumors shows off his current Version 1.0.2 Persona versus his Version 1.1 Persona (1.1 is still in beta) and it really is noticeably better. If Apple can improve Personas this much in a small update like this, perhaps they will get substantially better when Apple has even more time.
Trevor Sochocki of the NBC affiliate in Sarasota, Florida, reports that an Apple Watch alert helped to save the life of a deputy sheriff when it warned him of a heart issue that the hospital determined was dangerous blood clots in his lungs.
And finally, Apple Music is sponsoring the Super Bowl Halftime Show this weekend, with a performance by Usher. As a teaser for that performance, Apple released a video starring rappers Ludacris and Lil Jon, actress Taraji P. Henson, and Apple CEO Tim Cook, wearing a T-shirt that I’ve never seen him wear during an Apple product announcement. I guess we will get the conclusion to the story set for in this video on Sunday:
I’ve been using an Apple Vision Pro for just a few days now, and I want to tell you all about it. I want to, but to be fair, I can’t. I don’t just mean that I am speechless, although there is some of that for sure. The issue is that there is so much new about this device that I am going to need to use it for at least a few weeks before I really get the hang of it. Also, the thing that I am most curious to try out and review—using the Vision Pro to be productive to get work done—is way too much in a preliminary stage right now. I cannot currently use my work email, contacts, or calendar because Apple has not yet added Mobile Device Management support. [UPDATE 2/6/2024: Zac Hall of 9to5Mac reports that MDM support is coming in the next software update.] I cannot directly use Dropbox the way that I can on an iPad (or even iPhone) because there is no Dropbox app yet. And even for apps that are there, I am still trying to figure out how to make them work well. (For example, only a few minutes ago did I learn how to be much more efficient with the 1Password app having it confirm my identity with a retina scan instead of me typing my password. Tip: tell 1Password to let you log in using “Face ID” even though it isn’t really Face ID.) But even though I cannot possibly write a full review yet, I can provide lots of initial thoughts. And at the risk of spoiling the ending: this is perhaps the most amazing item of technology that I have owned. I am blown away.
Entertainment
I’m definitely ready to start talking about the Apple Vision Pro as an entertainment device. Just one weekend was more than enough time for me to discover that this is an incredible entertainment device and the best way to watch a show if you are by yourself.
I presume that you have seen a 3D movie in a movie theater before. The 3D effects can be impressive, but the movies are less bright and the color is less vivid because 3D glasses act like sunglasses. As a result, I have avoided seeing a 3D movie in the theater for many years. Watching a 3D movie in the Vision Pro removes those limitations. And the end result is simply amazing.
Saturday night, I watched the first half of the 3D movie Avatar: the Way of Water in the Disney+ app, one of many Disney 3D movies available to subscribers of the service. It was beautiful, The colors, the dynamic range, and the realism were all beyond anything I have ever seen before. I started by watching the movie using one of four special Disney environments: I selected the movie theater. When the movie started, the lights around me dimmed and I felt like I was watching the movie in a beautiful theater where I was sitting in the center. But because the movie was so incredible looking, I decided that I wanted an even closer view. When I exited the special environment and just played the movie in a normal window, I was able to look at a corner of a window and change the size of the window—something that you can do in most any app with the Vision Pro. That let me make the video screen just a little bit larger, which for me was even better because it was even more immersive.
As great as that movie was, I was even more impressed by content that filled even more of the screen. In the Apple TV app, there are a few Immersive Videos. They are fantastic, and Apple needs to add many more of these. This is the future. You’ve probably heard a little about them because Apple shows small clips of these videos in the demo reel shown to earlier reviewers that has been written about extensively. One video places you in a music studio with Alicia Keys. Another video, a show called Apple Immersive Adventure, has only one episode available so far and it features a woman walking across a high wire stretched between mountains in Norway. These Immersive Videos fill your entire field of view, and as a result, they do the best job of making you feel like you are actually there. The video quality is crisp, the colors are vibrant, and these experiences make a regular 3D movie with a 16×9 screen feel almost boring in comparison.
Your own photos and videos
And then there is the entertainment value that comes from looking at your own photos and videos. Because you can make the Photos app as big as a wall, your photos and videos are huge and beautiful. If you take a Spatial Video using either an iPhone 15 Pro or the Apple Vision Pro itself, you get a 3D home video that is so much more realistic that it can actually provoke an emotional response. I do wish that you could expand the size of the 3D home videos even more so that they were close to the size of the Apple-produced Immersive Videos. Instead, you can tap a button at the top right to change to Immersive mode, which makes the video even bigger but then produces clouds around the edges and through the rest of your field of vision. Don’t get me wrong, this is pretty cool too, but not as impressive as a video that fills the entire field of sight.
If you take a Spatial Video using the Apple Vision Pro itself, the quality of the camera doesn’t seem to be any better than what you take with an iPhone 15 Pro, and in fact, based on the tech specs, it may actually be worse. However, the videos that you take with the Apple Vision Pro take up a little bit more of your field of view than the iPhone Spatial Videos because they are square, so you see more at the top and bottom. However, Spatial Videos that you create using the Apple Vision Pro still do not come close to taking up the entire field of view like the immersive videos offered by Apple TV+.
Also, Spatial Videos that you take yourself are not as bright or as crisp. My guess is that this all comes down to the quality of the cameras. If a future iPhone (or Vision Pro) supported taking spatial video and pictures with a camera system that supported 4K and HDR, then I suspect that we get closer to the quality (if not size) of what Apple is offering. That would be something special, so it is now on my bucket list for what I hope to see in a future model of the iPhone: take 4K HDR Spatial Videos.
Having said that, don’t misjudge what I am saying about Spatial Videos that you take yourself. They are truly magical and represent a major leap in home video technology. When my son was a baby, my wife and I purchased an HD video camera that recorded to tape. It was expensive, but I figured that in the future I would appreciate having HD videos of those early years instead of standard definition videos. Now that the family TV supports 4K HDR, I’m so happy that I don’t have to look at standard definition videos of my kids when they were young and cute. So at some point in the future, when using who-knows-what generation of an Apple Vision device, I’m sure that I’ll be happy that I took as many Spatial Videos as I could back in 2024.
Augmented reality
The core feature of the Apple Vision Pro is that you can see things all around you. And it is amazing, a new revolution in computing. We are all used to window management on a computer, where you can have multiple windows that overlap to a certain extent—less so if you have a big enough monitor. But when all of the space around you that you can see, even above and behind your head, can be filled with windows, it really is something to experience.
Moreover, if I stand up from the desk where I am typing—and yes, I’m tying virtually all of this review using an Apple Vision Pro—I can walk to another room and then look back and all of these windows are still just sitting where I placed them around the table where I was working. The back of the windows are just gray boxes, but for many of them I can still see some of what is on the screen, and it is just mind-blowing to walk around and even through all of these virtual screens.
You can also open a different window in a different part of the room or even a different part of the house. You can leave a timer app in the kitchen, the Notes app in a different room, etc. There is something incredible about standing in a doorway and seeing different windows floating in different rooms of your house. But they don’t have to stay there forever. You can simply tap the Digital Crown button to move your apps to your current room.
In the future, I’d love to see some setting or third-party app that you could use to perfectly arrange a bunch of windows from a bunch of different apps and then save that setting so that you could restore it in the future. For now, if you turn off the Apple Vision Pro, you need to arrange the windows anew.
Other augmented reality features are also very cool. I’m using a Bluetooth keyboard, and just above my keyboard are autocorrect buttons that I could select to automatically type the next word, activate Siri, or bring up a full virtual keyboard. If I move my physical keyboard, the window floating above my keyboard moves too. It helps to create the illusion that this virtual additional display is a part of my non-virtual keyboard, blurring the distinction between the real world and the virtual world.
Given that this device can recognize a keyboard, I wish that it could also recognize and work well with an iPad or iPhone. You can easily see either one of those devices when you are using a Vision Pro, but Face ID does not work because your eyes are blocked by the Vision Pro device. I’m surprised that Apple did not already create a solution for this. For example, once the Vision Pro authenticates who I am by checking my eyes, it should communicate that to my iPhone or iPad if I am clearly looking at one of them with the Vision Pro. This is just one of many examples of situations in which it is clear that we are using an early version of the system software.
For now, there are only a limited number of ways in which the Vision Pro truly augments reality, blending the real world and the digital world. But it is really cool whenever it happens, and I look forward to seeing more of it.
Controls
You control the Apple Vision Pro by looking around and then, when you want to select something, you pinch your thumb and pointer finger together. To scroll a window, you put your thumb and pointer finger together and then move up or down or side to side as if there was an invisible string between your fingertips.
For the most part, it works quite well for me. However, I have noticed two issues, which are a result of problems that I have with my eyes. I mention them in case you have similar eye issues.
First, I have pretty severe nearsightedness. My prescription addresses this as best as it can (my Sphere is -5.25 for my right eye and -7.00 for my left eye), but because my right eye is much stronger than my left eye, I found that I got better results when I tell the Vision Pro to just track my right eye. You can do this in the Settings app under Accessibility -> Eye Input, where your choices are the default of Both Eyes or Left Eye Only or Right Eye Only. If one of your eyes is better than the other one, this setting is made for you.
Second, I have a condition called Nystagmus, which means that my eyes shake, especially when they get tired. It’s a reason that even a good prescription cannot get me to 20/20 vision. When your eye is your pointer device and your eye shakes, the result is that it is more difficult to select specific items. You can see whether this is an issue for yourself by opening in the Settings app and going to Accessibility-> Interaction (under Physical and Motor) -> Pointer Control -> and then flip the switch for the first option, Pointer Control. This will make a semi-translucent dot appear in the virtual world wherever you are looking, and it is very similar to what you see when you use a trackpad or mouse on an iPad. If you find that the dot cursor follows your eye very precisely, then I suspect you can be very precise in controlling the Vision Pro experience. For me, the Vision Pro just gets a little harder for me to control at the end of a long day.
There are a few times when you can use your hand to “touch” something in virtual space. For example, when the on-screen keyboard comes up, you can use one finger to touch keys on the keyboard, as if you are typing using hunt-and-peck. I find that it is easier to use the on-screen keyboard by looking at a letter and then pinching your fingers on that letter. And for most of your Apple Vision Pro use, you will just be looking with your eyes to simulate a mouse and pinching with your fingers to simulate a click.
Getting work done
This is the part of this review that I had really looked forward to writing. But for now, it will have to wait. As I mentioned at the outset, everything is still just too new. Most of the apps and services that I need to get my work done just don’t exist yet. And even the ones that do exist need more work.
But I can already tell that at some point, hopefully soon, this is going to be an amazing device for getting work done. I’ve already mentioned how interesting it is to place windows anywhere. If you are working with a document and want to set it aside as you type in Microsoft Word, it is great to have so much space. You can just put it up on the ceiling if you want, or to the top right. I enjoyed reading legal briefs using the PDF Expert app on the Vision Pro. PDF Expert doesn’t (yet) have an app specifically for the Vision Pro, but the iPad app works fine.
I’ve seen devices that you can buy to hold an iPad on an arm so that you can position it over you in a bed or right in front of you as you are sitting on the couch. This is what you get with the Apple Vision Pro—plus the ability to pinch the corner of your window and change it from 11" to 70", which of course you cannot do with an iPad.
And the fact that each window can be so big is truly amazing. I love being able to work with multiple huge virtual monitors—with screens larger and more crisp than anything that I could ever afford in the real world—no matter where I am located. I can easily envision sitting at a small desk in a hotel room and taking full advantage of this setup.
But getting work done requires more than just a screen. If you are typing, you will want to use some sort of external keyboard. The Vision Pro’s virtual keyboard is fine for typing a few letters or words, but for anything more than that you will want an external keyboard so that you can type quickly.
My opinion that you need an external Bluetooth keyboard to get real work done should not really be surprising. I would say the same thing about getting work done on an iPad. Or even an iPhone, if you are typing something long.
One surprise for the keyboard: I had expected for Command-Tab to switch me between apps, or at least to bring up some sort of app switcher. That is missing, and it seems like something that Apple should add. I’m sure that you know how quickly switching apps on any computing device is a core productivity tool.
I also suspect that a physical pointing device could be very useful. Unlike an iPad, you cannot use a Bluetooth mouse with an Apple Vision Pro. Right now, I believe that the only device that works with the Vision Pro is the second generation of the Apple Magic Trackpad, which is currently discounted 15% on Amazon for $109.99. I ordered one so that I can see if it will help me to work more quickly. (I suspect that it will also help with my Nystagmus.)
As the system software improves and the third-party app situation starts to become more complete, I strongly believe that a day will come when I will find that I will prefer to get work done with an Apple Vision Pro (plus a keyboard and trackpad) instead of an iPad (plus a keyboard and pointer device). It’s not there today, but I hope that it comes soon.
Conclusion
As I begin my journey with the Apple Vision Pro, I am incredibly excited. As an entertainment device, it is already amazing. Moreover, it is clear that Apple did a great job at coming up with the 1.0 version of a spatial computing environment. As I type these words, I am looking at these incredible, large windows all around me, floating in virtual space, and I almost cannot believe what I am looking at. The “wow” factor is definitely there. As for using an Apple Vision Pro to get work done, I did write this blog post, so the ability to write and edit is already there right now. But you can tell that it is still early stages. It took the iPhone two years to add the cut, copy, and paste commands, and I cannot imagine using an iPhone or iPad today without that basic feature. I expect to see similarly important improvements over the next year or two with the Apple Vision Pro.
Because there is clearly so much more to come, it makes perfect sense that many folks will wait until a second-generation device comes out and/or wait for the system software to mature. But if you decide to be an early adapter and get a device now, there is certainly a lot to love.
Yesterday morning, I was first in line at my local Apple Store to pick up my pre-order of the Apple Vision Pro. I used it basically all day long yesterday to explore how the device works, do some legal work including reading and annotating some briefs, explore the amazing photo and video offerings including the most amazing 3D that I have every seen in my life, etc. And then at the end of the day, I used the Zoom app in the Vision Pro to connect with Brett Burney and record this week’s episode of the In the News podcast. If you want to hear all of my initial thoughts, you’ll definitely want to check out this episode.
Better yet, I encourage you to “see” and not just “hear.” If you are like me, you listen to audio versions of your podcasts. But for this episode, you should at least watch a little bit of the video, which is embedded and linked below, to see how the avatar version of me looks like using Apple’s very-much-in-beta Persona feature. Just take a deep brief before you start watching my avatar; that’s what I had to do.
This episode is all about the Apple Vision Pro, and my one tip of the week is for folks who are lucky enough to have one but want to recalibrate the eye tracking every they have more of a sense of how everything works.
The Apple Vision Pro officially goes on sale today. I’m scheduled to pick up my pre-order at my local Apple Store this morning. Also, starting at 8am today, you can sign up for a Vision Pro demo appointment at an Apple Store if you want to try it out for about 20 minutes. Even if you don’t buy one right away, I think that this will be something that you will want to try out. The importance of this device in the history of Apple and computing in general will be judged by future historians, but right now, it does seem like we are witnessing the launch of something revolutionary. This was made clear om an article about the Vision Pro released yesterday that I highly recommend: the cover story by Nick Bilton for the most recent issue of Vanity Fair (available in Apple News+). As a 1.0 product, I’m sure that we will one day look back on this device and laugh at some of its specifications—perhaps it will one day seem too big or too heavy or underpowered. But just like Apple introduced a graphical user interface to the masses with the Mac in 1984 and Apple introduced a touch smartphone interface to all of us with the iPhone in 2007, today is the day that Apple brings spatial computing to the world. As people start using it, we will likely see interesting apps from developers unveiled over the course of this year. And in the upcoming years, Apple will tweak the device to provide even more of the features that people like. Whether you are one of the few who is an early adopter or you are just watching from the sidelines to find out whether this technology is useful or a bust, there is sure to be lots to talk about. If you want to hear my initial reactions, Brett Burney and I will be recording an episode of the In the News podcast at the end of the day today, so look out for that episode tomorrow. And now, the news of note from the past week:
Apple announced yesterday that there will be over 600 new apps for Vision Pro available at launch, plus over a million iPad apps that are compatible.
In a post on the Microsoft 365 blog, Gabriel Valdez Malpartida reveals that the Microsoft apps Teams, Word, Excel, PowerPoint, Outlook, OneNote, and Loop will be available for the Apple Vision Pro today. Apple had previously demonstrated the Word app, so we knew that one was coming, but it is nice to see the others as well.
If you work in a corporate environment that uses Mobile Device Management (MDM) to manage devices like the iPhone, as I do, I believe we will need to have MDM enabled on the Apple Vision Pro to access certain services such as Mail and the Calendar. And so far, I haven’t heard anything about MDM being available on the Vision Pro. For example, Microsoft has a product called Intune that is used by many law firms and other businesses to manage devices, and I don’t see any reference to Intune in that post from Microsoft that I just mentioned. A few weeks ago, Jonny Evans explained in his Apple Holic column in Computrworld why it is important for Apple to support MDM with the Vision Pro, but that article is the only reference I have seen on this topic. Without MDM support, I suspect that I won’t be able to access my work email on an Apple Vision Pro even though I can do so on my iPhone or iPad. This seems like one of countless issues that Apple will need to address in future software updates.
The Zoom app will be available for the Apple Vision Pro starting today, as reported by Chance Miller of 9to5Mac. I’ll try it out and, if it works, I’ll try to use it for part of the time when we record today’s episode of the In the News podcast. Once it is posted, check out what will be the YouTube version of the Episode 133 of the podcast to see what you think.
Shortcut Button is an interesting Apple Vision Pro app developed by Finn Voorhees and reviewed by Matthew Cassinelli. The app lets you put floating automation buttons in space around you, such as a button next to your stove that starts a timer. (If the name “Voorhees” sounds familiar, Finn’s father is John Voorhees, a former practicing attorney who now works at MacStories and is often cited here.)
Salvador Rodriguez of the Wall Street Journal reports that Meta, the parent company of Facebook that sells the Meta Quest VR headset, hopes that the Quest will serve as a competitor to the Vision Pro in the same way that Android is a competitor to the iPhone.
Emma Roth of The Verge reports that Meta will update the Quest 2 and Quest 3 headsets so that they can also watch 3D spatial videos recorded for Apple’s Vision Pro. Folks who have used a Vision Pro have reported that watching spatial videos can be an incredibly moving experience. With the lower resolution of the Meta devices, I’m curious whether that emotional aspect will still be there.
How many Apple Vision Pro units has Apple pre-sold? Only Apple knows for sure, but Juli Clover of MacRumors claims that the number is around 200,000. At one point, I heard a rumor that Apple could only get 1 million of the tiny screens used in the Vision Pro during the first year, and with two screens per device that means 500,000. If that number is even close to accurate, it is hard to believe that Apple has already sold 40% of the units available in Year 1, so some of those guesstimates must be wrong.
One number that Apple has revealed for the first time is the number of users of the Apple Card credit card. As reported by Benjamin Mayo of 9to5Mac, there are over 12 million people using the Apple Card, with over $1 billion in cash back paid last year. I own an Apple Card, and while I don’t use it for all of my purchases, I definitely use it for all of my Apple purchases thanks to the 3% cash back.
The iPhone has been able to take panoramic photos for a while now, and I understand that those photos will look great in an Apple Vision Pro—so much so that I’m sure that I will soon wish that I had taken even more of them in the past. But you can also use the panoramic photo feature to perform some tricks, as Michael Potuck of 9to5Mac explains. Fun stuff.
Ed Hardy of Cult of Mac reviews NexDock, a product that looks like a laptop but connects to an iPad to essentially turn the iPad into a laptop. It’s a fascinating product, even though I don’t think that I would want to ever use it myself.
Dan Moren of Six Colors reviews the Level Lock+, a smart lock that supports Apple’s home key standard so that you can just hold up your Apple Watch near the lock to unlock the door.
If you want a web browser for the iPhone that is easier to use with just one hand, Federico Viticci of MacStories recommends the Arc Search app.
One of the shows that I thoroughly enjoyed watching on Apple TV+ last year was Hijack starring the amazing Idris Elba. Ed Hardy of Cult of Mac reveals that Apple is picking up that show for a second season. This is great news, and I look forward to seeing the next installment of this exciting series.
William Gallagher of AppleInsider reports that victims of a horrible accident when a car ran into a palm tree in California were rescued because one of the occupants was wearing an Apple Watch, which called for help.
There isn’t an iPhone connection for this one, but as an appellate attorney who is often thinking about the right font to use, this video from Elle Cordova that personifies fonts made me laugh, as did the follow-up video.
And finally, here is the advertisement called Hello Apple Vision Pro that Apple is running on TV to promote sales of this new device.
Apple gave select members of the press an Apple Vision Pro so that they could use this new device in the real world for about a week, and yesterday the embargo lifted and their initial reviews were released. It is clear from these reviews that Apple has come up with something really incredible. There was tons of praise … and also some critiques, but they are what you would expect for any first-generation product and not unlike the drawbacks of the very first iPhone, Mac, Apple Watch, car, airplane, and just about any other major new technology. Nevertheless, the consensus seems to be that what the Apple Vision Pro does now, and what it has the potential to do with future updates, is groundbreaking. Mark Spoonauer of Tom’s Guide calls it “the most innovative Apple product since the original iPhone.” And Todd Haselton of CNBC says: “This is an entirely new type of computing, providing a whole new world of experiences. It feels like the future.”
Perhaps the biggest praise came for the Apple Vision Pro as an entertainment device: a way to watch movies, TV shows, and especially 3D content in an immersive environment. As John Guber of Daring Fireball noted: “There are 3D immersive experiences in Vision Pro that are more compelling than Disney World attractions that people wait in line for hours to see.” And Gruber notes that he is normally someone who is not a fan of 3D movies, “but in Vision Pro the depth feels natural, not distracting, and you don’t suffer any loss of brightness from wearing what are effectively sunglasses.”
The reviewers were also impressed by the quality of the tiny screens that are in front of your eyes. Text on a website is actually readable. And thanks to the HDR, you can look out a window in your room or at an iPhone screen and the bright areas are not completely blown out.
Reviewers seemed to agree that the device is heavy enough that you notice the weight, although the two different straps that Apple provides seem to do a decent job of accounting for that weight on your head, with different reviewers preferring different straps. Keep in mind, though, that if you have hair with a lot of volume, the over-the-top strap (the Dual Loop Band) might mess up your hairdo, in which case I hope that you like the Solo Knit Band. One issue that had not occurred to me is that for people who wear makeup, the light seal cushion can mess up your makeup and collect makeup on the cushion. All of these issues, it seems, are inherent to this being a 1.0 product. Of course future generations will be smaller and lighter and have better battery life etc. If you don’t mind waiting a few years, I have no doubt that future Apple Vision products will address these issues, and at some point we will look back and be amazed at how big the first generation Apple Vision Pro was. During the recent 40th anniversary celebration of the first Mac, I sometimes found it hard to believe that the original Mac’s tiny black-and-white screen with a single floppy drive seemed so revolutionary at the time—except that I remember how fascinated I was by that original Mac when it debuted in 1984, not only compared to what else was on the market at the time but also because it seemed to provide a glimpse of what the future would bring.
Using the Apple Vision Pro as a productivity device is something that I’m particularly interested in, and the reviews were mixed on this front. Everyone seemed to be impressed by the ability to place different windows in different spaces all around you. You can even leave a window in a specific place if you walk around—for example, leave a timer right on top of a pot on the stove. But it is clear that to get any real typing done, you’ll want to use an external keyboard instead of the virtual keyboard. No big surprise; I’m tying these words on my iPad right now, but I’m using a bluetooth keyboard to do so because nobody likes to type multiple paragraphs on the iPad’s on-screen keyboard.
If you own a Mac, the Vision Pro provides your Mac with an amazing huge external screen that would be too expensive to ever buy. But remember that you are working in isolation. If someone else is in the room with you, you cannot simply turn a monitor to show them what you are seeing.
But you might be able to learn more by seeing the Vision Pro in action, albeit in a 2D video. There are lots of them out there, but I particularly enjoyed these three:
iOS 17.3 was released this week, so Brett and I begin this week’s episode of the In the News podcast by discussing what is new, including Personal Device Protection, shared playlists, and more. We then look forward to iOS 17.4, which could be coming in March, with even more features including transcripts in the podcast app and new Emoji. We then celebrate the 40th anniversary of the Mac, discuss the latest information on the Apple Vision Pro, explain why you need to watch what you type when you are using public Wi-Fi in an airport, and more.
In our In the Know segment, Brett discusses the new live tracking feature in the Starbucks app. I discuss why you might be better off paying for AppleCare monthly or yearly instead of all at once.
When I was growing up in the 1980s, I had a chance to use lots of different computers including my own personal computers—the Sinclair ZX81 followed by a Commodore 64—plus various Apple II models that were in the computer lab at my school. But when the Macintosh was introduced 40 years ago this week in 1984, that was the computer that I really wanted, so much so that I even tried to imitate it by running a graphical user interface on my Commodore 64 called GEOS made by Berkeley Softworks. In 1988, using the money that I had earned working over the summer and thanks to the generous student discount at my college bookstore, I was finally able to afford my own Mac: a Mac Plus with an external 20MB (!) hard drive. Over the last four decades, we’ve seen many other influential technology products come and go—some of them from Apple itself, such as the iPod—but through it all, the Mac has remained and is actually doing better than ever right now. And I still use a Mac every day, including right now to type these words. A lot has been written about the Mac this week thanks to the 40th anniversary, but I don’t think anyone has done a better job than Jason Snell, the former editor of Macworld magazine. Here is his excellent article on the 40th anniversary in The Verge, and he also wrote great articles on his Six Colors website about the four eras of the Mac and the Mac’s enduring appeal. I’ll admit that the iPhone, iPad, and other products like the upcoming Apple Vision Pro are often more exciting to talk about and use, but the Mac continues to get things done. Bravo for that. As we look forward to the next forty years of the Mac, here are the other items of note from the past week:
iOS 17.3 was released this week, and the major new feature is Stolen Device Protection, which I discussed earlier this week. But there are other notable changes as well, and Justin Meyers of Gadget Hacks does his typical good job of describing all of the new features.
Dan Moren notes on Mastodon that with the release of iOS 17.3, Apple has now rolled out all of the iOS 17 features that were promised when iOS 17 was revealed in 2023.
…but that doesn’t mean that we are done with iOS 17. Apple has already started beta testing iOS 17.4, and there are lots of interesting new features planned. For example, as someone who loves to listen to podcasts and who records one every week, I’m particularly interested in a new automatic transcript feature in Apple’s Podcasts app. It sounds like Apple is using AI to create the transcripts and then they sync with the audio, just like lyrics sync with a song in Apple Music. This will make it so much easier to relocate information that you heard in a prior podcast, and will have so many other uses as well. Jason Snell of Six Colors has information on this new feature, as does John Voorhees of MacStories.
Federico Viticci of MacStories notes on Mastodon that Apple has been working on this transcripts feature for over four years.
iOS 17.4 will also include new Emoji, and Roman Loyola of Macworld has a preview.
Lucas Ropek of Gizmodo has an interesting warning about being careful what you say even when you think you are saying it in private. A teenager at an airport sent a message to a friend via Snapchat and made the (bad) joke that he was going to blow up the plane as he boarded. Because he was using the airport’s public Wi-Fi, British security intercepted the message and took it seriously. Spain them scrambled two fighter jets to escort the plane to its destination, and the teenager was arrested. Spain is now seeking to make him pay $120,000 for the fighter jet deployment as a result of the teenager making what he thought was a joke in a private message to a friend.
We are now only a week away from the public release of the Apple Vision Pro. I’m sure that Apple gave some reporters and others pre-release units, so my guess is that we will see the first reviews around Tuesday of next week. David Pierce of The Verge explains that Safari may be one of the most important apps to be included with the Vision Pro because you can do just about anything with a web browser, even if a developer has not released a specific app for the Apple Vision Pro. For example, without a Netflix or YouTube app (for now at least), you’ll use Safari to watch videos from those services on the Apple Vision Pro.
Apple released a really cool video that shows the Apple Vision Pro being made. But if you are like me, you may not understand what you are seeing in every shot of the video. Product designer Marcus Kane wrote a great thread on X in which he explains, with pictures, exactly what Apple is showing off.
Chip Loder of AppleInsider explains the different modes of the Apple Vision Pro and who developers needs to keep this in mind when designing spatial computing apps for this device.
As we think about the Apple Vision Pro, the next big thing from Apple, what will be the next big thing after that? Perhaps it will have something to do with a car. It is widely known that Apple has been working on car technology, and there were rumors this week from Mark Gurman of Bloomberg that we might see a product in four years. But as Zac Hall of 9to5Mac correctly points out, people have been saying that Apple’s car project is four years away for nearly nine years now.
The European Union is requiring Apple to change the way that its App Store works to comply with the Digital Markets Act. As John Voorhees of MacStories reports, Apple revealed this week how it will do so, and it includes allowing third-parties to run their own App Stores. It will be fascinating to see how this works in practice and whether it ends up being better for end users or just exposes them to more dangerous or unsavory apps.
And finally, one of the new features in iOS 17.3 is the ability to share an Apple Music playlist with others so that a group of people can add to and edit a playlist. Ebro Darden, Nadeska Alexis, and Lowkey, who host the weekly video series called Rap Life Review, demonstrate how this new feature works in a video released this week by Apple Music:
Yesterday, Apple released updates for the iPhone and most of its other platforms. Once you update your iPhone to iOS 17.3, I recommend that you turn on Stolen Device Protection. That way, if someone steals your iPhone and also has learned your device passcode, there is a limit to how much damage they can cause to you. Here is how you turn on Stolen Device Protection and what it does.
To turn on Stolen Device Protection, go to your Settings app and tap on Face ID & Passcode. Scroll down until you see Stolen Device Protection. Simply tap the blue words Turn On Protection to turn it on. That’s it.
Once you have this feature turned on, your iPhone will turn on some additional security requirements in certain situations. If your iPhone senses that you are away from a familiar location like your home or your work, the iPhone will require you to use Face ID or Touch ID to do certain functions such as accessing stored passwords and using credit cards.
Isn’t this already true? Yes, but only to a certain extent. Without Stolen Device Protection, if Face ID doesn’t work, you can instead enter your iPhone’s passcode. But this means that if a thief has your iPhone and knows your passcode, it won’t matter that Face ID and Touch ID will fail for the thief because he can instead use your code. Turning on Stolen Device Protection prevents the thief from using your passcode as a way to circumvent biometric authentication.
Additionally, with Stolen Device Protection turned on, there is a delay before your iPhone will let you perform certain actions such as change your Apple ID password. Currently, some thieves know that if they immediately change your password once they steal your iPhone, they can prevent you from using your Apple ID on another device to mark your iPhone as lost, which remotely locks it with a passcode to keep your device and information secure. Stolen Device Protection gives you some time (I believe it is one hour) to do this once your iPhone has been stolen—assuming that you realize soon enough that your iPhone was stolen.
There are some downsides to turning on Stolen Device Protection, but they seem pretty minor to me. For example, if you plan on using Apple Pay on your own iPhone but you are wearing a mask that isn’t correctly detected by Face ID, you will no longer have the backup of entering your passcode. And it will take a little longer on those (rare) occasions when you really do want to change your Apple ID password.
A clever and determined criminal might be able to find a way to circumvent some of these measures, so turning on Stolen Device Protection is not a complete cure for stolen iPhone woes.
For example, Stolen Device Protection doesn’t kick in when you are at what the iPhone considers a trusted location (your home or some other location where you are frequently located). And you cannot manually change trusted locations. So if a determined thief steals your iPhone at your own home and then uses your passcode to change your Apple ID password and run up charges on your credit card, Stolen Device Protection won’t help you. But in many circumstances, such as when you are at a bar or restaurant, having this feature enabled can be the difference between you just losing your iPhone and you also losing funds from your bank account, getting charges on your credit card, and permanently losing all of your photos and videos because they are deleted from iCloud by the criminal. To me, these protections seem well worth the minor inconvenience of turning on Stolen Device Protection.
The buzzword in all of technology right now—including legal technology—is AI, especially generative AI, which can take one set of information and use it to create new information. When ChatGPT was introduced last year, people were amazed at how quickly it could provide complex answers to all sorts of different types of questions, albeit sometimes with mistakes. The new AI feature introduced a few weeks ago in Westlaw is impressive and has already made a noticeable difference in my law practice. As Microsoft, Google, Apple, and so many others are looking at what can be done with generative AI, some iPhone and iPad apps are starting to incorporate AI in interesting ways that show a lot of promise. Today, I am discussing recent updates to two of the most useful iPad apps in my law practice: Readdle’s PDF Expert and GoodNotes. The recent addition of AI to these apps has made them much more useful to attorneys and other professionals for whom it is helpful to quickly get information from a document.
AI Chat in PDF Expert
The PDF Expert app from Readdle is a fantastic app for storing, reading, and annotating PDF documents. The new AI Chat feature in PDF Expert is still in beta, and it is only available if you have a premium subscription. To use the feature, in the PDF Expert app on the iPad (or iPhone or Mac), open up a PDF document that has readable text. (If your PDF document only contains an image of the document, you’ll first need to OCR the document, which is a feature available in PDF Expert on the Mac but not yet on the iPad/iPhone.) At the top right of the screen, next to the search icon (a magnifying glass), you will see an icon containing a word bubble with the letters AI in it. Tap that button, and the app will analyze your document.
The app next suggests some quick actions that it can do using AI: Generate Summary, List Main Points, Generate Keywords, and Generate Hashtags. You can tap Generate Summary to find out what a document is about without taking the time to read the document.
In some of my tests, the Generate Summary feature was quite useful and accurate. Other times, it was accurate but not very useful. For example, when I asked it to summarize an opponent’s appellate brief in one of my cases, this is what I saw:
The summary, provided in the right column, says: “The document is an original brief of appellee [name] in a civil action case. It provides a summary of the argument and statement of the case, as well as the legal arguments and standards of review. It also includes a table of contents and table of authorities. The document discusses the abuse of discretion standard of review and argues that the district court did not abuse its discretion in sanctioning [party] for its discovery failures and repeated misrepresentations. It cites several cases to support its arguments and concludes by stating that the district court’s choice of sanctions was reasonable and appropriate.” Technically, that summary is accurate, but only a single sentence really tells me anything about the substance of brief. Being told that a legal brief contains a table of contents and table of authorities and discusses the standard of review is not helpful. But again, for other documents, the Generate Summary and related List Main Points options gave me something useful without me needing to take the time to read the document.
More interesting is the ability to Ask Anything. Imagine that you gave a document to another person and that person studied and memorized the document. Next, imagine asking that person a specific question about the document. This is useful because that person could immediately tell you the answer, saving you the time of reading through the entire document yourself.
For example, in preparing for one of my appellate oral arguments a few weeks ago, I had reviewed lots of different prior opinions in other cases. When I look at the Jones case again, was that the one with fact pattern X or the one with fact pattern Y? With this new AI feature in PDF Expert, I can open up one of the opinions that I downloaded from Westlaw and just ask a question. For example, in a case in which the Louisiana Supreme Court held that a trial court erred in not granting a continuance, I asked: “What was the basis for seeking a continuance.” In just a second, the app told me specific facts set forth by the plaintiff to support a continuance (his physical and emotional condition) and the evidentiary support provided (a doctor’s note saying that the plaintiff needed another three months). This instantly reminded me what this case was all about.
Using AI to ask a question about a document is somewhat similar to using the search feature to find a keyword in a document. But it is much more powerful because you can use natural language to explain what you are looking for instead of using a precise term in a traditional search that might not be found because the document used a synonym or the word in a different tense or something like that. And sometimes, you are not exactly sure what you are looking for, so you may not know what term to search for even though you know the concept that you are interested in.
One of the big dangers of relying upon generative AI is hallucinations. An AI bot might sometimes give you an answer with the complete confidence of somehow who knows what he is talking about and yet the AI is completely wrong. Thus, whenever you get any sort of answer using AI, it is critical to confirm the answer that was provided. PDF Expert helps you do this by telling you the page(s) of the document from which the answer was generated. For example, in the example above where I asked a question about a Louisiana Supreme Court opinion, I can see that the answer came from page 5. By jumping to page 5, I can see (1) the actual text of the opinion that resulted in the answer and (2) that the AI actually pulled the answer from a dissent, not the majority opinion. For this type of question, none of that was a problem for me. The text of the document supported the AI’s answer, and even though the AI was relying on the dissent, both the dissent and the majority were looking at the same facts. But you can certainly imagine a situation in which you ask a question about a legal holding in a case and if the AI pulls its answer from the dissent, the answer might be 100% wrong.
My experience using the beta version of this AI feature in PDF Expert parallels my experience with AI in general. The AI is not always correct, so after you read the answer provided by the AI it is critical that you next double-check the answer in the text. But because the AI is usually correct, and because it is so fast and can answer such complex questions, I still tend to be better off after using AI than I would have been without it because the AI helps me to quickly get to the relevant portion of the document. As the developer of PDF Expert says on its website, this new AI feature can be very useful for some legal professionals because “[y]ou can use AI Chat to locate specific clauses or reference points with lengthy legal documents without the need to read them thoroughly.” I agree with that completely.
Ask GoodNotes
GoodNotes is the app that I use to take handwritten notes. Over time, some of my digital notebooks can become dozens of pages long, sometimes over 100 pages. Fortunately, the app can read handwriting, so if I want to find a specific word that I previously used in a note, I can search for the word and see the pages containing that specific word.
But as noted above, sometimes a word search is too precise. With the new Ask GoodNotes feature, which the developer says is still in an experimental stage, you have new AI tools for getting information from your documents.
First, you can summarize an entire document or (for longer documents) a range of pages. In my tests, the summaries have been pretty accurate and helpful. If you have 10 pages of notes from a meeting, GoodNotes can quickly give you six bullet points that sum up your meeting. If you had taken the time to do the same thing, perhaps you would have added a seventh or eighth bullet point, but it would take you significant time to put that together. GoodNotes does a decent job in just seconds.
Second, the Q&A feature lets you ask a question about your notes. Just like in PDF Expert, it can be far more helpful than a simple word search, and you are provided with the specific page numbers of your notes that were used to come up with the answer so that you can go back and double-check everything.
In the following example, I ran of search of notes that I had previously taken in meetings of my law firm’s technology committee, a committee that I chair. I was curious about the companies that we had used in the past to run a penetration test, a way to check the security of a network. So I took my handwritten notes from our meetings and searched for terms like “pen test” and “penetration test” and I was able to find some of my notes about companies that we had used in the past. Then I asked the AI to do a similar search and it came up with its own answers. Moreover, after giving me the answer, the AI suggested some follow-up questions that I may want to ask.
Note that in addition to working with handwritten notes, GoodNotes can also work with any PDF file. Thus, I sent the same Louisiana Supreme Court opinion that I had analyzed in PDF Expert to GoodNotes. Then, I used the AI Q&A feature to ask the same question: “What was the basis for seeking a continuance.” In just a second, GoodNotes responded: “The basis for seeking a continuance was the illness of one of the plaintiffs who had undergone an arterial transplant and whose testimony was considered critical to the case.” The cite provided was to page 1 of the case, so unlike PDF Expert which gave me a more complete answer by citing the dissent, GoodNotes gave me a shorter but still precise answer by pulling information from the Westlaw headnote on the first page of the opinion. GoodNotes also gave me some good suggestions for follow-up questions, such as “Was any previous continuance requested in the case?”
That was a smart question to ask because the holding in that case was not only based upon the reason for the continuance request but also the fact that this would have been the first requested continuance by the plaintiff in the case. One of the things about generative AI (in general) that I particularly like is that it can often help you to think of important concepts that might not have occurred to you at first. By suggesting additional questions, GoodNotes can help you to think about important, related concepts. Sometimes, that can be incredibly helpful.
The AI Q&A feature of GoodNotes is yet another way to get out information in your own notes. Just like a simple word search, it may not give you all of the answers, but it can be a useful tool.
Information privacy
About a year ago, when ChatGPT and similar generative AI products first rose to predominance, people noticed that if one person provided certain information in forming a question, ChatGPT might remember that information and use the same information when providing an answer to someone else’s question. This led many legal ethics professionals (here is an example) to note that lawyers need to protect the confidentiality of certain information and to warn that providing confidential information to a generative AI system could be a breach of that duty.
I reached out to the developers of both of these apps to ask about whether using generative AI on my documents or notes means that the contents are shared with third-parties and become a part of the generative AI engine. In both cases, I was told that this does not happen. Here is what Readdle said about PDF Expert:
[W]e process each of your requests in the AI Chat in 3 steps. First, we receive and pseudonymize your request.
Second, we send it to OpenAI, L.L.C. (OpenAI) infrastructure for processing.
Third, we receive the output from OpenAI and provide the result back to you. Please note that we do not control or influence the data included in your requests, their results, or the associated files. OpenAI processes your requests according to their Business Terms, Privacy Commitments, and Data Processing Addendum. Among other things, they commit not to use the data for training and retain API inputs and outputs for up to 30 days to identify abuse only.
Additionally, both your request and its results are encrypted in alignment with our standard privacy and information security practices. We employ the latest commercially reasonable technical, administrative, and organizational measures to safeguard your personal data, both online and offline, against loss, misuse, unauthorized access, disclosure, alteration, or destruction. For further insights into your privacy within PDF Expert, please refer to our Privacy Notice.
Here is what the developer of GoodNotes told me:
Goodnotes assures users that it does not access or collect data from their notebooks to train its AI features. However, users of Goodnotes can voluntarily submit their data through ‘Feature Feedback’ under section 3.7.1. Goodnotes may use this information to improve its services, but only if it is submitted voluntarily. … Goodnotes takes its customers’ data seriously. We want to reassure our users that GoodNotes does not have access to their notes or the information, whether they are saved in local or iCloud storage or third-party cloud services (i.e. OneDrive). Your personal notes remain private and are not shared with third parties.
As generative AI continues to get more sophisticated, I suspect that privacy will become even more of a concern, so this is something to always consider when you use any of the AI systems coming on to the market.
Limited use
Running generative AI searches can be costly for app developers, so both PDF Expert and GoodNotes limit how much you can use these new generative AI features. It looks like PDF Expert currently limits you to 200 searches a month. A number at the top left of the Chat pane tells you how many searches you have left. GoodNotes currently limits you to 30 credits (30 questions) each day.
Conclusion
The developers of PDF Expert and GoodNotes say right up front that these generative AI features are currently in their preliminary stages. Nevertheless, I have already found them to be useful in my law practice. Considering how useful they are at this early point, I’m incredibly excited about the possibilities as this technology is further refined in the future.
As someone who works with lots of different types of documents every single day, I love the idea of my iPad or iPhone helping me to get information from my documents faster than ever, not to mention helping me to see things in my documents that I otherwise might have missed. Don’t forget the crucial step of verifying any answers that are provided by the AI. But even with that caveat, generative AI assistance can be incredibly useful, and as more lawyers and other professionals learn to take advantage of this emerging technology, you won’t want to be left behind.