Apple gave select members of the press an Apple Vision Pro so that they could use this new device in the real world for about a week, and yesterday the embargo lifted and their initial reviews were released. It is clear from these reviews that Apple has come up with something really incredible. There was tons of praise … and also some critiques, but they are what you would expect for any first-generation product and not unlike the drawbacks of the very first iPhone, Mac, Apple Watch, car, airplane, and just about any other major new technology. Nevertheless, the consensus seems to be that what the Apple Vision Pro does now, and what it has the potential to do with future updates, is groundbreaking. Mark Spoonauer of Tom’s Guide calls it “the most innovative Apple product since the original iPhone.” And Todd Haselton of CNBC says: “This is an entirely new type of computing, providing a whole new world of experiences. It feels like the future.”
Perhaps the biggest praise came for the Apple Vision Pro as an entertainment device: a way to watch movies, TV shows, and especially 3D content in an immersive environment. As John Guber of Daring Fireball noted: “There are 3D immersive experiences in Vision Pro that are more compelling than Disney World attractions that people wait in line for hours to see.” And Gruber notes that he is normally someone who is not a fan of 3D movies, “but in Vision Pro the depth feels natural, not distracting, and you don’t suffer any loss of brightness from wearing what are effectively sunglasses.”
The reviewers were also impressed by the quality of the tiny screens that are in front of your eyes. Text on a website is actually readable. And thanks to the HDR, you can look out a window in your room or at an iPhone screen and the bright areas are not completely blown out.
Reviewers seemed to agree that the device is heavy enough that you notice the weight, although the two different straps that Apple provides seem to do a decent job of accounting for that weight on your head, with different reviewers preferring different straps. Keep in mind, though, that if you have hair with a lot of volume, the over-the-top strap (the Dual Loop Band) might mess up your hairdo, in which case I hope that you like the Solo Knit Band. One issue that had not occurred to me is that for people who wear makeup, the light seal cushion can mess up your makeup and collect makeup on the cushion. All of these issues, it seems, are inherent to this being a 1.0 product. Of course future generations will be smaller and lighter and have better battery life etc. If you don’t mind waiting a few years, I have no doubt that future Apple Vision products will address these issues, and at some point we will look back and be amazed at how big the first generation Apple Vision Pro was. During the recent 40th anniversary celebration of the first Mac, I sometimes found it hard to believe that the original Mac’s tiny black-and-white screen with a single floppy drive seemed so revolutionary at the time—except that I remember how fascinated I was by that original Mac when it debuted in 1984, not only compared to what else was on the market at the time but also because it seemed to provide a glimpse of what the future would bring.
Using the Apple Vision Pro as a productivity device is something that I’m particularly interested in, and the reviews were mixed on this front. Everyone seemed to be impressed by the ability to place different windows in different spaces all around you. You can even leave a window in a specific place if you walk around—for example, leave a timer right on top of a pot on the stove. But it is clear that to get any real typing done, you’ll want to use an external keyboard instead of the virtual keyboard. No big surprise; I’m tying these words on my iPad right now, but I’m using a bluetooth keyboard to do so because nobody likes to type multiple paragraphs on the iPad’s on-screen keyboard.
If you own a Mac, the Vision Pro provides your Mac with an amazing huge external screen that would be too expensive to ever buy. But remember that you are working in isolation. If someone else is in the room with you, you cannot simply turn a monitor to show them what you are seeing.
But you might be able to learn more by seeing the Vision Pro in action, albeit in a 2D video. There are lots of them out there, but I particularly enjoyed these three:
iOS 17.3 was released this week, so Brett and I begin this week’s episode of the In the News podcast by discussing what is new, including Personal Device Protection, shared playlists, and more. We then look forward to iOS 17.4, which could be coming in March, with even more features including transcripts in the podcast app and new Emoji. We then celebrate the 40th anniversary of the Mac, discuss the latest information on the Apple Vision Pro, explain why you need to watch what you type when you are using public Wi-Fi in an airport, and more.
In our In the Know segment, Brett discusses the new live tracking feature in the Starbucks app. I discuss why you might be better off paying for AppleCare monthly or yearly instead of all at once.
When I was growing up in the 1980s, I had a chance to use lots of different computers including my own personal computers—the Sinclair ZX81 followed by a Commodore 64—plus various Apple II models that were in the computer lab at my school. But when the Macintosh was introduced 40 years ago this week in 1984, that was the computer that I really wanted, so much so that I even tried to imitate it by running a graphical user interface on my Commodore 64 called GEOS made by Berkeley Softworks. In 1988, using the money that I had earned working over the summer and thanks to the generous student discount at my college bookstore, I was finally able to afford my own Mac: a Mac Plus with an external 20MB (!) hard drive. Over the last four decades, we’ve seen many other influential technology products come and go—some of them from Apple itself, such as the iPod—but through it all, the Mac has remained and is actually doing better than ever right now. And I still use a Mac every day, including right now to type these words. A lot has been written about the Mac this week thanks to the 40th anniversary, but I don’t think anyone has done a better job than Jason Snell, the former editor of Macworld magazine. Here is his excellent article on the 40th anniversary in The Verge, and he also wrote great articles on his Six Colors website about the four eras of the Mac and the Mac’s enduring appeal. I’ll admit that the iPhone, iPad, and other products like the upcoming Apple Vision Pro are often more exciting to talk about and use, but the Mac continues to get things done. Bravo for that. As we look forward to the next forty years of the Mac, here are the other items of note from the past week:
iOS 17.3 was released this week, and the major new feature is Stolen Device Protection, which I discussed earlier this week. But there are other notable changes as well, and Justin Meyers of Gadget Hacks does his typical good job of describing all of the new features.
Dan Moren notes on Mastodon that with the release of iOS 17.3, Apple has now rolled out all of the iOS 17 features that were promised when iOS 17 was revealed in 2023.
…but that doesn’t mean that we are done with iOS 17. Apple has already started beta testing iOS 17.4, and there are lots of interesting new features planned. For example, as someone who loves to listen to podcasts and who records one every week, I’m particularly interested in a new automatic transcript feature in Apple’s Podcasts app. It sounds like Apple is using AI to create the transcripts and then they sync with the audio, just like lyrics sync with a song in Apple Music. This will make it so much easier to relocate information that you heard in a prior podcast, and will have so many other uses as well. Jason Snell of Six Colors has information on this new feature, as does John Voorhees of MacStories.
Federico Viticci of MacStories notes on Mastodon that Apple has been working on this transcripts feature for over four years.
iOS 17.4 will also include new Emoji, and Roman Loyola of Macworld has a preview.
Lucas Ropek of Gizmodo has an interesting warning about being careful what you say even when you think you are saying it in private. A teenager at an airport sent a message to a friend via Snapchat and made the (bad) joke that he was going to blow up the plane as he boarded. Because he was using the airport’s public Wi-Fi, British security intercepted the message and took it seriously. Spain them scrambled two fighter jets to escort the plane to its destination, and the teenager was arrested. Spain is now seeking to make him pay $120,000 for the fighter jet deployment as a result of the teenager making what he thought was a joke in a private message to a friend.
We are now only a week away from the public release of the Apple Vision Pro. I’m sure that Apple gave some reporters and others pre-release units, so my guess is that we will see the first reviews around Tuesday of next week. David Pierce of The Verge explains that Safari may be one of the most important apps to be included with the Vision Pro because you can do just about anything with a web browser, even if a developer has not released a specific app for the Apple Vision Pro. For example, without a Netflix or YouTube app (for now at least), you’ll use Safari to watch videos from those services on the Apple Vision Pro.
Apple released a really cool video that shows the Apple Vision Pro being made. But if you are like me, you may not understand what you are seeing in every shot of the video. Product designer Marcus Kane wrote a great thread on X in which he explains, with pictures, exactly what Apple is showing off.
Chip Loder of AppleInsider explains the different modes of the Apple Vision Pro and who developers needs to keep this in mind when designing spatial computing apps for this device.
As we think about the Apple Vision Pro, the next big thing from Apple, what will be the next big thing after that? Perhaps it will have something to do with a car. It is widely known that Apple has been working on car technology, and there were rumors this week from Mark Gurman of Bloomberg that we might see a product in four years. But as Zac Hall of 9to5Mac correctly points out, people have been saying that Apple’s car project is four years away for nearly nine years now.
The European Union is requiring Apple to change the way that its App Store works to comply with the Digital Markets Act. As John Voorhees of MacStories reports, Apple revealed this week how it will do so, and it includes allowing third-parties to run their own App Stores. It will be fascinating to see how this works in practice and whether it ends up being better for end users or just exposes them to more dangerous or unsavory apps.
And finally, one of the new features in iOS 17.3 is the ability to share an Apple Music playlist with others so that a group of people can add to and edit a playlist. Ebro Darden, Nadeska Alexis, and Lowkey, who host the weekly video series called Rap Life Review, demonstrate how this new feature works in a video released this week by Apple Music:
Yesterday, Apple released updates for the iPhone and most of its other platforms. Once you update your iPhone to iOS 17.3, I recommend that you turn on Stolen Device Protection. That way, if someone steals your iPhone and also has learned your device passcode, there is a limit to how much damage they can cause to you. Here is how you turn on Stolen Device Protection and what it does.
To turn on Stolen Device Protection, go to your Settings app and tap on Face ID & Passcode. Scroll down until you see Stolen Device Protection. Simply tap the blue words Turn On Protection to turn it on. That’s it.
Once you have this feature turned on, your iPhone will turn on some additional security requirements in certain situations. If your iPhone senses that you are away from a familiar location like your home or your work, the iPhone will require you to use Face ID or Touch ID to do certain functions such as accessing stored passwords and using credit cards.
Isn’t this already true? Yes, but only to a certain extent. Without Stolen Device Protection, if Face ID doesn’t work, you can instead enter your iPhone’s passcode. But this means that if a thief has your iPhone and knows your passcode, it won’t matter that Face ID and Touch ID will fail for the thief because he can instead use your code. Turning on Stolen Device Protection prevents the thief from using your passcode as a way to circumvent biometric authentication.
Additionally, with Stolen Device Protection turned on, there is a delay before your iPhone will let you perform certain actions such as change your Apple ID password. Currently, some thieves know that if they immediately change your password once they steal your iPhone, they can prevent you from using your Apple ID on another device to mark your iPhone as lost, which remotely locks it with a passcode to keep your device and information secure. Stolen Device Protection gives you some time (I believe it is one hour) to do this once your iPhone has been stolen—assuming that you realize soon enough that your iPhone was stolen.
There are some downsides to turning on Stolen Device Protection, but they seem pretty minor to me. For example, if you plan on using Apple Pay on your own iPhone but you are wearing a mask that isn’t correctly detected by Face ID, you will no longer have the backup of entering your passcode. And it will take a little longer on those (rare) occasions when you really do want to change your Apple ID password.
A clever and determined criminal might be able to find a way to circumvent some of these measures, so turning on Stolen Device Protection is not a complete cure for stolen iPhone woes.
For example, Stolen Device Protection doesn’t kick in when you are at what the iPhone considers a trusted location (your home or some other location where you are frequently located). And you cannot manually change trusted locations. So if a determined thief steals your iPhone at your own home and then uses your passcode to change your Apple ID password and run up charges on your credit card, Stolen Device Protection won’t help you. But in many circumstances, such as when you are at a bar or restaurant, having this feature enabled can be the difference between you just losing your iPhone and you also losing funds from your bank account, getting charges on your credit card, and permanently losing all of your photos and videos because they are deleted from iCloud by the criminal. To me, these protections seem well worth the minor inconvenience of turning on Stolen Device Protection.
The buzzword in all of technology right now—including legal technology—is AI, especially generative AI, which can take one set of information and use it to create new information. When ChatGPT was introduced last year, people were amazed at how quickly it could provide complex answers to all sorts of different types of questions, albeit sometimes with mistakes. The new AI feature introduced a few weeks ago in Westlaw is impressive and has already made a noticeable difference in my law practice. As Microsoft, Google, Apple, and so many others are looking at what can be done with generative AI, some iPhone and iPad apps are starting to incorporate AI in interesting ways that show a lot of promise. Today, I am discussing recent updates to two of the most useful iPad apps in my law practice: Readdle’s PDF Expert and GoodNotes. The recent addition of AI to these apps has made them much more useful to attorneys and other professionals for whom it is helpful to quickly get information from a document.
AI Chat in PDF Expert
The PDF Expert app from Readdle is a fantastic app for storing, reading, and annotating PDF documents. The new AI Chat feature in PDF Expert is still in beta, and it is only available if you have a premium subscription. To use the feature, in the PDF Expert app on the iPad (or iPhone or Mac), open up a PDF document that has readable text. (If your PDF document only contains an image of the document, you’ll first need to OCR the document, which is a feature available in PDF Expert on the Mac but not yet on the iPad/iPhone.) At the top right of the screen, next to the search icon (a magnifying glass), you will see an icon containing a word bubble with the letters AI in it. Tap that button, and the app will analyze your document.
The app next suggests some quick actions that it can do using AI: Generate Summary, List Main Points, Generate Keywords, and Generate Hashtags. You can tap Generate Summary to find out what a document is about without taking the time to read the document.
In some of my tests, the Generate Summary feature was quite useful and accurate. Other times, it was accurate but not very useful. For example, when I asked it to summarize an opponent’s appellate brief in one of my cases, this is what I saw:
The summary, provided in the right column, says: “The document is an original brief of appellee [name] in a civil action case. It provides a summary of the argument and statement of the case, as well as the legal arguments and standards of review. It also includes a table of contents and table of authorities. The document discusses the abuse of discretion standard of review and argues that the district court did not abuse its discretion in sanctioning [party] for its discovery failures and repeated misrepresentations. It cites several cases to support its arguments and concludes by stating that the district court’s choice of sanctions was reasonable and appropriate.” Technically, that summary is accurate, but only a single sentence really tells me anything about the substance of brief. Being told that a legal brief contains a table of contents and table of authorities and discusses the standard of review is not helpful. But again, for other documents, the Generate Summary and related List Main Points options gave me something useful without me needing to take the time to read the document.
More interesting is the ability to Ask Anything. Imagine that you gave a document to another person and that person studied and memorized the document. Next, imagine asking that person a specific question about the document. This is useful because that person could immediately tell you the answer, saving you the time of reading through the entire document yourself.
For example, in preparing for one of my appellate oral arguments a few weeks ago, I had reviewed lots of different prior opinions in other cases. When I look at the Jones case again, was that the one with fact pattern X or the one with fact pattern Y? With this new AI feature in PDF Expert, I can open up one of the opinions that I downloaded from Westlaw and just ask a question. For example, in a case in which the Louisiana Supreme Court held that a trial court erred in not granting a continuance, I asked: “What was the basis for seeking a continuance.” In just a second, the app told me specific facts set forth by the plaintiff to support a continuance (his physical and emotional condition) and the evidentiary support provided (a doctor’s note saying that the plaintiff needed another three months). This instantly reminded me what this case was all about.
Using AI to ask a question about a document is somewhat similar to using the search feature to find a keyword in a document. But it is much more powerful because you can use natural language to explain what you are looking for instead of using a precise term in a traditional search that might not be found because the document used a synonym or the word in a different tense or something like that. And sometimes, you are not exactly sure what you are looking for, so you may not know what term to search for even though you know the concept that you are interested in.
One of the big dangers of relying upon generative AI is hallucinations. An AI bot might sometimes give you an answer with the complete confidence of somehow who knows what he is talking about and yet the AI is completely wrong. Thus, whenever you get any sort of answer using AI, it is critical to confirm the answer that was provided. PDF Expert helps you do this by telling you the page(s) of the document from which the answer was generated. For example, in the example above where I asked a question about a Louisiana Supreme Court opinion, I can see that the answer came from page 5. By jumping to page 5, I can see (1) the actual text of the opinion that resulted in the answer and (2) that the AI actually pulled the answer from a dissent, not the majority opinion. For this type of question, none of that was a problem for me. The text of the document supported the AI’s answer, and even though the AI was relying on the dissent, both the dissent and the majority were looking at the same facts. But you can certainly imagine a situation in which you ask a question about a legal holding in a case and if the AI pulls its answer from the dissent, the answer might be 100% wrong.
My experience using the beta version of this AI feature in PDF Expert parallels my experience with AI in general. The AI is not always correct, so after you read the answer provided by the AI it is critical that you next double-check the answer in the text. But because the AI is usually correct, and because it is so fast and can answer such complex questions, I still tend to be better off after using AI than I would have been without it because the AI helps me to quickly get to the relevant portion of the document. As the developer of PDF Expert says on its website, this new AI feature can be very useful for some legal professionals because “[y]ou can use AI Chat to locate specific clauses or reference points with lengthy legal documents without the need to read them thoroughly.” I agree with that completely.
Ask GoodNotes
GoodNotes is the app that I use to take handwritten notes. Over time, some of my digital notebooks can become dozens of pages long, sometimes over 100 pages. Fortunately, the app can read handwriting, so if I want to find a specific word that I previously used in a note, I can search for the word and see the pages containing that specific word.
But as noted above, sometimes a word search is too precise. With the new Ask GoodNotes feature, which the developer says is still in an experimental stage, you have new AI tools for getting information from your documents.
First, you can summarize an entire document or (for longer documents) a range of pages. In my tests, the summaries have been pretty accurate and helpful. If you have 10 pages of notes from a meeting, GoodNotes can quickly give you six bullet points that sum up your meeting. If you had taken the time to do the same thing, perhaps you would have added a seventh or eighth bullet point, but it would take you significant time to put that together. GoodNotes does a decent job in just seconds.
Second, the Q&A feature lets you ask a question about your notes. Just like in PDF Expert, it can be far more helpful than a simple word search, and you are provided with the specific page numbers of your notes that were used to come up with the answer so that you can go back and double-check everything.
In the following example, I ran of search of notes that I had previously taken in meetings of my law firm’s technology committee, a committee that I chair. I was curious about the companies that we had used in the past to run a penetration test, a way to check the security of a network. So I took my handwritten notes from our meetings and searched for terms like “pen test” and “penetration test” and I was able to find some of my notes about companies that we had used in the past. Then I asked the AI to do a similar search and it came up with its own answers. Moreover, after giving me the answer, the AI suggested some follow-up questions that I may want to ask.
Note that in addition to working with handwritten notes, GoodNotes can also work with any PDF file. Thus, I sent the same Louisiana Supreme Court opinion that I had analyzed in PDF Expert to GoodNotes. Then, I used the AI Q&A feature to ask the same question: “What was the basis for seeking a continuance.” In just a second, GoodNotes responded: “The basis for seeking a continuance was the illness of one of the plaintiffs who had undergone an arterial transplant and whose testimony was considered critical to the case.” The cite provided was to page 1 of the case, so unlike PDF Expert which gave me a more complete answer by citing the dissent, GoodNotes gave me a shorter but still precise answer by pulling information from the Westlaw headnote on the first page of the opinion. GoodNotes also gave me some good suggestions for follow-up questions, such as “Was any previous continuance requested in the case?”
That was a smart question to ask because the holding in that case was not only based upon the reason for the continuance request but also the fact that this would have been the first requested continuance by the plaintiff in the case. One of the things about generative AI (in general) that I particularly like is that it can often help you to think of important concepts that might not have occurred to you at first. By suggesting additional questions, GoodNotes can help you to think about important, related concepts. Sometimes, that can be incredibly helpful.
The AI Q&A feature of GoodNotes is yet another way to get out information in your own notes. Just like a simple word search, it may not give you all of the answers, but it can be a useful tool.
Information privacy
About a year ago, when ChatGPT and similar generative AI products first rose to predominance, people noticed that if one person provided certain information in forming a question, ChatGPT might remember that information and use the same information when providing an answer to someone else’s question. This led many legal ethics professionals (here is an example) to note that lawyers need to protect the confidentiality of certain information and to warn that providing confidential information to a generative AI system could be a breach of that duty.
I reached out to the developers of both of these apps to ask about whether using generative AI on my documents or notes means that the contents are shared with third-parties and become a part of the generative AI engine. In both cases, I was told that this does not happen. Here is what Readdle said about PDF Expert:
[W]e process each of your requests in the AI Chat in 3 steps. First, we receive and pseudonymize your request.
Second, we send it to OpenAI, L.L.C. (OpenAI) infrastructure for processing.
Third, we receive the output from OpenAI and provide the result back to you. Please note that we do not control or influence the data included in your requests, their results, or the associated files. OpenAI processes your requests according to their Business Terms, Privacy Commitments, and Data Processing Addendum. Among other things, they commit not to use the data for training and retain API inputs and outputs for up to 30 days to identify abuse only.
Additionally, both your request and its results are encrypted in alignment with our standard privacy and information security practices. We employ the latest commercially reasonable technical, administrative, and organizational measures to safeguard your personal data, both online and offline, against loss, misuse, unauthorized access, disclosure, alteration, or destruction. For further insights into your privacy within PDF Expert, please refer to our Privacy Notice.
Here is what the developer of GoodNotes told me:
Goodnotes assures users that it does not access or collect data from their notebooks to train its AI features. However, users of Goodnotes can voluntarily submit their data through ‘Feature Feedback’ under section 3.7.1. Goodnotes may use this information to improve its services, but only if it is submitted voluntarily. … Goodnotes takes its customers’ data seriously. We want to reassure our users that GoodNotes does not have access to their notes or the information, whether they are saved in local or iCloud storage or third-party cloud services (i.e. OneDrive). Your personal notes remain private and are not shared with third parties.
As generative AI continues to get more sophisticated, I suspect that privacy will become even more of a concern, so this is something to always consider when you use any of the AI systems coming on to the market.
Limited use
Running generative AI searches can be costly for app developers, so both PDF Expert and GoodNotes limit how much you can use these new generative AI features. It looks like PDF Expert currently limits you to 200 searches a month. A number at the top left of the Chat pane tells you how many searches you have left. GoodNotes currently limits you to 30 credits (30 questions) each day.
Conclusion
The developers of PDF Expert and GoodNotes say right up front that these generative AI features are currently in their preliminary stages. Nevertheless, I have already found them to be useful in my law practice. Considering how useful they are at this early point, I’m incredibly excited about the possibilities as this technology is further refined in the future.
As someone who works with lots of different types of documents every single day, I love the idea of my iPad or iPhone helping me to get information from my documents faster than ever, not to mention helping me to see things in my documents that I otherwise might have missed. Don’t forget the crucial step of verifying any answers that are provided by the AI. But even with that caveat, generative AI assistance can be incredibly useful, and as more lawyers and other professionals learn to take advantage of this emerging technology, you won’t want to be left behind.
Ordering an Apple Vision Pro is unlike ordering any other device that Apple has ever sold. What size bands do you need? What accessories should you buy? How do you order prescription lenses and what are the limitations? Brett Burney and I begin this week’s episode of the In the News podcast by describing everything that you need to know about the buying process for Apple’s newest product, the first new platform since Apple started selling the Apple Watch. We also talk about the new security features coming in a few days when iOS 17.3 is released, the difference between MagSafe and Qi2 products, changes on the Apple Board of Directors, a new milestone for iPhone sales, and much more.
In our Where Y’At? segment, Brett and I discuss the amazing story of a man in Toronto who tracked his stolen vehicle across the ocean to a used car lot in Dubai.
In our In the Know segment, Brett discusses the new Ember Smart Travel Mug 2+. I recommend a new video from Apple that shows you what it is like to use an Apple Vision Pro as well as a new technical specs page with tons of details on this upcoming device.
There is a lot going on in the world of Apple right now, such as the pre-orders for the Apple Vision Pro that started this morning. (I placed an order, and have an appointment at my local Apple Store at 8:30 on February 2 to pick it up.) But one news item that I don’t want you to miss is a change that is coming in iOS 17.3, which many expect Apple to release next week. This new feature will address a security issue that was exposed in early 2023 by Joanna Stern of the Wall Street Journal. I discussed it in this post. In short, if a criminal learns of your iPhone passcode (perhaps by having someone look over your shoulder as you type it in) and then steals your iPhone, they can then change the passcode and cause you a world of problems such as draining your bank account, forever deleting all of your photos, improperly using confidential information, etc. As Joanna Stern and Nicole Nguyen reported last month, and as discussed more extensively in this post by John Gruber of Daring Fireball, Apple has a number of solutions coming in iOS 17.3 that seem to be quite clever. But importantly, to take advantage of these solutions, you need to enable a new feature called Stolen Device Protection. So whenever you install iOS 17.3, please remember to consider enabling this new feature to give you additional protection against someone stealing your iPhone and taking over your Apple account. I’m especially sensitive to these issues because—like many readers of this website—I am an attorney with confidential information on my devices. But because of the chaos and potentially severe consequences that could come to anyone who falls victim to this scheme, I’m glad that Apple is doing something substantial to address this situation. I wish that Apple had acted more quickly, but I’m glad to have more protection soon. And now, the news of note from the past week:
If you plan to get an Apple Vision Pro, one thing that you can look forward to is a great way to enjoy 3D movies. In the past, I’ve been unimpressed by most 3D movies because in a theater or using home theater glasses, you typically need to sacrifice brightness for the added dimension. But it looks like the 3D movie experience will be substantially better thanks to the numerous features of the Apple Vision Pro. Apple issued a press release this week to describe some of the new entertainment features that will become possible with an Apple Vision Pro, and it is worth reading that release to learn about it all.
One entertainment feature that you will not have in the Apple Vision Pro, at least for now, is a Netflix app. Netflix provided a statement to Bloomberg to say that you can watch Netflix content in a webpage with the Apple Vision Pro (the same way that you can already view Netflix content on a Mac or PC), but Netflix does not currently plan to update its iPhone/iPad/Apple TV app to work with the Vision Pro. This means that you cannot download a show to an Apple Vision Netflix app to watch it on a plane, unless you are on a plane with sufficient Internet broadband for streaming. My hope is that at some point in the future, Netflix will expand its 3D streaming video offerings and at the same time bring an app to the Apple Vision platform for both 3D and non-3D movies.
Apple invited a number of journalists who had previously tried out an Apple Vision Pro to do so again this week. One resulting report that I found particularly interesting is this one by Chance Miller of 9to5Mac because he provides some new information on the optional Dual Loop Band, the keyboard, spatial videos and Immersive Videos, the impressive Disney+ app, and the EyeSight feature that displays a representation of your eyes and other information on the outside of the device to improve interaction with people who are around you.
Samantha Wiley of iLounge reports that the HBO Max app will fully support the Apple Vision Pro.
John-Anthony Disotto of iMore identifies the over 150 3D movies that you will be able to watch on the Apple Vision Pro when it launches.
Many years ago, Epic picked a fight with Apple because Epic wanted to be able to offer its own App Store on the iPhone and iPad. In the resulting antitrust lawsuit, Epic won some minor battles, but Apple was mostly successful. This week, the United States Supreme Court denied further review. John Voorhees of MacStories, who used to be a practicing attorney, explains what is next in this article. For one, Apple will start allowing app developers to provide users with a link for pay items on an external website instead of an in-app purchase, but because of various restrictions on the process, many developers may decide that it isn’t worth the hassle. Additionally, Epic will have to pay Apple $73 million in legal fees. And perhaps worst of all for Epic, and as noted in this post by John Gruber of Daring Fireball, I doubt that we will ever see Fortnight return to Apple platforms: “iOS Fortnite players are like the children in an ugly divorce.”
Speaking of Apple’s legal battles, while Apple pursues its federal court appeal of the ITC import ban, Apple announced this week that it will disable the blood oxygen detection feature in all new models of the Apple Watch Series 9 and Apple Watch Ultra 2 (because the appellate court denied Apple’s request for a stay pending the appeal). If Apple wins the appeal, I’m sure that it will issue a software update to restore the feature on those devices. If Apple loses the appeal, perhaps Apple will reach an agreement with Masimo pursuant to which Apple can re-enable the feature. If you already own an Apple Watch with a working blood oxygen detection feature, then this change will not affect you. Wesley Hilliard of Apple Insider provides further details.
I’m a big fan of 3-in-1 wireless chargers that work with StandBy mode and charge your devices at the maximum possible speed in the least amount of space. That’s why I gave the Anker 3-in-1 Cube with MagSafe such a glowing review a few months ago. Rikka Altland of 9to5Toys has tried out the new Anker MagGo 3-in-1 Charging Station and is a big fan. At $110 on Amazon, this product is a little cheaper than the Anker Cube (currently $135) but works much the same way. Instead of being an official Apple MagSafe product, this is a Qi2 product, which means that it charges at the same speed. I hope to see even more Qi2 products like this soon so that creative product designers can come up with even more useful products like this.
Apple announced this week that two members of its Board of Directors are retiring now that they have reached the age of 75. One of those people is former U.S. Vice President Al Gore, and John Gruber wrote a nice article about Gore and Apple.
For a long time now, Apple has received the most revenue in the smartphone market, but that is because it focuses on more profitable models and lets other companies sell less expensive smartphones. But according to a report discussed by Ed Hardy of Cult of Mac, 2023 was the first time ever that Apple sold more smartphones than any other manufacturer. With a 20.1% market share, Apple just barely beat out Samsung’s 19.4% market share.
Thomas Daigle of CBC reports that a man in Toronto had his GMC Yukon SUV stolen—the second time that this had happened to him this year. This time, he had an AirTag hidden in the vehicle. He was able to track his vehicle to a rail yard, then the Port of Montreal, then the United Arab Emirates, and finally to a used car lot near Dubai. The fascinating article has tons of details on how the man tracked his stolen vehicle and tried, without success so far, to get the police and even Interpol to help him.
If you watch the Apple TV+ show For All Mankind—and if you don’t, you should!—then you know that last week, Apple aired the exciting final episode of Season 4. For some great behind-the-scenes information about the show, I recommend that you listen to the latest episode of the NASA Vending Machine podcast by Jason Snell and Dan Moren because they interview the co-creators and showrunners Ben Nedivi and Matt Wolpert.
If you are a fan of musicals, I recommend the first two seasons of the Apple TV+ show Schmigadoon! However, if you are a fan of Schmigadoon!, I’m sorry to share this report from Zac Hall at 9to5Mac. Even though Season 3 has been written (with 25 new songs), Apple decided not to renew the show for a third season. That’s a shame because it was a good show, and I have no doubt that I would have enjoyed another season of it.
And finally, speaking of great looks, here is a preview of a new TV series coming to Apple TV+ called The New Look. Inspired by true events, it tells the stories of Christian Dior, Coco Chanel, and others in the Paris fashion scene near the end of World War II when the Nazis occupied Paris. The cast and (of course) the wardrobe looks amazing. I hadn’t heard anything about this show before this week, and based on this preview, it could be really great:
The Apple Vision Pro now has a release date, and so Brett Burney and I spend the first segment of this week’s episode of the In the News podcast discussing this interesting new product. We then discuss an initial review of the Click keyboard for iPhone, a new resource from David Sparks to help you to be more productive, why you should considering shuffling the photos on your iPhone’s lock screen, Find My devices that are slim enough to fit in a wallet, and more.
In our Where Y’At? segment, Brett and I discuss the iPhone that fell 16,000 feet from Alaska Airlines 1282 and survived to tell the tale.
In our In the Know segment, Brett recommends using Command-Tab on an external keyboard with an iPad, and I share a tip for playing a video in the background even when the app developer doesn’t make it easy for you to do so.
One week ago, on Friday, January 5, 2024, an Alaska Airlines plane left the airport in Portland, Oregon, and shortly thereafter a panel blew off of the aircraft, causing the cabin to depressurize. It is amazing that the pilot was able to get the plane back to the airport quickly and nobody died, but according to Lauren Rosenblatt of the Seattle Times, many of the 165 passengers are claiming injuries and have filed a class action lawsuit. Minyvonne Burke and Jay Blackman of NBC News reported that the FAA issued a statement yesterday saying: “This incident should have never happened and it cannot happen again.” No argument there. I mention all of this because there is actually an interesting and surprising iPhone angle to this story. As reported by John Gruber of Daring Fireball (and many others), game designer Sean Bates found an iPhone on the side of the road that had apparently fallen from the plane, and it was still working after a 16,000-foot drop. He says it was “[s]till in airplane mode with half a battery and open to a baggage claim for #AlaskaAirlines ASA1282.” Note that he was able to see the baggage claim information because the iPhone did not have a lock code on it. So I suppose the real lesson here is that you may not be able to do anything about the safety of the planes that you fly, but you can control whether your iPhone has a passcode so that it remains private no matter what happens—even things that should have never happened. And now, the other news of note from the past week:
I’ve frequently mentioned the fantastic Field Guides created by former attorney and current guru of all things Apple David Sparks. These online guides contain high-quality videos that teach you everything that you need to know about a topic: typically an app or a service. His newest Field Guide is called the Productivity Field Guide (affiliate link), and it is full of information on how to be more productive no matter what it is that you are trying to do. I’ve heard David talk about productivity topics for more years than I can count, so I know that this is something that he cares deeply about and has thought a lot about. Congrats to David for finishing this new guide, and I hope that it helps tons of people to be more productive.
This isn’t an iPhone topic per se, but it is a legal technology topic that is of interest to me. Many courts have been adopting, or considering adopting, rules that require disclosures when you use AI to help create a brief. I am opposed to these rules, and rather than list my reasons here, I’m instead going to just link to this fantastic post on LinkedIn by New Orleans appellate attorney Andrew Lee, where he links to the letter that he sent to the U.S. Fifth Circuit. Kudos to Andy for researching and writing that detailed letter and for sharing it with all of us. I’ll also note that I’ve been using the new AI legal research feature on Westlaw, and it is incredibly useful.
Last week, I mentioned the introduction of Clicks, an iPhone case that adds a hardware keyboard. Nick Wolny of CNet was at CES in Las Vegas this week and had a chance to try it out for about 20 minutes, and his preliminary review was that the product has some real potential.
Rikka Altland of 9to5Toys reports on lots of new chargers, power banks, and multi-device docks that were announced by Anker at CES this week.
Although we know that Apple Vision Pro pre-sales begin in one week, there is still tons that we don’t know about purchasing and using this new product. For example, what is the process if you wear glasses? As Dan Moren notes in an article for Macworld, this is a different kind of product, with a different kind of launch.
I’ve noted that I’m a fan of the MagSafe Battery Pack from Apple, but I guess that Apple itself doesn’t share my love since it no longer sells the product. However, Michael Potuck of 9to5Mac reports that Mophie revealed at CES that it is bringing back its Juice Pack, a somewhat similar product that attaches with magnets to the back of your iPhone and provides additional battery life.
Jason Snell of Six Colors writes about using the Photo Shuffle feature to cause pictures of friends and family—even our furry friends—appear on the iPhone lock screen. He notes: “One of the magical things about Photo Shuffle is that those obscure photos also keep floating to the top. They’re not necessarily the best or most polished, but they’re surprising and delightful.” I agree 100%, and I love this feature.
Apple announced this week that former Vice President Al Gore and former CEO of Boeing James Bell are retiring from the Apple Board of Directors now that both are turning 75, as reported by Chance Miller of 9to5Mac. Dr. Wanda Austin, former CEO of The Aerospace Corporation, has been nominated for election to Apple’s board of directors. If you are watching the current season of For All Mankind on Apple TV+, then you know why Al Gore has been on my mind for the last few weeks.
With more and more ways to use the Find My app to track items—such as the Eufy SmartTrack Card and Rolling Square AirCard devices I reviewed earlier this week—some folks may find themselves tracking many AirTags and similar products. Wesley Hilliard of Apple Insider reports that Apple recently raised the limit on items that you can track from 16 to 32. I didn’t even know that there was a limit.
If you use an Apple Magic Keyboard—I’ve been using one with my iPhone and iPad since 2010—Apple has released what it calls a critical firmware update to fix a Bluetooth security flaw, as reported by Oliver Haslam of iMore.
And finally, here is the opening title sequence for the upcoming Apple TV+ show Masters of the Air, a nine-part series that premiers on January 26. I was a big fan of Band of Brothers way back when, and I hope that this one is even better.
The next big thing from Apple is only days away. The Apple Vision Pro, first announced by Apple last year, will be available for pre-order starting at 8am Eastern on Friday, January 19, 2024. Two weeks later, on Friday, February 2, 2024, you will be able to schedule a demo of the Apple Vision Pro in an Apple Store to try out the product and purchase it if you like it. This is all in the United States; we don’t yet have a date for other countries.
Last year, we learned that the starting price would be $3,499. That remains true, and it is for a unit with 256GB of storage. I presume that Apple will also sell versions with more storage for a few hundred dollars more, but those specifics have not yet been announced.
If you have good vision, all you need is the device. If you wear glasses, you will need to purchase optical inserts made by ZEISS. If you just need something similar to reader glasses, that will cost $99. If you want to match a prescription, you will need to have a valid prescription and the lens cost will be $149. Apple has not yet provided details on how the process works for getting the prescription inserts. I believe that an eyeglasses prescription in the United States is valid for one year, so if your prescription is older than that, Apple may require you to get an updated prescription to order the lenses, but that is unclear.
I was a little surprised to see Apple provide all of this information yesterday in a mere press release. I thought that we might see another product video with more details on how it works. Perhaps Apple is saving that for February 2. The rumor is that initial supplies are limited, and those folks who are likely to be early adopters probably don’t need to get even more information from Apple to decide that they want to buy one. As you might have guessed, I fall in that category.
As we learn many more details over the coming weeks, it will be exciting to see this news unfold. If the early reports are accurate, this will be a fascinating device to use.
Here is a fun video called Get Ready that Apple released last night to build excitement. It reminds me of the fantastic Hello video that Apple released before the iPhone first went on sale: