Here’s How the Exposure Notification System from Apple and Google Protects Your Privacy – Los Angeles IT Consultants

In iOS 13.5 Apple incorporated a new Exposure Notification API in response to the global COVID-19 pandemic. We’ve seen a few people freak out about this, but seriously, calm down, folks. At best, the Exposure Notification API could lower contact tracing costs, reduce the spread of COVID-19, prevent life-changing health consequences, and save lives. At worst, it won’t prove particularly effective. In neither case does it pose any threat to personal privacy.

Why have Apple and Google—two companies that normally compete tooth and nail—formed this unprecedented partnership? Contact tracing is one of the key techniques employed by public health authorities in slowing the spread of COVID-19. It involves gathering information from an infected person about those they’ve been in contact with, enabling authorities to learn who might have been the source of the infection and who they may have infected. It’s a slow, laborious, and error-prone process—do you know or even remember all the people you’ve come in contact with over the past few weeks?—but it’s helpful nonetheless.

To speed up this process and make it more accurate, Apple and Google are building exposure notification capabilities into their respective smartphone operating systems. A large percentage of the population carries a smartphone running either iOS or Android, and since these phones have the capability to detect when other phones are in their vicinity via Bluetooth, Apple and Google realized they could use technology to alert people when they had been exposed to a person who later tests positive for COVID-19.

Their solution comes in two phases. In the first phase, Apple and Google released the Exposure Notification API, and that’s what happened with iOS 13.5. This API, or application programming interface, allows apps written by public health authorities to work across both iOS and Android devices, something that’s never been possible before. The first key fact to understand is that only public health authorities will be allowed to write apps that leverage the Exposure Notification API. It cannot be incorporated into sketchy social media apps.

Unfortunately, it seems likely that many people will never learn about or download those apps. So in the second phase, Apple and Google will build the exposure notification technology directly into iOS and Android, so it can work without a public health authority app being installed.

The second key fact to understand is the entire system is opt-in. You must explicitly consent to the terms and conditions of the program before it becomes active on your phone. That’s true whether you get an app in the first phase or rely on the integration in the second phase. And, of course, if you change your mind, you can always turn it off in the app or the operating system settings.

How does it work? Apple and Google have developed an ingenious approach that ensures that those who opt-in to the technology can use it without worrying about privacy violations.

Your phone creates a Bluetooth beacon with a unique ID derived from a randomly generated diagnosis encryption key. The system generates a fresh diagnosis key every 24 hours and stores it on your phone for 14 days, deleting all older keys. Plus, the unique Bluetooth beacon ID that your phone broadcasts to other phones in your vicinity changes every 15 minutes. Similarly, your phone reads the unique IDs from nearby phones and stores them locally. This approach ensures privacy in three important ways:

  1. No personal information is shared. The ID is based on a random encryption key and changes constantly, so there’s no way it could be traced back to your phone, much less to you personally.
  2. No location information is stored. The only data that’s generated and transferred between the phones are these unique IDs. The system does not record or share location information, and Apple and Google have said they won’t approve any public health authority app that uses this system and also records location separately.
  3. No data is uploaded unless you test positive. As long as you remain uninfected by COVID-19, no data from your phone is uploaded to the Apple- and Google-controlled servers.

What happens if you test positive for COVID-19? (Sorry!) In that case, you would need to use a public health authority app to report your test results. You’ll likely have to enter a code or other piece of information to validate the diagnosis—a requirement necessary to prevent fake reporting.

When the app confirms your diagnosis, it triggers your phone to upload up to the last 14 days of diagnosis encryption keys—remember, these are just the keys from which the IDs are derived, not the IDs themselves—to the servers. Fewer days might be uploaded depending on when the exposure could have occurred.

All the phones enrolled in the system constantly download these diagnosis keys from devices of infected people. Then they perform cryptographic operations to see if those keys match any of the locally stored Bluetooth IDs captured during the period covered by the key. If there’s a match, that means you were in proximity to an infected person, and the system generates a notification with information about the day the exposure happened, how long it lasted, and the Bluetooth signal strength (which can indicate how close you were). A public health authority app will provide detailed instructions on how to proceed; if someone doesn’t have the app yet, the smartphone operating system will explain how to get it. Additional privacy protections are built into these steps:

  1. No one is forced to report a positive diagnosis. Just as you have to opt-in to the proximity ID sharing, you must explicitly choose to share your positive diagnosis. Not sharing puts others, including your loved ones, at risk, but that’s your decision to make.
  2. Shared diagnosis keys cannot identify you. The information that your phone uploads in the case of a positive diagnosis is limited to—at most—14 encryption keys. Those keys, which are then shared with others’ phones, contain no personal or location information.
  3. The matching process takes place only on users’ phones. Since the diagnosis keys and the derived IDs only meet on individual phones, there’s no way Apple, Google, or any government agency could match them up to establish a relationship.
  4. The notification information is too general to identify individuals. In most cases, there will be no way to connect an exposure notification back to an individual. Obviously, if you were in contact with only one or two people on a relevant day, that’s less true, but in such a situation, they’re likely known to you anyway.

Finally, Apple and Google have said they’ll disable the exposure notification system on a regional basis when it is no longer needed.

We apologize if that sounds complicated. It is, and necessarily so, because Apple and Google have put a tremendous amount of thought and technical and cryptographic experience into developing this exposure notification system. They are the preeminent technology companies on the planet, and their knowledge, skills, and expertise are as good as it gets. A simpler system—and, unfortunately, we’ll probably see plenty of other apps that won’t be as well designed—would likely have loopholes or could be exploited in unanticipated ways.

You can read more about the system from Apple and Google, including a FAQ and the technical specifications.

Our take? We are participating in this exposure notification system. It’s the least we can do to help keep our loved ones and others in our communities safe. In a pandemic, we all have to work to help others.

(Featured image based on an original by Dennis Kummer on Unsplash)


Having Mac Troubles? Running Apple Diagnostics May Help Identify the Problem – LA IT Consultants

If your Mac is acting up and you suspect a hardware problem, there’s an easy first step that you can—and should—try before calling for tech support: Apple Diagnostics. (On Mac models released before June 2013, Apple instead included a similar set of diagnostics called Apple Hardware Test.)

Apple Diagnostics is a set of hardware test routines that Apple bakes into every Mac. It tests numerous internal subsystems in your Mac, including the CPU, memory, and firmware; displays and graphics adapters; connectivity via USB, Bluetooth, Wi-Fi, and Thunderbolt; batteries and power adapters on laptops; and more.

Before you run Apple Diagnostics, prepare your Mac with these steps:

  1. If you have a firmware password enabled, turn it off before proceeding.
  2. If possible, pick a situation when the Mac is most likely to experience the problem (such as right after turning it on for the day, or when it’s unusually warm).
  3. Disconnect all external devices with the following exceptions: the keyboard and mouse or trackpad, display, Ethernet cable if you use it, and power adapter for laptops.
  4. If you’re testing a laptop, make sure it’s on a flat, well-ventilated surface.
  5. Shut down your Mac.

Once you’re ready, turn your Mac on while holding down the D key. (If that doesn’t invoke Apple Diagnostics, try again, holding down Option-D to attempt to start Apple Diagnostics over the Internet.) Keep holding down until you see a screen asking you to choose your language. Once you’ve done that, you’ll see a bar showing the progress of the diagnostic tests, which should take only a few minutes.

What to do if Apple Diagnostics reports an issue

If Apple Diagnostics finds any issues, it suggests solutions and provides reference codes. Write the reference codes down so you can share them with tech support later, if necessary. Apple publishes a full list of reference codes, but the list generally doesn’t tell you much beyond what the Apple Diagnostics report explains.

After you’ve read about the issues and solutions, you have four options.

  1. For a second opinion, click the “Run the test again” link. It’s not a bad idea to make sure that multiple tests come up with the same results. If they don’t, that’s useful information for tech support too.
  2. To get more information, including details about service and support options from Apple, click the “Get started” link. Doing so causes your Mac to start up in macOS Recovery, open Safari, and display a Web page for Apple Support. It asks for your location along with permission to read your Mac’s serial number and reference codes before providing additional details. If your Mac can’t access the Internet at this time, none of this will work.
  3. To restart your Mac normally, click the Restart button.
  4. To shut your Mac down normally, click the Shut Down button.

With a few exceptions, most problems identified by Apple Diagnostics require service from an Apple Authorized Service Provider or Apple itself.

  1. If you get a note about USB or Thunderbolt hardware, make sure you’ve disconnected any devices other than the keyboard and pointing device and test again. If you have another wired keyboard or pointing device, swap those in and test again.
  2. If Apple Diagnostics complains about your laptop’s power adapter, disconnect it from both the wall and the computer, reconnect it to both, and rerun the test.
  3. One of the battery errors (PPT004) may require updated diagnostic information. To confirm the problem, run Apple Diagnostics over the Internet: shut down the Mac and start it up again while holding Option-D.

What to do if Apple Diagnostics doesn’t find any problems

With any luck, you’ll see the coveted “No issues found” message. While that doesn’t mean you’re imagining any problems, it does suggest that they’re probably related to software and won’t require a hardware repair. However, some infuriating problems are intermittent due to solder connections being warm or cold, which is why it’s important to test when they’re most likely to occur.

One final note: If you want to see the results of the last run of Apple Diagnostics, open the System Information app and click Diagnostics under the Hardware section.

(Featured image by Adam Engst)


How to Find the Snaps You Want in the Mac’s Photos App: IT Support Los Angeles

Digital cameras have been around long enough that people have stopped making snarky comments about how hard it is to find anything in a shoebox filled with hundreds of unorganized photos. But given the tens of thousands of photos many of us now have, it’s hard to be smug about the ease of finding any given image. Luckily, Apple has provided us with numerous tools in the Photos app to help. Some of these organization systems you have to set up and maintain, but others work silently for you in the background. Let’s start with the automatic methods.

Date

It’s impossible to miss how Photos automatically organizes your photo library by date, particularly in macOS 10.15 Catalina, where the Photos view lets you drill down by Year, Month, and Day. One tip: Day view doesn’t necessarily show you all the pictures taken on a particular day; to see them, click All Photos.

If you don’t want to browse, you can also search (choose Edit > Find) on things like “2015” or “January 2015.” The utility of such searches is that they filter the displayed images to just those taken in that year or month. You can even search on “January” to find all photos taken in January of any year.

People

With a little training of its facial recognition algorithms, Photos can automatically create and maintain collections of photos of particular people. Click People in the sidebar to see the faces that Photos has identified automatically, and if any of them currently lack names, click the Name button for a photo you want to identify, enter a name, and either press Return or select from the suggestions. Although it may not happen immediately, Photos will scan all photos for other pictures of each person and add them; if you get a banner in the toolbar asking you to review additional photos, click Review and then deselect any photos that aren’t that person in the next dialog.

Whenever you’re looking for a photo of a particular person, the fastest way may be to focus on just those photos that contain their face. Click People in the sidebar and double-click the desired person’s box to see their photos. Make sure to click Show More to see all the matched photos, rather than just those Photos deems the best.

Places

By default, the Camera app tags every iPhone or iPad photo with the location where you took the picture. That enables you to search for images on a map. Click Places in the sidebar, and then pan and zoom the map to find the desired location. Click any photo thumbnail to show just the photos taken in that spot. If you know the name of the location, you can also search for it directly—Photos knows the names of all geotagged locations.

Location-based searching could be a godsend for real-estate agents, builders, and others who need to collect images by address. No need to use keywords or other metadata, since the geotagging provides all the necessary information. 

AI Object Search

In the last few releases of Photos, Apple has added object searching, which finds photos based on their contents. Looking for photos of cows, or beaches, or oak trees? Just type what you want to find into the Photos search field, and Photos might find it.

Although it’s magic when this approach works, don’t put too much stock in it. Searching for “cow” also brought up images of pigs, goats, and horses for us. Close, in that they’re all four-legged farm animals, but no cigar.

Media Types

Sometimes, what you want to find is already categorized by its media type. If you want to find a selfie, for instance, or a panorama, look no further than the Media Types collection in the Photos sidebar. It includes dedicated albums that automatically update themselves to contain videos, selfies, Live Photos, Portrait-mode photos, panoramas, time-lapse movies, slo-mo movies, bursts, screenshots, and animated GIFs.

Albums and Smart Albums

With the categorization techniques so far, you don’t have to do much, if anything. With albums, however, all organization is entirely manual. Creating a new album is easy—select some photos and then choose File > New Album with Selection. After the fact, you can add more photos to the album by dragging them from the main window to the album in the sidebar. And, of course, clicking the album in the sidebar displays all the photos.

Smart albums are entirely different from albums—they are essentially saved searches. To create one, choose File > New Smart Album and then define the matching criteria. Photos provides oodles of options, making it easy to create a smart album that, for instance, holds photos of a particular person taken with one specific camera over a certain time frame.

An aspect of working with albums and smart albums that can be confusing is how to delete photos. When you remove a photo from a regular album, you’re just taking it out of that album, not deleting it from your library. (To actually delete a photo from your library, click Photos in the sidebar before selecting the photo and pressing the Delete key.) The only way to remove a photo from a smart album is to ensure that it no longer matches the smart album’s criteria, either by changing the conditions or by modifying the photo’s metadata, which isn’t always possible.

Keywords

If you want to tag individual images in a way that makes them easy to find later, keywords are an excellent option. Choose Window > Keyword Manager to display the floating Keywords window, and click Edit Keywords to open the editing view where you can click + to add a keyword (complete with a one-letter shortcut, which also puts it at the top of the Keywords window). Click – to remove a keyword (from the list and from any photos to which it’s assigned). Click OK to switch back to the main keyword view.

To assign a keyword, select a set of photos or just focus on the current one. Either click the keyword in the Keywords window or press its associated letter shortcut. Clicking or pressing the shortcut again removes the keyword.

You can see what keywords are attached to an image by making sure View > Metadata > Keywords is chosen and then clicking the badge that Photos adds to keyworded images. To find everything with a particular keyword, though, you’ll have to do a search and, if necessary, look at the Keywords collection at the bottom of the search results.

Titles and Descriptions

Another way to find photos manually is to give them titles or descriptions and then search for words in those bits of metadata. Applying consistent titles and descriptions manually would be onerous, but you can do multiple selected images as easily as one. Select some pictures, choose Window > Info, and in the Info window, enter a title or description. Close the Info window to save.

To see (and edit) the title under each image, make sure View > Metadata > Titles is chosen. To find included words, you need to do a search, just like with keywords.

Choosing the Best Approach for Your Needs

So many choices! Here’s our advice about which should you use:

  • When possible, stick with the approaches (date, People, Places, object search, media types) that require little or no additional tagging work. People and Places are particularly useful that way.
  • If you can construct a smart album that finds all the images you want, do it. However, it may not be useful (or possible) unless you’re looking for a subset of photos that already are in an album, have a keyword, or are attached to a person.
  • Use albums for quick, ad-hoc collections or for collections of related photos. They’re easy to make and use, and to delete if you no longer need them. An album would be good for collecting all the photos from your summer vacation.
  • Use keywords to identify general aspects of images throughout your entire photo library that you’re happy to access only by searching or via a smart album. Keywords would be useful for tagging all the photos you take of lecture slides, or that relate to your hobby.
  • Avoid relying on titles and descriptions if you can. It’s too easy to make mistakes such that later you can’t find items you’ve titled or described. Albums and keywords are better for organization. Leave the titles and descriptions for actually titling and describing individual images.

Next time you think, “I wish I could find all my photos that…,” take a minute and think through these options to decide which will best serve your needs.

(Featured image by Simon Steinberger from Pixabay)