When I first heard about Recall, I immediately buried my face in my hands. I never thought I'd see such a stunning lens created by Microsoft, much less marketed as a characteristic.
If you haven't read about it yet, Recall is an AI feature coming to PCs running Windows 11 Copilot+. It's designed to let you go back in time on your computer by “taking pictures of your active screen every few seconds” and analyzing them with AI, according to Microsoft's recall FAQ. If someone other than you has access to that recovery data, it could be disastrous.
Satya Nadella says Windows PCs will have a photo memory feature called Recall that will remember and understand everything you do on your computer by taking constant screenshots pic.twitter.com/Gubi4DGHcsMay 20, 2024
This may sound familiar, and that's because it's remarkably similar to the failed and archived Timeline feature in Windows 10. Unlike Timeline, however, Recall doesn't just restore a version of your desktop files, it uses AI to take you back to that one. moment, even opening relevant applications.
What is the problem with Windows Recall?
On the surface, this sounds I find it a cool feature, but that paranoid privacy purist in the back of my mind is burying his face in a pillow and screaming. Imagine if almost everything he's done over the past three months was recorded so anyone with access to his computer could see it. Well, if you use Recall, you won't have to imagine it.
It may seem like an overreaction, but let me explain: Recall involves taking screenshots every few seconds and storing them on your device. By adding encryption to the mix, you get a huge amount of bloated visual data that will show most you have been doing on your computer during that period.
As Microsoft explains, “The default allocation for Recall on a device with 256 GB will be 25 GB, which can store approximately 3 months of snapshots. You can increase the storage allocation for Recall in your PC settings. Old snapshots will be deleted once time you use your allocated storage, allowing new ones to be stored.”
This is worse than keylogging! Remembering is not just recording what you write, but recording everything you are doing, with photographic evidence, every three seconds.
I say almost all because Microsoft claims that “Recall also does not take snapshots of certain types of content, including InPrivate web browsing sessions in Microsoft Edge. It treats material protected with digital rights management (DRM) similarly; just like other Windows apps like Snipping Tool, Recall will not store DRM content.” This is reassuring on the surface, but it's still too vague for anyone to really have faith in it.
Will this only work in Microsoft Edge or will it also integrate with Chrome and Firefox? If it only works with Edge, it feels like an egregious privacy wall for not using Microsoft's unpopular web browser.
But that's just the tip of the iceberg. Microsoft openly admits that Recall will take screenshots of your passwords and private data:
“Please note that Recall does not perform content moderation. It will not hide information such as passwords or financial account numbers. That data may be in snapshots stored on your device, especially when sites do not follow standard Internet protocols, such as logging in. covert passwords.”
So what you might have here is something that stores your passwords, your information, your account details, etc., and is visible to anyone on your profile. If you only have one profile for your device, that means everyone who has access to that PC will be able to see your Recall data.
Arguably the worst part about this is that it will be on by default once you activate your device. Microsoft states:
“On Copilot+ PCs with Snapdragon® your device.”
I think this is a bad idea. The decision should be made by the individual and not by Windows. Having it immediately active just means that uninformed people may not be able to act on it. In my opinion, it is similar to cookie tracking: it can be just as invasive. All of this makes me wonder if you may have a problem with consent under the GDPR.
Is Microsoft making Recall safe?
In defense of Microsoft, I would like it to be known that there was a attempt to make it safe. I don't think it was very good, but there was an attempt.
Microsoft states that “recovery snapshots are saved on the Copilot+ PCs themselves, on the local hard drive, and are protected by data encryption on your device and (if you have Windows 11 Pro or a Windows 11 enterprise SKU) BitLocker.” Based on the wording here, it looks like your snapshots will only be encrypted if you have Windows Pro or a commercial Windows code.
The omission of Windows Home users is horrifying. Yes this is In this case, it leaves ordinary people vulnerable if their devices are compromised. People shouldn't have to pay a premium and upgrade to protect their privacy on an operating system that takes snapshots of their screen every few seconds.
The big question, however, is what type of encryption is used? I've been working with virtual private network (VPN) encryption for a while now and just because something is “encrypted” doesn't mean it's secure. In fact, with advances in quantum computing, encryption is under threat and even the best VPN services are having to come up with quantum-safe encryption methods. We have already seen that BitLocker can be decrypted.
Another note in Microsoft's favor is that the data is stored locally and encrypted, rather than uploaded to a cloud server for Microsoft to access.
“Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available to Microsoft to view, or use them to target ads.”
This means that, for now, Microsoft is not looking behind the curtain. But that does not guarantee that it will be like this forever. If Microsoft can legally find a way to make money with this tool, I guess they'll try. For now, the goal seems to be to persuade people to upgrade their operating system.
If you are one of those households that has different profiles for each person on the family PC, you can regain a little privacy.
“Screenshots are only available to the person whose profile was used to log in to the device. If two people share a device with different profiles, they will not be able to access each other's screenshots. If they use the same profile to log in the device, they will share a screenshot history. Otherwise, Recall screenshots will not be available to other users or accessed by other applications or services.
The problem is that this is only useful if you password protect your profile and if someone sets parental controls on your profile, that could give them a back door.
What are the security risks with Recall?
You're probably thinking “so what?” Let me give you some scenarios where this could be a problem:
- You are using a public computer: Let's say you do some online shopping or banking on a library computer. You didn't realize Recall was active, and now the person using the computer after you has just accessed the Recall file to get all your banking details, your address and your passwords. It's like handing the keys to your house to a thief before telling him you're going on vacation that week.
- You are using a work laptop: We've all used a company computer for personal reasons, whether it's checking social media during lunchtime or just running some errands because you don't have your own laptop. Now your boss, your IT team and someone With access to your device, you can check and see every three seconds how you are using it his equipment. They could use this to track your work performance and see how productive you are, and they could even read the private messages you send to people.
- You are using a family PC: If you've been using your home computer and don't have a password-protected profile, anyone could log in and open your recovery history. If you've been doing something disgusting It's about to be obvious, even if you deleted that search history.
- Your laptop is hacked or stolen: This is pretty obvious, but if someone manages to hack your device, the encryption won't matter. Similarly, if someone just steals your laptop and you don't have a strong password locking it, then a criminal (cyber or not) can use Recall to get the entire world out from under them.
There are so many problems that can arise simply when someone accesses your Recall data. Using a password manager would become irrelevant if someone can see you typing your master password, your private messages will be the opposite, and there's no point in deleting your search history because Microsoft keeps the receipts.
How to protect your privacy with Windows Recall
There are a few ways to protect your privacy from Windows Recall, but the most obvious and effective will be to disable it completely. As the saying goes, “prevention is better than cure.” First of all, it's best that you don't have these things stored on your device.
However, if you want to use Recall, you will need to do the following:
- Create an individual profile on your PC: This will prevent people from having shared access to your Recall data as long as you follow my next tip.
- Protect your profile with password: not only your device, but also your profile. Don't use a weak password, be serious. Use three memorable words with numbers and symbols, and no, don't set your password to “3-Memorable-worD5.”
- Encrypt your withdrawal details: You may have to upgrade your operating system or pay for BitLocker, but encryption is non-negotiable. If someone passes your password, you don't want them to have immediate, uncontrolled access to what he's been doing for the past three months.
- Do not access sensitive data while recovery is activated: If you are going to enter personal passwords or view NSFW content, simply turn it off. Obviously this will be annoying and time-consuming, but it's much better than the alternative of taking a screenshot of everything.
In short: the memory gives me goosebumps
Look, I've been a researcher and privacy advocate for years. I don't like the idea of anything tracking what we do. But this… This is something else. The risk that Recall entails, the enormous devastation it could cause if your device is hacked, the idea that Microsoft may be protecting privacy behind what I can only describe as a paywall. Me sick.
There are many chances of this feature being misused. Safety cannot be underestimated. Privacy cannot be assured. Taking screenshots of my device from the moment I activate it should No be a default option. Let the user be in control of their privacy and leave the decision in their hands.
All of this just pushes me into the fins of privacy-loving Linux.
you might also like