Microsoft is once again postponing the rollout of its controversial AI feature, Recall for Windows, over concerns about user privacy and data security. The company, which planned to start testing the tool in October after an initial delay, now says Recall will be ready for Windows Insiders on Copilot-enabled PCs by December, pending further internal review.

In May, Microsoft introduced the Recall tool for Windows 11, describing it as a personal “time machine” that lets you instantly retrieve anything previously displayed on your screen, from documents and images to websites. Microsoft says the tool captures screenshots of the user’s screen, stores them securely on the device, and uses AI to organize and make this data searchable.

Zooey Liao/CNET

But the tool soon faced scrutiny from privacy advocates, forcing Microsoft to delay the rollout for additional review. Recall comes at a time when Microsoft and other tech companies continue to expand and offer new AI-powered features as part of a greater effort to stand out in a crowded marketplace. Yet many companies, including Microsoft, are still trying to navigate the numerous security and privacy challenges that arise from generative AI. 

The latest remarks follow updates announced in September to the security and privacy architecture for Recall in a Windows Blog post. 

Brandon LeBlanc, senior product manager for Windows, said that to ensure Microsoft delivers on security and privacy, it is taking “additional time to refine the experience before previewing it with Windows Insiders.”

He added that the company is “committed to delivering a secure and trusted experience with Recall.”

The news was first reported by The Verge.

In September, the company said the Recall feature will be opt-in only and can be removed from Copilot PCs. It also noted that sensitive data in Recall is always encrypted and keys are protected.



Read the full article here

Share.
Leave A Reply

2024 © Prices.com LLC. All Rights Reserved.
Exit mobile version