Microsoft delays broad release of Recall AI feature due to security concerns

It’s time to celebrate the incredible women leading the way in AI! Nominate your inspiring leaders for VentureBeat’s Women in AI Awards today before June 18. Learn More


Microsoft announced today that it is delaying the broad release of its Recall artificial intelligence feature for Copilot+ PCs. Recall, which was originally slated to be widely available to Copilot+ PC users on June 18, will now first be released as a preview to members of the Windows Insider Program in the coming weeks.

The decision to push back the general availability of Recall stems from Microsoft’s desire to gather additional feedback and ensure the feature meets the company’s stringent security and quality standards before rolling it out to all users. The move underscores the growing scrutiny and caution surrounding the deployment of AI capabilities, as companies grapple with balancing innovation and responsible stewardship of the technology.

Balancing productivity and privacy: Recall’s on-device AI and security measures

Recall leverages on-device AI to periodically capture snapshots of a user’s screen, creating a searchable visual timeline to help users quickly find previously viewed content across apps, websites, images, and documents. While Microsoft has touted the feature as a productivity booster akin to a “photographic memory,” concerns have been raised about the privacy and security implications of storing and analyzing such sensitive data.

To address these concerns, Microsoft is implementing additional security measures for Recall. This includes using “just in time” decryption protected by Windows Hello Enhanced Sign-in Security, ensuring that Recall snapshots can only be accessed when the user authenticates their identity. The company is also encrypting the search index database associated with Recall.


VB Transform 2024 Registration is Open

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now


Responsible AI deployment: Microsoft’s commitment to its secure future initiative

By first releasing Recall to the Windows Insider community, Microsoft aims to gather real-world feedback and usage data to refine the feature before making it more widely available. This staged rollout approach has become increasingly common for companies developing cutting-edge AI capabilities, as they seek to balance the potential benefits with the need for responsible deployment.

The delay also reflects Microsoft’s commitment to its Secure Future Initiative, which prioritizes security and privacy in the development of AI and other advanced technologies. As part of this initiative, Microsoft has implemented a range of security enhancements for Copilot+ PCs, including making them Secured-core PCs, enabling Microsoft Pluton security processors by default, and shipping them with Windows Hello Enhanced Sign-in Security.

While the postponement of Recall’s broad release may disappoint some early adopters eager to try out the feature, industry experts believe it is a necessary step to ensure the long-term success and trustworthiness of AI-powered tools like Recall. As enterprises increasingly look to leverage AI to boost productivity and gain competitive advantages, the responsible development and deployment of these technologies will be critical.

Microsoft has not provided a specific timeline for when Recall will be made available to all Copilot+ PC users, stating only that it will happen “soon” after gathering feedback from the Windows Insider Program. The company plans to publish a blog post with details on how to access the Recall preview once it becomes available to Windows Insiders.



Source link