Microsoft’s new artificial intelligence program, Recall, is sparking significant concern among cybersecurity experts due to its ability to take screenshots of a user’s activity every five seconds. Dr. Kris Shrishak, an AI and privacy adviser, warned the British Broadcasting Company, saying, “This could be a privacy nightmare.” He added, “The mere fact that screenshots will be taken during use of the device could have a chilling effect on people.”
Recall is a component of Microsoft’s broader AI Co-Pilot interface, designed to help users “retrace their steps” in an informal yet technical manner. The program can read key terms and words in screen captures. When users input photos, phrases, or links to search within their history, Recall can then scan and match those with relevant screenshots.
Microsoft describes the functionality: “Trying to remember the name of the Korean restaurant your friend Alice mentioned? Just ask Recall and it retrieves both text and visual matches for your search, automatically sorted by how closely the results match your search.” The AI can even navigate back to the exact location of the item you saw.
Privacy and security concerns
The screenshots are stored locally on a person’s device and are not accessible by outside sources, including Microsoft, the company told the BBC in a statement. However, this assurance has not alleviated all concerns.
Daniel Tozer, a data and privacy expert, drew a dystopian comparison to the show “Black Mirror.” He noted, “There may well be information on the screen which is proprietary or confidential to the user’s employer; will the business be happy for Microsoft to be recording this?” Tozer also raised concerns about screenshots of video chats, questioning if other people on screen will be given the choice to consent.
Governments are already paying attention. A spokesperson for the United Kingdom’s Information Commissioner’s Office stated they are making inquiries with Microsoft to understand the safeguards in place to protect user privacy.
Potential for misuse and policy changes
Experts like Jen Caltrider of Mozilla express additional concerns, particularly regarding the potential for Recall to give easy access to sensitive information. She highlighted risks such as law enforcement court orders or potential policy changes by Microsoft that could lead to using this content for targeted advertising or training their AIs.
Caltrider also pointed out that sites not blacking out passwords captured by Recall pose a user risk. Moreover, reports indicate that Recall has already been cracked to run on unsupported hardware, raising further security issues. “I wouldn’t want to use a computer running Recall to do anything I wouldn’t do in front of a busload of strangers,” Caltrider cautioned.
Despite the concerns, Microsoft maintains that “you are in control with Recall,” noting that the feature can be strategically paused. “You can select which apps and websites you want to exclude, such as banking apps and websites,” the company stated. As debates around privacy and security continue to swirl, it remains to be seen how Microsoft will address the numerous concerns raised by experts and regulatory bodies.