Metaverse poses serious privacy risks for users, report warns

Date: unknown


The immersive internet experience known as the metaverse will erode users’ privacy unless significant steps are taken to improve and regulate how the technology captures and stores personal data, a new report from New York University argues.

The metaverse relies on extended reality (XR) technologies — the umbrella term referring to augmented reality, virtual reality and mixed reality. The report from NYU’s Stern Center for Business and Human Rights warns that because the technology doesn't work without gathering and processing vast quantities of personal and bodily data, it presents a major privacy risk.

According to the report, the bodily data alone can be used to deduce behavioral and psychological information about individuals.

“Conventional XR hardware is equipped with sensors that continuously track at least three types of user data: head movements, eye movements, and spatial maps of physical surroundings,” the report says.

When compiled over time, the report argues, this data can reveal “highly sensitive information” about users, including their physical and mental condition, which can be leveraged for commercial or political gain.

Companies with a deep interest in metaverse technology include tech giants like Meta and Microsoft, hardware makers like Nvidia, game developers such as Epic Games and software platforms like Unity.

Companies should establish “known best practices” on privacy, safety by design, and cybersecurity before unveiling their products and should be transparent with the public about how the technology could affect their privacy, the report says.

Additionally, companies should erase all “raw and derived bodily data” as soon as it is not needed for the product to operate and give users different options to control how much risk they are exposed to. Given the potential dangers embedded in XR technology, the report argues that Congress should pass a comprehensive privacy law that includes language to protect against body-based data being used to profile users as well as bolster user consent models.

It notes that the version of the American Data Privacy and Protection Act (ADPPA) which made it out of the House Energy and Commerce Committee in July 2022, but ultimately failed to reach a floor vote offered a “good foundation upon which to build” by barring companies from collecting geolocation and health data.

The House Energy and Commerce Committee is now negotiating an updated version of ADPPA which has not yet been unveiled.

The report says the revised bill needs to account for the harm that potential uses of body-based data can pose; improve notice and consent standards; and bar the use of consumers’ bodily data for psychographic profiling.

Get more insights with the

Recorded Future

Intelligence Cloud.

Learn more.


No previous article

No new articles