Tech & Gadgets

Harvard students develop app to identify everyone using Meta Smart Glasses

Ray-Ban Meta smart glasses were used by two Harvard engineering students to build an app that can reveal sensitive information about people without them realizing it. The students posted a demo of the video on X (formerly known as Twitter) and showed off the app’s capabilities. Notably, the app is not made publicly available to users, but was created to highlight the dangers of AI-powered wearable devices that use discrete cameras that can take photos and images of people.

The app, called I-Xray, uses artificial intelligence (AI) for facial recognition and then uses processed visual data to doxx individuals. Doxxing, a popular Internet jargon that is a portmanteau of “dropping dox (informal of documents or records),” is the act of revealing personal information about someone without their consent.

It was integrated with the Ray-Ban Meta smart glasses, but the developers said it would work with any smart glasses with discrete cameras. It uses an AI model similar to PimEyes and FaceCheck for reverse facial recognition. The technology can match a person’s face with publicly available images of him or her online and search the URLs.

Another large language model (LLM) is then added to these URLs and an automatic prompt is generated to find out the person’s name, occupation, address and other similar details. The AI ​​model also looks through publicly available government data, such as voter registration databases. In addition, an online tool called FastPeopleSearch was also used for this purpose.

In a short video demonstration, Harvard students AnhPhu Nguyen and Caine Ardayfio also showed how this app works. They could meet strangers with the camera already on and ask their name, and the AI-powered app could take over from there to find personal data about the individual.

In a Google Docs filesaid the developers: “This synergy between LLMs and reverse face search enables fully automatic and comprehensive data extraction that was previously not possible using traditional methods alone.”

The students have stated that they have no intention of making the app publicly available and only developed it to highlight the risks of an AI-enabled wearable device that can discreetly record people. However, this doesn’t mean that bad actors can’t create a similar app using a similar methodology.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button