App for Ray Ban Meta Smart Glasses That Reveals Peoples Sensitive Informations Harvard Students Develop Share Demo

0
7

Ray-Ban Meta smart glasses were used by two Harvard engineering students to create an app that can expose sensitive information about people without them knowing. The students posted a demo video on the X platform and showed off the app’s capabilities. The app is not being made publicly available for users, instead, they have created it to highlight the dangers of AI-powered wearables that can capture photos of people using cameras.

The app, called I-Xray, uses Artificial Intelligence (AI) for facial recognition and then uses the processed visual data to document individuals. Doxxing is a popular Internet slang term for “dropping docs”. It is the act of exposing personal information about someone without their consent.

It was bundled with Ray-Ban Meta smart glasses, but the developers said it will work with any smart glasses with a camera. It uses the same AI model as PimEyes and FaceCheck for reverse facial recognition. This technology can match a person’s face to publicly available images online and check the URL.

These URLs are then fed to another Large Language Model (LLM) and an automatic prompt is generated to find out the person’s name, business, address and other similar data. The AI ​​model also mines publicly available government data such as voter registration databases. Besides, an online tool named FastPeopleSearch was also used for this.

Harvard students Anhfu Nguyen and Ken Ardefio also demonstrated how the app works in a short demo video. They were able to approach strangers with the camera already on and ask their name, after which the AI-powered app did the work of finding personal data about the person.

Google Docs file “This synergy between LLM and reverse face search allows for fully automated and comprehensive data extraction that was not previously possible with traditional methods alone,” the developers said.

The students have said that they do not intend to make the app publicly available and that they developed it only to highlight the risks of AI-enabled wearables. However, this does not at all mean that people with bad intentions cannot create similar apps using such methods.