This scary AI tool can guess your location from a single photo – and that’s a privacy nightmare

Key Takeaways:

– PIGEON is an AI project created by Stanford University students to pinpoint the location of any photo, including personal photos, with a high degree of accuracy.
– The privacy implications of this technology are significant, including government surveillance, corporate tracking, and stalking.
– The creators of PIGEON have decided not to release the technology to the public, but concerns remain about what could be done by organizations like Google.
– While there are potential positive uses for the technology, such as identifying areas in need of maintenance or planning holidays, it also poses risks to personal privacy.
– Regulation is needed to prevent abuses, and companies developing AI tech should take responsibility for preventing harm caused by their products.


There’s no question that artificial intelligence (AI) is in the process of upending society, with ChatGPT and its rivals already changing the way we live our lives. But a new AI project has just emerged that can pinpoint the location of where almost any photo was taken – and it has the potential to become a privacy nightmare.

The project, dubbed Predicting Image Geolocations (or PIGEON for short) was created by three students at Stanford University and was designed to help find where images from Google Street View were taken. But when fed personal photos it had never seen before, it was even able to accurately find their locations, usually with a high degree of accuracy.

Source link

AI Eclipse TLDR:

A new AI project called Predicting Image Geolocations (PIGEON) has been developed by three students at Stanford University. Originally designed to locate images from Google Street View, PIGEON has the ability to accurately pinpoint the location of almost any photo, including personal ones. While the technology has positive applications such as identifying areas in need of maintenance or helping with holiday planning, it also raises serious privacy concerns. Jay Stanley of the American Civil Liberties Union warns that in the wrong hands, PIGEON could be used for government surveillance, corporate tracking, or stalking. The student creators have decided not to release the technology, but the potential for misuse remains a concern. Regulation is needed to prevent abuse and companies must take responsibility for the potential damage caused by their AI products.