– PIGEON is an AI project created by Stanford University students to pinpoint the location of any photo, including personal photos, with a high degree of accuracy.
– The privacy implications of this technology are significant, including government surveillance, corporate tracking, and stalking.
– The creators of PIGEON have decided not to release the technology to the public, but concerns remain about what could be done by organizations like Google.
– While there are potential positive uses for the technology, such as identifying areas in need of maintenance or planning holidays, it also poses risks to personal privacy.
– Regulation is needed to prevent abuses, and companies developing AI tech should take responsibility for preventing harm caused by their products.
There’s no question that artificial intelligence (AI) is in the process of upending society, with ChatGPT and its rivals already changing the way we live our lives. But a new AI project has just emerged that can pinpoint the location of where almost any photo was taken – and it has the potential to become a privacy nightmare.
The project, dubbed Predicting Image Geolocations (or PIGEON for short) was created by three students at Stanford University and was designed to help find where images from Google Street View were taken. But when fed personal photos it had never seen before, it was even able to accurately find their locations, usually with a high degree of accuracy.
Jay Stanley of the American Civil Liberties Union says that has serious privacy implications, including government surveillance, corporate tracking and stalking, according to NPR. For instance, a government could use PIGEON to find dissidents or see whether you have visited places it disapproves of. Or a stalker could employ it to work out where a potential victim lives. In the wrong hands, this kind of tech could wreak havoc.
Motivated by those concerns, the student creators have decided against releasing the tech to the wider world. But as Stanley points out, that might not be the end of the matter: “The fact that this was done as a student project makes you wonder what could be done by, for example, Google.”
A double-edged sword
Before we start getting the pitchforks ready, it’s worth remembering that this technology might also have a range of positive uses, if deployed responsibly. For instance, it could be used to identify places in need of roadworks or other maintenance. Or it could help you plan a holiday: where in the world could you go to see landscapes like those in your photos? There are other uses, too, from education to monitoring biodiversity.
Like many recent advances in AI, it’s a double-edged sword. Generative AI can be used to help a programmer debug code to great effect, but could also be used by a hacker to refine their malware. It could help you drum up ideas for a novel, but might assist someone who wants to cheat on their college coursework.
But anything that helps identify a person’s location in this way could be extremely problematic in terms of personal privacy – and have big ramifications for social media. As Stanley argued, it’s long been possible to remove geolocation data from photos before you upload them. Now, that might not matter anymore.
What’s clear is that some sort of regulation is desperately needed to prevent wider abuses, while the companies making AI tech must work to prevent damage caused by their products. Until that happens, it’s likely we’ll continue to see concerns raised over AI and its abilities.
You might also like
AI Eclipse TLDR:
A new AI project called Predicting Image Geolocations (PIGEON) has been developed by three students at Stanford University. Originally designed to locate images from Google Street View, PIGEON has the ability to accurately pinpoint the location of almost any photo, including personal ones. While the technology has positive applications such as identifying areas in need of maintenance or helping with holiday planning, it also raises serious privacy concerns. Jay Stanley of the American Civil Liberties Union warns that in the wrong hands, PIGEON could be used for government surveillance, corporate tracking, or stalking. The student creators have decided not to release the technology, but the potential for misuse remains a concern. Regulation is needed to prevent abuse and companies must take responsibility for the potential damage caused by their AI products.