Ever app began as a normal cloud storage company in 2013 before pivoting to the AI business and renaming itself to Ever AI without informing its millions of users. The company then used the millions of photos uploaded to the site to develop facial recognition tools.
A quick check on their website reveals that the company still presents itself as a photo storage platform and nothing about their AI business except for a privacy policy that was recently added last month that claims that the photos shared on the platform are used to train its facial recognition system.
Ever then offers to sell this technology to other private companies and government agencies such as law enforcement and the military.
This is worrying as Ever pivoted from a cloud storage platform where up to 13 billion photos and videos have been uploaded from tens of millions of users in 95 countries to an AI company without informing them of this change in business and violate the privacy of its users.
This illegal scraping of users photos and videos has privacy experts and civil rights advocates concerned too since people’s precious memories can be used to build surveillance technology.
As usual, when these kinds of revelations of privacy violations come up, the CEOs of said companies rush to defend themselves. Doug Alley, Ever’s CEO spoke to NBC saying that Ever AI does not share the photos or any identifying information about users with its facial recognition customers and that the images are used to instruct an algorithm how to identify faces.
When NBC reached out to users of the Ever app, most of them didn’t even know- the company uses its customers’ data in a way they would never expect: their photos were being used to train a facial recognition technology which they found creepy and invasive with most of them deleting the app.
Alarms raised
Facial recognition tech has come under fire in recent years since tech companies have been accessing giant databases of photos to improve the accuracy of their matching technology using publicly available datasets and others scraping people’s photos without their consent thus violating their privacy.
We’ve seen that facial technology has also been used to oppressively target minorities especially if used for surveillance. Chinese start-ups have built algorithms that the government uses to track members of a largely Muslim minority group and this has drawn wide international condemnation.
It’s now imperative to be cautious of whatever service you sign up to or are currently using especially ones where that involves uploading and sharing of photos as said photos could be used to train facial recognition technology and develop surveillance products without users being aware.
Even with civil liberties calling for new laws and tighter regulations to protect users, none is stopping the facial recognition industry from moving forward.