Getting ads about something you were searching for is not a coincidence. Tech giants track your online movement, from the shows you watch, the websites you use for shopping, or the latest gadget you might be interested in buying, everything is monitored.
The data is collected and stored on machine-learning algorithms which sent ads and recommendations according to your preference. Companies like Google cash in your data for over $120 billion a year from ad revenue, reports MIT.
Even if you disable settings in Google Chrome to purge all website cookies and site data when you close the browser it still stores data for itself and YouTube, according to Mac programmer Jeff Johnson who elaborated on this in a blog.
To force changes in these practices, researchers have suggested three ways:
Users install privacy tools or leave platforms so that tech firms such as Google and Facebook are unable to track and store your data.
Data poisoning involves manipulating the trained dataset. For example, AdNauseam is a browser extension that clicks on every advertisement that is popped up on your online feed which confuses Google’s ad-targeting algorithms.
Conscious data contribution
To register your protest against a platform you can start uploading content on its competitor’s platform. For example, instead of uploading your photos on Facebook, you can post them on Tumblr instead.
With a collective effort, companies can be forced to change their data collection practices. For example, when WhatsApp announced its new terms of service that would allow Facebook and its subsidiaries to store data, millions of users deleted their accounts and moved to competitors like Signal and Telegram.
As a result, Facebook had to delay its policy changes.
Google also announced that it will stop tracking individuals on the web and target ads at them however, the real motive behind this decision is still unclear. It is possible that the use of ad blockers and tools like AdNauseam have contributed to the decision.
“It’s exciting to see this kind of work,” says Ali Alkhatib, a research fellow at the University of San Francisco’s Center for Applied Data Ethics, who was not involved in the research.
“It was really interesting to see them thinking about the collective or holistic view: we can mess with the well and make demands with that threat because it is our data and it all goes into this well together.”