I Am a Product, and So Are You
I am a product, even though I wasn’t aware.
Every day, we use social media by spending countless hours staring at a screen. What we see as harmless fun and convenience turns out to be something so much more. The news of Facebook’s misuse of our personal/private data is a headline we have repeatedly seen. Ironically, we see news about the Facebook’s breach of user trust on our Facebook newsfeeds as well; the company can’t even hide their sins on their own platform. 2.2 billion users have been informed on how to protect their information, while at least 87 million users might have had their data compromised through Cambridge Analytica. It was used to target ads and products seen in our feed, and because the psychological profiling of Facebook is so accurate, the likelihood that we click on these ads is quite high. Due to this, our data is valuable, and social media companies like Facebook are worth billions of dollars not because of their interactive function, but because of the information, they have on their users. In theory, there’s nothing wrong with this business model. What’s unacceptable is that not all of these users consented to have their personal data used by other companies; it’s like giving away your psychological profile to complete strangers.
I am a product, even though I don’t want to be.
Facebook argues that they already give users the option to protect their data from certain apps. Coming from someone who’s quite conscious of her online presence, I am quite aware of these features on all my social media platforms. I would always click “do not allow” or “block” whenever prompted that an app is asking permission to use my data. However, upon reflection, I am not even sure if that was enough. While following Mark Zuckerburg’s Senate hearing, I grew skeptical of any trust I had in social media companies. I can choose to protect my data from other users and maybe some external apps, but what happens behind-the-scenes is beyond my control. It’s not just what you have posted on social media that stays on the Internet forever, but also your behaviour online: what you search, where you’re usually browsing, what you tend to buy online, what music you listen to, what you like watching, etc. With every click, tap, or scroll, an algorithm records another piece of you in a profile. Artificial intelligence is becoming so advanced and sophisticated that there are certain things it can figure out without user disclosure. For example, a Stanford University study found that there are certain algorithms that can correctly identify a person’s sexual orientation based on a profile photo. If AIs continue to progress, then my permission doesn’t even matter; my profile can just be gathered by observing my online behaviour. This is something I cannot control, even if I wish to.
I am a product, and my environment is keeping me that way.
Other than the fact that we can choose to live in an information bubble where the only news we want exists, the algorithm is constantly keeping us in a certain online environment. In a time when any information can be easily spread, our timelines are filled with things that the AI believes will capture our attention because that is what will make the most money. If our feed is continuously filled with things that we are likely to enjoy seeing, then we will remain longer on the platform. Because we enjoy our time on social media, we don’t leave—even if it means we know our data is being put on a market. I’m basically cattle being force–fed until my sale. This is also dangerous in another aspect: we can be made into the product that companies or organizations want us to be. An AI can tailor the news we receive to our preferences, and this means our perspectives can be altered or reinforced. Especially in an era of fake news and easily accessible biased sources, the truth can be diluted, or even worse, lost; we live in whatever truth we want to see as true, and not what truly is true.
I am a product, and so are you.
In this grim reflection, you may think that I will suggest that social media is horrible and we should refuse to be cattle by boycotting it. However, the truth is we need these platforms. It’s socially expected of us to communicate with each other online. We get our products and information through the websites or apps we choose to spend our time on, and we can even be employed through the Internet.
This doesn’t mean, however, that we just sit in front of our screens and accept our fate. The Facebook hearing could be the first step towards the regulation of these giant data companies, but governments are still far from having solid legislation to protect us from them. Social media and technology are not an inherent evil; they were simply tools that were exploited. The truth is, whether one chooses to believe it or not, we can’t completely unplug from our digital world, so we should start to become smarter about how we navigate the online space. After all, we choose to give up certain freedoms for the sake of convenience and pleasure; for instance, your phone has taken your fingerprint, a part of your identity, in exchange for security and time saved. In the meantime, we can do things to protect ourselves: seek out and utilize the security controls that are provided for you, be wary of articles that appear on your feed, and be more aware of your presence online. If you look at the online space like any public space, the attitude you have online will change. Remember that someone (or something) is always watching, so be vigilant of how you navigate the online world. If we keep up this habit, we will be able to create safer spaces for everyone, even when Big Brother is no longer watching.
Edited by Alec Regino