Playing mind games in the fight for online privacy
<hl>Regulation is great, but the road to protected online privacy starts in our minds.<hl>
When the General Data Protection Regulation (GDPR) was introduced a year ago, it was seen as the biggest promise in online privacy. Internet users in the EU and beyond expected to find a level playing field in which they will regain control over their personal data and how it is being used online.
Not long after the law took effect, the high expectations of GDPR faced reality, with surveys suggesting that users find it mainly frustrating, as they end up with is a tsunami of privacy policy spam, cookie-consent pop-ups and long, tedious legal terms. Instead of enjoying newly found control and peace of mind, many felt more harassed and blindsided than ever.
There’s very little doubt regarding the need for privacy protection and the regulation to define and enforce it. Clear boundaries must be set. But the process of changing how online privacy is approached cannot be one-sided. Privacy protection plans should be created for the people and adopted by the people. Just as we must invest resources in forming regulation and following its implementation closely, great efforts should be made regarding how individuals perceive this issue, while addressing specific obstacles that prevent us all from tackling this problem in our everyday lives. The road to a public solution goes through a series of personal mind games to acknowledge and win. Here are 3 mental obstacles to overcome on our path to creating a safer world wide web.
Now you see me, now you don’t
In 2016, Mark Zuckerberg posted a Facebook post with the intention of promoting Instagram, and instead promoted the idea of taping over one’s webcam and microphone. A couple of years later, during Zuckerberg’s Congressional hearing following the Cambridge Analytica data breach scandal, the CEO struggled to explain basic notions regarding Facebook’s advertising procedures and privacy policy. These examples show the difference between the tangible idea of privacy, which is easy to understand and discuss, and abstract concepts we can’t quite grasp.
The true nature of our online entity and digital footprint can be difficult to understand, and so we tend to focus on ideas that are visible and closer to our daily routine, such as browsing history, passwords, etc. When it comes to a websites’ cookies policies, for example, we don’t fully understand how they can protect us and from what, but are well aware of the annoyance in hitting “accept” over and over again, so much so that browser extensions that automatically click these warnings are now available.
This approach leaves major parts of our personal data exposed and vulnerable because we don’t really consider protecting them. Every now and then, we’ll notice something strange and point out the connection between a conversation we’ve had and the ads we come across immediately after, but for the most part, we have a very limited idea of what online privacy actually means.
Ignorance is bliss
Burying our heads in the sand instead of investigating what is being done with our online data seems dangerous, but trying to figure it all out might also have serious consequences. In a perfect world, we would all be masters of our pension and insurance plans, as well as online privacy. This would take quite a bit of time and could get awfully boring, but there’s another reason we tend to avoid studying certain corners of our lives. The overwhelming anxiety we feel when approaching these complex topics causes our mind to shut down in a way, as avoidance can both drive and stem from anxiety.
Ironically, recognizing just how important these issues are makes it harder and more stressful to approach and understand them, and the more we neglect to do so, the stronger our negative feelings towards addressing these topics grow.
Just because you're paranoid doesn't mean they aren't after you
Is it all in our heads? Of course not. The reason we have legislation such as GDPR and The California Consumer Privacy Act (CCPA), which will come into effect in 2020, is that corporates rarely volunteer to make this information public. And they know why. Understanding the massive amount of personal data that is being gathered will damage the already-fragile public trust towards many corporations and providing users with the tools to limit said data will harm companies’ ability to harness this information for profit.
The legislation in the field aimed and succeeded in providing users with more information, but that’s not necessarily a good thing. In fact, it created a heavy information overload. An old internet joke claims that the best place to hide a body is the 2nd page of Google, but privacy policies can provide a hideout just as solid.
Today’s internet users are drowning in legal talk that makes it harder to detect and understand the core facts. Companies are compelled to share their policies, but they don’t have to make them comprehensible to the average Joe. No one is thinking of the individual when creating company policies, and in the age of short attention spans and TL;DR, privacy policies are practically designed to never be read. Legal, financial and professional terms make researching our own future nearly impossible. What is truly needed now is a way to filter crucial information, making it easily digestible.
If the real purpose of privacy laws is protecting people, it’s imperative that these laws consider the realistic ways in which our mind works, and form new paths accordingly. If we create solutions that instead of solving these mind games only enhance and manipulate them, we are not giving people back control over their online entity, but instead a mirage of privacy.