The Privacy Concern With Using Third-Party (Email) Apps
The Wall Street Journal's (paywall) story that third-party email apps like Return Path and Edison are scanning users' emails to improve their products and services highlights the general risks associated with granting access to third-party apps, but in an era post Cambridge Analytica, it serves as an cautionary tale about how wrong it is to blindly trust them to do their job without taking time to reading their terms of service agreements.
The fact that neither companies' privacy policies explicitly mention allowing humans to see user information is troubling, but then what exactly is new here? (Remember the Unroll.me scandal?) That these developers are reading emails to unearth usage patterns? Or that Google is lax with their vetting process with regards to what apps can read your emails in the first place? Or that there is a general lack of clarity when it comes to privacy policies?
What most fail to realise is that with Google, third-party developers can get hold of personal data only with users' affirmative consent. Facebook too faced a similar issue with the Cambridge Analytica data sharing fiasco, but then it was compounded by loss of transitive user privacy, resulting in leakage of personal information of friends, who may or may not have taken the personality quiz themselves. In fact, the search giant's handy support page should be your go-to guide (Step 2 in particular) when it comes to granting permissions to any third-party app:
Google, while no champion of privacy, said an year ago that it would stop scanning users' Gmail inboxes for purposes of targeted advertising. The question we should be asking, then, isn't whether these apps can read your emails after your explicit consent, but whether they state their intentions behind this sort of data collection (and retention) in black and white.
As Electronic Frontier Foundation rightly mentions, the correct way to protect data privacy would entail a four pronged approach:
The fact that neither companies' privacy policies explicitly mention allowing humans to see user information is troubling, but then what exactly is new here? (Remember the Unroll.me scandal?) That these developers are reading emails to unearth usage patterns? Or that Google is lax with their vetting process with regards to what apps can read your emails in the first place? Or that there is a general lack of clarity when it comes to privacy policies?
What most fail to realise is that with Google, third-party developers can get hold of personal data only with users' affirmative consent. Facebook too faced a similar issue with the Cambridge Analytica data sharing fiasco, but then it was compounded by loss of transitive user privacy, resulting in leakage of personal information of friends, who may or may not have taken the personality quiz themselves. In fact, the search giant's handy support page should be your go-to guide (Step 2 in particular) when it comes to granting permissions to any third-party app:
Google, while no champion of privacy, said an year ago that it would stop scanning users' Gmail inboxes for purposes of targeted advertising. The question we should be asking, then, isn't whether these apps can read your emails after your explicit consent, but whether they state their intentions behind this sort of data collection (and retention) in black and white.
As Electronic Frontier Foundation rightly mentions, the correct way to protect data privacy would entail a four pronged approach:
- Getting opt-in consent to online data gathering through a request that's "easy to understand and clearly advise the user what data the operator seeks to gather, how the operator will use it, how long the operator will keep it, and with whom the operator will share it."
- Creating an affirmative "right to know," so users can learn what data online services have gathered from and about them, and what they are doing with it.
- Creating an affirmative "right to data extraction," so users can get a complete copy of their data from an online service.
- Establishing proper mechanisms in place for users to hold companies accountable for data breaches and other privacy failures.
Whether be it through simplifying privacy policies, or rolling out a comprehensive data protection plan like GDPR, what's required is an urgent update to our privacy laws so that companies and online services who are wilfully failing to respect user privacy are held responsible for their actions.
Comments
Post a Comment