Why Transitive User Privacy is a Must

You may be the most privacy conscious user, meticulously controlling every aspect of how, where and what pieces of information you share online. Yet not even all the safeguards against pernicious digital tracking methods can foolproof your identity. Well not the real "you" per se, but still uniquely identifiable as "you."


TrueCaller's entire business model rests on this surreptitious modus operandi, so does Facebook's and just about every other platform that scoops up contact information to target ads and offer other "benefits". While it's debatable whether this sort of convenience is really necessary, the route these services often take to obtain information about you from other sources is a lot more sneaky and alarming to say the least.

Shadow profiles are called so for a reason. And with Facebook, it opens a whole lot of unanswered questions about what data is and isn't covered and whether or not user consent is involved, and if you, as the user, can exercise full control over the data shared.

Because Facebook not only maintains a profile of every user who has signed up for the service, but also that of every non-user's. These shadow profiles are filled with personal data gathered through uploaded contact lists (from your friends for "Find Friends"), photos, or other sources, resulting in a reserve of information which is stored permanently in hopes that if there exists a non-user of Facebook and if (s)he does sign up for it at some point in the future, its algorithms will know exactly whom to recommend as friends. Plus, also to target ads trough a tool called Custom Audiences.

All of this is very troubling. Not just for the fact that this information is something that is sneakily pulled up from your family and friends, but also because, from a privacy point of view, you have no control over the contact information someone else uploads about you, as Kashmir Hill rightly notes in her new story for Gizmodo, with no option to delete it, irrespective of whether you are a Facebook user or not.

The researchers … found that if User A, whom we'll call Anna, shares her contacts with Facebook, including a previously unknown phone number for User B, whom we'll call Ben, advertisers will be able to target Ben with an ad using that phone number, which I call "shadow contact information," about a month later. Ben can't access his shadow contact information, because that would violate Anna's privacy, according to Facebook, so he can't see it or delete it, and he can't keep advertisers from using it either.

What's more concerning here is that Facebook even uses the phone numbers users provide for two-factor authentication to target ads at them. In response, Facebook not only confirmed the findings, but nonchalantly added: "We outline the information we receive and use for ads in our data policy, and give people control over their ads experience including custom audiences, via their ad preferences. For more information about how to manage your preferences and the type of data we use to show people ads see this post."

While it is true that Facebook's ad preferences page lets you see a list of advertisers who can target you using your (shadow) contact information, it isn't upfront about which part of it was used to reach you in the first place. It's this lack of networked (transitive) privacy — this illusion of control — while actually providing none, is one of the many dark patterns that lies at the heart of privacy control paradox.

Facebook has claimed that users already have extensive control over what information is made available to advertisers, but that’s not entirely true. When I asked the company last year about whether it used shadow contact information for ads, it gave me inaccurate information, and it hadn't made the practice clear in its extensive messaging to users about ads. It took academic researchers performing tests for months to unearth the truth. People are increasingly paranoid about the creepy accuracy of the ads they see online and don’t understand where the information is coming from that leads to that accuracy. It seems that, when it came to this particular practice, Facebook wanted to keep its users in the dark.

Comments