Bloomberg reports that, after four years of negotiations, Google purchases a trove of credit card transaction data from Mastercard, allegedly for “millions of dollars.” Google then reportedly used that data to provide select advertisers with a tool called “store sales measurement” that the company quietly announced in a blog post last year, though it failed to mention the inclusion of Mastercard data in the workflow. The tool can track how online ads lead to real-world purchases, and that extra data is designed to make Google’s ad products more appealing to advertisers. (Read: everybody makes more money this way.) The public was not informed of the reported Mastercard deal, though advertisers have had access to the transaction data for at least a year, according to Bloomberg.
While this appears to be a serious breach of privacy, this article covers the technical and cryptographical aspects of what is going on a bit more. The basic summary is that this appears to be an early breakthrough in something called homomorphic encryption. I’m not super familiar with it, or the math behind cryptography in general (just some programming applications), but the basic gist as the article explains is that it allows research to be conducted on sensitive data without revealing anything sensitive to the consumers. Basically, you can run algorithms on data as if it were meaningfully revealed while also encrypting it to prevent serious breaches of privacy.
As the article notes, this is huge! It actually provides a way to do research on data that might have been hard to work with in the past due to its sensitive nature. Things like student or medical data could open up to greater research opportunities. Outside of what was mentioned in the article, I’m also wondering if there are potential ways that this could help certain projects move forward despite things like GDPR, which restricts how data can be shared. And before anyone wonders about trusting data to encryption algorithms, yes, there is potential for mistakes in the algorithm to be exploited, but most of us already trust sensitive data to such algorithms on a daily basis, despite past failures, so I’m not sure this is enough of a concern to not move forward.
Ultimately, what appears to be a massive scandal on Google’s part might actually be a major win for security and research.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.