By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy

FLoC is a recent Google proposal that would have your browser share your browsing behavior and interests by default with every site and advertiser with which you interact. Brave opposes FLoC, along with any other feature designed to share information about you and your interests without your fully informed consent. To protect Brave users, Brave has removed FLoC in the Nightly version of both Brave for desktop and Android. The privacy-affecting aspects of FLoC have never been enabled in Brave releases; the additional implementation details of FLoC will be removed from all Brave releases with this week’s stable release. Brave is also disabling FLoC on our websites, to protect Chrome users learning about Brave. Companies are finally being forced to respect user privacy (even if only minimally), pushed by trends such as increased user education, the success of privacy-first tools (e.g., Brave among others), and the growth of legislation including the CCPA and GPDR. In the face of these trends, it is disappointing to see Google, instead of taking the present opportunity to help design and build a user-first, privacy-first Web, proposing and immediately shipping in Chrome a set of smaller, ad-tech-conserving changes, which explicitly prioritize maintaining the structure of the Web advertising ecosystem as Google sees it. For the Web to be trusted and to flourish, we hold that much more is needed than the complex yet conservative chair-shuffling embodied by FLoC and Privacy Sandbox. Deeper changes to how creators pay their bills via ads are not only possible, but necessary. The success of Brave’s privacy-respecting, performance-maintaining, and site-supporting advertising system shows that more radical approaches work. We invite Google to join us in fixing the fundamentals, undoing the harm that ad-tech has caused, and building a Web that serves users first. The rest of this post explains why we believe FLoC is bad for Web users, bad for sites, and a bad direction for the Web in general. FLoC harms privacy directly and by design: FLoC shares information about your browsing behavior with sites and advertisers that otherwise wouldn’t have access to that information. Unambiguously, FLoC tells sites about your browsing history in a new way that browsers categorically do not today. Google claims that FLoC is privacy improving, despite intentionally telling sites more about you, for broadly two reasons, each of which conflate unrelated topics. First, Google says FLoC is privacy preserving compared to sending third-party cookies. But this is a misleading baseline to compare against. Many browsers don’t send third-party cookies at all; Brave hasn’t ever. Saying a new Chrome feature is privacy-improving only when compared to status-quo Chrome (the most privacy-harming popular browser on the market), is misleading, self-serving, and a further reason for users to run away from Chrome. Second, Google defends FLoC as not privacy-harming because interest cohorts are designed to be not unique to a user, using k-anonymity protections. This shows a mistaken idea of what privacy is. Many things about a person are i) not unique, but still ii) personal and important, and shouldn’t be shared without consent. Whether I prefer to wear “men’s” or “women’s” clothes, whether I live according to my professed religion, whether I believe vaccines are a scam, or whether I am a gun owner, or a Brony-fan, or a million other things, are all aspects of our lives that we might like to share with some people but not others, and under our terms and control. FLoC adds an enormous amount of fingerprinting surface to the browser, as the whole point of the feature is for sites to be able to distinguish between user interest-group cohorts. This undermines the work Brave is doing to protect users against browser fingerprinting and the statistically inferred cohort tracking enabled by fingerprinting attack surface. Google’s proposed solution to the increased fingerprinting risk from FLoC is both untestable and unlikely to work. Google proposes using a “privacy budget” approach to prevent FLoC from being used to track users. First, Brave has previously detailed why we do not think a “budget” approach is workable to prevent fingerprinting-based tracking. We stand by those concerns, and have not received any response from Google, despite having raised the concerns over a year ago. And second, Google has yet to specify how their “privacy budget” approach will work; the approach is still in “feasibility-testing” stages. Google is aware of some of these concerns, but gives them shallow treatment in their proposal. For example, Google notes that some categories (sexual orientation, medical issues, political party, etc.) will be exempt from FLoC, and that they are looking into other ways of preventing “sensitive” categories from being used in FLoC. Google’s approach here is fundamentally wrong. First, Google’s approach to determining whether a FLoC cohort is sensitive requires (in most cases) Google to record and collect that sensitive cohort in the first place! A system that determines whether a cohort is “sensitive” by recording how many people are in that sensitive cohort doesn’t pass the laugh test. Second, and more fundamental, the idea of creating a global list of “sensitive categories” is illogical and immoral. Whether a behavior is “sensitive” varies wildly across people. One’s mom may not find her interest in “women’s clothes” a private part of her identity, but one’s dad might (or might not! but, plainly, Google isn’t the appropriate party to make that choice). Similarly, an adult happily expecting a child might not find their interest in “baby goods” particularly sensitive, but a scared and nervous teenager might. More broadly, interests that are banal to one person, might be sensitive, private or even dangerous to another person. The point isn’t that Google’s list of “sensitive cohorts” will be missing important items. The point, rather, is that a “privacy preserving system” that relies on a single, global determination of what behaviors are “privacy sensitive,” fundamentally doesn’t protect privacy, or even understand why privacy is important. Visit OUR FORUM for more.