Project

General

Profile

Actions

Feature #495

open

Remove the Sec-Fetch headers

Added by privacybrowser user over 5 years ago. Updated over 5 years ago.

Status:
New
Priority:
4.x
Start date:
09/06/2019
Due date:
% Done:

0%

Estimated time:

Description

Grant users ability to modify or remove any unwanted (potential tracking vectors that would conflict with user-specified useragents) headers from any/all network requests

https://chrome.google.com/webstore/detail/modify-headers-for-google/innpjfdalfhpcoinfnehdnbkglpmogdi

Actions #1

Updated by Soren Stoutner over 5 years ago

  • Status changed from New to Closed
  • Assignee set to Soren Stoutner

Modifying these headers will require Privacy WebView. As such, it will be part of the 4.x release.

There are already a number of feature requests for individual headers. For example:

https://redmine.stoutner.com/issues/337
https://redmine.stoutner.com/issues/246
https://redmine.stoutner.com/issues/235

If you see anything missing from the feature requests for the 4.x series, feel free to add it (one request per header type, please).

Actions #2

Updated by privacybrowser user over 5 years ago

Granting users the ability to remove or if they prefer customize certain headers, which includes Accept, Accept-Language and Referers as a start is valuable. Given that you will be using WebView/Chromium for Privacy WebView, you may want to include "Sec-Fetch-Mode", "Sec-Fetch-User", and "Sec-Fetch-Site" (I'll add separate issues if you like) since Chromium as of version 76 at least is sending these headers. Equally, being able to remove/customize the Cache-Control, Pragma headers could be valuable.

However, why I mentioned a custom solution to dynamically allow users to remove user-chosen headers is because many times sneaky websites that aren't quite as sophisticated as Google (who embed fingerprinting data and required funcionality requests in one encrypted parameter usually using image requests, but sometimes also XHRs) embed fingerprinting information as headers in JS XHRs and being able to strip or edit site-specific headers from these XHRs so that only what is truly required for functionality is sent would be valuable - e.g. with a workflow of 1) check out what XHRs are being blocked, 2) check out the headers, 3) allow specific XHRs and only specific headers...this is how I prevent this kind of fingerprinting lekaing on desktop Chromium/Firefox or Fennec usually (Fennec is just so broken and slow...).

Actions #3

Updated by Soren Stoutner over 5 years ago

I looked over a bit of the documentation of the Sec-Fetch headers and XMLHttpRequests. I can't think of any way that they can expose information about the user to a server that isn't already being exposed by other means. For example, if you have JavaScript enabled, and you block access to one or more of these technologies, there is nothing they are exposing that can't be exposed with other JavaScript commands. And if JavaScript is disabled, there is nothing that these commands can expose that isn't already being exposed through the existing HTTP commands that are being sent to the server. As such, I can't think of any privacy reason to modify the behavior of these commands.

However, I would be very interested to hear a description of a scenario where these commands represent a privacy vulnerability in and of themselves. One of the things I have learned as I have been working on Privacy Browser is that things that I didn't originally think were privacy problems can be abused to compromise privacy. So, I am always open to the wisdom of other people regarding why certain commands need to be blocked.

Actions #4

Updated by privacybrowser user over 5 years ago

Well, I was discussing two different tracking vectors in one issue which I know I shouldn't, but the general theme is the same...

Regarding the Sec-Fetch headers...given the issues that you cited previously regarding the Accept and Accept-Language headers mentioning the ability to customize those headers to allow users to avoid Privacy Browser sticking out too much...at least this was the logic that you accepted in the issues you referred to.

XHRs are a primary (alongside webrtc/websockets/or sometimes pixels with tracking params or other requests) way fingerprints get sent back to servers when cookies, websockets, webrtc, etc are disabled...just examine most websites...almost all sites send back JS-generated fingerprints in analytics or other unnecessary XHRs, or they embed the fingerprints in specific headers of legitimate XHRs required for site functionality. As I said, some more sophisticated actors just concatenate legitimate request parameters with fingerprinting ids, encrypt and then send them as either headers or query parameters in image or otherwise xhr requests. However, I am trying to point out that the majority fall into the first two categories, not the last one, and there is real tracking that can be prevented if you can strip unwanted headers and only allow specific whitelisted requests through as I described in other tickets I have created.

Actions #5

Updated by Soren Stoutner over 5 years ago

The Sec-Fetch headers operate differently than the Accept or Accept-Language headers. Accept, Accept-Language, and the other request headers listed in View Source, are sent with every HTTP request, on the first request, to every server. As such, they can be used by any server you connect to for fingerprinting of the user. My goal is to eventually completely disable or create an option to disable all of the default headers that are sent with every HTTP request, with the exception of Host, and possibly Connection and Upgrade-Insecure-Requests.

The Sec-Fetch headers are not sent on every connection. The are created by JavaScript commands. As you pointed out, they can be used to exfiltrate fingerprinting information generated by JavaScript back to a server. However, they are only one of many possible exfiltration mechanisms. For example, they could use an HTTP POST command, or an HTTP GET command with the information encoded in the URL.

XMLHttpRequests function similarly. They can be used to exfiltrate fingerprinting information. But there are many other ways that information could be exfiltrated.

If blocking either of these techniques would in any way improve privacy I would do so. But neither of them are the problem. The problem is JavaScript. If JavaScript is enabled and both of these mechanisms are disabled, nothing would improve because the JavaScript would just use a different method that cannot be blocked to transfer the information. And if JavaScript is disabled, neither of these can do any harm.

The key, as I see it is to disable any JavaScript command that can be used to generate fingerprinting information. I have written a little more about the extent of information I think a browser should provide to a server at https://www.stoutner.com/privacy-browser/core-privacy-principles/.

Actions #6

Updated by privacybrowser user over 5 years ago

Sec-Fetch headers are sent on all requests as of Chromium 76, and this is also the case for the webview on my devices as well.

Actions #7

Updated by privacybrowser user over 5 years ago

I understand and respect your efforts, but given that essentially Google and Mozilla (less and less) basically control web standards, and you see the increasing complexity and fingerprintability that is being introduced from new TLS, HTTP and other standards... As it is QUIC will likely replace TCP at some point for web browsing and QUIC itself has yet more privacy issues with yet more session and unique identifiers, with these issues all being swept under the carpet pretty much. If you examine how rapidly both Mozilla and Google add new APIs and functionality to their JS engines you see that if anything the direction right now is more, not less javascript, and the only option to be able to interact with the net in future will be to create implementations of said functionality which avoids privacy issues or spoofs information in such a way to protect privacy.

We live in a world where apps are gaining more and more power and some services are disabling their web portals and requiring use of closed source apps that have many multiples more fingerprinting, data collection and exploitation avenues than available to javascript in a web browser. If browsers don't end up allowing data collection like this, web browsers will likely be abandoned, so the only solution I see is something that spoofs information and serves incorrect information or simply intercepts API requests assuming core web app functionality is not broken. I mean if the web today worked with a simple curl-based setup with html and image rendering I would, but it just doesn't...it isn't 1995 anymore...and even by the later 90s javascript and flash were a buggy, terribly exploitable mess.

I confess I'm not sure I understand how you can't see XHRs are a major fingerprinting vector - I mean sure they are one of many, but arguing against trying to combat them because they are one of many doesn't make sense to me. Just try to open an instagram feed and click on a picture and check out the XHR request parameters, json payloads, and headers in developer tools. Almost all websites today use XHRs in this fashion and many times simply blocking all XHRs and whitelisting as appropriate has been a useful component of preventing fingerprints from leaking for me.

As it is, Chromium (which in some ways has actually become more useful than Firefox given the mess which is webextensions and their maiming of about:config, and love of bloatware) is going to become annoying soon given the manifest v3 proposal and rumblings that Google may at some point make it very difficult to block requests at all without significant source changes in an effort to curb ad blocking that hasn't been approved by them (since Google is creating their own ad blocking "solution"). Indications are that Google may even close source both Chromium and Android or an increased portion of it long term, so who knows what will happen.

Sorry for writing all this, and I understand if you lock this ticket now :)

Actions #8

Updated by Soren Stoutner over 5 years ago

  • Subject changed from Modify/Remove HTTP Headers to Remove the Sec-Fetch headers
  • Status changed from Closed to New
  • Priority changed from 3.x to 4.x

You are correct. Google has recently added Sec-Fetch-Mode, Sec-Fetch-User, and Sec-Fetch-Site to their list of standard headers in WebView. As such, I have renamed this feature request and will remove them in the 4.x series once Privacy WebView is implemented.

I normally check which headers are sent every few months or so, but I hadn't caught this change yet. For those who are interested in seeing which headers a browser sends, they can visit https://browserleaks.com/ip.

Actions #9

Updated by Soren Stoutner over 5 years ago

Regarding XMLHttpRequests, they don't generate any fingerprinting information. If they did, I would block them. They are just one of many possible TRANSPORT mechanisms for fingerprinting information once it has been generated by something else. If I could block XMLHttpRequests and it would make one iota of difference to fingerprinting I would do it in a heartbeat. But if I did, within 5 seconds all the major fingerprinting scripts would add a routine to detect if XMLHttpRequests were blocked and just use one of the many other transport mechanisms at their disposal to send the fingerprinting information to the server. So, it would end up accomplishing nothing.

The key to this is that I can't block all the transport mechanisms. As I say in the Privacy Policy at https://www.stoutner.com/privacy-browser/permissions/, if I don't request the android.permission.INTERNET permission, "Privacy Browser would be 'No Browser: Protecting Your Privacy by Staying Completely Off the Internet".' Similarly, blocking all the transport mechanisms would just keep you entirely off the internet, and only blocking some of them would have 0 effect.

However, I can block the systems that generate fingerprinting information, which is what I fully intend to do. Even if it breaks some websites.

Note that the blocklists already block individual known bad XMLHttpRequests. And in the future, it will be possible for users to add their own lists. See https://redmine.stoutner.com/issues/180 and https://redmine.stoutner.com/issues/181. However, this is a whack-a-mole approach that will never fully resolve the issue because all a server has to do to bypass the block is rename the request. The true solution is to disable or spoof the JavaScript and other commands that are used to generate the fingerprint.

Actions

Also available in: Atom PDF