Generates a host list of first-party trackers for ad-blocking. https://hostfiles.frogeye.fr
 
 
Go to file
Geoffrey Frogeye 3b6f7a58b3
Remove support for Rapid7
They changed their privacy / pricing model and as such I don't have
access to their massive DNS dataset anymore,
even after asking.

Since 2022-01-02, I put the list on freeze while looking for an alternative,
but couldn't find any.
To make the list update again with the remaining DNS sources I have,
I put the last version of the list generated with the Rapid7 dataset
as an input for subdomains, that will now get resolved with MassDNS.
2022-11-13 20:10:27 +01:00
dist Remove support for Rapid7 2022-11-13 20:10:27 +01:00
last_updates Clever pruning mechanism 2019-12-25 14:54:57 +01:00
nameservers Fixed scripting around 2019-12-18 13:01:32 +01:00
rules Add SAS.com 2021-08-22 18:02:37 +02:00
rules_adblock Improved rules handling 2019-12-03 08:48:12 +01:00
rules_asn Actually remove ThreatMetrix 2021-08-14 21:55:44 +02:00
rules_hosts Improved rules handling 2019-12-03 08:48:12 +01:00
rules_ip Re-import Rapid7 datasets when rules have been updated 2020-01-04 10:54:46 +01:00
subdomains Added possibility to add personal sources 2019-11-11 11:19:46 +01:00
temp Separated DNS resolution from filtering 2019-12-02 19:03:08 +01:00
tests Add SAS.com 2021-08-22 18:02:37 +02:00
websites Added RED by SFR website 2019-11-13 18:14:56 +01:00
.env.default Remove support for Rapid7 2022-11-13 20:10:27 +01:00
.gitignore Explanations folder 2019-12-27 15:35:30 +01:00
LICENSE Added LICENSE 2019-12-20 17:38:26 +01:00
README.md Remove support for Rapid7 2022-11-13 20:10:27 +01:00
adblock_to_domain_list.py Black pass 2021-08-14 23:27:28 +02:00
collect_subdomains.py Fix (most) mypy / flake8 errors 2021-08-14 23:35:51 +02:00
collect_subdomains.sh Fix log in scripts 2019-12-07 18:45:48 +01:00
database.py Black pass 2021-08-14 23:27:28 +02:00
db.py Black pass 2021-08-14 23:27:28 +02:00
eulaurarien.sh Remove support for Rapid7 2022-11-13 20:10:27 +01:00
export.py Black pass 2021-08-14 23:27:28 +02:00
export_lists.sh Remove support for Rapid7 2022-11-13 20:10:27 +01:00
feed_asn.py Black pass 2021-08-14 23:27:28 +02:00
feed_dns.py Remove support for Rapid7 2022-11-13 20:10:27 +01:00
feed_rules.py Fix (most) mypy / flake8 errors 2021-08-14 23:35:51 +02:00
fetch_resources.sh Add Fukuda & co research paper to test suite 2020-12-06 22:13:05 +01:00
generate_index.py Black pass 2021-08-14 23:27:28 +02:00
import_rules.sh Clever pruning mechanism 2019-12-25 14:54:57 +01:00
prune.sh Clever pruning mechanism 2019-12-25 14:54:57 +01:00
requirements.txt Add requirements.txt file 2022-02-26 13:01:11 +01:00
resolve_subdomains.sh Better list output 2019-12-27 21:46:57 +01:00
run_tests.py Fix (most) mypy / flake8 errors 2021-08-14 23:35:51 +02:00
validate_list.py Validate also lower the case of domains 2019-12-25 15:31:20 +01:00

README.md

eulaurarien

This program is able to generate a list of every hostnames being a DNS redirection to a list of DNS zones and IP networks.

It is primarilyy used to generate Geoffrey Frogeye's block list of first-party trackers (learn about first-party trackers by following this link).

If you want to contribute but don't want to create an account on this forge, contact me the way you like: https://geoffrey.frogeye.fr

How does this work

This program takes as input:

  • Lists of hostnames to match
  • Lists of DNS zone to match (a domain and their subdomains)
  • Lists of IP address / IP networks to match
  • Lists of Autonomous System numbers to match
  • An enormous quantity of DNS records

It will be able to output hostnames being a DNS redirection to any item in the lists provided.

DNS records can be locally resolved from a list of subdomains using MassDNS.

Those subdomains can either be provided as is, come from Cisco Umbrella Popularity List, from your browsing history, or from analyzing the traffic a web browser makes when opening an URL (the program provides utility to do all that).

Usage

Remember you can get an already generated and up-to-date list of first-party trackers from here.

The following is for the people wanting to build their own list.

Requirements

Depending on the sources you'll be using to generate the list, you'll need to install some of the following:

Create a new database

The so-called database (in the form of blocking.p) is a file storing all the matching entities (ASN, IPs, hostnames, zones…) and every entity leading to it. It exists because the list cannot be generated in one pass, as DNS redirections chain links do not have to be inputed in order.

You can purge of old records the database by running ./prune.sh. When you remove a source of data, remove its corresponding file in last_updates to fix the pruning process.

Gather external sources

External sources are not stored in this repository. You'll need to fetch them by running ./fetch_resources.sh. Those include:

  • Third-party trackers lists
  • TLD lists (used to test the validity of hostnames)
  • List of public DNS resolvers (for DNS resolving from subdomains)
  • Top 1M subdomains

Import rules into the database

You need to put the lists of rules for matching in the different subfolders:

  • rules: Lists of DNS zones
  • rules_ip: Lists of IP networks (for IP addresses append /32)
  • rules_asn: Lists of Autonomous Systems numbers (IP ranges will be deducted from them)
  • rules_adblock: Lists of DNS zones, but in the form of AdBlock lists (only the ones concerning domains will be extracted)
  • rules_hosts: Lists of DNS zones, but in the form of hosts lists

See the provided examples for syntax.

In each folder:

  • first-party.ext will be the only files considered for the first-party variant of the list
  • *.cache.ext are from external sources, and thus might be deleted / overwrote
  • *.custom.ext are for sources that you don't want commited

Then, run ./import_rules.sh.

If you removed rules and you want to remove every record depending on those rules immediately, run the following command:

./db.py --prune --prune-before "$(cat "last_updates/rules.txt")" --prune-base

Add subdomains

If you plan to resolve DNS records yourself (as the DNS records datasets are not exhaustive), the top 1M subdomains provided might not be enough.

You can add them into the subdomains folder. It follows the same specificities as the rules folder for *.cache.ext and *.custom.ext files.

Add personal sources

Adding your own browsing history will help create a more suited subdomains list. Here's reference command for possible sources:

  • Pi-hole: sqlite3 /etc/pihole-FTL.db "select distinct domain from queries" > /path/to/eulaurarien/subdomains/my-pihole.custom.list
  • Firefox: cp ~/.mozilla/firefox/<your_profile>.default/places.sqlite temp; sqlite3 temp "select distinct rev_host from moz_places" | rev | sed 's|^\.||' > /path/to/eulaurarien/subdomains/my-firefox.custom.list; rm temp

Collect subdomains from websites

You can add the websites URLs into the websites folder. It follows the same specificities as the rules folder for *.cache.ext and *.custom.ext files.

Then, run collect_subdomain.sh. This is a long step, and might be memory-intensive from time to time.

Note: For first-party tracking, a list of subdomains issued from the websites in the repository is avaliable here: https://hostfiles.frogeye.fr/from_websites.cache.list

Resolve DNS records

Once you've added subdomains, you'll need to resolve them to get their DNS records. The program will use a list of public nameservers to do that, but you can add your own in the nameservers directory.

Then, run ./resolve_subdomains.sh. Note that this is a network intensive process, not in term of bandwith, but in terms of packet number.

Note: Some VPS providers might detect this as a DDoS attack and cut the network access. Some Wi-Fi connections can be rendered unusable for other uses, some routers might cease to work. Since massdns does not support yet rate limiting, my best bet was a Raspberry Pi with a slow ethernet link (Raspberry Pi < 4).

The DNS records will automatically be imported into the database. If you want to re-import the records without re-doing the resolving, just run the last line of the ./resolve_subdomains.sh script.

Export the lists

For the tracking list, use ./export_lists.sh, the output will be in the dist folder (please change the links before distributing them). For other purposes, tinker with the ./export.py program.

Explanations

Note that if you created an explanations folder at the root of the project, a file with a timestamp will be created in it. It contains every rule in the database and the reason of their presence (i.e. their dependency). This might be useful to track changes between runs.

Every rule has an associated tag with four components:

  1. A number: the level of the rule (1 if it is a rule present in the rules* folders)
  2. A letter: F if first-party, M if multi-party.
  3. A letter: D if a dupplicate (e.g. foo.bar.com if *.bar.com is already a rule), _ if not.
  4. A number: the number of rules relying on this one

Generate the index webpage

This is the one served on https://hostfiles.frogeye.fr. Just run ./generate_index.py.

Everything

Once you've made sure every step runs fine, you can use ./eulaurarien.sh to run every step consecutively.