Geoffrey Frogeye
5023b85d7c
It's just CSV. The DNS from the datasets are not ordered consistently, so we need to parse it completly. It seems that converting to an IR before sending data to ./feed_dns.py through a pipe is faster than decoding the JSON in ./feed_dns.py. This will also reduce the storage of the resolved subdomains by about 15% (compressed).
23 lines
468 B
Bash
Executable file
23 lines
468 B
Bash
Executable file
#!/usr/bin/env bash
|
|
|
|
function log() {
|
|
echo -e "\033[33m$@\033[0m"
|
|
}
|
|
|
|
./fetch_resources.sh
|
|
./import_rules.sh
|
|
|
|
# TODO Fetch 'em
|
|
log "Reading PTR records…"
|
|
pv ptr.json.gz | gunzip | ./json_to_csv.py | ./feed_dns.py
|
|
log "Reading A records…"
|
|
pv a.json.gz | gunzip | ./json_to_csv.py | ./feed_dns.py
|
|
log "Reading CNAME records…"
|
|
pv cname.json.gz | gunzip | ./json_to_csv.py | ./feed_dns.py
|
|
|
|
log "Pruning old data…"
|
|
./database.py --prune
|
|
|
|
./filter_subdomains.sh
|
|
|