Compare commits

...

67 Commits

Author SHA1 Message Date
Adam Warner
a787e29aad Merge pull request #1245 from pi-hole/dev
Dev - > master
2022-11-27 21:46:10 +00:00
Adam Warner
8aa6623844 Merge pull request #1242 from th0r88/patch-1
Added the right public port for pihole config
2022-11-21 07:12:18 +00:00
Jan Ferme
91a174f976 Added the right public port for pihole config
Otherwise it doesn't work.

Signed-off-by: Jan Ferme <10115360+th0r88@users.noreply.github.com>
2022-11-21 01:00:18 +01:00
Adam Warner
41aa699aab Merge pull request #1234 from pi-hole/updatechecktiming
Ensure FTL is running before attempting to run functions/lines that require DNS resolution to be working
2022-11-09 18:03:53 +00:00
Adam Warner
f73c92d03b rename _gravityonboot service/script to _postFTL and move updatecheck/version output to this script
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-11-09 17:51:07 +00:00
Adam Warner
1f69a1ab83 Merge pull request #1232 from davidgatti/patch-1
Working docker compose command.
2022-11-08 12:17:02 +00:00
David Gatti
7f8862d73f Added -d option for detach. 2022-11-05 14:39:27 +01:00
David Gatti
069a88b7d1 Update README.md
Added the working command for Docker 2

Signed-off-by: David Gatti <dawid@gatti.io>
2022-11-05 08:50:04 +01:00
Adam Warner
b9f3aada94 Merge pull request #1225 from pi-hole/dev
Dev -> master before release
2022-10-10 22:02:18 +01:00
Adam Warner
dc4071f9a4 Merge pull request #1219 from juliohurtado/patch-1
Delete duplicate "that"
2022-10-06 19:11:45 +01:00
Julio Hurtado Gómez
1f3ced8d31 Merge branch 'dev' into patch-1 2022-10-06 16:23:47 +02:00
Julio Hurtado Gómez
47fe743548 Delete duplicate "that"
Signed-off-by: Julio Hurtado Gómez <kloner88@gmail.com>
2022-10-04 10:14:30 +02:00
Adam Warner
e4a7a11b88 Merge pull request #1216 from pi-hole/ftl-port-file-removal
Remove references to deprecated `pihole-FTL.port` file
2022-09-24 15:18:33 +01:00
Adam Warner
658f6de774 FTL.port file is deprecated (see https://github.com/pi-hole/FTL/pull/1445)
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-09-24 15:11:27 +01:00
Adam Warner
fc8a679521 Merge pull request #1215 from pi-hole/dependabot/github_actions/dev/actions/stale-6
Bump actions/stale from 5 to 6
2022-09-24 13:10:02 +01:00
dependabot[bot]
4f9b854546 Bump actions/stale from 5 to 6
Bumps [actions/stale](https://github.com/actions/stale) from 5 to 6.
- [Release notes](https://github.com/actions/stale/releases)
- [Changelog](https://github.com/actions/stale/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/stale/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/stale
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-24 10:06:36 +00:00
Adam Warner
1eff43b6b7 Merge pull request #1214 from danitorregrosa/dev
S6_ARCH fix for rpi1
2022-09-22 22:19:31 +01:00
danitorregrosa
913f11beb5 S6_ARCH fix for rpi1
Resolves #1201

Signed-off-by: danitorregrosa <danitorregrosa@gmail.com>
2022-09-22 22:55:21 +02:00
Adam Warner
71d77b5fe8 Merge pull request #1212 from pi-hole/refactor-some-more
Change some variables so that they cannot be overridden by users
2022-09-20 09:16:07 +01:00
Adam Warner
f94fb54a18 Some of these should not be user overridable
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-09-19 21:46:46 +01:00
Adam Warner
a9ecd4e7a2 Merge pull request #1210 from lightswitch05/feature/cleanup-tests-and-test-dependencies
Remove old test dependencies, updated ones still being used. Introduce Black formatter
2022-09-18 11:58:12 +01:00
Daniel
4da66313f4 Remove old test dependencies, updated ones still being used. Introduce Black formatter 2022-09-17 04:28:01 +00:00
Adam Warner
2164220c69 Merge pull request #1204 from willman42/dev
init commit of Pi-hole behind Caddy example
2022-09-14 21:39:00 +01:00
William Trelawny
798c0e606f init commit of Pi-hole behind Caddy example
- Single docker compose file to spin up a Caddy and a Pi-hole container
- I put plenty of comments in compose file for user ref. Looks kinda
  messy but I figured the more info the better and they can cull out
what they dont need.
- DHCP server is disabled by default in Pi-hole so I disabled the port
  binding and NET_ADMIN capability by default as well, with brief
instructions and refs included.
- Caddy docker compose example ref: https://hub.docker.com/_/caddy
- Pi-hole Caddy config ref: https://docs.pi-hole.net/guides/webserver/caddy/

Signed-off-by: William Trelawny <william@trelawny.family>
2022-09-05 09:57:54 -04:00
Adam Warner
4ddb2f817d Merge pull request #1203 from pi-hole/master
sync: master to dev
2022-09-05 12:47:15 +01:00
Adam Warner
988c39581e Merge pull request #1202 from pi-hole/dev
Dev
2022-09-05 12:28:05 +01:00
Adam Warner
2f2395e5c0 Merge pull request #1200 from willman42/dev
fixed broken link to TESTING.md
2022-09-05 12:27:46 +01:00
William Trelawny (willman42)
0b9e9a5af6 fixed broken link to TESTING.md
Signed-off-by: William Trelawny (willman42) <william@trelawny.family>
2022-09-03 18:35:43 -04:00
Adam Warner
4a636fb7ba Merge pull request #1198 from labodj/master
Remove the email function
2022-09-03 08:43:57 +01:00
LaboDJ
540ca1e31f Remove the email function
It was removed from pi-hole in 5.12 by this PR https://github.com/pi-hole/pi-hole/pull/4870

Signed-off-by: LaboDJ <2527836+labodj@users.noreply.github.com>
2022-09-03 05:31:11 +02:00
Adam Warner
f044e58b5c Calrifications to experimental uid/gid changer variables
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-09-02 23:58:53 +01:00
Adam Warner
0d5a001916 also run remote update check on container startup
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-09-02 21:37:10 +01:00
Adam Warner
9d17bd9871 Update documentation per comments on #1155
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-09-02 18:14:10 +01:00
Adam Warner
de80425c87 Merge pull request #1168 from freddydumont/patch-1
Add systemd-resolved note for Fedora
2022-09-02 18:09:56 +01:00
Adam Warner
b18d9bd419 Merge pull request #1193 from pi-hole/dnsmasq-listening
Set FTL/DNSMASQ listening behaviour per variable if it is passed/
2022-08-29 17:16:32 +01:00
Adam Warner
0bbdd15073 Set FTL/DNSMASQ listening behaviour per variable if it is passed. Fixes #1188
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 17:12:58 +01:00
Adam Warner
a4c931f115 Merge pull request #1192 from pi-hole/s6-wait-maxtime
Prevent S6 timeouts on container start by setting S6_CMD_WAIT_FOR_SERVICES_MAXTIME to 0
2022-08-29 16:52:00 +01:00
Adam Warner
efd587bdd1 Read CAP_STR into an array and output one cap per line in startup
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 16:49:16 +01:00
Adam Warner
27980ed9cf Prevent S6 timeouts on container start by setting S6_CMD_WAIT_FOR_SERVICES_MAXTIME to 0, per https://github.com/just-containers/s6-overlay/issues/452
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 16:33:41 +01:00
Adam Warner
10c33ed871 Add note to readme about installingon Dokku, closes #1190
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 14:32:14 +01:00
Adam Warner
14c67ed729 capabilites ==> capabilities
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 13:22:50 +01:00
Adam Warner
69f64b963e Merge pull request #1191 from pi-hole/s6-tweaks
Tweaks to s6 service startup order to clarify logged output
2022-08-29 13:18:49 +01:00
Adam Warner
97f81bae21 Shuffle some files around, change dependencies to make output clearer, change ::: for [i] to further clarify output.
Allow for  PH_VERBOSE to set -x on all scripts that it would be useful for

Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 13:17:22 +01:00
Adam Warner
2b60df6d2b Move uid/gid changer from legacy to new style.
Move deprecated dependencies file to dependencies.d for all services. Make cron depend on base

Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 12:20:10 +01:00
Adam Warner
473117e8a8 Rename start and gravity scripts to match their service names
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 11:50:19 +01:00
Adam Warner
e6d4c3091f Move all scripts to /usr/local/bin to simplify dockerfile
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 11:48:07 +01:00
Adam Warner
55f4f89a0c remove some no-longer-referenced files
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 11:37:42 +01:00
Adam Warner
faffe6430f Remove the 01-resolver-resolv file - can't track down what it does from looking at commit history - functionality is deprecated
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 11:21:29 +01:00
Adam Warner
48c6192617 reintroduce ability to set verbose output
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-29 10:38:04 +01:00
Adam Warner
9d0162ebcc Merge pull request #1185 from pi-hole/tweak/PIHOLE_DNS_ARR
Test for empty values in the PIHOLE_DNS_ array, skip if it's empty
2022-08-27 08:16:08 +01:00
Adam Warner
939a69b895 Test for empty values in the PIHOLE_DNS_ array, skip if it's empty
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-26 22:46:10 +01:00
Adam Warner
1daeb117cb Merge pull request #1183 from pi-hole/master
sync: master to dev
2022-08-26 22:45:52 +01:00
Adam Warner
9039a73272 Merge pull request #1182 from pi-hole/dev
Dev -> Master
2022-08-26 00:23:17 +01:00
Adam Warner
cbd86caa5b Merge pull request #1170 from pi-hole/dependabot/pip/urllib3-1.26.5
Bump urllib3 from 1.25.9 to 1.26.5
2022-08-25 21:50:26 +01:00
Adam Warner
471e0425c6 Merge pull request #1181 from pi-hole/fix-podman-issues
Fix issues with Podman and S6 overlay
2022-08-25 21:50:09 +01:00
Adam Warner
8619de0031 Revert change that removed the moving of /init to /s6-init. Add comments to make sure it does not get removed again
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-25 19:31:21 +01:00
Adam Warner
a7c7be01c1 Merge pull request #1180 from pi-hole/master
sync: master to dev
2022-08-25 16:13:51 +01:00
Adam Warner
4241e50d4f Merge pull request #1179 from pi-hole/dev
Dev -> Master
2022-08-25 13:41:39 +01:00
Adam Warner
6baa4f9f08 Merge pull request #1178 from pi-hole/master
sync: master to dev
2022-08-25 13:40:09 +01:00
Adam Warner
c89a55c72e Update dependabot.yml
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-24 21:41:56 +01:00
Adam Warner
776bac7b90 [Experimental] Move the capability setting back to bash_functions from the pihole-FTL service
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-24 21:27:38 +01:00
Adam Warner
7ac0e8f0e3 Merge pull request #1173 from pi-hole/update-docs
FTLCONF_REPLY_ADDR4/6 are now deprecated. Use FTLCONF_LOCAL_IPV4/6 instead
2022-08-23 19:02:28 +01:00
Adam Warner
a1f9601c89 Merge branch 'dev' into update-docs 2022-08-22 21:46:12 +01:00
Adam Warner
cb38190b50 FTLCONF_REPLY_ADDR4/6 are now deprecated. Use FTLCONF_LOCAL_IPV4/6 instead
Signed-off-by: Adam Warner <me@adamwarner.co.uk>
2022-08-22 21:42:29 +01:00
Adam Warner
8273649b5e Merge pull request #1172 from pi-hole/master
sync: master to dev
2022-08-22 17:25:26 +01:00
dependabot[bot]
33966f8eb2 Bump urllib3 from 1.25.9 to 1.26.5
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.25.9 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.25.9...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-08-21 19:32:22 +00:00
Frédérick Morin
31cd4fbc47 Add systemd-resolved note for Fedora
Fedora 33+ enables [systemd-resolved by default](https://fedoraproject.org/wiki/Changes/systemd-resolved), so the "Installing on Ubuntu" section becomes relevant for Fedora users as well.

This small change makes it possible to search the Readme for "Fedora" and find the solution instantly.

Signed-off-by: Frédérick Morin <freddydumont@users.noreply.github.com>
2022-08-15 16:46:25 -04:00
48 changed files with 728 additions and 1050 deletions

View File

@@ -5,3 +5,8 @@ updates:
directory: "/"
schedule:
interval: "weekly"
day: saturday
time: "10:00"
target-branch: dev
reviewers:
- "pi-hole/docker-maintainers"

View File

@@ -13,7 +13,7 @@ jobs:
issues: write
steps:
- uses: actions/stale@v5
- uses: actions/stale@v6
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
days-before-stale: 30

View File

@@ -6,5 +6,5 @@ Please review the following before opening a pull request (PR) to help your PR g
* To ensure proper testing and quality control, target any code change pull requests against `dev` branch.
* Make sure the tests pass
* Take a look at [TESTING.md](TESTING.md) to see how to run tests locally so you do not have to push all your code to a PR and have GitHub Actions run it.
* Take a look at [TESTING.md](test/TESTING.md) to see how to run tests locally so you do not have to push all your code to a PR and have GitHub Actions run it.
* Your tests will probably run faster locally and you get a faster feedback loop.

View File

@@ -15,7 +15,7 @@
_If you absolutely cannot do this, some users [have reported](https://github.com/pi-hole/docker-pi-hole/issues/1042#issuecomment-1086728157) success in updating `libseccomp2` via backports on debian, or similar via updates on Ubuntu. You can try this workaround at your own risk_ (Note, you may also find that you need the latest `docker.io` (more details [here](https://blog.samcater.com/fix-workaround-rpi4-docker-libseccomp2-docker-20/))
- Some users [have reported issues](https://github.com/pi-hole/docker-pi-hole/issues/963#issuecomment-1095602502) with using the `--privileged` flag on `2022.04` and above. TL;DR, don't use that that mode, and be [explicit with the permitted caps](https://github.com/pi-hole/docker-pi-hole#note-on-capabilities) (if needed) instead
- Some users [have reported issues](https://github.com/pi-hole/docker-pi-hole/issues/963#issuecomment-1095602502) with using the `--privileged` flag on `2022.04` and above. TL;DR, don't use that mode, and be [explicit with the permitted caps](https://github.com/pi-hole/docker-pi-hole#note-on-capabilities) (if needed) instead
- As of `2022.04.01`, setting `CAP_NET_ADMIN` is only required if you are using Pi-hole as your DHCP server. The container will only try to set caps that are explicitly granted (or natively available)
@@ -53,8 +53,8 @@ services:
- NET_ADMIN # Required if you are using Pi-hole as your DHCP server, else not needed
restart: unless-stopped
```
2. Run `docker-compose up -d` to build and start pi-hole
3. Use the Pi-hole web UI to change the DNS settings *Interface listening behavior* to "Listen on all interfaces, permit all origins", if using Docker's default `bridge` network setting
2. Run `docker compose -f docker-compose.yml up -d` to build and start pi-hole
3. Use the Pi-hole web UI to change the DNS settings *Interface listening behavior* to "Listen on all interfaces, permit all origins", if using Docker's default `bridge` network setting. (This can also be achieved by setting the environment variable `DNSMASQ_LISTENING` to `all`)
[Here is an equivalent docker run script](https://github.com/pi-hole/docker-pi-hole/blob/master/examples/docker_run.sh).
@@ -99,13 +99,12 @@ There are other environment variables if you want to customize various things in
| -------- | ------- | ----- | ---------- |
| `TZ` | UTC | `<Timezone>` | Set your [timezone](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) to make sure logs rotate at local midnight instead of at UTC midnight.
| `WEBPASSWORD` | random | `<Admin password>` | http://pi.hole/admin password. Run `docker logs pihole \| grep random` to find your random pass.
| `FTLCONF_REPLY_ADDR4` | unset | `<Host's IP>` | Set to your server's LAN IP, used by web block modes and lighttpd bind address.
| `FTLCONF_LOCAL_IPV4` | unset | `<Host's IP>` | Set to your server's LAN IP, used by web block modes and lighttpd bind address.
### Optional Variables
| Variable | Default | Value | Description |
| -------- | ------- | ----- | ---------- |
| `ADMIN_EMAIL` | unset | email address | Set an administrative contact address for the Block Page |
| `PIHOLE_DNS_` | `8.8.8.8;8.8.4.4` | IPs delimited by `;` | Upstream DNS server(s) for Pi-hole to forward queries to, separated by a semicolon <br/> (supports non-standard ports with `#[port number]`) e.g `127.0.0.1#5053;8.8.8.8;8.8.4.4` <br/> (supports [Docker service names and links](https://docs.docker.com/compose/networking/) instead of IPs) e.g `upstream0;upstream1` where `upstream0` and `upstream1` are the service names of or links to docker services <br/> Note: The existence of this environment variable assumes this as the _sole_ management of upstream DNS. Upstream DNS added via the web interface will be overwritten on container restart/recreation |
| `DNSSEC` | `false` | `<"true"\|"false">` | Enable DNSSEC support |
| `DNS_BOGUS_PRIV` | `true` |`<"true"\|"false">`| Never forward reverse lookups for private ranges |
@@ -122,7 +121,7 @@ There are other environment variables if you want to customize various things in
| `PIHOLE_DOMAIN` | `lan` | `<domain>` | Domain name sent by the DHCP server.
| `DHCP_IPv6` | `false` | `<"true"\|"false">` | Enable DHCP server IPv6 support (SLAAC + RA).
| `DHCP_rapid_commit` | `false` | `<"true"\|"false">` | Enable DHCPv4 rapid commit (fast address assignment).
| `VIRTUAL_HOST` | `$FTLCONF_REPLY_ADDR4` | `<Custom Hostname>` | What your web server 'virtual host' is, accessing admin through this Hostname/IP allows you to make changes to the whitelist / blacklists in addition to the default 'http://pi.hole/admin/' address
| `VIRTUAL_HOST` | `$FTLCONF_LOCAL_IPV4` | `<Custom Hostname>` | What your web server 'virtual host' is, accessing admin through this Hostname/IP allows you to make changes to the whitelist / blacklists in addition to the default 'http://pi.hole/admin/' address
| `IPv6` | `true` | `<"true"\|"false">` | For unraid compatibility, strips out all the IPv6 configuration from DNS/Web services when false.
| `TEMPERATUREUNIT` | `c` | `<c\|k\|f>` | Set preferred temperature unit to `c`: Celsius, `k`: Kelvin, or `f` Fahrenheit units.
| `WEBUIBOXEDLAYOUT` | `boxed` | `<boxed\|traditional>` | Use boxed layout (helpful when working on large screens)
@@ -140,16 +139,16 @@ There are other environment variables if you want to customize various things in
| `CORS_HOSTS` | unset | `<FQDNs delimited by ,>` | List of domains/subdomains on which CORS is allowed. Wildcards are not supported. Eg: `CORS_HOSTS: domain.com,home.domain.com,www.domain.com`.
| `CUSTOM_CACHE_SIZE` | `10000` | Number | Set the cache size for dnsmasq. Useful for increasing the default cache size or to set it to 0. Note that when `DNSSEC` is "true", then this setting is ignored.
| `FTL_CMD` | `no-daemon` | `no-daemon -- <dnsmasq option>` | Customize the options with which dnsmasq gets started. e.g. `no-daemon -- --dns-forward-max 300` to increase max. number of concurrent dns queries on high load setups. |
| `FTLCONF_[SETTING]` | unset | As per documentation | Customize pihole-FTL.conf with settings described in the [FTLDNS Configuration page](https://docs.pi-hole.net/ftldns/configfile/). For example, to customize REPLY_ADDR6, ensure you have the `FTLCONF_REPLY_ADDR6` environment variable set.
| `FTLCONF_[SETTING]` | unset | As per documentation | Customize pihole-FTL.conf with settings described in the [FTLDNS Configuration page](https://docs.pi-hole.net/ftldns/configfile/). For example, to customize LOCAL_IPV4, ensure you have the `FTLCONF_LOCAL_IPV4` environment variable set.
### Experimental Variables
| Variable | Default | Value | Description |
| -------- | ------- | ----- | ---------- |
| `DNSMASQ_USER` | unset | `<pihole\|root>` | Allows changing the user that FTLDNS runs as. Default: `pihole`|
| `PIHOLE_UID` | debian system value | Number | Overrides image's default pihole user id to match a host user id |
| `PIHOLE_GID` | debian system value | Number | Overrides image's default pihole group id to match a host group id |
| `WEB_UID` | debian system value | Number | Overrides image's default www-data user id to match a host user id |
| `WEB_GID` | debian system value | Number | Overrides image's default www-data group id to match a host group id |
| `PIHOLE_UID` | `999` | Number | Overrides image's default pihole user id to match a host user id<br/>**IMPORTANT**: id must not already be in use inside the container! |
| `PIHOLE_GID` | `999` | Number | Overrides image's default pihole group id to match a host group id<br/>**IMPORTANT**: id must not already be in use inside the container!|
| `WEB_UID` | `33` | Number | Overrides image's default www-data user id to match a host user id<br/>**IMPORTANT**: id must not already be in use inside the container! (Make sure it is different to `PIHOLE_UID` if you are using that, also)|
| `WEB_GID` | `33` | Number | Overrides image's default www-data group id to match a host group id<br/>**IMPORTANT**: id must not already be in use inside the container! (Make sure it is different to `PIHOLE_GID` if you are using that, also)|
| `WEBLOGS_STDOUT` | 0 | 0&vert;1 | 0 logs to defined files, 1 redirect access and error logs to stdout |
## Deprecated environment variables:
@@ -165,6 +164,8 @@ While these may still work, they are likely to be removed in a future version. W
| `DNS2` | Secondary upstream DNS provider, default is google DNS, `no` if only one DNS should used | `PIHOLE_DNS_` |
| `ServerIP` | Set to your server's LAN IP, used by web block modes and lighttpd bind address | `FTLCONF_REPLY_ADDR4` |
| `ServerIPv6` | **If you have a v6 network** set to your server's LAN IPv6 to block IPv6 ads fully | `FTLCONF_REPLY_ADDR6` |
| `FTLCONF_REPLY_ADDR4` | Set to your server's LAN IP, used by web block modes and lighttpd bind address | `FTLCONF_LOCAL_IPV4` |
| `FTLCONF_REPLY_ADDR6` | **If you have a v6 network** set to your server's LAN IPv6 to block IPv6 ads fully | `FTLCONF_LOCAL_IPV6` |
To use these env vars in docker run format style them like: `-e DNS1=1.1.1.1`
@@ -194,8 +195,8 @@ Here is a rundown of other arguments for your docker-compose / docker run.
* [Here is an example of running with nginxproxy/nginx-proxy](https://github.com/pi-hole/docker-pi-hole/blob/master/examples/docker-compose-nginx-proxy.yml) (an nginx auto-configuring docker reverse proxy for docker) on my port 80 with Pi-hole on another port. Pi-hole needs to be `DEFAULT_HOST` env in nginxproxy/nginx-proxy and you need to set the matching `VIRTUAL_HOST` for the Pi-hole's container. Please read nginxproxy/nginx-proxy readme for more info if you have trouble.
* Docker's default network mode `bridge` isolates the container from the host's network. This is a more secure setting, but requires setting the Pi-hole DNS option for *Interface listening behavior* to "Listen on all interfaces, permit all origins".
### Installing on Ubuntu
Modern releases of Ubuntu (17.10+) include [`systemd-resolved`](http://manpages.ubuntu.com/manpages/bionic/man8/systemd-resolved.service.8.html) which is configured by default to implement a caching DNS stub resolver. This will prevent pi-hole from listening on port 53.
### Installing on Ubuntu or Fedora
Modern releases of Ubuntu (17.10+) and Fedora (33+) include [`systemd-resolved`](http://manpages.ubuntu.com/manpages/bionic/man8/systemd-resolved.service.8.html) which is configured by default to implement a caching DNS stub resolver. This will prevent pi-hole from listening on port 53.
The stub resolver should be disabled with: `sudo sed -r -i.orig 's/#?DNSStubListener=yes/DNSStubListener=no/g' /etc/systemd/resolved.conf`
This will not change the nameserver settings, which point to the stub resolver thus preventing DNS resolution. Change the `/etc/resolv.conf` symlink to point to `/run/systemd/resolve/resolv.conf`, which is automatically updated to follow the system's [`netplan`](https://netplan.io/):
@@ -220,6 +221,9 @@ Note that it is also possible to disable `systemd-resolved` entirely. However, t
Users of older Ubuntu releases (circa 17.04) will need to disable dnsmasq.
## Installing on Dokku
@Rikj000 has produced a guide to assist users [installing Pi-hole on Dokku](https://github.com/Rikj000/Pihole-Dokku-Installation)
## Docker tags and versioning
The primary docker tags are explained in the following table. [Click here to see the full list of tags](https://store.docker.com/community/images/pihole/pihole/tags). See [GitHub Release notes](https://github.com/pi-hole/docker-pi-hole/releases) to see the specific version of Pi-hole Core, Web, and FTL included in the release.

5
examples/Caddyfile Normal file
View File

@@ -0,0 +1,5 @@
pihole-dev.lab {
tls internal
redir / /admin
reverse_proxy pihole:8081
}

View File

@@ -0,0 +1,66 @@
version: "3"
services:
# Caddy example derived from Caddy's own example at https://hub.docker.com/_/caddy
caddy:
container_name: caddy
image: caddy:latest
networks:
- caddy-net # Network exclusively for Caddy-proxied containers
restart: unless-stopped
ports:
- "80:80"
- "443:443"
- "443:443/udp" # QUIC protocol support: https://www.chromium.org/quic/
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile # config file on host in same directory as docker-compose.yml for easy editing.
#- $PWD/site:/srv # Only use if you are serving a website behind caddy
- caddy_data:/data # Use docker volumes here bc no need to access these files from host
- caddy_config:/config # Use docker volumes here bc no need to access these files from host
# More info at https://github.com/pi-hole/docker-pi-hole/ and https://docs.pi-hole.net/
pihole:
depends_on:
- caddy
container_name: pihole
#dns: # Optional. Specify desired upstream DNS servers here.
# - 127.0.0.1
# - 9.9.9.9
# - 149.112.112.112
image: pihole/pihole:latest
networks:
- caddy-net # Need to plug into caddy net to access proxy
ports:
- "8081:80/tcp" # Pi-hole web admin interface, proxied through Caddy (configure port in Caddyfile)
# Following are NOT proxied through Caddy, bound to host net instead:
- "53:53/udp"
- "53:53/tcp"
- "853:853/tcp" # DNS-over-TLS
#- "67:67/udp" # DHCP, if desired. If not bound to host net you need an mDNS proxy service configured somewhere on host net.
# ref: https://docs.pi-hole.net/docker/DHCP/
environment:
TZ: 'America/New_York' # Supported TZ database names: https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#Time_Zone_abbreviations
WEBPASSWORD: 'password' # Only used on first boot, change with pihole cli then comment out here.
volumes:
- './etc-pihole:/etc/pihole'
- './etc-dnsmasq.d:/etc/dnsmasq.d'
- './etc-lighttpd/external.conf:/etc/lighttpd/external.conf' # Recommend leave as bind mount for easier editing.
# ref for why you may need to change this file: https://docs.pi-hole.net/guides/webserver/caddy/#modifying-lighttpd-configuration
#cap_add: # Uncomment if using Pi-hole as DHCP server
# https://github.com/pi-hole/docker-pi-hole#note-on-capabilities
#- NET_ADMIN # ONLY required if you are using Pi-hole as your DHCP server, else remove for better security
restart: unless-stopped
# ref: https://hub.docker.com/_/caddy
networks:
caddy-net:
driver: bridge
name: caddy-net
# ref: https://hub.docker.com/_/caddy
volumes:
caddy_data:
external: true # May need to create volume with 'docker volume create caddy_data'
caddy_config:

View File

@@ -5,7 +5,7 @@
PIHOLE_BASE="${PIHOLE_BASE:-$(pwd)}"
[[ -d "$PIHOLE_BASE" ]] || mkdir -p "$PIHOLE_BASE" || { echo "Couldn't create storage directory: $PIHOLE_BASE"; exit 1; }
# Note: FTLCONF_REPLY_ADDR4 should be replaced with your external ip.
# Note: FTLCONF_LOCAL_IPV4 should be replaced with your external ip.
docker run -d \
--name pihole \
-p 53:53/tcp -p 53:53/udp \
@@ -18,7 +18,7 @@ docker run -d \
--hostname pi.hole \
-e VIRTUAL_HOST="pi.hole" \
-e PROXY_LOCATION="pi.hole" \
-e FTLCONF_REPLY_ADDR4="127.0.0.1" \
-e FTLCONF_LOCAL_IPV4="127.0.0.1" \
pihole/pihole:latest
printf 'Starting up pihole container '

View File

@@ -2,14 +2,9 @@ ARG PIHOLE_BASE
FROM "${PIHOLE_BASE:-ghcr.io/pi-hole/docker-pi-hole-base:bullseye-slim}"
ARG PIHOLE_DOCKER_TAG
ENV PIHOLE_DOCKER_TAG "${PIHOLE_DOCKER_TAG}"
RUN echo "${PIHOLE_DOCKER_TAG}" > /pihole.docker.tag
ENV S6_OVERLAY_VERSION v3.1.1.2
COPY ./scripts/install.sh /usr/local/bin/install.sh
ENV PIHOLE_INSTALL /etc/.pihole/automated\ install/basic-install.sh
ENTRYPOINT [ "/init" ]
ENTRYPOINT [ "/s6-init" ]
COPY s6/debian-root /
COPY s6/service /usr/local/bin/service
@@ -22,9 +17,6 @@ ARG PHP_ENV_CONFIG
ENV PHP_ENV_CONFIG /etc/lighttpd/conf-enabled/15-fastcgi-php.conf
ARG PHP_ERROR_LOG
ENV PHP_ERROR_LOG /var/log/lighttpd/error-pihole.log
COPY ./scripts/start.sh /
COPY ./scripts/bash_functions.sh /
COPY ./scripts/gravityonboot.sh /
# IPv6 disable flag for networks/devices that do not support it
ENV IPv6 True
@@ -33,11 +25,11 @@ EXPOSE 53 53/udp
EXPOSE 67/udp
EXPOSE 80
ENV S6_LOGGING 0
ENV S6_KEEP_ENV 1
ENV S6_BEHAVIOUR_IF_STAGE2_FAILS 2
ENV S6_CMD_WAIT_FOR_SERVICES_MAXTIME 0
ENV FTLCONF_REPLY_ADDR4 0.0.0.0
ENV FTLCONF_LOCAL_IPV4 0.0.0.0
ENV FTL_CMD no-daemon
ENV DNSMASQ_USER pihole

View File

@@ -1 +0,0 @@
/etc/resolv.conf false doesntexist,0:1000 0664 0664

View File

@@ -1,2 +0,0 @@
#!/command/execlineb
background { bash -e /gravityonboot.sh }

View File

@@ -0,0 +1,2 @@
#!/command/execlineb
background { bash -e /usr/local/bin/_postFTL.sh }

View File

@@ -1,2 +1,2 @@
#!/command/execlineb
foreground { bash -e /start.sh }
foreground { bash -e /usr/local/bin/_startup.sh }

View File

@@ -0,0 +1 @@
oneshot

View File

@@ -0,0 +1,2 @@
#!/command/execlineb
foreground { bash -e /usr/local/bin/_uid-gid-changer.sh }

View File

@@ -1,5 +1,3 @@
#!/command/with-contenv bash
s6-echo "Starting crond"
exec -c
fdmove -c 2 1 /usr/sbin/cron -f

View File

@@ -1,6 +1,8 @@
#!/command/with-contenv bash
s6-echo "Starting lighttpd"
if [ "${PH_VERBOSE:-0}" -gt 0 ] ; then
set -x ;
fi
if [[ 1 -eq ${WEBLOGS_STDOUT:-0} ]]; then
#lighthttpd cannot use /dev/stdout https://redmine.lighttpd.net/issues/2731

View File

@@ -1,39 +1,9 @@
#!/command/with-contenv bash
# Testing on Docker 20.10.14 with no caps set shows the following caps available to the container:
# Current: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_net_raw,cap_sys_chroot,cap_mknod,cap_audit_write,cap_setfcap=ep
# FTL can also use CAP_NET_ADMIN and CAP_SYS_NICE. If we try to set them when they haven't been explicitly enabled, FTL will not start. Test for them first:
/sbin/capsh --has-p=cap_chown 2>/dev/null && CAP_STR+=',CAP_CHOWN'
/sbin/capsh --has-p=cap_net_bind_service 2>/dev/null && CAP_STR+=',CAP_NET_BIND_SERVICE'
/sbin/capsh --has-p=cap_net_raw 2>/dev/null && CAP_STR+=',CAP_NET_RAW'
/sbin/capsh --has-p=cap_net_admin 2>/dev/null && CAP_STR+=',CAP_NET_ADMIN' || DHCP_READY='false'
/sbin/capsh --has-p=cap_sys_nice 2>/dev/null && CAP_STR+=',CAP_SYS_NICE'
if [[ ${CAP_STR} ]]; then
# We have the (some of) the above caps available to us - apply them to pihole-FTL
setcap ${CAP_STR:1}+ep "$(which pihole-FTL)" || ret=$?
if [[ $DHCP_READY == false ]] && [[ $DHCP_ACTIVE == true ]]; then
# DHCP is requested but NET_ADMIN is not available.
echo "ERROR: DHCP requested but NET_ADMIN is not available. DHCP will not be started."
echo " Please add cap_net_admin to the container's capabilities or disable DHCP."
DHCP_ACTIVE='false'
change_setting "DHCP_ACTIVE" "false"
fi
if [[ $ret -ne 0 && "${DNSMASQ_USER:-pihole}" != "root" ]]; then
echo "ERROR: Unable to set capabilities for pihole-FTL. Cannot run as non-root."
echo " If you are seeing this error, please set the environment variable 'DNSMASQ_USER' to the value 'root'"
exit 1
fi
else
echo "WARNING: Unable to set capabilities for pihole-FTL."
echo " Please ensure that the container has the required capabilities."
exit 1
if [ "${PH_VERBOSE:-0}" -gt 0 ] ; then
set -x ;
fi
s6-echo "Starting pihole-FTL ($FTL_CMD) as ${DNSMASQ_USER}"
# Remove possible leftovers from previous pihole-FTL processes
rm -f /dev/shm/FTL-* 2> /dev/null
rm /run/pihole/FTL.sock 2> /dev/null
@@ -41,14 +11,13 @@ rm /run/pihole/FTL.sock 2> /dev/null
# install /dev/null files to ensure they exist (create if non-existing, preserve if existing)
mkdir -pm 0755 /run/pihole /var/log/pihole
[[ ! -f /run/pihole-FTL.pid ]] && install /dev/null /run/pihole-FTL.pid
[[ ! -f /run/pihole-FTL.port ]] && install /dev/null /run/pihole-FTL.port
[[ ! -f /var/log/pihole/FTL.log ]] && install /dev/null /var/log/pihole/FTL.log
[[ ! -f /var/log/pihole/pihole.log ]] && install /dev/null /var/log/pihole/pihole.log
[[ ! -f /etc/pihole/dhcp.leases ]] && install /dev/null /etc/pihole/dhcp.leases
# Ensure that permissions are set so that pihole-FTL can edit all necessary files
chown pihole:pihole /run/pihole-FTL.pid /run/pihole-FTL.port /var/log/pihole/FTL.log /var/log/pihole/pihole.log /etc/pihole/dhcp.leases /run/pihole /etc/pihole
chmod 0644 /run/pihole-FTL.pid /run/pihole-FTL.port /var/log/pihole/FTL.log /var/log/pihole/pihole.log /etc/pihole/dhcp.leases
chown pihole:pihole /run/pihole-FTL.pid /var/log/pihole/FTL.log /var/log/pihole/pihole.log /etc/pihole/dhcp.leases /run/pihole /etc/pihole
chmod 0644 /run/pihole-FTL.pid /var/log/pihole/FTL.log /var/log/pihole/pihole.log /etc/pihole/dhcp.leases
# Ensure that permissions are set so that pihole-FTL can edit the files. We ignore errors as the file may not (yet) exist
chmod -f 0644 /etc/pihole/macvendor.db

View File

@@ -1,10 +0,0 @@
#!/usr/bin/with-contenv sh
#
# This script will determine the network IP of the container.
#
# Return format should be a single IP address.
#
# Default to using the value of the $HOSTNAME ENV variable.
getent hosts ${1:-$HOSTNAME} | awk '{print $1}'

View File

@@ -1,12 +0,0 @@
#!/usr/bin/execlineb -S0
if { s6-test $# -eq 2 }
backtick -in FILENAME {
pipeline { s6-echo "${1}" }
tr "a-z" "A-Z"
}
import -u FILENAME
redirfd -w 1 /var/run/s6/container_environment/${FILENAME}
s6-echo -n -- ${2}

View File

@@ -1,4 +1,10 @@
#!/bin/bash
# This script contains function calls and lines that may rely on pihole-FTL to be running, it is run as part of a oneshot service on container startup
if [ "${PH_VERBOSE:-0}" -gt 0 ] ; then
set -x ;
fi
gravityDBfile="/etc/pihole/gravity.db"
config_file="/etc/pihole/pihole-FTL.conf"
# make a point to mention which config file we're checking, as breadcrumb to revisit if/when pihole-FTL.conf is succeeded by TOML
@@ -15,4 +21,13 @@ if [ -z "$SKIPGRAVITYONBOOT" ] || [ ! -f "${gravityDBfile}" ]; then
pihole -g
else
echo " Skipping Gravity Database Update."
fi
fi
# Run update checker to check for newer container, and display version output
echo ""
pihole updatechecker
pihole -v
DOCKER_TAG=$(cat /pihole.docker.tag)
echo " Container tag is: ${DOCKER_TAG}"
echo ""

View File

@@ -1,23 +1,28 @@
#!/bin/bash -e
if [ "${PH_VERBOSE:-0}" -gt 0 ] ; then
set -x ;
fi
# The below functions are all contained in bash_functions.sh
# shellcheck source=/dev/null
. /bash_functions.sh
. /usr/local/bin/bash_functions.sh
# shellcheck source=/dev/null
SKIP_INSTALL=true . "${PIHOLE_INSTALL}"
SKIP_INSTALL=true . /etc/.pihole/automated\ install/basic-install.sh
echo " ::: Starting docker specific checks & setup for docker pihole/pihole"
echo " [i] Starting docker specific checks & setup for docker pihole/pihole"
# TODO:
#if [ ! -f /.piholeFirstBoot ] ; then
# echo " ::: Not first container startup so not running docker's setup, re-create container to run setup again"
# echo " [i] Not first container startup so not running docker's setup, re-create container to run setup again"
#else
# regular_setup_functions
#fi
# Initial checks
# ===========================
fix_capabilities
validate_env || exit 1
ensure_basic_configuration
@@ -38,7 +43,6 @@ setup_lighttpd_bind
# Misc Setup
# ===========================
setup_admin_email
setup_blocklists
# FTL setup
@@ -48,6 +52,7 @@ setup_FTL_upstream_DNS
apply_FTL_Configs_From_Env
setup_FTL_User
setup_FTL_Interface
setup_FTL_ListeningBehaviour
setup_FTL_CacheSize
setup_FTL_query_logging
setup_FTL_server || true
@@ -61,8 +66,9 @@ test_configs
[ -f /.piholeFirstBoot ] && rm /.piholeFirstBoot
echo "::: Docker start setup complete"
echo " [i] Docker start setup complete"
echo ""
pihole -v
echo " Container tag is: ${PIHOLE_DOCKER_TAG}"
echo " [i] pihole-FTL ($FTL_CMD) will be started as ${DNSMASQ_USER}"
echo ""

View File

@@ -1,4 +1,4 @@
#!/command/with-contenv bash
#!/bin/bash
set -e
if [ "${PH_VERBOSE:-0}" -gt 0 ] ; then
@@ -13,7 +13,7 @@ modifyUser()
local currentId=$(id -u ${username})
[[ ${currentId} -eq ${newId} ]] && return
echo "Changing ID for user: ${username} (${currentId} => ${newId})"
echo " [i] Changing ID for user: ${username} (${currentId} => ${newId})"
usermod -o -u ${newId} ${username}
}
@@ -25,7 +25,7 @@ modifyGroup()
local currentId=$(id -g ${groupname})
[[ ${currentId} -eq ${newId} ]] && return
echo "Changing ID for group: ${groupname} (${currentId} => ${newId})"
echo " [i] Changing ID for group: ${groupname} (${currentId} => ${newId})"
groupmod -o -g ${newId} ${groupname}
}

View File

@@ -5,8 +5,12 @@
[ -n "${QUERY_LOGGING}" ] && export QUERY_LOGGING_OVERRIDE="${QUERY_LOGGING}"
# Legacy Env Vars preserved for backwards compatibility - convert them to FTLCONF_ equivalents
[ -n "${ServerIP}" ] && echo "ServerIP is deprecated. Converting to FTLCONF_REPLY_ADDR4" && export "FTLCONF_REPLY_ADDR4"="$ServerIP"
[ -n "${ServerIPv6}" ] && echo "ServerIPv6 is deprecated. Converting to FTLCONF_REPLY_ADDR6" && export "FTLCONF_REPLY_ADDR6"="$ServerIPv6"
[ -n "${ServerIP}" ] && echo "ServerIP is deprecated. Converting to FTLCONF_LOCAL_IPV4" && export "FTLCONF_LOCAL_IPV4"="$ServerIP"
[ -n "${ServerIPv6}" ] && echo "ServerIPv6 is deprecated. Converting to FTLCONF_LOCAL_IPV6" && export "FTLCONF_LOCAL_IPV6"="$ServerIPv6"
# Previously used FTLCONF_ equivalent has since been deprecated, also convert this one
[ -n "${FTLCONF_REPLY_ADDR4}" ] && echo "FTLCONF_REPLY_ADDR4 is deprecated. Converting to FTLCONF_LOCAL_IPV4" && export "FTLCONF_LOCAL_IPV4"="$FTLCONF_REPLY_ADDR4"
[ -n "${FTLCONF_REPLY_ADDR6}" ] && echo "FTLCONF_REPLY_ADDR6 is deprecated. Converting to FTLCONF_LOCAL_IPV6" && export "FTLCONF_LOCAL_IPV6"="$FTLCONF_REPLY_ADDR6"
# Some of the bash_functions use utilities from Pi-hole's utils.sh
# shellcheck disable=SC2154
@@ -26,9 +30,51 @@ changeFTLsetting() {
addOrEditKeyValPair "${FTLconf}" "${1}" "${2}"
}
fix_capabilities() {
# Testing on Docker 20.10.14 with no caps set shows the following caps available to the container:
# Current: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_net_raw,cap_sys_chroot,cap_mknod,cap_audit_write,cap_setfcap=ep
# FTL can also use CAP_NET_ADMIN and CAP_SYS_NICE. If we try to set them when they haven't been explicitly enabled, FTL will not start. Test for them first:
echo " [i] Setting capabilities on pihole-FTL where possible"
/sbin/capsh --has-p=cap_chown 2>/dev/null && CAP_STR+=',CAP_CHOWN'
/sbin/capsh --has-p=cap_net_bind_service 2>/dev/null && CAP_STR+=',CAP_NET_BIND_SERVICE'
/sbin/capsh --has-p=cap_net_raw 2>/dev/null && CAP_STR+=',CAP_NET_RAW'
/sbin/capsh --has-p=cap_net_admin 2>/dev/null && CAP_STR+=',CAP_NET_ADMIN' || DHCP_READY='false'
/sbin/capsh --has-p=cap_sys_nice 2>/dev/null && CAP_STR+=',CAP_SYS_NICE'
if [[ ${CAP_STR} ]]; then
# We have the (some of) the above caps available to us - apply them to pihole-FTL
echo " [i] Applying the following caps to pihole-FTL:"
IFS=',' read -ra CAPS <<< "${CAP_STR:1}"
for i in "${CAPS[@]}"; do
echo " * ${i}"
done
setcap ${CAP_STR:1}+ep "$(which pihole-FTL)" || ret=$?
if [[ $DHCP_READY == false ]] && [[ $DHCP_ACTIVE == true ]]; then
# DHCP is requested but NET_ADMIN is not available.
echo "ERROR: DHCP requested but NET_ADMIN is not available. DHCP will not be started."
echo " Please add cap_net_admin to the container's capabilities or disable DHCP."
DHCP_ACTIVE='false'
change_setting "DHCP_ACTIVE" "false"
fi
if [[ $ret -ne 0 && "${DNSMASQ_USER:-pihole}" != "root" ]]; then
echo " [!] ERROR: Unable to set capabilities for pihole-FTL. Cannot run as non-root."
echo " If you are seeing this error, please set the environment variable 'DNSMASQ_USER' to the value 'root'"
exit 1
fi
else
echo " [!] WARNING: Unable to set capabilities for pihole-FTL."
echo " Please ensure that the container has the required capabilities."
exit 1
fi
}
# shellcheck disable=SC2034
ensure_basic_configuration() {
echo " [i] Ensuring basic configuration by re-running select functions from basic-install.sh"
# Set Debian webserver variables for installConfigs
LIGHTTPD_USER="www-data"
LIGHTTPD_GROUP="www-data"
@@ -38,7 +84,7 @@ ensure_basic_configuration() {
if [ ! -f "${setupVars}" ]; then
install -m 644 /dev/null "${setupVars}"
echo "Creating empty ${setupVars} file."
echo " [i] Creating empty ${setupVars} file."
# The following setting needs to exist else the web interface version won't show in pihole -v
change_setting "INSTALL_WEB_INTERFACE" "true"
fi
@@ -54,8 +100,6 @@ ensure_basic_configuration() {
chown pihole:root "${PI_HOLE_BIN_DIR}/pihole"
set -e
# Update version numbers
pihole updatechecker
# Re-write all of the setupVars to ensure required ones are present (like QUERY_LOGGING)
# If the setup variable file exists,
@@ -78,24 +122,24 @@ ensure_basic_configuration() {
}
validate_env() {
# Optional FTLCONF_REPLY_ADDR4 is a valid IP
# Optional FTLCONF_LOCAL_IPV4 is a valid IP
# nc won't throw any text based errors when it times out connecting to a valid IP, otherwise it complains about the DNS name being garbage
# if nc doesn't behave as we expect on a valid IP the routing table should be able to look it up and return a 0 retcode
if [[ "$(nc -4 -w1 -z "$FTLCONF_REPLY_ADDR4" 53 2>&1)" != "" ]] && ! ip route get "$FTLCONF_REPLY_ADDR4" > /dev/null ; then
echo "ERROR: FTLCONF_REPLY_ADDR4 Environment variable ($FTLCONF_REPLY_ADDR4) doesn't appear to be a valid IPv4 address"
if [[ "$(nc -4 -w1 -z "$FTLCONF_LOCAL_IPV4" 53 2>&1)" != "" ]] && ! ip route get "$FTLCONF_LOCAL_IPV4" > /dev/null ; then
echo "ERROR: FTLCONF_LOCAL_IPV4 Environment variable ($FTLCONF_LOCAL_IPV4) doesn't appear to be a valid IPv4 address"
exit 1
fi
# Optional IPv6 is a valid address
if [[ -n "$FTLCONF_REPLY_ADDR6" ]] ; then
if [[ "$FTLCONF_REPLY_ADDR6" == 'kernel' ]] ; then
echo "ERROR: You passed in IPv6 with a value of 'kernel', this maybe because you do not have IPv6 enabled on your network"
unset FTLCONF_REPLY_ADDR6
if [[ -n "$FTLCONF_LOCAL_IPV6" ]] ; then
if [[ "$FTLCONF_LOCAL_IPV6" == 'kernel' ]] ; then
echo " [!] ERROR: You passed in IPv6 with a value of 'kernel', this maybe because you do not have IPv6 enabled on your network"
unset FTLCONF_LOCAL_IPV6
exit 1
fi
if [[ "$(nc -6 -w1 -z "$FTLCONF_REPLY_ADDR6" 53 2>&1)" != "" ]] && ! ip route get "$FTLCONF_REPLY_ADDR6" > /dev/null ; then
echo "ERROR: FTLCONF_REPLY_ADDR6 Environment variable ($FTLCONF_REPLY_ADDR6) doesn't appear to be a valid IPv6 address"
echo " TIP: If your server is not IPv6 enabled just remove '-e FTLCONF_REPLY_ADDR6' from your docker container"
if [[ "$(nc -6 -w1 -z "$FTLCONF_LOCAL_IPV6" 53 2>&1)" != "" ]] && ! ip route get "$FTLCONF_LOCAL_IPV6" > /dev/null ; then
echo " [!] ERROR: FTLCONF_LOCAL_IPV6 Environment variable ($FTLCONF_LOCAL_IPV6) doesn't appear to be a valid IPv6 address"
echo " TIP: If your server is not IPv6 enabled just remove '-e FTLCONF_LOCAL_IPV6' from your docker container"
exit 1
fi
fi;
@@ -121,12 +165,18 @@ setup_FTL_Interface(){
if [ "$interface" != 'eth0' ] ; then
interfaceType='custom'
fi;
echo "FTL binding to $interfaceType interface: $interface"
echo " [i] FTL binding to $interfaceType interface: $interface"
change_setting "PIHOLE_INTERFACE" "${interface}"
}
setup_FTL_ListeningBehaviour(){
if [ -n "$DNSMASQ_LISTENING" ]; then
change_setting "DNSMASQ_LISTENING" "${DNSMASQ_LISTENING}"
fi;
}
setup_FTL_CacheSize() {
local warning="WARNING: CUSTOM_CACHE_SIZE not used"
local warning=" [i] WARNING: CUSTOM_CACHE_SIZE not used"
local dnsmasq_pihole_01_location="/etc/dnsmasq.d/01-pihole.conf"
# Quietly exit early for empty or default
if [[ -z "${CUSTOM_CACHE_SIZE}" || "${CUSTOM_CACHE_SIZE}" == '10000' ]] ; then return ; fi
@@ -146,7 +196,7 @@ setup_FTL_CacheSize() {
echo "$warning - $custom_cache_size is not a positive integer or zero"
return
fi
echo "Custom CUSTOM_CACHE_SIZE set to $custom_cache_size"
echo " [i] Custom CUSTOM_CACHE_SIZE set to $custom_cache_size"
change_setting "CACHE_SIZE" "$custom_cache_size"
sed -i "s/^cache-size=\s*[0-9]*/cache-size=$custom_cache_size/" ${dnsmasq_pihole_01_location}
@@ -158,14 +208,14 @@ apply_FTL_Configs_From_Env(){
# setting defined here: https://docs.pi-hole.net/ftldns/configfile/
declare -px | grep FTLCONF_ | sed -E 's/declare -x FTLCONF_([^=]+)=\"(.+)\"/\1 \2/' | while read -r name value
do
echo "Applying pihole-FTL.conf setting $name=$value"
echo " [i] Applying pihole-FTL.conf setting $name=$value"
changeFTLsetting "$name" "$value"
done
}
setup_FTL_dhcp() {
if [ -z "${DHCP_START}" ] || [ -z "${DHCP_END}" ] || [ -z "${DHCP_ROUTER}" ]; then
echo "ERROR: Won't enable DHCP server because mandatory Environment variables are missing: DHCP_START, DHCP_END and/or DHCP_ROUTER"
echo " [!] ERROR: Won't enable DHCP server because mandatory Environment variables are missing: DHCP_START, DHCP_END and/or DHCP_ROUTER"
change_setting "DHCP_ACTIVE" "false"
else
change_setting "DHCP_ACTIVE" "${DHCP_ACTIVE}"
@@ -181,14 +231,14 @@ setup_FTL_dhcp() {
setup_FTL_query_logging(){
if [ "${QUERY_LOGGING_OVERRIDE}" == "false" ]; then
echo "::: Disabling Query Logging"
echo " [i] Disabling Query Logging"
change_setting "QUERY_LOGGING" "$QUERY_LOGGING_OVERRIDE"
removeKey "${dnsmasqconfig}" log-queries
else
# If it is anything other than false, set it to true
change_setting "QUERY_LOGGING" "true"
# Set pihole logging on for good measure
echo "::: Enabling Query Logging"
echo " [i] Enabling Query Logging"
addKey "${dnsmasqconfig}" log-queries
fi
@@ -215,51 +265,53 @@ setup_FTL_upstream_DNS(){
# For backward compatibility, if DNS1 and/or DNS2 are set, but PIHOLE_DNS_ is not, convert them to
# a semi-colon delimited string and store in PIHOLE_DNS_
# They are not used anywhere if PIHOLE_DNS_ is set already
[ -n "${DNS1}" ] && echo "Converting DNS1 to PIHOLE_DNS_" && PIHOLE_DNS_="$DNS1"
[[ -n "${DNS2}" && "${DNS2}" != "no" ]] && echo "Converting DNS2 to PIHOLE_DNS_" && PIHOLE_DNS_="$PIHOLE_DNS_;$DNS2"
[ -n "${DNS1}" ] && echo " [i] Converting DNS1 to PIHOLE_DNS_" && PIHOLE_DNS_="$DNS1"
[[ -n "${DNS2}" && "${DNS2}" != "no" ]] && echo " [i] Converting DNS2 to PIHOLE_DNS_" && PIHOLE_DNS_="$PIHOLE_DNS_;$DNS2"
fi
# Parse the PIHOLE_DNS variable, if it exists, and apply upstream servers to Pi-hole config
if [ -n "${PIHOLE_DNS_}" ]; then
echo "Setting DNS servers based on PIHOLE_DNS_ variable"
echo " [i] Setting DNS servers based on PIHOLE_DNS_ variable"
# Remove any PIHOLE_DNS_ entries from setupVars.conf, if they exist
sed -i '/PIHOLE_DNS_/d' /etc/pihole/setupVars.conf
# Split into an array (delimited by ;)
# Loop through and add them one by one to setupVars.conf
IFS=";" read -r -a PIHOLE_DNS_ARR <<< "${PIHOLE_DNS_}"
# PIHOLE_DNS_ARR=(${PIHOLE_DNS_//;/ })
count=1
valid_entries=0
for i in "${PIHOLE_DNS_ARR[@]}"; do
if valid_ip "$i" || valid_ip6 "$i" ; then
change_setting "PIHOLE_DNS_$count" "$i"
((count=count+1))
((valid_entries=valid_entries+1))
continue
fi
# shellcheck disable=SC2086
if [ -n "$(dig +short ${i//#*/})" ]; then
# If the "address" is a domain (for example a docker link) then try to resolve it and add
# the result as a DNS server in setupVars.conf.
resolved_ip="$(dig +short ${i//#*/} | head -n 1)"
if [ -n "${i//*#/}" ] && [ "${i//*#/}" != "${i//#*/}" ]; then
resolved_ip="${resolved_ip}#${i//*#/}"
fi
echo "Resolved ${i} from PIHOLE_DNS_ as: ${resolved_ip}"
if valid_ip "$resolved_ip" || valid_ip6 "$resolved_ip" ; then
change_setting "PIHOLE_DNS_$count" "$resolved_ip"
# Ensure we don't have an empty value first (see https://github.com/pi-hole/docker-pi-hole/issues/1174#issuecomment-1228763422 )
if [ -n "$i" ]; then
if valid_ip "$i" || valid_ip6 "$i" ; then
change_setting "PIHOLE_DNS_$count" "$i"
((count=count+1))
((valid_entries=valid_entries+1))
continue
fi
# shellcheck disable=SC2086
if [ -n "$(dig +short ${i//#*/})" ]; then
# If the "address" is a domain (for example a docker link) then try to resolve it and add
# the result as a DNS server in setupVars.conf.
resolved_ip="$(dig +short ${i//#*/} | head -n 1)"
if [ -n "${i//*#/}" ] && [ "${i//*#/}" != "${i//#*/}" ]; then
resolved_ip="${resolved_ip}#${i//*#/}"
fi
echo "Resolved ${i} from PIHOLE_DNS_ as: ${resolved_ip}"
if valid_ip "$resolved_ip" || valid_ip6 "$resolved_ip" ; then
change_setting "PIHOLE_DNS_$count" "$resolved_ip"
((count=count+1))
((valid_entries=valid_entries+1))
continue
fi
fi
# If the above tests fail then this is an invalid DNS server
echo " [!] Invalid entry detected in PIHOLE_DNS_: ${i}"
fi
fi
# If the above tests fail then this is an invalid DNS server
echo "Invalid entry detected in PIHOLE_DNS_: ${i}"
done
if [ $valid_entries -eq 0 ]; then
echo "No Valid entries detected in PIHOLE_DNS_. Aborting"
exit 1
echo " [!] No Valid entries detected in PIHOLE_DNS_. Aborting"
exit 1
fi
else
# Environment variable has not been set, but there may be existing values in an existing setupVars.conf
@@ -268,11 +320,11 @@ setup_FTL_upstream_DNS(){
setupVarsDNS="$(grep 'PIHOLE_DNS_' /etc/pihole/setupVars.conf || true)"
if [ -z "${setupVarsDNS}" ]; then
echo "Configuring default DNS servers: 8.8.8.8, 8.8.4.4"
echo " [i] Configuring default DNS servers: 8.8.8.8, 8.8.4.4"
change_setting "PIHOLE_DNS_1" "8.8.8.8"
change_setting "PIHOLE_DNS_2" "8.8.4.4"
else
echo "Existing DNS servers detected in setupVars.conf. Leaving them alone"
echo " [i] Existing DNS servers detected in setupVars.conf. Leaving them alone"
fi
fi
}
@@ -286,8 +338,8 @@ setup_FTL_ProcessDNSSettings(){
}
setup_lighttpd_bind() {
local serverip="${FTLCONF_REPLY_ADDR4}"
# if using '--net=host' only bind lighttpd on $FTLCONF_REPLY_ADDR6 and localhost
local serverip="${FTLCONF_LOCAL_IPV4}"
# if using '--net=host' only bind lighttpd on $FTLCONF_LOCAL_IPV4 and localhost
if grep -q "docker" /proc/net/dev && [[ $serverip != 0.0.0.0 ]]; then #docker (docker0 by default) should only be present on the host system
if ! grep -q "server.bind" /etc/lighttpd/lighttpd.conf ; then # if the declaration is already there, don't add it again
sed -i -E "s/server\.port\s+\=\s+([0-9]+)/server.bind\t\t = \"${serverip}\"\nserver.port\t\t = \1\n"\$SERVER"\[\"socket\"\] == \"127\.0\.0\.1:\1\" \{\}/" /etc/lighttpd/lighttpd.conf
@@ -297,7 +349,7 @@ setup_lighttpd_bind() {
setup_web_php_env() {
if [ -z "$VIRTUAL_HOST" ] ; then
VIRTUAL_HOST="$FTLCONF_REPLY_ADDR4"
VIRTUAL_HOST="$FTLCONF_LOCAL_IPV4"
fi;
for config_var in "VIRTUAL_HOST" "CORS_HOSTS" "PHP_ERROR_LOG" "PIHOLE_DOCKER_TAG" "TZ"; do
@@ -311,12 +363,12 @@ setup_web_php_env() {
fi
done
echo "Added ENV to php:"
echo " [i] Added ENV to php:"
grep -E '(VIRTUAL_HOST|CORS_HOSTS|PHP_ERROR_LOG|PIHOLE_DOCKER_TAG|TZ)' "$PHP_ENV_CONFIG"
}
setup_web_port() {
local warning="WARNING: Custom WEB_PORT not used"
local warning=" [!] WARNING: Custom WEB_PORT not used"
# Quietly exit early for empty or default
if [[ -z "${WEB_PORT}" || "${WEB_PORT}" == '80' ]] ; then return ; fi
@@ -330,8 +382,8 @@ setup_web_port() {
echo "$warning - $web_port is not within valid port range of 1-65535"
return
fi
echo "Custom WEB_PORT set to $web_port"
echo "INFO: Without proper router DNAT forwarding to $FTLCONF_REPLY_ADDR4:$web_port, you may not get any blocked websites on ads"
echo " [i] Custom WEB_PORT set to $web_port"
echo " [i] Without proper router DNAT forwarding to $FTLCONF_LOCAL_IPV4:$web_port, you may not get any blocked websites on ads"
# Update lighttpd's port
sed -i '/server.port\s*=\s*80\s*$/ s/80/'"${WEB_PORT}"'/g' /etc/lighttpd/lighttpd.conf
@@ -344,11 +396,11 @@ setup_web_theme(){
if [ -n "${WEBTHEME}" ]; then
case "${WEBTHEME}" in
"default-dark" | "default-darker" | "default-light" | "default-auto" | "lcars")
echo "Setting Web Theme based on WEBTHEME variable, using value ${WEBTHEME}"
echo " [i] Setting Web Theme based on WEBTHEME variable, using value ${WEBTHEME}"
change_setting "WEBTHEME" "${WEBTHEME}"
;;
*)
echo "Invalid theme name supplied: ${WEBTHEME}, falling back to default-light."
echo " [!] Invalid theme name supplied: ${WEBTHEME}, falling back to default-light."
change_setting "WEBTHEME" "default-light"
;;
esac
@@ -371,10 +423,10 @@ setup_web_password() {
setup_var_exists "WEBPASSWORD" && return
# Generate new random password
WEBPASSWORD=$(tr -dc _A-Z-a-z-0-9 < /dev/urandom | head -c 8)
echo "Assigning random password: $WEBPASSWORD"
echo " [i] Assigning random password: $WEBPASSWORD"
else
# ENV WEBPASSWORD_OVERRIDE is set and will be used
echo "::: Assigning password defined by Environment Variable"
echo " [i] Assigning password defined by Environment Variable"
# WEBPASSWORD="$WEBPASSWORD"
fi
@@ -400,15 +452,15 @@ setup_ipv4_ipv6() {
ip_versions="IPv4"
sed -i '/use-ipv6.pl/ d' /etc/lighttpd/lighttpd.conf
fi;
echo "Using $ip_versions"
echo " [i] Using $ip_versions"
}
test_configs() {
set -e
echo -n '::: Testing lighttpd config: '
echo -n ' [i] Testing lighttpd config: '
lighttpd -t -f /etc/lighttpd/lighttpd.conf || exit 1
set +e
echo "::: All config checks passed, cleared for startup ..."
echo " [i] All config checks passed, cleared for startup ..."
}
setup_blocklists() {
@@ -417,22 +469,21 @@ setup_blocklists() {
exit_string="(exiting ${FUNCNAME[0]} early)"
if [ -n "${skip_setup_blocklists}" ]; then
echo "::: skip_setup_blocklists requested ($exit_string)"
echo " [i] skip_setup_blocklists requested $exit_string"
return
fi
# 2. The adlist file exists already (restarted container or volume mounted list)
if [ -f "${adlistFile}" ]; then
echo "::: Preexisting ad list ${adlistFile} detected ($exit_string)"
cat "${adlistFile}"
echo " [i] Preexisting ad list ${adlistFile} detected $exit_string"
return
fi
echo "::: ${FUNCNAME[0]} now setting default blocklists up: "
echo "::: TIP: Use a docker volume for ${adlistFile} if you want to customize for first boot"
echo " [i] ${FUNCNAME[0]} now setting default blocklists up: "
echo " [i] TIP: Use a docker volume for ${adlistFile} if you want to customize for first boot"
installDefaultBlocklists
echo "::: Blocklists (${adlistFile}) now set to:"
echo " [i] Blocklists (${adlistFile}) now set to:"
cat "${adlistFile}"
}
@@ -442,7 +493,7 @@ setup_var_exists() {
local REQUIRED_VALUE="[^\n]+"
fi
if grep -Pq "^${KEY}=${REQUIRED_VALUE}" "$setupVars"; then
echo "::: Pre existing ${KEY} found"
echo " [i] Pre existing ${KEY} found"
true
else
false
@@ -470,11 +521,3 @@ setup_web_layout() {
fi
fi
}
setup_admin_email() {
local EMAIL="${ADMIN_EMAIL}"
# check if var is empty
if [[ "$EMAIL" != "" ]] ; then
pihole -a -e "$EMAIL"
fi
}

View File

@@ -16,7 +16,7 @@ detect_arch() {
amd64)
S6_ARCH="x86_64";;
armel)
S6_ARCH="arm";;
S6_ARCH="armhf";;
armhf)
S6_ARCH="armhf";;
arm64)
@@ -27,8 +27,9 @@ esac
}
DOCKER_TAG=$(cat /pihole.docker.tag)
# Helps to have some additional tools in the dev image when debugging
if [[ "${PIHOLE_DOCKER_TAG}" = 'nightly' || "${PIHOLE_DOCKER_TAG}" = 'dev' ]]; then
if [[ "${DOCKER_TAG}" = 'nightly' || "${DOCKER_TAG}" = 'dev' ]]; then
apt-get update
apt-get install --no-install-recommends -y nano less
rm -rf /var/lib/apt/lists/*
@@ -36,9 +37,17 @@ fi
detect_arch
S6_OVERLAY_VERSION=v3.1.1.2
curl -L -s "https://github.com/just-containers/s6-overlay/releases/download/${S6_OVERLAY_VERSION}/s6-overlay-noarch.tar.xz" | tar Jxpf - -C /
curl -L -s "https://github.com/just-containers/s6-overlay/releases/download/${S6_OVERLAY_VERSION}/s6-overlay-${S6_ARCH}.tar.xz" | tar Jxpf - -C /
# IMPORTANT: #########################################################################
# Move /init somewhere else to prevent issues with podman/RHEL #
# See: https://github.com/pi-hole/docker-pi-hole/issues/1176#issuecomment-1227587045 #
mv /init /s6-init #
######################################################################################
# Preseed variables to assist with using --unattended install
{
echo "PIHOLE_INTERFACE=eth0"
@@ -60,7 +69,7 @@ export PIHOLE_SKIP_OS_CHECK=true
curl -sSL https://install.pi-hole.net | bash -sex -- --unattended
# At this stage, if we are building a :nightly tag, then switch the Pi-hole install to dev versions
if [[ "${PIHOLE_DOCKER_TAG}" = 'nightly' ]]; then
if [[ "${DOCKER_TAG}" = 'nightly' ]]; then
yes | pihole checkout dev
fi
@@ -86,4 +95,4 @@ ln -s /macvendor.db /etc/pihole/macvendor.db
if [ ! -f /.piholeFirstBoot ]; then
touch /.piholeFirstBoot
fi
echo 'Docker install successful'
echo 'Docker install successful'

View File

@@ -1,15 +0,0 @@
#!/bin/bash
# A shim to make busybox timeout take in debian style args
# v1 only need support for this style: `timeout 1 getent hosts github.com`
# Busybox args:
# Usage: timeout [-t SECS] [-s SIG] PROG ARGS
# Debian args:
# Usage: timeout [OPTION] DURATION COMMAND [ARG]...
# or: timeout [OPTION]
TIMEOUT=/usr/bin/timeout
SECS="${1}"
ARGS="${@:2}"
$TIMEOUT -t $SECS $ARGS

View File

@@ -1,4 +1,4 @@
FROM python:3.8-slim-bullseye
FROM python:3.10-slim-bullseye
# Only works for docker CLIENT (bind mounted socket)
COPY --from=docker:20.10.17 /usr/local/bin/docker /usr/local/bin/
@@ -10,7 +10,7 @@ RUN apt-get update && \
&& rm -rf /var/lib/apt/lists/* \
&& pip3 install --no-cache-dir -U pip pipenv
RUN curl -L https://github.com/docker/compose/releases/download/1.25.5/docker-compose-`uname -s`-`uname -m` > /usr/local/bin/docker-compose && \
RUN curl -L https://github.com/docker/compose/releases/download/2.10.2/docker-compose-`uname -s`-`uname -m` > /usr/local/bin/docker-compose && \
chmod +x /usr/local/bin/docker-compose
COPY ./cmd.sh /usr/local/bin/
@@ -18,7 +18,7 @@ COPY Pipfile* /root/
WORKDIR /root
RUN pipenv install --system \
&& sed -i 's|/bin/sh|/bin/bash|g' /usr/local/lib/python3.8/site-packages/testinfra/backend/docker.py
&& sed -i 's|/bin/sh|/bin/bash|g' /usr/local/lib/python3.10/site-packages/testinfra/backend/docker.py
RUN echo "set -ex && cmd.sh && \$@" > /usr/local/bin/entrypoint.sh
RUN chmod +x /usr/local/bin/entrypoint.sh

View File

@@ -6,59 +6,10 @@ verify_ssl = true
[dev-packages]
[packages]
apipkg = "==1.5"
atomicwrites = "==1.4.1"
attrs = "==19.3.0"
bcrypt = "==3.1.7"
cached-property = "==1.5.1"
certifi = "==2019.11.28"
cffi = "==1.13.2"
chardet = "==3.0.4"
configparser = "==4.0.2"
contextlib2 = "==0.6.0.post1"
coverage = "==5.0.1"
cryptography = "==3.3.2"
docker = "==4.1.0"
dockerpty = "==0.4.1"
docopt = "==0.6.2"
enum34 = "==1.1.6"
execnet = "==1.7.1"
filelock = "==3.0.12"
funcsigs = "==1.0.2"
idna = "==2.8"
importlib-metadata = "==1.3.0"
ipaddress = "==1.0.23"
jsonschema = "==3.2.0"
more-itertools = "==5.0.0"
pathlib2 = "==2.3.5"
pluggy = "==0.13.1"
py = "==1.10.0"
pycparser = "==2.19"
pyparsing = "==2.4.6"
pyrsistent = "==0.15.6"
pytest = "==4.6.8"
pytest-cov = "==2.8.1"
pytest-forked = "==1.1.3"
pytest-xdist = "==1.31.0"
requests = "==2.28.1"
scandir = "==1.10.0"
six = "==1.13.0"
subprocess32 = "==3.5.4"
testinfra = "==3.3.0"
texttable = "==1.6.2"
toml = "==0.10.0"
tox = "==3.14.3"
urllib3 = "==1.26.5"
virtualenv = "==16.7.9"
wcwidth = "==0.1.7"
zipp = "==0.6.0"
"backports.shutil_get_terminal_size" = "==1.0.0"
"backports.ssl_match_hostname" = "==3.7.0.1"
Jinja2 = "==2.11.3"
MarkupSafe = "==1.1.1"
PyYAML = "==5.4"
websocket_client = "==0.57.0"
python-dotenv = "==0.17.1"
pytest = "==7.1.3"
pytest-xdist = "==2.5.0"
pytest-testinfra = "==6.8.0"
black = "==22.8.0"
[requires]
python_version = "3.8"
python_version = "3"

655
test/Pipfile.lock generated
View File

@@ -1,11 +1,11 @@
{
"_meta": {
"hash": {
"sha256": "c679c3eaa7a38959fa47159a01b66d7f7dd1e1667c188c9437a52302ee5a9290"
"sha256": "7cd0c8140d8505e7613e9bf2a9853aaf7cb08a1f100f49db0bec1b94b93be3e1"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.8"
"python_version": "3"
},
"sources": [
{
@@ -16,619 +16,160 @@
]
},
"default": {
"apipkg": {
"hashes": [
"sha256:37228cda29411948b422fae072f57e31d3396d2ee1c9783775980ee9c9990af6",
"sha256:58587dd4dc3daefad0487f6d9ae32b4542b185e1c36db6993290e7c41ca2b47c"
],
"index": "pypi",
"version": "==1.5"
},
"atomicwrites": {
"hashes": [
"sha256:81b2c9071a49367a7f770170e5eec8cb66567cfbbc8c73d20ce5ca4a8d71cf11"
],
"index": "pypi",
"version": "==1.4.1"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
"sha256:29adc2665447e5191d0e7c568fde78b21f9672d344281d0c6e1ab085429b22b6",
"sha256:86efa402f67bf2df34f51a335487cf46b1ec130d02b8d39fd248abfd30da551c"
],
"markers": "python_version >= '3.5'",
"version": "==22.1.0"
},
"black": {
"hashes": [
"sha256:0a12e4e1353819af41df998b02c6742643cfef58282915f781d0e4dd7a200411",
"sha256:0ad827325a3a634bae88ae7747db1a395d5ee02cf05d9aa7a9bd77dfb10e940c",
"sha256:32a4b17f644fc288c6ee2bafdf5e3b045f4eff84693ac069d87b1a347d861497",
"sha256:3b2c25f8dea5e8444bdc6788a2f543e1fb01494e144480bc17f806178378005e",
"sha256:4a098a69a02596e1f2a58a2a1c8d5a05d5a74461af552b371e82f9fa4ada8342",
"sha256:5107ea36b2b61917956d018bd25129baf9ad1125e39324a9b18248d362156a27",
"sha256:53198e28a1fb865e9fe97f88220da2e44df6da82b18833b588b1883b16bb5d41",
"sha256:5594efbdc35426e35a7defa1ea1a1cb97c7dbd34c0e49af7fb593a36bd45edab",
"sha256:5b879eb439094751185d1cfdca43023bc6786bd3c60372462b6f051efa6281a5",
"sha256:78dd85caaab7c3153054756b9fe8c611efa63d9e7aecfa33e533060cb14b6d16",
"sha256:792f7eb540ba9a17e8656538701d3eb1afcb134e3b45b71f20b25c77a8db7e6e",
"sha256:8ce13ffed7e66dda0da3e0b2eb1bdfc83f5812f66e09aca2b0978593ed636b6c",
"sha256:a05da0430bd5ced89176db098567973be52ce175a55677436a271102d7eaa3fe",
"sha256:a983526af1bea1e4cf6768e649990f28ee4f4137266921c2c3cee8116ae42ec3",
"sha256:bc4d4123830a2d190e9cc42a2e43570f82ace35c3aeb26a512a2102bce5af7ec",
"sha256:c3a73f66b6d5ba7288cd5d6dad9b4c9b43f4e8a4b789a94bf5abfb878c663eb3",
"sha256:ce957f1d6b78a8a231b18e0dd2d94a33d2ba738cd88a7fe64f53f659eea49fdd",
"sha256:cea1b2542d4e2c02c332e83150e41e3ca80dc0fb8de20df3c5e98e242156222c",
"sha256:d2c21d439b2baf7aa80d6dd4e3659259be64c6f49dfd0f32091063db0e006db4",
"sha256:d839150f61d09e7217f52917259831fe2b689f5c8e5e32611736351b89bb2a90",
"sha256:dd82842bb272297503cbec1a2600b6bfb338dae017186f8f215c8958f8acf869",
"sha256:e8166b7bfe5dcb56d325385bd1d1e0f635f24aae14b3ae437102dedc0c186747",
"sha256:e981e20ec152dfb3e77418fb616077937378b322d7b26aa1ff87717fb18b4875"
],
"index": "pypi",
"version": "==19.3.0"
"version": "==22.8.0"
},
"backports.shutil-get-terminal-size": {
"click": {
"hashes": [
"sha256:0975ba55054c15e346944b38956a4c9cbee9009391e41b86c68990effb8c1f64",
"sha256:713e7a8228ae80341c70586d1cc0a8caa5207346927e23d09dcbcaf18eadec80"
"sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e",
"sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"
],
"index": "pypi",
"version": "==1.0.0"
},
"backports.ssl-match-hostname": {
"hashes": [
"sha256:bb82e60f9fbf4c080eabd957c39f0641f0fc247d9a16e31e26d594d8f42b9fd2"
],
"index": "pypi",
"version": "==3.7.0.1"
},
"bcrypt": {
"hashes": [
"sha256:0258f143f3de96b7c14f762c770f5fc56ccd72f8a1857a451c1cd9a655d9ac89",
"sha256:0b0069c752ec14172c5f78208f1863d7ad6755a6fae6fe76ec2c80d13be41e42",
"sha256:19a4b72a6ae5bb467fea018b825f0a7d917789bcfe893e53f15c92805d187294",
"sha256:436a487dec749bca7e6e72498a75a5fa2433bda13bac91d023e18df9089ae0b8",
"sha256:5432dd7b34107ae8ed6c10a71b4397f1c853bd39a4d6ffa7e35f40584cffd161",
"sha256:6305557019906466fc42dbc53b46da004e72fd7a551c044a827e572c82191752",
"sha256:69361315039878c0680be456640f8705d76cb4a3a3fe1e057e0f261b74be4b31",
"sha256:6fe49a60b25b584e2f4ef175b29d3a83ba63b3a4df1b4c0605b826668d1b6be5",
"sha256:74a015102e877d0ccd02cdeaa18b32aa7273746914a6c5d0456dd442cb65b99c",
"sha256:763669a367869786bb4c8fcf731f4175775a5b43f070f50f46f0b59da45375d0",
"sha256:8b10acde4e1919d6015e1df86d4c217d3b5b01bb7744c36113ea43d529e1c3de",
"sha256:9fe92406c857409b70a38729dbdf6578caf9228de0aef5bc44f859ffe971a39e",
"sha256:a190f2a5dbbdbff4b74e3103cef44344bc30e61255beb27310e2aec407766052",
"sha256:a595c12c618119255c90deb4b046e1ca3bcfad64667c43d1166f2b04bc72db09",
"sha256:c9457fa5c121e94a58d6505cadca8bed1c64444b83b3204928a866ca2e599105",
"sha256:cb93f6b2ab0f6853550b74e051d297c27a638719753eb9ff66d1e4072be67133",
"sha256:ce4e4f0deb51d38b1611a27f330426154f2980e66582dc5f438aad38b5f24fc1",
"sha256:d7bdc26475679dd073ba0ed2766445bb5b20ca4793ca0db32b399dccc6bc84b7",
"sha256:ff032765bb8716d9387fd5376d987a937254b0619eff0972779515b5c98820bc"
],
"index": "pypi",
"version": "==3.1.7"
},
"cached-property": {
"hashes": [
"sha256:3a026f1a54135677e7da5ce819b0c690f156f37976f3e30c5430740725203d7f",
"sha256:9217a59f14a5682da7c4b8829deadbfc194ac22e9908ccf7c8820234e80a1504"
],
"index": "pypi",
"version": "==1.5.1"
},
"certifi": {
"hashes": [
"sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
],
"index": "pypi",
"version": "==2019.11.28"
},
"cffi": {
"hashes": [
"sha256:0b49274afc941c626b605fb59b59c3485c17dc776dc3cc7cc14aca74cc19cc42",
"sha256:0e3ea92942cb1168e38c05c1d56b0527ce31f1a370f6117f1d490b8dcd6b3a04",
"sha256:135f69aecbf4517d5b3d6429207b2dff49c876be724ac0c8bf8e1ea99df3d7e5",
"sha256:19db0cdd6e516f13329cba4903368bff9bb5a9331d3410b1b448daaadc495e54",
"sha256:2781e9ad0e9d47173c0093321bb5435a9dfae0ed6a762aabafa13108f5f7b2ba",
"sha256:291f7c42e21d72144bb1c1b2e825ec60f46d0a7468f5346841860454c7aa8f57",
"sha256:2c5e309ec482556397cb21ede0350c5e82f0eb2621de04b2633588d118da4396",
"sha256:2e9c80a8c3344a92cb04661115898a9129c074f7ab82011ef4b612f645939f12",
"sha256:32a262e2b90ffcfdd97c7a5e24a6012a43c61f1f5a57789ad80af1d26c6acd97",
"sha256:3c9fff570f13480b201e9ab69453108f6d98244a7f495e91b6c654a47486ba43",
"sha256:415bdc7ca8c1c634a6d7163d43fb0ea885a07e9618a64bda407e04b04333b7db",
"sha256:42194f54c11abc8583417a7cf4eaff544ce0de8187abaf5d29029c91b1725ad3",
"sha256:4424e42199e86b21fc4db83bd76909a6fc2a2aefb352cb5414833c030f6ed71b",
"sha256:4a43c91840bda5f55249413037b7a9b79c90b1184ed504883b72c4df70778579",
"sha256:599a1e8ff057ac530c9ad1778293c665cb81a791421f46922d80a86473c13346",
"sha256:5c4fae4e9cdd18c82ba3a134be256e98dc0596af1e7285a3d2602c97dcfa5159",
"sha256:5ecfa867dea6fabe2a58f03ac9186ea64da1386af2159196da51c4904e11d652",
"sha256:62f2578358d3a92e4ab2d830cd1c2049c9c0d0e6d3c58322993cc341bdeac22e",
"sha256:6471a82d5abea994e38d2c2abc77164b4f7fbaaf80261cb98394d5793f11b12a",
"sha256:6d4f18483d040e18546108eb13b1dfa1000a089bcf8529e30346116ea6240506",
"sha256:71a608532ab3bd26223c8d841dde43f3516aa5d2bf37b50ac410bb5e99053e8f",
"sha256:74a1d8c85fb6ff0b30fbfa8ad0ac23cd601a138f7509dc617ebc65ef305bb98d",
"sha256:7b93a885bb13073afb0aa73ad82059a4c41f4b7d8eb8368980448b52d4c7dc2c",
"sha256:7d4751da932caaec419d514eaa4215eaf14b612cff66398dd51129ac22680b20",
"sha256:7f627141a26b551bdebbc4855c1157feeef18241b4b8366ed22a5c7d672ef858",
"sha256:8169cf44dd8f9071b2b9248c35fc35e8677451c52f795daa2bb4643f32a540bc",
"sha256:aa00d66c0fab27373ae44ae26a66a9e43ff2a678bf63a9c7c1a9a4d61172827a",
"sha256:ccb032fda0873254380aa2bfad2582aedc2959186cce61e3a17abc1a55ff89c3",
"sha256:d754f39e0d1603b5b24a7f8484b22d2904fa551fe865fd0d4c3332f078d20d4e",
"sha256:d75c461e20e29afc0aee7172a0950157c704ff0dd51613506bd7d82b718e7410",
"sha256:dcd65317dd15bc0451f3e01c80da2216a31916bdcffd6221ca1202d96584aa25",
"sha256:e570d3ab32e2c2861c4ebe6ffcad6a8abf9347432a37608fe1fbd157b3f0036b",
"sha256:fd43a88e045cf992ed09fa724b5315b790525f2676883a6ea64e3263bae6549d"
],
"index": "pypi",
"version": "==1.13.2"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"index": "pypi",
"version": "==3.0.4"
},
"charset-normalizer": {
"hashes": [
"sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5",
"sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413"
],
"markers": "python_full_version >= '3.6.0'",
"version": "==2.1.0"
},
"configparser": {
"hashes": [
"sha256:254c1d9c79f60c45dfde850850883d5aaa7f19a23f13561243a050d5a7c3fe4c",
"sha256:c7d282687a5308319bf3d2e7706e575c635b0a470342641c93bea0ea3b5331df"
],
"index": "pypi",
"version": "==4.0.2"
},
"contextlib2": {
"hashes": [
"sha256:01f490098c18b19d2bd5bb5dc445b2054d2fa97f09a4280ba2c5f3c394c8162e",
"sha256:3355078a159fbb44ee60ea80abd0d87b80b78c248643b49aa6d94673b413609b"
],
"index": "pypi",
"version": "==0.6.0.post1"
},
"coverage": {
"hashes": [
"sha256:0101888bd1592a20ccadae081ba10e8b204d20235d18d05c6f7d5e904a38fc10",
"sha256:04b961862334687549eb91cd5178a6fbe977ad365bddc7c60f2227f2f9880cf4",
"sha256:1ca43dbd739c0fc30b0a3637a003a0d2c7edc1dd618359d58cc1e211742f8bd1",
"sha256:1cbb88b34187bdb841f2599770b7e6ff8e259dc3bb64fc7893acf44998acf5f8",
"sha256:232f0b52a5b978288f0bbc282a6c03fe48cd19a04202df44309919c142b3bb9c",
"sha256:24bcfa86fd9ce86b73a8368383c39d919c497a06eebb888b6f0c12f13e920b1a",
"sha256:25b8f60b5c7da71e64c18888f3067d5b6f1334b9681876b2fb41eea26de881ae",
"sha256:2714160a63da18aed9340c70ed514973971ee7e665e6b336917ff4cca81a25b1",
"sha256:2ca2cd5264e84b2cafc73f0045437f70c6378c0d7dbcddc9ee3fe192c1e29e5d",
"sha256:2cc707fc9aad2592fc686d63ef72dc0031fc98b6fb921d2f5395d9ab84fbc3ef",
"sha256:348630edea485f4228233c2f310a598abf8afa5f8c716c02a9698089687b6085",
"sha256:40fbfd6b044c9db13aeec1daf5887d322c710d811f944011757526ef6e323fd9",
"sha256:46c9c6a1d1190c0b75ec7c0f339088309952b82ae8d67a79ff1319eb4e749b96",
"sha256:591506e088901bdc25620c37aec885e82cc896528f28c57e113751e3471fc314",
"sha256:5ac71bba1e07eab403b082c4428f868c1c9e26a21041436b4905c4c3d4e49b08",
"sha256:5f622f19abda4e934938e24f1d67599249abc201844933a6f01aaa8663094489",
"sha256:65bead1ac8c8930cf92a1ccaedcce19a57298547d5d1db5c9d4d068a0675c38b",
"sha256:7362a7f829feda10c7265b553455de596b83d1623b3d436b6d3c51c688c57bf6",
"sha256:7f2675750c50151f806070ec11258edf4c328340916c53bac0adbc465abd6b1e",
"sha256:960d7f42277391e8b1c0b0ae427a214e1b31a1278de6b73f8807b20c2e913bba",
"sha256:a50b0888d8a021a3342d36a6086501e30de7d840ab68fca44913e97d14487dc1",
"sha256:b7dbc5e8c39ea3ad3db22715f1b5401cd698a621218680c6daf42c2f9d36e205",
"sha256:bb3d29df5d07d5399d58a394d0ef50adf303ab4fbf66dfd25b9ef258effcb692",
"sha256:c0fff2733f7c2950f58a4fd09b5db257b00c6fec57bf3f68c5bae004d804b407",
"sha256:c792d3707a86c01c02607ae74364854220fb3e82735f631cd0a345dea6b4cee5",
"sha256:c90bda74e16bcd03861b09b1d37c0a4158feda5d5a036bb2d6e58de6ff65793e",
"sha256:cfce79ce41cc1a1dc7fc85bb41eeeb32d34a4cf39a645c717c0550287e30ff06",
"sha256:eeafb646f374988c22c8e6da5ab9fb81367ecfe81c70c292623373d2a021b1a1",
"sha256:f425f50a6dd807cb9043d15a4fcfba3b5874a54d9587ccbb748899f70dc18c47",
"sha256:fcd4459fe35a400b8f416bc57906862693c9f88b66dc925e7f2a933e77f6b18b",
"sha256:ff3936dd5feaefb4f91c8c1f50a06c588b5dc69fba4f7d9c79a6617ad80bb7df"
],
"index": "pypi",
"version": "==5.0.1"
},
"cryptography": {
"hashes": [
"sha256:0d7b69674b738068fa6ffade5c962ecd14969690585aaca0a1b1fc9058938a72",
"sha256:1bd0ccb0a1ed775cd7e2144fe46df9dc03eefd722bbcf587b3e0616ea4a81eff",
"sha256:3c284fc1e504e88e51c428db9c9274f2da9f73fdf5d7e13a36b8ecb039af6e6c",
"sha256:49570438e60f19243e7e0d504527dd5fe9b4b967b5a1ff21cc12b57602dd85d3",
"sha256:541dd758ad49b45920dda3b5b48c968f8b2533d8981bcdb43002798d8f7a89ed",
"sha256:5a60d3780149e13b7a6ff7ad6526b38846354d11a15e21068e57073e29e19bed",
"sha256:7951a966613c4211b6612b0352f5bf29989955ee592c4a885d8c7d0f830d0433",
"sha256:922f9602d67c15ade470c11d616f2b2364950602e370c76f0c94c94ae672742e",
"sha256:a0f0b96c572fc9f25c3f4ddbf4688b9b38c69836713fb255f4a2715d93cbaf44",
"sha256:a777c096a49d80f9d2979695b835b0f9c9edab73b59e4ceb51f19724dda887ed",
"sha256:a9a4ac9648d39ce71c2f63fe7dc6db144b9fa567ddfc48b9fde1b54483d26042",
"sha256:aa4969f24d536ae2268c902b2c3d62ab464b5a66bcb247630d208a79a8098e9b",
"sha256:c7390f9b2119b2b43160abb34f63277a638504ef8df99f11cb52c1fda66a2e6f",
"sha256:e18e6ab84dfb0ab997faf8cca25a86ff15dfea4027b986322026cc99e0a892da"
],
"index": "pypi",
"version": "==3.3.2"
},
"docker": {
"hashes": [
"sha256:6e06c5e70ba4fad73e35f00c55a895a448398f3ada7faae072e2bb01348bafc1",
"sha256:8f93775b8bdae3a2df6bc9a5312cce564cade58d6555f2c2570165a1270cd8a7"
],
"index": "pypi",
"version": "==4.1.0"
},
"dockerpty": {
"hashes": [
"sha256:69a9d69d573a0daa31bcd1c0774eeed5c15c295fe719c61aca550ed1393156ce"
],
"index": "pypi",
"version": "==0.4.1"
},
"docopt": {
"hashes": [
"sha256:49b3a825280bd66b3aa83585ef59c4a8c82f2c8a522dbe754a8bc8d08c85c491"
],
"index": "pypi",
"version": "==0.6.2"
},
"enum34": {
"hashes": [
"sha256:2d81cbbe0e73112bdfe6ef8576f2238f2ba27dd0d55752a776c41d38b7da2850",
"sha256:644837f692e5f550741432dd3f223bbb9852018674981b1664e5dc339387588a",
"sha256:6bd0f6ad48ec2aa117d3d141940d484deccda84d4fcd884f5c3d93c23ecd8c79",
"sha256:8ad8c4783bf61ded74527bffb48ed9b54166685e4230386a9ed9b1279e2df5b1"
],
"index": "pypi",
"version": "==1.1.6"
"markers": "python_version >= '3.7'",
"version": "==8.1.3"
},
"execnet": {
"hashes": [
"sha256:cacb9df31c9680ec5f95553976c4da484d407e85e41c83cb812aa014f0eddc50",
"sha256:d4efd397930c46415f62f8a31388d6be4f27a91d7550eb79bc64a756e0056547"
"sha256:8f694f3ba9cc92cab508b152dcfe322153975c29bda272e2fd7f3f00f36e47c5",
"sha256:a295f7cc774947aac58dde7fdc85f4aa00c42adf5d8f5468fc630c1acf30a142"
],
"index": "pypi",
"version": "==1.7.1"
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==1.9.0"
},
"filelock": {
"iniconfig": {
"hashes": [
"sha256:18d82244ee114f543149c66a6e0c14e9c4f8a1044b5cdaadd0f82159d6a6ff59",
"sha256:929b7d63ec5b7d6b71b0fa5ac14e030b3f70b75747cef1b10da9b879fef15836"
"sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3",
"sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"
],
"index": "pypi",
"version": "==3.0.12"
},
"funcsigs": {
"hashes": [
"sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca",
"sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50"
],
"index": "pypi",
"version": "==1.0.2"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"index": "pypi",
"version": "==2.8"
},
"importlib-metadata": {
"hashes": [
"sha256:073a852570f92da5f744a3472af1b61e28e9f78ccf0c9117658dc32b15de7b45",
"sha256:d95141fbfa7ef2ec65cfd945e2af7e5a6ddbd7c8d9a25e66ff3be8e3daf9f60f"
],
"index": "pypi",
"version": "==1.3.0"
},
"ipaddress": {
"hashes": [
"sha256:6e0f4a39e66cb5bb9a137b00276a2eff74f93b71dcbdad6f10ff7df9d3557fcc",
"sha256:b7f8e0369580bb4a24d5ba1d7cc29660a4a6987763faf1d8a8046830e020e7e2"
],
"index": "pypi",
"version": "==1.0.23"
},
"jinja2": {
"hashes": [
"sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419",
"sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6"
],
"index": "pypi",
"version": "==2.11.3"
},
"jsonschema": {
"hashes": [
"sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163",
"sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"
],
"index": "pypi",
"version": "==3.2.0"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
"sha256:195d7d2c4fbb0ee8139a6cf67194f3973a6b3042d742ebe0a9ed36d8b6f0c07f",
"sha256:22c178a091fc6630d0d045bdb5992d2dfe14e3259760e713c490da5323866c39",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:2beec1e0de6924ea551859edb9e7679da6e4870d32cb766240ce17e0a0ba2014",
"sha256:3b8a6499709d29c2e2399569d96719a1b21dcd94410a586a18526b143ec8470f",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:6f1e273a344928347c1290119b493a1f0303c52f5a5eae5f16d74f48c15d4a85",
"sha256:6fffc775d90dcc9aed1b89219549b329a9250d918fd0b8fa8d93d154918422e1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:7fed13866cf14bba33e7176717346713881f56d9d2bcebab207f7a036f41b850",
"sha256:84dee80c15f1b560d55bcfe6d47b27d070b4681c699c572af2e3c7cc90a3b8e0",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98bae9582248d6cf62321dcb52aaf5d9adf0bad3b40582925ef7c7f0ed85fceb",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:a6a744282b7718a2a62d2ed9d993cad6f5f585605ad352c11de459f4108df0a1",
"sha256:acf08ac40292838b3cbbb06cfe9b2cb9ec78fce8baca31ddb87aaac2e2dc3bc2",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b1dba4527182c95a0db8b6060cc98ac49b9e2f5e64320e2b56e47cb2831978c7",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:b7d644ddb4dbd407d31ffb699f1d140bc35478da613b441c582aeb7c43838dd8",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:bf5aa3cbcfdf57fa2ee9cd1822c862ef23037f5c832ad09cfea57fa846dec193",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:caabedc8323f1e93231b52fc32bdcde6db817623d33e100708d9a68e1f53b26b",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
"sha256:d53bc011414228441014aa71dbec320c66468c1030aae3a6e29778a3382d96e5",
"sha256:d73a845f227b0bfe8a7455ee623525ee656a9e2e749e4742706d80a6065d5e2c",
"sha256:d9be0ba6c527163cbed5e0857c451fcd092ce83947944d6c14bc95441203f032",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
"sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be",
"sha256:feb7b34d6325451ef96bc0e36e1a6c0c1c64bc1fbec4b854f4529e51887b1621"
],
"index": "pypi",
"version": "==1.1.1"
},
"more-itertools": {
"mypy-extensions": {
"hashes": [
"sha256:38a936c0a6d98a38bcc2d03fdaaedaba9f412879461dd2ceff8d37564d6522e4",
"sha256:c0a5785b1109a6bd7fac76d6837fd1feca158e54e521ccd2ae8bfe393cc9d4fc",
"sha256:fe7a7cae1ccb57d33952113ff4fa1bc5f879963600ed74918f1236e212ee50b9"
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
],
"index": "pypi",
"version": "==5.0.0"
"version": "==0.4.3"
},
"packaging": {
"hashes": [
"sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb",
"sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"
],
"markers": "python_full_version >= '3.6.0'",
"markers": "python_version >= '3.6'",
"version": "==21.3"
},
"pathlib2": {
"pathspec": {
"hashes": [
"sha256:0ec8205a157c80d7acc301c0b18fbd5d44fe655968f5d947b6ecef5290fc35db",
"sha256:6cd9a47b597b37cc57de1c05e56fb1a1c9cc9fab04fe78c29acd090418529868"
"sha256:46846318467efc4556ccfd27816e004270a9eeeeb4d062ce5e6fc7a87c573f93",
"sha256:7ace6161b621d31e7902eb6b5ae148d12cfd23f4a249b9ffb6b9fee12084323d"
],
"index": "pypi",
"version": "==2.3.5"
"markers": "python_version >= '3.7'",
"version": "==0.10.1"
},
"platformdirs": {
"hashes": [
"sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788",
"sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"
],
"markers": "python_version >= '3.7'",
"version": "==2.5.2"
},
"pluggy": {
"hashes": [
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
"sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159",
"sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"
],
"index": "pypi",
"version": "==0.13.1"
"markers": "python_version >= '3.6'",
"version": "==1.0.0"
},
"py": {
"hashes": [
"sha256:21b81bda15b66ef5e1a777a21c4dcd9c20ad3efd0b3f817e7a809035269e1bd3",
"sha256:3b80836aa6d1feeaa108e046da6423ab8f6ceda6468545ae8d02d9d58d18818a"
"sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719",
"sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"
],
"index": "pypi",
"version": "==1.10.0"
},
"pycparser": {
"hashes": [
"sha256:a988718abfad80b6b157acce7bf130a30876d27603738ac39f140993246b25b3"
],
"index": "pypi",
"version": "==2.19"
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==1.11.0"
},
"pyparsing": {
"hashes": [
"sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f",
"sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec"
"sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb",
"sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"
],
"index": "pypi",
"version": "==2.4.6"
},
"pyrsistent": {
"hashes": [
"sha256:f3b280d030afb652f79d67c5586157c5c1355c9a58dfc7940566e28d28f3df1b"
],
"index": "pypi",
"version": "==0.15.6"
"markers": "python_full_version >= '3.6.8'",
"version": "==3.0.9"
},
"pytest": {
"hashes": [
"sha256:6192875be8af57b694b7c4904e909680102befcb99e610ef3d9f786952f795aa",
"sha256:f8447ebf8fd3d362868a5d3f43a9df786dfdfe9608843bd9002a2d47a104808f"
"sha256:1377bda3466d70b55e3f5cecfa55bb7cfcf219c7964629b967c37cf0bda818b7",
"sha256:4f365fec2dff9c1162f834d9f18af1ba13062db0c708bf7b946f8a5c76180c39"
],
"index": "pypi",
"version": "==4.6.8"
},
"pytest-cov": {
"hashes": [
"sha256:cc6742d8bac45070217169f5f72ceee1e0e55b0221f54bcf24845972d3a47f2b",
"sha256:cdbdef4f870408ebdbfeb44e63e07eb18bb4619fae852f6e760645fa36172626"
],
"index": "pypi",
"version": "==2.8.1"
"version": "==7.1.3"
},
"pytest-forked": {
"hashes": [
"sha256:1805699ed9c9e60cb7a8179b8d4fa2b8898098e82d229b0825d8095f0f261100",
"sha256:1ae25dba8ee2e56fb47311c9638f9e58552691da87e82d25b0ce0e4bf52b7d87"
"sha256:8b67587c8f98cbbadfdd804539ed5455b6ed03802203485dd2f53c1422d7440e",
"sha256:bbbb6717efc886b9d64537b41fb1497cfaf3c9601276be8da2cccfea5a3c8ad8"
],
"markers": "python_version >= '3.6'",
"version": "==1.4.0"
},
"pytest-testinfra": {
"hashes": [
"sha256:07c8c2c472aca7d83099ebc5f850d383721cd654b66c60ffbb145e45e584ff99",
"sha256:56ac1dfc61342632a1189091473e253db1a3cdcecce0d49d6a769f33cd264814"
],
"index": "pypi",
"version": "==1.1.3"
"version": "==6.8.0"
},
"pytest-xdist": {
"hashes": [
"sha256:0f46020d3d9619e6d17a65b5b989c1ebbb58fc7b1da8fb126d70f4bac4dfeed1",
"sha256:7dc0d027d258cd0defc618fb97055fbd1002735ca7a6d17037018cf870e24011"
"sha256:4580deca3ff04ddb2ac53eba39d76cb5dd5edeac050cb6fbc768b0dd712b4edf",
"sha256:6fe5c74fec98906deb8f2d2b616b5c782022744978e7bd4695d39c8f42d0ce65"
],
"index": "pypi",
"version": "==1.31.0"
"version": "==2.5.0"
},
"python-dotenv": {
"tomli": {
"hashes": [
"sha256:00aa34e92d992e9f8383730816359647f358f4a3be1ba45e5a5cefd27ee91544",
"sha256:b1ae5e9643d5ed987fc57cc2583021e38db531946518130777734f9589b3141f"
"sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc",
"sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"
],
"index": "pypi",
"version": "==0.17.1"
},
"pyyaml": {
"hashes": [
"sha256:02c78d77281d8f8d07a255e57abdbf43b02257f59f50cc6b636937d68efa5dd0",
"sha256:0dc9f2eb2e3c97640928dec63fd8dc1dd91e6b6ed236bd5ac00332b99b5c2ff9",
"sha256:124fd7c7bc1e95b1eafc60825f2daf67c73ce7b33f1194731240d24b0d1bf628",
"sha256:26fcb33776857f4072601502d93e1a619f166c9c00befb52826e7b774efaa9db",
"sha256:31ba07c54ef4a897758563e3a0fcc60077698df10180abe4b8165d9895c00ebf",
"sha256:3c49e39ac034fd64fd576d63bb4db53cda89b362768a67f07749d55f128ac18a",
"sha256:52bf0930903818e600ae6c2901f748bc4869c0c406056f679ab9614e5d21a166",
"sha256:5a3f345acff76cad4aa9cb171ee76c590f37394186325d53d1aa25318b0d4a09",
"sha256:5e7ac4e0e79a53451dc2814f6876c2fa6f71452de1498bbe29c0b54b69a986f4",
"sha256:7242790ab6c20316b8e7bb545be48d7ed36e26bbe279fd56f2c4a12510e60b4b",
"sha256:737bd70e454a284d456aa1fa71a0b429dd527bcbf52c5c33f7c8eee81ac16b89",
"sha256:8635d53223b1f561b081ff4adecb828fd484b8efffe542edcfdff471997f7c39",
"sha256:8b818b6c5a920cbe4203b5a6b14256f0e5244338244560da89b7b0f1313ea4b6",
"sha256:8bf38641b4713d77da19e91f8b5296b832e4db87338d6aeffe422d42f1ca896d",
"sha256:a36a48a51e5471513a5aea920cdad84cbd56d70a5057cca3499a637496ea379c",
"sha256:b2243dd033fd02c01212ad5c601dafb44fbb293065f430b0d3dbf03f3254d615",
"sha256:cc547d3ead3754712223abb7b403f0a184e4c3eae18c9bb7fd15adef1597cc4b",
"sha256:cc552b6434b90d9dbed6a4f13339625dc466fd82597119897e9489c953acbc22",
"sha256:f3790156c606299ff499ec44db422f66f05a7363b39eb9d5b064f17bd7d7c47b",
"sha256:f7a21e3d99aa3095ef0553e7ceba36fb693998fbb1226f1392ce33681047465f",
"sha256:fdc6b2cb4b19e431994f25a9160695cc59a4e861710cc6fc97161c5e845fc579"
],
"index": "pypi",
"version": "==5.4"
},
"requests": {
"hashes": [
"sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983",
"sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"
],
"index": "pypi",
"version": "==2.28.1"
},
"scandir": {
"hashes": [
"sha256:2586c94e907d99617887daed6c1d102b5ca28f1085f90446554abf1faf73123e",
"sha256:2ae41f43797ca0c11591c0c35f2f5875fa99f8797cb1a1fd440497ec0ae4b022",
"sha256:2b8e3888b11abb2217a32af0766bc06b65cc4a928d8727828ee68af5a967fa6f",
"sha256:2c712840c2e2ee8dfaf36034080108d30060d759c7b73a01a52251cc8989f11f",
"sha256:4d4631f6062e658e9007ab3149a9b914f3548cb38bfb021c64f39a025ce578ae",
"sha256:67f15b6f83e6507fdc6fca22fedf6ef8b334b399ca27c6b568cbfaa82a364173",
"sha256:7d2d7a06a252764061a020407b997dd036f7bd6a175a5ba2b345f0a357f0b3f4",
"sha256:8c5922863e44ffc00c5c693190648daa6d15e7c1207ed02d6f46a8dcc2869d32",
"sha256:92c85ac42f41ffdc35b6da57ed991575bdbe69db895507af88b9f499b701c188",
"sha256:b24086f2375c4a094a6b51e78b4cf7ca16c721dcee2eddd7aa6494b42d6d519d",
"sha256:cb925555f43060a1745d0a321cca94bcea927c50114b623d73179189a4e100ac"
],
"index": "pypi",
"version": "==1.10.0"
},
"setuptools": {
"hashes": [
"sha256:16923d366ced322712c71ccb97164d07472abeecd13f3a6c283f6d5d26722793",
"sha256:db3b8e2f922b2a910a29804776c643ea609badb6a32c4bcc226fd4fd902cce65"
],
"markers": "python_version >= '3.7'",
"version": "==63.1.0"
},
"six": {
"hashes": [
"sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd",
"sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66"
],
"index": "pypi",
"version": "==1.13.0"
},
"subprocess32": {
"hashes": [
"sha256:88e37c1aac5388df41cc8a8456bb49ebffd321a3ad4d70358e3518176de3a56b",
"sha256:e45d985aef903c5b7444d34350b05da91a9e0ea015415ab45a21212786c649d0",
"sha256:eb2937c80497978d181efa1b839ec2d9622cf9600a039a79d0e108d1f9aec79d"
],
"index": "pypi",
"version": "==3.5.4"
},
"testinfra": {
"hashes": [
"sha256:780e6c2ab392ea93c26cee1777c968a144c2189a56b3e239a3a66e6d256925b5",
"sha256:c3492b39c8d2c98d8419ce1a91d7fe348213f9b98b91198d2e7e88b3954b050b"
],
"index": "pypi",
"version": "==3.3.0"
},
"texttable": {
"hashes": [
"sha256:7dc282a5b22564fe0fdc1c771382d5dd9a54742047c61558e071c8cd595add86",
"sha256:eff3703781fbc7750125f50e10f001195174f13825a92a45e9403037d539b4f4"
],
"index": "pypi",
"version": "==1.6.2"
},
"toml": {
"hashes": [
"sha256:229f81c57791a41d65e399fc06bf0848bab550a9dfd5ed66df18ce5f05e73d5c",
"sha256:235682dd292d5899d361a811df37e04a8828a5b1da3115886b73cf81ebc9100e",
"sha256:f1db651f9657708513243e61e6cc67d101a39bad662eaa9b5546f789338e07a3"
],
"index": "pypi",
"version": "==0.10.0"
},
"tox": {
"hashes": [
"sha256:06ba73b149bf838d5cd25dc30c2dd2671ae5b2757cf98e5c41a35fe449f131b3",
"sha256:806d0a9217584558cc93747a945a9d9bff10b141a5287f0c8429a08828a22192"
],
"index": "pypi",
"version": "==3.14.3"
},
"urllib3": {
"hashes": [
"sha256:753a0374df26658f99d826cfe40394a686d05985786d946fbe4165b5148f5a7c",
"sha256:a7acd0977125325f516bda9735fa7142b909a8d01e8b2e4c8108d0984e6e0098"
],
"index": "pypi",
"version": "==1.26.5"
},
"virtualenv": {
"hashes": [
"sha256:0d62c70883c0342d59c11d0ddac0d954d0431321a41ab20851facf2b222598f3",
"sha256:55059a7a676e4e19498f1aad09b8313a38fcc0cdbe4fdddc0e9b06946d21b4bb"
],
"index": "pypi",
"version": "==16.7.9"
},
"wcwidth": {
"hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
],
"index": "pypi",
"version": "==0.1.7"
},
"websocket-client": {
"hashes": [
"sha256:0fc45c961324d79c781bab301359d5a1b00b13ad1b10415a4780229ef71a5549",
"sha256:d735b91d6d1692a6a181f2a8c9e0238e5f6373356f561bb9dc4c7af36f452010"
],
"index": "pypi",
"version": "==0.57.0"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
],
"index": "pypi",
"version": "==0.6.0"
"markers": "python_full_version < '3.11.0a7'",
"version": "==2.0.1"
}
},
"develop": {}

View File

@@ -12,3 +12,8 @@ Should result in:
- An image named `pihole:[branch-name]` being built
- Tests being ran to confirm the image doesn't have any regressions
# Modify Pipfile
You can enter into the test docker image using `./build-and-test.sh enter`.
From there, you can `cd test` and execute any needed pipenv commands.

View File

@@ -4,6 +4,9 @@ set -eux
docker build ./src --tag pihole:${GIT_TAG} --no-cache
docker images
# auto-format the pytest code
python -m black ./test/tests/
# TODO: Add junitxml output and have something consume it
# 2 parallel max b/c race condition with docker fixture (I think?)
py.test -vv -n 2 ./test/tests/

View File

@@ -1,55 +0,0 @@
-i https://pypi.org/simple/
apipkg==1.5
atomicwrites==1.3.0
attrs==19.3.0
backports.shutil-get-terminal-size==1.0.0
backports.ssl-match-hostname==3.7.0.1
bcrypt==3.1.7
cached-property==1.5.1
certifi==2019.11.28
cffi==1.13.2
chardet==3.0.4
configparser==4.0.2
contextlib2==0.6.0.post1
coverage==5.0.1
cryptography==3.3.2
docker==4.1.0
dockerpty==0.4.1
docopt==0.6.2
enum34==1.1.6
execnet==1.7.1
filelock==3.0.12
funcsigs==1.0.2
idna==2.8
importlib-metadata==1.3.0
ipaddress==1.0.23
jinja2==2.11.3
jsonschema==3.2.0
markupsafe==1.1.1
more-itertools==5.0.0
packaging==20.9
pathlib2==2.3.5
pluggy==0.13.1
py==1.10.0
pycparser==2.19
pyparsing==2.4.6
pyrsistent==0.15.6
pytest-cov==2.8.1
pytest-forked==1.1.3
pytest-xdist==1.31.0
pytest==4.6.8
pyyaml==5.4
requests==2.22.0
scandir==1.10.0
six==1.13.0
subprocess32==3.5.4
testinfra==3.3.0
texttable==1.6.2
toml==0.10.0
tox==3.14.3
urllib3==1.25.9
virtualenv==16.7.9
wcwidth==0.1.7
websocket-client==0.57.0
zipp==0.6.0
python-dotenv==0.17.1

View File

@@ -3,63 +3,79 @@ import pytest
import subprocess
import testinfra
local_host = testinfra.get_host('local://')
local_host = testinfra.get_host("local://")
check_output = local_host.check_output
TAIL_DEV_NULL='tail -f /dev/null'
TAIL_DEV_NULL = "tail -f /dev/null"
@pytest.fixture()
def run_and_stream_command_output():
def run_and_stream_command_output_inner(command, verbose=False):
print("Running", command)
build_env = os.environ.copy()
build_env['PIHOLE_DOCKER_TAG'] = version
build_result = subprocess.Popen(command.split(), env=build_env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_newlines=True)
build_env["PIHOLE_DOCKER_TAG"] = version
build_result = subprocess.Popen(
command.split(),
env=build_env,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1,
universal_newlines=True,
)
if verbose:
while build_result.poll() is None:
for line in build_result.stdout:
print(line, end='')
print(line, end="")
build_result.wait()
if build_result.returncode != 0:
print(f' ::: Error running: {command}')
print(f" [i] Error running: {command}")
print(build_result.stderr)
return run_and_stream_command_output_inner
@pytest.fixture()
def args_volumes():
return '-v /dev/null:/etc/pihole/adlists.list'
return "-v /dev/null:/etc/pihole/adlists.list"
@pytest.fixture()
def args_env():
return '-e FTLCONF_REPLY_ADDR4="127.0.0.1"'
return '-e FTLCONF_LOCAL_IPV4="127.0.0.1"'
@pytest.fixture()
def args(args_volumes, args_env):
return "{} {}".format(args_volumes, args_env)
@pytest.fixture()
def test_args():
''' test override fixture to provide arguments separate from our core args '''
return ''
"""test override fixture to provide arguments separate from our core args"""
return ""
def docker_generic(request, _test_args, _args, _image, _cmd, _entrypoint):
#assert 'docker' in check_output('id'), "Are you in the docker group?"
# assert 'docker' in check_output('id'), "Are you in the docker group?"
# Always appended PYTEST arg to tell pihole we're testing
if 'pihole' in _image and 'PYTEST=1' not in _args:
_args = '{} -e PYTEST=1'.format(_args)
docker_run = 'docker run -d -t {args} {test_args} {entry} {image} {cmd}'\
.format(args=_args, test_args=_test_args, entry=_entrypoint, image=_image, cmd=_cmd)
if "pihole" in _image and "PYTEST=1" not in _args:
_args = "{} -e PYTEST=1".format(_args)
docker_run = "docker run -d -t {args} {test_args} {entry} {image} {cmd}".format(
args=_args, test_args=_test_args, entry=_entrypoint, image=_image, cmd=_cmd
)
# Print a human runable version of the container run command for faster debugging
print(docker_run.replace('-d -t', '--rm -it').replace(TAIL_DEV_NULL, 'bash'))
print(docker_run.replace("-d -t", "--rm -it").replace(TAIL_DEV_NULL, "bash"))
docker_id = check_output(docker_run)
def teardown():
check_output("docker logs {}".format(docker_id))
check_output("docker rm -f {}".format(docker_id))
request.addfinalizer(teardown)
docker_container = testinfra.backend.get_backend("docker://" + docker_id, sudo=False)
request.addfinalizer(teardown)
docker_container = testinfra.backend.get_backend(
"docker://" + docker_id, sudo=False
)
docker_container.id = docker_id
return docker_container
@@ -67,90 +83,126 @@ def docker_generic(request, _test_args, _args, _image, _cmd, _entrypoint):
@pytest.fixture
def docker(request, test_args, args, image, cmd, entrypoint):
''' One-off Docker container run '''
"""One-off Docker container run"""
return docker_generic(request, test_args, args, image, cmd, entrypoint)
@pytest.fixture(scope='module')
def docker_persist(request, persist_test_args, persist_args, persist_image, persist_cmd, persist_entrypoint, dig):
''' Persistent Docker container for multiple tests, instead of stopping container after one test '''
''' Uses DUP'd module scoped fixtures because smaller scoped fixtures won't mix with module scope '''
persistent_container = docker_generic(request, persist_test_args, persist_args, persist_image, persist_cmd, persist_entrypoint)
''' attach a dig container for lookups '''
@pytest.fixture(scope="module")
def docker_persist(
request,
persist_test_args,
persist_args,
persist_image,
persist_cmd,
persist_entrypoint,
dig,
):
"""
Persistent Docker container for multiple tests, instead of stopping container after one test
Uses DUP'd module scoped fixtures because smaller scoped fixtures won't mix with module scope
"""
persistent_container = docker_generic(
request,
persist_test_args,
persist_args,
persist_image,
persist_cmd,
persist_entrypoint,
)
""" attach a dig container for lookups """
persistent_container.dig = dig(persistent_container.id)
return persistent_container
@pytest.fixture
def entrypoint():
return ''
return ""
@pytest.fixture()
def version():
return os.environ.get('GIT_TAG', None)
return os.environ.get("GIT_TAG", None)
@pytest.fixture()
def tag(version):
return '{}'.format(version)
return "{}".format(version)
@pytest.fixture
def webserver(tag):
''' TODO: this is obvious without alpine+nginx as the alternative, remove fixture, hard code lighttpd in tests? '''
return 'lighttpd'
"""TODO: this is obvious without alpine+nginx as the alternative, remove fixture, hard code lighttpd in tests?"""
return "lighttpd"
@pytest.fixture()
def image(tag):
image = 'pihole'
return '{}:{}'.format(image, tag)
image = "pihole"
return "{}:{}".format(image, tag)
@pytest.fixture()
def cmd():
return TAIL_DEV_NULL
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_version():
return version
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_args_dns():
return '--dns 127.0.0.1 --dns 1.1.1.1'
return "--dns 127.0.0.1 --dns 1.1.1.1"
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_args_volumes():
return '-v /dev/null:/etc/pihole/adlists.list'
return "-v /dev/null:/etc/pihole/adlists.list"
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_args_env():
return '-e ServerIP="127.0.0.1"'
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_args(persist_args_volumes, persist_args_env):
return "{} {}".format(persist_args_volumes, persist_args_env)
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_test_args():
''' test override fixture to provide arguments separate from our core args '''
return ''
"""test override fixture to provide arguments separate from our core args"""
return ""
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_tag(persist_version):
return '{}'.format(persist_version)
return "{}".format(persist_version)
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_webserver(persist_tag):
''' TODO: this is obvious without alpine+nginx as the alternative, remove fixture, hard code lighttpd in tests? '''
return 'lighttpd'
"""TODO: this is obvious without alpine+nginx as the alternative, remove fixture, hard code lighttpd in tests?"""
return "lighttpd"
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_image(persist_tag):
image = 'pihole'
return '{}:{}'.format(image, persist_tag)
image = "pihole"
return "{}:{}".format(image, persist_tag)
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_cmd():
return TAIL_DEV_NULL
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def persist_entrypoint():
return ''
return ""
@pytest.fixture
def slow():
@@ -158,6 +210,7 @@ def slow():
Run a slow check, check if the state is correct for `timeout` seconds.
"""
import time
def _slow(check, timeout=20):
timeout_at = time.time() + timeout
while True:
@@ -170,26 +223,28 @@ def slow():
raise e
else:
return
return _slow
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def dig():
''' separate container to link to pi-hole and perform lookups '''
''' a docker pull is faster than running an install of dnsutils '''
"""separate container to link to pi-hole and perform lookups"""
""" a docker pull is faster than running an install of dnsutils """
def _dig(docker_id):
args = '--link {}:test_pihole'.format(docker_id)
image = 'azukiapp/dig'
cmd = TAIL_DEV_NULL
dig_container = docker_generic(request, '', args, image, cmd, '')
args = "--link {}:test_pihole".format(docker_id)
image = "azukiapp/dig"
cmd = TAIL_DEV_NULL
dig_container = docker_generic(request, "", args, image, cmd, "")
return dig_container
return _dig
'''
Persistent Docker container for testing service post start.sh
'''
@pytest.fixture
def running_pihole(docker_persist, slow, persist_webserver):
''' Persist a fully started docker-pi-hole to help speed up subsequent tests '''
slow(lambda: docker_persist.run('pgrep pihole-FTL').rc == 0)
slow(lambda: docker_persist.run('pgrep lighttpd').rc == 0)
return docker_persist
"""Persist a fully started docker-pi-hole to help speed up subsequent tests"""
slow(lambda: docker_persist.run("pgrep pihole-FTL").rc == 0)
slow(lambda: docker_persist.run("pgrep lighttpd").rc == 0)
return docker_persist

View File

@@ -1,165 +1,249 @@
import os
import pytest
import re
SETUPVARS_LOC='/etc/pihole/setupVars.conf'
DNSMASQ_CONFIG_LOC = '/etc/dnsmasq.d/01-pihole.conf'
CMD_SETUP_FTL_CACHESIZE='. bash_functions.sh ; setup_FTL_CacheSize'
CMD_SETUP_FTL_INTERFACE='. bash_functions.sh ; setup_FTL_Interface'
CMD_SETUP_WEB_PASSWORD='. bash_functions.sh ; setup_web_password'
SETUPVARS_LOC = "/etc/pihole/setupVars.conf"
DNSMASQ_CONFIG_LOC = "/etc/dnsmasq.d/01-pihole.conf"
CMD_SETUP_FTL_CACHESIZE = ". bash_functions.sh ; setup_FTL_CacheSize"
CMD_SETUP_FTL_INTERFACE = ". bash_functions.sh ; setup_FTL_Interface"
CMD_SETUP_WEB_PASSWORD = ". bash_functions.sh ; setup_web_password"
def _cat(file):
return 'cat {}'.format(file)
return "cat {}".format(file)
def _grep(string, file):
return 'grep -q \'{}\' {}'.format(string,file)
return "grep -q '{}' {}".format(string, file)
@pytest.mark.parametrize('test_args,expected_ipv6,expected_stdout', [
('', True, 'IPv4 and IPv6'),
('-e "IPv6=True"', True, 'IPv4 and IPv6'),
('-e "IPv6=False"', False, 'IPv4'),
('-e "IPv6=foobar"', False, 'IPv4'),
])
def test_ipv6_not_true_removes_ipv6(docker, slow, test_args, expected_ipv6, expected_stdout):
''' When a user overrides IPv6=True they only get IPv4 listening webservers '''
IPV6_LINE = 'use-ipv6.pl'
WEB_CONFIG = '/etc/lighttpd/lighttpd.conf'
function = docker.run('. /bash_functions.sh ; setup_ipv4_ipv6')
@pytest.mark.parametrize(
"test_args,expected_ipv6,expected_stdout",
[
("", True, "IPv4 and IPv6"),
('-e "IPv6=True"', True, "IPv4 and IPv6"),
('-e "IPv6=False"', False, "IPv4"),
('-e "IPv6=foobar"', False, "IPv4"),
],
)
def test_ipv6_not_true_removes_ipv6(
docker, slow, test_args, expected_ipv6, expected_stdout
):
"""When a user overrides IPv6=True they only get IPv4 listening webservers"""
IPV6_LINE = "use-ipv6.pl"
WEB_CONFIG = "/etc/lighttpd/lighttpd.conf"
function = docker.run(". /usr/local/bin/bash_functions.sh ; setup_ipv4_ipv6")
assert "Using {}".format(expected_stdout) in function.stdout
if expected_stdout == 'IPv4':
assert 'IPv6' not in function.stdout
if expected_stdout == "IPv4":
assert "IPv6" not in function.stdout
# On overlay2(?) docker sometimes writes to disk are slow enough to break some tests...
expected_ipv6_check = lambda: (\
IPV6_LINE in docker.run('grep \'use-ipv6.pl\' {}'.format(WEB_CONFIG)).stdout
) == expected_ipv6
expected_ipv6_check = (
lambda: (
IPV6_LINE in docker.run("grep 'use-ipv6.pl' {}".format(WEB_CONFIG)).stdout
)
== expected_ipv6
)
slow(expected_ipv6_check)
@pytest.mark.parametrize('test_args', ['-e "WEB_PORT=999"'])
@pytest.mark.parametrize("test_args", ['-e "WEB_PORT=999"'])
def test_overrides_default_web_port(docker, slow, test_args):
''' When a --net=host user sets WEB_PORT to avoid synology's 80 default IPv4 and or IPv6 ports are updated'''
CONFIG_LINE = r'server.port\s*=\s*999'
WEB_CONFIG = '/etc/lighttpd/lighttpd.conf'
"""When a --net=host user sets WEB_PORT to avoid synology's 80 default IPv4 and or IPv6 ports are updated"""
CONFIG_LINE = r"server.port\s*=\s*999"
WEB_CONFIG = "/etc/lighttpd/lighttpd.conf"
function = docker.run('. /bash_functions.sh ; eval `grep setup_web_port /start.sh`')
assert "Custom WEB_PORT set to 999" in function.stdout
assert "INFO: Without proper router DNAT forwarding to 127.0.0.1:999, you may not get any blocked websites on ads" in function.stdout
slow(lambda: re.search(CONFIG_LINE, docker.run(_cat(WEB_CONFIG)).stdout) != None)
function = docker.run(
". /usr/local/bin/bash_functions.sh ; eval `grep setup_web_port /usr/local/bin/_startup.sh`"
)
assert " [i] Custom WEB_PORT set to 999" in function.stdout
assert (
" [i] Without proper router DNAT forwarding to 127.0.0.1:999, you may not get any blocked websites on ads"
in function.stdout
)
slow(
lambda: re.search(CONFIG_LINE, docker.run(_cat(WEB_CONFIG)).stdout) is not None
)
@pytest.mark.parametrize('test_args,expected_error', [
('-e WEB_PORT="LXXX"', 'WARNING: Custom WEB_PORT not used - LXXX is not an integer'),
('-e WEB_PORT="1,000"', 'WARNING: Custom WEB_PORT not used - 1,000 is not an integer'),
('-e WEB_PORT="99999"', 'WARNING: Custom WEB_PORT not used - 99999 is not within valid port range of 1-65535'),
])
@pytest.mark.parametrize(
"test_args,expected_error",
[
(
'-e WEB_PORT="LXXX"',
"WARNING: Custom WEB_PORT not used - LXXX is not an integer",
),
(
'-e WEB_PORT="1,000"',
"WARNING: Custom WEB_PORT not used - 1,000 is not an integer",
),
(
'-e WEB_PORT="99999"',
"WARNING: Custom WEB_PORT not used - 99999 is not within valid port range of 1-65535",
),
],
)
def test_bad_input_to_web_port(docker, test_args, expected_error):
function = docker.run('. /bash_functions.sh ; eval `grep setup_web_port /start.sh`')
function = docker.run(
". /usr/local/bin/bash_functions.sh ; eval `grep setup_web_port /usr/local/bin/_startup.sh`"
)
assert expected_error in function.stdout
@pytest.mark.parametrize('test_args,cache_size', [('-e CUSTOM_CACHE_SIZE="0"', '0'), ('-e CUSTOM_CACHE_SIZE="20000"', '20000')])
@pytest.mark.parametrize(
"test_args,cache_size",
[('-e CUSTOM_CACHE_SIZE="0"', "0"), ('-e CUSTOM_CACHE_SIZE="20000"', "20000")],
)
def test_overrides_default_custom_cache_size(docker, slow, test_args, cache_size):
''' Changes the cache_size setting to increase or decrease the cache size for dnsmasq'''
CONFIG_LINE = r'cache-size\s*=\s*{}'.format(cache_size)
"""Changes the cache_size setting to increase or decrease the cache size for dnsmasq"""
CONFIG_LINE = r"cache-size\s*=\s*{}".format(cache_size)
function = docker.run('echo ${CUSTOM_CACHE_SIZE};. ./bash_functions.sh; echo ${CUSTOM_CACHE_SIZE}; eval `grep setup_FTL_CacheSize /start.sh`')
function = docker.run(
"echo ${CUSTOM_CACHE_SIZE};. ./usr/local/bin/bash_functions.sh; echo ${CUSTOM_CACHE_SIZE}; eval `grep setup_FTL_CacheSize /usr/local/bin/_startup.sh`"
)
assert "Custom CUSTOM_CACHE_SIZE set to {}".format(cache_size) in function.stdout
slow(lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout) != None)
slow(
lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout)
is not None
)
@pytest.mark.parametrize('test_args', [
'-e CUSTOM_CACHE_SIZE="-1"',
'-e CUSTOM_CACHE_SIZE="1,000"',
])
@pytest.mark.parametrize(
"test_args",
[
'-e CUSTOM_CACHE_SIZE="-1"',
'-e CUSTOM_CACHE_SIZE="1,000"',
],
)
def test_bad_input_to_custom_cache_size(docker, slow, test_args):
CONFIG_LINE = r'cache-size\s*=\s*10000'
CONFIG_LINE = r"cache-size\s*=\s*10000"
docker.run(CMD_SETUP_FTL_CACHESIZE)
slow(lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout) != None)
slow(
lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout)
is not None
)
@pytest.mark.parametrize('test_args', [
'-e DNSSEC="true" -e CUSTOM_CACHE_SIZE="0"',
])
@pytest.mark.parametrize(
"test_args",
[
'-e DNSSEC="true" -e CUSTOM_CACHE_SIZE="0"',
],
)
def test_dnssec_enabled_with_custom_cache_size(docker, slow, test_args):
CONFIG_LINE = r'cache-size\s*=\s*10000'
CONFIG_LINE = r"cache-size\s*=\s*10000"
docker.run(CMD_SETUP_FTL_CACHESIZE)
slow(lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout) != None)
slow(
lambda: re.search(CONFIG_LINE, docker.run(_cat(DNSMASQ_CONFIG_LOC)).stdout)
is not None
)
@pytest.mark.parametrize('args_env, expected_stdout, expected_config_line', [
('', 'binding to default interface: eth0', 'PIHOLE_INTERFACE=eth0'),
('-e INTERFACE="br0"', 'binding to custom interface: br0', 'PIHOLE_INTERFACE=br0'),
])
def test_dns_interface_override_defaults(docker, slow, args_env, expected_stdout, expected_config_line):
''' When INTERFACE environment var is passed in, overwrite dnsmasq interface '''
@pytest.mark.parametrize(
"args_env, expected_stdout, expected_config_line",
[
("", "binding to default interface: eth0", "PIHOLE_INTERFACE=eth0"),
(
'-e INTERFACE="br0"',
"binding to custom interface: br0",
"PIHOLE_INTERFACE=br0",
),
],
)
def test_dns_interface_override_defaults(
docker, slow, args_env, expected_stdout, expected_config_line
):
"""When INTERFACE environment var is passed in, overwrite dnsmasq interface"""
function = docker.run(CMD_SETUP_FTL_INTERFACE)
assert expected_stdout in function.stdout
slow(lambda: expected_config_line + '\n' == docker.run('grep "^PIHOLE_INTERFACE" {}'.format(SETUPVARS_LOC)).stdout)
slow(
lambda: expected_config_line + "\n"
== docker.run('grep "^PIHOLE_INTERFACE" {}'.format(SETUPVARS_LOC)).stdout
)
expected_debian_lines = [
'"VIRTUAL_HOST" => "127.0.0.1"',
'"PHP_ERROR_LOG" => "/var/log/lighttpd/error-pihole.log"'
'"PHP_ERROR_LOG" => "/var/log/lighttpd/error-pihole.log"',
]
@pytest.mark.parametrize('expected_lines,repeat_function', [
(expected_debian_lines, 1),
(expected_debian_lines, 2)
])
@pytest.mark.parametrize(
"expected_lines,repeat_function",
[(expected_debian_lines, 1), (expected_debian_lines, 2)],
)
def test_debian_setup_php_env(docker, expected_lines, repeat_function):
''' confirm all expected output is there and nothing else '''
"""confirm all expected output is there and nothing else"""
for _ in range(repeat_function):
docker.run('. /bash_functions.sh ; eval `grep setup_php_env /start.sh`').stdout
docker.run(
". /usr/local/bin/bash_functions.sh ; eval `grep setup_php_env /usr/local/bin/_startup.sh`"
)
for expected_line in expected_lines:
search_config_cmd = "grep -c '{}' /etc/lighttpd/conf-enabled/15-fastcgi-php.conf".format(expected_line)
search_config_cmd = (
"grep -c '{}' /etc/lighttpd/conf-enabled/15-fastcgi-php.conf".format(
expected_line
)
)
search_config_count = docker.run(search_config_cmd)
found_lines = int(search_config_count.stdout.rstrip('\n'))
found_lines = int(search_config_count.stdout.rstrip("\n"))
if found_lines > 1:
assert False, f'Found line {expected_line} times (more than once): {found_lines}'
assert (
False
), f"Found line {expected_line} times (more than once): {found_lines}"
def test_webpassword_random_generation(docker):
''' When a user sets webPassword env the admin password gets set to that '''
"""When a user sets webPassword env the admin password gets set to that"""
function = docker.run(CMD_SETUP_WEB_PASSWORD)
assert 'assigning random password' in function.stdout.lower()
assert "assigning random password" in function.stdout.lower()
@pytest.mark.parametrize('entrypoint,cmd', [('--entrypoint=tail','-f /dev/null')])
@pytest.mark.parametrize('args_env,secure,setupvars_hash', [
('-e WEBPASSWORD=login', True, 'WEBPASSWORD=6060d59351e8c2f48140f01b2c3f3b61652f396c53a5300ae239ebfbe7d5ff08'),
('-e WEBPASSWORD=""', False, ''),
])
def test_webpassword_env_assigns_password_to_file_or_removes_if_empty(docker, args_env, secure, setupvars_hash):
''' When a user sets webPassword env the admin password gets set or removed if empty '''
@pytest.mark.parametrize("entrypoint,cmd", [("--entrypoint=tail", "-f /dev/null")])
@pytest.mark.parametrize(
"args_env,secure,setupvars_hash",
[
(
"-e WEBPASSWORD=login",
True,
"WEBPASSWORD=6060d59351e8c2f48140f01b2c3f3b61652f396c53a5300ae239ebfbe7d5ff08",
),
('-e WEBPASSWORD=""', False, ""),
],
)
def test_webpassword_env_assigns_password_to_file_or_removes_if_empty(
docker, args_env, secure, setupvars_hash
):
"""When a user sets webPassword env the admin password gets set or removed if empty"""
function = docker.run(CMD_SETUP_WEB_PASSWORD)
if secure:
assert 'new password set' in function.stdout.lower()
assert "new password set" in function.stdout.lower()
assert docker.run(_grep(setupvars_hash, SETUPVARS_LOC)).rc == 0
else:
assert 'password removed' in function.stdout.lower()
assert docker.run(_grep('^WEBPASSWORD=$', SETUPVARS_LOC)).rc == 0
assert "password removed" in function.stdout.lower()
assert docker.run(_grep("^WEBPASSWORD=$", SETUPVARS_LOC)).rc == 0
@pytest.mark.parametrize('entrypoint,cmd', [('--entrypoint=tail','-f /dev/null')])
@pytest.mark.parametrize('test_args', ['-e WEBPASSWORD=login', '-e WEBPASSWORD=""'])
@pytest.mark.parametrize("entrypoint,cmd", [("--entrypoint=tail", "-f /dev/null")])
@pytest.mark.parametrize("test_args", ["-e WEBPASSWORD=login", '-e WEBPASSWORD=""'])
def test_env_always_updates_password(docker, args_env, test_args):
'''When a user sets the WEBPASSWORD environment variable, ensure it always sets the password'''
"""When a user sets the WEBPASSWORD environment variable, ensure it always sets the password"""
function = docker.run(CMD_SETUP_WEB_PASSWORD)
assert '::: Assigning password defined by Environment Variable' in function.stdout
assert " [i] Assigning password defined by Environment Variable" in function.stdout
@pytest.mark.parametrize('entrypoint,cmd', [('--entrypoint=tail','-f /dev/null')])
@pytest.mark.parametrize("entrypoint,cmd", [("--entrypoint=tail", "-f /dev/null")])
def test_setupvars_trumps_random_password_if_set(docker, args_env, test_args):
'''If a password is already set in setupvars, and no password is set in the environment variable, do not generate a random password'''
docker.run('. /opt/pihole/utils.sh ; addOrEditKeyValPair {} WEBPASSWORD volumepass'.format(SETUPVARS_LOC))
"""If a password is already set in setupvars, and no password is set in the environment variable, do not generate a random password"""
docker.run(
". /opt/pihole/utils.sh ; addOrEditKeyValPair {} WEBPASSWORD volumepass".format(
SETUPVARS_LOC
)
)
function = docker.run(CMD_SETUP_WEB_PASSWORD)
assert 'Pre existing WEBPASSWORD found' in function.stdout
assert docker.run(_grep('WEBPASSWORD=volumepass', SETUPVARS_LOC)).rc == 0
assert "Pre existing WEBPASSWORD found" in function.stdout
assert docker.run(_grep("WEBPASSWORD=volumepass", SETUPVARS_LOC)).rc == 0

View File

@@ -1,19 +1,37 @@
import pytest
import time
''' conftest.py provides the defaults through fixtures '''
''' Note, testinfra builtins don't seem fully compatible with
docker containers (esp. musl based OSs) stripped down nature '''
""" conftest.py provides the defaults through fixtures """
""" Note, testinfra builtins don't seem fully compatible with
docker containers (esp. musl based OSs) stripped down nature """
# If the test runs /start.sh, do not let s6 run it too! Kill entrypoint to avoid race condition/duplicated execution
@pytest.mark.parametrize('entrypoint,cmd', [('--entrypoint=tail','-f /dev/null')])
@pytest.mark.parametrize('args,error_msg,expect_rc', [
('-e FTLCONF_REPLY_ADDR4="1.2.3.z"', "FTLCONF_REPLY_ADDR4 Environment variable (1.2.3.z) doesn't appear to be a valid IPv4 address",1),
('-e FTLCONF_REPLY_ADDR4="1.2.3.4" -e FTLCONF_REPLY_ADDR6="1234:1234:1234:ZZZZ"', "Environment variable (1234:1234:1234:ZZZZ) doesn't appear to be a valid IPv6 address",1),
('-e FTLCONF_REPLY_ADDR4="1.2.3.4" -e FTLCONF_REPLY_ADDR6="kernel"', "ERROR: You passed in IPv6 with a value of 'kernel'",1),
])
def test_ftlconf_reply_addr_invalid_ips_triggers_exit_error(docker, error_msg, expect_rc):
start = docker.run('/start.sh')
# If the test runs /usr/local/bin/_startup.sh, do not let s6 run it too! Kill entrypoint to avoid race condition/duplicated execution
@pytest.mark.parametrize("entrypoint,cmd", [("--entrypoint=tail", "-f /dev/null")])
@pytest.mark.parametrize(
"args,error_msg,expect_rc",
[
(
'-e FTLCONF_LOCAL_IPV4="1.2.3.z"',
"FTLCONF_LOCAL_IPV4 Environment variable (1.2.3.z) doesn't appear to be a valid IPv4 address",
1,
),
(
'-e FTLCONF_LOCAL_IPV4="1.2.3.4" -e FTLCONF_LOCAL_IPV6="1234:1234:1234:ZZZZ"',
"Environment variable (1234:1234:1234:ZZZZ) doesn't appear to be a valid IPv6 address",
1,
),
(
'-e FTLCONF_LOCAL_IPV4="1.2.3.4" -e FTLCONF_LOCAL_IPV6="kernel"',
"ERROR: You passed in IPv6 with a value of 'kernel'",
1,
),
],
)
def test_ftlconf_local_addr_invalid_ips_triggers_exit_error(
docker, error_msg, expect_rc
):
start = docker.run("/usr/local/bin/_startup.sh")
assert start.rc == expect_rc
assert 'ERROR' in start.stdout
assert "ERROR" in start.stdout
assert error_msg in start.stdout