Describe the problem you’re having:
Fetching feeds from sites using let’s encrypt certs doesn’t work. Debuggin such a feed produces the following output:
[19:05:42/13871] start
[19:05:42/13871] running HOOK_FETCH_FEED handlers...
[19:05:42/13871] ... Af_Comics
[19:05:42/13871] === 0.0002 (sec)
[19:05:42/13871] feed data has not been modified by a plugin.
[19:05:42/13871] local cache will not be used for this feed
[19:05:42/13871] last unconditional update request: 2020-04-08 08:30:55
[19:05:42/13871] maximum allowed interval for conditional requests exceeded, forcing refetch
[19:05:42/13871] fetching [https://www.laufzeit.de/feed/] (force_refetch: 1)...
[19:05:42/13871] fetch done.
[19:05:42/13871] source last modified:
[19:05:42/13871] unable to fetch: ; 60 SSL certificate problem: unable to get local issuer certificate [0]
If possible include steps to reproduce the problem:
Subscribe to a rss feed where the site uses let’s encrypt certs for https connections.
tt-rss version (including git commit id):
latest origin master
Certificate validation occurs within the curl module, which uses the operating system’s list of known and trusted certificate authorities. You should try manually making the request via SSH using curl, you’ll probably get the same error. From there you can update your operating system or manually added the necessary root CA certificates from Let’s Encrypt.
Having had a play on two different systems, wget and curl both have problems with their feed URL.
$ curl -LI https://www.laufzeit.de/feed
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
$ wget https://www.laufzeit.de/feed -O /dev/null
--2020-04-11 22:48:28-- https://www.laufzeit.de/feed
Resolving www.laufzeit.de (www.laufzeit.de)... 45.146.172.25
Connecting to www.laufzeit.de (www.laufzeit.de)|45.146.172.25|:443... connected.
ERROR: cannot verify www.laufzeit.de's certificate, issued by ‘CN=Let's Encrypt Authority X3,O=Let's Encrypt,C=US’:
Unable to locally verify the issuer's authority.
To connect to www.laufzeit.de insecurely, use `--no-check-certificate'.
Firefox is quite happy with it however.
Had a quick look at the certificates, and I don’t think it’s a root CA problem - one of my own sites uses the same chain as they are* so I can only conclude there’s something wrong with the site certificate.
* root[DST Root CA X3] + intermediary[Let’s Encrypt Authority X3]
tl;dr the server needs to provide the intermediate CA cert in addition to the server cert.
the problem is that the server only provides its cert and doesn’t bundle the Let's Encrypt Authority X3 intermediate CA cert. this can be discerned from the openssl output: 1. only the server cert is displayed, ie -----BEGIN CERTIFICATE-----, and 2. openssl complains that the server cert’s issuer cert is missing (because it was neither provided by the server nor found locally, ie unable to get local issuer certificate). compare this to openssl s_client -showcerts -connect www.google.com:443 </dev/null where two different certs are printed: 1. the server’s cert and 2. the/an intermediate CA cert.
best practice is for the server to bundle the server and intermediate CA certs together and require the client to have the root CA, eg /etc/ssl/certs/DST_Root_CA_X3.pem on Debian/Ubuntu, which the client uses to verify the certificate chain.
cert verification works for firefox because firefox includes the Let's Encrypt Authority X3 intermediate cert in its own certificate store which it uses to verify the server’s lone cert: Preferences → Privacy & Security → View Certificates → Authorities → Digital Signature Trust Co. → Let’s Encrypt Authority X3.
Sigh. It’s the eternal dilemma of a Firefox user. Do I continue to support[1] Mozilla even though they keep doing that kind of thing? Or do I switch to a Chromium-based browser and doom the web to be under Google’s will forever[2]?
[1] Yes, I donate yearly.
[2] Chromium may be open source, but even a company the size of Microsoft can’t practically fork it.
Yeah… I read that and was pretty taken aback, especially for an organization that touts privacy as one of their marquee features.
I proxied Microsoft’s new Chromium Edge browser and with the appropriate settings, it basically reports nothing that Windows itself doesn’t already report. (Not to mention it’s faster than Firefox.)
mozilla is actively trying to appear as a privacy-first non-profit while literally not being either. fuck them. at least google is open about what it is, i.e. a cyberpunk-tier evil megacorp.
i’m afraid it already is.
it has a rather questionable implementation of safe browsing:
at the end of the day all browsers - and the content you access via them - are spying platforms. might as well choose one that works best for you.