TLDR: If you run into 503 errors when trying to git pull, try pulling less often.
I’ve noticed an unusual amount of git-related HTTP requests in nginx logs which caused a noticeable increase in overall traffic sent per month (each request is only ~16K but with the rate of 1 per 3-4 seconds per IP it stacks up quite quickly - going by webalizer this specific URL endpoint caused 88% of monthly hits and 36% overall bytes sent in April, however this could be legit git traffic, at least partially):
Only three IP addresses originate the vast majority of the above traffic, one of those registered to Digital Ocean. Maybe they NAT all outbound traffic for their hosted sites under one address? Maybe some special person out there decided to git pull every second? Who knows, really.
Anyway, for the time being I’m implementing a rate limit if the following two conditions match: request URI contains “service=git-upload-pack” and user agent contains “git”.
Seems like on DigitalOcean all the downloads over IPv6 hit the limit. If you disable IPv6 connectivity you’re able to interact with the repository though.
I don’t know if the rate limiting still applies, but I was getting 50kbps download speeds cloning when my connection should support up to 50mbps.
This is my first checkout since I haven’t updated since the last tarball, took several minutes to download the whole 75MB repo
consider running git gc --aggressive on the bare repo in the server. it reduced the size locally to 29MB, which would make initial checkouts faster even when rate limited.
before