Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache.nixos.org is down #29389

Closed
YorikSar opened this issue Sep 14, 2017 · 10 comments
Closed

cache.nixos.org is down #29389

YorikSar opened this issue Sep 14, 2017 · 10 comments

Comments

@YorikSar
Copy link
Contributor

It seems us-east-1 region is experiencing some issues right now, and cache.nixos.org always returns "503 SlowDown" error.

Is cache.nixos.org hosted in one region only? Is it possible to duplicate it in different region (or different cloud) for more availability?

@vcunat
Copy link
Member

vcunat commented Sep 14, 2017

The cache is in multiple regions. Europe seems OK to me, though I'm not sure if it will help to try cross-region downloads. FWIW:

$ dig cache.nixos.org +short
d3m36hgdyp4koz.cloudfront.net.
13.32.121.5
13.32.121.15
13.32.121.41
13.32.121.49
13.32.121.89
13.32.121.95
13.32.121.175
13.32.121.233

$ dig cache.nixos.org +short AAAA
d3m36hgdyp4koz.cloudfront.net.
2600:9000:20ac:1e00:10:dfa4:b0c0:93a1
2600:9000:20ac:2800:10:dfa4:b0c0:93a1
2600:9000:20ac:2c00:10:dfa4:b0c0:93a1
2600:9000:20ac:6c00:10:dfa4:b0c0:93a1
2600:9000:20ac:7400:10:dfa4:b0c0:93a1
2600:9000:20ac:ba00:10:dfa4:b0c0:93a1
2600:9000:20ac:ea00:10:dfa4:b0c0:93a1
2600:9000:20ac:fc00:10:dfa4:b0c0:93a1

@bbarker
Copy link
Contributor

bbarker commented Sep 14, 2017

I was having a lot of trouble (I'm actually in the eastern US, assuming that matters):

$ dig cache.nixos.org +short
d3m36hgdyp4koz.cloudfront.net.
54.230.161.241
54.230.161.105
54.230.161.41
54.230.161.32
54.230.161.242
54.230.161.57
54.230.161.157
54.230.161.76

$ dig cache.nixos.org +short AAAA
d3m36hgdyp4koz.cloudfront.net.
2600:9000:2019:a800:10:dfa4:b0c0:93a1
2600:9000:2019:cc00:10:dfa4:b0c0:93a1
2600:9000:2019:4800:10:dfa4:b0c0:93a1
2600:9000:2019:ee00:10:dfa4:b0c0:93a1
2600:9000:2019:b400:10:dfa4:b0c0:93a1
2600:9000:2019:0:10:dfa4:b0c0:93a1
2600:9000:2019:4c00:10:dfa4:b0c0:93a1
2600:9000:2019:a00:10:dfa4:b0c0:93a1

@YorikSar
Copy link
Contributor Author

Europe seems OK to me

Huh, I've been getting US servers from DNS because of active VPN all this time :)

I wonder if CloudFront can be configured to skip problem region.

@vcunat
Copy link
Member

vcunat commented Sep 15, 2017

AWS status shows OK now. I don't think there's much to do about about such rare outages, except that in the long term we might manage to provide some redundancy, e.g. via NixOS/nix#1167 (comment)

@vcunat vcunat closed this as completed Sep 15, 2017
@YorikSar
Copy link
Contributor Author

@vcunat Oh, IPFS is already in works, huh? Cool, thanks!

@vcunat
Copy link
Member

vcunat commented Sep 15, 2017

Noone serves nixpkgs binaries via IPFS currently, AFAIK, but I believe we will work it out in time. (There are just too many things to do.)

@YorikSar
Copy link
Contributor Author

Is there some final conclusion on how to serve cache over IPFS? I've seen lots of discussion and didn't find a concrete proposal that is being implemented.

In my head it can be implemented as simple IPFS "directory" with pretty much same structure as being served through HTTP now.

Can you share current stats of the cache like overall size, number of files in it, average number of requests and network load?

@vcunat
Copy link
Member

vcunat commented Sep 16, 2017

The link I posted is a complete implementation that has been tested, I believe, though it's not merged in the official nix repository. Here we discussed approximate sizes of the channels (I have no numbers first-hand).

@YorikSar
Copy link
Contributor Author

@vcunat thanks. Diving into all these threads I feel totally lost. I guess I'm a teapot. The first thing that surprised me was 60G for total size of packages (really? just 60G? other systems require way more space for mirrors...), and it all went haywire from there. I guess learning stuff just shows me how much I don't know there :)

I think this discussion would be better suited for email thread than some random issue on GitHub, will start a thread there.

@bbarker
Copy link
Contributor

bbarker commented Sep 16, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants