-
-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use the local computer index if possible #131
Comments
Well.. In case of the apt metadata, it is not that simple. We'll have to extract the base signed file, that stores the entire checksum, then correlate it with unpacked data, with varying names. And then see if anything is worthy of re-usability. It is a very unusual use case. |
Or the other way would be to just pick everything as is, and then, on the no-network machine, simply run it. If the files are tampered with, the signatures would be invalid. This wouldn't really be a Sounds like a decent feature. But I won't have any near term time to get this done. I anyone else would like to propose this as a PR, that's welcome. |
I have a script to apply apt-offline on multiple machines, and it happen that the "host" machine (the one connected to internet) has the same (or almost) archives list than the "no-network" machines. What I do, I run Of course, the ideal would be to implement #29. |
This would be a great feature. I am running Ubuntu 20.04 and run several VM's of the same. It is becoming wasteful to redownload updates for all my VMs which are just one update behind the host OS. Can apt-offline take a folder(for example |
I'm going to mark this closed. But I'll give you my thought of the best way to achieve a workflow where you have a heterogeneous setup. Machine 1: Ubutnu Blah
Given this workflow will fulfill this use case, I see no point in over-engineering |
I think you missed a point @rickysarraf. The workflow you provided does save internet data, but there's still one part left. Can apt-offline do this(pseudocode to save wordy sentences):
This will significantly minimize internet consumption, especially for a more homogeneous setup. It'd be great if you could provide a workflow(if this doesn't seem like a worthy feature). |
No. That
You are picking a very very corner case. But I do have the same scenario and I'll document what I do best to make use of every bit of the internet traffic I have.
On the network machine, setup an
Now, when you invoke
That should do the part about your last request of optimum utilization of network resources. I do use a setup derived of the same. Keep in mind that the mentioned proxies do have some issues (remote proxy side), when used in combination with And if you come across any other proxies that you are able to use in this setup, please do share with me too. I'm always on the look to explore new proxy servers for this use case. |
As it is possible for .deb files with
--cache-dir
, it would be nice to have the possibility to use the index files in/var/lib/apt/list
to avoid to re-download them.This would allow to benefit from the patch-delta update algorithm of apt (see #29) without implementing it, if the two computers share some source lists.
The text was updated successfully, but these errors were encountered: