You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sometimes the script can take an hour to complete because the network is painfully slow and the test files are over 200MB.
Adding a --max-time option to the curl command on line 28 would greatly decrease the chances of the script hanging.
According to the curl manual:
-m, --max-time <seconds>
Maximum time in seconds that you allow the whole operation to take. This is useful for preventing your batch jobs from hanging for hours due to slow networks or links going down.
I second (or third) this change as well. I use a localized version of the script to include such changes. I have mine set to spend no more than 30 seconds per route test. At least for my connection, 30 seconds is enough time to observe a slow connection.
I think 30 seconds is too aggressive for general distribution to be honest. Some poor connections need several minutes to actually average out a real speed. I would say 2-3 minutes minimum would be safe
Sometimes the script can take an hour to complete because the network is painfully slow and the test files are over 200MB.
Adding a --max-time option to the curl command on line 28 would greatly decrease the chances of the script hanging.
According to the curl manual:
Currently, the line in the script is:
messyspeed=$(echo -n "scale=2; " && curl -4 -s -L ${test_files[$count]} -w "%{speed_download}" -o /dev/null | sed "s/\,/\./g")
The changed line would read like this:
messyspeed=$(echo -n "scale=2; " && curl -4 -s -m 180 -L ${test_files[$count]} -w "%{speed_download}" -o /dev/null | sed "s/\,/\./g")
The option
-m 180
specifies a timeout of 180 seconds, or 3 minutes.The text was updated successfully, but these errors were encountered: