我使用wget下载网站内容,但是wget是一个一个下载文件的。

我怎么能让wget下载使用4个同时连接?


当前回答

另一个可以做到这一点的程序是axel。

axel -n <NUMBER_OF_CONNECTIONS> URL

对于基本的HTTP认证,

axel -n <NUMBER_OF_CONNECTIONS> "user:password@https://domain.tld/path/file.ext"

Ubuntu手册页。

其他回答

为每个链接调用Wget并将其设置为在后台运行。

我尝试了这段Python代码

with open('links.txt', 'r')as f1:      # Opens links.txt file with read mode
  list_1 = f1.read().splitlines()      # Get every line in links.txt
for i in list_1:                       # Iteration over each link
  !wget "$i" -bq                       # Call wget with background mode

参数:

      b - Run in Background
      q - Quiet mode (No Output)

Wget不能在多个连接中下载,相反,您可以尝试使用其他程序,如aria2。

我发现(可能) 一个解决方案

In the process of downloading a few thousand log files from one server to the next I suddenly had the need to do some serious multithreaded downloading in BSD, preferably with Wget as that was the simplest way I could think of handling this. A little looking around led me to this little nugget: wget -r -np -N [url] & wget -r -np -N [url] & wget -r -np -N [url] & wget -r -np -N [url] Just repeat the wget -r -np -N [url] for as many threads as you need... Now given this isn’t pretty and there are surely better ways to do this but if you want something quick and dirty it should do the trick...

注意:选项-N使wget只下载“更新的”文件,这意味着它不会覆盖或重新下载文件,除非它们在服务器上的时间戳发生了变化。

一个新的(但尚未发布的)工具是Mget。 它已经从Wget中获得了许多选项,并提供了一个库,允许您轻松地将(递归)下载嵌入到您自己的应用程序中。

回答你的问题:

Mget——num-threads=4 [url]

更新

Mget现在开发为Wget2,修复了许多错误,增加了更多的功能(例如HTTP/2支持)。

——num-threads现在是——max-threads。

make可以很容易地并行化(例如,make -j 4)。例如,这是一个简单的Makefile,我正在使用wget并行下载文件:

BASE=http://www.somewhere.com/path/to
FILES=$(shell awk '{printf "%s.ext\n", $$1}' filelist.txt)
LOG=download.log

all: $(FILES)
    echo $(FILES)

%.ext:
    wget -N -a $(LOG) $(BASE)/$@

.PHONY: all
default: all