七仔的博客

七仔的博客GithubPages分博

0%

为Scrapy爬虫添加UA池和动态IP代理池

大家在进行爬取数据时可能会被封,这种情况可以通过降低爬取速度、更换user-agent、更换ip来预防,这里写了我的爬虫经验

为Scrapy爬虫添加UA池和动态IP代理池

大家在进行爬取数据时可能会被封,这种情况可以通过降低爬取速度、更换user-agent、更换ip来预防,这里写了我的爬虫经验

添加UA池

在setting.py中添加

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
USER_AGENT_LIST = [
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1",
"Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5",
"Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24"
]

然后在setting.py中取消注释DOWNLOADER_MIDDLEWARES(*处改成你自己的):

1
2
3
4
DOWNLOADER_MIDDLEWARES = {
'*.middlewares.Job51SpiderMiddleware': 543,
'*.middlewares.Job51DownloaderMiddleware': 543,
}

然后再middlewares.py的*DownloaderMiddleware类中的process_request函数中添加

1
2
3
user_agent = random.choice(spider.settings['USER_AGENT_LIST'])
print(user_agent)
request.headers['User-Agent'] = user_agent

运行后查看是否输出user-agent,输出代表成功了

添加动态IP代理池

添加IP代理池有很大程度上防止被封,但是免费的IP代理很难找到,找到了也用不久(事实上我就没找到过能用的),后来发现了个阿布云,有动态IP代理,然后是注册就送三个小时,并发每秒5个,这着实爽到我了,后面再用是每秒5个的话一小时一块钱!?,对于用不了那么多数据量的来说够够的了,添加并发的话大概每5个/秒的话是加4-5毛。没打广告啊( ̄▽ ̄)”

下面讲讲具体操作:

在middlewares.py上方添加

1
2
3
4
5
6
7
8
# 阿布云 ip代理
import base64
# 代理服务器
proxyServer = "http://http-dyn.abuyun.com:9020"
# 代理隧道验证信息
proxyUser = "填你自己在阿布云获取到的"
proxyPass = "填你自己在阿布云获取到的"
proxyAuth = "Basic " + base64.urlsafe_b64encode(bytes((proxyUser + ":" + proxyPass), "ascii")).decode("utf8")

然后在*SpiderMiddleware类的process_request函数里增加:

1
2
3
4
request.meta["proxy"] = proxyServer
request.headers["Proxy-Authorization"] = proxyAuth
print(proxyServer)
print(proxyAuth)

运行后查看是否有输出

此为博主副博客,留言请去主博客,转载请注明出处:https://www.baby7blog.com/myBlog/63.html

欢迎关注我的其它发布渠道