阿布云

你所需要的,不仅仅是一个好用的代理。

(7)分布式下的爬虫Scrapy应该如何做-关于伪装和防Ban的那点事儿

阿布云 发表于

写好的爬虫放在网上爬,被ban其实是一件很郁闷的事情,现在各个网站都会有相应的防爬的措施,一般来说模拟请求头一般都够了,不过仅仅是模拟请求头够么,答案当然是否定的,我们至少需要手动的打开一个网站,用fiddler 或者Chrome 的 F12 来具体看看我们一次真实的请求都做了哪些操作。

 

关于chrome和fiddler的使用,我就不细说了,自己百度吧,我们一般看到,在一次真实的请求中,我们都会带着各种样的请求参数,如果将这些参数补全,那我们被ban的机率也会相应的减少很多。如何将这些参数在scrapy的项目补齐,那就是我们本次的研究主题。在开始这个问题之前,我们回头来看看scrapy的架构

 

从架构中我们可以清晰的看到,如果要操作这些参数,我们可以从Downloader Middlewares下载中间件着手,有人可能会有疑问:我记得Spider中有一个start_requests的方法,不是可以写到代码中么?

 

Scrapy在架构和django都有点类似,那就是松散耦合的原则,各个功能和文件都应该各司其职,做好自己的事情。

引用:http://doc.scrapy.org/en/latest/topics/downloader-middleware.html

现在我们要实现这样的一个功能,在我们请求的时候,我们可以随机的更换请求头(User-Agent)

 

 

在settings.py 添加如下代码请求头列表

 

 

#User Agent list to random choice the info USER_AGENT_LIST = ['zspider/0.9-dev http://feedback.redkolibri.com/', 'Xaldon_WebSpider/2.0.b1', 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)', 'Mozilla/5.0 (compatible; Speedy Spider; http://www.entireweb.com/about/search_tech/speedy_spider/)', 'Speedy Spider (Entireweb; Beta/1.3; http://www.entireweb.com/about/search_tech/speedyspider/)', 'Speedy Spider (Entireweb; Beta/1.2; http://www.entireweb.com/about/search_tech/speedyspider/)', 'Speedy Spider (Entireweb; Beta/1.1; http://www.entireweb.com/about/search_tech/speedyspider/)', 'Speedy Spider (Entireweb; Beta/1.0; http://www.entireweb.com/about/search_tech/speedyspider/)', 'Speedy Spider (Beta/1.0; www.entireweb.com)', 'Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)', 'Speedy Spider (http://www.entireweb.com/about/search_tech/speedyspider/)', 'Speedy Spider (http://www.entireweb.com)', 'Sosospider+(+http://help.soso.com/webspider.htm)', 'sogou spider', 'Nusearch Spider (www.nusearch.com)', 'nuSearch Spider (compatible; MSIE 4.01; Windows NT)', 'lmspider (lmspider@scansoft.com)', 'lmspider lmspider@scansoft.com', 'ldspider (http://code.google.com/p/ldspider/wiki/Robots)', 'iaskspider/2.0(+http://iask.com/help/help_index.html)', 'iaskspider', 'hl_ftien_spider_v1.1', 'hl_ftien_spider', 'FyberSpider (+http://www.fybersearch.com/fyberspider.php)', 'FyberSpider', 'everyfeed-spider/2.0 (http://www.everyfeed.com)', 'envolk[ITS]spider/1.6 (+http://www.envolk.com/envolkspider.html)', 'envolk[ITS]spider/1.6 ( http://www.envolk.com/envolkspider.html)', 'Baiduspider+(+http://www.baidu.com/search/spider_jp.html)', 'Baiduspider+(+http://www.baidu.com/search/spider.htm)', 'BaiDuSpider', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0) AddSugarSpiderBot www.idealobserver.com',]

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

#User Agent list to random choice the info

USER_AGENT_LIST = ['zspider/0.9-dev http://feedback.redkolibri.com/',

                    'Xaldon_WebSpider/2.0.b1',

                    'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)',

                    'Mozilla/5.0 (compatible; Speedy Spider; http://www.entireweb.com/about/search_tech/speedy_spider/)',

                    'Speedy Spider (Entireweb; Beta/1.3; http://www.entireweb.com/about/search_tech/speedyspider/)',

                    'Speedy Spider (Entireweb; Beta/1.2; http://www.entireweb.com/about/search_tech/speedyspider/)',

                    'Speedy Spider (Entireweb; Beta/1.1; http://www.entireweb.com/about/search_tech/speedyspider/)',

                    'Speedy Spider (Entireweb; Beta/1.0; http://www.entireweb.com/about/search_tech/speedyspider/)',

                    'Speedy Spider (Beta/1.0; www.entireweb.com)',

                    'Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)',

                    'Speedy Spider (http://www.entireweb.com/about/search_tech/speedyspider/)',

                    'Speedy Spider (http://www.entireweb.com)',

                    'Sosospider+(+http://help.soso.com/webspider.htm)',

                    'sogou spider',

                    'Nusearch Spider (www.nusearch.com)',

                    'nuSearch Spider (compatible; MSIE 4.01; Windows NT)',

                    'lmspider (lmspider@scansoft.com)',

                    'lmspider lmspider@scansoft.com',

                    'ldspider (http://code.google.com/p/ldspider/wiki/Robots)',

                    'iaskspider/2.0(+http://iask.com/help/help_index.html)',

                    'iaskspider',

                    'hl_ftien_spider_v1.1',

                    'hl_ftien_spider',

                    'FyberSpider (+http://www.fybersearch.com/fyberspider.php)',

                    'FyberSpider',

                    'everyfeed-spider/2.0 (http://www.everyfeed.com)',

                    'envolk[ITS]spider/1.6 (+http://www.envolk.com/envolkspider.html)',

                    'envolk[ITS]spider/1.6 ( http://www.envolk.com/envolkspider.html)',

                    'Baiduspider+(+http://www.baidu.com/search/spider_jp.html)',

                    'Baiduspider+(+http://www.baidu.com/search/spider.htm)',

                    'BaiDuSpider',

                    'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0) AddSugarSpiderBot www.idealobserver.com',]

 

 

建立DownloadMiddleWare文件

 

 

 

编写代码

 

download midware 提供了三个方法:

 process_request(requestspider)

 process_response(requestresponsespider)

 process_exception(requestexceptionspider)

 

我们直接从字面意思上就可以看到第一个方法,在预处理我们的请求的,第二个方法是处理返回的一些数据的,第三个是当spider发生错误会调用的方法,第一个方法,就是我们要重载的方法。

添加如下代码

 

Python

__author__ = 'bruce' from scrapy.utils.project import get_project_settings import random settings = get_project_settings() class ProcessHeaderMidware(): """process request add request info""" def process_request(self, request, spider): ua = random.choice(settings.get('USER_AGENT_LIST')) spider.logger.info(msg='now entring download midware') if ua: request.headers['User-Agent'] = ua # Add desired logging message here. spider.logger.info( u'User-Agent is : {} {}'.format(request.headers.get('User-Agent'), request) ) pass

__author__ = 'bruce'

from scrapy.utils.project import get_project_settings

import random

settings = get_project_settings()

 

class ProcessHeaderMidware():

    """process request add request info"""

 

    def process_request(self, request, spider):

        ua = random.choice(settings.get('USER_AGENT_LIST'))

        spider.logger.info(msg='now entring download midware')

        if ua:

            request.headers['User-Agent'] = ua

            # Add desired logging message here.

            spider.logger.info(

                u'User-Agent is : {} {}'.format(request.headers.get('User-Agent'), request)

            )

        pass

 

 

我们从配置文件中随机取出了一个请求头,然后赋值给request对象。

 

启用中间件

 

 

运行效果

 

 

总结

 

本次文章只是分析请求头更换这一种常见的策略,另外一些策略包括:添上cookies信息,启用随便下载延迟,增加代理更换的,这些都可以在downloadmidware中间件完成,所以发挥你的想象,做一个更耐用的爬虫出来吧,少年

文章来源:http://brucedone.com/archives/88#comment-177

阿布云代理提供海量极速高匿名HTTPS代理IP,HTTP代理、HTTPS代理、SOCKS代理、动态代理、爬虫代理等专业动态代理IP服务,阿布云因为专业,所以简单
https://www.abuyun.com