scrapy_Atwisted.web.error.SchemeNotSupported: Unsupported scheme: b''錯誤及解決
阿新 • • 發佈:2019-01-08
問題描述:
在middleware中使用ip代理的時候,報錯如下
2019-01-05 21:16:15 [scrapy.core.scraper] ERROR: Error downloading <GET http://httpbin.org/ip> Traceback (most recent call last): File "e:\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1416, in _inlineCallbacks result = result.throwExceptionIntoGenerator(g) File "e:\anaconda3\lib\site-packages\twisted\python\failure.py", line 491, in throwExceptionIntoGenerator return g.throw(self.type, self.value, self.tb) File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request defer.returnValue((yield download_func(request=request,spider=spider))) File "e:\anaconda3\lib\site-packages\scrapy\utils\defer.py", line 45, in mustbe_deferred result = f(*args, **kw) File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 65, in download_request return handler.download_request(request, spider) File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 67, in download_request return agent.download_request(request) File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 331, in download_request method, to_bytes(url, encoding='ascii'), headers, bodyproducer) File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 252, in request proxyEndpoint = self._getEndpoint(self._proxyURI) File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1635, in _getEndpoint return self._endpointFactory.endpointForURI(uri) File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1513, in endpointForURI raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,)) twisted.web.error.SchemeNotSupported: Unsupported scheme: b''
middleware程式碼如下:
class IpProxyDownloaderMiddleware(object): def __init__(self): self.proxy_list = [ "39.137.107.98:80", "111.177.190.124:9999", "175.25.26.184:10800" ] def process_request(self, request, spider): proxy = random.choice(self.proxy_list) # print("proxy:", proxy) request.meta["proxy"] = proxy
settings檔案:
# Enable or disable downloader middlewares
# See https://doc.scrapy.org/en/latest/topics/downloader-middleware.html
DOWNLOADER_MIDDLEWARES = {
'Useragent.middlewares.UseragentDownloaderMiddleware': 543,
'Useragent.middlewares.IpProxyDownloaderMiddleware': 300,
}
原因分析:
在middleware中代理ip並沒有加"https://"
因為我用的都是高匿代理所以都需要加https,修改後程式碼:
class IpProxyDownloaderMiddleware(object):
def __init__(self):
self.proxy_list = [
"39.137.107.98:80",
"111.177.190.124:9999",
"175.25.26.184:10800"
]
def process_request(self, request, spider):
proxy = random.choice(self.proxy_list)
# print("proxy:", proxy)
request.meta["proxy"] = "https://" + proxy
重新執行,成功!!