1. 程式人生 > 程式設計 >Python爬蟲設定ip代理過程解析

Python爬蟲設定ip代理過程解析

1、get方式:如何為爬蟲新增ip代理,設定Request header(請求頭)

import urllib 
import urllib.request
import urllib.parse
import random
import time
from fake_useragent import UserAgent
ua = UserAgent()
url = "http://www.baidu.com"
########################################################
'''
設定ip代理
iplist = [ '127.0.0.1:80']  #可自行上網找一些代理
proxy_support = urllib.request.ProxyHandler({'http':random.choice(iplist)}) #也可以設定為https,要看你的代理支不支援
opener = urllib.request.build_opener(proxy_support)
'''
########################################################
'''無ip代理'''
opener = urllib.request.build_opener()

'''f12檢視請求頭新增即可,不一定都需要全新增↓↓↓'''
opener.addheaders = [('Host','newtab.firefoxchina.cn'),('User-Agent',ua.random),('Accept-Encoding','deflate,br'),('Accept','text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8'),('Accept-Language','zh-CN,zh;q=0.8,zh-TW;q=0.7,zh-HK;q=0.5,en-US;q=0.3,en;q=0.2'),('Connection','keep-alive'),('Upgrade-Insecure-Requests',1),('Cookie','__gads=ID=138080209be66bf8:T=1592037395:S=ALNI_Ma-g9wHmfxFL4GCy9veAjJrJRsNmg; Hm_lvt_dd4738b5fb302cb062ef19107df5d2e4=1592449208,1592471447,1592471736,1594001802; uid=rBADnV7m04mi8wRJK3xYAg=='),]
urllib.request.install_opener(opener)
while True:
  try:
    response = urllib.request.urlopen(url)
    break
  except Exception as e:
    print("錯誤資訊:" + str(e))
    time.sleep(3)
html = response.read().decode("utf-8")
print(html)

2、post方式新增載荷(此處是打比方),修改urllib.request.install_opener(opener)以下的程式碼即可

urllib.request.install_opener(opener)
# data = {}    #當頁面提交資料是有載荷但是載荷內容為空時,必須以data = {}傳參,不然無法獲取網頁資料
data = {'_csrf':'請把','collection-name':'載荷的引數','description':'以這種形式','_csrf':'裝載'
    }
data = urllib.parse.urlencode(data).encode('utf-8')
req = urllib.request.Request(url,data)
while True:
  try:
    response = urllib.request.urlopen(req)
    break
  except Exception as e:
    print("錯誤資訊:" + str(e))
    time.sleep(3)
html = response.read().decode("utf-8")

以上就是本文的全部內容,希望對大家的學習有所幫助,也希望大家多多支援我們。