1. 程式人生 > >HTTPConnectionPool(host:XX)Max retries exceeded with url

HTTPConnectionPool(host:XX)Max retries exceeded with url

.get sig assign 服務 網站 quest requests net head

爬蟲多次訪問同一個網站一段時間後會出現錯誤 HTTPConnectionPool(host:XX)Max retries exceeded with url ‘<requests.packages.urllib3.connection.HTTPConnection object at XXXX>: Failed to establish a new connection: [Errno 99] Cannot assign requested address‘
是因為在每次數據傳輸前客戶端要和服務器建立TCP連接,為節省傳輸消耗,默認為keep-alive,即連接一次,傳輸多次,然而在多次訪問後不能結束並回到連接池中,導致不能產生新的連接
headers中的Connection默認為keep-alive,
將header中的Connection一項置為close

headers = {
‘Connection‘: ‘close‘,
}
r = requests.get(url, headers=headers)

此時問題解決
---------------------
作者:張濤
來源:CSDN
原文:https://blog.csdn.net/ZTCooper/article/details/80220063
版權聲明:本文為博主原創文章,轉載請附上博文鏈接!

HTTPConnectionPool(host:XX)Max retries exceeded with url