1、urllib2代码
如下文代码所示,自定义 'Connection': 'keep-alive',通知服务器交互结束后,不断开连接,即所谓长连接。点击(此处)折叠或打开
-
#测试8 使用urllib2 测试Connection=keep-alive
-
import urllib2
-
import cookielib
-
-
-
httpHandler = urllib2.HTTPHandler(debuglevel=1)
-
httpsHandler = urllib2.HTTPSHandler(debuglevel=1)
-
opener = urllib2.build_opener(httpHandler, httpsHandler)
-
urllib2.install_opener(opener)
-
-
loginHeaders={
-
'User-Agent': 'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.0 Chrome/30.0.1599.101 Safari/537.36',
-
'Referer': '',
-
'Connection': 'keep-alive'
-
}
-
request=urllib2.Request('',headers=loginHeaders)
-
response = urllib2.urlopen(request)
-
page=''
-
page= response.read()
-
print response.info()
- print page
输出报文:
注意日志中划线部分,可以看到请求报文其他头部,例如User-agent已被修改成功,但connection仍然保持close
- Connection: close
- header: Connection: close
点击(此处)折叠或打开
-
send: 'GET / HTTP/1.1\r\nAccept-Encoding: identity\r\nHost: \r\nReferer: \r\nConnection: close\r\nUser-Agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.0 Chrome/30.0.1599.101 Safari/537.36\r\n\r\n'
-
reply: 'HTTP/1.1 200 OK\r\n'
-
header: Connection: close
-
header: Transfer-Encoding: chunked
-
header: Expires: Thu, 19 Nov 1981 08:52:00 GMT
-
header: Date: Sun, 15 May 2016 02:51:33 GMT
-
header: Content-Type: text/html; charset=utf-8
-
header: Server: nginx/1.2.9
-
header: Vary: Accept-Encoding
-
header: X-Powered-By: ThinkPHP
-
header: Set-Cookie: PHPSESSID=q0pobulph34f8sum6akhpovkg1; path=/
-
header: Cache-Control: private
-
header: Pragma: no-cache
-
Connection: close
-
Transfer-Encoding: chunked
-
Expires: Thu, 19 Nov 1981 08:52:00 GMT
-
Date: Sun, 15 May 2016 02:51:33 GMT
-
Content-Type: text/html; charset=utf-8
-
Server: nginx/1.2.9
-
Vary: Accept-Encoding
-
X-Powered-By: ThinkPHP
-
Set-Cookie: PHPSESSID=q0pobulph34f8sum6akhpovkg1; path=/
-
Cache-Control: private
- Pragma: no-cache
2、httplib2写法代码
换成httplib2协议的代码,当然这也是urllib2不支持keep-alive的解决办法之一,另一个方法是Requests。点击(此处)折叠或打开
-
#测试8 使用httplib2测试Connection=keep-alive
-
import httplib2
-
-
ghttp = httplib2.Http()
-
httplib2.debuglevel=1
-
loginHeaders={
-
'User-Agent': 'Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.0 Chrome/30.0.1599.101 Safari/537.36',
-
'Connection': 'keep-alive'
-
}
-
-
response ,page= ghttp.request('',headers=loginHeaders )
- print page.decode('utf-8')
可以看到输出中,长连接设置成功。
- header: Connection: Keep-Alive
点击(此处)折叠或打开
-
connect: (www.suning.com.cn, 80) ************
-
send: 'GET / HTTP/1.1\r\nHost: \r\nconnection: keep-alive\r\naccept-encoding: gzip, deflate\r\nuser-agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.0 Chrome/30.0.1599.101 Safari/537.36\r\n\r\n'
-
reply: 'HTTP/1.1 200 OK\r\n'
-
header: Connection: Keep-Alive
-
header: Transfer-Encoding: chunked
-
header: Expires: Thu, 19 Nov 1981 08:52:00 GMT
-
header: Date: Sun, 15 May 2016 02:59:50 GMT
-
header: Content-Type: text/html; charset=utf-8
-
header: Server: nginx/1.2.9
-
header: Vary: Accept-Encoding
-
header: X-Powered-By: ThinkPHP
-
header: Set-Cookie: PHPSESSID=egs5ef9dja68ti8u72v0hg5066; path=/
-
header: Cache-Control: private
- header: Pragma: no-cache
3、分析原因
还是上urllib2的源码吧,可以看到在do_open核心方法中,connection被写死成了close。至于原因就是上面那一堆注释,大概意思是addinfourl这个类一旦启用长链接,可以读取到上次交互未读完的应答报文,为了防止此类情况,所以强制性将Connection写死成close。
def do_open(self, http_class, req, **http_conn_args):
……
# We want to make an HTTP/1.1 request, but the addinfourl
# class isn't prepared to deal with a persistent connection.
# It will try to read all remaining data from the socket,
# which will block while the server waits for the next request.
# So make sure the connection gets closed after the (only)
# request.
headers["Connection"] = "close"
headers = dict((name.title(), val) for name, val in headers.items())
if req._tunnel_host:
tunnel_headers = {}
proxy_auth_hdr = "Proxy-Authorization"
if proxy_auth_hdr in headers:
tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr]
# Proxy-Authorization should not be sent to origin
# server.
del headers[proxy_auth_hdr]
h.set_tunnel(req._tunnel_host, headers=tunnel_headers)
try:
h.request(req.get_method(), req.get_selector(), req.data, headers)