##// END OF EJS Templates
Merge with stable
Matt Mackall -
r11051:bf7e63fe merge default
parent child Browse files
Show More
@@ -1,150 +1,151 b''
1 1 Mercurial allows you to customize output of commands through
2 2 templates. You can either pass in a template from the command
3 3 line, via the --template option, or select an existing
4 4 template-style (--style).
5 5
6 6 You can customize output for any "log-like" command: log,
7 7 outgoing, incoming, tip, parents, heads and glog.
8 8
9 Three styles are packaged with Mercurial: default (the style used
10 when no explicit preference is passed), compact and changelog.
9 Four styles are packaged with Mercurial: default (the style used
10 when no explicit preference is passed), compact, changelog,
11 and xml.
11 12 Usage::
12 13
13 14 $ hg log -r1 --style changelog
14 15
15 16 A template is a piece of text, with markup to invoke variable
16 17 expansion::
17 18
18 19 $ hg log -r1 --template "{node}\n"
19 20 b56ce7b07c52de7d5fd79fb89701ea538af65746
20 21
21 22 Strings in curly braces are called keywords. The availability of
22 23 keywords depends on the exact context of the templater. These
23 24 keywords are usually available for templating a log-like command:
24 25
25 26 :author: String. The unmodified author of the changeset.
26 27
27 28 :branches: String. The name of the branch on which the changeset was
28 29 committed. Will be empty if the branch name was default.
29 30
30 31 :date: Date information. The date when the changeset was committed.
31 32
32 33 :desc: String. The text of the changeset description.
33 34
34 35 :diffstat: String. Statistics of changes with the following format:
35 36 "modified files: +added/-removed lines"
36 37
37 38 :files: List of strings. All files modified, added, or removed by this
38 39 changeset.
39 40
40 41 :file_adds: List of strings. Files added by this changeset.
41 42
42 43 :file_copies: List of strings. Files copied in this changeset with
43 44 their sources.
44 45
45 46 :file_copies_switch: List of strings. Like "file_copies" but displayed
46 47 only if the --copied switch is set.
47 48
48 49 :file_mods: List of strings. Files modified by this changeset.
49 50
50 51 :file_dels: List of strings. Files removed by this changeset.
51 52
52 53 :node: String. The changeset identification hash, as a 40-character
53 54 hexadecimal string.
54 55
55 56 :parents: List of strings. The parents of the changeset.
56 57
57 58 :rev: Integer. The repository-local changeset revision number.
58 59
59 60 :tags: List of strings. Any tags associated with the changeset.
60 61
61 62 :latesttag: String. Most recent global tag in the ancestors of this
62 63 changeset.
63 64
64 65 :latesttagdistance: Integer. Longest path to the latest tag.
65 66
66 67 The "date" keyword does not produce human-readable output. If you
67 68 want to use a date in your output, you can use a filter to process
68 69 it. Filters are functions which return a string based on the input
69 70 variable. Be sure to use the stringify filter first when you're
70 71 applying a string-input filter to a list-like input variable.
71 72 You can also use a chain of filters to get the desired output::
72 73
73 74 $ hg tip --template "{date|isodate}\n"
74 75 2008-08-21 18:22 +0000
75 76
76 77 List of filters:
77 78
78 79 :addbreaks: Any text. Add an XHTML "<br />" tag before the end of
79 80 every line except the last.
80 81
81 82 :age: Date. Returns a human-readable date/time difference between the
82 83 given date/time and the current date/time.
83 84
84 85 :basename: Any text. Treats the text as a path, and returns the last
85 86 component of the path after splitting by the path separator
86 87 (ignoring trailing separators). For example, "foo/bar/baz" becomes
87 88 "baz" and "foo/bar//" becomes "bar".
88 89
89 90 :stripdir: Treat the text as path and strip a directory level, if
90 91 possible. For example, "foo" and "foo/bar" becomes "foo".
91 92
92 93 :date: Date. Returns a date in a Unix date format, including the
93 94 timezone: "Mon Sep 04 15:13:13 2006 0700".
94 95
95 96 :domain: Any text. Finds the first string that looks like an email
96 97 address, and extracts just the domain component. Example: ``User
97 98 <user@example.com>`` becomes ``example.com``.
98 99
99 100 :email: Any text. Extracts the first string that looks like an email
100 101 address. Example: ``User <user@example.com>`` becomes
101 102 ``user@example.com``.
102 103
103 104 :escape: Any text. Replaces the special XML/XHTML characters "&", "<"
104 105 and ">" with XML entities.
105 106
106 107 :fill68: Any text. Wraps the text to fit in 68 columns.
107 108
108 109 :fill76: Any text. Wraps the text to fit in 76 columns.
109 110
110 111 :firstline: Any text. Returns the first line of text.
111 112
112 113 :nonempty: Any text. Returns '(none)' if the string is empty.
113 114
114 115 :hgdate: Date. Returns the date as a pair of numbers: "1157407993
115 116 25200" (Unix timestamp, timezone offset).
116 117
117 118 :isodate: Date. Returns the date in ISO 8601 format: "2009-08-18 13:00
118 119 +0200".
119 120
120 121 :isodatesec: Date. Returns the date in ISO 8601 format, including
121 122 seconds: "2009-08-18 13:00:13 +0200". See also the rfc3339date
122 123 filter.
123 124
124 125 :localdate: Date. Converts a date to local date.
125 126
126 127 :obfuscate: Any text. Returns the input text rendered as a sequence of
127 128 XML entities.
128 129
129 130 :person: Any text. Returns the text before an email address.
130 131
131 132 :rfc822date: Date. Returns a date using the same format used in email
132 133 headers: "Tue, 18 Aug 2009 13:00:13 +0200".
133 134
134 135 :rfc3339date: Date. Returns a date using the Internet date format
135 136 specified in RFC 3339: "2009-08-18T13:00:13+02:00".
136 137
137 138 :short: Changeset hash. Returns the short form of a changeset hash,
138 139 i.e. a 12-byte hexadecimal string.
139 140
140 141 :shortdate: Date. Returns a date like "2006-09-18".
141 142
142 143 :strip: Any text. Strips all leading and trailing whitespace.
143 144
144 145 :tabindent: Any text. Returns the text, with every line except the
145 146 first starting with a tab character.
146 147
147 148 :urlescape: Any text. Escapes all "special" characters. For example,
148 149 "foo bar" becomes "foo%20bar".
149 150
150 151 :user: Any text. Returns the user portion of an email address.
@@ -1,616 +1,627 b''
1 1 # url.py - HTTP handling for mercurial
2 2 #
3 3 # Copyright 2005, 2006, 2007, 2008 Matt Mackall <mpm@selenic.com>
4 4 # Copyright 2006, 2007 Alexis S. L. Carvalho <alexis@cecm.usp.br>
5 5 # Copyright 2006 Vadim Gelfer <vadim.gelfer@gmail.com>
6 6 #
7 7 # This software may be used and distributed according to the terms of the
8 8 # GNU General Public License version 2 or any later version.
9 9
10 10 import urllib, urllib2, urlparse, httplib, os, re, socket, cStringIO
11 11 from i18n import _
12 12 import keepalive, util
13 13
14 def _urlunparse(scheme, netloc, path, params, query, fragment, url):
15 '''Handle cases where urlunparse(urlparse(x://)) doesn't preserve the "//"'''
16 result = urlparse.urlunparse((scheme, netloc, path, params, query, fragment))
17 if (scheme and
18 result.startswith(scheme + ':') and
19 not result.startswith(scheme + '://') and
20 url.startswith(scheme + '://')
21 ):
22 result = scheme + '://' + result[len(scheme + ':'):]
23 return result
24
14 25 def hidepassword(url):
15 26 '''hide user credential in a url string'''
16 27 scheme, netloc, path, params, query, fragment = urlparse.urlparse(url)
17 28 netloc = re.sub('([^:]*):([^@]*)@(.*)', r'\1:***@\3', netloc)
18 return urlparse.urlunparse((scheme, netloc, path, params, query, fragment))
29 return _urlunparse(scheme, netloc, path, params, query, fragment, url)
19 30
20 31 def removeauth(url):
21 32 '''remove all authentication information from a url string'''
22 33 scheme, netloc, path, params, query, fragment = urlparse.urlparse(url)
23 34 netloc = netloc[netloc.find('@')+1:]
24 return urlparse.urlunparse((scheme, netloc, path, params, query, fragment))
35 return _urlunparse(scheme, netloc, path, params, query, fragment, url)
25 36
26 37 def netlocsplit(netloc):
27 38 '''split [user[:passwd]@]host[:port] into 4-tuple.'''
28 39
29 40 a = netloc.find('@')
30 41 if a == -1:
31 42 user, passwd = None, None
32 43 else:
33 44 userpass, netloc = netloc[:a], netloc[a + 1:]
34 45 c = userpass.find(':')
35 46 if c == -1:
36 47 user, passwd = urllib.unquote(userpass), None
37 48 else:
38 49 user = urllib.unquote(userpass[:c])
39 50 passwd = urllib.unquote(userpass[c + 1:])
40 51 c = netloc.find(':')
41 52 if c == -1:
42 53 host, port = netloc, None
43 54 else:
44 55 host, port = netloc[:c], netloc[c + 1:]
45 56 return host, port, user, passwd
46 57
47 58 def netlocunsplit(host, port, user=None, passwd=None):
48 59 '''turn host, port, user, passwd into [user[:passwd]@]host[:port].'''
49 60 if port:
50 61 hostport = host + ':' + port
51 62 else:
52 63 hostport = host
53 64 if user:
54 65 quote = lambda s: urllib.quote(s, safe='')
55 66 if passwd:
56 67 userpass = quote(user) + ':' + quote(passwd)
57 68 else:
58 69 userpass = quote(user)
59 70 return userpass + '@' + hostport
60 71 return hostport
61 72
62 73 _safe = ('abcdefghijklmnopqrstuvwxyz'
63 74 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
64 75 '0123456789' '_.-/')
65 76 _safeset = None
66 77 _hex = None
67 78 def quotepath(path):
68 79 '''quote the path part of a URL
69 80
70 81 This is similar to urllib.quote, but it also tries to avoid
71 82 quoting things twice (inspired by wget):
72 83
73 84 >>> quotepath('abc def')
74 85 'abc%20def'
75 86 >>> quotepath('abc%20def')
76 87 'abc%20def'
77 88 >>> quotepath('abc%20 def')
78 89 'abc%20%20def'
79 90 >>> quotepath('abc def%20')
80 91 'abc%20def%20'
81 92 >>> quotepath('abc def%2')
82 93 'abc%20def%252'
83 94 >>> quotepath('abc def%')
84 95 'abc%20def%25'
85 96 '''
86 97 global _safeset, _hex
87 98 if _safeset is None:
88 99 _safeset = set(_safe)
89 100 _hex = set('abcdefABCDEF0123456789')
90 101 l = list(path)
91 102 for i in xrange(len(l)):
92 103 c = l[i]
93 104 if (c == '%' and i + 2 < len(l) and
94 105 l[i + 1] in _hex and l[i + 2] in _hex):
95 106 pass
96 107 elif c not in _safeset:
97 108 l[i] = '%%%02X' % ord(c)
98 109 return ''.join(l)
99 110
100 111 class passwordmgr(urllib2.HTTPPasswordMgrWithDefaultRealm):
101 112 def __init__(self, ui):
102 113 urllib2.HTTPPasswordMgrWithDefaultRealm.__init__(self)
103 114 self.ui = ui
104 115
105 116 def find_user_password(self, realm, authuri):
106 117 authinfo = urllib2.HTTPPasswordMgrWithDefaultRealm.find_user_password(
107 118 self, realm, authuri)
108 119 user, passwd = authinfo
109 120 if user and passwd:
110 121 self._writedebug(user, passwd)
111 122 return (user, passwd)
112 123
113 124 if not user:
114 125 auth = self.readauthtoken(authuri)
115 126 if auth:
116 127 user, passwd = auth.get('username'), auth.get('password')
117 128 if not user or not passwd:
118 129 if not self.ui.interactive():
119 130 raise util.Abort(_('http authorization required'))
120 131
121 132 self.ui.write(_("http authorization required\n"))
122 133 self.ui.status(_("realm: %s\n") % realm)
123 134 if user:
124 135 self.ui.status(_("user: %s\n") % user)
125 136 else:
126 137 user = self.ui.prompt(_("user:"), default=None)
127 138
128 139 if not passwd:
129 140 passwd = self.ui.getpass()
130 141
131 142 self.add_password(realm, authuri, user, passwd)
132 143 self._writedebug(user, passwd)
133 144 return (user, passwd)
134 145
135 146 def _writedebug(self, user, passwd):
136 147 msg = _('http auth: user %s, password %s\n')
137 148 self.ui.debug(msg % (user, passwd and '*' * len(passwd) or 'not set'))
138 149
139 150 def readauthtoken(self, uri):
140 151 # Read configuration
141 152 config = dict()
142 153 for key, val in self.ui.configitems('auth'):
143 154 if '.' not in key:
144 155 self.ui.warn(_("ignoring invalid [auth] key '%s'\n") % key)
145 156 continue
146 157 group, setting = key.split('.', 1)
147 158 gdict = config.setdefault(group, dict())
148 159 if setting in ('cert', 'key'):
149 160 val = util.expandpath(val)
150 161 gdict[setting] = val
151 162
152 163 # Find the best match
153 164 scheme, hostpath = uri.split('://', 1)
154 165 bestlen = 0
155 166 bestauth = None
156 167 for auth in config.itervalues():
157 168 prefix = auth.get('prefix')
158 169 if not prefix:
159 170 continue
160 171 p = prefix.split('://', 1)
161 172 if len(p) > 1:
162 173 schemes, prefix = [p[0]], p[1]
163 174 else:
164 175 schemes = (auth.get('schemes') or 'https').split()
165 176 if (prefix == '*' or hostpath.startswith(prefix)) and \
166 177 len(prefix) > bestlen and scheme in schemes:
167 178 bestlen = len(prefix)
168 179 bestauth = auth
169 180 return bestauth
170 181
171 182 class proxyhandler(urllib2.ProxyHandler):
172 183 def __init__(self, ui):
173 184 proxyurl = ui.config("http_proxy", "host") or os.getenv('http_proxy')
174 185 # XXX proxyauthinfo = None
175 186
176 187 if proxyurl:
177 188 # proxy can be proper url or host[:port]
178 189 if not (proxyurl.startswith('http:') or
179 190 proxyurl.startswith('https:')):
180 191 proxyurl = 'http://' + proxyurl + '/'
181 192 snpqf = urlparse.urlsplit(proxyurl)
182 193 proxyscheme, proxynetloc, proxypath, proxyquery, proxyfrag = snpqf
183 194 hpup = netlocsplit(proxynetloc)
184 195
185 196 proxyhost, proxyport, proxyuser, proxypasswd = hpup
186 197 if not proxyuser:
187 198 proxyuser = ui.config("http_proxy", "user")
188 199 proxypasswd = ui.config("http_proxy", "passwd")
189 200
190 201 # see if we should use a proxy for this url
191 202 no_list = ["localhost", "127.0.0.1"]
192 203 no_list.extend([p.lower() for
193 204 p in ui.configlist("http_proxy", "no")])
194 205 no_list.extend([p.strip().lower() for
195 206 p in os.getenv("no_proxy", '').split(',')
196 207 if p.strip()])
197 208 # "http_proxy.always" config is for running tests on localhost
198 209 if ui.configbool("http_proxy", "always"):
199 210 self.no_list = []
200 211 else:
201 212 self.no_list = no_list
202 213
203 214 proxyurl = urlparse.urlunsplit((
204 215 proxyscheme, netlocunsplit(proxyhost, proxyport,
205 216 proxyuser, proxypasswd or ''),
206 217 proxypath, proxyquery, proxyfrag))
207 218 proxies = {'http': proxyurl, 'https': proxyurl}
208 219 ui.debug('proxying through http://%s:%s\n' %
209 220 (proxyhost, proxyport))
210 221 else:
211 222 proxies = {}
212 223
213 224 # urllib2 takes proxy values from the environment and those
214 225 # will take precedence if found, so drop them
215 226 for env in ["HTTP_PROXY", "http_proxy", "no_proxy"]:
216 227 try:
217 228 if env in os.environ:
218 229 del os.environ[env]
219 230 except OSError:
220 231 pass
221 232
222 233 urllib2.ProxyHandler.__init__(self, proxies)
223 234 self.ui = ui
224 235
225 236 def proxy_open(self, req, proxy, type_):
226 237 host = req.get_host().split(':')[0]
227 238 if host in self.no_list:
228 239 return None
229 240
230 241 # work around a bug in Python < 2.4.2
231 242 # (it leaves a "\n" at the end of Proxy-authorization headers)
232 243 baseclass = req.__class__
233 244 class _request(baseclass):
234 245 def add_header(self, key, val):
235 246 if key.lower() == 'proxy-authorization':
236 247 val = val.strip()
237 248 return baseclass.add_header(self, key, val)
238 249 req.__class__ = _request
239 250
240 251 return urllib2.ProxyHandler.proxy_open(self, req, proxy, type_)
241 252
242 253 class httpsendfile(file):
243 254 def __len__(self):
244 255 return os.fstat(self.fileno()).st_size
245 256
246 257 def _gen_sendfile(connection):
247 258 def _sendfile(self, data):
248 259 # send a file
249 260 if isinstance(data, httpsendfile):
250 261 # if auth required, some data sent twice, so rewind here
251 262 data.seek(0)
252 263 for chunk in util.filechunkiter(data):
253 264 connection.send(self, chunk)
254 265 else:
255 266 connection.send(self, data)
256 267 return _sendfile
257 268
258 269 has_https = hasattr(urllib2, 'HTTPSHandler')
259 270 if has_https:
260 271 try:
261 272 # avoid using deprecated/broken FakeSocket in python 2.6
262 273 import ssl
263 274 _ssl_wrap_socket = ssl.wrap_socket
264 275 CERT_REQUIRED = ssl.CERT_REQUIRED
265 276 except ImportError:
266 277 CERT_REQUIRED = 2
267 278
268 279 def _ssl_wrap_socket(sock, key_file, cert_file,
269 280 cert_reqs=CERT_REQUIRED, ca_certs=None):
270 281 if ca_certs:
271 282 raise util.Abort(_(
272 283 'certificate checking requires Python 2.6'))
273 284
274 285 ssl = socket.ssl(sock, key_file, cert_file)
275 286 return httplib.FakeSocket(sock, ssl)
276 287
277 288 try:
278 289 _create_connection = socket.create_connection
279 290 except AttributeError:
280 291 _GLOBAL_DEFAULT_TIMEOUT = object()
281 292
282 293 def _create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT,
283 294 source_address=None):
284 295 # lifted from Python 2.6
285 296
286 297 msg = "getaddrinfo returns an empty list"
287 298 host, port = address
288 299 for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
289 300 af, socktype, proto, canonname, sa = res
290 301 sock = None
291 302 try:
292 303 sock = socket.socket(af, socktype, proto)
293 304 if timeout is not _GLOBAL_DEFAULT_TIMEOUT:
294 305 sock.settimeout(timeout)
295 306 if source_address:
296 307 sock.bind(source_address)
297 308 sock.connect(sa)
298 309 return sock
299 310
300 311 except socket.error, msg:
301 312 if sock is not None:
302 313 sock.close()
303 314
304 315 raise socket.error, msg
305 316
306 317 class httpconnection(keepalive.HTTPConnection):
307 318 # must be able to send big bundle as stream.
308 319 send = _gen_sendfile(keepalive.HTTPConnection)
309 320
310 321 def connect(self):
311 322 if has_https and self.realhostport: # use CONNECT proxy
312 323 self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
313 324 self.sock.connect((self.host, self.port))
314 325 if _generic_proxytunnel(self):
315 326 # we do not support client x509 certificates
316 327 self.sock = _ssl_wrap_socket(self.sock, None, None)
317 328 else:
318 329 keepalive.HTTPConnection.connect(self)
319 330
320 331 def getresponse(self):
321 332 proxyres = getattr(self, 'proxyres', None)
322 333 if proxyres:
323 334 if proxyres.will_close:
324 335 self.close()
325 336 self.proxyres = None
326 337 return proxyres
327 338 return keepalive.HTTPConnection.getresponse(self)
328 339
329 340 # general transaction handler to support different ways to handle
330 341 # HTTPS proxying before and after Python 2.6.3.
331 342 def _generic_start_transaction(handler, h, req):
332 343 if hasattr(req, '_tunnel_host') and req._tunnel_host:
333 344 tunnel_host = req._tunnel_host
334 345 if tunnel_host[:7] not in ['http://', 'https:/']:
335 346 tunnel_host = 'https://' + tunnel_host
336 347 new_tunnel = True
337 348 else:
338 349 tunnel_host = req.get_selector()
339 350 new_tunnel = False
340 351
341 352 if new_tunnel or tunnel_host == req.get_full_url(): # has proxy
342 353 urlparts = urlparse.urlparse(tunnel_host)
343 354 if new_tunnel or urlparts[0] == 'https': # only use CONNECT for HTTPS
344 355 realhostport = urlparts[1]
345 356 if realhostport[-1] == ']' or ':' not in realhostport:
346 357 realhostport += ':443'
347 358
348 359 h.realhostport = realhostport
349 360 h.headers = req.headers.copy()
350 361 h.headers.update(handler.parent.addheaders)
351 362 return
352 363
353 364 h.realhostport = None
354 365 h.headers = None
355 366
356 367 def _generic_proxytunnel(self):
357 368 proxyheaders = dict(
358 369 [(x, self.headers[x]) for x in self.headers
359 370 if x.lower().startswith('proxy-')])
360 371 self._set_hostport(self.host, self.port)
361 372 self.send('CONNECT %s HTTP/1.0\r\n' % self.realhostport)
362 373 for header in proxyheaders.iteritems():
363 374 self.send('%s: %s\r\n' % header)
364 375 self.send('\r\n')
365 376
366 377 # majority of the following code is duplicated from
367 378 # httplib.HTTPConnection as there are no adequate places to
368 379 # override functions to provide the needed functionality
369 380 res = self.response_class(self.sock,
370 381 strict=self.strict,
371 382 method=self._method)
372 383
373 384 while True:
374 385 version, status, reason = res._read_status()
375 386 if status != httplib.CONTINUE:
376 387 break
377 388 while True:
378 389 skip = res.fp.readline().strip()
379 390 if not skip:
380 391 break
381 392 res.status = status
382 393 res.reason = reason.strip()
383 394
384 395 if res.status == 200:
385 396 while True:
386 397 line = res.fp.readline()
387 398 if line == '\r\n':
388 399 break
389 400 return True
390 401
391 402 if version == 'HTTP/1.0':
392 403 res.version = 10
393 404 elif version.startswith('HTTP/1.'):
394 405 res.version = 11
395 406 elif version == 'HTTP/0.9':
396 407 res.version = 9
397 408 else:
398 409 raise httplib.UnknownProtocol(version)
399 410
400 411 if res.version == 9:
401 412 res.length = None
402 413 res.chunked = 0
403 414 res.will_close = 1
404 415 res.msg = httplib.HTTPMessage(cStringIO.StringIO())
405 416 return False
406 417
407 418 res.msg = httplib.HTTPMessage(res.fp)
408 419 res.msg.fp = None
409 420
410 421 # are we using the chunked-style of transfer encoding?
411 422 trenc = res.msg.getheader('transfer-encoding')
412 423 if trenc and trenc.lower() == "chunked":
413 424 res.chunked = 1
414 425 res.chunk_left = None
415 426 else:
416 427 res.chunked = 0
417 428
418 429 # will the connection close at the end of the response?
419 430 res.will_close = res._check_close()
420 431
421 432 # do we have a Content-Length?
422 433 # NOTE: RFC 2616, S4.4, #3 says we ignore this if tr_enc is "chunked"
423 434 length = res.msg.getheader('content-length')
424 435 if length and not res.chunked:
425 436 try:
426 437 res.length = int(length)
427 438 except ValueError:
428 439 res.length = None
429 440 else:
430 441 if res.length < 0: # ignore nonsensical negative lengths
431 442 res.length = None
432 443 else:
433 444 res.length = None
434 445
435 446 # does the body have a fixed length? (of zero)
436 447 if (status == httplib.NO_CONTENT or status == httplib.NOT_MODIFIED or
437 448 100 <= status < 200 or # 1xx codes
438 449 res._method == 'HEAD'):
439 450 res.length = 0
440 451
441 452 # if the connection remains open, and we aren't using chunked, and
442 453 # a content-length was not provided, then assume that the connection
443 454 # WILL close.
444 455 if (not res.will_close and
445 456 not res.chunked and
446 457 res.length is None):
447 458 res.will_close = 1
448 459
449 460 self.proxyres = res
450 461
451 462 return False
452 463
453 464 class httphandler(keepalive.HTTPHandler):
454 465 def http_open(self, req):
455 466 return self.do_open(httpconnection, req)
456 467
457 468 def _start_transaction(self, h, req):
458 469 _generic_start_transaction(self, h, req)
459 470 return keepalive.HTTPHandler._start_transaction(self, h, req)
460 471
461 472 def __del__(self):
462 473 self.close_all()
463 474
464 475 if has_https:
465 476 class BetterHTTPS(httplib.HTTPSConnection):
466 477 send = keepalive.safesend
467 478
468 479 def connect(self):
469 480 if hasattr(self, 'ui'):
470 481 cacerts = self.ui.config('web', 'cacerts')
471 482 else:
472 483 cacerts = None
473 484
474 485 if cacerts:
475 486 sock = _create_connection((self.host, self.port))
476 487 self.sock = _ssl_wrap_socket(sock, self.key_file,
477 488 self.cert_file, cert_reqs=CERT_REQUIRED,
478 489 ca_certs=cacerts)
479 490 self.ui.debug(_('server identity verification succeeded\n'))
480 491 else:
481 492 httplib.HTTPSConnection.connect(self)
482 493
483 494 class httpsconnection(BetterHTTPS):
484 495 response_class = keepalive.HTTPResponse
485 496 # must be able to send big bundle as stream.
486 497 send = _gen_sendfile(BetterHTTPS)
487 498 getresponse = keepalive.wrapgetresponse(httplib.HTTPSConnection)
488 499
489 500 def connect(self):
490 501 if self.realhostport: # use CONNECT proxy
491 502 self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
492 503 self.sock.connect((self.host, self.port))
493 504 if _generic_proxytunnel(self):
494 505 self.sock = _ssl_wrap_socket(self.sock, self.cert_file,
495 506 self.key_file)
496 507 else:
497 508 BetterHTTPS.connect(self)
498 509
499 510 class httpshandler(keepalive.KeepAliveHandler, urllib2.HTTPSHandler):
500 511 def __init__(self, ui):
501 512 keepalive.KeepAliveHandler.__init__(self)
502 513 urllib2.HTTPSHandler.__init__(self)
503 514 self.ui = ui
504 515 self.pwmgr = passwordmgr(self.ui)
505 516
506 517 def _start_transaction(self, h, req):
507 518 _generic_start_transaction(self, h, req)
508 519 return keepalive.KeepAliveHandler._start_transaction(self, h, req)
509 520
510 521 def https_open(self, req):
511 522 self.auth = self.pwmgr.readauthtoken(req.get_full_url())
512 523 return self.do_open(self._makeconnection, req)
513 524
514 525 def _makeconnection(self, host, port=None, *args, **kwargs):
515 526 keyfile = None
516 527 certfile = None
517 528
518 529 if len(args) >= 1: # key_file
519 530 keyfile = args[0]
520 531 if len(args) >= 2: # cert_file
521 532 certfile = args[1]
522 533 args = args[2:]
523 534
524 535 # if the user has specified different key/cert files in
525 536 # hgrc, we prefer these
526 537 if self.auth and 'key' in self.auth and 'cert' in self.auth:
527 538 keyfile = self.auth['key']
528 539 certfile = self.auth['cert']
529 540
530 541 conn = httpsconnection(host, port, keyfile, certfile, *args, **kwargs)
531 542 conn.ui = self.ui
532 543 return conn
533 544
534 545 # In python < 2.5 AbstractDigestAuthHandler raises a ValueError if
535 546 # it doesn't know about the auth type requested. This can happen if
536 547 # somebody is using BasicAuth and types a bad password.
537 548 class httpdigestauthhandler(urllib2.HTTPDigestAuthHandler):
538 549 def http_error_auth_reqed(self, auth_header, host, req, headers):
539 550 try:
540 551 return urllib2.HTTPDigestAuthHandler.http_error_auth_reqed(
541 552 self, auth_header, host, req, headers)
542 553 except ValueError, inst:
543 554 arg = inst.args[0]
544 555 if arg.startswith("AbstractDigestAuthHandler doesn't know "):
545 556 return
546 557 raise
547 558
548 559 def getauthinfo(path):
549 560 scheme, netloc, urlpath, query, frag = urlparse.urlsplit(path)
550 561 if not urlpath:
551 562 urlpath = '/'
552 563 if scheme != 'file':
553 564 # XXX: why are we quoting the path again with some smart
554 565 # heuristic here? Anyway, it cannot be done with file://
555 566 # urls since path encoding is os/fs dependent (see
556 567 # urllib.pathname2url() for details).
557 568 urlpath = quotepath(urlpath)
558 569 host, port, user, passwd = netlocsplit(netloc)
559 570
560 571 # urllib cannot handle URLs with embedded user or passwd
561 572 url = urlparse.urlunsplit((scheme, netlocunsplit(host, port),
562 573 urlpath, query, frag))
563 574 if user:
564 575 netloc = host
565 576 if port:
566 577 netloc += ':' + port
567 578 # Python < 2.4.3 uses only the netloc to search for a password
568 579 authinfo = (None, (url, netloc), user, passwd or '')
569 580 else:
570 581 authinfo = None
571 582 return url, authinfo
572 583
573 584 handlerfuncs = []
574 585
575 586 def opener(ui, authinfo=None):
576 587 '''
577 588 construct an opener suitable for urllib2
578 589 authinfo will be added to the password manager
579 590 '''
580 591 handlers = [httphandler()]
581 592 if has_https:
582 593 handlers.append(httpshandler(ui))
583 594
584 595 handlers.append(proxyhandler(ui))
585 596
586 597 passmgr = passwordmgr(ui)
587 598 if authinfo is not None:
588 599 passmgr.add_password(*authinfo)
589 600 user, passwd = authinfo[2:4]
590 601 ui.debug('http auth: user %s, password %s\n' %
591 602 (user, passwd and '*' * len(passwd) or 'not set'))
592 603
593 604 handlers.extend((urllib2.HTTPBasicAuthHandler(passmgr),
594 605 httpdigestauthhandler(passmgr)))
595 606 handlers.extend([h(ui, passmgr) for h in handlerfuncs])
596 607 opener = urllib2.build_opener(*handlers)
597 608
598 609 # 1.0 here is the _protocol_ version
599 610 opener.addheaders = [('User-agent', 'mercurial/proto-1.0')]
600 611 opener.addheaders.append(('Accept', 'application/mercurial-0.1'))
601 612 return opener
602 613
603 614 scheme_re = re.compile(r'^([a-zA-Z0-9+-.]+)://')
604 615
605 616 def open(ui, url, data=None):
606 617 scheme = None
607 618 m = scheme_re.search(url)
608 619 if m:
609 620 scheme = m.group(1).lower()
610 621 if not scheme:
611 622 path = util.normpath(os.path.abspath(url))
612 623 url = 'file://' + urllib.pathname2url(path)
613 624 authinfo = None
614 625 else:
615 626 url, authinfo = getauthinfo(url)
616 627 return opener(ui, authinfo).open(url, data)
General Comments 0
You need to be logged in to leave comments. Login now