Show More
@@ -1,150 +1,151 | |||||
1 | Mercurial allows you to customize output of commands through |
|
1 | Mercurial allows you to customize output of commands through | |
2 | templates. You can either pass in a template from the command |
|
2 | templates. You can either pass in a template from the command | |
3 | line, via the --template option, or select an existing |
|
3 | line, via the --template option, or select an existing | |
4 | template-style (--style). |
|
4 | template-style (--style). | |
5 |
|
5 | |||
6 | You can customize output for any "log-like" command: log, |
|
6 | You can customize output for any "log-like" command: log, | |
7 | outgoing, incoming, tip, parents, heads and glog. |
|
7 | outgoing, incoming, tip, parents, heads and glog. | |
8 |
|
8 | |||
9 |
|
|
9 | Four styles are packaged with Mercurial: default (the style used | |
10 |
when no explicit preference is passed), compact |
|
10 | when no explicit preference is passed), compact, changelog, | |
|
11 | and xml. | |||
11 | Usage:: |
|
12 | Usage:: | |
12 |
|
13 | |||
13 | $ hg log -r1 --style changelog |
|
14 | $ hg log -r1 --style changelog | |
14 |
|
15 | |||
15 | A template is a piece of text, with markup to invoke variable |
|
16 | A template is a piece of text, with markup to invoke variable | |
16 | expansion:: |
|
17 | expansion:: | |
17 |
|
18 | |||
18 | $ hg log -r1 --template "{node}\n" |
|
19 | $ hg log -r1 --template "{node}\n" | |
19 | b56ce7b07c52de7d5fd79fb89701ea538af65746 |
|
20 | b56ce7b07c52de7d5fd79fb89701ea538af65746 | |
20 |
|
21 | |||
21 | Strings in curly braces are called keywords. The availability of |
|
22 | Strings in curly braces are called keywords. The availability of | |
22 | keywords depends on the exact context of the templater. These |
|
23 | keywords depends on the exact context of the templater. These | |
23 | keywords are usually available for templating a log-like command: |
|
24 | keywords are usually available for templating a log-like command: | |
24 |
|
25 | |||
25 | :author: String. The unmodified author of the changeset. |
|
26 | :author: String. The unmodified author of the changeset. | |
26 |
|
27 | |||
27 | :branches: String. The name of the branch on which the changeset was |
|
28 | :branches: String. The name of the branch on which the changeset was | |
28 | committed. Will be empty if the branch name was default. |
|
29 | committed. Will be empty if the branch name was default. | |
29 |
|
30 | |||
30 | :date: Date information. The date when the changeset was committed. |
|
31 | :date: Date information. The date when the changeset was committed. | |
31 |
|
32 | |||
32 | :desc: String. The text of the changeset description. |
|
33 | :desc: String. The text of the changeset description. | |
33 |
|
34 | |||
34 | :diffstat: String. Statistics of changes with the following format: |
|
35 | :diffstat: String. Statistics of changes with the following format: | |
35 | "modified files: +added/-removed lines" |
|
36 | "modified files: +added/-removed lines" | |
36 |
|
37 | |||
37 | :files: List of strings. All files modified, added, or removed by this |
|
38 | :files: List of strings. All files modified, added, or removed by this | |
38 | changeset. |
|
39 | changeset. | |
39 |
|
40 | |||
40 | :file_adds: List of strings. Files added by this changeset. |
|
41 | :file_adds: List of strings. Files added by this changeset. | |
41 |
|
42 | |||
42 | :file_copies: List of strings. Files copied in this changeset with |
|
43 | :file_copies: List of strings. Files copied in this changeset with | |
43 | their sources. |
|
44 | their sources. | |
44 |
|
45 | |||
45 | :file_copies_switch: List of strings. Like "file_copies" but displayed |
|
46 | :file_copies_switch: List of strings. Like "file_copies" but displayed | |
46 | only if the --copied switch is set. |
|
47 | only if the --copied switch is set. | |
47 |
|
48 | |||
48 | :file_mods: List of strings. Files modified by this changeset. |
|
49 | :file_mods: List of strings. Files modified by this changeset. | |
49 |
|
50 | |||
50 | :file_dels: List of strings. Files removed by this changeset. |
|
51 | :file_dels: List of strings. Files removed by this changeset. | |
51 |
|
52 | |||
52 | :node: String. The changeset identification hash, as a 40-character |
|
53 | :node: String. The changeset identification hash, as a 40-character | |
53 | hexadecimal string. |
|
54 | hexadecimal string. | |
54 |
|
55 | |||
55 | :parents: List of strings. The parents of the changeset. |
|
56 | :parents: List of strings. The parents of the changeset. | |
56 |
|
57 | |||
57 | :rev: Integer. The repository-local changeset revision number. |
|
58 | :rev: Integer. The repository-local changeset revision number. | |
58 |
|
59 | |||
59 | :tags: List of strings. Any tags associated with the changeset. |
|
60 | :tags: List of strings. Any tags associated with the changeset. | |
60 |
|
61 | |||
61 | :latesttag: String. Most recent global tag in the ancestors of this |
|
62 | :latesttag: String. Most recent global tag in the ancestors of this | |
62 | changeset. |
|
63 | changeset. | |
63 |
|
64 | |||
64 | :latesttagdistance: Integer. Longest path to the latest tag. |
|
65 | :latesttagdistance: Integer. Longest path to the latest tag. | |
65 |
|
66 | |||
66 | The "date" keyword does not produce human-readable output. If you |
|
67 | The "date" keyword does not produce human-readable output. If you | |
67 | want to use a date in your output, you can use a filter to process |
|
68 | want to use a date in your output, you can use a filter to process | |
68 | it. Filters are functions which return a string based on the input |
|
69 | it. Filters are functions which return a string based on the input | |
69 | variable. Be sure to use the stringify filter first when you're |
|
70 | variable. Be sure to use the stringify filter first when you're | |
70 | applying a string-input filter to a list-like input variable. |
|
71 | applying a string-input filter to a list-like input variable. | |
71 | You can also use a chain of filters to get the desired output:: |
|
72 | You can also use a chain of filters to get the desired output:: | |
72 |
|
73 | |||
73 | $ hg tip --template "{date|isodate}\n" |
|
74 | $ hg tip --template "{date|isodate}\n" | |
74 | 2008-08-21 18:22 +0000 |
|
75 | 2008-08-21 18:22 +0000 | |
75 |
|
76 | |||
76 | List of filters: |
|
77 | List of filters: | |
77 |
|
78 | |||
78 | :addbreaks: Any text. Add an XHTML "<br />" tag before the end of |
|
79 | :addbreaks: Any text. Add an XHTML "<br />" tag before the end of | |
79 | every line except the last. |
|
80 | every line except the last. | |
80 |
|
81 | |||
81 | :age: Date. Returns a human-readable date/time difference between the |
|
82 | :age: Date. Returns a human-readable date/time difference between the | |
82 | given date/time and the current date/time. |
|
83 | given date/time and the current date/time. | |
83 |
|
84 | |||
84 | :basename: Any text. Treats the text as a path, and returns the last |
|
85 | :basename: Any text. Treats the text as a path, and returns the last | |
85 | component of the path after splitting by the path separator |
|
86 | component of the path after splitting by the path separator | |
86 | (ignoring trailing separators). For example, "foo/bar/baz" becomes |
|
87 | (ignoring trailing separators). For example, "foo/bar/baz" becomes | |
87 | "baz" and "foo/bar//" becomes "bar". |
|
88 | "baz" and "foo/bar//" becomes "bar". | |
88 |
|
89 | |||
89 | :stripdir: Treat the text as path and strip a directory level, if |
|
90 | :stripdir: Treat the text as path and strip a directory level, if | |
90 | possible. For example, "foo" and "foo/bar" becomes "foo". |
|
91 | possible. For example, "foo" and "foo/bar" becomes "foo". | |
91 |
|
92 | |||
92 | :date: Date. Returns a date in a Unix date format, including the |
|
93 | :date: Date. Returns a date in a Unix date format, including the | |
93 | timezone: "Mon Sep 04 15:13:13 2006 0700". |
|
94 | timezone: "Mon Sep 04 15:13:13 2006 0700". | |
94 |
|
95 | |||
95 | :domain: Any text. Finds the first string that looks like an email |
|
96 | :domain: Any text. Finds the first string that looks like an email | |
96 | address, and extracts just the domain component. Example: ``User |
|
97 | address, and extracts just the domain component. Example: ``User | |
97 | <user@example.com>`` becomes ``example.com``. |
|
98 | <user@example.com>`` becomes ``example.com``. | |
98 |
|
99 | |||
99 | :email: Any text. Extracts the first string that looks like an email |
|
100 | :email: Any text. Extracts the first string that looks like an email | |
100 | address. Example: ``User <user@example.com>`` becomes |
|
101 | address. Example: ``User <user@example.com>`` becomes | |
101 | ``user@example.com``. |
|
102 | ``user@example.com``. | |
102 |
|
103 | |||
103 | :escape: Any text. Replaces the special XML/XHTML characters "&", "<" |
|
104 | :escape: Any text. Replaces the special XML/XHTML characters "&", "<" | |
104 | and ">" with XML entities. |
|
105 | and ">" with XML entities. | |
105 |
|
106 | |||
106 | :fill68: Any text. Wraps the text to fit in 68 columns. |
|
107 | :fill68: Any text. Wraps the text to fit in 68 columns. | |
107 |
|
108 | |||
108 | :fill76: Any text. Wraps the text to fit in 76 columns. |
|
109 | :fill76: Any text. Wraps the text to fit in 76 columns. | |
109 |
|
110 | |||
110 | :firstline: Any text. Returns the first line of text. |
|
111 | :firstline: Any text. Returns the first line of text. | |
111 |
|
112 | |||
112 | :nonempty: Any text. Returns '(none)' if the string is empty. |
|
113 | :nonempty: Any text. Returns '(none)' if the string is empty. | |
113 |
|
114 | |||
114 | :hgdate: Date. Returns the date as a pair of numbers: "1157407993 |
|
115 | :hgdate: Date. Returns the date as a pair of numbers: "1157407993 | |
115 | 25200" (Unix timestamp, timezone offset). |
|
116 | 25200" (Unix timestamp, timezone offset). | |
116 |
|
117 | |||
117 | :isodate: Date. Returns the date in ISO 8601 format: "2009-08-18 13:00 |
|
118 | :isodate: Date. Returns the date in ISO 8601 format: "2009-08-18 13:00 | |
118 | +0200". |
|
119 | +0200". | |
119 |
|
120 | |||
120 | :isodatesec: Date. Returns the date in ISO 8601 format, including |
|
121 | :isodatesec: Date. Returns the date in ISO 8601 format, including | |
121 | seconds: "2009-08-18 13:00:13 +0200". See also the rfc3339date |
|
122 | seconds: "2009-08-18 13:00:13 +0200". See also the rfc3339date | |
122 | filter. |
|
123 | filter. | |
123 |
|
124 | |||
124 | :localdate: Date. Converts a date to local date. |
|
125 | :localdate: Date. Converts a date to local date. | |
125 |
|
126 | |||
126 | :obfuscate: Any text. Returns the input text rendered as a sequence of |
|
127 | :obfuscate: Any text. Returns the input text rendered as a sequence of | |
127 | XML entities. |
|
128 | XML entities. | |
128 |
|
129 | |||
129 | :person: Any text. Returns the text before an email address. |
|
130 | :person: Any text. Returns the text before an email address. | |
130 |
|
131 | |||
131 | :rfc822date: Date. Returns a date using the same format used in email |
|
132 | :rfc822date: Date. Returns a date using the same format used in email | |
132 | headers: "Tue, 18 Aug 2009 13:00:13 +0200". |
|
133 | headers: "Tue, 18 Aug 2009 13:00:13 +0200". | |
133 |
|
134 | |||
134 | :rfc3339date: Date. Returns a date using the Internet date format |
|
135 | :rfc3339date: Date. Returns a date using the Internet date format | |
135 | specified in RFC 3339: "2009-08-18T13:00:13+02:00". |
|
136 | specified in RFC 3339: "2009-08-18T13:00:13+02:00". | |
136 |
|
137 | |||
137 | :short: Changeset hash. Returns the short form of a changeset hash, |
|
138 | :short: Changeset hash. Returns the short form of a changeset hash, | |
138 | i.e. a 12-byte hexadecimal string. |
|
139 | i.e. a 12-byte hexadecimal string. | |
139 |
|
140 | |||
140 | :shortdate: Date. Returns a date like "2006-09-18". |
|
141 | :shortdate: Date. Returns a date like "2006-09-18". | |
141 |
|
142 | |||
142 | :strip: Any text. Strips all leading and trailing whitespace. |
|
143 | :strip: Any text. Strips all leading and trailing whitespace. | |
143 |
|
144 | |||
144 | :tabindent: Any text. Returns the text, with every line except the |
|
145 | :tabindent: Any text. Returns the text, with every line except the | |
145 | first starting with a tab character. |
|
146 | first starting with a tab character. | |
146 |
|
147 | |||
147 | :urlescape: Any text. Escapes all "special" characters. For example, |
|
148 | :urlescape: Any text. Escapes all "special" characters. For example, | |
148 | "foo bar" becomes "foo%20bar". |
|
149 | "foo bar" becomes "foo%20bar". | |
149 |
|
150 | |||
150 | :user: Any text. Returns the user portion of an email address. |
|
151 | :user: Any text. Returns the user portion of an email address. |
@@ -1,616 +1,627 | |||||
1 | # url.py - HTTP handling for mercurial |
|
1 | # url.py - HTTP handling for mercurial | |
2 | # |
|
2 | # | |
3 | # Copyright 2005, 2006, 2007, 2008 Matt Mackall <mpm@selenic.com> |
|
3 | # Copyright 2005, 2006, 2007, 2008 Matt Mackall <mpm@selenic.com> | |
4 | # Copyright 2006, 2007 Alexis S. L. Carvalho <alexis@cecm.usp.br> |
|
4 | # Copyright 2006, 2007 Alexis S. L. Carvalho <alexis@cecm.usp.br> | |
5 | # Copyright 2006 Vadim Gelfer <vadim.gelfer@gmail.com> |
|
5 | # Copyright 2006 Vadim Gelfer <vadim.gelfer@gmail.com> | |
6 | # |
|
6 | # | |
7 | # This software may be used and distributed according to the terms of the |
|
7 | # This software may be used and distributed according to the terms of the | |
8 | # GNU General Public License version 2 or any later version. |
|
8 | # GNU General Public License version 2 or any later version. | |
9 |
|
9 | |||
10 | import urllib, urllib2, urlparse, httplib, os, re, socket, cStringIO |
|
10 | import urllib, urllib2, urlparse, httplib, os, re, socket, cStringIO | |
11 | from i18n import _ |
|
11 | from i18n import _ | |
12 | import keepalive, util |
|
12 | import keepalive, util | |
13 |
|
13 | |||
|
14 | def _urlunparse(scheme, netloc, path, params, query, fragment, url): | |||
|
15 | '''Handle cases where urlunparse(urlparse(x://)) doesn't preserve the "//"''' | |||
|
16 | result = urlparse.urlunparse((scheme, netloc, path, params, query, fragment)) | |||
|
17 | if (scheme and | |||
|
18 | result.startswith(scheme + ':') and | |||
|
19 | not result.startswith(scheme + '://') and | |||
|
20 | url.startswith(scheme + '://') | |||
|
21 | ): | |||
|
22 | result = scheme + '://' + result[len(scheme + ':'):] | |||
|
23 | return result | |||
|
24 | ||||
14 | def hidepassword(url): |
|
25 | def hidepassword(url): | |
15 | '''hide user credential in a url string''' |
|
26 | '''hide user credential in a url string''' | |
16 | scheme, netloc, path, params, query, fragment = urlparse.urlparse(url) |
|
27 | scheme, netloc, path, params, query, fragment = urlparse.urlparse(url) | |
17 | netloc = re.sub('([^:]*):([^@]*)@(.*)', r'\1:***@\3', netloc) |
|
28 | netloc = re.sub('([^:]*):([^@]*)@(.*)', r'\1:***@\3', netloc) | |
18 |
return |
|
29 | return _urlunparse(scheme, netloc, path, params, query, fragment, url) | |
19 |
|
30 | |||
20 | def removeauth(url): |
|
31 | def removeauth(url): | |
21 | '''remove all authentication information from a url string''' |
|
32 | '''remove all authentication information from a url string''' | |
22 | scheme, netloc, path, params, query, fragment = urlparse.urlparse(url) |
|
33 | scheme, netloc, path, params, query, fragment = urlparse.urlparse(url) | |
23 | netloc = netloc[netloc.find('@')+1:] |
|
34 | netloc = netloc[netloc.find('@')+1:] | |
24 |
return |
|
35 | return _urlunparse(scheme, netloc, path, params, query, fragment, url) | |
25 |
|
36 | |||
26 | def netlocsplit(netloc): |
|
37 | def netlocsplit(netloc): | |
27 | '''split [user[:passwd]@]host[:port] into 4-tuple.''' |
|
38 | '''split [user[:passwd]@]host[:port] into 4-tuple.''' | |
28 |
|
39 | |||
29 | a = netloc.find('@') |
|
40 | a = netloc.find('@') | |
30 | if a == -1: |
|
41 | if a == -1: | |
31 | user, passwd = None, None |
|
42 | user, passwd = None, None | |
32 | else: |
|
43 | else: | |
33 | userpass, netloc = netloc[:a], netloc[a + 1:] |
|
44 | userpass, netloc = netloc[:a], netloc[a + 1:] | |
34 | c = userpass.find(':') |
|
45 | c = userpass.find(':') | |
35 | if c == -1: |
|
46 | if c == -1: | |
36 | user, passwd = urllib.unquote(userpass), None |
|
47 | user, passwd = urllib.unquote(userpass), None | |
37 | else: |
|
48 | else: | |
38 | user = urllib.unquote(userpass[:c]) |
|
49 | user = urllib.unquote(userpass[:c]) | |
39 | passwd = urllib.unquote(userpass[c + 1:]) |
|
50 | passwd = urllib.unquote(userpass[c + 1:]) | |
40 | c = netloc.find(':') |
|
51 | c = netloc.find(':') | |
41 | if c == -1: |
|
52 | if c == -1: | |
42 | host, port = netloc, None |
|
53 | host, port = netloc, None | |
43 | else: |
|
54 | else: | |
44 | host, port = netloc[:c], netloc[c + 1:] |
|
55 | host, port = netloc[:c], netloc[c + 1:] | |
45 | return host, port, user, passwd |
|
56 | return host, port, user, passwd | |
46 |
|
57 | |||
47 | def netlocunsplit(host, port, user=None, passwd=None): |
|
58 | def netlocunsplit(host, port, user=None, passwd=None): | |
48 | '''turn host, port, user, passwd into [user[:passwd]@]host[:port].''' |
|
59 | '''turn host, port, user, passwd into [user[:passwd]@]host[:port].''' | |
49 | if port: |
|
60 | if port: | |
50 | hostport = host + ':' + port |
|
61 | hostport = host + ':' + port | |
51 | else: |
|
62 | else: | |
52 | hostport = host |
|
63 | hostport = host | |
53 | if user: |
|
64 | if user: | |
54 | quote = lambda s: urllib.quote(s, safe='') |
|
65 | quote = lambda s: urllib.quote(s, safe='') | |
55 | if passwd: |
|
66 | if passwd: | |
56 | userpass = quote(user) + ':' + quote(passwd) |
|
67 | userpass = quote(user) + ':' + quote(passwd) | |
57 | else: |
|
68 | else: | |
58 | userpass = quote(user) |
|
69 | userpass = quote(user) | |
59 | return userpass + '@' + hostport |
|
70 | return userpass + '@' + hostport | |
60 | return hostport |
|
71 | return hostport | |
61 |
|
72 | |||
62 | _safe = ('abcdefghijklmnopqrstuvwxyz' |
|
73 | _safe = ('abcdefghijklmnopqrstuvwxyz' | |
63 | 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' |
|
74 | 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' | |
64 | '0123456789' '_.-/') |
|
75 | '0123456789' '_.-/') | |
65 | _safeset = None |
|
76 | _safeset = None | |
66 | _hex = None |
|
77 | _hex = None | |
67 | def quotepath(path): |
|
78 | def quotepath(path): | |
68 | '''quote the path part of a URL |
|
79 | '''quote the path part of a URL | |
69 |
|
80 | |||
70 | This is similar to urllib.quote, but it also tries to avoid |
|
81 | This is similar to urllib.quote, but it also tries to avoid | |
71 | quoting things twice (inspired by wget): |
|
82 | quoting things twice (inspired by wget): | |
72 |
|
83 | |||
73 | >>> quotepath('abc def') |
|
84 | >>> quotepath('abc def') | |
74 | 'abc%20def' |
|
85 | 'abc%20def' | |
75 | >>> quotepath('abc%20def') |
|
86 | >>> quotepath('abc%20def') | |
76 | 'abc%20def' |
|
87 | 'abc%20def' | |
77 | >>> quotepath('abc%20 def') |
|
88 | >>> quotepath('abc%20 def') | |
78 | 'abc%20%20def' |
|
89 | 'abc%20%20def' | |
79 | >>> quotepath('abc def%20') |
|
90 | >>> quotepath('abc def%20') | |
80 | 'abc%20def%20' |
|
91 | 'abc%20def%20' | |
81 | >>> quotepath('abc def%2') |
|
92 | >>> quotepath('abc def%2') | |
82 | 'abc%20def%252' |
|
93 | 'abc%20def%252' | |
83 | >>> quotepath('abc def%') |
|
94 | >>> quotepath('abc def%') | |
84 | 'abc%20def%25' |
|
95 | 'abc%20def%25' | |
85 | ''' |
|
96 | ''' | |
86 | global _safeset, _hex |
|
97 | global _safeset, _hex | |
87 | if _safeset is None: |
|
98 | if _safeset is None: | |
88 | _safeset = set(_safe) |
|
99 | _safeset = set(_safe) | |
89 | _hex = set('abcdefABCDEF0123456789') |
|
100 | _hex = set('abcdefABCDEF0123456789') | |
90 | l = list(path) |
|
101 | l = list(path) | |
91 | for i in xrange(len(l)): |
|
102 | for i in xrange(len(l)): | |
92 | c = l[i] |
|
103 | c = l[i] | |
93 | if (c == '%' and i + 2 < len(l) and |
|
104 | if (c == '%' and i + 2 < len(l) and | |
94 | l[i + 1] in _hex and l[i + 2] in _hex): |
|
105 | l[i + 1] in _hex and l[i + 2] in _hex): | |
95 | pass |
|
106 | pass | |
96 | elif c not in _safeset: |
|
107 | elif c not in _safeset: | |
97 | l[i] = '%%%02X' % ord(c) |
|
108 | l[i] = '%%%02X' % ord(c) | |
98 | return ''.join(l) |
|
109 | return ''.join(l) | |
99 |
|
110 | |||
100 | class passwordmgr(urllib2.HTTPPasswordMgrWithDefaultRealm): |
|
111 | class passwordmgr(urllib2.HTTPPasswordMgrWithDefaultRealm): | |
101 | def __init__(self, ui): |
|
112 | def __init__(self, ui): | |
102 | urllib2.HTTPPasswordMgrWithDefaultRealm.__init__(self) |
|
113 | urllib2.HTTPPasswordMgrWithDefaultRealm.__init__(self) | |
103 | self.ui = ui |
|
114 | self.ui = ui | |
104 |
|
115 | |||
105 | def find_user_password(self, realm, authuri): |
|
116 | def find_user_password(self, realm, authuri): | |
106 | authinfo = urllib2.HTTPPasswordMgrWithDefaultRealm.find_user_password( |
|
117 | authinfo = urllib2.HTTPPasswordMgrWithDefaultRealm.find_user_password( | |
107 | self, realm, authuri) |
|
118 | self, realm, authuri) | |
108 | user, passwd = authinfo |
|
119 | user, passwd = authinfo | |
109 | if user and passwd: |
|
120 | if user and passwd: | |
110 | self._writedebug(user, passwd) |
|
121 | self._writedebug(user, passwd) | |
111 | return (user, passwd) |
|
122 | return (user, passwd) | |
112 |
|
123 | |||
113 | if not user: |
|
124 | if not user: | |
114 | auth = self.readauthtoken(authuri) |
|
125 | auth = self.readauthtoken(authuri) | |
115 | if auth: |
|
126 | if auth: | |
116 | user, passwd = auth.get('username'), auth.get('password') |
|
127 | user, passwd = auth.get('username'), auth.get('password') | |
117 | if not user or not passwd: |
|
128 | if not user or not passwd: | |
118 | if not self.ui.interactive(): |
|
129 | if not self.ui.interactive(): | |
119 | raise util.Abort(_('http authorization required')) |
|
130 | raise util.Abort(_('http authorization required')) | |
120 |
|
131 | |||
121 | self.ui.write(_("http authorization required\n")) |
|
132 | self.ui.write(_("http authorization required\n")) | |
122 | self.ui.status(_("realm: %s\n") % realm) |
|
133 | self.ui.status(_("realm: %s\n") % realm) | |
123 | if user: |
|
134 | if user: | |
124 | self.ui.status(_("user: %s\n") % user) |
|
135 | self.ui.status(_("user: %s\n") % user) | |
125 | else: |
|
136 | else: | |
126 | user = self.ui.prompt(_("user:"), default=None) |
|
137 | user = self.ui.prompt(_("user:"), default=None) | |
127 |
|
138 | |||
128 | if not passwd: |
|
139 | if not passwd: | |
129 | passwd = self.ui.getpass() |
|
140 | passwd = self.ui.getpass() | |
130 |
|
141 | |||
131 | self.add_password(realm, authuri, user, passwd) |
|
142 | self.add_password(realm, authuri, user, passwd) | |
132 | self._writedebug(user, passwd) |
|
143 | self._writedebug(user, passwd) | |
133 | return (user, passwd) |
|
144 | return (user, passwd) | |
134 |
|
145 | |||
135 | def _writedebug(self, user, passwd): |
|
146 | def _writedebug(self, user, passwd): | |
136 | msg = _('http auth: user %s, password %s\n') |
|
147 | msg = _('http auth: user %s, password %s\n') | |
137 | self.ui.debug(msg % (user, passwd and '*' * len(passwd) or 'not set')) |
|
148 | self.ui.debug(msg % (user, passwd and '*' * len(passwd) or 'not set')) | |
138 |
|
149 | |||
139 | def readauthtoken(self, uri): |
|
150 | def readauthtoken(self, uri): | |
140 | # Read configuration |
|
151 | # Read configuration | |
141 | config = dict() |
|
152 | config = dict() | |
142 | for key, val in self.ui.configitems('auth'): |
|
153 | for key, val in self.ui.configitems('auth'): | |
143 | if '.' not in key: |
|
154 | if '.' not in key: | |
144 | self.ui.warn(_("ignoring invalid [auth] key '%s'\n") % key) |
|
155 | self.ui.warn(_("ignoring invalid [auth] key '%s'\n") % key) | |
145 | continue |
|
156 | continue | |
146 | group, setting = key.split('.', 1) |
|
157 | group, setting = key.split('.', 1) | |
147 | gdict = config.setdefault(group, dict()) |
|
158 | gdict = config.setdefault(group, dict()) | |
148 | if setting in ('cert', 'key'): |
|
159 | if setting in ('cert', 'key'): | |
149 | val = util.expandpath(val) |
|
160 | val = util.expandpath(val) | |
150 | gdict[setting] = val |
|
161 | gdict[setting] = val | |
151 |
|
162 | |||
152 | # Find the best match |
|
163 | # Find the best match | |
153 | scheme, hostpath = uri.split('://', 1) |
|
164 | scheme, hostpath = uri.split('://', 1) | |
154 | bestlen = 0 |
|
165 | bestlen = 0 | |
155 | bestauth = None |
|
166 | bestauth = None | |
156 | for auth in config.itervalues(): |
|
167 | for auth in config.itervalues(): | |
157 | prefix = auth.get('prefix') |
|
168 | prefix = auth.get('prefix') | |
158 | if not prefix: |
|
169 | if not prefix: | |
159 | continue |
|
170 | continue | |
160 | p = prefix.split('://', 1) |
|
171 | p = prefix.split('://', 1) | |
161 | if len(p) > 1: |
|
172 | if len(p) > 1: | |
162 | schemes, prefix = [p[0]], p[1] |
|
173 | schemes, prefix = [p[0]], p[1] | |
163 | else: |
|
174 | else: | |
164 | schemes = (auth.get('schemes') or 'https').split() |
|
175 | schemes = (auth.get('schemes') or 'https').split() | |
165 | if (prefix == '*' or hostpath.startswith(prefix)) and \ |
|
176 | if (prefix == '*' or hostpath.startswith(prefix)) and \ | |
166 | len(prefix) > bestlen and scheme in schemes: |
|
177 | len(prefix) > bestlen and scheme in schemes: | |
167 | bestlen = len(prefix) |
|
178 | bestlen = len(prefix) | |
168 | bestauth = auth |
|
179 | bestauth = auth | |
169 | return bestauth |
|
180 | return bestauth | |
170 |
|
181 | |||
171 | class proxyhandler(urllib2.ProxyHandler): |
|
182 | class proxyhandler(urllib2.ProxyHandler): | |
172 | def __init__(self, ui): |
|
183 | def __init__(self, ui): | |
173 | proxyurl = ui.config("http_proxy", "host") or os.getenv('http_proxy') |
|
184 | proxyurl = ui.config("http_proxy", "host") or os.getenv('http_proxy') | |
174 | # XXX proxyauthinfo = None |
|
185 | # XXX proxyauthinfo = None | |
175 |
|
186 | |||
176 | if proxyurl: |
|
187 | if proxyurl: | |
177 | # proxy can be proper url or host[:port] |
|
188 | # proxy can be proper url or host[:port] | |
178 | if not (proxyurl.startswith('http:') or |
|
189 | if not (proxyurl.startswith('http:') or | |
179 | proxyurl.startswith('https:')): |
|
190 | proxyurl.startswith('https:')): | |
180 | proxyurl = 'http://' + proxyurl + '/' |
|
191 | proxyurl = 'http://' + proxyurl + '/' | |
181 | snpqf = urlparse.urlsplit(proxyurl) |
|
192 | snpqf = urlparse.urlsplit(proxyurl) | |
182 | proxyscheme, proxynetloc, proxypath, proxyquery, proxyfrag = snpqf |
|
193 | proxyscheme, proxynetloc, proxypath, proxyquery, proxyfrag = snpqf | |
183 | hpup = netlocsplit(proxynetloc) |
|
194 | hpup = netlocsplit(proxynetloc) | |
184 |
|
195 | |||
185 | proxyhost, proxyport, proxyuser, proxypasswd = hpup |
|
196 | proxyhost, proxyport, proxyuser, proxypasswd = hpup | |
186 | if not proxyuser: |
|
197 | if not proxyuser: | |
187 | proxyuser = ui.config("http_proxy", "user") |
|
198 | proxyuser = ui.config("http_proxy", "user") | |
188 | proxypasswd = ui.config("http_proxy", "passwd") |
|
199 | proxypasswd = ui.config("http_proxy", "passwd") | |
189 |
|
200 | |||
190 | # see if we should use a proxy for this url |
|
201 | # see if we should use a proxy for this url | |
191 | no_list = ["localhost", "127.0.0.1"] |
|
202 | no_list = ["localhost", "127.0.0.1"] | |
192 | no_list.extend([p.lower() for |
|
203 | no_list.extend([p.lower() for | |
193 | p in ui.configlist("http_proxy", "no")]) |
|
204 | p in ui.configlist("http_proxy", "no")]) | |
194 | no_list.extend([p.strip().lower() for |
|
205 | no_list.extend([p.strip().lower() for | |
195 | p in os.getenv("no_proxy", '').split(',') |
|
206 | p in os.getenv("no_proxy", '').split(',') | |
196 | if p.strip()]) |
|
207 | if p.strip()]) | |
197 | # "http_proxy.always" config is for running tests on localhost |
|
208 | # "http_proxy.always" config is for running tests on localhost | |
198 | if ui.configbool("http_proxy", "always"): |
|
209 | if ui.configbool("http_proxy", "always"): | |
199 | self.no_list = [] |
|
210 | self.no_list = [] | |
200 | else: |
|
211 | else: | |
201 | self.no_list = no_list |
|
212 | self.no_list = no_list | |
202 |
|
213 | |||
203 | proxyurl = urlparse.urlunsplit(( |
|
214 | proxyurl = urlparse.urlunsplit(( | |
204 | proxyscheme, netlocunsplit(proxyhost, proxyport, |
|
215 | proxyscheme, netlocunsplit(proxyhost, proxyport, | |
205 | proxyuser, proxypasswd or ''), |
|
216 | proxyuser, proxypasswd or ''), | |
206 | proxypath, proxyquery, proxyfrag)) |
|
217 | proxypath, proxyquery, proxyfrag)) | |
207 | proxies = {'http': proxyurl, 'https': proxyurl} |
|
218 | proxies = {'http': proxyurl, 'https': proxyurl} | |
208 | ui.debug('proxying through http://%s:%s\n' % |
|
219 | ui.debug('proxying through http://%s:%s\n' % | |
209 | (proxyhost, proxyport)) |
|
220 | (proxyhost, proxyport)) | |
210 | else: |
|
221 | else: | |
211 | proxies = {} |
|
222 | proxies = {} | |
212 |
|
223 | |||
213 | # urllib2 takes proxy values from the environment and those |
|
224 | # urllib2 takes proxy values from the environment and those | |
214 | # will take precedence if found, so drop them |
|
225 | # will take precedence if found, so drop them | |
215 | for env in ["HTTP_PROXY", "http_proxy", "no_proxy"]: |
|
226 | for env in ["HTTP_PROXY", "http_proxy", "no_proxy"]: | |
216 | try: |
|
227 | try: | |
217 | if env in os.environ: |
|
228 | if env in os.environ: | |
218 | del os.environ[env] |
|
229 | del os.environ[env] | |
219 | except OSError: |
|
230 | except OSError: | |
220 | pass |
|
231 | pass | |
221 |
|
232 | |||
222 | urllib2.ProxyHandler.__init__(self, proxies) |
|
233 | urllib2.ProxyHandler.__init__(self, proxies) | |
223 | self.ui = ui |
|
234 | self.ui = ui | |
224 |
|
235 | |||
225 | def proxy_open(self, req, proxy, type_): |
|
236 | def proxy_open(self, req, proxy, type_): | |
226 | host = req.get_host().split(':')[0] |
|
237 | host = req.get_host().split(':')[0] | |
227 | if host in self.no_list: |
|
238 | if host in self.no_list: | |
228 | return None |
|
239 | return None | |
229 |
|
240 | |||
230 | # work around a bug in Python < 2.4.2 |
|
241 | # work around a bug in Python < 2.4.2 | |
231 | # (it leaves a "\n" at the end of Proxy-authorization headers) |
|
242 | # (it leaves a "\n" at the end of Proxy-authorization headers) | |
232 | baseclass = req.__class__ |
|
243 | baseclass = req.__class__ | |
233 | class _request(baseclass): |
|
244 | class _request(baseclass): | |
234 | def add_header(self, key, val): |
|
245 | def add_header(self, key, val): | |
235 | if key.lower() == 'proxy-authorization': |
|
246 | if key.lower() == 'proxy-authorization': | |
236 | val = val.strip() |
|
247 | val = val.strip() | |
237 | return baseclass.add_header(self, key, val) |
|
248 | return baseclass.add_header(self, key, val) | |
238 | req.__class__ = _request |
|
249 | req.__class__ = _request | |
239 |
|
250 | |||
240 | return urllib2.ProxyHandler.proxy_open(self, req, proxy, type_) |
|
251 | return urllib2.ProxyHandler.proxy_open(self, req, proxy, type_) | |
241 |
|
252 | |||
242 | class httpsendfile(file): |
|
253 | class httpsendfile(file): | |
243 | def __len__(self): |
|
254 | def __len__(self): | |
244 | return os.fstat(self.fileno()).st_size |
|
255 | return os.fstat(self.fileno()).st_size | |
245 |
|
256 | |||
246 | def _gen_sendfile(connection): |
|
257 | def _gen_sendfile(connection): | |
247 | def _sendfile(self, data): |
|
258 | def _sendfile(self, data): | |
248 | # send a file |
|
259 | # send a file | |
249 | if isinstance(data, httpsendfile): |
|
260 | if isinstance(data, httpsendfile): | |
250 | # if auth required, some data sent twice, so rewind here |
|
261 | # if auth required, some data sent twice, so rewind here | |
251 | data.seek(0) |
|
262 | data.seek(0) | |
252 | for chunk in util.filechunkiter(data): |
|
263 | for chunk in util.filechunkiter(data): | |
253 | connection.send(self, chunk) |
|
264 | connection.send(self, chunk) | |
254 | else: |
|
265 | else: | |
255 | connection.send(self, data) |
|
266 | connection.send(self, data) | |
256 | return _sendfile |
|
267 | return _sendfile | |
257 |
|
268 | |||
258 | has_https = hasattr(urllib2, 'HTTPSHandler') |
|
269 | has_https = hasattr(urllib2, 'HTTPSHandler') | |
259 | if has_https: |
|
270 | if has_https: | |
260 | try: |
|
271 | try: | |
261 | # avoid using deprecated/broken FakeSocket in python 2.6 |
|
272 | # avoid using deprecated/broken FakeSocket in python 2.6 | |
262 | import ssl |
|
273 | import ssl | |
263 | _ssl_wrap_socket = ssl.wrap_socket |
|
274 | _ssl_wrap_socket = ssl.wrap_socket | |
264 | CERT_REQUIRED = ssl.CERT_REQUIRED |
|
275 | CERT_REQUIRED = ssl.CERT_REQUIRED | |
265 | except ImportError: |
|
276 | except ImportError: | |
266 | CERT_REQUIRED = 2 |
|
277 | CERT_REQUIRED = 2 | |
267 |
|
278 | |||
268 | def _ssl_wrap_socket(sock, key_file, cert_file, |
|
279 | def _ssl_wrap_socket(sock, key_file, cert_file, | |
269 | cert_reqs=CERT_REQUIRED, ca_certs=None): |
|
280 | cert_reqs=CERT_REQUIRED, ca_certs=None): | |
270 | if ca_certs: |
|
281 | if ca_certs: | |
271 | raise util.Abort(_( |
|
282 | raise util.Abort(_( | |
272 | 'certificate checking requires Python 2.6')) |
|
283 | 'certificate checking requires Python 2.6')) | |
273 |
|
284 | |||
274 | ssl = socket.ssl(sock, key_file, cert_file) |
|
285 | ssl = socket.ssl(sock, key_file, cert_file) | |
275 | return httplib.FakeSocket(sock, ssl) |
|
286 | return httplib.FakeSocket(sock, ssl) | |
276 |
|
287 | |||
277 | try: |
|
288 | try: | |
278 | _create_connection = socket.create_connection |
|
289 | _create_connection = socket.create_connection | |
279 | except AttributeError: |
|
290 | except AttributeError: | |
280 | _GLOBAL_DEFAULT_TIMEOUT = object() |
|
291 | _GLOBAL_DEFAULT_TIMEOUT = object() | |
281 |
|
292 | |||
282 | def _create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, |
|
293 | def _create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, | |
283 | source_address=None): |
|
294 | source_address=None): | |
284 | # lifted from Python 2.6 |
|
295 | # lifted from Python 2.6 | |
285 |
|
296 | |||
286 | msg = "getaddrinfo returns an empty list" |
|
297 | msg = "getaddrinfo returns an empty list" | |
287 | host, port = address |
|
298 | host, port = address | |
288 | for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): |
|
299 | for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM): | |
289 | af, socktype, proto, canonname, sa = res |
|
300 | af, socktype, proto, canonname, sa = res | |
290 | sock = None |
|
301 | sock = None | |
291 | try: |
|
302 | try: | |
292 | sock = socket.socket(af, socktype, proto) |
|
303 | sock = socket.socket(af, socktype, proto) | |
293 | if timeout is not _GLOBAL_DEFAULT_TIMEOUT: |
|
304 | if timeout is not _GLOBAL_DEFAULT_TIMEOUT: | |
294 | sock.settimeout(timeout) |
|
305 | sock.settimeout(timeout) | |
295 | if source_address: |
|
306 | if source_address: | |
296 | sock.bind(source_address) |
|
307 | sock.bind(source_address) | |
297 | sock.connect(sa) |
|
308 | sock.connect(sa) | |
298 | return sock |
|
309 | return sock | |
299 |
|
310 | |||
300 | except socket.error, msg: |
|
311 | except socket.error, msg: | |
301 | if sock is not None: |
|
312 | if sock is not None: | |
302 | sock.close() |
|
313 | sock.close() | |
303 |
|
314 | |||
304 | raise socket.error, msg |
|
315 | raise socket.error, msg | |
305 |
|
316 | |||
306 | class httpconnection(keepalive.HTTPConnection): |
|
317 | class httpconnection(keepalive.HTTPConnection): | |
307 | # must be able to send big bundle as stream. |
|
318 | # must be able to send big bundle as stream. | |
308 | send = _gen_sendfile(keepalive.HTTPConnection) |
|
319 | send = _gen_sendfile(keepalive.HTTPConnection) | |
309 |
|
320 | |||
310 | def connect(self): |
|
321 | def connect(self): | |
311 | if has_https and self.realhostport: # use CONNECT proxy |
|
322 | if has_https and self.realhostport: # use CONNECT proxy | |
312 | self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) |
|
323 | self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) | |
313 | self.sock.connect((self.host, self.port)) |
|
324 | self.sock.connect((self.host, self.port)) | |
314 | if _generic_proxytunnel(self): |
|
325 | if _generic_proxytunnel(self): | |
315 | # we do not support client x509 certificates |
|
326 | # we do not support client x509 certificates | |
316 | self.sock = _ssl_wrap_socket(self.sock, None, None) |
|
327 | self.sock = _ssl_wrap_socket(self.sock, None, None) | |
317 | else: |
|
328 | else: | |
318 | keepalive.HTTPConnection.connect(self) |
|
329 | keepalive.HTTPConnection.connect(self) | |
319 |
|
330 | |||
320 | def getresponse(self): |
|
331 | def getresponse(self): | |
321 | proxyres = getattr(self, 'proxyres', None) |
|
332 | proxyres = getattr(self, 'proxyres', None) | |
322 | if proxyres: |
|
333 | if proxyres: | |
323 | if proxyres.will_close: |
|
334 | if proxyres.will_close: | |
324 | self.close() |
|
335 | self.close() | |
325 | self.proxyres = None |
|
336 | self.proxyres = None | |
326 | return proxyres |
|
337 | return proxyres | |
327 | return keepalive.HTTPConnection.getresponse(self) |
|
338 | return keepalive.HTTPConnection.getresponse(self) | |
328 |
|
339 | |||
329 | # general transaction handler to support different ways to handle |
|
340 | # general transaction handler to support different ways to handle | |
330 | # HTTPS proxying before and after Python 2.6.3. |
|
341 | # HTTPS proxying before and after Python 2.6.3. | |
331 | def _generic_start_transaction(handler, h, req): |
|
342 | def _generic_start_transaction(handler, h, req): | |
332 | if hasattr(req, '_tunnel_host') and req._tunnel_host: |
|
343 | if hasattr(req, '_tunnel_host') and req._tunnel_host: | |
333 | tunnel_host = req._tunnel_host |
|
344 | tunnel_host = req._tunnel_host | |
334 | if tunnel_host[:7] not in ['http://', 'https:/']: |
|
345 | if tunnel_host[:7] not in ['http://', 'https:/']: | |
335 | tunnel_host = 'https://' + tunnel_host |
|
346 | tunnel_host = 'https://' + tunnel_host | |
336 | new_tunnel = True |
|
347 | new_tunnel = True | |
337 | else: |
|
348 | else: | |
338 | tunnel_host = req.get_selector() |
|
349 | tunnel_host = req.get_selector() | |
339 | new_tunnel = False |
|
350 | new_tunnel = False | |
340 |
|
351 | |||
341 | if new_tunnel or tunnel_host == req.get_full_url(): # has proxy |
|
352 | if new_tunnel or tunnel_host == req.get_full_url(): # has proxy | |
342 | urlparts = urlparse.urlparse(tunnel_host) |
|
353 | urlparts = urlparse.urlparse(tunnel_host) | |
343 | if new_tunnel or urlparts[0] == 'https': # only use CONNECT for HTTPS |
|
354 | if new_tunnel or urlparts[0] == 'https': # only use CONNECT for HTTPS | |
344 | realhostport = urlparts[1] |
|
355 | realhostport = urlparts[1] | |
345 | if realhostport[-1] == ']' or ':' not in realhostport: |
|
356 | if realhostport[-1] == ']' or ':' not in realhostport: | |
346 | realhostport += ':443' |
|
357 | realhostport += ':443' | |
347 |
|
358 | |||
348 | h.realhostport = realhostport |
|
359 | h.realhostport = realhostport | |
349 | h.headers = req.headers.copy() |
|
360 | h.headers = req.headers.copy() | |
350 | h.headers.update(handler.parent.addheaders) |
|
361 | h.headers.update(handler.parent.addheaders) | |
351 | return |
|
362 | return | |
352 |
|
363 | |||
353 | h.realhostport = None |
|
364 | h.realhostport = None | |
354 | h.headers = None |
|
365 | h.headers = None | |
355 |
|
366 | |||
356 | def _generic_proxytunnel(self): |
|
367 | def _generic_proxytunnel(self): | |
357 | proxyheaders = dict( |
|
368 | proxyheaders = dict( | |
358 | [(x, self.headers[x]) for x in self.headers |
|
369 | [(x, self.headers[x]) for x in self.headers | |
359 | if x.lower().startswith('proxy-')]) |
|
370 | if x.lower().startswith('proxy-')]) | |
360 | self._set_hostport(self.host, self.port) |
|
371 | self._set_hostport(self.host, self.port) | |
361 | self.send('CONNECT %s HTTP/1.0\r\n' % self.realhostport) |
|
372 | self.send('CONNECT %s HTTP/1.0\r\n' % self.realhostport) | |
362 | for header in proxyheaders.iteritems(): |
|
373 | for header in proxyheaders.iteritems(): | |
363 | self.send('%s: %s\r\n' % header) |
|
374 | self.send('%s: %s\r\n' % header) | |
364 | self.send('\r\n') |
|
375 | self.send('\r\n') | |
365 |
|
376 | |||
366 | # majority of the following code is duplicated from |
|
377 | # majority of the following code is duplicated from | |
367 | # httplib.HTTPConnection as there are no adequate places to |
|
378 | # httplib.HTTPConnection as there are no adequate places to | |
368 | # override functions to provide the needed functionality |
|
379 | # override functions to provide the needed functionality | |
369 | res = self.response_class(self.sock, |
|
380 | res = self.response_class(self.sock, | |
370 | strict=self.strict, |
|
381 | strict=self.strict, | |
371 | method=self._method) |
|
382 | method=self._method) | |
372 |
|
383 | |||
373 | while True: |
|
384 | while True: | |
374 | version, status, reason = res._read_status() |
|
385 | version, status, reason = res._read_status() | |
375 | if status != httplib.CONTINUE: |
|
386 | if status != httplib.CONTINUE: | |
376 | break |
|
387 | break | |
377 | while True: |
|
388 | while True: | |
378 | skip = res.fp.readline().strip() |
|
389 | skip = res.fp.readline().strip() | |
379 | if not skip: |
|
390 | if not skip: | |
380 | break |
|
391 | break | |
381 | res.status = status |
|
392 | res.status = status | |
382 | res.reason = reason.strip() |
|
393 | res.reason = reason.strip() | |
383 |
|
394 | |||
384 | if res.status == 200: |
|
395 | if res.status == 200: | |
385 | while True: |
|
396 | while True: | |
386 | line = res.fp.readline() |
|
397 | line = res.fp.readline() | |
387 | if line == '\r\n': |
|
398 | if line == '\r\n': | |
388 | break |
|
399 | break | |
389 | return True |
|
400 | return True | |
390 |
|
401 | |||
391 | if version == 'HTTP/1.0': |
|
402 | if version == 'HTTP/1.0': | |
392 | res.version = 10 |
|
403 | res.version = 10 | |
393 | elif version.startswith('HTTP/1.'): |
|
404 | elif version.startswith('HTTP/1.'): | |
394 | res.version = 11 |
|
405 | res.version = 11 | |
395 | elif version == 'HTTP/0.9': |
|
406 | elif version == 'HTTP/0.9': | |
396 | res.version = 9 |
|
407 | res.version = 9 | |
397 | else: |
|
408 | else: | |
398 | raise httplib.UnknownProtocol(version) |
|
409 | raise httplib.UnknownProtocol(version) | |
399 |
|
410 | |||
400 | if res.version == 9: |
|
411 | if res.version == 9: | |
401 | res.length = None |
|
412 | res.length = None | |
402 | res.chunked = 0 |
|
413 | res.chunked = 0 | |
403 | res.will_close = 1 |
|
414 | res.will_close = 1 | |
404 | res.msg = httplib.HTTPMessage(cStringIO.StringIO()) |
|
415 | res.msg = httplib.HTTPMessage(cStringIO.StringIO()) | |
405 | return False |
|
416 | return False | |
406 |
|
417 | |||
407 | res.msg = httplib.HTTPMessage(res.fp) |
|
418 | res.msg = httplib.HTTPMessage(res.fp) | |
408 | res.msg.fp = None |
|
419 | res.msg.fp = None | |
409 |
|
420 | |||
410 | # are we using the chunked-style of transfer encoding? |
|
421 | # are we using the chunked-style of transfer encoding? | |
411 | trenc = res.msg.getheader('transfer-encoding') |
|
422 | trenc = res.msg.getheader('transfer-encoding') | |
412 | if trenc and trenc.lower() == "chunked": |
|
423 | if trenc and trenc.lower() == "chunked": | |
413 | res.chunked = 1 |
|
424 | res.chunked = 1 | |
414 | res.chunk_left = None |
|
425 | res.chunk_left = None | |
415 | else: |
|
426 | else: | |
416 | res.chunked = 0 |
|
427 | res.chunked = 0 | |
417 |
|
428 | |||
418 | # will the connection close at the end of the response? |
|
429 | # will the connection close at the end of the response? | |
419 | res.will_close = res._check_close() |
|
430 | res.will_close = res._check_close() | |
420 |
|
431 | |||
421 | # do we have a Content-Length? |
|
432 | # do we have a Content-Length? | |
422 | # NOTE: RFC 2616, S4.4, #3 says we ignore this if tr_enc is "chunked" |
|
433 | # NOTE: RFC 2616, S4.4, #3 says we ignore this if tr_enc is "chunked" | |
423 | length = res.msg.getheader('content-length') |
|
434 | length = res.msg.getheader('content-length') | |
424 | if length and not res.chunked: |
|
435 | if length and not res.chunked: | |
425 | try: |
|
436 | try: | |
426 | res.length = int(length) |
|
437 | res.length = int(length) | |
427 | except ValueError: |
|
438 | except ValueError: | |
428 | res.length = None |
|
439 | res.length = None | |
429 | else: |
|
440 | else: | |
430 | if res.length < 0: # ignore nonsensical negative lengths |
|
441 | if res.length < 0: # ignore nonsensical negative lengths | |
431 | res.length = None |
|
442 | res.length = None | |
432 | else: |
|
443 | else: | |
433 | res.length = None |
|
444 | res.length = None | |
434 |
|
445 | |||
435 | # does the body have a fixed length? (of zero) |
|
446 | # does the body have a fixed length? (of zero) | |
436 | if (status == httplib.NO_CONTENT or status == httplib.NOT_MODIFIED or |
|
447 | if (status == httplib.NO_CONTENT or status == httplib.NOT_MODIFIED or | |
437 | 100 <= status < 200 or # 1xx codes |
|
448 | 100 <= status < 200 or # 1xx codes | |
438 | res._method == 'HEAD'): |
|
449 | res._method == 'HEAD'): | |
439 | res.length = 0 |
|
450 | res.length = 0 | |
440 |
|
451 | |||
441 | # if the connection remains open, and we aren't using chunked, and |
|
452 | # if the connection remains open, and we aren't using chunked, and | |
442 | # a content-length was not provided, then assume that the connection |
|
453 | # a content-length was not provided, then assume that the connection | |
443 | # WILL close. |
|
454 | # WILL close. | |
444 | if (not res.will_close and |
|
455 | if (not res.will_close and | |
445 | not res.chunked and |
|
456 | not res.chunked and | |
446 | res.length is None): |
|
457 | res.length is None): | |
447 | res.will_close = 1 |
|
458 | res.will_close = 1 | |
448 |
|
459 | |||
449 | self.proxyres = res |
|
460 | self.proxyres = res | |
450 |
|
461 | |||
451 | return False |
|
462 | return False | |
452 |
|
463 | |||
453 | class httphandler(keepalive.HTTPHandler): |
|
464 | class httphandler(keepalive.HTTPHandler): | |
454 | def http_open(self, req): |
|
465 | def http_open(self, req): | |
455 | return self.do_open(httpconnection, req) |
|
466 | return self.do_open(httpconnection, req) | |
456 |
|
467 | |||
457 | def _start_transaction(self, h, req): |
|
468 | def _start_transaction(self, h, req): | |
458 | _generic_start_transaction(self, h, req) |
|
469 | _generic_start_transaction(self, h, req) | |
459 | return keepalive.HTTPHandler._start_transaction(self, h, req) |
|
470 | return keepalive.HTTPHandler._start_transaction(self, h, req) | |
460 |
|
471 | |||
461 | def __del__(self): |
|
472 | def __del__(self): | |
462 | self.close_all() |
|
473 | self.close_all() | |
463 |
|
474 | |||
464 | if has_https: |
|
475 | if has_https: | |
465 | class BetterHTTPS(httplib.HTTPSConnection): |
|
476 | class BetterHTTPS(httplib.HTTPSConnection): | |
466 | send = keepalive.safesend |
|
477 | send = keepalive.safesend | |
467 |
|
478 | |||
468 | def connect(self): |
|
479 | def connect(self): | |
469 | if hasattr(self, 'ui'): |
|
480 | if hasattr(self, 'ui'): | |
470 | cacerts = self.ui.config('web', 'cacerts') |
|
481 | cacerts = self.ui.config('web', 'cacerts') | |
471 | else: |
|
482 | else: | |
472 | cacerts = None |
|
483 | cacerts = None | |
473 |
|
484 | |||
474 | if cacerts: |
|
485 | if cacerts: | |
475 | sock = _create_connection((self.host, self.port)) |
|
486 | sock = _create_connection((self.host, self.port)) | |
476 | self.sock = _ssl_wrap_socket(sock, self.key_file, |
|
487 | self.sock = _ssl_wrap_socket(sock, self.key_file, | |
477 | self.cert_file, cert_reqs=CERT_REQUIRED, |
|
488 | self.cert_file, cert_reqs=CERT_REQUIRED, | |
478 | ca_certs=cacerts) |
|
489 | ca_certs=cacerts) | |
479 | self.ui.debug(_('server identity verification succeeded\n')) |
|
490 | self.ui.debug(_('server identity verification succeeded\n')) | |
480 | else: |
|
491 | else: | |
481 | httplib.HTTPSConnection.connect(self) |
|
492 | httplib.HTTPSConnection.connect(self) | |
482 |
|
493 | |||
483 | class httpsconnection(BetterHTTPS): |
|
494 | class httpsconnection(BetterHTTPS): | |
484 | response_class = keepalive.HTTPResponse |
|
495 | response_class = keepalive.HTTPResponse | |
485 | # must be able to send big bundle as stream. |
|
496 | # must be able to send big bundle as stream. | |
486 | send = _gen_sendfile(BetterHTTPS) |
|
497 | send = _gen_sendfile(BetterHTTPS) | |
487 | getresponse = keepalive.wrapgetresponse(httplib.HTTPSConnection) |
|
498 | getresponse = keepalive.wrapgetresponse(httplib.HTTPSConnection) | |
488 |
|
499 | |||
489 | def connect(self): |
|
500 | def connect(self): | |
490 | if self.realhostport: # use CONNECT proxy |
|
501 | if self.realhostport: # use CONNECT proxy | |
491 | self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) |
|
502 | self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) | |
492 | self.sock.connect((self.host, self.port)) |
|
503 | self.sock.connect((self.host, self.port)) | |
493 | if _generic_proxytunnel(self): |
|
504 | if _generic_proxytunnel(self): | |
494 | self.sock = _ssl_wrap_socket(self.sock, self.cert_file, |
|
505 | self.sock = _ssl_wrap_socket(self.sock, self.cert_file, | |
495 | self.key_file) |
|
506 | self.key_file) | |
496 | else: |
|
507 | else: | |
497 | BetterHTTPS.connect(self) |
|
508 | BetterHTTPS.connect(self) | |
498 |
|
509 | |||
499 | class httpshandler(keepalive.KeepAliveHandler, urllib2.HTTPSHandler): |
|
510 | class httpshandler(keepalive.KeepAliveHandler, urllib2.HTTPSHandler): | |
500 | def __init__(self, ui): |
|
511 | def __init__(self, ui): | |
501 | keepalive.KeepAliveHandler.__init__(self) |
|
512 | keepalive.KeepAliveHandler.__init__(self) | |
502 | urllib2.HTTPSHandler.__init__(self) |
|
513 | urllib2.HTTPSHandler.__init__(self) | |
503 | self.ui = ui |
|
514 | self.ui = ui | |
504 | self.pwmgr = passwordmgr(self.ui) |
|
515 | self.pwmgr = passwordmgr(self.ui) | |
505 |
|
516 | |||
506 | def _start_transaction(self, h, req): |
|
517 | def _start_transaction(self, h, req): | |
507 | _generic_start_transaction(self, h, req) |
|
518 | _generic_start_transaction(self, h, req) | |
508 | return keepalive.KeepAliveHandler._start_transaction(self, h, req) |
|
519 | return keepalive.KeepAliveHandler._start_transaction(self, h, req) | |
509 |
|
520 | |||
510 | def https_open(self, req): |
|
521 | def https_open(self, req): | |
511 | self.auth = self.pwmgr.readauthtoken(req.get_full_url()) |
|
522 | self.auth = self.pwmgr.readauthtoken(req.get_full_url()) | |
512 | return self.do_open(self._makeconnection, req) |
|
523 | return self.do_open(self._makeconnection, req) | |
513 |
|
524 | |||
514 | def _makeconnection(self, host, port=None, *args, **kwargs): |
|
525 | def _makeconnection(self, host, port=None, *args, **kwargs): | |
515 | keyfile = None |
|
526 | keyfile = None | |
516 | certfile = None |
|
527 | certfile = None | |
517 |
|
528 | |||
518 | if len(args) >= 1: # key_file |
|
529 | if len(args) >= 1: # key_file | |
519 | keyfile = args[0] |
|
530 | keyfile = args[0] | |
520 | if len(args) >= 2: # cert_file |
|
531 | if len(args) >= 2: # cert_file | |
521 | certfile = args[1] |
|
532 | certfile = args[1] | |
522 | args = args[2:] |
|
533 | args = args[2:] | |
523 |
|
534 | |||
524 | # if the user has specified different key/cert files in |
|
535 | # if the user has specified different key/cert files in | |
525 | # hgrc, we prefer these |
|
536 | # hgrc, we prefer these | |
526 | if self.auth and 'key' in self.auth and 'cert' in self.auth: |
|
537 | if self.auth and 'key' in self.auth and 'cert' in self.auth: | |
527 | keyfile = self.auth['key'] |
|
538 | keyfile = self.auth['key'] | |
528 | certfile = self.auth['cert'] |
|
539 | certfile = self.auth['cert'] | |
529 |
|
540 | |||
530 | conn = httpsconnection(host, port, keyfile, certfile, *args, **kwargs) |
|
541 | conn = httpsconnection(host, port, keyfile, certfile, *args, **kwargs) | |
531 | conn.ui = self.ui |
|
542 | conn.ui = self.ui | |
532 | return conn |
|
543 | return conn | |
533 |
|
544 | |||
534 | # In python < 2.5 AbstractDigestAuthHandler raises a ValueError if |
|
545 | # In python < 2.5 AbstractDigestAuthHandler raises a ValueError if | |
535 | # it doesn't know about the auth type requested. This can happen if |
|
546 | # it doesn't know about the auth type requested. This can happen if | |
536 | # somebody is using BasicAuth and types a bad password. |
|
547 | # somebody is using BasicAuth and types a bad password. | |
537 | class httpdigestauthhandler(urllib2.HTTPDigestAuthHandler): |
|
548 | class httpdigestauthhandler(urllib2.HTTPDigestAuthHandler): | |
538 | def http_error_auth_reqed(self, auth_header, host, req, headers): |
|
549 | def http_error_auth_reqed(self, auth_header, host, req, headers): | |
539 | try: |
|
550 | try: | |
540 | return urllib2.HTTPDigestAuthHandler.http_error_auth_reqed( |
|
551 | return urllib2.HTTPDigestAuthHandler.http_error_auth_reqed( | |
541 | self, auth_header, host, req, headers) |
|
552 | self, auth_header, host, req, headers) | |
542 | except ValueError, inst: |
|
553 | except ValueError, inst: | |
543 | arg = inst.args[0] |
|
554 | arg = inst.args[0] | |
544 | if arg.startswith("AbstractDigestAuthHandler doesn't know "): |
|
555 | if arg.startswith("AbstractDigestAuthHandler doesn't know "): | |
545 | return |
|
556 | return | |
546 | raise |
|
557 | raise | |
547 |
|
558 | |||
548 | def getauthinfo(path): |
|
559 | def getauthinfo(path): | |
549 | scheme, netloc, urlpath, query, frag = urlparse.urlsplit(path) |
|
560 | scheme, netloc, urlpath, query, frag = urlparse.urlsplit(path) | |
550 | if not urlpath: |
|
561 | if not urlpath: | |
551 | urlpath = '/' |
|
562 | urlpath = '/' | |
552 | if scheme != 'file': |
|
563 | if scheme != 'file': | |
553 | # XXX: why are we quoting the path again with some smart |
|
564 | # XXX: why are we quoting the path again with some smart | |
554 | # heuristic here? Anyway, it cannot be done with file:// |
|
565 | # heuristic here? Anyway, it cannot be done with file:// | |
555 | # urls since path encoding is os/fs dependent (see |
|
566 | # urls since path encoding is os/fs dependent (see | |
556 | # urllib.pathname2url() for details). |
|
567 | # urllib.pathname2url() for details). | |
557 | urlpath = quotepath(urlpath) |
|
568 | urlpath = quotepath(urlpath) | |
558 | host, port, user, passwd = netlocsplit(netloc) |
|
569 | host, port, user, passwd = netlocsplit(netloc) | |
559 |
|
570 | |||
560 | # urllib cannot handle URLs with embedded user or passwd |
|
571 | # urllib cannot handle URLs with embedded user or passwd | |
561 | url = urlparse.urlunsplit((scheme, netlocunsplit(host, port), |
|
572 | url = urlparse.urlunsplit((scheme, netlocunsplit(host, port), | |
562 | urlpath, query, frag)) |
|
573 | urlpath, query, frag)) | |
563 | if user: |
|
574 | if user: | |
564 | netloc = host |
|
575 | netloc = host | |
565 | if port: |
|
576 | if port: | |
566 | netloc += ':' + port |
|
577 | netloc += ':' + port | |
567 | # Python < 2.4.3 uses only the netloc to search for a password |
|
578 | # Python < 2.4.3 uses only the netloc to search for a password | |
568 | authinfo = (None, (url, netloc), user, passwd or '') |
|
579 | authinfo = (None, (url, netloc), user, passwd or '') | |
569 | else: |
|
580 | else: | |
570 | authinfo = None |
|
581 | authinfo = None | |
571 | return url, authinfo |
|
582 | return url, authinfo | |
572 |
|
583 | |||
573 | handlerfuncs = [] |
|
584 | handlerfuncs = [] | |
574 |
|
585 | |||
575 | def opener(ui, authinfo=None): |
|
586 | def opener(ui, authinfo=None): | |
576 | ''' |
|
587 | ''' | |
577 | construct an opener suitable for urllib2 |
|
588 | construct an opener suitable for urllib2 | |
578 | authinfo will be added to the password manager |
|
589 | authinfo will be added to the password manager | |
579 | ''' |
|
590 | ''' | |
580 | handlers = [httphandler()] |
|
591 | handlers = [httphandler()] | |
581 | if has_https: |
|
592 | if has_https: | |
582 | handlers.append(httpshandler(ui)) |
|
593 | handlers.append(httpshandler(ui)) | |
583 |
|
594 | |||
584 | handlers.append(proxyhandler(ui)) |
|
595 | handlers.append(proxyhandler(ui)) | |
585 |
|
596 | |||
586 | passmgr = passwordmgr(ui) |
|
597 | passmgr = passwordmgr(ui) | |
587 | if authinfo is not None: |
|
598 | if authinfo is not None: | |
588 | passmgr.add_password(*authinfo) |
|
599 | passmgr.add_password(*authinfo) | |
589 | user, passwd = authinfo[2:4] |
|
600 | user, passwd = authinfo[2:4] | |
590 | ui.debug('http auth: user %s, password %s\n' % |
|
601 | ui.debug('http auth: user %s, password %s\n' % | |
591 | (user, passwd and '*' * len(passwd) or 'not set')) |
|
602 | (user, passwd and '*' * len(passwd) or 'not set')) | |
592 |
|
603 | |||
593 | handlers.extend((urllib2.HTTPBasicAuthHandler(passmgr), |
|
604 | handlers.extend((urllib2.HTTPBasicAuthHandler(passmgr), | |
594 | httpdigestauthhandler(passmgr))) |
|
605 | httpdigestauthhandler(passmgr))) | |
595 | handlers.extend([h(ui, passmgr) for h in handlerfuncs]) |
|
606 | handlers.extend([h(ui, passmgr) for h in handlerfuncs]) | |
596 | opener = urllib2.build_opener(*handlers) |
|
607 | opener = urllib2.build_opener(*handlers) | |
597 |
|
608 | |||
598 | # 1.0 here is the _protocol_ version |
|
609 | # 1.0 here is the _protocol_ version | |
599 | opener.addheaders = [('User-agent', 'mercurial/proto-1.0')] |
|
610 | opener.addheaders = [('User-agent', 'mercurial/proto-1.0')] | |
600 | opener.addheaders.append(('Accept', 'application/mercurial-0.1')) |
|
611 | opener.addheaders.append(('Accept', 'application/mercurial-0.1')) | |
601 | return opener |
|
612 | return opener | |
602 |
|
613 | |||
603 | scheme_re = re.compile(r'^([a-zA-Z0-9+-.]+)://') |
|
614 | scheme_re = re.compile(r'^([a-zA-Z0-9+-.]+)://') | |
604 |
|
615 | |||
605 | def open(ui, url, data=None): |
|
616 | def open(ui, url, data=None): | |
606 | scheme = None |
|
617 | scheme = None | |
607 | m = scheme_re.search(url) |
|
618 | m = scheme_re.search(url) | |
608 | if m: |
|
619 | if m: | |
609 | scheme = m.group(1).lower() |
|
620 | scheme = m.group(1).lower() | |
610 | if not scheme: |
|
621 | if not scheme: | |
611 | path = util.normpath(os.path.abspath(url)) |
|
622 | path = util.normpath(os.path.abspath(url)) | |
612 | url = 'file://' + urllib.pathname2url(path) |
|
623 | url = 'file://' + urllib.pathname2url(path) | |
613 | authinfo = None |
|
624 | authinfo = None | |
614 | else: |
|
625 | else: | |
615 | url, authinfo = getauthinfo(url) |
|
626 | url, authinfo = getauthinfo(url) | |
616 | return opener(ui, authinfo).open(url, data) |
|
627 | return opener(ui, authinfo).open(url, data) |
General Comments 0
You need to be logged in to leave comments.
Login now