##// END OF EJS Templates
bundle2: rename error exception class for unsupported feature...
Pierre-Yves David -
r26393:cff70549 default
parent child Browse files
Show More
@@ -1,1422 +1,1422
1 # bundle2.py - generic container format to transmit arbitrary data.
1 # bundle2.py - generic container format to transmit arbitrary data.
2 #
2 #
3 # Copyright 2013 Facebook, Inc.
3 # Copyright 2013 Facebook, Inc.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """Handling of the new bundle2 format
7 """Handling of the new bundle2 format
8
8
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
10 payloads in an application agnostic way. It consist in a sequence of "parts"
10 payloads in an application agnostic way. It consist in a sequence of "parts"
11 that will be handed to and processed by the application layer.
11 that will be handed to and processed by the application layer.
12
12
13
13
14 General format architecture
14 General format architecture
15 ===========================
15 ===========================
16
16
17 The format is architectured as follow
17 The format is architectured as follow
18
18
19 - magic string
19 - magic string
20 - stream level parameters
20 - stream level parameters
21 - payload parts (any number)
21 - payload parts (any number)
22 - end of stream marker.
22 - end of stream marker.
23
23
24 the Binary format
24 the Binary format
25 ============================
25 ============================
26
26
27 All numbers are unsigned and big-endian.
27 All numbers are unsigned and big-endian.
28
28
29 stream level parameters
29 stream level parameters
30 ------------------------
30 ------------------------
31
31
32 Binary format is as follow
32 Binary format is as follow
33
33
34 :params size: int32
34 :params size: int32
35
35
36 The total number of Bytes used by the parameters
36 The total number of Bytes used by the parameters
37
37
38 :params value: arbitrary number of Bytes
38 :params value: arbitrary number of Bytes
39
39
40 A blob of `params size` containing the serialized version of all stream level
40 A blob of `params size` containing the serialized version of all stream level
41 parameters.
41 parameters.
42
42
43 The blob contains a space separated list of parameters. Parameters with value
43 The blob contains a space separated list of parameters. Parameters with value
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
45
45
46 Empty name are obviously forbidden.
46 Empty name are obviously forbidden.
47
47
48 Name MUST start with a letter. If this first letter is lower case, the
48 Name MUST start with a letter. If this first letter is lower case, the
49 parameter is advisory and can be safely ignored. However when the first
49 parameter is advisory and can be safely ignored. However when the first
50 letter is capital, the parameter is mandatory and the bundling process MUST
50 letter is capital, the parameter is mandatory and the bundling process MUST
51 stop if he is not able to proceed it.
51 stop if he is not able to proceed it.
52
52
53 Stream parameters use a simple textual format for two main reasons:
53 Stream parameters use a simple textual format for two main reasons:
54
54
55 - Stream level parameters should remain simple and we want to discourage any
55 - Stream level parameters should remain simple and we want to discourage any
56 crazy usage.
56 crazy usage.
57 - Textual data allow easy human inspection of a bundle2 header in case of
57 - Textual data allow easy human inspection of a bundle2 header in case of
58 troubles.
58 troubles.
59
59
60 Any Applicative level options MUST go into a bundle2 part instead.
60 Any Applicative level options MUST go into a bundle2 part instead.
61
61
62 Payload part
62 Payload part
63 ------------------------
63 ------------------------
64
64
65 Binary format is as follow
65 Binary format is as follow
66
66
67 :header size: int32
67 :header size: int32
68
68
69 The total number of Bytes used by the part header. When the header is empty
69 The total number of Bytes used by the part header. When the header is empty
70 (size = 0) this is interpreted as the end of stream marker.
70 (size = 0) this is interpreted as the end of stream marker.
71
71
72 :header:
72 :header:
73
73
74 The header defines how to interpret the part. It contains two piece of
74 The header defines how to interpret the part. It contains two piece of
75 data: the part type, and the part parameters.
75 data: the part type, and the part parameters.
76
76
77 The part type is used to route an application level handler, that can
77 The part type is used to route an application level handler, that can
78 interpret payload.
78 interpret payload.
79
79
80 Part parameters are passed to the application level handler. They are
80 Part parameters are passed to the application level handler. They are
81 meant to convey information that will help the application level object to
81 meant to convey information that will help the application level object to
82 interpret the part payload.
82 interpret the part payload.
83
83
84 The binary format of the header is has follow
84 The binary format of the header is has follow
85
85
86 :typesize: (one byte)
86 :typesize: (one byte)
87
87
88 :parttype: alphanumerical part name (restricted to [a-zA-Z0-9_:-]*)
88 :parttype: alphanumerical part name (restricted to [a-zA-Z0-9_:-]*)
89
89
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
91 to this part.
91 to this part.
92
92
93 :parameters:
93 :parameters:
94
94
95 Part's parameter may have arbitrary content, the binary structure is::
95 Part's parameter may have arbitrary content, the binary structure is::
96
96
97 <mandatory-count><advisory-count><param-sizes><param-data>
97 <mandatory-count><advisory-count><param-sizes><param-data>
98
98
99 :mandatory-count: 1 byte, number of mandatory parameters
99 :mandatory-count: 1 byte, number of mandatory parameters
100
100
101 :advisory-count: 1 byte, number of advisory parameters
101 :advisory-count: 1 byte, number of advisory parameters
102
102
103 :param-sizes:
103 :param-sizes:
104
104
105 N couple of bytes, where N is the total number of parameters. Each
105 N couple of bytes, where N is the total number of parameters. Each
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
107
107
108 :param-data:
108 :param-data:
109
109
110 A blob of bytes from which each parameter key and value can be
110 A blob of bytes from which each parameter key and value can be
111 retrieved using the list of size couples stored in the previous
111 retrieved using the list of size couples stored in the previous
112 field.
112 field.
113
113
114 Mandatory parameters comes first, then the advisory ones.
114 Mandatory parameters comes first, then the advisory ones.
115
115
116 Each parameter's key MUST be unique within the part.
116 Each parameter's key MUST be unique within the part.
117
117
118 :payload:
118 :payload:
119
119
120 payload is a series of `<chunksize><chunkdata>`.
120 payload is a series of `<chunksize><chunkdata>`.
121
121
122 `chunksize` is an int32, `chunkdata` are plain bytes (as much as
122 `chunksize` is an int32, `chunkdata` are plain bytes (as much as
123 `chunksize` says)` The payload part is concluded by a zero size chunk.
123 `chunksize` says)` The payload part is concluded by a zero size chunk.
124
124
125 The current implementation always produces either zero or one chunk.
125 The current implementation always produces either zero or one chunk.
126 This is an implementation limitation that will ultimately be lifted.
126 This is an implementation limitation that will ultimately be lifted.
127
127
128 `chunksize` can be negative to trigger special case processing. No such
128 `chunksize` can be negative to trigger special case processing. No such
129 processing is in place yet.
129 processing is in place yet.
130
130
131 Bundle processing
131 Bundle processing
132 ============================
132 ============================
133
133
134 Each part is processed in order using a "part handler". Handler are registered
134 Each part is processed in order using a "part handler". Handler are registered
135 for a certain part type.
135 for a certain part type.
136
136
137 The matching of a part to its handler is case insensitive. The case of the
137 The matching of a part to its handler is case insensitive. The case of the
138 part type is used to know if a part is mandatory or advisory. If the Part type
138 part type is used to know if a part is mandatory or advisory. If the Part type
139 contains any uppercase char it is considered mandatory. When no handler is
139 contains any uppercase char it is considered mandatory. When no handler is
140 known for a Mandatory part, the process is aborted and an exception is raised.
140 known for a Mandatory part, the process is aborted and an exception is raised.
141 If the part is advisory and no handler is known, the part is ignored. When the
141 If the part is advisory and no handler is known, the part is ignored. When the
142 process is aborted, the full bundle is still read from the stream to keep the
142 process is aborted, the full bundle is still read from the stream to keep the
143 channel usable. But none of the part read from an abort are processed. In the
143 channel usable. But none of the part read from an abort are processed. In the
144 future, dropping the stream may become an option for channel we do not care to
144 future, dropping the stream may become an option for channel we do not care to
145 preserve.
145 preserve.
146 """
146 """
147
147
148 from __future__ import absolute_import
148 from __future__ import absolute_import
149
149
150 import errno
150 import errno
151 import re
151 import re
152 import string
152 import string
153 import struct
153 import struct
154 import sys
154 import sys
155 import urllib
155 import urllib
156
156
157 from .i18n import _
157 from .i18n import _
158 from . import (
158 from . import (
159 changegroup,
159 changegroup,
160 error,
160 error,
161 obsolete,
161 obsolete,
162 pushkey,
162 pushkey,
163 tags,
163 tags,
164 url,
164 url,
165 util,
165 util,
166 )
166 )
167
167
168 _pack = struct.pack
168 _pack = struct.pack
169 _unpack = struct.unpack
169 _unpack = struct.unpack
170
170
171 _fstreamparamsize = '>i'
171 _fstreamparamsize = '>i'
172 _fpartheadersize = '>i'
172 _fpartheadersize = '>i'
173 _fparttypesize = '>B'
173 _fparttypesize = '>B'
174 _fpartid = '>I'
174 _fpartid = '>I'
175 _fpayloadsize = '>i'
175 _fpayloadsize = '>i'
176 _fpartparamcount = '>BB'
176 _fpartparamcount = '>BB'
177
177
178 preferedchunksize = 4096
178 preferedchunksize = 4096
179
179
180 _parttypeforbidden = re.compile('[^a-zA-Z0-9_:-]')
180 _parttypeforbidden = re.compile('[^a-zA-Z0-9_:-]')
181
181
182 def outdebug(ui, message):
182 def outdebug(ui, message):
183 """debug regarding output stream (bundling)"""
183 """debug regarding output stream (bundling)"""
184 if ui.configbool('devel', 'bundle2.debug', False):
184 if ui.configbool('devel', 'bundle2.debug', False):
185 ui.debug('bundle2-output: %s\n' % message)
185 ui.debug('bundle2-output: %s\n' % message)
186
186
187 def indebug(ui, message):
187 def indebug(ui, message):
188 """debug on input stream (unbundling)"""
188 """debug on input stream (unbundling)"""
189 if ui.configbool('devel', 'bundle2.debug', False):
189 if ui.configbool('devel', 'bundle2.debug', False):
190 ui.debug('bundle2-input: %s\n' % message)
190 ui.debug('bundle2-input: %s\n' % message)
191
191
192 def validateparttype(parttype):
192 def validateparttype(parttype):
193 """raise ValueError if a parttype contains invalid character"""
193 """raise ValueError if a parttype contains invalid character"""
194 if _parttypeforbidden.search(parttype):
194 if _parttypeforbidden.search(parttype):
195 raise ValueError(parttype)
195 raise ValueError(parttype)
196
196
197 def _makefpartparamsizes(nbparams):
197 def _makefpartparamsizes(nbparams):
198 """return a struct format to read part parameter sizes
198 """return a struct format to read part parameter sizes
199
199
200 The number parameters is variable so we need to build that format
200 The number parameters is variable so we need to build that format
201 dynamically.
201 dynamically.
202 """
202 """
203 return '>'+('BB'*nbparams)
203 return '>'+('BB'*nbparams)
204
204
205 parthandlermapping = {}
205 parthandlermapping = {}
206
206
207 def parthandler(parttype, params=()):
207 def parthandler(parttype, params=()):
208 """decorator that register a function as a bundle2 part handler
208 """decorator that register a function as a bundle2 part handler
209
209
210 eg::
210 eg::
211
211
212 @parthandler('myparttype', ('mandatory', 'param', 'handled'))
212 @parthandler('myparttype', ('mandatory', 'param', 'handled'))
213 def myparttypehandler(...):
213 def myparttypehandler(...):
214 '''process a part of type "my part".'''
214 '''process a part of type "my part".'''
215 ...
215 ...
216 """
216 """
217 validateparttype(parttype)
217 validateparttype(parttype)
218 def _decorator(func):
218 def _decorator(func):
219 lparttype = parttype.lower() # enforce lower case matching.
219 lparttype = parttype.lower() # enforce lower case matching.
220 assert lparttype not in parthandlermapping
220 assert lparttype not in parthandlermapping
221 parthandlermapping[lparttype] = func
221 parthandlermapping[lparttype] = func
222 func.params = frozenset(params)
222 func.params = frozenset(params)
223 return func
223 return func
224 return _decorator
224 return _decorator
225
225
226 class unbundlerecords(object):
226 class unbundlerecords(object):
227 """keep record of what happens during and unbundle
227 """keep record of what happens during and unbundle
228
228
229 New records are added using `records.add('cat', obj)`. Where 'cat' is a
229 New records are added using `records.add('cat', obj)`. Where 'cat' is a
230 category of record and obj is an arbitrary object.
230 category of record and obj is an arbitrary object.
231
231
232 `records['cat']` will return all entries of this category 'cat'.
232 `records['cat']` will return all entries of this category 'cat'.
233
233
234 Iterating on the object itself will yield `('category', obj)` tuples
234 Iterating on the object itself will yield `('category', obj)` tuples
235 for all entries.
235 for all entries.
236
236
237 All iterations happens in chronological order.
237 All iterations happens in chronological order.
238 """
238 """
239
239
240 def __init__(self):
240 def __init__(self):
241 self._categories = {}
241 self._categories = {}
242 self._sequences = []
242 self._sequences = []
243 self._replies = {}
243 self._replies = {}
244
244
245 def add(self, category, entry, inreplyto=None):
245 def add(self, category, entry, inreplyto=None):
246 """add a new record of a given category.
246 """add a new record of a given category.
247
247
248 The entry can then be retrieved in the list returned by
248 The entry can then be retrieved in the list returned by
249 self['category']."""
249 self['category']."""
250 self._categories.setdefault(category, []).append(entry)
250 self._categories.setdefault(category, []).append(entry)
251 self._sequences.append((category, entry))
251 self._sequences.append((category, entry))
252 if inreplyto is not None:
252 if inreplyto is not None:
253 self.getreplies(inreplyto).add(category, entry)
253 self.getreplies(inreplyto).add(category, entry)
254
254
255 def getreplies(self, partid):
255 def getreplies(self, partid):
256 """get the records that are replies to a specific part"""
256 """get the records that are replies to a specific part"""
257 return self._replies.setdefault(partid, unbundlerecords())
257 return self._replies.setdefault(partid, unbundlerecords())
258
258
259 def __getitem__(self, cat):
259 def __getitem__(self, cat):
260 return tuple(self._categories.get(cat, ()))
260 return tuple(self._categories.get(cat, ()))
261
261
262 def __iter__(self):
262 def __iter__(self):
263 return iter(self._sequences)
263 return iter(self._sequences)
264
264
265 def __len__(self):
265 def __len__(self):
266 return len(self._sequences)
266 return len(self._sequences)
267
267
268 def __nonzero__(self):
268 def __nonzero__(self):
269 return bool(self._sequences)
269 return bool(self._sequences)
270
270
271 class bundleoperation(object):
271 class bundleoperation(object):
272 """an object that represents a single bundling process
272 """an object that represents a single bundling process
273
273
274 Its purpose is to carry unbundle-related objects and states.
274 Its purpose is to carry unbundle-related objects and states.
275
275
276 A new object should be created at the beginning of each bundle processing.
276 A new object should be created at the beginning of each bundle processing.
277 The object is to be returned by the processing function.
277 The object is to be returned by the processing function.
278
278
279 The object has very little content now it will ultimately contain:
279 The object has very little content now it will ultimately contain:
280 * an access to the repo the bundle is applied to,
280 * an access to the repo the bundle is applied to,
281 * a ui object,
281 * a ui object,
282 * a way to retrieve a transaction to add changes to the repo,
282 * a way to retrieve a transaction to add changes to the repo,
283 * a way to record the result of processing each part,
283 * a way to record the result of processing each part,
284 * a way to construct a bundle response when applicable.
284 * a way to construct a bundle response when applicable.
285 """
285 """
286
286
287 def __init__(self, repo, transactiongetter, captureoutput=True):
287 def __init__(self, repo, transactiongetter, captureoutput=True):
288 self.repo = repo
288 self.repo = repo
289 self.ui = repo.ui
289 self.ui = repo.ui
290 self.records = unbundlerecords()
290 self.records = unbundlerecords()
291 self.gettransaction = transactiongetter
291 self.gettransaction = transactiongetter
292 self.reply = None
292 self.reply = None
293 self.captureoutput = captureoutput
293 self.captureoutput = captureoutput
294
294
295 class TransactionUnavailable(RuntimeError):
295 class TransactionUnavailable(RuntimeError):
296 pass
296 pass
297
297
298 def _notransaction():
298 def _notransaction():
299 """default method to get a transaction while processing a bundle
299 """default method to get a transaction while processing a bundle
300
300
301 Raise an exception to highlight the fact that no transaction was expected
301 Raise an exception to highlight the fact that no transaction was expected
302 to be created"""
302 to be created"""
303 raise TransactionUnavailable()
303 raise TransactionUnavailable()
304
304
305 def processbundle(repo, unbundler, transactiongetter=None, op=None):
305 def processbundle(repo, unbundler, transactiongetter=None, op=None):
306 """This function process a bundle, apply effect to/from a repo
306 """This function process a bundle, apply effect to/from a repo
307
307
308 It iterates over each part then searches for and uses the proper handling
308 It iterates over each part then searches for and uses the proper handling
309 code to process the part. Parts are processed in order.
309 code to process the part. Parts are processed in order.
310
310
311 This is very early version of this function that will be strongly reworked
311 This is very early version of this function that will be strongly reworked
312 before final usage.
312 before final usage.
313
313
314 Unknown Mandatory part will abort the process.
314 Unknown Mandatory part will abort the process.
315
315
316 It is temporarily possible to provide a prebuilt bundleoperation to the
316 It is temporarily possible to provide a prebuilt bundleoperation to the
317 function. This is used to ensure output is properly propagated in case of
317 function. This is used to ensure output is properly propagated in case of
318 an error during the unbundling. This output capturing part will likely be
318 an error during the unbundling. This output capturing part will likely be
319 reworked and this ability will probably go away in the process.
319 reworked and this ability will probably go away in the process.
320 """
320 """
321 if op is None:
321 if op is None:
322 if transactiongetter is None:
322 if transactiongetter is None:
323 transactiongetter = _notransaction
323 transactiongetter = _notransaction
324 op = bundleoperation(repo, transactiongetter)
324 op = bundleoperation(repo, transactiongetter)
325 # todo:
325 # todo:
326 # - replace this is a init function soon.
326 # - replace this is a init function soon.
327 # - exception catching
327 # - exception catching
328 unbundler.params
328 unbundler.params
329 if repo.ui.debugflag:
329 if repo.ui.debugflag:
330 msg = ['bundle2-input-bundle:']
330 msg = ['bundle2-input-bundle:']
331 if unbundler.params:
331 if unbundler.params:
332 msg.append(' %i params')
332 msg.append(' %i params')
333 if op.gettransaction is None:
333 if op.gettransaction is None:
334 msg.append(' no-transaction')
334 msg.append(' no-transaction')
335 else:
335 else:
336 msg.append(' with-transaction')
336 msg.append(' with-transaction')
337 msg.append('\n')
337 msg.append('\n')
338 repo.ui.debug(''.join(msg))
338 repo.ui.debug(''.join(msg))
339 iterparts = enumerate(unbundler.iterparts())
339 iterparts = enumerate(unbundler.iterparts())
340 part = None
340 part = None
341 nbpart = 0
341 nbpart = 0
342 try:
342 try:
343 for nbpart, part in iterparts:
343 for nbpart, part in iterparts:
344 _processpart(op, part)
344 _processpart(op, part)
345 except BaseException as exc:
345 except BaseException as exc:
346 for nbpart, part in iterparts:
346 for nbpart, part in iterparts:
347 # consume the bundle content
347 # consume the bundle content
348 part.seek(0, 2)
348 part.seek(0, 2)
349 # Small hack to let caller code distinguish exceptions from bundle2
349 # Small hack to let caller code distinguish exceptions from bundle2
350 # processing from processing the old format. This is mostly
350 # processing from processing the old format. This is mostly
351 # needed to handle different return codes to unbundle according to the
351 # needed to handle different return codes to unbundle according to the
352 # type of bundle. We should probably clean up or drop this return code
352 # type of bundle. We should probably clean up or drop this return code
353 # craziness in a future version.
353 # craziness in a future version.
354 exc.duringunbundle2 = True
354 exc.duringunbundle2 = True
355 salvaged = []
355 salvaged = []
356 replycaps = None
356 replycaps = None
357 if op.reply is not None:
357 if op.reply is not None:
358 salvaged = op.reply.salvageoutput()
358 salvaged = op.reply.salvageoutput()
359 replycaps = op.reply.capabilities
359 replycaps = op.reply.capabilities
360 exc._replycaps = replycaps
360 exc._replycaps = replycaps
361 exc._bundle2salvagedoutput = salvaged
361 exc._bundle2salvagedoutput = salvaged
362 raise
362 raise
363 finally:
363 finally:
364 repo.ui.debug('bundle2-input-bundle: %i parts total\n' % nbpart)
364 repo.ui.debug('bundle2-input-bundle: %i parts total\n' % nbpart)
365
365
366 return op
366 return op
367
367
368 def _processpart(op, part):
368 def _processpart(op, part):
369 """process a single part from a bundle
369 """process a single part from a bundle
370
370
371 The part is guaranteed to have been fully consumed when the function exits
371 The part is guaranteed to have been fully consumed when the function exits
372 (even if an exception is raised)."""
372 (even if an exception is raised)."""
373 status = 'unknown' # used by debug output
373 status = 'unknown' # used by debug output
374 try:
374 try:
375 try:
375 try:
376 handler = parthandlermapping.get(part.type)
376 handler = parthandlermapping.get(part.type)
377 if handler is None:
377 if handler is None:
378 status = 'unsupported-type'
378 status = 'unsupported-type'
379 raise error.UnsupportedPartError(parttype=part.type)
379 raise error.BundleUnknownFeatureError(parttype=part.type)
380 indebug(op.ui, 'found a handler for part %r' % part.type)
380 indebug(op.ui, 'found a handler for part %r' % part.type)
381 unknownparams = part.mandatorykeys - handler.params
381 unknownparams = part.mandatorykeys - handler.params
382 if unknownparams:
382 if unknownparams:
383 unknownparams = list(unknownparams)
383 unknownparams = list(unknownparams)
384 unknownparams.sort()
384 unknownparams.sort()
385 status = 'unsupported-params (%s)' % unknownparams
385 status = 'unsupported-params (%s)' % unknownparams
386 raise error.UnsupportedPartError(parttype=part.type,
386 raise error.BundleUnknownFeatureError(parttype=part.type,
387 params=unknownparams)
387 params=unknownparams)
388 status = 'supported'
388 status = 'supported'
389 except error.UnsupportedPartError as exc:
389 except error.BundleUnknownFeatureError as exc:
390 if part.mandatory: # mandatory parts
390 if part.mandatory: # mandatory parts
391 raise
391 raise
392 indebug(op.ui, 'ignoring unsupported advisory part %s' % exc)
392 indebug(op.ui, 'ignoring unsupported advisory part %s' % exc)
393 return # skip to part processing
393 return # skip to part processing
394 finally:
394 finally:
395 if op.ui.debugflag:
395 if op.ui.debugflag:
396 msg = ['bundle2-input-part: "%s"' % part.type]
396 msg = ['bundle2-input-part: "%s"' % part.type]
397 if not part.mandatory:
397 if not part.mandatory:
398 msg.append(' (advisory)')
398 msg.append(' (advisory)')
399 nbmp = len(part.mandatorykeys)
399 nbmp = len(part.mandatorykeys)
400 nbap = len(part.params) - nbmp
400 nbap = len(part.params) - nbmp
401 if nbmp or nbap:
401 if nbmp or nbap:
402 msg.append(' (params:')
402 msg.append(' (params:')
403 if nbmp:
403 if nbmp:
404 msg.append(' %i mandatory' % nbmp)
404 msg.append(' %i mandatory' % nbmp)
405 if nbap:
405 if nbap:
406 msg.append(' %i advisory' % nbmp)
406 msg.append(' %i advisory' % nbmp)
407 msg.append(')')
407 msg.append(')')
408 msg.append(' %s\n' % status)
408 msg.append(' %s\n' % status)
409 op.ui.debug(''.join(msg))
409 op.ui.debug(''.join(msg))
410
410
411 # handler is called outside the above try block so that we don't
411 # handler is called outside the above try block so that we don't
412 # risk catching KeyErrors from anything other than the
412 # risk catching KeyErrors from anything other than the
413 # parthandlermapping lookup (any KeyError raised by handler()
413 # parthandlermapping lookup (any KeyError raised by handler()
414 # itself represents a defect of a different variety).
414 # itself represents a defect of a different variety).
415 output = None
415 output = None
416 if op.captureoutput and op.reply is not None:
416 if op.captureoutput and op.reply is not None:
417 op.ui.pushbuffer(error=True, subproc=True)
417 op.ui.pushbuffer(error=True, subproc=True)
418 output = ''
418 output = ''
419 try:
419 try:
420 handler(op, part)
420 handler(op, part)
421 finally:
421 finally:
422 if output is not None:
422 if output is not None:
423 output = op.ui.popbuffer()
423 output = op.ui.popbuffer()
424 if output:
424 if output:
425 outpart = op.reply.newpart('output', data=output,
425 outpart = op.reply.newpart('output', data=output,
426 mandatory=False)
426 mandatory=False)
427 outpart.addparam('in-reply-to', str(part.id), mandatory=False)
427 outpart.addparam('in-reply-to', str(part.id), mandatory=False)
428 finally:
428 finally:
429 # consume the part content to not corrupt the stream.
429 # consume the part content to not corrupt the stream.
430 part.seek(0, 2)
430 part.seek(0, 2)
431
431
432
432
433 def decodecaps(blob):
433 def decodecaps(blob):
434 """decode a bundle2 caps bytes blob into a dictionary
434 """decode a bundle2 caps bytes blob into a dictionary
435
435
436 The blob is a list of capabilities (one per line)
436 The blob is a list of capabilities (one per line)
437 Capabilities may have values using a line of the form::
437 Capabilities may have values using a line of the form::
438
438
439 capability=value1,value2,value3
439 capability=value1,value2,value3
440
440
441 The values are always a list."""
441 The values are always a list."""
442 caps = {}
442 caps = {}
443 for line in blob.splitlines():
443 for line in blob.splitlines():
444 if not line:
444 if not line:
445 continue
445 continue
446 if '=' not in line:
446 if '=' not in line:
447 key, vals = line, ()
447 key, vals = line, ()
448 else:
448 else:
449 key, vals = line.split('=', 1)
449 key, vals = line.split('=', 1)
450 vals = vals.split(',')
450 vals = vals.split(',')
451 key = urllib.unquote(key)
451 key = urllib.unquote(key)
452 vals = [urllib.unquote(v) for v in vals]
452 vals = [urllib.unquote(v) for v in vals]
453 caps[key] = vals
453 caps[key] = vals
454 return caps
454 return caps
455
455
456 def encodecaps(caps):
456 def encodecaps(caps):
457 """encode a bundle2 caps dictionary into a bytes blob"""
457 """encode a bundle2 caps dictionary into a bytes blob"""
458 chunks = []
458 chunks = []
459 for ca in sorted(caps):
459 for ca in sorted(caps):
460 vals = caps[ca]
460 vals = caps[ca]
461 ca = urllib.quote(ca)
461 ca = urllib.quote(ca)
462 vals = [urllib.quote(v) for v in vals]
462 vals = [urllib.quote(v) for v in vals]
463 if vals:
463 if vals:
464 ca = "%s=%s" % (ca, ','.join(vals))
464 ca = "%s=%s" % (ca, ','.join(vals))
465 chunks.append(ca)
465 chunks.append(ca)
466 return '\n'.join(chunks)
466 return '\n'.join(chunks)
467
467
468 class bundle20(object):
468 class bundle20(object):
469 """represent an outgoing bundle2 container
469 """represent an outgoing bundle2 container
470
470
471 Use the `addparam` method to add stream level parameter. and `newpart` to
471 Use the `addparam` method to add stream level parameter. and `newpart` to
472 populate it. Then call `getchunks` to retrieve all the binary chunks of
472 populate it. Then call `getchunks` to retrieve all the binary chunks of
473 data that compose the bundle2 container."""
473 data that compose the bundle2 container."""
474
474
475 _magicstring = 'HG20'
475 _magicstring = 'HG20'
476
476
477 def __init__(self, ui, capabilities=()):
477 def __init__(self, ui, capabilities=()):
478 self.ui = ui
478 self.ui = ui
479 self._params = []
479 self._params = []
480 self._parts = []
480 self._parts = []
481 self.capabilities = dict(capabilities)
481 self.capabilities = dict(capabilities)
482
482
483 @property
483 @property
484 def nbparts(self):
484 def nbparts(self):
485 """total number of parts added to the bundler"""
485 """total number of parts added to the bundler"""
486 return len(self._parts)
486 return len(self._parts)
487
487
488 # methods used to defines the bundle2 content
488 # methods used to defines the bundle2 content
489 def addparam(self, name, value=None):
489 def addparam(self, name, value=None):
490 """add a stream level parameter"""
490 """add a stream level parameter"""
491 if not name:
491 if not name:
492 raise ValueError('empty parameter name')
492 raise ValueError('empty parameter name')
493 if name[0] not in string.letters:
493 if name[0] not in string.letters:
494 raise ValueError('non letter first character: %r' % name)
494 raise ValueError('non letter first character: %r' % name)
495 self._params.append((name, value))
495 self._params.append((name, value))
496
496
497 def addpart(self, part):
497 def addpart(self, part):
498 """add a new part to the bundle2 container
498 """add a new part to the bundle2 container
499
499
500 Parts contains the actual applicative payload."""
500 Parts contains the actual applicative payload."""
501 assert part.id is None
501 assert part.id is None
502 part.id = len(self._parts) # very cheap counter
502 part.id = len(self._parts) # very cheap counter
503 self._parts.append(part)
503 self._parts.append(part)
504
504
505 def newpart(self, typeid, *args, **kwargs):
505 def newpart(self, typeid, *args, **kwargs):
506 """create a new part and add it to the containers
506 """create a new part and add it to the containers
507
507
508 As the part is directly added to the containers. For now, this means
508 As the part is directly added to the containers. For now, this means
509 that any failure to properly initialize the part after calling
509 that any failure to properly initialize the part after calling
510 ``newpart`` should result in a failure of the whole bundling process.
510 ``newpart`` should result in a failure of the whole bundling process.
511
511
512 You can still fall back to manually create and add if you need better
512 You can still fall back to manually create and add if you need better
513 control."""
513 control."""
514 part = bundlepart(typeid, *args, **kwargs)
514 part = bundlepart(typeid, *args, **kwargs)
515 self.addpart(part)
515 self.addpart(part)
516 return part
516 return part
517
517
518 # methods used to generate the bundle2 stream
518 # methods used to generate the bundle2 stream
519 def getchunks(self):
519 def getchunks(self):
520 if self.ui.debugflag:
520 if self.ui.debugflag:
521 msg = ['bundle2-output-bundle: "%s",' % self._magicstring]
521 msg = ['bundle2-output-bundle: "%s",' % self._magicstring]
522 if self._params:
522 if self._params:
523 msg.append(' (%i params)' % len(self._params))
523 msg.append(' (%i params)' % len(self._params))
524 msg.append(' %i parts total\n' % len(self._parts))
524 msg.append(' %i parts total\n' % len(self._parts))
525 self.ui.debug(''.join(msg))
525 self.ui.debug(''.join(msg))
526 outdebug(self.ui, 'start emission of %s stream' % self._magicstring)
526 outdebug(self.ui, 'start emission of %s stream' % self._magicstring)
527 yield self._magicstring
527 yield self._magicstring
528 param = self._paramchunk()
528 param = self._paramchunk()
529 outdebug(self.ui, 'bundle parameter: %s' % param)
529 outdebug(self.ui, 'bundle parameter: %s' % param)
530 yield _pack(_fstreamparamsize, len(param))
530 yield _pack(_fstreamparamsize, len(param))
531 if param:
531 if param:
532 yield param
532 yield param
533
533
534 outdebug(self.ui, 'start of parts')
534 outdebug(self.ui, 'start of parts')
535 for part in self._parts:
535 for part in self._parts:
536 outdebug(self.ui, 'bundle part: "%s"' % part.type)
536 outdebug(self.ui, 'bundle part: "%s"' % part.type)
537 for chunk in part.getchunks(ui=self.ui):
537 for chunk in part.getchunks(ui=self.ui):
538 yield chunk
538 yield chunk
539 outdebug(self.ui, 'end of bundle')
539 outdebug(self.ui, 'end of bundle')
540 yield _pack(_fpartheadersize, 0)
540 yield _pack(_fpartheadersize, 0)
541
541
542 def _paramchunk(self):
542 def _paramchunk(self):
543 """return a encoded version of all stream parameters"""
543 """return a encoded version of all stream parameters"""
544 blocks = []
544 blocks = []
545 for par, value in self._params:
545 for par, value in self._params:
546 par = urllib.quote(par)
546 par = urllib.quote(par)
547 if value is not None:
547 if value is not None:
548 value = urllib.quote(value)
548 value = urllib.quote(value)
549 par = '%s=%s' % (par, value)
549 par = '%s=%s' % (par, value)
550 blocks.append(par)
550 blocks.append(par)
551 return ' '.join(blocks)
551 return ' '.join(blocks)
552
552
553 def salvageoutput(self):
553 def salvageoutput(self):
554 """return a list with a copy of all output parts in the bundle
554 """return a list with a copy of all output parts in the bundle
555
555
556 This is meant to be used during error handling to make sure we preserve
556 This is meant to be used during error handling to make sure we preserve
557 server output"""
557 server output"""
558 salvaged = []
558 salvaged = []
559 for part in self._parts:
559 for part in self._parts:
560 if part.type.startswith('output'):
560 if part.type.startswith('output'):
561 salvaged.append(part.copy())
561 salvaged.append(part.copy())
562 return salvaged
562 return salvaged
563
563
564
564
565 class unpackermixin(object):
565 class unpackermixin(object):
566 """A mixin to extract bytes and struct data from a stream"""
566 """A mixin to extract bytes and struct data from a stream"""
567
567
568 def __init__(self, fp):
568 def __init__(self, fp):
569 self._fp = fp
569 self._fp = fp
570 self._seekable = (util.safehasattr(fp, 'seek') and
570 self._seekable = (util.safehasattr(fp, 'seek') and
571 util.safehasattr(fp, 'tell'))
571 util.safehasattr(fp, 'tell'))
572
572
573 def _unpack(self, format):
573 def _unpack(self, format):
574 """unpack this struct format from the stream"""
574 """unpack this struct format from the stream"""
575 data = self._readexact(struct.calcsize(format))
575 data = self._readexact(struct.calcsize(format))
576 return _unpack(format, data)
576 return _unpack(format, data)
577
577
578 def _readexact(self, size):
578 def _readexact(self, size):
579 """read exactly <size> bytes from the stream"""
579 """read exactly <size> bytes from the stream"""
580 return changegroup.readexactly(self._fp, size)
580 return changegroup.readexactly(self._fp, size)
581
581
582 def seek(self, offset, whence=0):
582 def seek(self, offset, whence=0):
583 """move the underlying file pointer"""
583 """move the underlying file pointer"""
584 if self._seekable:
584 if self._seekable:
585 return self._fp.seek(offset, whence)
585 return self._fp.seek(offset, whence)
586 else:
586 else:
587 raise NotImplementedError(_('File pointer is not seekable'))
587 raise NotImplementedError(_('File pointer is not seekable'))
588
588
589 def tell(self):
589 def tell(self):
590 """return the file offset, or None if file is not seekable"""
590 """return the file offset, or None if file is not seekable"""
591 if self._seekable:
591 if self._seekable:
592 try:
592 try:
593 return self._fp.tell()
593 return self._fp.tell()
594 except IOError as e:
594 except IOError as e:
595 if e.errno == errno.ESPIPE:
595 if e.errno == errno.ESPIPE:
596 self._seekable = False
596 self._seekable = False
597 else:
597 else:
598 raise
598 raise
599 return None
599 return None
600
600
601 def close(self):
601 def close(self):
602 """close underlying file"""
602 """close underlying file"""
603 if util.safehasattr(self._fp, 'close'):
603 if util.safehasattr(self._fp, 'close'):
604 return self._fp.close()
604 return self._fp.close()
605
605
606 def getunbundler(ui, fp, magicstring=None):
606 def getunbundler(ui, fp, magicstring=None):
607 """return a valid unbundler object for a given magicstring"""
607 """return a valid unbundler object for a given magicstring"""
608 if magicstring is None:
608 if magicstring is None:
609 magicstring = changegroup.readexactly(fp, 4)
609 magicstring = changegroup.readexactly(fp, 4)
610 magic, version = magicstring[0:2], magicstring[2:4]
610 magic, version = magicstring[0:2], magicstring[2:4]
611 if magic != 'HG':
611 if magic != 'HG':
612 raise util.Abort(_('not a Mercurial bundle'))
612 raise util.Abort(_('not a Mercurial bundle'))
613 unbundlerclass = formatmap.get(version)
613 unbundlerclass = formatmap.get(version)
614 if unbundlerclass is None:
614 if unbundlerclass is None:
615 raise util.Abort(_('unknown bundle version %s') % version)
615 raise util.Abort(_('unknown bundle version %s') % version)
616 unbundler = unbundlerclass(ui, fp)
616 unbundler = unbundlerclass(ui, fp)
617 indebug(ui, 'start processing of %s stream' % magicstring)
617 indebug(ui, 'start processing of %s stream' % magicstring)
618 return unbundler
618 return unbundler
619
619
620 class unbundle20(unpackermixin):
620 class unbundle20(unpackermixin):
621 """interpret a bundle2 stream
621 """interpret a bundle2 stream
622
622
623 This class is fed with a binary stream and yields parts through its
623 This class is fed with a binary stream and yields parts through its
624 `iterparts` methods."""
624 `iterparts` methods."""
625
625
626 def __init__(self, ui, fp):
626 def __init__(self, ui, fp):
627 """If header is specified, we do not read it out of the stream."""
627 """If header is specified, we do not read it out of the stream."""
628 self.ui = ui
628 self.ui = ui
629 super(unbundle20, self).__init__(fp)
629 super(unbundle20, self).__init__(fp)
630
630
631 @util.propertycache
631 @util.propertycache
632 def params(self):
632 def params(self):
633 """dictionary of stream level parameters"""
633 """dictionary of stream level parameters"""
634 indebug(self.ui, 'reading bundle2 stream parameters')
634 indebug(self.ui, 'reading bundle2 stream parameters')
635 params = {}
635 params = {}
636 paramssize = self._unpack(_fstreamparamsize)[0]
636 paramssize = self._unpack(_fstreamparamsize)[0]
637 if paramssize < 0:
637 if paramssize < 0:
638 raise error.BundleValueError('negative bundle param size: %i'
638 raise error.BundleValueError('negative bundle param size: %i'
639 % paramssize)
639 % paramssize)
640 if paramssize:
640 if paramssize:
641 for p in self._readexact(paramssize).split(' '):
641 for p in self._readexact(paramssize).split(' '):
642 p = p.split('=', 1)
642 p = p.split('=', 1)
643 p = [urllib.unquote(i) for i in p]
643 p = [urllib.unquote(i) for i in p]
644 if len(p) < 2:
644 if len(p) < 2:
645 p.append(None)
645 p.append(None)
646 self._processparam(*p)
646 self._processparam(*p)
647 params[p[0]] = p[1]
647 params[p[0]] = p[1]
648 return params
648 return params
649
649
650 def _processparam(self, name, value):
650 def _processparam(self, name, value):
651 """process a parameter, applying its effect if needed
651 """process a parameter, applying its effect if needed
652
652
653 Parameter starting with a lower case letter are advisory and will be
653 Parameter starting with a lower case letter are advisory and will be
654 ignored when unknown. Those starting with an upper case letter are
654 ignored when unknown. Those starting with an upper case letter are
655 mandatory and will this function will raise a KeyError when unknown.
655 mandatory and will this function will raise a KeyError when unknown.
656
656
657 Note: no option are currently supported. Any input will be either
657 Note: no option are currently supported. Any input will be either
658 ignored or failing.
658 ignored or failing.
659 """
659 """
660 if not name:
660 if not name:
661 raise ValueError('empty parameter name')
661 raise ValueError('empty parameter name')
662 if name[0] not in string.letters:
662 if name[0] not in string.letters:
663 raise ValueError('non letter first character: %r' % name)
663 raise ValueError('non letter first character: %r' % name)
664 # Some logic will be later added here to try to process the option for
664 # Some logic will be later added here to try to process the option for
665 # a dict of known parameter.
665 # a dict of known parameter.
666 if name[0].islower():
666 if name[0].islower():
667 indebug(self.ui, "ignoring unknown parameter %r" % name)
667 indebug(self.ui, "ignoring unknown parameter %r" % name)
668 else:
668 else:
669 raise error.UnsupportedPartError(params=(name,))
669 raise error.BundleUnknownFeatureError(params=(name,))
670
670
671
671
672 def iterparts(self):
672 def iterparts(self):
673 """yield all parts contained in the stream"""
673 """yield all parts contained in the stream"""
674 # make sure param have been loaded
674 # make sure param have been loaded
675 self.params
675 self.params
676 indebug(self.ui, 'start extraction of bundle2 parts')
676 indebug(self.ui, 'start extraction of bundle2 parts')
677 headerblock = self._readpartheader()
677 headerblock = self._readpartheader()
678 while headerblock is not None:
678 while headerblock is not None:
679 part = unbundlepart(self.ui, headerblock, self._fp)
679 part = unbundlepart(self.ui, headerblock, self._fp)
680 yield part
680 yield part
681 part.seek(0, 2)
681 part.seek(0, 2)
682 headerblock = self._readpartheader()
682 headerblock = self._readpartheader()
683 indebug(self.ui, 'end of bundle2 stream')
683 indebug(self.ui, 'end of bundle2 stream')
684
684
685 def _readpartheader(self):
685 def _readpartheader(self):
686 """reads a part header size and return the bytes blob
686 """reads a part header size and return the bytes blob
687
687
688 returns None if empty"""
688 returns None if empty"""
689 headersize = self._unpack(_fpartheadersize)[0]
689 headersize = self._unpack(_fpartheadersize)[0]
690 if headersize < 0:
690 if headersize < 0:
691 raise error.BundleValueError('negative part header size: %i'
691 raise error.BundleValueError('negative part header size: %i'
692 % headersize)
692 % headersize)
693 indebug(self.ui, 'part header size: %i' % headersize)
693 indebug(self.ui, 'part header size: %i' % headersize)
694 if headersize:
694 if headersize:
695 return self._readexact(headersize)
695 return self._readexact(headersize)
696 return None
696 return None
697
697
698 def compressed(self):
698 def compressed(self):
699 return False
699 return False
700
700
701 formatmap = {'20': unbundle20}
701 formatmap = {'20': unbundle20}
702
702
703 class bundlepart(object):
703 class bundlepart(object):
704 """A bundle2 part contains application level payload
704 """A bundle2 part contains application level payload
705
705
706 The part `type` is used to route the part to the application level
706 The part `type` is used to route the part to the application level
707 handler.
707 handler.
708
708
709 The part payload is contained in ``part.data``. It could be raw bytes or a
709 The part payload is contained in ``part.data``. It could be raw bytes or a
710 generator of byte chunks.
710 generator of byte chunks.
711
711
712 You can add parameters to the part using the ``addparam`` method.
712 You can add parameters to the part using the ``addparam`` method.
713 Parameters can be either mandatory (default) or advisory. Remote side
713 Parameters can be either mandatory (default) or advisory. Remote side
714 should be able to safely ignore the advisory ones.
714 should be able to safely ignore the advisory ones.
715
715
716 Both data and parameters cannot be modified after the generation has begun.
716 Both data and parameters cannot be modified after the generation has begun.
717 """
717 """
718
718
719 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
719 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
720 data='', mandatory=True):
720 data='', mandatory=True):
721 validateparttype(parttype)
721 validateparttype(parttype)
722 self.id = None
722 self.id = None
723 self.type = parttype
723 self.type = parttype
724 self._data = data
724 self._data = data
725 self._mandatoryparams = list(mandatoryparams)
725 self._mandatoryparams = list(mandatoryparams)
726 self._advisoryparams = list(advisoryparams)
726 self._advisoryparams = list(advisoryparams)
727 # checking for duplicated entries
727 # checking for duplicated entries
728 self._seenparams = set()
728 self._seenparams = set()
729 for pname, __ in self._mandatoryparams + self._advisoryparams:
729 for pname, __ in self._mandatoryparams + self._advisoryparams:
730 if pname in self._seenparams:
730 if pname in self._seenparams:
731 raise RuntimeError('duplicated params: %s' % pname)
731 raise RuntimeError('duplicated params: %s' % pname)
732 self._seenparams.add(pname)
732 self._seenparams.add(pname)
733 # status of the part's generation:
733 # status of the part's generation:
734 # - None: not started,
734 # - None: not started,
735 # - False: currently generated,
735 # - False: currently generated,
736 # - True: generation done.
736 # - True: generation done.
737 self._generated = None
737 self._generated = None
738 self.mandatory = mandatory
738 self.mandatory = mandatory
739
739
740 def copy(self):
740 def copy(self):
741 """return a copy of the part
741 """return a copy of the part
742
742
743 The new part have the very same content but no partid assigned yet.
743 The new part have the very same content but no partid assigned yet.
744 Parts with generated data cannot be copied."""
744 Parts with generated data cannot be copied."""
745 assert not util.safehasattr(self.data, 'next')
745 assert not util.safehasattr(self.data, 'next')
746 return self.__class__(self.type, self._mandatoryparams,
746 return self.__class__(self.type, self._mandatoryparams,
747 self._advisoryparams, self._data, self.mandatory)
747 self._advisoryparams, self._data, self.mandatory)
748
748
749 # methods used to defines the part content
749 # methods used to defines the part content
750 def __setdata(self, data):
750 def __setdata(self, data):
751 if self._generated is not None:
751 if self._generated is not None:
752 raise error.ReadOnlyPartError('part is being generated')
752 raise error.ReadOnlyPartError('part is being generated')
753 self._data = data
753 self._data = data
754 def __getdata(self):
754 def __getdata(self):
755 return self._data
755 return self._data
756 data = property(__getdata, __setdata)
756 data = property(__getdata, __setdata)
757
757
758 @property
758 @property
759 def mandatoryparams(self):
759 def mandatoryparams(self):
760 # make it an immutable tuple to force people through ``addparam``
760 # make it an immutable tuple to force people through ``addparam``
761 return tuple(self._mandatoryparams)
761 return tuple(self._mandatoryparams)
762
762
763 @property
763 @property
764 def advisoryparams(self):
764 def advisoryparams(self):
765 # make it an immutable tuple to force people through ``addparam``
765 # make it an immutable tuple to force people through ``addparam``
766 return tuple(self._advisoryparams)
766 return tuple(self._advisoryparams)
767
767
768 def addparam(self, name, value='', mandatory=True):
768 def addparam(self, name, value='', mandatory=True):
769 if self._generated is not None:
769 if self._generated is not None:
770 raise error.ReadOnlyPartError('part is being generated')
770 raise error.ReadOnlyPartError('part is being generated')
771 if name in self._seenparams:
771 if name in self._seenparams:
772 raise ValueError('duplicated params: %s' % name)
772 raise ValueError('duplicated params: %s' % name)
773 self._seenparams.add(name)
773 self._seenparams.add(name)
774 params = self._advisoryparams
774 params = self._advisoryparams
775 if mandatory:
775 if mandatory:
776 params = self._mandatoryparams
776 params = self._mandatoryparams
777 params.append((name, value))
777 params.append((name, value))
778
778
779 # methods used to generates the bundle2 stream
779 # methods used to generates the bundle2 stream
780 def getchunks(self, ui):
780 def getchunks(self, ui):
781 if self._generated is not None:
781 if self._generated is not None:
782 raise RuntimeError('part can only be consumed once')
782 raise RuntimeError('part can only be consumed once')
783 self._generated = False
783 self._generated = False
784
784
785 if ui.debugflag:
785 if ui.debugflag:
786 msg = ['bundle2-output-part: "%s"' % self.type]
786 msg = ['bundle2-output-part: "%s"' % self.type]
787 if not self.mandatory:
787 if not self.mandatory:
788 msg.append(' (advisory)')
788 msg.append(' (advisory)')
789 nbmp = len(self.mandatoryparams)
789 nbmp = len(self.mandatoryparams)
790 nbap = len(self.advisoryparams)
790 nbap = len(self.advisoryparams)
791 if nbmp or nbap:
791 if nbmp or nbap:
792 msg.append(' (params:')
792 msg.append(' (params:')
793 if nbmp:
793 if nbmp:
794 msg.append(' %i mandatory' % nbmp)
794 msg.append(' %i mandatory' % nbmp)
795 if nbap:
795 if nbap:
796 msg.append(' %i advisory' % nbmp)
796 msg.append(' %i advisory' % nbmp)
797 msg.append(')')
797 msg.append(')')
798 if not self.data:
798 if not self.data:
799 msg.append(' empty payload')
799 msg.append(' empty payload')
800 elif util.safehasattr(self.data, 'next'):
800 elif util.safehasattr(self.data, 'next'):
801 msg.append(' streamed payload')
801 msg.append(' streamed payload')
802 else:
802 else:
803 msg.append(' %i bytes payload' % len(self.data))
803 msg.append(' %i bytes payload' % len(self.data))
804 msg.append('\n')
804 msg.append('\n')
805 ui.debug(''.join(msg))
805 ui.debug(''.join(msg))
806
806
807 #### header
807 #### header
808 if self.mandatory:
808 if self.mandatory:
809 parttype = self.type.upper()
809 parttype = self.type.upper()
810 else:
810 else:
811 parttype = self.type.lower()
811 parttype = self.type.lower()
812 outdebug(ui, 'part %s: "%s"' % (self.id, parttype))
812 outdebug(ui, 'part %s: "%s"' % (self.id, parttype))
813 ## parttype
813 ## parttype
814 header = [_pack(_fparttypesize, len(parttype)),
814 header = [_pack(_fparttypesize, len(parttype)),
815 parttype, _pack(_fpartid, self.id),
815 parttype, _pack(_fpartid, self.id),
816 ]
816 ]
817 ## parameters
817 ## parameters
818 # count
818 # count
819 manpar = self.mandatoryparams
819 manpar = self.mandatoryparams
820 advpar = self.advisoryparams
820 advpar = self.advisoryparams
821 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
821 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
822 # size
822 # size
823 parsizes = []
823 parsizes = []
824 for key, value in manpar:
824 for key, value in manpar:
825 parsizes.append(len(key))
825 parsizes.append(len(key))
826 parsizes.append(len(value))
826 parsizes.append(len(value))
827 for key, value in advpar:
827 for key, value in advpar:
828 parsizes.append(len(key))
828 parsizes.append(len(key))
829 parsizes.append(len(value))
829 parsizes.append(len(value))
830 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
830 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
831 header.append(paramsizes)
831 header.append(paramsizes)
832 # key, value
832 # key, value
833 for key, value in manpar:
833 for key, value in manpar:
834 header.append(key)
834 header.append(key)
835 header.append(value)
835 header.append(value)
836 for key, value in advpar:
836 for key, value in advpar:
837 header.append(key)
837 header.append(key)
838 header.append(value)
838 header.append(value)
839 ## finalize header
839 ## finalize header
840 headerchunk = ''.join(header)
840 headerchunk = ''.join(header)
841 outdebug(ui, 'header chunk size: %i' % len(headerchunk))
841 outdebug(ui, 'header chunk size: %i' % len(headerchunk))
842 yield _pack(_fpartheadersize, len(headerchunk))
842 yield _pack(_fpartheadersize, len(headerchunk))
843 yield headerchunk
843 yield headerchunk
844 ## payload
844 ## payload
845 try:
845 try:
846 for chunk in self._payloadchunks():
846 for chunk in self._payloadchunks():
847 outdebug(ui, 'payload chunk size: %i' % len(chunk))
847 outdebug(ui, 'payload chunk size: %i' % len(chunk))
848 yield _pack(_fpayloadsize, len(chunk))
848 yield _pack(_fpayloadsize, len(chunk))
849 yield chunk
849 yield chunk
850 except GeneratorExit:
850 except GeneratorExit:
851 # GeneratorExit means that nobody is listening for our
851 # GeneratorExit means that nobody is listening for our
852 # results anyway, so just bail quickly rather than trying
852 # results anyway, so just bail quickly rather than trying
853 # to produce an error part.
853 # to produce an error part.
854 ui.debug('bundle2-generatorexit\n')
854 ui.debug('bundle2-generatorexit\n')
855 raise
855 raise
856 except BaseException as exc:
856 except BaseException as exc:
857 # backup exception data for later
857 # backup exception data for later
858 ui.debug('bundle2-input-stream-interrupt: encoding exception %s'
858 ui.debug('bundle2-input-stream-interrupt: encoding exception %s'
859 % exc)
859 % exc)
860 exc_info = sys.exc_info()
860 exc_info = sys.exc_info()
861 msg = 'unexpected error: %s' % exc
861 msg = 'unexpected error: %s' % exc
862 interpart = bundlepart('error:abort', [('message', msg)],
862 interpart = bundlepart('error:abort', [('message', msg)],
863 mandatory=False)
863 mandatory=False)
864 interpart.id = 0
864 interpart.id = 0
865 yield _pack(_fpayloadsize, -1)
865 yield _pack(_fpayloadsize, -1)
866 for chunk in interpart.getchunks(ui=ui):
866 for chunk in interpart.getchunks(ui=ui):
867 yield chunk
867 yield chunk
868 outdebug(ui, 'closing payload chunk')
868 outdebug(ui, 'closing payload chunk')
869 # abort current part payload
869 # abort current part payload
870 yield _pack(_fpayloadsize, 0)
870 yield _pack(_fpayloadsize, 0)
871 raise exc_info[0], exc_info[1], exc_info[2]
871 raise exc_info[0], exc_info[1], exc_info[2]
872 # end of payload
872 # end of payload
873 outdebug(ui, 'closing payload chunk')
873 outdebug(ui, 'closing payload chunk')
874 yield _pack(_fpayloadsize, 0)
874 yield _pack(_fpayloadsize, 0)
875 self._generated = True
875 self._generated = True
876
876
877 def _payloadchunks(self):
877 def _payloadchunks(self):
878 """yield chunks of a the part payload
878 """yield chunks of a the part payload
879
879
880 Exists to handle the different methods to provide data to a part."""
880 Exists to handle the different methods to provide data to a part."""
881 # we only support fixed size data now.
881 # we only support fixed size data now.
882 # This will be improved in the future.
882 # This will be improved in the future.
883 if util.safehasattr(self.data, 'next'):
883 if util.safehasattr(self.data, 'next'):
884 buff = util.chunkbuffer(self.data)
884 buff = util.chunkbuffer(self.data)
885 chunk = buff.read(preferedchunksize)
885 chunk = buff.read(preferedchunksize)
886 while chunk:
886 while chunk:
887 yield chunk
887 yield chunk
888 chunk = buff.read(preferedchunksize)
888 chunk = buff.read(preferedchunksize)
889 elif len(self.data):
889 elif len(self.data):
890 yield self.data
890 yield self.data
891
891
892
892
893 flaginterrupt = -1
893 flaginterrupt = -1
894
894
895 class interrupthandler(unpackermixin):
895 class interrupthandler(unpackermixin):
896 """read one part and process it with restricted capability
896 """read one part and process it with restricted capability
897
897
898 This allows to transmit exception raised on the producer size during part
898 This allows to transmit exception raised on the producer size during part
899 iteration while the consumer is reading a part.
899 iteration while the consumer is reading a part.
900
900
901 Part processed in this manner only have access to a ui object,"""
901 Part processed in this manner only have access to a ui object,"""
902
902
903 def __init__(self, ui, fp):
903 def __init__(self, ui, fp):
904 super(interrupthandler, self).__init__(fp)
904 super(interrupthandler, self).__init__(fp)
905 self.ui = ui
905 self.ui = ui
906
906
907 def _readpartheader(self):
907 def _readpartheader(self):
908 """reads a part header size and return the bytes blob
908 """reads a part header size and return the bytes blob
909
909
910 returns None if empty"""
910 returns None if empty"""
911 headersize = self._unpack(_fpartheadersize)[0]
911 headersize = self._unpack(_fpartheadersize)[0]
912 if headersize < 0:
912 if headersize < 0:
913 raise error.BundleValueError('negative part header size: %i'
913 raise error.BundleValueError('negative part header size: %i'
914 % headersize)
914 % headersize)
915 indebug(self.ui, 'part header size: %i\n' % headersize)
915 indebug(self.ui, 'part header size: %i\n' % headersize)
916 if headersize:
916 if headersize:
917 return self._readexact(headersize)
917 return self._readexact(headersize)
918 return None
918 return None
919
919
920 def __call__(self):
920 def __call__(self):
921
921
922 self.ui.debug('bundle2-input-stream-interrupt:'
922 self.ui.debug('bundle2-input-stream-interrupt:'
923 ' opening out of band context\n')
923 ' opening out of band context\n')
924 indebug(self.ui, 'bundle2 stream interruption, looking for a part.')
924 indebug(self.ui, 'bundle2 stream interruption, looking for a part.')
925 headerblock = self._readpartheader()
925 headerblock = self._readpartheader()
926 if headerblock is None:
926 if headerblock is None:
927 indebug(self.ui, 'no part found during interruption.')
927 indebug(self.ui, 'no part found during interruption.')
928 return
928 return
929 part = unbundlepart(self.ui, headerblock, self._fp)
929 part = unbundlepart(self.ui, headerblock, self._fp)
930 op = interruptoperation(self.ui)
930 op = interruptoperation(self.ui)
931 _processpart(op, part)
931 _processpart(op, part)
932 self.ui.debug('bundle2-input-stream-interrupt:'
932 self.ui.debug('bundle2-input-stream-interrupt:'
933 ' closing out of band context\n')
933 ' closing out of band context\n')
934
934
935 class interruptoperation(object):
935 class interruptoperation(object):
936 """A limited operation to be use by part handler during interruption
936 """A limited operation to be use by part handler during interruption
937
937
938 It only have access to an ui object.
938 It only have access to an ui object.
939 """
939 """
940
940
941 def __init__(self, ui):
941 def __init__(self, ui):
942 self.ui = ui
942 self.ui = ui
943 self.reply = None
943 self.reply = None
944 self.captureoutput = False
944 self.captureoutput = False
945
945
946 @property
946 @property
947 def repo(self):
947 def repo(self):
948 raise RuntimeError('no repo access from stream interruption')
948 raise RuntimeError('no repo access from stream interruption')
949
949
950 def gettransaction(self):
950 def gettransaction(self):
951 raise TransactionUnavailable('no repo access from stream interruption')
951 raise TransactionUnavailable('no repo access from stream interruption')
952
952
953 class unbundlepart(unpackermixin):
953 class unbundlepart(unpackermixin):
954 """a bundle part read from a bundle"""
954 """a bundle part read from a bundle"""
955
955
956 def __init__(self, ui, header, fp):
956 def __init__(self, ui, header, fp):
957 super(unbundlepart, self).__init__(fp)
957 super(unbundlepart, self).__init__(fp)
958 self.ui = ui
958 self.ui = ui
959 # unbundle state attr
959 # unbundle state attr
960 self._headerdata = header
960 self._headerdata = header
961 self._headeroffset = 0
961 self._headeroffset = 0
962 self._initialized = False
962 self._initialized = False
963 self.consumed = False
963 self.consumed = False
964 # part data
964 # part data
965 self.id = None
965 self.id = None
966 self.type = None
966 self.type = None
967 self.mandatoryparams = None
967 self.mandatoryparams = None
968 self.advisoryparams = None
968 self.advisoryparams = None
969 self.params = None
969 self.params = None
970 self.mandatorykeys = ()
970 self.mandatorykeys = ()
971 self._payloadstream = None
971 self._payloadstream = None
972 self._readheader()
972 self._readheader()
973 self._mandatory = None
973 self._mandatory = None
974 self._chunkindex = [] #(payload, file) position tuples for chunk starts
974 self._chunkindex = [] #(payload, file) position tuples for chunk starts
975 self._pos = 0
975 self._pos = 0
976
976
977 def _fromheader(self, size):
977 def _fromheader(self, size):
978 """return the next <size> byte from the header"""
978 """return the next <size> byte from the header"""
979 offset = self._headeroffset
979 offset = self._headeroffset
980 data = self._headerdata[offset:(offset + size)]
980 data = self._headerdata[offset:(offset + size)]
981 self._headeroffset = offset + size
981 self._headeroffset = offset + size
982 return data
982 return data
983
983
984 def _unpackheader(self, format):
984 def _unpackheader(self, format):
985 """read given format from header
985 """read given format from header
986
986
987 This automatically compute the size of the format to read."""
987 This automatically compute the size of the format to read."""
988 data = self._fromheader(struct.calcsize(format))
988 data = self._fromheader(struct.calcsize(format))
989 return _unpack(format, data)
989 return _unpack(format, data)
990
990
991 def _initparams(self, mandatoryparams, advisoryparams):
991 def _initparams(self, mandatoryparams, advisoryparams):
992 """internal function to setup all logic related parameters"""
992 """internal function to setup all logic related parameters"""
993 # make it read only to prevent people touching it by mistake.
993 # make it read only to prevent people touching it by mistake.
994 self.mandatoryparams = tuple(mandatoryparams)
994 self.mandatoryparams = tuple(mandatoryparams)
995 self.advisoryparams = tuple(advisoryparams)
995 self.advisoryparams = tuple(advisoryparams)
996 # user friendly UI
996 # user friendly UI
997 self.params = dict(self.mandatoryparams)
997 self.params = dict(self.mandatoryparams)
998 self.params.update(dict(self.advisoryparams))
998 self.params.update(dict(self.advisoryparams))
999 self.mandatorykeys = frozenset(p[0] for p in mandatoryparams)
999 self.mandatorykeys = frozenset(p[0] for p in mandatoryparams)
1000
1000
1001 def _payloadchunks(self, chunknum=0):
1001 def _payloadchunks(self, chunknum=0):
1002 '''seek to specified chunk and start yielding data'''
1002 '''seek to specified chunk and start yielding data'''
1003 if len(self._chunkindex) == 0:
1003 if len(self._chunkindex) == 0:
1004 assert chunknum == 0, 'Must start with chunk 0'
1004 assert chunknum == 0, 'Must start with chunk 0'
1005 self._chunkindex.append((0, super(unbundlepart, self).tell()))
1005 self._chunkindex.append((0, super(unbundlepart, self).tell()))
1006 else:
1006 else:
1007 assert chunknum < len(self._chunkindex), \
1007 assert chunknum < len(self._chunkindex), \
1008 'Unknown chunk %d' % chunknum
1008 'Unknown chunk %d' % chunknum
1009 super(unbundlepart, self).seek(self._chunkindex[chunknum][1])
1009 super(unbundlepart, self).seek(self._chunkindex[chunknum][1])
1010
1010
1011 pos = self._chunkindex[chunknum][0]
1011 pos = self._chunkindex[chunknum][0]
1012 payloadsize = self._unpack(_fpayloadsize)[0]
1012 payloadsize = self._unpack(_fpayloadsize)[0]
1013 indebug(self.ui, 'payload chunk size: %i' % payloadsize)
1013 indebug(self.ui, 'payload chunk size: %i' % payloadsize)
1014 while payloadsize:
1014 while payloadsize:
1015 if payloadsize == flaginterrupt:
1015 if payloadsize == flaginterrupt:
1016 # interruption detection, the handler will now read a
1016 # interruption detection, the handler will now read a
1017 # single part and process it.
1017 # single part and process it.
1018 interrupthandler(self.ui, self._fp)()
1018 interrupthandler(self.ui, self._fp)()
1019 elif payloadsize < 0:
1019 elif payloadsize < 0:
1020 msg = 'negative payload chunk size: %i' % payloadsize
1020 msg = 'negative payload chunk size: %i' % payloadsize
1021 raise error.BundleValueError(msg)
1021 raise error.BundleValueError(msg)
1022 else:
1022 else:
1023 result = self._readexact(payloadsize)
1023 result = self._readexact(payloadsize)
1024 chunknum += 1
1024 chunknum += 1
1025 pos += payloadsize
1025 pos += payloadsize
1026 if chunknum == len(self._chunkindex):
1026 if chunknum == len(self._chunkindex):
1027 self._chunkindex.append((pos,
1027 self._chunkindex.append((pos,
1028 super(unbundlepart, self).tell()))
1028 super(unbundlepart, self).tell()))
1029 yield result
1029 yield result
1030 payloadsize = self._unpack(_fpayloadsize)[0]
1030 payloadsize = self._unpack(_fpayloadsize)[0]
1031 indebug(self.ui, 'payload chunk size: %i' % payloadsize)
1031 indebug(self.ui, 'payload chunk size: %i' % payloadsize)
1032
1032
1033 def _findchunk(self, pos):
1033 def _findchunk(self, pos):
1034 '''for a given payload position, return a chunk number and offset'''
1034 '''for a given payload position, return a chunk number and offset'''
1035 for chunk, (ppos, fpos) in enumerate(self._chunkindex):
1035 for chunk, (ppos, fpos) in enumerate(self._chunkindex):
1036 if ppos == pos:
1036 if ppos == pos:
1037 return chunk, 0
1037 return chunk, 0
1038 elif ppos > pos:
1038 elif ppos > pos:
1039 return chunk - 1, pos - self._chunkindex[chunk - 1][0]
1039 return chunk - 1, pos - self._chunkindex[chunk - 1][0]
1040 raise ValueError('Unknown chunk')
1040 raise ValueError('Unknown chunk')
1041
1041
1042 def _readheader(self):
1042 def _readheader(self):
1043 """read the header and setup the object"""
1043 """read the header and setup the object"""
1044 typesize = self._unpackheader(_fparttypesize)[0]
1044 typesize = self._unpackheader(_fparttypesize)[0]
1045 self.type = self._fromheader(typesize)
1045 self.type = self._fromheader(typesize)
1046 indebug(self.ui, 'part type: "%s"' % self.type)
1046 indebug(self.ui, 'part type: "%s"' % self.type)
1047 self.id = self._unpackheader(_fpartid)[0]
1047 self.id = self._unpackheader(_fpartid)[0]
1048 indebug(self.ui, 'part id: "%s"' % self.id)
1048 indebug(self.ui, 'part id: "%s"' % self.id)
1049 # extract mandatory bit from type
1049 # extract mandatory bit from type
1050 self.mandatory = (self.type != self.type.lower())
1050 self.mandatory = (self.type != self.type.lower())
1051 self.type = self.type.lower()
1051 self.type = self.type.lower()
1052 ## reading parameters
1052 ## reading parameters
1053 # param count
1053 # param count
1054 mancount, advcount = self._unpackheader(_fpartparamcount)
1054 mancount, advcount = self._unpackheader(_fpartparamcount)
1055 indebug(self.ui, 'part parameters: %i' % (mancount + advcount))
1055 indebug(self.ui, 'part parameters: %i' % (mancount + advcount))
1056 # param size
1056 # param size
1057 fparamsizes = _makefpartparamsizes(mancount + advcount)
1057 fparamsizes = _makefpartparamsizes(mancount + advcount)
1058 paramsizes = self._unpackheader(fparamsizes)
1058 paramsizes = self._unpackheader(fparamsizes)
1059 # make it a list of couple again
1059 # make it a list of couple again
1060 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
1060 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
1061 # split mandatory from advisory
1061 # split mandatory from advisory
1062 mansizes = paramsizes[:mancount]
1062 mansizes = paramsizes[:mancount]
1063 advsizes = paramsizes[mancount:]
1063 advsizes = paramsizes[mancount:]
1064 # retrieve param value
1064 # retrieve param value
1065 manparams = []
1065 manparams = []
1066 for key, value in mansizes:
1066 for key, value in mansizes:
1067 manparams.append((self._fromheader(key), self._fromheader(value)))
1067 manparams.append((self._fromheader(key), self._fromheader(value)))
1068 advparams = []
1068 advparams = []
1069 for key, value in advsizes:
1069 for key, value in advsizes:
1070 advparams.append((self._fromheader(key), self._fromheader(value)))
1070 advparams.append((self._fromheader(key), self._fromheader(value)))
1071 self._initparams(manparams, advparams)
1071 self._initparams(manparams, advparams)
1072 ## part payload
1072 ## part payload
1073 self._payloadstream = util.chunkbuffer(self._payloadchunks())
1073 self._payloadstream = util.chunkbuffer(self._payloadchunks())
1074 # we read the data, tell it
1074 # we read the data, tell it
1075 self._initialized = True
1075 self._initialized = True
1076
1076
1077 def read(self, size=None):
1077 def read(self, size=None):
1078 """read payload data"""
1078 """read payload data"""
1079 if not self._initialized:
1079 if not self._initialized:
1080 self._readheader()
1080 self._readheader()
1081 if size is None:
1081 if size is None:
1082 data = self._payloadstream.read()
1082 data = self._payloadstream.read()
1083 else:
1083 else:
1084 data = self._payloadstream.read(size)
1084 data = self._payloadstream.read(size)
1085 self._pos += len(data)
1085 self._pos += len(data)
1086 if size is None or len(data) < size:
1086 if size is None or len(data) < size:
1087 if not self.consumed and self._pos:
1087 if not self.consumed and self._pos:
1088 self.ui.debug('bundle2-input-part: total payload size %i\n'
1088 self.ui.debug('bundle2-input-part: total payload size %i\n'
1089 % self._pos)
1089 % self._pos)
1090 self.consumed = True
1090 self.consumed = True
1091 return data
1091 return data
1092
1092
1093 def tell(self):
1093 def tell(self):
1094 return self._pos
1094 return self._pos
1095
1095
1096 def seek(self, offset, whence=0):
1096 def seek(self, offset, whence=0):
1097 if whence == 0:
1097 if whence == 0:
1098 newpos = offset
1098 newpos = offset
1099 elif whence == 1:
1099 elif whence == 1:
1100 newpos = self._pos + offset
1100 newpos = self._pos + offset
1101 elif whence == 2:
1101 elif whence == 2:
1102 if not self.consumed:
1102 if not self.consumed:
1103 self.read()
1103 self.read()
1104 newpos = self._chunkindex[-1][0] - offset
1104 newpos = self._chunkindex[-1][0] - offset
1105 else:
1105 else:
1106 raise ValueError('Unknown whence value: %r' % (whence,))
1106 raise ValueError('Unknown whence value: %r' % (whence,))
1107
1107
1108 if newpos > self._chunkindex[-1][0] and not self.consumed:
1108 if newpos > self._chunkindex[-1][0] and not self.consumed:
1109 self.read()
1109 self.read()
1110 if not 0 <= newpos <= self._chunkindex[-1][0]:
1110 if not 0 <= newpos <= self._chunkindex[-1][0]:
1111 raise ValueError('Offset out of range')
1111 raise ValueError('Offset out of range')
1112
1112
1113 if self._pos != newpos:
1113 if self._pos != newpos:
1114 chunk, internaloffset = self._findchunk(newpos)
1114 chunk, internaloffset = self._findchunk(newpos)
1115 self._payloadstream = util.chunkbuffer(self._payloadchunks(chunk))
1115 self._payloadstream = util.chunkbuffer(self._payloadchunks(chunk))
1116 adjust = self.read(internaloffset)
1116 adjust = self.read(internaloffset)
1117 if len(adjust) != internaloffset:
1117 if len(adjust) != internaloffset:
1118 raise util.Abort(_('Seek failed\n'))
1118 raise util.Abort(_('Seek failed\n'))
1119 self._pos = newpos
1119 self._pos = newpos
1120
1120
1121 # These are only the static capabilities.
1121 # These are only the static capabilities.
1122 # Check the 'getrepocaps' function for the rest.
1122 # Check the 'getrepocaps' function for the rest.
1123 capabilities = {'HG20': (),
1123 capabilities = {'HG20': (),
1124 'error': ('abort', 'unsupportedcontent', 'pushraced',
1124 'error': ('abort', 'unsupportedcontent', 'pushraced',
1125 'pushkey'),
1125 'pushkey'),
1126 'listkeys': (),
1126 'listkeys': (),
1127 'pushkey': (),
1127 'pushkey': (),
1128 'digests': tuple(sorted(util.DIGESTS.keys())),
1128 'digests': tuple(sorted(util.DIGESTS.keys())),
1129 'remote-changegroup': ('http', 'https'),
1129 'remote-changegroup': ('http', 'https'),
1130 'hgtagsfnodes': (),
1130 'hgtagsfnodes': (),
1131 }
1131 }
1132
1132
1133 def getrepocaps(repo, allowpushback=False):
1133 def getrepocaps(repo, allowpushback=False):
1134 """return the bundle2 capabilities for a given repo
1134 """return the bundle2 capabilities for a given repo
1135
1135
1136 Exists to allow extensions (like evolution) to mutate the capabilities.
1136 Exists to allow extensions (like evolution) to mutate the capabilities.
1137 """
1137 """
1138 caps = capabilities.copy()
1138 caps = capabilities.copy()
1139 caps['changegroup'] = tuple(sorted(changegroup.packermap.keys()))
1139 caps['changegroup'] = tuple(sorted(changegroup.packermap.keys()))
1140 if obsolete.isenabled(repo, obsolete.exchangeopt):
1140 if obsolete.isenabled(repo, obsolete.exchangeopt):
1141 supportedformat = tuple('V%i' % v for v in obsolete.formats)
1141 supportedformat = tuple('V%i' % v for v in obsolete.formats)
1142 caps['obsmarkers'] = supportedformat
1142 caps['obsmarkers'] = supportedformat
1143 if allowpushback:
1143 if allowpushback:
1144 caps['pushback'] = ()
1144 caps['pushback'] = ()
1145 return caps
1145 return caps
1146
1146
1147 def bundle2caps(remote):
1147 def bundle2caps(remote):
1148 """return the bundle capabilities of a peer as dict"""
1148 """return the bundle capabilities of a peer as dict"""
1149 raw = remote.capable('bundle2')
1149 raw = remote.capable('bundle2')
1150 if not raw and raw != '':
1150 if not raw and raw != '':
1151 return {}
1151 return {}
1152 capsblob = urllib.unquote(remote.capable('bundle2'))
1152 capsblob = urllib.unquote(remote.capable('bundle2'))
1153 return decodecaps(capsblob)
1153 return decodecaps(capsblob)
1154
1154
1155 def obsmarkersversion(caps):
1155 def obsmarkersversion(caps):
1156 """extract the list of supported obsmarkers versions from a bundle2caps dict
1156 """extract the list of supported obsmarkers versions from a bundle2caps dict
1157 """
1157 """
1158 obscaps = caps.get('obsmarkers', ())
1158 obscaps = caps.get('obsmarkers', ())
1159 return [int(c[1:]) for c in obscaps if c.startswith('V')]
1159 return [int(c[1:]) for c in obscaps if c.startswith('V')]
1160
1160
1161 @parthandler('changegroup', ('version', 'nbchanges'))
1161 @parthandler('changegroup', ('version', 'nbchanges'))
1162 def handlechangegroup(op, inpart):
1162 def handlechangegroup(op, inpart):
1163 """apply a changegroup part on the repo
1163 """apply a changegroup part on the repo
1164
1164
1165 This is a very early implementation that will massive rework before being
1165 This is a very early implementation that will massive rework before being
1166 inflicted to any end-user.
1166 inflicted to any end-user.
1167 """
1167 """
1168 # Make sure we trigger a transaction creation
1168 # Make sure we trigger a transaction creation
1169 #
1169 #
1170 # The addchangegroup function will get a transaction object by itself, but
1170 # The addchangegroup function will get a transaction object by itself, but
1171 # we need to make sure we trigger the creation of a transaction object used
1171 # we need to make sure we trigger the creation of a transaction object used
1172 # for the whole processing scope.
1172 # for the whole processing scope.
1173 op.gettransaction()
1173 op.gettransaction()
1174 unpackerversion = inpart.params.get('version', '01')
1174 unpackerversion = inpart.params.get('version', '01')
1175 # We should raise an appropriate exception here
1175 # We should raise an appropriate exception here
1176 unpacker = changegroup.packermap[unpackerversion][1]
1176 unpacker = changegroup.packermap[unpackerversion][1]
1177 cg = unpacker(inpart, None)
1177 cg = unpacker(inpart, None)
1178 # the source and url passed here are overwritten by the one contained in
1178 # the source and url passed here are overwritten by the one contained in
1179 # the transaction.hookargs argument. So 'bundle2' is a placeholder
1179 # the transaction.hookargs argument. So 'bundle2' is a placeholder
1180 nbchangesets = None
1180 nbchangesets = None
1181 if 'nbchanges' in inpart.params:
1181 if 'nbchanges' in inpart.params:
1182 nbchangesets = int(inpart.params.get('nbchanges'))
1182 nbchangesets = int(inpart.params.get('nbchanges'))
1183 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2',
1183 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2',
1184 expectedtotal=nbchangesets)
1184 expectedtotal=nbchangesets)
1185 op.records.add('changegroup', {'return': ret})
1185 op.records.add('changegroup', {'return': ret})
1186 if op.reply is not None:
1186 if op.reply is not None:
1187 # This is definitely not the final form of this
1187 # This is definitely not the final form of this
1188 # return. But one need to start somewhere.
1188 # return. But one need to start somewhere.
1189 part = op.reply.newpart('reply:changegroup', mandatory=False)
1189 part = op.reply.newpart('reply:changegroup', mandatory=False)
1190 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1190 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1191 part.addparam('return', '%i' % ret, mandatory=False)
1191 part.addparam('return', '%i' % ret, mandatory=False)
1192 assert not inpart.read()
1192 assert not inpart.read()
1193
1193
1194 _remotechangegroupparams = tuple(['url', 'size', 'digests'] +
1194 _remotechangegroupparams = tuple(['url', 'size', 'digests'] +
1195 ['digest:%s' % k for k in util.DIGESTS.keys()])
1195 ['digest:%s' % k for k in util.DIGESTS.keys()])
1196 @parthandler('remote-changegroup', _remotechangegroupparams)
1196 @parthandler('remote-changegroup', _remotechangegroupparams)
1197 def handleremotechangegroup(op, inpart):
1197 def handleremotechangegroup(op, inpart):
1198 """apply a bundle10 on the repo, given an url and validation information
1198 """apply a bundle10 on the repo, given an url and validation information
1199
1199
1200 All the information about the remote bundle to import are given as
1200 All the information about the remote bundle to import are given as
1201 parameters. The parameters include:
1201 parameters. The parameters include:
1202 - url: the url to the bundle10.
1202 - url: the url to the bundle10.
1203 - size: the bundle10 file size. It is used to validate what was
1203 - size: the bundle10 file size. It is used to validate what was
1204 retrieved by the client matches the server knowledge about the bundle.
1204 retrieved by the client matches the server knowledge about the bundle.
1205 - digests: a space separated list of the digest types provided as
1205 - digests: a space separated list of the digest types provided as
1206 parameters.
1206 parameters.
1207 - digest:<digest-type>: the hexadecimal representation of the digest with
1207 - digest:<digest-type>: the hexadecimal representation of the digest with
1208 that name. Like the size, it is used to validate what was retrieved by
1208 that name. Like the size, it is used to validate what was retrieved by
1209 the client matches what the server knows about the bundle.
1209 the client matches what the server knows about the bundle.
1210
1210
1211 When multiple digest types are given, all of them are checked.
1211 When multiple digest types are given, all of them are checked.
1212 """
1212 """
1213 try:
1213 try:
1214 raw_url = inpart.params['url']
1214 raw_url = inpart.params['url']
1215 except KeyError:
1215 except KeyError:
1216 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'url')
1216 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'url')
1217 parsed_url = util.url(raw_url)
1217 parsed_url = util.url(raw_url)
1218 if parsed_url.scheme not in capabilities['remote-changegroup']:
1218 if parsed_url.scheme not in capabilities['remote-changegroup']:
1219 raise util.Abort(_('remote-changegroup does not support %s urls') %
1219 raise util.Abort(_('remote-changegroup does not support %s urls') %
1220 parsed_url.scheme)
1220 parsed_url.scheme)
1221
1221
1222 try:
1222 try:
1223 size = int(inpart.params['size'])
1223 size = int(inpart.params['size'])
1224 except ValueError:
1224 except ValueError:
1225 raise util.Abort(_('remote-changegroup: invalid value for param "%s"')
1225 raise util.Abort(_('remote-changegroup: invalid value for param "%s"')
1226 % 'size')
1226 % 'size')
1227 except KeyError:
1227 except KeyError:
1228 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'size')
1228 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'size')
1229
1229
1230 digests = {}
1230 digests = {}
1231 for typ in inpart.params.get('digests', '').split():
1231 for typ in inpart.params.get('digests', '').split():
1232 param = 'digest:%s' % typ
1232 param = 'digest:%s' % typ
1233 try:
1233 try:
1234 value = inpart.params[param]
1234 value = inpart.params[param]
1235 except KeyError:
1235 except KeyError:
1236 raise util.Abort(_('remote-changegroup: missing "%s" param') %
1236 raise util.Abort(_('remote-changegroup: missing "%s" param') %
1237 param)
1237 param)
1238 digests[typ] = value
1238 digests[typ] = value
1239
1239
1240 real_part = util.digestchecker(url.open(op.ui, raw_url), size, digests)
1240 real_part = util.digestchecker(url.open(op.ui, raw_url), size, digests)
1241
1241
1242 # Make sure we trigger a transaction creation
1242 # Make sure we trigger a transaction creation
1243 #
1243 #
1244 # The addchangegroup function will get a transaction object by itself, but
1244 # The addchangegroup function will get a transaction object by itself, but
1245 # we need to make sure we trigger the creation of a transaction object used
1245 # we need to make sure we trigger the creation of a transaction object used
1246 # for the whole processing scope.
1246 # for the whole processing scope.
1247 op.gettransaction()
1247 op.gettransaction()
1248 from . import exchange
1248 from . import exchange
1249 cg = exchange.readbundle(op.repo.ui, real_part, raw_url)
1249 cg = exchange.readbundle(op.repo.ui, real_part, raw_url)
1250 if not isinstance(cg, changegroup.cg1unpacker):
1250 if not isinstance(cg, changegroup.cg1unpacker):
1251 raise util.Abort(_('%s: not a bundle version 1.0') %
1251 raise util.Abort(_('%s: not a bundle version 1.0') %
1252 util.hidepassword(raw_url))
1252 util.hidepassword(raw_url))
1253 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1253 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1254 op.records.add('changegroup', {'return': ret})
1254 op.records.add('changegroup', {'return': ret})
1255 if op.reply is not None:
1255 if op.reply is not None:
1256 # This is definitely not the final form of this
1256 # This is definitely not the final form of this
1257 # return. But one need to start somewhere.
1257 # return. But one need to start somewhere.
1258 part = op.reply.newpart('reply:changegroup')
1258 part = op.reply.newpart('reply:changegroup')
1259 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1259 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1260 part.addparam('return', '%i' % ret, mandatory=False)
1260 part.addparam('return', '%i' % ret, mandatory=False)
1261 try:
1261 try:
1262 real_part.validate()
1262 real_part.validate()
1263 except util.Abort as e:
1263 except util.Abort as e:
1264 raise util.Abort(_('bundle at %s is corrupted:\n%s') %
1264 raise util.Abort(_('bundle at %s is corrupted:\n%s') %
1265 (util.hidepassword(raw_url), str(e)))
1265 (util.hidepassword(raw_url), str(e)))
1266 assert not inpart.read()
1266 assert not inpart.read()
1267
1267
1268 @parthandler('reply:changegroup', ('return', 'in-reply-to'))
1268 @parthandler('reply:changegroup', ('return', 'in-reply-to'))
1269 def handlereplychangegroup(op, inpart):
1269 def handlereplychangegroup(op, inpart):
1270 ret = int(inpart.params['return'])
1270 ret = int(inpart.params['return'])
1271 replyto = int(inpart.params['in-reply-to'])
1271 replyto = int(inpart.params['in-reply-to'])
1272 op.records.add('changegroup', {'return': ret}, replyto)
1272 op.records.add('changegroup', {'return': ret}, replyto)
1273
1273
1274 @parthandler('check:heads')
1274 @parthandler('check:heads')
1275 def handlecheckheads(op, inpart):
1275 def handlecheckheads(op, inpart):
1276 """check that head of the repo did not change
1276 """check that head of the repo did not change
1277
1277
1278 This is used to detect a push race when using unbundle.
1278 This is used to detect a push race when using unbundle.
1279 This replaces the "heads" argument of unbundle."""
1279 This replaces the "heads" argument of unbundle."""
1280 h = inpart.read(20)
1280 h = inpart.read(20)
1281 heads = []
1281 heads = []
1282 while len(h) == 20:
1282 while len(h) == 20:
1283 heads.append(h)
1283 heads.append(h)
1284 h = inpart.read(20)
1284 h = inpart.read(20)
1285 assert not h
1285 assert not h
1286 if heads != op.repo.heads():
1286 if heads != op.repo.heads():
1287 raise error.PushRaced('repository changed while pushing - '
1287 raise error.PushRaced('repository changed while pushing - '
1288 'please try again')
1288 'please try again')
1289
1289
1290 @parthandler('output')
1290 @parthandler('output')
1291 def handleoutput(op, inpart):
1291 def handleoutput(op, inpart):
1292 """forward output captured on the server to the client"""
1292 """forward output captured on the server to the client"""
1293 for line in inpart.read().splitlines():
1293 for line in inpart.read().splitlines():
1294 op.ui.status(('remote: %s\n' % line))
1294 op.ui.status(('remote: %s\n' % line))
1295
1295
1296 @parthandler('replycaps')
1296 @parthandler('replycaps')
1297 def handlereplycaps(op, inpart):
1297 def handlereplycaps(op, inpart):
1298 """Notify that a reply bundle should be created
1298 """Notify that a reply bundle should be created
1299
1299
1300 The payload contains the capabilities information for the reply"""
1300 The payload contains the capabilities information for the reply"""
1301 caps = decodecaps(inpart.read())
1301 caps = decodecaps(inpart.read())
1302 if op.reply is None:
1302 if op.reply is None:
1303 op.reply = bundle20(op.ui, caps)
1303 op.reply = bundle20(op.ui, caps)
1304
1304
1305 @parthandler('error:abort', ('message', 'hint'))
1305 @parthandler('error:abort', ('message', 'hint'))
1306 def handleerrorabort(op, inpart):
1306 def handleerrorabort(op, inpart):
1307 """Used to transmit abort error over the wire"""
1307 """Used to transmit abort error over the wire"""
1308 raise util.Abort(inpart.params['message'], hint=inpart.params.get('hint'))
1308 raise util.Abort(inpart.params['message'], hint=inpart.params.get('hint'))
1309
1309
1310 @parthandler('error:pushkey', ('namespace', 'key', 'new', 'old', 'ret',
1310 @parthandler('error:pushkey', ('namespace', 'key', 'new', 'old', 'ret',
1311 'in-reply-to'))
1311 'in-reply-to'))
1312 def handleerrorpushkey(op, inpart):
1312 def handleerrorpushkey(op, inpart):
1313 """Used to transmit failure of a mandatory pushkey over the wire"""
1313 """Used to transmit failure of a mandatory pushkey over the wire"""
1314 kwargs = {}
1314 kwargs = {}
1315 for name in ('namespace', 'key', 'new', 'old', 'ret'):
1315 for name in ('namespace', 'key', 'new', 'old', 'ret'):
1316 value = inpart.params.get(name)
1316 value = inpart.params.get(name)
1317 if value is not None:
1317 if value is not None:
1318 kwargs[name] = value
1318 kwargs[name] = value
1319 raise error.PushkeyFailed(inpart.params['in-reply-to'], **kwargs)
1319 raise error.PushkeyFailed(inpart.params['in-reply-to'], **kwargs)
1320
1320
1321 @parthandler('error:unsupportedcontent', ('parttype', 'params'))
1321 @parthandler('error:unsupportedcontent', ('parttype', 'params'))
1322 def handleerrorunsupportedcontent(op, inpart):
1322 def handleerrorunsupportedcontent(op, inpart):
1323 """Used to transmit unknown content error over the wire"""
1323 """Used to transmit unknown content error over the wire"""
1324 kwargs = {}
1324 kwargs = {}
1325 parttype = inpart.params.get('parttype')
1325 parttype = inpart.params.get('parttype')
1326 if parttype is not None:
1326 if parttype is not None:
1327 kwargs['parttype'] = parttype
1327 kwargs['parttype'] = parttype
1328 params = inpart.params.get('params')
1328 params = inpart.params.get('params')
1329 if params is not None:
1329 if params is not None:
1330 kwargs['params'] = params.split('\0')
1330 kwargs['params'] = params.split('\0')
1331
1331
1332 raise error.UnsupportedPartError(**kwargs)
1332 raise error.BundleUnknownFeatureError(**kwargs)
1333
1333
1334 @parthandler('error:pushraced', ('message',))
1334 @parthandler('error:pushraced', ('message',))
1335 def handleerrorpushraced(op, inpart):
1335 def handleerrorpushraced(op, inpart):
1336 """Used to transmit push race error over the wire"""
1336 """Used to transmit push race error over the wire"""
1337 raise error.ResponseError(_('push failed:'), inpart.params['message'])
1337 raise error.ResponseError(_('push failed:'), inpart.params['message'])
1338
1338
1339 @parthandler('listkeys', ('namespace',))
1339 @parthandler('listkeys', ('namespace',))
1340 def handlelistkeys(op, inpart):
1340 def handlelistkeys(op, inpart):
1341 """retrieve pushkey namespace content stored in a bundle2"""
1341 """retrieve pushkey namespace content stored in a bundle2"""
1342 namespace = inpart.params['namespace']
1342 namespace = inpart.params['namespace']
1343 r = pushkey.decodekeys(inpart.read())
1343 r = pushkey.decodekeys(inpart.read())
1344 op.records.add('listkeys', (namespace, r))
1344 op.records.add('listkeys', (namespace, r))
1345
1345
1346 @parthandler('pushkey', ('namespace', 'key', 'old', 'new'))
1346 @parthandler('pushkey', ('namespace', 'key', 'old', 'new'))
1347 def handlepushkey(op, inpart):
1347 def handlepushkey(op, inpart):
1348 """process a pushkey request"""
1348 """process a pushkey request"""
1349 dec = pushkey.decode
1349 dec = pushkey.decode
1350 namespace = dec(inpart.params['namespace'])
1350 namespace = dec(inpart.params['namespace'])
1351 key = dec(inpart.params['key'])
1351 key = dec(inpart.params['key'])
1352 old = dec(inpart.params['old'])
1352 old = dec(inpart.params['old'])
1353 new = dec(inpart.params['new'])
1353 new = dec(inpart.params['new'])
1354 ret = op.repo.pushkey(namespace, key, old, new)
1354 ret = op.repo.pushkey(namespace, key, old, new)
1355 record = {'namespace': namespace,
1355 record = {'namespace': namespace,
1356 'key': key,
1356 'key': key,
1357 'old': old,
1357 'old': old,
1358 'new': new}
1358 'new': new}
1359 op.records.add('pushkey', record)
1359 op.records.add('pushkey', record)
1360 if op.reply is not None:
1360 if op.reply is not None:
1361 rpart = op.reply.newpart('reply:pushkey')
1361 rpart = op.reply.newpart('reply:pushkey')
1362 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1362 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1363 rpart.addparam('return', '%i' % ret, mandatory=False)
1363 rpart.addparam('return', '%i' % ret, mandatory=False)
1364 if inpart.mandatory and not ret:
1364 if inpart.mandatory and not ret:
1365 kwargs = {}
1365 kwargs = {}
1366 for key in ('namespace', 'key', 'new', 'old', 'ret'):
1366 for key in ('namespace', 'key', 'new', 'old', 'ret'):
1367 if key in inpart.params:
1367 if key in inpart.params:
1368 kwargs[key] = inpart.params[key]
1368 kwargs[key] = inpart.params[key]
1369 raise error.PushkeyFailed(partid=str(inpart.id), **kwargs)
1369 raise error.PushkeyFailed(partid=str(inpart.id), **kwargs)
1370
1370
1371 @parthandler('reply:pushkey', ('return', 'in-reply-to'))
1371 @parthandler('reply:pushkey', ('return', 'in-reply-to'))
1372 def handlepushkeyreply(op, inpart):
1372 def handlepushkeyreply(op, inpart):
1373 """retrieve the result of a pushkey request"""
1373 """retrieve the result of a pushkey request"""
1374 ret = int(inpart.params['return'])
1374 ret = int(inpart.params['return'])
1375 partid = int(inpart.params['in-reply-to'])
1375 partid = int(inpart.params['in-reply-to'])
1376 op.records.add('pushkey', {'return': ret}, partid)
1376 op.records.add('pushkey', {'return': ret}, partid)
1377
1377
1378 @parthandler('obsmarkers')
1378 @parthandler('obsmarkers')
1379 def handleobsmarker(op, inpart):
1379 def handleobsmarker(op, inpart):
1380 """add a stream of obsmarkers to the repo"""
1380 """add a stream of obsmarkers to the repo"""
1381 tr = op.gettransaction()
1381 tr = op.gettransaction()
1382 markerdata = inpart.read()
1382 markerdata = inpart.read()
1383 if op.ui.config('experimental', 'obsmarkers-exchange-debug', False):
1383 if op.ui.config('experimental', 'obsmarkers-exchange-debug', False):
1384 op.ui.write(('obsmarker-exchange: %i bytes received\n')
1384 op.ui.write(('obsmarker-exchange: %i bytes received\n')
1385 % len(markerdata))
1385 % len(markerdata))
1386 new = op.repo.obsstore.mergemarkers(tr, markerdata)
1386 new = op.repo.obsstore.mergemarkers(tr, markerdata)
1387 if new:
1387 if new:
1388 op.repo.ui.status(_('%i new obsolescence markers\n') % new)
1388 op.repo.ui.status(_('%i new obsolescence markers\n') % new)
1389 op.records.add('obsmarkers', {'new': new})
1389 op.records.add('obsmarkers', {'new': new})
1390 if op.reply is not None:
1390 if op.reply is not None:
1391 rpart = op.reply.newpart('reply:obsmarkers')
1391 rpart = op.reply.newpart('reply:obsmarkers')
1392 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1392 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1393 rpart.addparam('new', '%i' % new, mandatory=False)
1393 rpart.addparam('new', '%i' % new, mandatory=False)
1394
1394
1395
1395
1396 @parthandler('reply:obsmarkers', ('new', 'in-reply-to'))
1396 @parthandler('reply:obsmarkers', ('new', 'in-reply-to'))
1397 def handleobsmarkerreply(op, inpart):
1397 def handleobsmarkerreply(op, inpart):
1398 """retrieve the result of a pushkey request"""
1398 """retrieve the result of a pushkey request"""
1399 ret = int(inpart.params['new'])
1399 ret = int(inpart.params['new'])
1400 partid = int(inpart.params['in-reply-to'])
1400 partid = int(inpart.params['in-reply-to'])
1401 op.records.add('obsmarkers', {'new': ret}, partid)
1401 op.records.add('obsmarkers', {'new': ret}, partid)
1402
1402
1403 @parthandler('hgtagsfnodes')
1403 @parthandler('hgtagsfnodes')
1404 def handlehgtagsfnodes(op, inpart):
1404 def handlehgtagsfnodes(op, inpart):
1405 """Applies .hgtags fnodes cache entries to the local repo.
1405 """Applies .hgtags fnodes cache entries to the local repo.
1406
1406
1407 Payload is pairs of 20 byte changeset nodes and filenodes.
1407 Payload is pairs of 20 byte changeset nodes and filenodes.
1408 """
1408 """
1409 cache = tags.hgtagsfnodescache(op.repo.unfiltered())
1409 cache = tags.hgtagsfnodescache(op.repo.unfiltered())
1410
1410
1411 count = 0
1411 count = 0
1412 while True:
1412 while True:
1413 node = inpart.read(20)
1413 node = inpart.read(20)
1414 fnode = inpart.read(20)
1414 fnode = inpart.read(20)
1415 if len(node) < 20 or len(fnode) < 20:
1415 if len(node) < 20 or len(fnode) < 20:
1416 op.ui.debug('ignoring incomplete received .hgtags fnodes data\n')
1416 op.ui.debug('ignoring incomplete received .hgtags fnodes data\n')
1417 break
1417 break
1418 cache.setfnode(node, fnode)
1418 cache.setfnode(node, fnode)
1419 count += 1
1419 count += 1
1420
1420
1421 cache.write()
1421 cache.write()
1422 op.ui.debug('applied %i hgtags fnodes cache entries\n' % count)
1422 op.ui.debug('applied %i hgtags fnodes cache entries\n' % count)
@@ -1,192 +1,192
1 # error.py - Mercurial exceptions
1 # error.py - Mercurial exceptions
2 #
2 #
3 # Copyright 2005-2008 Matt Mackall <mpm@selenic.com>
3 # Copyright 2005-2008 Matt Mackall <mpm@selenic.com>
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 """Mercurial exceptions.
8 """Mercurial exceptions.
9
9
10 This allows us to catch exceptions at higher levels without forcing
10 This allows us to catch exceptions at higher levels without forcing
11 imports.
11 imports.
12 """
12 """
13
13
14 from __future__ import absolute_import
14 from __future__ import absolute_import
15
15
16 # Do not import anything here, please
16 # Do not import anything here, please
17
17
18 class HintException(Exception):
18 class HintException(Exception):
19 def __init__(self, *args, **kw):
19 def __init__(self, *args, **kw):
20 Exception.__init__(self, *args)
20 Exception.__init__(self, *args)
21 self.hint = kw.get('hint')
21 self.hint = kw.get('hint')
22
22
23 class RevlogError(HintException):
23 class RevlogError(HintException):
24 pass
24 pass
25
25
26 class FilteredIndexError(IndexError):
26 class FilteredIndexError(IndexError):
27 pass
27 pass
28
28
29 class LookupError(RevlogError, KeyError):
29 class LookupError(RevlogError, KeyError):
30 def __init__(self, name, index, message):
30 def __init__(self, name, index, message):
31 self.name = name
31 self.name = name
32 self.index = index
32 self.index = index
33 # this can't be called 'message' because at least some installs of
33 # this can't be called 'message' because at least some installs of
34 # Python 2.6+ complain about the 'message' property being deprecated
34 # Python 2.6+ complain about the 'message' property being deprecated
35 self.lookupmessage = message
35 self.lookupmessage = message
36 if isinstance(name, str) and len(name) == 20:
36 if isinstance(name, str) and len(name) == 20:
37 from .node import short
37 from .node import short
38 name = short(name)
38 name = short(name)
39 RevlogError.__init__(self, '%s@%s: %s' % (index, name, message))
39 RevlogError.__init__(self, '%s@%s: %s' % (index, name, message))
40
40
41 def __str__(self):
41 def __str__(self):
42 return RevlogError.__str__(self)
42 return RevlogError.__str__(self)
43
43
44 class FilteredLookupError(LookupError):
44 class FilteredLookupError(LookupError):
45 pass
45 pass
46
46
47 class ManifestLookupError(LookupError):
47 class ManifestLookupError(LookupError):
48 pass
48 pass
49
49
50 class CommandError(Exception):
50 class CommandError(Exception):
51 """Exception raised on errors in parsing the command line."""
51 """Exception raised on errors in parsing the command line."""
52
52
53 class InterventionRequired(Exception):
53 class InterventionRequired(Exception):
54 """Exception raised when a command requires human intervention."""
54 """Exception raised when a command requires human intervention."""
55
55
56 class Abort(HintException):
56 class Abort(HintException):
57 """Raised if a command needs to print an error and exit."""
57 """Raised if a command needs to print an error and exit."""
58 pass
58 pass
59
59
60 class HookAbort(Abort):
60 class HookAbort(Abort):
61 """raised when a validation hook fails, aborting an operation
61 """raised when a validation hook fails, aborting an operation
62
62
63 Exists to allow more specialized catching."""
63 Exists to allow more specialized catching."""
64 pass
64 pass
65
65
66 class ConfigError(Abort):
66 class ConfigError(Abort):
67 """Exception raised when parsing config files"""
67 """Exception raised when parsing config files"""
68
68
69 class OutOfBandError(Exception):
69 class OutOfBandError(Exception):
70 """Exception raised when a remote repo reports failure"""
70 """Exception raised when a remote repo reports failure"""
71
71
72 def __init__(self, *args, **kw):
72 def __init__(self, *args, **kw):
73 Exception.__init__(self, *args)
73 Exception.__init__(self, *args)
74 self.hint = kw.get('hint')
74 self.hint = kw.get('hint')
75
75
76 class ParseError(Exception):
76 class ParseError(Exception):
77 """Raised when parsing config files and {rev,file}sets (msg[, pos])"""
77 """Raised when parsing config files and {rev,file}sets (msg[, pos])"""
78
78
79 class UnknownIdentifier(ParseError):
79 class UnknownIdentifier(ParseError):
80 """Exception raised when a {rev,file}set references an unknown identifier"""
80 """Exception raised when a {rev,file}set references an unknown identifier"""
81
81
82 def __init__(self, function, symbols):
82 def __init__(self, function, symbols):
83 from .i18n import _
83 from .i18n import _
84 ParseError.__init__(self, _("unknown identifier: %s") % function)
84 ParseError.__init__(self, _("unknown identifier: %s") % function)
85 self.function = function
85 self.function = function
86 self.symbols = symbols
86 self.symbols = symbols
87
87
88 class RepoError(HintException):
88 class RepoError(HintException):
89 pass
89 pass
90
90
91 class RepoLookupError(RepoError):
91 class RepoLookupError(RepoError):
92 pass
92 pass
93
93
94 class FilteredRepoLookupError(RepoLookupError):
94 class FilteredRepoLookupError(RepoLookupError):
95 pass
95 pass
96
96
97 class CapabilityError(RepoError):
97 class CapabilityError(RepoError):
98 pass
98 pass
99
99
100 class RequirementError(RepoError):
100 class RequirementError(RepoError):
101 """Exception raised if .hg/requires has an unknown entry."""
101 """Exception raised if .hg/requires has an unknown entry."""
102 pass
102 pass
103
103
104 class LockError(IOError):
104 class LockError(IOError):
105 def __init__(self, errno, strerror, filename, desc):
105 def __init__(self, errno, strerror, filename, desc):
106 IOError.__init__(self, errno, strerror, filename)
106 IOError.__init__(self, errno, strerror, filename)
107 self.desc = desc
107 self.desc = desc
108
108
109 class LockHeld(LockError):
109 class LockHeld(LockError):
110 def __init__(self, errno, filename, desc, locker):
110 def __init__(self, errno, filename, desc, locker):
111 LockError.__init__(self, errno, 'Lock held', filename, desc)
111 LockError.__init__(self, errno, 'Lock held', filename, desc)
112 self.locker = locker
112 self.locker = locker
113
113
114 class LockUnavailable(LockError):
114 class LockUnavailable(LockError):
115 pass
115 pass
116
116
117 # LockError is for errors while acquiring the lock -- this is unrelated
117 # LockError is for errors while acquiring the lock -- this is unrelated
118 class LockInheritanceContractViolation(AssertionError):
118 class LockInheritanceContractViolation(AssertionError):
119 pass
119 pass
120
120
121 class ResponseError(Exception):
121 class ResponseError(Exception):
122 """Raised to print an error with part of output and exit."""
122 """Raised to print an error with part of output and exit."""
123
123
124 class UnknownCommand(Exception):
124 class UnknownCommand(Exception):
125 """Exception raised if command is not in the command table."""
125 """Exception raised if command is not in the command table."""
126
126
127 class AmbiguousCommand(Exception):
127 class AmbiguousCommand(Exception):
128 """Exception raised if command shortcut matches more than one command."""
128 """Exception raised if command shortcut matches more than one command."""
129
129
130 # derived from KeyboardInterrupt to simplify some breakout code
130 # derived from KeyboardInterrupt to simplify some breakout code
131 class SignalInterrupt(KeyboardInterrupt):
131 class SignalInterrupt(KeyboardInterrupt):
132 """Exception raised on SIGTERM and SIGHUP."""
132 """Exception raised on SIGTERM and SIGHUP."""
133
133
134 class SignatureError(Exception):
134 class SignatureError(Exception):
135 pass
135 pass
136
136
137 class PushRaced(RuntimeError):
137 class PushRaced(RuntimeError):
138 """An exception raised during unbundling that indicate a push race"""
138 """An exception raised during unbundling that indicate a push race"""
139
139
140 # bundle2 related errors
140 # bundle2 related errors
141 class BundleValueError(ValueError):
141 class BundleValueError(ValueError):
142 """error raised when bundle2 cannot be processed"""
142 """error raised when bundle2 cannot be processed"""
143
143
144 class UnsupportedPartError(BundleValueError):
144 class BundleUnknownFeatureError(BundleValueError):
145 def __init__(self, parttype=None, params=()):
145 def __init__(self, parttype=None, params=()):
146 self.parttype = parttype
146 self.parttype = parttype
147 self.params = params
147 self.params = params
148 if self.parttype is None:
148 if self.parttype is None:
149 msg = 'Stream Parameter'
149 msg = 'Stream Parameter'
150 else:
150 else:
151 msg = parttype
151 msg = parttype
152 if self.params:
152 if self.params:
153 msg = '%s - %s' % (msg, ', '.join(self.params))
153 msg = '%s - %s' % (msg, ', '.join(self.params))
154 ValueError.__init__(self, msg)
154 ValueError.__init__(self, msg)
155
155
156 class ReadOnlyPartError(RuntimeError):
156 class ReadOnlyPartError(RuntimeError):
157 """error raised when code tries to alter a part being generated"""
157 """error raised when code tries to alter a part being generated"""
158 pass
158 pass
159
159
160 class PushkeyFailed(Abort):
160 class PushkeyFailed(Abort):
161 """error raised when a pushkey part failed to update a value"""
161 """error raised when a pushkey part failed to update a value"""
162
162
163 def __init__(self, partid, namespace=None, key=None, new=None, old=None,
163 def __init__(self, partid, namespace=None, key=None, new=None, old=None,
164 ret=None):
164 ret=None):
165 self.partid = partid
165 self.partid = partid
166 self.namespace = namespace
166 self.namespace = namespace
167 self.key = key
167 self.key = key
168 self.new = new
168 self.new = new
169 self.old = old
169 self.old = old
170 self.ret = ret
170 self.ret = ret
171 # no i18n expected to be processed into a better message
171 # no i18n expected to be processed into a better message
172 Abort.__init__(self, 'failed to update value for "%s/%s"'
172 Abort.__init__(self, 'failed to update value for "%s/%s"'
173 % (namespace, key))
173 % (namespace, key))
174
174
175 class CensoredNodeError(RevlogError):
175 class CensoredNodeError(RevlogError):
176 """error raised when content verification fails on a censored node
176 """error raised when content verification fails on a censored node
177
177
178 Also contains the tombstone data substituted for the uncensored data.
178 Also contains the tombstone data substituted for the uncensored data.
179 """
179 """
180
180
181 def __init__(self, filename, node, tombstone):
181 def __init__(self, filename, node, tombstone):
182 from .node import short
182 from .node import short
183 RevlogError.__init__(self, '%s:%s' % (filename, short(node)))
183 RevlogError.__init__(self, '%s:%s' % (filename, short(node)))
184 self.tombstone = tombstone
184 self.tombstone = tombstone
185
185
186 class CensoredBaseError(RevlogError):
186 class CensoredBaseError(RevlogError):
187 """error raised when a delta is rejected because its base is censored
187 """error raised when a delta is rejected because its base is censored
188
188
189 A delta based on a censored revision must be formed as single patch
189 A delta based on a censored revision must be formed as single patch
190 operation which replaces the entire base with new content. This ensures
190 operation which replaces the entire base with new content. This ensures
191 the delta may be applied by clones which have not censored the base.
191 the delta may be applied by clones which have not censored the base.
192 """
192 """
General Comments 0
You need to be logged in to leave comments. Login now