##// END OF EJS Templates
bundle2: rename part to bundlepart...
Pierre-Yves David -
r21005:3d38ebb5 default
parent child Browse files
Show More
@@ -1,610 +1,610 b''
1 # bundle2.py - generic container format to transmit arbitrary data.
1 # bundle2.py - generic container format to transmit arbitrary data.
2 #
2 #
3 # Copyright 2013 Facebook, Inc.
3 # Copyright 2013 Facebook, Inc.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """Handling of the new bundle2 format
7 """Handling of the new bundle2 format
8
8
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
10 payloads in an application agnostic way. It consist in a sequence of "parts"
10 payloads in an application agnostic way. It consist in a sequence of "parts"
11 that will be handed to and processed by the application layer.
11 that will be handed to and processed by the application layer.
12
12
13
13
14 General format architecture
14 General format architecture
15 ===========================
15 ===========================
16
16
17 The format is architectured as follow
17 The format is architectured as follow
18
18
19 - magic string
19 - magic string
20 - stream level parameters
20 - stream level parameters
21 - payload parts (any number)
21 - payload parts (any number)
22 - end of stream marker.
22 - end of stream marker.
23
23
24 the Binary format
24 the Binary format
25 ============================
25 ============================
26
26
27 All numbers are unsigned and big endian.
27 All numbers are unsigned and big endian.
28
28
29 stream level parameters
29 stream level parameters
30 ------------------------
30 ------------------------
31
31
32 Binary format is as follow
32 Binary format is as follow
33
33
34 :params size: (16 bits integer)
34 :params size: (16 bits integer)
35
35
36 The total number of Bytes used by the parameters
36 The total number of Bytes used by the parameters
37
37
38 :params value: arbitrary number of Bytes
38 :params value: arbitrary number of Bytes
39
39
40 A blob of `params size` containing the serialized version of all stream level
40 A blob of `params size` containing the serialized version of all stream level
41 parameters.
41 parameters.
42
42
43 The blob contains a space separated list of parameters. parameter with value
43 The blob contains a space separated list of parameters. parameter with value
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
45
45
46 Empty name are obviously forbidden.
46 Empty name are obviously forbidden.
47
47
48 Name MUST start with a letter. If this first letter is lower case, the
48 Name MUST start with a letter. If this first letter is lower case, the
49 parameter is advisory and can be safefly ignored. However when the first
49 parameter is advisory and can be safefly ignored. However when the first
50 letter is capital, the parameter is mandatory and the bundling process MUST
50 letter is capital, the parameter is mandatory and the bundling process MUST
51 stop if he is not able to proceed it.
51 stop if he is not able to proceed it.
52
52
53 Stream parameters use a simple textual format for two main reasons:
53 Stream parameters use a simple textual format for two main reasons:
54
54
55 - Stream level parameters should remains simple and we want to discourage any
55 - Stream level parameters should remains simple and we want to discourage any
56 crazy usage.
56 crazy usage.
57 - Textual data allow easy human inspection of a the bundle2 header in case of
57 - Textual data allow easy human inspection of a the bundle2 header in case of
58 troubles.
58 troubles.
59
59
60 Any Applicative level options MUST go into a bundle2 part instead.
60 Any Applicative level options MUST go into a bundle2 part instead.
61
61
62 Payload part
62 Payload part
63 ------------------------
63 ------------------------
64
64
65 Binary format is as follow
65 Binary format is as follow
66
66
67 :header size: (16 bits inter)
67 :header size: (16 bits inter)
68
68
69 The total number of Bytes used by the part headers. When the header is empty
69 The total number of Bytes used by the part headers. When the header is empty
70 (size = 0) this is interpreted as the end of stream marker.
70 (size = 0) this is interpreted as the end of stream marker.
71
71
72 :header:
72 :header:
73
73
74 The header defines how to interpret the part. It contains two piece of
74 The header defines how to interpret the part. It contains two piece of
75 data: the part type, and the part parameters.
75 data: the part type, and the part parameters.
76
76
77 The part type is used to route an application level handler, that can
77 The part type is used to route an application level handler, that can
78 interpret payload.
78 interpret payload.
79
79
80 Part parameters are passed to the application level handler. They are
80 Part parameters are passed to the application level handler. They are
81 meant to convey information that will help the application level object to
81 meant to convey information that will help the application level object to
82 interpret the part payload.
82 interpret the part payload.
83
83
84 The binary format of the header is has follow
84 The binary format of the header is has follow
85
85
86 :typesize: (one byte)
86 :typesize: (one byte)
87
87
88 :typename: alphanumerical part name
88 :typename: alphanumerical part name
89
89
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
91 to this part.
91 to this part.
92
92
93 :parameters:
93 :parameters:
94
94
95 Part's parameter may have arbitraty content, the binary structure is::
95 Part's parameter may have arbitraty content, the binary structure is::
96
96
97 <mandatory-count><advisory-count><param-sizes><param-data>
97 <mandatory-count><advisory-count><param-sizes><param-data>
98
98
99 :mandatory-count: 1 byte, number of mandatory parameters
99 :mandatory-count: 1 byte, number of mandatory parameters
100
100
101 :advisory-count: 1 byte, number of advisory parameters
101 :advisory-count: 1 byte, number of advisory parameters
102
102
103 :param-sizes:
103 :param-sizes:
104
104
105 N couple of bytes, where N is the total number of parameters. Each
105 N couple of bytes, where N is the total number of parameters. Each
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
107
107
108 :param-data:
108 :param-data:
109
109
110 A blob of bytes from which each parameter key and value can be
110 A blob of bytes from which each parameter key and value can be
111 retrieved using the list of size couples stored in the previous
111 retrieved using the list of size couples stored in the previous
112 field.
112 field.
113
113
114 Mandatory parameters comes first, then the advisory ones.
114 Mandatory parameters comes first, then the advisory ones.
115
115
116 :payload:
116 :payload:
117
117
118 payload is a series of `<chunksize><chunkdata>`.
118 payload is a series of `<chunksize><chunkdata>`.
119
119
120 `chunksize` is a 32 bits integer, `chunkdata` are plain bytes (as much as
120 `chunksize` is a 32 bits integer, `chunkdata` are plain bytes (as much as
121 `chunksize` says)` The payload part is concluded by a zero size chunk.
121 `chunksize` says)` The payload part is concluded by a zero size chunk.
122
122
123 The current implementation always produces either zero or one chunk.
123 The current implementation always produces either zero or one chunk.
124 This is an implementation limitation that will ultimatly be lifted.
124 This is an implementation limitation that will ultimatly be lifted.
125
125
126 Bundle processing
126 Bundle processing
127 ============================
127 ============================
128
128
129 Each part is processed in order using a "part handler". Handler are registered
129 Each part is processed in order using a "part handler". Handler are registered
130 for a certain part type.
130 for a certain part type.
131
131
132 The matching of a part to its handler is case insensitive. The case of the
132 The matching of a part to its handler is case insensitive. The case of the
133 part type is used to know if a part is mandatory or advisory. If the Part type
133 part type is used to know if a part is mandatory or advisory. If the Part type
134 contains any uppercase char it is considered mandatory. When no handler is
134 contains any uppercase char it is considered mandatory. When no handler is
135 known for a Mandatory part, the process is aborted and an exception is raised.
135 known for a Mandatory part, the process is aborted and an exception is raised.
136 If the part is advisory and no handler is known, the part is ignored. When the
136 If the part is advisory and no handler is known, the part is ignored. When the
137 process is aborted, the full bundle is still read from the stream to keep the
137 process is aborted, the full bundle is still read from the stream to keep the
138 channel usable. But none of the part read from an abort are processed. In the
138 channel usable. But none of the part read from an abort are processed. In the
139 future, dropping the stream may become an option for channel we do not care to
139 future, dropping the stream may become an option for channel we do not care to
140 preserve.
140 preserve.
141 """
141 """
142
142
143 import util
143 import util
144 import struct
144 import struct
145 import urllib
145 import urllib
146 import string
146 import string
147 import StringIO
147 import StringIO
148
148
149 import changegroup
149 import changegroup
150 from i18n import _
150 from i18n import _
151
151
152 _pack = struct.pack
152 _pack = struct.pack
153 _unpack = struct.unpack
153 _unpack = struct.unpack
154
154
155 _magicstring = 'HG20'
155 _magicstring = 'HG20'
156
156
157 _fstreamparamsize = '>H'
157 _fstreamparamsize = '>H'
158 _fpartheadersize = '>H'
158 _fpartheadersize = '>H'
159 _fparttypesize = '>B'
159 _fparttypesize = '>B'
160 _fpartid = '>I'
160 _fpartid = '>I'
161 _fpayloadsize = '>I'
161 _fpayloadsize = '>I'
162 _fpartparamcount = '>BB'
162 _fpartparamcount = '>BB'
163
163
164 preferedchunksize = 4096
164 preferedchunksize = 4096
165
165
166 def _makefpartparamsizes(nbparams):
166 def _makefpartparamsizes(nbparams):
167 """return a struct format to read part parameter sizes
167 """return a struct format to read part parameter sizes
168
168
169 The number parameters is variable so we need to build that format
169 The number parameters is variable so we need to build that format
170 dynamically.
170 dynamically.
171 """
171 """
172 return '>'+('BB'*nbparams)
172 return '>'+('BB'*nbparams)
173
173
174 parthandlermapping = {}
174 parthandlermapping = {}
175
175
176 def parthandler(parttype):
176 def parthandler(parttype):
177 """decorator that register a function as a bundle2 part handler
177 """decorator that register a function as a bundle2 part handler
178
178
179 eg::
179 eg::
180
180
181 @parthandler('myparttype')
181 @parthandler('myparttype')
182 def myparttypehandler(...):
182 def myparttypehandler(...):
183 '''process a part of type "my part".'''
183 '''process a part of type "my part".'''
184 ...
184 ...
185 """
185 """
186 def _decorator(func):
186 def _decorator(func):
187 lparttype = parttype.lower() # enforce lower case matching.
187 lparttype = parttype.lower() # enforce lower case matching.
188 assert lparttype not in parthandlermapping
188 assert lparttype not in parthandlermapping
189 parthandlermapping[lparttype] = func
189 parthandlermapping[lparttype] = func
190 return func
190 return func
191 return _decorator
191 return _decorator
192
192
193 class unbundlerecords(object):
193 class unbundlerecords(object):
194 """keep record of what happens during and unbundle
194 """keep record of what happens during and unbundle
195
195
196 New records are added using `records.add('cat', obj)`. Where 'cat' is a
196 New records are added using `records.add('cat', obj)`. Where 'cat' is a
197 category of record and obj is an arbitraty object.
197 category of record and obj is an arbitraty object.
198
198
199 `records['cat']` will return all entries of this category 'cat'.
199 `records['cat']` will return all entries of this category 'cat'.
200
200
201 Iterating on the object itself will yield `('category', obj)` tuples
201 Iterating on the object itself will yield `('category', obj)` tuples
202 for all entries.
202 for all entries.
203
203
204 All iterations happens in chronological order.
204 All iterations happens in chronological order.
205 """
205 """
206
206
207 def __init__(self):
207 def __init__(self):
208 self._categories = {}
208 self._categories = {}
209 self._sequences = []
209 self._sequences = []
210 self._replies = {}
210 self._replies = {}
211
211
212 def add(self, category, entry, inreplyto=None):
212 def add(self, category, entry, inreplyto=None):
213 """add a new record of a given category.
213 """add a new record of a given category.
214
214
215 The entry can then be retrieved in the list returned by
215 The entry can then be retrieved in the list returned by
216 self['category']."""
216 self['category']."""
217 self._categories.setdefault(category, []).append(entry)
217 self._categories.setdefault(category, []).append(entry)
218 self._sequences.append((category, entry))
218 self._sequences.append((category, entry))
219 if inreplyto is not None:
219 if inreplyto is not None:
220 self.getreplies(inreplyto).add(category, entry)
220 self.getreplies(inreplyto).add(category, entry)
221
221
222 def getreplies(self, partid):
222 def getreplies(self, partid):
223 """get the subrecords that replies to a specific part"""
223 """get the subrecords that replies to a specific part"""
224 return self._replies.setdefault(partid, unbundlerecords())
224 return self._replies.setdefault(partid, unbundlerecords())
225
225
226 def __getitem__(self, cat):
226 def __getitem__(self, cat):
227 return tuple(self._categories.get(cat, ()))
227 return tuple(self._categories.get(cat, ()))
228
228
229 def __iter__(self):
229 def __iter__(self):
230 return iter(self._sequences)
230 return iter(self._sequences)
231
231
232 def __len__(self):
232 def __len__(self):
233 return len(self._sequences)
233 return len(self._sequences)
234
234
235 def __nonzero__(self):
235 def __nonzero__(self):
236 return bool(self._sequences)
236 return bool(self._sequences)
237
237
238 class bundleoperation(object):
238 class bundleoperation(object):
239 """an object that represents a single bundling process
239 """an object that represents a single bundling process
240
240
241 Its purpose is to carry unbundle-related objects and states.
241 Its purpose is to carry unbundle-related objects and states.
242
242
243 A new object should be created at the beginning of each bundle processing.
243 A new object should be created at the beginning of each bundle processing.
244 The object is to be returned by the processing function.
244 The object is to be returned by the processing function.
245
245
246 The object has very little content now it will ultimately contain:
246 The object has very little content now it will ultimately contain:
247 * an access to the repo the bundle is applied to,
247 * an access to the repo the bundle is applied to,
248 * a ui object,
248 * a ui object,
249 * a way to retrieve a transaction to add changes to the repo,
249 * a way to retrieve a transaction to add changes to the repo,
250 * a way to record the result of processing each part,
250 * a way to record the result of processing each part,
251 * a way to construct a bundle response when applicable.
251 * a way to construct a bundle response when applicable.
252 """
252 """
253
253
254 def __init__(self, repo, transactiongetter):
254 def __init__(self, repo, transactiongetter):
255 self.repo = repo
255 self.repo = repo
256 self.ui = repo.ui
256 self.ui = repo.ui
257 self.records = unbundlerecords()
257 self.records = unbundlerecords()
258 self.gettransaction = transactiongetter
258 self.gettransaction = transactiongetter
259 self.reply = None
259 self.reply = None
260
260
261 class TransactionUnavailable(RuntimeError):
261 class TransactionUnavailable(RuntimeError):
262 pass
262 pass
263
263
264 def _notransaction():
264 def _notransaction():
265 """default method to get a transaction while processing a bundle
265 """default method to get a transaction while processing a bundle
266
266
267 Raise an exception to highlight the fact that no transaction was expected
267 Raise an exception to highlight the fact that no transaction was expected
268 to be created"""
268 to be created"""
269 raise TransactionUnavailable()
269 raise TransactionUnavailable()
270
270
271 def processbundle(repo, unbundler, transactiongetter=_notransaction):
271 def processbundle(repo, unbundler, transactiongetter=_notransaction):
272 """This function process a bundle, apply effect to/from a repo
272 """This function process a bundle, apply effect to/from a repo
273
273
274 It iterates over each part then searches for and uses the proper handling
274 It iterates over each part then searches for and uses the proper handling
275 code to process the part. Parts are processed in order.
275 code to process the part. Parts are processed in order.
276
276
277 This is very early version of this function that will be strongly reworked
277 This is very early version of this function that will be strongly reworked
278 before final usage.
278 before final usage.
279
279
280 Unknown Mandatory part will abort the process.
280 Unknown Mandatory part will abort the process.
281 """
281 """
282 op = bundleoperation(repo, transactiongetter)
282 op = bundleoperation(repo, transactiongetter)
283 # todo:
283 # todo:
284 # - only create reply bundle if requested.
284 # - only create reply bundle if requested.
285 op.reply = bundle20(op.ui)
285 op.reply = bundle20(op.ui)
286 # todo:
286 # todo:
287 # - replace this is a init function soon.
287 # - replace this is a init function soon.
288 # - exception catching
288 # - exception catching
289 unbundler.params
289 unbundler.params
290 iterparts = iter(unbundler)
290 iterparts = iter(unbundler)
291 try:
291 try:
292 for part in iterparts:
292 for part in iterparts:
293 parttype = part.type
293 parttype = part.type
294 # part key are matched lower case
294 # part key are matched lower case
295 key = parttype.lower()
295 key = parttype.lower()
296 try:
296 try:
297 handler = parthandlermapping[key]
297 handler = parthandlermapping[key]
298 op.ui.debug('found a handler for part %r\n' % parttype)
298 op.ui.debug('found a handler for part %r\n' % parttype)
299 except KeyError:
299 except KeyError:
300 if key != parttype: # mandatory parts
300 if key != parttype: # mandatory parts
301 # todo:
301 # todo:
302 # - use a more precise exception
302 # - use a more precise exception
303 raise
303 raise
304 op.ui.debug('ignoring unknown advisory part %r\n' % key)
304 op.ui.debug('ignoring unknown advisory part %r\n' % key)
305 # todo:
305 # todo:
306 # - consume the part once we use streaming
306 # - consume the part once we use streaming
307 continue
307 continue
308
308
309 # handler is called outside the above try block so that we don't
309 # handler is called outside the above try block so that we don't
310 # risk catching KeyErrors from anything other than the
310 # risk catching KeyErrors from anything other than the
311 # parthandlermapping lookup (any KeyError raised by handler()
311 # parthandlermapping lookup (any KeyError raised by handler()
312 # itself represents a defect of a different variety).
312 # itself represents a defect of a different variety).
313 handler(op, part)
313 handler(op, part)
314 except Exception:
314 except Exception:
315 for part in iterparts:
315 for part in iterparts:
316 pass # consume the bundle content
316 pass # consume the bundle content
317 raise
317 raise
318 return op
318 return op
319
319
320 class bundle20(object):
320 class bundle20(object):
321 """represent an outgoing bundle2 container
321 """represent an outgoing bundle2 container
322
322
323 Use the `addparam` method to add stream level parameter. and `addpart` to
323 Use the `addparam` method to add stream level parameter. and `addpart` to
324 populate it. Then call `getchunks` to retrieve all the binary chunks of
324 populate it. Then call `getchunks` to retrieve all the binary chunks of
325 datathat compose the bundle2 container."""
325 datathat compose the bundle2 container."""
326
326
327 def __init__(self, ui):
327 def __init__(self, ui):
328 self.ui = ui
328 self.ui = ui
329 self._params = []
329 self._params = []
330 self._parts = []
330 self._parts = []
331
331
332 def addparam(self, name, value=None):
332 def addparam(self, name, value=None):
333 """add a stream level parameter"""
333 """add a stream level parameter"""
334 if not name:
334 if not name:
335 raise ValueError('empty parameter name')
335 raise ValueError('empty parameter name')
336 if name[0] not in string.letters:
336 if name[0] not in string.letters:
337 raise ValueError('non letter first character: %r' % name)
337 raise ValueError('non letter first character: %r' % name)
338 self._params.append((name, value))
338 self._params.append((name, value))
339
339
340 def addpart(self, part):
340 def addpart(self, part):
341 """add a new part to the bundle2 container
341 """add a new part to the bundle2 container
342
342
343 Parts contains the actuall applicative payload."""
343 Parts contains the actuall applicative payload."""
344 assert part.id is None
344 assert part.id is None
345 part.id = len(self._parts) # very cheap counter
345 part.id = len(self._parts) # very cheap counter
346 self._parts.append(part)
346 self._parts.append(part)
347
347
348 def getchunks(self):
348 def getchunks(self):
349 self.ui.debug('start emission of %s stream\n' % _magicstring)
349 self.ui.debug('start emission of %s stream\n' % _magicstring)
350 yield _magicstring
350 yield _magicstring
351 param = self._paramchunk()
351 param = self._paramchunk()
352 self.ui.debug('bundle parameter: %s\n' % param)
352 self.ui.debug('bundle parameter: %s\n' % param)
353 yield _pack(_fstreamparamsize, len(param))
353 yield _pack(_fstreamparamsize, len(param))
354 if param:
354 if param:
355 yield param
355 yield param
356
356
357 self.ui.debug('start of parts\n')
357 self.ui.debug('start of parts\n')
358 for part in self._parts:
358 for part in self._parts:
359 self.ui.debug('bundle part: "%s"\n' % part.type)
359 self.ui.debug('bundle part: "%s"\n' % part.type)
360 for chunk in part.getchunks():
360 for chunk in part.getchunks():
361 yield chunk
361 yield chunk
362 self.ui.debug('end of bundle\n')
362 self.ui.debug('end of bundle\n')
363 yield '\0\0'
363 yield '\0\0'
364
364
365 def _paramchunk(self):
365 def _paramchunk(self):
366 """return a encoded version of all stream parameters"""
366 """return a encoded version of all stream parameters"""
367 blocks = []
367 blocks = []
368 for par, value in self._params:
368 for par, value in self._params:
369 par = urllib.quote(par)
369 par = urllib.quote(par)
370 if value is not None:
370 if value is not None:
371 value = urllib.quote(value)
371 value = urllib.quote(value)
372 par = '%s=%s' % (par, value)
372 par = '%s=%s' % (par, value)
373 blocks.append(par)
373 blocks.append(par)
374 return ' '.join(blocks)
374 return ' '.join(blocks)
375
375
376 class unbundle20(object):
376 class unbundle20(object):
377 """interpret a bundle2 stream
377 """interpret a bundle2 stream
378
378
379 (this will eventually yield parts)"""
379 (this will eventually yield parts)"""
380
380
381 def __init__(self, ui, fp):
381 def __init__(self, ui, fp):
382 self.ui = ui
382 self.ui = ui
383 self._fp = fp
383 self._fp = fp
384 header = self._readexact(4)
384 header = self._readexact(4)
385 magic, version = header[0:2], header[2:4]
385 magic, version = header[0:2], header[2:4]
386 if magic != 'HG':
386 if magic != 'HG':
387 raise util.Abort(_('not a Mercurial bundle'))
387 raise util.Abort(_('not a Mercurial bundle'))
388 if version != '20':
388 if version != '20':
389 raise util.Abort(_('unknown bundle version %s') % version)
389 raise util.Abort(_('unknown bundle version %s') % version)
390 self.ui.debug('start processing of %s stream\n' % header)
390 self.ui.debug('start processing of %s stream\n' % header)
391
391
392 def _unpack(self, format):
392 def _unpack(self, format):
393 """unpack this struct format from the stream"""
393 """unpack this struct format from the stream"""
394 data = self._readexact(struct.calcsize(format))
394 data = self._readexact(struct.calcsize(format))
395 return _unpack(format, data)
395 return _unpack(format, data)
396
396
397 def _readexact(self, size):
397 def _readexact(self, size):
398 """read exactly <size> bytes from the stream"""
398 """read exactly <size> bytes from the stream"""
399 return changegroup.readexactly(self._fp, size)
399 return changegroup.readexactly(self._fp, size)
400
400
401 @util.propertycache
401 @util.propertycache
402 def params(self):
402 def params(self):
403 """dictionnary of stream level parameters"""
403 """dictionnary of stream level parameters"""
404 self.ui.debug('reading bundle2 stream parameters\n')
404 self.ui.debug('reading bundle2 stream parameters\n')
405 params = {}
405 params = {}
406 paramssize = self._unpack(_fstreamparamsize)[0]
406 paramssize = self._unpack(_fstreamparamsize)[0]
407 if paramssize:
407 if paramssize:
408 for p in self._readexact(paramssize).split(' '):
408 for p in self._readexact(paramssize).split(' '):
409 p = p.split('=', 1)
409 p = p.split('=', 1)
410 p = [urllib.unquote(i) for i in p]
410 p = [urllib.unquote(i) for i in p]
411 if len(p) < 2:
411 if len(p) < 2:
412 p.append(None)
412 p.append(None)
413 self._processparam(*p)
413 self._processparam(*p)
414 params[p[0]] = p[1]
414 params[p[0]] = p[1]
415 return params
415 return params
416
416
417 def _processparam(self, name, value):
417 def _processparam(self, name, value):
418 """process a parameter, applying its effect if needed
418 """process a parameter, applying its effect if needed
419
419
420 Parameter starting with a lower case letter are advisory and will be
420 Parameter starting with a lower case letter are advisory and will be
421 ignored when unknown. Those starting with an upper case letter are
421 ignored when unknown. Those starting with an upper case letter are
422 mandatory and will this function will raise a KeyError when unknown.
422 mandatory and will this function will raise a KeyError when unknown.
423
423
424 Note: no option are currently supported. Any input will be either
424 Note: no option are currently supported. Any input will be either
425 ignored or failing.
425 ignored or failing.
426 """
426 """
427 if not name:
427 if not name:
428 raise ValueError('empty parameter name')
428 raise ValueError('empty parameter name')
429 if name[0] not in string.letters:
429 if name[0] not in string.letters:
430 raise ValueError('non letter first character: %r' % name)
430 raise ValueError('non letter first character: %r' % name)
431 # Some logic will be later added here to try to process the option for
431 # Some logic will be later added here to try to process the option for
432 # a dict of known parameter.
432 # a dict of known parameter.
433 if name[0].islower():
433 if name[0].islower():
434 self.ui.debug("ignoring unknown parameter %r\n" % name)
434 self.ui.debug("ignoring unknown parameter %r\n" % name)
435 else:
435 else:
436 raise KeyError(name)
436 raise KeyError(name)
437
437
438
438
439 def __iter__(self):
439 def __iter__(self):
440 """yield all parts contained in the stream"""
440 """yield all parts contained in the stream"""
441 # make sure param have been loaded
441 # make sure param have been loaded
442 self.params
442 self.params
443 self.ui.debug('start extraction of bundle2 parts\n')
443 self.ui.debug('start extraction of bundle2 parts\n')
444 part = self._readpart()
444 part = self._readpart()
445 while part is not None:
445 while part is not None:
446 yield part
446 yield part
447 part = self._readpart()
447 part = self._readpart()
448 self.ui.debug('end of bundle2 stream\n')
448 self.ui.debug('end of bundle2 stream\n')
449
449
450 def _readpart(self):
450 def _readpart(self):
451 """return None when an end of stream markers is reach"""
451 """return None when an end of stream markers is reach"""
452
452
453 headersize = self._unpack(_fpartheadersize)[0]
453 headersize = self._unpack(_fpartheadersize)[0]
454 self.ui.debug('part header size: %i\n' % headersize)
454 self.ui.debug('part header size: %i\n' % headersize)
455 if not headersize:
455 if not headersize:
456 return None
456 return None
457 headerblock = self._readexact(headersize)
457 headerblock = self._readexact(headersize)
458 # some utility to help reading from the header block
458 # some utility to help reading from the header block
459 self._offset = 0 # layer violation to have something easy to understand
459 self._offset = 0 # layer violation to have something easy to understand
460 def fromheader(size):
460 def fromheader(size):
461 """return the next <size> byte from the header"""
461 """return the next <size> byte from the header"""
462 offset = self._offset
462 offset = self._offset
463 data = headerblock[offset:(offset + size)]
463 data = headerblock[offset:(offset + size)]
464 self._offset = offset + size
464 self._offset = offset + size
465 return data
465 return data
466 def unpackheader(format):
466 def unpackheader(format):
467 """read given format from header
467 """read given format from header
468
468
469 This automatically compute the size of the format to read."""
469 This automatically compute the size of the format to read."""
470 data = fromheader(struct.calcsize(format))
470 data = fromheader(struct.calcsize(format))
471 return _unpack(format, data)
471 return _unpack(format, data)
472
472
473 typesize = unpackheader(_fparttypesize)[0]
473 typesize = unpackheader(_fparttypesize)[0]
474 parttype = fromheader(typesize)
474 parttype = fromheader(typesize)
475 self.ui.debug('part type: "%s"\n' % parttype)
475 self.ui.debug('part type: "%s"\n' % parttype)
476 partid = unpackheader(_fpartid)[0]
476 partid = unpackheader(_fpartid)[0]
477 self.ui.debug('part id: "%s"\n' % partid)
477 self.ui.debug('part id: "%s"\n' % partid)
478 ## reading parameters
478 ## reading parameters
479 # param count
479 # param count
480 mancount, advcount = unpackheader(_fpartparamcount)
480 mancount, advcount = unpackheader(_fpartparamcount)
481 self.ui.debug('part parameters: %i\n' % (mancount + advcount))
481 self.ui.debug('part parameters: %i\n' % (mancount + advcount))
482 # param size
482 # param size
483 paramsizes = unpackheader(_makefpartparamsizes(mancount + advcount))
483 paramsizes = unpackheader(_makefpartparamsizes(mancount + advcount))
484 # make it a list of couple again
484 # make it a list of couple again
485 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
485 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
486 # split mandatory from advisory
486 # split mandatory from advisory
487 mansizes = paramsizes[:mancount]
487 mansizes = paramsizes[:mancount]
488 advsizes = paramsizes[mancount:]
488 advsizes = paramsizes[mancount:]
489 # retrive param value
489 # retrive param value
490 manparams = []
490 manparams = []
491 for key, value in mansizes:
491 for key, value in mansizes:
492 manparams.append((fromheader(key), fromheader(value)))
492 manparams.append((fromheader(key), fromheader(value)))
493 advparams = []
493 advparams = []
494 for key, value in advsizes:
494 for key, value in advsizes:
495 advparams.append((fromheader(key), fromheader(value)))
495 advparams.append((fromheader(key), fromheader(value)))
496 del self._offset # clean up layer, nobody saw anything.
496 del self._offset # clean up layer, nobody saw anything.
497 ## part payload
497 ## part payload
498 payload = []
498 payload = []
499 payloadsize = self._unpack(_fpayloadsize)[0]
499 payloadsize = self._unpack(_fpayloadsize)[0]
500 self.ui.debug('payload chunk size: %i\n' % payloadsize)
500 self.ui.debug('payload chunk size: %i\n' % payloadsize)
501 while payloadsize:
501 while payloadsize:
502 payload.append(self._readexact(payloadsize))
502 payload.append(self._readexact(payloadsize))
503 payloadsize = self._unpack(_fpayloadsize)[0]
503 payloadsize = self._unpack(_fpayloadsize)[0]
504 self.ui.debug('payload chunk size: %i\n' % payloadsize)
504 self.ui.debug('payload chunk size: %i\n' % payloadsize)
505 payload = ''.join(payload)
505 payload = ''.join(payload)
506 current = part(parttype, manparams, advparams, data=payload)
506 current = bundlepart(parttype, manparams, advparams, data=payload)
507 current.id = partid
507 current.id = partid
508 return current
508 return current
509
509
510
510
511 class part(object):
511 class bundlepart(object):
512 """A bundle2 part contains application level payload
512 """A bundle2 part contains application level payload
513
513
514 The part `type` is used to route the part to the application level
514 The part `type` is used to route the part to the application level
515 handler.
515 handler.
516 """
516 """
517
517
518 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
518 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
519 data=''):
519 data=''):
520 self.id = None
520 self.id = None
521 self.type = parttype
521 self.type = parttype
522 self.data = data
522 self.data = data
523 self.mandatoryparams = mandatoryparams
523 self.mandatoryparams = mandatoryparams
524 self.advisoryparams = advisoryparams
524 self.advisoryparams = advisoryparams
525
525
526 def getchunks(self):
526 def getchunks(self):
527 #### header
527 #### header
528 ## parttype
528 ## parttype
529 header = [_pack(_fparttypesize, len(self.type)),
529 header = [_pack(_fparttypesize, len(self.type)),
530 self.type, _pack(_fpartid, self.id),
530 self.type, _pack(_fpartid, self.id),
531 ]
531 ]
532 ## parameters
532 ## parameters
533 # count
533 # count
534 manpar = self.mandatoryparams
534 manpar = self.mandatoryparams
535 advpar = self.advisoryparams
535 advpar = self.advisoryparams
536 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
536 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
537 # size
537 # size
538 parsizes = []
538 parsizes = []
539 for key, value in manpar:
539 for key, value in manpar:
540 parsizes.append(len(key))
540 parsizes.append(len(key))
541 parsizes.append(len(value))
541 parsizes.append(len(value))
542 for key, value in advpar:
542 for key, value in advpar:
543 parsizes.append(len(key))
543 parsizes.append(len(key))
544 parsizes.append(len(value))
544 parsizes.append(len(value))
545 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
545 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
546 header.append(paramsizes)
546 header.append(paramsizes)
547 # key, value
547 # key, value
548 for key, value in manpar:
548 for key, value in manpar:
549 header.append(key)
549 header.append(key)
550 header.append(value)
550 header.append(value)
551 for key, value in advpar:
551 for key, value in advpar:
552 header.append(key)
552 header.append(key)
553 header.append(value)
553 header.append(value)
554 ## finalize header
554 ## finalize header
555 headerchunk = ''.join(header)
555 headerchunk = ''.join(header)
556 yield _pack(_fpartheadersize, len(headerchunk))
556 yield _pack(_fpartheadersize, len(headerchunk))
557 yield headerchunk
557 yield headerchunk
558 ## payload
558 ## payload
559 for chunk in self._payloadchunks():
559 for chunk in self._payloadchunks():
560 yield _pack(_fpayloadsize, len(chunk))
560 yield _pack(_fpayloadsize, len(chunk))
561 yield chunk
561 yield chunk
562 # end of payload
562 # end of payload
563 yield _pack(_fpayloadsize, 0)
563 yield _pack(_fpayloadsize, 0)
564
564
565 def _payloadchunks(self):
565 def _payloadchunks(self):
566 """yield chunks of a the part payload
566 """yield chunks of a the part payload
567
567
568 Exists to handle the different methods to provide data to a part."""
568 Exists to handle the different methods to provide data to a part."""
569 # we only support fixed size data now.
569 # we only support fixed size data now.
570 # This will be improved in the future.
570 # This will be improved in the future.
571 if util.safehasattr(self.data, 'next'):
571 if util.safehasattr(self.data, 'next'):
572 buff = util.chunkbuffer(self.data)
572 buff = util.chunkbuffer(self.data)
573 chunk = buff.read(preferedchunksize)
573 chunk = buff.read(preferedchunksize)
574 while chunk:
574 while chunk:
575 yield chunk
575 yield chunk
576 chunk = buff.read(preferedchunksize)
576 chunk = buff.read(preferedchunksize)
577 elif len(self.data):
577 elif len(self.data):
578 yield self.data
578 yield self.data
579
579
580 @parthandler('changegroup')
580 @parthandler('changegroup')
581 def handlechangegroup(op, inpart):
581 def handlechangegroup(op, inpart):
582 """apply a changegroup part on the repo
582 """apply a changegroup part on the repo
583
583
584 This is a very early implementation that will massive rework before being
584 This is a very early implementation that will massive rework before being
585 inflicted to any end-user.
585 inflicted to any end-user.
586 """
586 """
587 # Make sure we trigger a transaction creation
587 # Make sure we trigger a transaction creation
588 #
588 #
589 # The addchangegroup function will get a transaction object by itself, but
589 # The addchangegroup function will get a transaction object by itself, but
590 # we need to make sure we trigger the creation of a transaction object used
590 # we need to make sure we trigger the creation of a transaction object used
591 # for the whole processing scope.
591 # for the whole processing scope.
592 op.gettransaction()
592 op.gettransaction()
593 data = StringIO.StringIO(inpart.data)
593 data = StringIO.StringIO(inpart.data)
594 data.seek(0)
594 data.seek(0)
595 cg = changegroup.readbundle(data, 'bundle2part')
595 cg = changegroup.readbundle(data, 'bundle2part')
596 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
596 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
597 op.records.add('changegroup', {'return': ret})
597 op.records.add('changegroup', {'return': ret})
598 if op.reply is not None:
598 if op.reply is not None:
599 # This is definitly not the final form of this
599 # This is definitly not the final form of this
600 # return. But one need to start somewhere.
600 # return. But one need to start somewhere.
601 op.reply.addpart(part('reply:changegroup', (),
601 op.reply.addpart(bundlepart('reply:changegroup', (),
602 [('in-reply-to', str(inpart.id)),
602 [('in-reply-to', str(inpart.id)),
603 ('return', '%i' % ret)]))
603 ('return', '%i' % ret)]))
604
604
605 @parthandler('reply:changegroup')
605 @parthandler('reply:changegroup')
606 def handlechangegroup(op, inpart):
606 def handlechangegroup(op, inpart):
607 p = dict(inpart.advisoryparams)
607 p = dict(inpart.advisoryparams)
608 ret = int(p['return'])
608 ret = int(p['return'])
609 op.records.add('changegroup', {'return': ret}, int(p['in-reply-to']))
609 op.records.add('changegroup', {'return': ret}, int(p['in-reply-to']))
610
610
@@ -1,644 +1,644 b''
1 # exchange.py - utily to exchange data between repo.
1 # exchange.py - utily to exchange data between repo.
2 #
2 #
3 # Copyright 2005-2007 Matt Mackall <mpm@selenic.com>
3 # Copyright 2005-2007 Matt Mackall <mpm@selenic.com>
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 from i18n import _
8 from i18n import _
9 from node import hex, nullid
9 from node import hex, nullid
10 import errno
10 import errno
11 import util, scmutil, changegroup, base85
11 import util, scmutil, changegroup, base85
12 import discovery, phases, obsolete, bookmarks, bundle2
12 import discovery, phases, obsolete, bookmarks, bundle2
13
13
14
14
15 class pushoperation(object):
15 class pushoperation(object):
16 """A object that represent a single push operation
16 """A object that represent a single push operation
17
17
18 It purpose is to carry push related state and very common operation.
18 It purpose is to carry push related state and very common operation.
19
19
20 A new should be created at the begining of each push and discarded
20 A new should be created at the begining of each push and discarded
21 afterward.
21 afterward.
22 """
22 """
23
23
24 def __init__(self, repo, remote, force=False, revs=None, newbranch=False):
24 def __init__(self, repo, remote, force=False, revs=None, newbranch=False):
25 # repo we push from
25 # repo we push from
26 self.repo = repo
26 self.repo = repo
27 self.ui = repo.ui
27 self.ui = repo.ui
28 # repo we push to
28 # repo we push to
29 self.remote = remote
29 self.remote = remote
30 # force option provided
30 # force option provided
31 self.force = force
31 self.force = force
32 # revs to be pushed (None is "all")
32 # revs to be pushed (None is "all")
33 self.revs = revs
33 self.revs = revs
34 # allow push of new branch
34 # allow push of new branch
35 self.newbranch = newbranch
35 self.newbranch = newbranch
36 # did a local lock get acquired?
36 # did a local lock get acquired?
37 self.locallocked = None
37 self.locallocked = None
38 # Integer version of the push result
38 # Integer version of the push result
39 # - None means nothing to push
39 # - None means nothing to push
40 # - 0 means HTTP error
40 # - 0 means HTTP error
41 # - 1 means we pushed and remote head count is unchanged *or*
41 # - 1 means we pushed and remote head count is unchanged *or*
42 # we have outgoing changesets but refused to push
42 # we have outgoing changesets but refused to push
43 # - other values as described by addchangegroup()
43 # - other values as described by addchangegroup()
44 self.ret = None
44 self.ret = None
45 # discover.outgoing object (contains common and outgoin data)
45 # discover.outgoing object (contains common and outgoin data)
46 self.outgoing = None
46 self.outgoing = None
47 # all remote heads before the push
47 # all remote heads before the push
48 self.remoteheads = None
48 self.remoteheads = None
49 # testable as a boolean indicating if any nodes are missing locally.
49 # testable as a boolean indicating if any nodes are missing locally.
50 self.incoming = None
50 self.incoming = None
51 # set of all heads common after changeset bundle push
51 # set of all heads common after changeset bundle push
52 self.commonheads = None
52 self.commonheads = None
53
53
54 def push(repo, remote, force=False, revs=None, newbranch=False):
54 def push(repo, remote, force=False, revs=None, newbranch=False):
55 '''Push outgoing changesets (limited by revs) from a local
55 '''Push outgoing changesets (limited by revs) from a local
56 repository to remote. Return an integer:
56 repository to remote. Return an integer:
57 - None means nothing to push
57 - None means nothing to push
58 - 0 means HTTP error
58 - 0 means HTTP error
59 - 1 means we pushed and remote head count is unchanged *or*
59 - 1 means we pushed and remote head count is unchanged *or*
60 we have outgoing changesets but refused to push
60 we have outgoing changesets but refused to push
61 - other values as described by addchangegroup()
61 - other values as described by addchangegroup()
62 '''
62 '''
63 pushop = pushoperation(repo, remote, force, revs, newbranch)
63 pushop = pushoperation(repo, remote, force, revs, newbranch)
64 if pushop.remote.local():
64 if pushop.remote.local():
65 missing = (set(pushop.repo.requirements)
65 missing = (set(pushop.repo.requirements)
66 - pushop.remote.local().supported)
66 - pushop.remote.local().supported)
67 if missing:
67 if missing:
68 msg = _("required features are not"
68 msg = _("required features are not"
69 " supported in the destination:"
69 " supported in the destination:"
70 " %s") % (', '.join(sorted(missing)))
70 " %s") % (', '.join(sorted(missing)))
71 raise util.Abort(msg)
71 raise util.Abort(msg)
72
72
73 # there are two ways to push to remote repo:
73 # there are two ways to push to remote repo:
74 #
74 #
75 # addchangegroup assumes local user can lock remote
75 # addchangegroup assumes local user can lock remote
76 # repo (local filesystem, old ssh servers).
76 # repo (local filesystem, old ssh servers).
77 #
77 #
78 # unbundle assumes local user cannot lock remote repo (new ssh
78 # unbundle assumes local user cannot lock remote repo (new ssh
79 # servers, http servers).
79 # servers, http servers).
80
80
81 if not pushop.remote.canpush():
81 if not pushop.remote.canpush():
82 raise util.Abort(_("destination does not support push"))
82 raise util.Abort(_("destination does not support push"))
83 # get local lock as we might write phase data
83 # get local lock as we might write phase data
84 locallock = None
84 locallock = None
85 try:
85 try:
86 locallock = pushop.repo.lock()
86 locallock = pushop.repo.lock()
87 pushop.locallocked = True
87 pushop.locallocked = True
88 except IOError, err:
88 except IOError, err:
89 pushop.locallocked = False
89 pushop.locallocked = False
90 if err.errno != errno.EACCES:
90 if err.errno != errno.EACCES:
91 raise
91 raise
92 # source repo cannot be locked.
92 # source repo cannot be locked.
93 # We do not abort the push, but just disable the local phase
93 # We do not abort the push, but just disable the local phase
94 # synchronisation.
94 # synchronisation.
95 msg = 'cannot lock source repository: %s\n' % err
95 msg = 'cannot lock source repository: %s\n' % err
96 pushop.ui.debug(msg)
96 pushop.ui.debug(msg)
97 try:
97 try:
98 pushop.repo.checkpush(pushop)
98 pushop.repo.checkpush(pushop)
99 lock = None
99 lock = None
100 unbundle = pushop.remote.capable('unbundle')
100 unbundle = pushop.remote.capable('unbundle')
101 if not unbundle:
101 if not unbundle:
102 lock = pushop.remote.lock()
102 lock = pushop.remote.lock()
103 try:
103 try:
104 _pushdiscovery(pushop)
104 _pushdiscovery(pushop)
105 if _pushcheckoutgoing(pushop):
105 if _pushcheckoutgoing(pushop):
106 _pushchangeset(pushop)
106 _pushchangeset(pushop)
107 _pushcomputecommonheads(pushop)
107 _pushcomputecommonheads(pushop)
108 _pushsyncphase(pushop)
108 _pushsyncphase(pushop)
109 _pushobsolete(pushop)
109 _pushobsolete(pushop)
110 finally:
110 finally:
111 if lock is not None:
111 if lock is not None:
112 lock.release()
112 lock.release()
113 finally:
113 finally:
114 if locallock is not None:
114 if locallock is not None:
115 locallock.release()
115 locallock.release()
116
116
117 _pushbookmark(pushop)
117 _pushbookmark(pushop)
118 return pushop.ret
118 return pushop.ret
119
119
120 def _pushdiscovery(pushop):
120 def _pushdiscovery(pushop):
121 # discovery
121 # discovery
122 unfi = pushop.repo.unfiltered()
122 unfi = pushop.repo.unfiltered()
123 fci = discovery.findcommonincoming
123 fci = discovery.findcommonincoming
124 commoninc = fci(unfi, pushop.remote, force=pushop.force)
124 commoninc = fci(unfi, pushop.remote, force=pushop.force)
125 common, inc, remoteheads = commoninc
125 common, inc, remoteheads = commoninc
126 fco = discovery.findcommonoutgoing
126 fco = discovery.findcommonoutgoing
127 outgoing = fco(unfi, pushop.remote, onlyheads=pushop.revs,
127 outgoing = fco(unfi, pushop.remote, onlyheads=pushop.revs,
128 commoninc=commoninc, force=pushop.force)
128 commoninc=commoninc, force=pushop.force)
129 pushop.outgoing = outgoing
129 pushop.outgoing = outgoing
130 pushop.remoteheads = remoteheads
130 pushop.remoteheads = remoteheads
131 pushop.incoming = inc
131 pushop.incoming = inc
132
132
133 def _pushcheckoutgoing(pushop):
133 def _pushcheckoutgoing(pushop):
134 outgoing = pushop.outgoing
134 outgoing = pushop.outgoing
135 unfi = pushop.repo.unfiltered()
135 unfi = pushop.repo.unfiltered()
136 if not outgoing.missing:
136 if not outgoing.missing:
137 # nothing to push
137 # nothing to push
138 scmutil.nochangesfound(unfi.ui, unfi, outgoing.excluded)
138 scmutil.nochangesfound(unfi.ui, unfi, outgoing.excluded)
139 return False
139 return False
140 # something to push
140 # something to push
141 if not pushop.force:
141 if not pushop.force:
142 # if repo.obsstore == False --> no obsolete
142 # if repo.obsstore == False --> no obsolete
143 # then, save the iteration
143 # then, save the iteration
144 if unfi.obsstore:
144 if unfi.obsstore:
145 # this message are here for 80 char limit reason
145 # this message are here for 80 char limit reason
146 mso = _("push includes obsolete changeset: %s!")
146 mso = _("push includes obsolete changeset: %s!")
147 mst = "push includes %s changeset: %s!"
147 mst = "push includes %s changeset: %s!"
148 # plain versions for i18n tool to detect them
148 # plain versions for i18n tool to detect them
149 _("push includes unstable changeset: %s!")
149 _("push includes unstable changeset: %s!")
150 _("push includes bumped changeset: %s!")
150 _("push includes bumped changeset: %s!")
151 _("push includes divergent changeset: %s!")
151 _("push includes divergent changeset: %s!")
152 # If we are to push if there is at least one
152 # If we are to push if there is at least one
153 # obsolete or unstable changeset in missing, at
153 # obsolete or unstable changeset in missing, at
154 # least one of the missinghead will be obsolete or
154 # least one of the missinghead will be obsolete or
155 # unstable. So checking heads only is ok
155 # unstable. So checking heads only is ok
156 for node in outgoing.missingheads:
156 for node in outgoing.missingheads:
157 ctx = unfi[node]
157 ctx = unfi[node]
158 if ctx.obsolete():
158 if ctx.obsolete():
159 raise util.Abort(mso % ctx)
159 raise util.Abort(mso % ctx)
160 elif ctx.troubled():
160 elif ctx.troubled():
161 raise util.Abort(_(mst)
161 raise util.Abort(_(mst)
162 % (ctx.troubles()[0],
162 % (ctx.troubles()[0],
163 ctx))
163 ctx))
164 newbm = pushop.ui.configlist('bookmarks', 'pushing')
164 newbm = pushop.ui.configlist('bookmarks', 'pushing')
165 discovery.checkheads(unfi, pushop.remote, outgoing,
165 discovery.checkheads(unfi, pushop.remote, outgoing,
166 pushop.remoteheads,
166 pushop.remoteheads,
167 pushop.newbranch,
167 pushop.newbranch,
168 bool(pushop.incoming),
168 bool(pushop.incoming),
169 newbm)
169 newbm)
170 return True
170 return True
171
171
172 def _pushchangeset(pushop):
172 def _pushchangeset(pushop):
173 """Make the actual push of changeset bundle to remote repo"""
173 """Make the actual push of changeset bundle to remote repo"""
174 outgoing = pushop.outgoing
174 outgoing = pushop.outgoing
175 unbundle = pushop.remote.capable('unbundle')
175 unbundle = pushop.remote.capable('unbundle')
176 # TODO: get bundlecaps from remote
176 # TODO: get bundlecaps from remote
177 bundlecaps = None
177 bundlecaps = None
178 # create a changegroup from local
178 # create a changegroup from local
179 if pushop.revs is None and not (outgoing.excluded
179 if pushop.revs is None and not (outgoing.excluded
180 or pushop.repo.changelog.filteredrevs):
180 or pushop.repo.changelog.filteredrevs):
181 # push everything,
181 # push everything,
182 # use the fast path, no race possible on push
182 # use the fast path, no race possible on push
183 bundler = changegroup.bundle10(pushop.repo, bundlecaps)
183 bundler = changegroup.bundle10(pushop.repo, bundlecaps)
184 cg = changegroup.getsubset(pushop.repo,
184 cg = changegroup.getsubset(pushop.repo,
185 outgoing,
185 outgoing,
186 bundler,
186 bundler,
187 'push',
187 'push',
188 fastpath=True)
188 fastpath=True)
189 else:
189 else:
190 cg = changegroup.getlocalbundle(pushop.repo, 'push', outgoing,
190 cg = changegroup.getlocalbundle(pushop.repo, 'push', outgoing,
191 bundlecaps)
191 bundlecaps)
192
192
193 # apply changegroup to remote
193 # apply changegroup to remote
194 if unbundle:
194 if unbundle:
195 # local repo finds heads on server, finds out what
195 # local repo finds heads on server, finds out what
196 # revs it must push. once revs transferred, if server
196 # revs it must push. once revs transferred, if server
197 # finds it has different heads (someone else won
197 # finds it has different heads (someone else won
198 # commit/push race), server aborts.
198 # commit/push race), server aborts.
199 if pushop.force:
199 if pushop.force:
200 remoteheads = ['force']
200 remoteheads = ['force']
201 else:
201 else:
202 remoteheads = pushop.remoteheads
202 remoteheads = pushop.remoteheads
203 # ssh: return remote's addchangegroup()
203 # ssh: return remote's addchangegroup()
204 # http: return remote's addchangegroup() or 0 for error
204 # http: return remote's addchangegroup() or 0 for error
205 pushop.ret = pushop.remote.unbundle(cg, remoteheads,
205 pushop.ret = pushop.remote.unbundle(cg, remoteheads,
206 'push')
206 'push')
207 else:
207 else:
208 # we return an integer indicating remote head count
208 # we return an integer indicating remote head count
209 # change
209 # change
210 pushop.ret = pushop.remote.addchangegroup(cg, 'push', pushop.repo.url())
210 pushop.ret = pushop.remote.addchangegroup(cg, 'push', pushop.repo.url())
211
211
212 def _pushcomputecommonheads(pushop):
212 def _pushcomputecommonheads(pushop):
213 unfi = pushop.repo.unfiltered()
213 unfi = pushop.repo.unfiltered()
214 if pushop.ret:
214 if pushop.ret:
215 # push succeed, synchronize target of the push
215 # push succeed, synchronize target of the push
216 cheads = pushop.outgoing.missingheads
216 cheads = pushop.outgoing.missingheads
217 elif pushop.revs is None:
217 elif pushop.revs is None:
218 # All out push fails. synchronize all common
218 # All out push fails. synchronize all common
219 cheads = pushop.outgoing.commonheads
219 cheads = pushop.outgoing.commonheads
220 else:
220 else:
221 # I want cheads = heads(::missingheads and ::commonheads)
221 # I want cheads = heads(::missingheads and ::commonheads)
222 # (missingheads is revs with secret changeset filtered out)
222 # (missingheads is revs with secret changeset filtered out)
223 #
223 #
224 # This can be expressed as:
224 # This can be expressed as:
225 # cheads = ( (missingheads and ::commonheads)
225 # cheads = ( (missingheads and ::commonheads)
226 # + (commonheads and ::missingheads))"
226 # + (commonheads and ::missingheads))"
227 # )
227 # )
228 #
228 #
229 # while trying to push we already computed the following:
229 # while trying to push we already computed the following:
230 # common = (::commonheads)
230 # common = (::commonheads)
231 # missing = ((commonheads::missingheads) - commonheads)
231 # missing = ((commonheads::missingheads) - commonheads)
232 #
232 #
233 # We can pick:
233 # We can pick:
234 # * missingheads part of common (::commonheads)
234 # * missingheads part of common (::commonheads)
235 common = set(pushop.outgoing.common)
235 common = set(pushop.outgoing.common)
236 nm = pushop.repo.changelog.nodemap
236 nm = pushop.repo.changelog.nodemap
237 cheads = [node for node in pushop.revs if nm[node] in common]
237 cheads = [node for node in pushop.revs if nm[node] in common]
238 # and
238 # and
239 # * commonheads parents on missing
239 # * commonheads parents on missing
240 revset = unfi.set('%ln and parents(roots(%ln))',
240 revset = unfi.set('%ln and parents(roots(%ln))',
241 pushop.outgoing.commonheads,
241 pushop.outgoing.commonheads,
242 pushop.outgoing.missing)
242 pushop.outgoing.missing)
243 cheads.extend(c.node() for c in revset)
243 cheads.extend(c.node() for c in revset)
244 pushop.commonheads = cheads
244 pushop.commonheads = cheads
245
245
246 def _pushsyncphase(pushop):
246 def _pushsyncphase(pushop):
247 """synchronise phase information locally and remotly"""
247 """synchronise phase information locally and remotly"""
248 unfi = pushop.repo.unfiltered()
248 unfi = pushop.repo.unfiltered()
249 cheads = pushop.commonheads
249 cheads = pushop.commonheads
250 if pushop.ret:
250 if pushop.ret:
251 # push succeed, synchronize target of the push
251 # push succeed, synchronize target of the push
252 cheads = pushop.outgoing.missingheads
252 cheads = pushop.outgoing.missingheads
253 elif pushop.revs is None:
253 elif pushop.revs is None:
254 # All out push fails. synchronize all common
254 # All out push fails. synchronize all common
255 cheads = pushop.outgoing.commonheads
255 cheads = pushop.outgoing.commonheads
256 else:
256 else:
257 # I want cheads = heads(::missingheads and ::commonheads)
257 # I want cheads = heads(::missingheads and ::commonheads)
258 # (missingheads is revs with secret changeset filtered out)
258 # (missingheads is revs with secret changeset filtered out)
259 #
259 #
260 # This can be expressed as:
260 # This can be expressed as:
261 # cheads = ( (missingheads and ::commonheads)
261 # cheads = ( (missingheads and ::commonheads)
262 # + (commonheads and ::missingheads))"
262 # + (commonheads and ::missingheads))"
263 # )
263 # )
264 #
264 #
265 # while trying to push we already computed the following:
265 # while trying to push we already computed the following:
266 # common = (::commonheads)
266 # common = (::commonheads)
267 # missing = ((commonheads::missingheads) - commonheads)
267 # missing = ((commonheads::missingheads) - commonheads)
268 #
268 #
269 # We can pick:
269 # We can pick:
270 # * missingheads part of common (::commonheads)
270 # * missingheads part of common (::commonheads)
271 common = set(pushop.outgoing.common)
271 common = set(pushop.outgoing.common)
272 nm = pushop.repo.changelog.nodemap
272 nm = pushop.repo.changelog.nodemap
273 cheads = [node for node in pushop.revs if nm[node] in common]
273 cheads = [node for node in pushop.revs if nm[node] in common]
274 # and
274 # and
275 # * commonheads parents on missing
275 # * commonheads parents on missing
276 revset = unfi.set('%ln and parents(roots(%ln))',
276 revset = unfi.set('%ln and parents(roots(%ln))',
277 pushop.outgoing.commonheads,
277 pushop.outgoing.commonheads,
278 pushop.outgoing.missing)
278 pushop.outgoing.missing)
279 cheads.extend(c.node() for c in revset)
279 cheads.extend(c.node() for c in revset)
280 pushop.commonheads = cheads
280 pushop.commonheads = cheads
281 # even when we don't push, exchanging phase data is useful
281 # even when we don't push, exchanging phase data is useful
282 remotephases = pushop.remote.listkeys('phases')
282 remotephases = pushop.remote.listkeys('phases')
283 if (pushop.ui.configbool('ui', '_usedassubrepo', False)
283 if (pushop.ui.configbool('ui', '_usedassubrepo', False)
284 and remotephases # server supports phases
284 and remotephases # server supports phases
285 and pushop.ret is None # nothing was pushed
285 and pushop.ret is None # nothing was pushed
286 and remotephases.get('publishing', False)):
286 and remotephases.get('publishing', False)):
287 # When:
287 # When:
288 # - this is a subrepo push
288 # - this is a subrepo push
289 # - and remote support phase
289 # - and remote support phase
290 # - and no changeset was pushed
290 # - and no changeset was pushed
291 # - and remote is publishing
291 # - and remote is publishing
292 # We may be in issue 3871 case!
292 # We may be in issue 3871 case!
293 # We drop the possible phase synchronisation done by
293 # We drop the possible phase synchronisation done by
294 # courtesy to publish changesets possibly locally draft
294 # courtesy to publish changesets possibly locally draft
295 # on the remote.
295 # on the remote.
296 remotephases = {'publishing': 'True'}
296 remotephases = {'publishing': 'True'}
297 if not remotephases: # old server or public only rer
297 if not remotephases: # old server or public only rer
298 _localphasemove(pushop, cheads)
298 _localphasemove(pushop, cheads)
299 # don't push any phase data as there is nothing to push
299 # don't push any phase data as there is nothing to push
300 else:
300 else:
301 ana = phases.analyzeremotephases(pushop.repo, cheads,
301 ana = phases.analyzeremotephases(pushop.repo, cheads,
302 remotephases)
302 remotephases)
303 pheads, droots = ana
303 pheads, droots = ana
304 ### Apply remote phase on local
304 ### Apply remote phase on local
305 if remotephases.get('publishing', False):
305 if remotephases.get('publishing', False):
306 _localphasemove(pushop, cheads)
306 _localphasemove(pushop, cheads)
307 else: # publish = False
307 else: # publish = False
308 _localphasemove(pushop, pheads)
308 _localphasemove(pushop, pheads)
309 _localphasemove(pushop, cheads, phases.draft)
309 _localphasemove(pushop, cheads, phases.draft)
310 ### Apply local phase on remote
310 ### Apply local phase on remote
311
311
312 # Get the list of all revs draft on remote by public here.
312 # Get the list of all revs draft on remote by public here.
313 # XXX Beware that revset break if droots is not strictly
313 # XXX Beware that revset break if droots is not strictly
314 # XXX root we may want to ensure it is but it is costly
314 # XXX root we may want to ensure it is but it is costly
315 outdated = unfi.set('heads((%ln::%ln) and public())',
315 outdated = unfi.set('heads((%ln::%ln) and public())',
316 droots, cheads)
316 droots, cheads)
317 for newremotehead in outdated:
317 for newremotehead in outdated:
318 r = pushop.remote.pushkey('phases',
318 r = pushop.remote.pushkey('phases',
319 newremotehead.hex(),
319 newremotehead.hex(),
320 str(phases.draft),
320 str(phases.draft),
321 str(phases.public))
321 str(phases.public))
322 if not r:
322 if not r:
323 pushop.ui.warn(_('updating %s to public failed!\n')
323 pushop.ui.warn(_('updating %s to public failed!\n')
324 % newremotehead)
324 % newremotehead)
325
325
326 def _localphasemove(pushop, nodes, phase=phases.public):
326 def _localphasemove(pushop, nodes, phase=phases.public):
327 """move <nodes> to <phase> in the local source repo"""
327 """move <nodes> to <phase> in the local source repo"""
328 if pushop.locallocked:
328 if pushop.locallocked:
329 phases.advanceboundary(pushop.repo, phase, nodes)
329 phases.advanceboundary(pushop.repo, phase, nodes)
330 else:
330 else:
331 # repo is not locked, do not change any phases!
331 # repo is not locked, do not change any phases!
332 # Informs the user that phases should have been moved when
332 # Informs the user that phases should have been moved when
333 # applicable.
333 # applicable.
334 actualmoves = [n for n in nodes if phase < pushop.repo[n].phase()]
334 actualmoves = [n for n in nodes if phase < pushop.repo[n].phase()]
335 phasestr = phases.phasenames[phase]
335 phasestr = phases.phasenames[phase]
336 if actualmoves:
336 if actualmoves:
337 pushop.ui.status(_('cannot lock source repo, skipping '
337 pushop.ui.status(_('cannot lock source repo, skipping '
338 'local %s phase update\n') % phasestr)
338 'local %s phase update\n') % phasestr)
339
339
340 def _pushobsolete(pushop):
340 def _pushobsolete(pushop):
341 """utility function to push obsolete markers to a remote"""
341 """utility function to push obsolete markers to a remote"""
342 pushop.ui.debug('try to push obsolete markers to remote\n')
342 pushop.ui.debug('try to push obsolete markers to remote\n')
343 repo = pushop.repo
343 repo = pushop.repo
344 remote = pushop.remote
344 remote = pushop.remote
345 if (obsolete._enabled and repo.obsstore and
345 if (obsolete._enabled and repo.obsstore and
346 'obsolete' in remote.listkeys('namespaces')):
346 'obsolete' in remote.listkeys('namespaces')):
347 rslts = []
347 rslts = []
348 remotedata = repo.listkeys('obsolete')
348 remotedata = repo.listkeys('obsolete')
349 for key in sorted(remotedata, reverse=True):
349 for key in sorted(remotedata, reverse=True):
350 # reverse sort to ensure we end with dump0
350 # reverse sort to ensure we end with dump0
351 data = remotedata[key]
351 data = remotedata[key]
352 rslts.append(remote.pushkey('obsolete', key, '', data))
352 rslts.append(remote.pushkey('obsolete', key, '', data))
353 if [r for r in rslts if not r]:
353 if [r for r in rslts if not r]:
354 msg = _('failed to push some obsolete markers!\n')
354 msg = _('failed to push some obsolete markers!\n')
355 repo.ui.warn(msg)
355 repo.ui.warn(msg)
356
356
357 def _pushbookmark(pushop):
357 def _pushbookmark(pushop):
358 """Update bookmark position on remote"""
358 """Update bookmark position on remote"""
359 ui = pushop.ui
359 ui = pushop.ui
360 repo = pushop.repo.unfiltered()
360 repo = pushop.repo.unfiltered()
361 remote = pushop.remote
361 remote = pushop.remote
362 ui.debug("checking for updated bookmarks\n")
362 ui.debug("checking for updated bookmarks\n")
363 revnums = map(repo.changelog.rev, pushop.revs or [])
363 revnums = map(repo.changelog.rev, pushop.revs or [])
364 ancestors = [a for a in repo.changelog.ancestors(revnums, inclusive=True)]
364 ancestors = [a for a in repo.changelog.ancestors(revnums, inclusive=True)]
365 (addsrc, adddst, advsrc, advdst, diverge, differ, invalid
365 (addsrc, adddst, advsrc, advdst, diverge, differ, invalid
366 ) = bookmarks.compare(repo, repo._bookmarks, remote.listkeys('bookmarks'),
366 ) = bookmarks.compare(repo, repo._bookmarks, remote.listkeys('bookmarks'),
367 srchex=hex)
367 srchex=hex)
368
368
369 for b, scid, dcid in advsrc:
369 for b, scid, dcid in advsrc:
370 if ancestors and repo[scid].rev() not in ancestors:
370 if ancestors and repo[scid].rev() not in ancestors:
371 continue
371 continue
372 if remote.pushkey('bookmarks', b, dcid, scid):
372 if remote.pushkey('bookmarks', b, dcid, scid):
373 ui.status(_("updating bookmark %s\n") % b)
373 ui.status(_("updating bookmark %s\n") % b)
374 else:
374 else:
375 ui.warn(_('updating bookmark %s failed!\n') % b)
375 ui.warn(_('updating bookmark %s failed!\n') % b)
376
376
377 class pulloperation(object):
377 class pulloperation(object):
378 """A object that represent a single pull operation
378 """A object that represent a single pull operation
379
379
380 It purpose is to carry push related state and very common operation.
380 It purpose is to carry push related state and very common operation.
381
381
382 A new should be created at the begining of each pull and discarded
382 A new should be created at the begining of each pull and discarded
383 afterward.
383 afterward.
384 """
384 """
385
385
386 def __init__(self, repo, remote, heads=None, force=False):
386 def __init__(self, repo, remote, heads=None, force=False):
387 # repo we pull into
387 # repo we pull into
388 self.repo = repo
388 self.repo = repo
389 # repo we pull from
389 # repo we pull from
390 self.remote = remote
390 self.remote = remote
391 # revision we try to pull (None is "all")
391 # revision we try to pull (None is "all")
392 self.heads = heads
392 self.heads = heads
393 # do we force pull?
393 # do we force pull?
394 self.force = force
394 self.force = force
395 # the name the pull transaction
395 # the name the pull transaction
396 self._trname = 'pull\n' + util.hidepassword(remote.url())
396 self._trname = 'pull\n' + util.hidepassword(remote.url())
397 # hold the transaction once created
397 # hold the transaction once created
398 self._tr = None
398 self._tr = None
399 # set of common changeset between local and remote before pull
399 # set of common changeset between local and remote before pull
400 self.common = None
400 self.common = None
401 # set of pulled head
401 # set of pulled head
402 self.rheads = None
402 self.rheads = None
403 # list of missing changeset to fetch remotly
403 # list of missing changeset to fetch remotly
404 self.fetch = None
404 self.fetch = None
405 # result of changegroup pulling (used as returng code by pull)
405 # result of changegroup pulling (used as returng code by pull)
406 self.cgresult = None
406 self.cgresult = None
407 # list of step remaining todo (related to future bundle2 usage)
407 # list of step remaining todo (related to future bundle2 usage)
408 self.todosteps = set(['changegroup', 'phases', 'obsmarkers'])
408 self.todosteps = set(['changegroup', 'phases', 'obsmarkers'])
409
409
410 @util.propertycache
410 @util.propertycache
411 def pulledsubset(self):
411 def pulledsubset(self):
412 """heads of the set of changeset target by the pull"""
412 """heads of the set of changeset target by the pull"""
413 # compute target subset
413 # compute target subset
414 if self.heads is None:
414 if self.heads is None:
415 # We pulled every thing possible
415 # We pulled every thing possible
416 # sync on everything common
416 # sync on everything common
417 c = set(self.common)
417 c = set(self.common)
418 ret = list(self.common)
418 ret = list(self.common)
419 for n in self.rheads:
419 for n in self.rheads:
420 if n not in c:
420 if n not in c:
421 ret.append(n)
421 ret.append(n)
422 return ret
422 return ret
423 else:
423 else:
424 # We pulled a specific subset
424 # We pulled a specific subset
425 # sync on this subset
425 # sync on this subset
426 return self.heads
426 return self.heads
427
427
428 def gettransaction(self):
428 def gettransaction(self):
429 """get appropriate pull transaction, creating it if needed"""
429 """get appropriate pull transaction, creating it if needed"""
430 if self._tr is None:
430 if self._tr is None:
431 self._tr = self.repo.transaction(self._trname)
431 self._tr = self.repo.transaction(self._trname)
432 return self._tr
432 return self._tr
433
433
434 def closetransaction(self):
434 def closetransaction(self):
435 """close transaction if created"""
435 """close transaction if created"""
436 if self._tr is not None:
436 if self._tr is not None:
437 self._tr.close()
437 self._tr.close()
438
438
439 def releasetransaction(self):
439 def releasetransaction(self):
440 """release transaction if created"""
440 """release transaction if created"""
441 if self._tr is not None:
441 if self._tr is not None:
442 self._tr.release()
442 self._tr.release()
443
443
444 def pull(repo, remote, heads=None, force=False):
444 def pull(repo, remote, heads=None, force=False):
445 pullop = pulloperation(repo, remote, heads, force)
445 pullop = pulloperation(repo, remote, heads, force)
446 if pullop.remote.local():
446 if pullop.remote.local():
447 missing = set(pullop.remote.requirements) - pullop.repo.supported
447 missing = set(pullop.remote.requirements) - pullop.repo.supported
448 if missing:
448 if missing:
449 msg = _("required features are not"
449 msg = _("required features are not"
450 " supported in the destination:"
450 " supported in the destination:"
451 " %s") % (', '.join(sorted(missing)))
451 " %s") % (', '.join(sorted(missing)))
452 raise util.Abort(msg)
452 raise util.Abort(msg)
453
453
454 lock = pullop.repo.lock()
454 lock = pullop.repo.lock()
455 try:
455 try:
456 _pulldiscovery(pullop)
456 _pulldiscovery(pullop)
457 if pullop.remote.capable('bundle2'):
457 if pullop.remote.capable('bundle2'):
458 _pullbundle2(pullop)
458 _pullbundle2(pullop)
459 if 'changegroup' in pullop.todosteps:
459 if 'changegroup' in pullop.todosteps:
460 _pullchangeset(pullop)
460 _pullchangeset(pullop)
461 if 'phases' in pullop.todosteps:
461 if 'phases' in pullop.todosteps:
462 _pullphase(pullop)
462 _pullphase(pullop)
463 if 'obsmarkers' in pullop.todosteps:
463 if 'obsmarkers' in pullop.todosteps:
464 _pullobsolete(pullop)
464 _pullobsolete(pullop)
465 pullop.closetransaction()
465 pullop.closetransaction()
466 finally:
466 finally:
467 pullop.releasetransaction()
467 pullop.releasetransaction()
468 lock.release()
468 lock.release()
469
469
470 return pullop.cgresult
470 return pullop.cgresult
471
471
472 def _pulldiscovery(pullop):
472 def _pulldiscovery(pullop):
473 """discovery phase for the pull
473 """discovery phase for the pull
474
474
475 Current handle changeset discovery only, will change handle all discovery
475 Current handle changeset discovery only, will change handle all discovery
476 at some point."""
476 at some point."""
477 tmp = discovery.findcommonincoming(pullop.repo.unfiltered(),
477 tmp = discovery.findcommonincoming(pullop.repo.unfiltered(),
478 pullop.remote,
478 pullop.remote,
479 heads=pullop.heads,
479 heads=pullop.heads,
480 force=pullop.force)
480 force=pullop.force)
481 pullop.common, pullop.fetch, pullop.rheads = tmp
481 pullop.common, pullop.fetch, pullop.rheads = tmp
482
482
483 def _pullbundle2(pullop):
483 def _pullbundle2(pullop):
484 """pull data using bundle2
484 """pull data using bundle2
485
485
486 For now, the only supported data are changegroup."""
486 For now, the only supported data are changegroup."""
487 kwargs = {'bundlecaps': set(['HG20'])}
487 kwargs = {'bundlecaps': set(['HG20'])}
488 # pulling changegroup
488 # pulling changegroup
489 pullop.todosteps.remove('changegroup')
489 pullop.todosteps.remove('changegroup')
490 if not pullop.fetch:
490 if not pullop.fetch:
491 pullop.repo.ui.status(_("no changes found\n"))
491 pullop.repo.ui.status(_("no changes found\n"))
492 pullop.cgresult = 0
492 pullop.cgresult = 0
493 else:
493 else:
494 kwargs['common'] = pullop.common
494 kwargs['common'] = pullop.common
495 kwargs['heads'] = pullop.heads or pullop.rheads
495 kwargs['heads'] = pullop.heads or pullop.rheads
496 if pullop.heads is None and list(pullop.common) == [nullid]:
496 if pullop.heads is None and list(pullop.common) == [nullid]:
497 pullop.repo.ui.status(_("requesting all changes\n"))
497 pullop.repo.ui.status(_("requesting all changes\n"))
498 if kwargs.keys() == ['format']:
498 if kwargs.keys() == ['format']:
499 return # nothing to pull
499 return # nothing to pull
500 bundle = pullop.remote.getbundle('pull', **kwargs)
500 bundle = pullop.remote.getbundle('pull', **kwargs)
501 try:
501 try:
502 op = bundle2.processbundle(pullop.repo, bundle, pullop.gettransaction)
502 op = bundle2.processbundle(pullop.repo, bundle, pullop.gettransaction)
503 except KeyError, exc:
503 except KeyError, exc:
504 raise util.Abort('missing support for %s' % exc)
504 raise util.Abort('missing support for %s' % exc)
505 assert len(op.records['changegroup']) == 1
505 assert len(op.records['changegroup']) == 1
506 pullop.cgresult = op.records['changegroup'][0]['return']
506 pullop.cgresult = op.records['changegroup'][0]['return']
507
507
508 def _pullchangeset(pullop):
508 def _pullchangeset(pullop):
509 """pull changeset from unbundle into the local repo"""
509 """pull changeset from unbundle into the local repo"""
510 # We delay the open of the transaction as late as possible so we
510 # We delay the open of the transaction as late as possible so we
511 # don't open transaction for nothing or you break future useful
511 # don't open transaction for nothing or you break future useful
512 # rollback call
512 # rollback call
513 pullop.todosteps.remove('changegroup')
513 pullop.todosteps.remove('changegroup')
514 if not pullop.fetch:
514 if not pullop.fetch:
515 pullop.repo.ui.status(_("no changes found\n"))
515 pullop.repo.ui.status(_("no changes found\n"))
516 pullop.cgresult = 0
516 pullop.cgresult = 0
517 return
517 return
518 pullop.gettransaction()
518 pullop.gettransaction()
519 if pullop.heads is None and list(pullop.common) == [nullid]:
519 if pullop.heads is None and list(pullop.common) == [nullid]:
520 pullop.repo.ui.status(_("requesting all changes\n"))
520 pullop.repo.ui.status(_("requesting all changes\n"))
521 elif pullop.heads is None and pullop.remote.capable('changegroupsubset'):
521 elif pullop.heads is None and pullop.remote.capable('changegroupsubset'):
522 # issue1320, avoid a race if remote changed after discovery
522 # issue1320, avoid a race if remote changed after discovery
523 pullop.heads = pullop.rheads
523 pullop.heads = pullop.rheads
524
524
525 if pullop.remote.capable('getbundle'):
525 if pullop.remote.capable('getbundle'):
526 # TODO: get bundlecaps from remote
526 # TODO: get bundlecaps from remote
527 cg = pullop.remote.getbundle('pull', common=pullop.common,
527 cg = pullop.remote.getbundle('pull', common=pullop.common,
528 heads=pullop.heads or pullop.rheads)
528 heads=pullop.heads or pullop.rheads)
529 elif pullop.heads is None:
529 elif pullop.heads is None:
530 cg = pullop.remote.changegroup(pullop.fetch, 'pull')
530 cg = pullop.remote.changegroup(pullop.fetch, 'pull')
531 elif not pullop.remote.capable('changegroupsubset'):
531 elif not pullop.remote.capable('changegroupsubset'):
532 raise util.Abort(_("partial pull cannot be done because "
532 raise util.Abort(_("partial pull cannot be done because "
533 "other repository doesn't support "
533 "other repository doesn't support "
534 "changegroupsubset."))
534 "changegroupsubset."))
535 else:
535 else:
536 cg = pullop.remote.changegroupsubset(pullop.fetch, pullop.heads, 'pull')
536 cg = pullop.remote.changegroupsubset(pullop.fetch, pullop.heads, 'pull')
537 pullop.cgresult = changegroup.addchangegroup(pullop.repo, cg, 'pull',
537 pullop.cgresult = changegroup.addchangegroup(pullop.repo, cg, 'pull',
538 pullop.remote.url())
538 pullop.remote.url())
539
539
540 def _pullphase(pullop):
540 def _pullphase(pullop):
541 # Get remote phases data from remote
541 # Get remote phases data from remote
542 pullop.todosteps.remove('phases')
542 pullop.todosteps.remove('phases')
543 remotephases = pullop.remote.listkeys('phases')
543 remotephases = pullop.remote.listkeys('phases')
544 publishing = bool(remotephases.get('publishing', False))
544 publishing = bool(remotephases.get('publishing', False))
545 if remotephases and not publishing:
545 if remotephases and not publishing:
546 # remote is new and unpublishing
546 # remote is new and unpublishing
547 pheads, _dr = phases.analyzeremotephases(pullop.repo,
547 pheads, _dr = phases.analyzeremotephases(pullop.repo,
548 pullop.pulledsubset,
548 pullop.pulledsubset,
549 remotephases)
549 remotephases)
550 phases.advanceboundary(pullop.repo, phases.public, pheads)
550 phases.advanceboundary(pullop.repo, phases.public, pheads)
551 phases.advanceboundary(pullop.repo, phases.draft,
551 phases.advanceboundary(pullop.repo, phases.draft,
552 pullop.pulledsubset)
552 pullop.pulledsubset)
553 else:
553 else:
554 # Remote is old or publishing all common changesets
554 # Remote is old or publishing all common changesets
555 # should be seen as public
555 # should be seen as public
556 phases.advanceboundary(pullop.repo, phases.public,
556 phases.advanceboundary(pullop.repo, phases.public,
557 pullop.pulledsubset)
557 pullop.pulledsubset)
558
558
559 def _pullobsolete(pullop):
559 def _pullobsolete(pullop):
560 """utility function to pull obsolete markers from a remote
560 """utility function to pull obsolete markers from a remote
561
561
562 The `gettransaction` is function that return the pull transaction, creating
562 The `gettransaction` is function that return the pull transaction, creating
563 one if necessary. We return the transaction to inform the calling code that
563 one if necessary. We return the transaction to inform the calling code that
564 a new transaction have been created (when applicable).
564 a new transaction have been created (when applicable).
565
565
566 Exists mostly to allow overriding for experimentation purpose"""
566 Exists mostly to allow overriding for experimentation purpose"""
567 pullop.todosteps.remove('obsmarkers')
567 pullop.todosteps.remove('obsmarkers')
568 tr = None
568 tr = None
569 if obsolete._enabled:
569 if obsolete._enabled:
570 pullop.repo.ui.debug('fetching remote obsolete markers\n')
570 pullop.repo.ui.debug('fetching remote obsolete markers\n')
571 remoteobs = pullop.remote.listkeys('obsolete')
571 remoteobs = pullop.remote.listkeys('obsolete')
572 if 'dump0' in remoteobs:
572 if 'dump0' in remoteobs:
573 tr = pullop.gettransaction()
573 tr = pullop.gettransaction()
574 for key in sorted(remoteobs, reverse=True):
574 for key in sorted(remoteobs, reverse=True):
575 if key.startswith('dump'):
575 if key.startswith('dump'):
576 data = base85.b85decode(remoteobs[key])
576 data = base85.b85decode(remoteobs[key])
577 pullop.repo.obsstore.mergemarkers(tr, data)
577 pullop.repo.obsstore.mergemarkers(tr, data)
578 pullop.repo.invalidatevolatilesets()
578 pullop.repo.invalidatevolatilesets()
579 return tr
579 return tr
580
580
581 def getbundle(repo, source, heads=None, common=None, bundlecaps=None):
581 def getbundle(repo, source, heads=None, common=None, bundlecaps=None):
582 """return a full bundle (with potentially multiple kind of parts)
582 """return a full bundle (with potentially multiple kind of parts)
583
583
584 Could be a bundle HG10 or a bundle HG20 depending on bundlecaps
584 Could be a bundle HG10 or a bundle HG20 depending on bundlecaps
585 passed. For now, the bundle can contain only changegroup, but this will
585 passed. For now, the bundle can contain only changegroup, but this will
586 changes when more part type will be available for bundle2.
586 changes when more part type will be available for bundle2.
587
587
588 This is different from changegroup.getbundle that only returns an HG10
588 This is different from changegroup.getbundle that only returns an HG10
589 changegroup bundle. They may eventually get reunited in the future when we
589 changegroup bundle. They may eventually get reunited in the future when we
590 have a clearer idea of the API we what to query different data.
590 have a clearer idea of the API we what to query different data.
591
591
592 The implementation is at a very early stage and will get massive rework
592 The implementation is at a very early stage and will get massive rework
593 when the API of bundle is refined.
593 when the API of bundle is refined.
594 """
594 """
595 # build bundle here.
595 # build bundle here.
596 cg = changegroup.getbundle(repo, source, heads=heads,
596 cg = changegroup.getbundle(repo, source, heads=heads,
597 common=common, bundlecaps=bundlecaps)
597 common=common, bundlecaps=bundlecaps)
598 if bundlecaps is None or 'HG20' not in bundlecaps:
598 if bundlecaps is None or 'HG20' not in bundlecaps:
599 return cg
599 return cg
600 # very crude first implementation,
600 # very crude first implementation,
601 # the bundle API will change and the generation will be done lazily.
601 # the bundle API will change and the generation will be done lazily.
602 bundler = bundle2.bundle20(repo.ui)
602 bundler = bundle2.bundle20(repo.ui)
603 def cgchunks(cg=cg):
603 def cgchunks(cg=cg):
604 yield 'HG10UN'
604 yield 'HG10UN'
605 for c in cg.getchunks():
605 for c in cg.getchunks():
606 yield c
606 yield c
607 part = bundle2.part('changegroup', data=cgchunks())
607 part = bundle2.bundlepart('changegroup', data=cgchunks())
608 bundler.addpart(part)
608 bundler.addpart(part)
609 return bundle2.unbundle20(repo.ui, util.chunkbuffer(bundler.getchunks()))
609 return bundle2.unbundle20(repo.ui, util.chunkbuffer(bundler.getchunks()))
610
610
611 class PushRaced(RuntimeError):
611 class PushRaced(RuntimeError):
612 """An exception raised during unbunding that indicate a push race"""
612 """An exception raised during unbunding that indicate a push race"""
613
613
614 def check_heads(repo, their_heads, context):
614 def check_heads(repo, their_heads, context):
615 """check if the heads of a repo have been modified
615 """check if the heads of a repo have been modified
616
616
617 Used by peer for unbundling.
617 Used by peer for unbundling.
618 """
618 """
619 heads = repo.heads()
619 heads = repo.heads()
620 heads_hash = util.sha1(''.join(sorted(heads))).digest()
620 heads_hash = util.sha1(''.join(sorted(heads))).digest()
621 if not (their_heads == ['force'] or their_heads == heads or
621 if not (their_heads == ['force'] or their_heads == heads or
622 their_heads == ['hashed', heads_hash]):
622 their_heads == ['hashed', heads_hash]):
623 # someone else committed/pushed/unbundled while we
623 # someone else committed/pushed/unbundled while we
624 # were transferring data
624 # were transferring data
625 raise PushRaced('repository changed while %s - '
625 raise PushRaced('repository changed while %s - '
626 'please try again' % context)
626 'please try again' % context)
627
627
628 def unbundle(repo, cg, heads, source, url):
628 def unbundle(repo, cg, heads, source, url):
629 """Apply a bundle to a repo.
629 """Apply a bundle to a repo.
630
630
631 this function makes sure the repo is locked during the application and have
631 this function makes sure the repo is locked during the application and have
632 mechanism to check that no push race occured between the creation of the
632 mechanism to check that no push race occured between the creation of the
633 bundle and its application.
633 bundle and its application.
634
634
635 If the push was raced as PushRaced exception is raised."""
635 If the push was raced as PushRaced exception is raised."""
636 r = 0
636 r = 0
637 lock = repo.lock()
637 lock = repo.lock()
638 try:
638 try:
639 check_heads(repo, heads, 'uploading changes')
639 check_heads(repo, heads, 'uploading changes')
640 # push can proceed
640 # push can proceed
641 r = changegroup.addchangegroup(repo, cg, source, url)
641 r = changegroup.addchangegroup(repo, cg, source, url)
642 finally:
642 finally:
643 lock.release()
643 lock.release()
644 return r
644 return r
@@ -1,676 +1,677 b''
1
1
2 Create an extension to test bundle2 API
2 Create an extension to test bundle2 API
3
3
4 $ cat > bundle2.py << EOF
4 $ cat > bundle2.py << EOF
5 > """A small extension to test bundle2 implementation
5 > """A small extension to test bundle2 implementation
6 >
6 >
7 > Current bundle2 implementation is far too limited to be used in any core
7 > Current bundle2 implementation is far too limited to be used in any core
8 > code. We still need to be able to test it while it grow up.
8 > code. We still need to be able to test it while it grow up.
9 > """
9 > """
10 >
10 >
11 > import sys
11 > import sys
12 > from mercurial import cmdutil
12 > from mercurial import cmdutil
13 > from mercurial import util
13 > from mercurial import util
14 > from mercurial import bundle2
14 > from mercurial import bundle2
15 > from mercurial import scmutil
15 > from mercurial import scmutil
16 > from mercurial import discovery
16 > from mercurial import discovery
17 > from mercurial import changegroup
17 > from mercurial import changegroup
18 > cmdtable = {}
18 > cmdtable = {}
19 > command = cmdutil.command(cmdtable)
19 > command = cmdutil.command(cmdtable)
20 >
20 >
21 > ELEPHANTSSONG = """Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
21 > ELEPHANTSSONG = """Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
22 > Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
22 > Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
23 > Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko."""
23 > Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko."""
24 > assert len(ELEPHANTSSONG) == 178 # future test say 178 bytes, trust it.
24 > assert len(ELEPHANTSSONG) == 178 # future test say 178 bytes, trust it.
25 >
25 >
26 > @bundle2.parthandler('test:song')
26 > @bundle2.parthandler('test:song')
27 > def songhandler(op, part):
27 > def songhandler(op, part):
28 > """handle a "test:song" bundle2 part, printing the lyrics on stdin"""
28 > """handle a "test:song" bundle2 part, printing the lyrics on stdin"""
29 > op.ui.write('The choir starts singing:\n')
29 > op.ui.write('The choir starts singing:\n')
30 > verses = 0
30 > verses = 0
31 > for line in part.data.split('\n'):
31 > for line in part.data.split('\n'):
32 > op.ui.write(' %s\n' % line)
32 > op.ui.write(' %s\n' % line)
33 > verses += 1
33 > verses += 1
34 > op.records.add('song', {'verses': verses})
34 > op.records.add('song', {'verses': verses})
35 >
35 >
36 > @bundle2.parthandler('test:ping')
36 > @bundle2.parthandler('test:ping')
37 > def pinghandler(op, part):
37 > def pinghandler(op, part):
38 > op.ui.write('received ping request (id %i)\n' % part.id)
38 > op.ui.write('received ping request (id %i)\n' % part.id)
39 > if op.reply is not None:
39 > if op.reply is not None:
40 > op.reply.addpart(bundle2.part('test:pong',
40 > rpart = bundle2.bundlepart('test:pong',
41 > [('in-reply-to', str(part.id))]))
41 > [('in-reply-to', str(part.id))])
42 > op.reply.addpart(rpart)
42 >
43 >
43 > @command('bundle2',
44 > @command('bundle2',
44 > [('', 'param', [], 'stream level parameter'),
45 > [('', 'param', [], 'stream level parameter'),
45 > ('', 'unknown', False, 'include an unknown mandatory part in the bundle'),
46 > ('', 'unknown', False, 'include an unknown mandatory part in the bundle'),
46 > ('', 'parts', False, 'include some arbitrary parts to the bundle'),
47 > ('', 'parts', False, 'include some arbitrary parts to the bundle'),
47 > ('r', 'rev', [], 'includes those changeset in the bundle'),],
48 > ('r', 'rev', [], 'includes those changeset in the bundle'),],
48 > '[OUTPUTFILE]')
49 > '[OUTPUTFILE]')
49 > def cmdbundle2(ui, repo, path=None, **opts):
50 > def cmdbundle2(ui, repo, path=None, **opts):
50 > """write a bundle2 container on standard ouput"""
51 > """write a bundle2 container on standard ouput"""
51 > bundler = bundle2.bundle20(ui)
52 > bundler = bundle2.bundle20(ui)
52 > for p in opts['param']:
53 > for p in opts['param']:
53 > p = p.split('=', 1)
54 > p = p.split('=', 1)
54 > try:
55 > try:
55 > bundler.addparam(*p)
56 > bundler.addparam(*p)
56 > except ValueError, exc:
57 > except ValueError, exc:
57 > raise util.Abort('%s' % exc)
58 > raise util.Abort('%s' % exc)
58 >
59 >
59 > revs = opts['rev']
60 > revs = opts['rev']
60 > if 'rev' in opts:
61 > if 'rev' in opts:
61 > revs = scmutil.revrange(repo, opts['rev'])
62 > revs = scmutil.revrange(repo, opts['rev'])
62 > if revs:
63 > if revs:
63 > # very crude version of a changegroup part creation
64 > # very crude version of a changegroup part creation
64 > bundled = repo.revs('%ld::%ld', revs, revs)
65 > bundled = repo.revs('%ld::%ld', revs, revs)
65 > headmissing = [c.node() for c in repo.set('heads(%ld)', revs)]
66 > headmissing = [c.node() for c in repo.set('heads(%ld)', revs)]
66 > headcommon = [c.node() for c in repo.set('parents(%ld) - %ld', revs, revs)]
67 > headcommon = [c.node() for c in repo.set('parents(%ld) - %ld', revs, revs)]
67 > outgoing = discovery.outgoing(repo.changelog, headcommon, headmissing)
68 > outgoing = discovery.outgoing(repo.changelog, headcommon, headmissing)
68 > cg = changegroup.getlocalbundle(repo, 'test:bundle2', outgoing, None)
69 > cg = changegroup.getlocalbundle(repo, 'test:bundle2', outgoing, None)
69 > def cgchunks(cg=cg):
70 > def cgchunks(cg=cg):
70 > yield 'HG10UN'
71 > yield 'HG10UN'
71 > for c in cg.getchunks():
72 > for c in cg.getchunks():
72 > yield c
73 > yield c
73 > part = bundle2.part('changegroup', data=cgchunks())
74 > part = bundle2.bundlepart('changegroup', data=cgchunks())
74 > bundler.addpart(part)
75 > bundler.addpart(part)
75 >
76 >
76 > if opts['parts']:
77 > if opts['parts']:
77 > part = bundle2.part('test:empty')
78 > part = bundle2.bundlepart('test:empty')
78 > bundler.addpart(part)
79 > bundler.addpart(part)
79 > # add a second one to make sure we handle multiple parts
80 > # add a second one to make sure we handle multiple parts
80 > part = bundle2.part('test:empty')
81 > part = bundle2.bundlepart('test:empty')
81 > bundler.addpart(part)
82 > bundler.addpart(part)
82 > part = bundle2.part('test:song', data=ELEPHANTSSONG)
83 > part = bundle2.bundlepart('test:song', data=ELEPHANTSSONG)
83 > bundler.addpart(part)
84 > bundler.addpart(part)
84 > part = bundle2.part('test:math',
85 > part = bundle2.bundlepart('test:math',
85 > [('pi', '3.14'), ('e', '2.72')],
86 > [('pi', '3.14'), ('e', '2.72')],
86 > [('cooking', 'raw')],
87 > [('cooking', 'raw')],
87 > '42')
88 > '42')
88 > bundler.addpart(part)
89 > bundler.addpart(part)
89 > if opts['unknown']:
90 > if opts['unknown']:
90 > part = bundle2.part('test:UNKNOWN',
91 > part = bundle2.bundlepart('test:UNKNOWN',
91 > data='some random content')
92 > data='some random content')
92 > bundler.addpart(part)
93 > bundler.addpart(part)
93 > if opts['parts']:
94 > if opts['parts']:
94 > part = bundle2.part('test:ping')
95 > part = bundle2.bundlepart('test:ping')
95 > bundler.addpart(part)
96 > bundler.addpart(part)
96 >
97 >
97 > if path is None:
98 > if path is None:
98 > file = sys.stdout
99 > file = sys.stdout
99 > else:
100 > else:
100 > file = open(path, 'w')
101 > file = open(path, 'w')
101 >
102 >
102 > for chunk in bundler.getchunks():
103 > for chunk in bundler.getchunks():
103 > file.write(chunk)
104 > file.write(chunk)
104 >
105 >
105 > @command('unbundle2', [], '')
106 > @command('unbundle2', [], '')
106 > def cmdunbundle2(ui, repo, replypath=None):
107 > def cmdunbundle2(ui, repo, replypath=None):
107 > """process a bundle2 stream from stdin on the current repo"""
108 > """process a bundle2 stream from stdin on the current repo"""
108 > try:
109 > try:
109 > tr = None
110 > tr = None
110 > lock = repo.lock()
111 > lock = repo.lock()
111 > tr = repo.transaction('processbundle')
112 > tr = repo.transaction('processbundle')
112 > try:
113 > try:
113 > unbundler = bundle2.unbundle20(ui, sys.stdin)
114 > unbundler = bundle2.unbundle20(ui, sys.stdin)
114 > op = bundle2.processbundle(repo, unbundler, lambda: tr)
115 > op = bundle2.processbundle(repo, unbundler, lambda: tr)
115 > tr.close()
116 > tr.close()
116 > except KeyError, exc:
117 > except KeyError, exc:
117 > raise util.Abort('missing support for %s' % exc)
118 > raise util.Abort('missing support for %s' % exc)
118 > finally:
119 > finally:
119 > if tr is not None:
120 > if tr is not None:
120 > tr.release()
121 > tr.release()
121 > lock.release()
122 > lock.release()
122 > remains = sys.stdin.read()
123 > remains = sys.stdin.read()
123 > ui.write('%i unread bytes\n' % len(remains))
124 > ui.write('%i unread bytes\n' % len(remains))
124 > if op.records['song']:
125 > if op.records['song']:
125 > totalverses = sum(r['verses'] for r in op.records['song'])
126 > totalverses = sum(r['verses'] for r in op.records['song'])
126 > ui.write('%i total verses sung\n' % totalverses)
127 > ui.write('%i total verses sung\n' % totalverses)
127 > for rec in op.records['changegroup']:
128 > for rec in op.records['changegroup']:
128 > ui.write('addchangegroup return: %i\n' % rec['return'])
129 > ui.write('addchangegroup return: %i\n' % rec['return'])
129 > if op.reply is not None and replypath is not None:
130 > if op.reply is not None and replypath is not None:
130 > file = open(replypath, 'w')
131 > file = open(replypath, 'w')
131 > for chunk in op.reply.getchunks():
132 > for chunk in op.reply.getchunks():
132 > file.write(chunk)
133 > file.write(chunk)
133 >
134 >
134 > @command('statbundle2', [], '')
135 > @command('statbundle2', [], '')
135 > def cmdstatbundle2(ui, repo):
136 > def cmdstatbundle2(ui, repo):
136 > """print statistic on the bundle2 container read from stdin"""
137 > """print statistic on the bundle2 container read from stdin"""
137 > unbundler = bundle2.unbundle20(ui, sys.stdin)
138 > unbundler = bundle2.unbundle20(ui, sys.stdin)
138 > try:
139 > try:
139 > params = unbundler.params
140 > params = unbundler.params
140 > except KeyError, exc:
141 > except KeyError, exc:
141 > raise util.Abort('unknown parameters: %s' % exc)
142 > raise util.Abort('unknown parameters: %s' % exc)
142 > ui.write('options count: %i\n' % len(params))
143 > ui.write('options count: %i\n' % len(params))
143 > for key in sorted(params):
144 > for key in sorted(params):
144 > ui.write('- %s\n' % key)
145 > ui.write('- %s\n' % key)
145 > value = params[key]
146 > value = params[key]
146 > if value is not None:
147 > if value is not None:
147 > ui.write(' %s\n' % value)
148 > ui.write(' %s\n' % value)
148 > parts = list(unbundler)
149 > parts = list(unbundler)
149 > ui.write('parts count: %i\n' % len(parts))
150 > ui.write('parts count: %i\n' % len(parts))
150 > for p in parts:
151 > for p in parts:
151 > ui.write(' :%s:\n' % p.type)
152 > ui.write(' :%s:\n' % p.type)
152 > ui.write(' mandatory: %i\n' % len(p.mandatoryparams))
153 > ui.write(' mandatory: %i\n' % len(p.mandatoryparams))
153 > ui.write(' advisory: %i\n' % len(p.advisoryparams))
154 > ui.write(' advisory: %i\n' % len(p.advisoryparams))
154 > ui.write(' payload: %i bytes\n' % len(p.data))
155 > ui.write(' payload: %i bytes\n' % len(p.data))
155 > EOF
156 > EOF
156 $ cat >> $HGRCPATH << EOF
157 $ cat >> $HGRCPATH << EOF
157 > [extensions]
158 > [extensions]
158 > bundle2=$TESTTMP/bundle2.py
159 > bundle2=$TESTTMP/bundle2.py
159 > [server]
160 > [server]
160 > bundle2=True
161 > bundle2=True
161 > EOF
162 > EOF
162
163
163 The extension requires a repo (currently unused)
164 The extension requires a repo (currently unused)
164
165
165 $ hg init main
166 $ hg init main
166 $ cd main
167 $ cd main
167 $ touch a
168 $ touch a
168 $ hg add a
169 $ hg add a
169 $ hg commit -m 'a'
170 $ hg commit -m 'a'
170
171
171
172
172 Empty bundle
173 Empty bundle
173 =================
174 =================
174
175
175 - no option
176 - no option
176 - no parts
177 - no parts
177
178
178 Test bundling
179 Test bundling
179
180
180 $ hg bundle2
181 $ hg bundle2
181 HG20\x00\x00\x00\x00 (no-eol) (esc)
182 HG20\x00\x00\x00\x00 (no-eol) (esc)
182
183
183 Test unbundling
184 Test unbundling
184
185
185 $ hg bundle2 | hg statbundle2
186 $ hg bundle2 | hg statbundle2
186 options count: 0
187 options count: 0
187 parts count: 0
188 parts count: 0
188
189
189 Test old style bundle are detected and refused
190 Test old style bundle are detected and refused
190
191
191 $ hg bundle --all ../bundle.hg
192 $ hg bundle --all ../bundle.hg
192 1 changesets found
193 1 changesets found
193 $ hg statbundle2 < ../bundle.hg
194 $ hg statbundle2 < ../bundle.hg
194 abort: unknown bundle version 10
195 abort: unknown bundle version 10
195 [255]
196 [255]
196
197
197 Test parameters
198 Test parameters
198 =================
199 =================
199
200
200 - some options
201 - some options
201 - no parts
202 - no parts
202
203
203 advisory parameters, no value
204 advisory parameters, no value
204 -------------------------------
205 -------------------------------
205
206
206 Simplest possible parameters form
207 Simplest possible parameters form
207
208
208 Test generation simple option
209 Test generation simple option
209
210
210 $ hg bundle2 --param 'caution'
211 $ hg bundle2 --param 'caution'
211 HG20\x00\x07caution\x00\x00 (no-eol) (esc)
212 HG20\x00\x07caution\x00\x00 (no-eol) (esc)
212
213
213 Test unbundling
214 Test unbundling
214
215
215 $ hg bundle2 --param 'caution' | hg statbundle2
216 $ hg bundle2 --param 'caution' | hg statbundle2
216 options count: 1
217 options count: 1
217 - caution
218 - caution
218 parts count: 0
219 parts count: 0
219
220
220 Test generation multiple option
221 Test generation multiple option
221
222
222 $ hg bundle2 --param 'caution' --param 'meal'
223 $ hg bundle2 --param 'caution' --param 'meal'
223 HG20\x00\x0ccaution meal\x00\x00 (no-eol) (esc)
224 HG20\x00\x0ccaution meal\x00\x00 (no-eol) (esc)
224
225
225 Test unbundling
226 Test unbundling
226
227
227 $ hg bundle2 --param 'caution' --param 'meal' | hg statbundle2
228 $ hg bundle2 --param 'caution' --param 'meal' | hg statbundle2
228 options count: 2
229 options count: 2
229 - caution
230 - caution
230 - meal
231 - meal
231 parts count: 0
232 parts count: 0
232
233
233 advisory parameters, with value
234 advisory parameters, with value
234 -------------------------------
235 -------------------------------
235
236
236 Test generation
237 Test generation
237
238
238 $ hg bundle2 --param 'caution' --param 'meal=vegan' --param 'elephants'
239 $ hg bundle2 --param 'caution' --param 'meal=vegan' --param 'elephants'
239 HG20\x00\x1ccaution meal=vegan elephants\x00\x00 (no-eol) (esc)
240 HG20\x00\x1ccaution meal=vegan elephants\x00\x00 (no-eol) (esc)
240
241
241 Test unbundling
242 Test unbundling
242
243
243 $ hg bundle2 --param 'caution' --param 'meal=vegan' --param 'elephants' | hg statbundle2
244 $ hg bundle2 --param 'caution' --param 'meal=vegan' --param 'elephants' | hg statbundle2
244 options count: 3
245 options count: 3
245 - caution
246 - caution
246 - elephants
247 - elephants
247 - meal
248 - meal
248 vegan
249 vegan
249 parts count: 0
250 parts count: 0
250
251
251 parameter with special char in value
252 parameter with special char in value
252 ---------------------------------------------------
253 ---------------------------------------------------
253
254
254 Test generation
255 Test generation
255
256
256 $ hg bundle2 --param 'e|! 7/=babar%#==tutu' --param simple
257 $ hg bundle2 --param 'e|! 7/=babar%#==tutu' --param simple
257 HG20\x00)e%7C%21%207/=babar%25%23%3D%3Dtutu simple\x00\x00 (no-eol) (esc)
258 HG20\x00)e%7C%21%207/=babar%25%23%3D%3Dtutu simple\x00\x00 (no-eol) (esc)
258
259
259 Test unbundling
260 Test unbundling
260
261
261 $ hg bundle2 --param 'e|! 7/=babar%#==tutu' --param simple | hg statbundle2
262 $ hg bundle2 --param 'e|! 7/=babar%#==tutu' --param simple | hg statbundle2
262 options count: 2
263 options count: 2
263 - e|! 7/
264 - e|! 7/
264 babar%#==tutu
265 babar%#==tutu
265 - simple
266 - simple
266 parts count: 0
267 parts count: 0
267
268
268 Test unknown mandatory option
269 Test unknown mandatory option
269 ---------------------------------------------------
270 ---------------------------------------------------
270
271
271 $ hg bundle2 --param 'Gravity' | hg statbundle2
272 $ hg bundle2 --param 'Gravity' | hg statbundle2
272 abort: unknown parameters: 'Gravity'
273 abort: unknown parameters: 'Gravity'
273 [255]
274 [255]
274
275
275 Test debug output
276 Test debug output
276 ---------------------------------------------------
277 ---------------------------------------------------
277
278
278 bundling debug
279 bundling debug
279
280
280 $ hg bundle2 --debug --param 'e|! 7/=babar%#==tutu' --param simple ../out.hg2
281 $ hg bundle2 --debug --param 'e|! 7/=babar%#==tutu' --param simple ../out.hg2
281 start emission of HG20 stream
282 start emission of HG20 stream
282 bundle parameter: e%7C%21%207/=babar%25%23%3D%3Dtutu simple
283 bundle parameter: e%7C%21%207/=babar%25%23%3D%3Dtutu simple
283 start of parts
284 start of parts
284 end of bundle
285 end of bundle
285
286
286 file content is ok
287 file content is ok
287
288
288 $ cat ../out.hg2
289 $ cat ../out.hg2
289 HG20\x00)e%7C%21%207/=babar%25%23%3D%3Dtutu simple\x00\x00 (no-eol) (esc)
290 HG20\x00)e%7C%21%207/=babar%25%23%3D%3Dtutu simple\x00\x00 (no-eol) (esc)
290
291
291 unbundling debug
292 unbundling debug
292
293
293 $ hg statbundle2 --debug < ../out.hg2
294 $ hg statbundle2 --debug < ../out.hg2
294 start processing of HG20 stream
295 start processing of HG20 stream
295 reading bundle2 stream parameters
296 reading bundle2 stream parameters
296 ignoring unknown parameter 'e|! 7/'
297 ignoring unknown parameter 'e|! 7/'
297 ignoring unknown parameter 'simple'
298 ignoring unknown parameter 'simple'
298 options count: 2
299 options count: 2
299 - e|! 7/
300 - e|! 7/
300 babar%#==tutu
301 babar%#==tutu
301 - simple
302 - simple
302 start extraction of bundle2 parts
303 start extraction of bundle2 parts
303 part header size: 0
304 part header size: 0
304 end of bundle2 stream
305 end of bundle2 stream
305 parts count: 0
306 parts count: 0
306
307
307
308
308 Test buggy input
309 Test buggy input
309 ---------------------------------------------------
310 ---------------------------------------------------
310
311
311 empty parameter name
312 empty parameter name
312
313
313 $ hg bundle2 --param '' --quiet
314 $ hg bundle2 --param '' --quiet
314 abort: empty parameter name
315 abort: empty parameter name
315 [255]
316 [255]
316
317
317 bad parameter name
318 bad parameter name
318
319
319 $ hg bundle2 --param 42babar
320 $ hg bundle2 --param 42babar
320 abort: non letter first character: '42babar'
321 abort: non letter first character: '42babar'
321 [255]
322 [255]
322
323
323
324
324 Test part
325 Test part
325 =================
326 =================
326
327
327 $ hg bundle2 --parts ../parts.hg2 --debug
328 $ hg bundle2 --parts ../parts.hg2 --debug
328 start emission of HG20 stream
329 start emission of HG20 stream
329 bundle parameter:
330 bundle parameter:
330 start of parts
331 start of parts
331 bundle part: "test:empty"
332 bundle part: "test:empty"
332 bundle part: "test:empty"
333 bundle part: "test:empty"
333 bundle part: "test:song"
334 bundle part: "test:song"
334 bundle part: "test:math"
335 bundle part: "test:math"
335 bundle part: "test:ping"
336 bundle part: "test:ping"
336 end of bundle
337 end of bundle
337
338
338 $ cat ../parts.hg2
339 $ cat ../parts.hg2
339 HG20\x00\x00\x00\x11 (esc)
340 HG20\x00\x00\x00\x11 (esc)
340 test:empty\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x11 (esc)
341 test:empty\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x11 (esc)
341 test:empty\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x10 test:song\x00\x00\x00\x02\x00\x00\x00\x00\x00\xb2Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko (esc)
342 test:empty\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x10 test:song\x00\x00\x00\x02\x00\x00\x00\x00\x00\xb2Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko (esc)
342 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
343 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
343 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.\x00\x00\x00\x00\x00+ test:math\x00\x00\x00\x03\x02\x01\x02\x04\x01\x04\x07\x03pi3.14e2.72cookingraw\x00\x00\x00\x0242\x00\x00\x00\x00\x00\x10 test:ping\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
344 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.\x00\x00\x00\x00\x00+ test:math\x00\x00\x00\x03\x02\x01\x02\x04\x01\x04\x07\x03pi3.14e2.72cookingraw\x00\x00\x00\x0242\x00\x00\x00\x00\x00\x10 test:ping\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
344
345
345
346
346 $ hg statbundle2 < ../parts.hg2
347 $ hg statbundle2 < ../parts.hg2
347 options count: 0
348 options count: 0
348 parts count: 5
349 parts count: 5
349 :test:empty:
350 :test:empty:
350 mandatory: 0
351 mandatory: 0
351 advisory: 0
352 advisory: 0
352 payload: 0 bytes
353 payload: 0 bytes
353 :test:empty:
354 :test:empty:
354 mandatory: 0
355 mandatory: 0
355 advisory: 0
356 advisory: 0
356 payload: 0 bytes
357 payload: 0 bytes
357 :test:song:
358 :test:song:
358 mandatory: 0
359 mandatory: 0
359 advisory: 0
360 advisory: 0
360 payload: 178 bytes
361 payload: 178 bytes
361 :test:math:
362 :test:math:
362 mandatory: 2
363 mandatory: 2
363 advisory: 1
364 advisory: 1
364 payload: 2 bytes
365 payload: 2 bytes
365 :test:ping:
366 :test:ping:
366 mandatory: 0
367 mandatory: 0
367 advisory: 0
368 advisory: 0
368 payload: 0 bytes
369 payload: 0 bytes
369
370
370 $ hg statbundle2 --debug < ../parts.hg2
371 $ hg statbundle2 --debug < ../parts.hg2
371 start processing of HG20 stream
372 start processing of HG20 stream
372 reading bundle2 stream parameters
373 reading bundle2 stream parameters
373 options count: 0
374 options count: 0
374 start extraction of bundle2 parts
375 start extraction of bundle2 parts
375 part header size: 17
376 part header size: 17
376 part type: "test:empty"
377 part type: "test:empty"
377 part id: "0"
378 part id: "0"
378 part parameters: 0
379 part parameters: 0
379 payload chunk size: 0
380 payload chunk size: 0
380 part header size: 17
381 part header size: 17
381 part type: "test:empty"
382 part type: "test:empty"
382 part id: "1"
383 part id: "1"
383 part parameters: 0
384 part parameters: 0
384 payload chunk size: 0
385 payload chunk size: 0
385 part header size: 16
386 part header size: 16
386 part type: "test:song"
387 part type: "test:song"
387 part id: "2"
388 part id: "2"
388 part parameters: 0
389 part parameters: 0
389 payload chunk size: 178
390 payload chunk size: 178
390 payload chunk size: 0
391 payload chunk size: 0
391 part header size: 43
392 part header size: 43
392 part type: "test:math"
393 part type: "test:math"
393 part id: "3"
394 part id: "3"
394 part parameters: 3
395 part parameters: 3
395 payload chunk size: 2
396 payload chunk size: 2
396 payload chunk size: 0
397 payload chunk size: 0
397 part header size: 16
398 part header size: 16
398 part type: "test:ping"
399 part type: "test:ping"
399 part id: "4"
400 part id: "4"
400 part parameters: 0
401 part parameters: 0
401 payload chunk size: 0
402 payload chunk size: 0
402 part header size: 0
403 part header size: 0
403 end of bundle2 stream
404 end of bundle2 stream
404 parts count: 5
405 parts count: 5
405 :test:empty:
406 :test:empty:
406 mandatory: 0
407 mandatory: 0
407 advisory: 0
408 advisory: 0
408 payload: 0 bytes
409 payload: 0 bytes
409 :test:empty:
410 :test:empty:
410 mandatory: 0
411 mandatory: 0
411 advisory: 0
412 advisory: 0
412 payload: 0 bytes
413 payload: 0 bytes
413 :test:song:
414 :test:song:
414 mandatory: 0
415 mandatory: 0
415 advisory: 0
416 advisory: 0
416 payload: 178 bytes
417 payload: 178 bytes
417 :test:math:
418 :test:math:
418 mandatory: 2
419 mandatory: 2
419 advisory: 1
420 advisory: 1
420 payload: 2 bytes
421 payload: 2 bytes
421 :test:ping:
422 :test:ping:
422 mandatory: 0
423 mandatory: 0
423 advisory: 0
424 advisory: 0
424 payload: 0 bytes
425 payload: 0 bytes
425
426
426 Test actual unbundling of test part
427 Test actual unbundling of test part
427 =======================================
428 =======================================
428
429
429 Process the bundle
430 Process the bundle
430
431
431 $ hg unbundle2 --debug < ../parts.hg2
432 $ hg unbundle2 --debug < ../parts.hg2
432 start processing of HG20 stream
433 start processing of HG20 stream
433 reading bundle2 stream parameters
434 reading bundle2 stream parameters
434 start extraction of bundle2 parts
435 start extraction of bundle2 parts
435 part header size: 17
436 part header size: 17
436 part type: "test:empty"
437 part type: "test:empty"
437 part id: "0"
438 part id: "0"
438 part parameters: 0
439 part parameters: 0
439 payload chunk size: 0
440 payload chunk size: 0
440 ignoring unknown advisory part 'test:empty'
441 ignoring unknown advisory part 'test:empty'
441 part header size: 17
442 part header size: 17
442 part type: "test:empty"
443 part type: "test:empty"
443 part id: "1"
444 part id: "1"
444 part parameters: 0
445 part parameters: 0
445 payload chunk size: 0
446 payload chunk size: 0
446 ignoring unknown advisory part 'test:empty'
447 ignoring unknown advisory part 'test:empty'
447 part header size: 16
448 part header size: 16
448 part type: "test:song"
449 part type: "test:song"
449 part id: "2"
450 part id: "2"
450 part parameters: 0
451 part parameters: 0
451 payload chunk size: 178
452 payload chunk size: 178
452 payload chunk size: 0
453 payload chunk size: 0
453 found a handler for part 'test:song'
454 found a handler for part 'test:song'
454 The choir starts singing:
455 The choir starts singing:
455 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
456 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
456 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
457 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
457 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
458 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
458 part header size: 43
459 part header size: 43
459 part type: "test:math"
460 part type: "test:math"
460 part id: "3"
461 part id: "3"
461 part parameters: 3
462 part parameters: 3
462 payload chunk size: 2
463 payload chunk size: 2
463 payload chunk size: 0
464 payload chunk size: 0
464 ignoring unknown advisory part 'test:math'
465 ignoring unknown advisory part 'test:math'
465 part header size: 16
466 part header size: 16
466 part type: "test:ping"
467 part type: "test:ping"
467 part id: "4"
468 part id: "4"
468 part parameters: 0
469 part parameters: 0
469 payload chunk size: 0
470 payload chunk size: 0
470 found a handler for part 'test:ping'
471 found a handler for part 'test:ping'
471 received ping request (id 4)
472 received ping request (id 4)
472 part header size: 0
473 part header size: 0
473 end of bundle2 stream
474 end of bundle2 stream
474 0 unread bytes
475 0 unread bytes
475 3 total verses sung
476 3 total verses sung
476
477
477 Unbundle with an unknown mandatory part
478 Unbundle with an unknown mandatory part
478 (should abort)
479 (should abort)
479
480
480 $ hg bundle2 --parts --unknown ../unknown.hg2
481 $ hg bundle2 --parts --unknown ../unknown.hg2
481
482
482 $ hg unbundle2 < ../unknown.hg2
483 $ hg unbundle2 < ../unknown.hg2
483 The choir starts singing:
484 The choir starts singing:
484 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
485 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
485 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
486 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
486 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
487 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
487 0 unread bytes
488 0 unread bytes
488 abort: missing support for 'test:unknown'
489 abort: missing support for 'test:unknown'
489 [255]
490 [255]
490
491
491 unbundle with a reply
492 unbundle with a reply
492
493
493 $ hg unbundle2 ../reply.hg2 < ../parts.hg2
494 $ hg unbundle2 ../reply.hg2 < ../parts.hg2
494 The choir starts singing:
495 The choir starts singing:
495 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
496 Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko
496 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
497 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
497 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
498 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
498 received ping request (id 4)
499 received ping request (id 4)
499 0 unread bytes
500 0 unread bytes
500 3 total verses sung
501 3 total verses sung
501
502
502 The reply is a bundle
503 The reply is a bundle
503
504
504 $ cat ../reply.hg2
505 $ cat ../reply.hg2
505 HG20\x00\x00\x00\x1e test:pong\x00\x00\x00\x00\x01\x00\x0b\x01in-reply-to4\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
506 HG20\x00\x00\x00\x1e test:pong\x00\x00\x00\x00\x01\x00\x0b\x01in-reply-to4\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
506
507
507 The reply is valid
508 The reply is valid
508
509
509 $ hg statbundle2 < ../reply.hg2
510 $ hg statbundle2 < ../reply.hg2
510 options count: 0
511 options count: 0
511 parts count: 1
512 parts count: 1
512 :test:pong:
513 :test:pong:
513 mandatory: 1
514 mandatory: 1
514 advisory: 0
515 advisory: 0
515 payload: 0 bytes
516 payload: 0 bytes
516
517
517 Support for changegroup
518 Support for changegroup
518 ===================================
519 ===================================
519
520
520 $ hg unbundle $TESTDIR/bundles/rebase.hg
521 $ hg unbundle $TESTDIR/bundles/rebase.hg
521 adding changesets
522 adding changesets
522 adding manifests
523 adding manifests
523 adding file changes
524 adding file changes
524 added 8 changesets with 7 changes to 7 files (+3 heads)
525 added 8 changesets with 7 changes to 7 files (+3 heads)
525 (run 'hg heads' to see heads, 'hg merge' to merge)
526 (run 'hg heads' to see heads, 'hg merge' to merge)
526
527
527 $ hg log -G
528 $ hg log -G
528 o changeset: 8:02de42196ebe
529 o changeset: 8:02de42196ebe
529 | tag: tip
530 | tag: tip
530 | parent: 6:24b6387c8c8c
531 | parent: 6:24b6387c8c8c
531 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
532 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
532 | date: Sat Apr 30 15:24:48 2011 +0200
533 | date: Sat Apr 30 15:24:48 2011 +0200
533 | summary: H
534 | summary: H
534 |
535 |
535 | o changeset: 7:eea13746799a
536 | o changeset: 7:eea13746799a
536 |/| parent: 6:24b6387c8c8c
537 |/| parent: 6:24b6387c8c8c
537 | | parent: 5:9520eea781bc
538 | | parent: 5:9520eea781bc
538 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
539 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
539 | | date: Sat Apr 30 15:24:48 2011 +0200
540 | | date: Sat Apr 30 15:24:48 2011 +0200
540 | | summary: G
541 | | summary: G
541 | |
542 | |
542 o | changeset: 6:24b6387c8c8c
543 o | changeset: 6:24b6387c8c8c
543 | | parent: 1:cd010b8cd998
544 | | parent: 1:cd010b8cd998
544 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
545 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
545 | | date: Sat Apr 30 15:24:48 2011 +0200
546 | | date: Sat Apr 30 15:24:48 2011 +0200
546 | | summary: F
547 | | summary: F
547 | |
548 | |
548 | o changeset: 5:9520eea781bc
549 | o changeset: 5:9520eea781bc
549 |/ parent: 1:cd010b8cd998
550 |/ parent: 1:cd010b8cd998
550 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
551 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
551 | date: Sat Apr 30 15:24:48 2011 +0200
552 | date: Sat Apr 30 15:24:48 2011 +0200
552 | summary: E
553 | summary: E
553 |
554 |
554 | o changeset: 4:32af7686d403
555 | o changeset: 4:32af7686d403
555 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
556 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
556 | | date: Sat Apr 30 15:24:48 2011 +0200
557 | | date: Sat Apr 30 15:24:48 2011 +0200
557 | | summary: D
558 | | summary: D
558 | |
559 | |
559 | o changeset: 3:5fddd98957c8
560 | o changeset: 3:5fddd98957c8
560 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
561 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
561 | | date: Sat Apr 30 15:24:48 2011 +0200
562 | | date: Sat Apr 30 15:24:48 2011 +0200
562 | | summary: C
563 | | summary: C
563 | |
564 | |
564 | o changeset: 2:42ccdea3bb16
565 | o changeset: 2:42ccdea3bb16
565 |/ user: Nicolas Dumazet <nicdumz.commits@gmail.com>
566 |/ user: Nicolas Dumazet <nicdumz.commits@gmail.com>
566 | date: Sat Apr 30 15:24:48 2011 +0200
567 | date: Sat Apr 30 15:24:48 2011 +0200
567 | summary: B
568 | summary: B
568 |
569 |
569 o changeset: 1:cd010b8cd998
570 o changeset: 1:cd010b8cd998
570 parent: -1:000000000000
571 parent: -1:000000000000
571 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
572 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
572 date: Sat Apr 30 15:24:48 2011 +0200
573 date: Sat Apr 30 15:24:48 2011 +0200
573 summary: A
574 summary: A
574
575
575 @ changeset: 0:3903775176ed
576 @ changeset: 0:3903775176ed
576 user: test
577 user: test
577 date: Thu Jan 01 00:00:00 1970 +0000
578 date: Thu Jan 01 00:00:00 1970 +0000
578 summary: a
579 summary: a
579
580
580
581
581 $ hg bundle2 --debug --rev '8+7+5+4' ../rev.hg2
582 $ hg bundle2 --debug --rev '8+7+5+4' ../rev.hg2
582 4 changesets found
583 4 changesets found
583 list of changesets:
584 list of changesets:
584 32af7686d403cf45b5d95f2d70cebea587ac806a
585 32af7686d403cf45b5d95f2d70cebea587ac806a
585 9520eea781bcca16c1e15acc0ba14335a0e8e5ba
586 9520eea781bcca16c1e15acc0ba14335a0e8e5ba
586 eea13746799a9e0bfd88f29d3c2e9dc9389f524f
587 eea13746799a9e0bfd88f29d3c2e9dc9389f524f
587 02de42196ebee42ef284b6780a87cdc96e8eaab6
588 02de42196ebee42ef284b6780a87cdc96e8eaab6
588 start emission of HG20 stream
589 start emission of HG20 stream
589 bundle parameter:
590 bundle parameter:
590 start of parts
591 start of parts
591 bundle part: "changegroup"
592 bundle part: "changegroup"
592 bundling: 1/4 changesets (25.00%)
593 bundling: 1/4 changesets (25.00%)
593 bundling: 2/4 changesets (50.00%)
594 bundling: 2/4 changesets (50.00%)
594 bundling: 3/4 changesets (75.00%)
595 bundling: 3/4 changesets (75.00%)
595 bundling: 4/4 changesets (100.00%)
596 bundling: 4/4 changesets (100.00%)
596 bundling: 1/4 manifests (25.00%)
597 bundling: 1/4 manifests (25.00%)
597 bundling: 2/4 manifests (50.00%)
598 bundling: 2/4 manifests (50.00%)
598 bundling: 3/4 manifests (75.00%)
599 bundling: 3/4 manifests (75.00%)
599 bundling: 4/4 manifests (100.00%)
600 bundling: 4/4 manifests (100.00%)
600 bundling: D 1/3 files (33.33%)
601 bundling: D 1/3 files (33.33%)
601 bundling: E 2/3 files (66.67%)
602 bundling: E 2/3 files (66.67%)
602 bundling: H 3/3 files (100.00%)
603 bundling: H 3/3 files (100.00%)
603 end of bundle
604 end of bundle
604
605
605 $ cat ../rev.hg2
606 $ cat ../rev.hg2
606 HG20\x00\x00\x00\x12\x0bchangegroup\x00\x00\x00\x00\x00\x00\x00\x00\x06\x19HG10UN\x00\x00\x00\xa42\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j_\xdd\xd9\x89W\xc8\xa5JMCm\xfe\x1d\xa9\xd8\x7f!\xa1\xb9{\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)6e1f4c47ecb533ffd0c8e52cdc88afb6cd39e20c (esc)
607 HG20\x00\x00\x00\x12\x0bchangegroup\x00\x00\x00\x00\x00\x00\x00\x00\x06\x19HG10UN\x00\x00\x00\xa42\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j_\xdd\xd9\x89W\xc8\xa5JMCm\xfe\x1d\xa9\xd8\x7f!\xa1\xb9{\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)6e1f4c47ecb533ffd0c8e52cdc88afb6cd39e20c (esc)
607 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x02D (esc)
608 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x02D (esc)
608 \x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01D\x00\x00\x00\xa4\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\xcd\x01\x0b\x8c\xd9\x98\xf3\x98\x1aZ\x81\x15\xf9O\x8d\xa4\xabP`\x89\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)4dece9c826f69490507b98c6383a3009b295837d (esc)
609 \x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01D\x00\x00\x00\xa4\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\xcd\x01\x0b\x8c\xd9\x98\xf3\x98\x1aZ\x81\x15\xf9O\x8d\xa4\xabP`\x89\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)4dece9c826f69490507b98c6383a3009b295837d (esc)
609 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x02E (esc)
610 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x02E (esc)
610 \x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01E\x00\x00\x00\xa2\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO$\xb68|\x8c\x8c\xae7\x17\x88\x80\xf3\xfa\x95\xde\xd3\xcb\x1c\xf7\x85\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)365b93d57fdf4814e2b5911d6bacff2b12014441 (esc)
611 \x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01E\x00\x00\x00\xa2\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO$\xb68|\x8c\x8c\xae7\x17\x88\x80\xf3\xfa\x95\xde\xd3\xcb\x1c\xf7\x85\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)365b93d57fdf4814e2b5911d6bacff2b12014441 (esc)
611 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x00\x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01G\x00\x00\x00\xa4\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
612 \x00\x00\x00f\x00\x00\x00h\x00\x00\x00\x00\x00\x00\x00i\x00\x00\x00j\x00\x00\x00\x01G\x00\x00\x00\xa4\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
612 \x87\xcd\xc9n\x8e\xaa\xb6$\xb68|\x8c\x8c\xae7\x17\x88\x80\xf3\xfa\x95\xde\xd3\xcb\x1c\xf7\x85\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
613 \x87\xcd\xc9n\x8e\xaa\xb6$\xb68|\x8c\x8c\xae7\x17\x88\x80\xf3\xfa\x95\xde\xd3\xcb\x1c\xf7\x85\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
613 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)8bee48edc7318541fc0013ee41b089276a8c24bf (esc)
614 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00\x00\x00\x00\x00)\x00\x00\x00)8bee48edc7318541fc0013ee41b089276a8c24bf (esc)
614 \x00\x00\x00f\x00\x00\x00f\x00\x00\x00\x02H (esc)
615 \x00\x00\x00f\x00\x00\x00f\x00\x00\x00\x02H (esc)
615 \x00\x00\x00g\x00\x00\x00h\x00\x00\x00\x01H\x00\x00\x00\x00\x00\x00\x00\x8bn\x1fLG\xec\xb53\xff\xd0\xc8\xe5,\xdc\x88\xaf\xb6\xcd9\xe2\x0cf\xa5\xa0\x18\x17\xfd\xf5#\x9c'8\x02\xb5\xb7a\x8d\x05\x1c\x89\xe4\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x81\x00\x00\x00\x81\x00\x00\x00+D\x00c3f1ca2924c16a19b0656a84900e504e5b0aec2d (esc)
616 \x00\x00\x00g\x00\x00\x00h\x00\x00\x00\x01H\x00\x00\x00\x00\x00\x00\x00\x8bn\x1fLG\xec\xb53\xff\xd0\xc8\xe5,\xdc\x88\xaf\xb6\xcd9\xe2\x0cf\xa5\xa0\x18\x17\xfd\xf5#\x9c'8\x02\xb5\xb7a\x8d\x05\x1c\x89\xe4\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x81\x00\x00\x00\x81\x00\x00\x00+D\x00c3f1ca2924c16a19b0656a84900e504e5b0aec2d (esc)
616 \x00\x00\x00\x8bM\xec\xe9\xc8&\xf6\x94\x90P{\x98\xc68:0 \xb2\x95\x83}\x00}\x8c\x9d\x88\x84\x13%\xf5\xc6\xb0cq\xb3[N\x8a+\x1a\x83\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00+\x00\x00\x00\xac\x00\x00\x00+E\x009c6fd0350a6c0d0c49d4a9c5017cf07043f54e58 (esc)
617 \x00\x00\x00\x8bM\xec\xe9\xc8&\xf6\x94\x90P{\x98\xc68:0 \xb2\x95\x83}\x00}\x8c\x9d\x88\x84\x13%\xf5\xc6\xb0cq\xb3[N\x8a+\x1a\x83\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00+\x00\x00\x00\xac\x00\x00\x00+E\x009c6fd0350a6c0d0c49d4a9c5017cf07043f54e58 (esc)
617 \x00\x00\x00\x8b6[\x93\xd5\x7f\xdfH\x14\xe2\xb5\x91\x1dk\xac\xff+\x12\x01DA(\xa5\x84\xc6^\xf1!\xf8\x9e\xb6j\xb7\xd0\xbc\x15=\x80\x99\xe7\xceM\xec\xe9\xc8&\xf6\x94\x90P{\x98\xc68:0 \xb2\x95\x83}\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO\x00\x00\x00V\x00\x00\x00V\x00\x00\x00+F\x0022bfcfd62a21a3287edbd4d656218d0f525ed76a (esc)
618 \x00\x00\x00\x8b6[\x93\xd5\x7f\xdfH\x14\xe2\xb5\x91\x1dk\xac\xff+\x12\x01DA(\xa5\x84\xc6^\xf1!\xf8\x9e\xb6j\xb7\xd0\xbc\x15=\x80\x99\xe7\xceM\xec\xe9\xc8&\xf6\x94\x90P{\x98\xc68:0 \xb2\x95\x83}\xee\xa17Fy\x9a\x9e\x0b\xfd\x88\xf2\x9d<.\x9d\xc98\x9fRO\x00\x00\x00V\x00\x00\x00V\x00\x00\x00+F\x0022bfcfd62a21a3287edbd4d656218d0f525ed76a (esc)
618 \x00\x00\x00\x97\x8b\xeeH\xed\xc71\x85A\xfc\x00\x13\xeeA\xb0\x89'j\x8c$\xbf(\xa5\x84\xc6^\xf1!\xf8\x9e\xb6j\xb7\xd0\xbc\x15=\x80\x99\xe7\xce\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
619 \x00\x00\x00\x97\x8b\xeeH\xed\xc71\x85A\xfc\x00\x13\xeeA\xb0\x89'j\x8c$\xbf(\xa5\x84\xc6^\xf1!\xf8\x9e\xb6j\xb7\xd0\xbc\x15=\x80\x99\xe7\xce\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
619 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00+\x00\x00\x00V\x00\x00\x00\x00\x00\x00\x00\x81\x00\x00\x00\x81\x00\x00\x00+H\x008500189e74a9e0475e822093bc7db0d631aeb0b4 (esc)
620 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00+\x00\x00\x00V\x00\x00\x00\x00\x00\x00\x00\x81\x00\x00\x00\x81\x00\x00\x00+H\x008500189e74a9e0475e822093bc7db0d631aeb0b4 (esc)
620 \x00\x00\x00\x00\x00\x00\x00\x05D\x00\x00\x00b\xc3\xf1\xca)$\xc1j\x19\xb0ej\x84\x90\x0ePN[ (esc)
621 \x00\x00\x00\x00\x00\x00\x00\x05D\x00\x00\x00b\xc3\xf1\xca)$\xc1j\x19\xb0ej\x84\x90\x0ePN[ (esc)
621 \xec-\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02D (esc)
622 \xec-\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x002\xafv\x86\xd4\x03\xcfE\xb5\xd9_-p\xce\xbe\xa5\x87\xac\x80j\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02D (esc)
622 \x00\x00\x00\x00\x00\x00\x00\x05E\x00\x00\x00b\x9co\xd05 (esc)
623 \x00\x00\x00\x00\x00\x00\x00\x05E\x00\x00\x00b\x9co\xd05 (esc)
623 l\r (no-eol) (esc)
624 l\r (no-eol) (esc)
624 \x0cI\xd4\xa9\xc5\x01|\xf0pC\xf5NX\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02E (esc)
625 \x0cI\xd4\xa9\xc5\x01|\xf0pC\xf5NX\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x95 \xee\xa7\x81\xbc\xca\x16\xc1\xe1Z\xcc\x0b\xa1C5\xa0\xe8\xe5\xba\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02E (esc)
625 \x00\x00\x00\x00\x00\x00\x00\x05H\x00\x00\x00b\x85\x00\x18\x9et\xa9\xe0G^\x82 \x93\xbc}\xb0\xd61\xae\xb0\xb4\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
626 \x00\x00\x00\x00\x00\x00\x00\x05H\x00\x00\x00b\x85\x00\x18\x9et\xa9\xe0G^\x82 \x93\xbc}\xb0\xd61\xae\xb0\xb4\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xdeB\x19n\xbe\xe4.\xf2\x84\xb6x (esc)
626 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02H (esc)
627 \x87\xcd\xc9n\x8e\xaa\xb6\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02H (esc)
627 \x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
628 \x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
628
629
629 $ hg unbundle2 ../rev-replay.hg2 < ../rev.hg2
630 $ hg unbundle2 ../rev-replay.hg2 < ../rev.hg2
630 adding changesets
631 adding changesets
631 adding manifests
632 adding manifests
632 adding file changes
633 adding file changes
633 added 0 changesets with 0 changes to 3 files
634 added 0 changesets with 0 changes to 3 files
634 0 unread bytes
635 0 unread bytes
635 addchangegroup return: 1
636 addchangegroup return: 1
636
637
637 $ cat ../rev-replay.hg2
638 $ cat ../rev-replay.hg2
638 HG20\x00\x00\x00/\x11reply:changegroup\x00\x00\x00\x00\x00\x02\x0b\x01\x06\x01in-reply-to0return1\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
639 HG20\x00\x00\x00/\x11reply:changegroup\x00\x00\x00\x00\x00\x02\x0b\x01\x06\x01in-reply-to0return1\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
639
640
640 Real world exchange
641 Real world exchange
641 =====================
642 =====================
642
643
643
644
644 clone --pull
645 clone --pull
645
646
646 $ cd ..
647 $ cd ..
647 $ hg clone main other --pull --rev 9520eea781bc
648 $ hg clone main other --pull --rev 9520eea781bc
648 adding changesets
649 adding changesets
649 adding manifests
650 adding manifests
650 adding file changes
651 adding file changes
651 added 2 changesets with 2 changes to 2 files
652 added 2 changesets with 2 changes to 2 files
652 updating to branch default
653 updating to branch default
653 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
654 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
654 $ hg -R other log -G
655 $ hg -R other log -G
655 @ changeset: 1:9520eea781bc
656 @ changeset: 1:9520eea781bc
656 | tag: tip
657 | tag: tip
657 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
658 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
658 | date: Sat Apr 30 15:24:48 2011 +0200
659 | date: Sat Apr 30 15:24:48 2011 +0200
659 | summary: E
660 | summary: E
660 |
661 |
661 o changeset: 0:cd010b8cd998
662 o changeset: 0:cd010b8cd998
662 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
663 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
663 date: Sat Apr 30 15:24:48 2011 +0200
664 date: Sat Apr 30 15:24:48 2011 +0200
664 summary: A
665 summary: A
665
666
666
667
667 pull
668 pull
668
669
669 $ hg -R other pull
670 $ hg -R other pull
670 pulling from $TESTTMP/main (glob)
671 pulling from $TESTTMP/main (glob)
671 searching for changes
672 searching for changes
672 adding changesets
673 adding changesets
673 adding manifests
674 adding manifests
674 adding file changes
675 adding file changes
675 added 7 changesets with 6 changes to 6 files (+3 heads)
676 added 7 changesets with 6 changes to 6 files (+3 heads)
676 (run 'hg heads' to see heads, 'hg merge' to merge)
677 (run 'hg heads' to see heads, 'hg merge' to merge)
General Comments 0
You need to be logged in to leave comments. Login now