##// END OF EJS Templates
obsolete: experimental flag to get debug about obsmarkers exchange...
Pierre-Yves David -
r24733:c00e4338 default
parent child Browse files
Show More
@@ -1,1235 +1,1239 b''
1 # bundle2.py - generic container format to transmit arbitrary data.
1 # bundle2.py - generic container format to transmit arbitrary data.
2 #
2 #
3 # Copyright 2013 Facebook, Inc.
3 # Copyright 2013 Facebook, Inc.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """Handling of the new bundle2 format
7 """Handling of the new bundle2 format
8
8
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
9 The goal of bundle2 is to act as an atomically packet to transmit a set of
10 payloads in an application agnostic way. It consist in a sequence of "parts"
10 payloads in an application agnostic way. It consist in a sequence of "parts"
11 that will be handed to and processed by the application layer.
11 that will be handed to and processed by the application layer.
12
12
13
13
14 General format architecture
14 General format architecture
15 ===========================
15 ===========================
16
16
17 The format is architectured as follow
17 The format is architectured as follow
18
18
19 - magic string
19 - magic string
20 - stream level parameters
20 - stream level parameters
21 - payload parts (any number)
21 - payload parts (any number)
22 - end of stream marker.
22 - end of stream marker.
23
23
24 the Binary format
24 the Binary format
25 ============================
25 ============================
26
26
27 All numbers are unsigned and big-endian.
27 All numbers are unsigned and big-endian.
28
28
29 stream level parameters
29 stream level parameters
30 ------------------------
30 ------------------------
31
31
32 Binary format is as follow
32 Binary format is as follow
33
33
34 :params size: int32
34 :params size: int32
35
35
36 The total number of Bytes used by the parameters
36 The total number of Bytes used by the parameters
37
37
38 :params value: arbitrary number of Bytes
38 :params value: arbitrary number of Bytes
39
39
40 A blob of `params size` containing the serialized version of all stream level
40 A blob of `params size` containing the serialized version of all stream level
41 parameters.
41 parameters.
42
42
43 The blob contains a space separated list of parameters. Parameters with value
43 The blob contains a space separated list of parameters. Parameters with value
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
44 are stored in the form `<name>=<value>`. Both name and value are urlquoted.
45
45
46 Empty name are obviously forbidden.
46 Empty name are obviously forbidden.
47
47
48 Name MUST start with a letter. If this first letter is lower case, the
48 Name MUST start with a letter. If this first letter is lower case, the
49 parameter is advisory and can be safely ignored. However when the first
49 parameter is advisory and can be safely ignored. However when the first
50 letter is capital, the parameter is mandatory and the bundling process MUST
50 letter is capital, the parameter is mandatory and the bundling process MUST
51 stop if he is not able to proceed it.
51 stop if he is not able to proceed it.
52
52
53 Stream parameters use a simple textual format for two main reasons:
53 Stream parameters use a simple textual format for two main reasons:
54
54
55 - Stream level parameters should remain simple and we want to discourage any
55 - Stream level parameters should remain simple and we want to discourage any
56 crazy usage.
56 crazy usage.
57 - Textual data allow easy human inspection of a bundle2 header in case of
57 - Textual data allow easy human inspection of a bundle2 header in case of
58 troubles.
58 troubles.
59
59
60 Any Applicative level options MUST go into a bundle2 part instead.
60 Any Applicative level options MUST go into a bundle2 part instead.
61
61
62 Payload part
62 Payload part
63 ------------------------
63 ------------------------
64
64
65 Binary format is as follow
65 Binary format is as follow
66
66
67 :header size: int32
67 :header size: int32
68
68
69 The total number of Bytes used by the part headers. When the header is empty
69 The total number of Bytes used by the part headers. When the header is empty
70 (size = 0) this is interpreted as the end of stream marker.
70 (size = 0) this is interpreted as the end of stream marker.
71
71
72 :header:
72 :header:
73
73
74 The header defines how to interpret the part. It contains two piece of
74 The header defines how to interpret the part. It contains two piece of
75 data: the part type, and the part parameters.
75 data: the part type, and the part parameters.
76
76
77 The part type is used to route an application level handler, that can
77 The part type is used to route an application level handler, that can
78 interpret payload.
78 interpret payload.
79
79
80 Part parameters are passed to the application level handler. They are
80 Part parameters are passed to the application level handler. They are
81 meant to convey information that will help the application level object to
81 meant to convey information that will help the application level object to
82 interpret the part payload.
82 interpret the part payload.
83
83
84 The binary format of the header is has follow
84 The binary format of the header is has follow
85
85
86 :typesize: (one byte)
86 :typesize: (one byte)
87
87
88 :parttype: alphanumerical part name (restricted to [a-zA-Z0-9_:-]*)
88 :parttype: alphanumerical part name (restricted to [a-zA-Z0-9_:-]*)
89
89
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
90 :partid: A 32bits integer (unique in the bundle) that can be used to refer
91 to this part.
91 to this part.
92
92
93 :parameters:
93 :parameters:
94
94
95 Part's parameter may have arbitrary content, the binary structure is::
95 Part's parameter may have arbitrary content, the binary structure is::
96
96
97 <mandatory-count><advisory-count><param-sizes><param-data>
97 <mandatory-count><advisory-count><param-sizes><param-data>
98
98
99 :mandatory-count: 1 byte, number of mandatory parameters
99 :mandatory-count: 1 byte, number of mandatory parameters
100
100
101 :advisory-count: 1 byte, number of advisory parameters
101 :advisory-count: 1 byte, number of advisory parameters
102
102
103 :param-sizes:
103 :param-sizes:
104
104
105 N couple of bytes, where N is the total number of parameters. Each
105 N couple of bytes, where N is the total number of parameters. Each
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
106 couple contains (<size-of-key>, <size-of-value) for one parameter.
107
107
108 :param-data:
108 :param-data:
109
109
110 A blob of bytes from which each parameter key and value can be
110 A blob of bytes from which each parameter key and value can be
111 retrieved using the list of size couples stored in the previous
111 retrieved using the list of size couples stored in the previous
112 field.
112 field.
113
113
114 Mandatory parameters comes first, then the advisory ones.
114 Mandatory parameters comes first, then the advisory ones.
115
115
116 Each parameter's key MUST be unique within the part.
116 Each parameter's key MUST be unique within the part.
117
117
118 :payload:
118 :payload:
119
119
120 payload is a series of `<chunksize><chunkdata>`.
120 payload is a series of `<chunksize><chunkdata>`.
121
121
122 `chunksize` is an int32, `chunkdata` are plain bytes (as much as
122 `chunksize` is an int32, `chunkdata` are plain bytes (as much as
123 `chunksize` says)` The payload part is concluded by a zero size chunk.
123 `chunksize` says)` The payload part is concluded by a zero size chunk.
124
124
125 The current implementation always produces either zero or one chunk.
125 The current implementation always produces either zero or one chunk.
126 This is an implementation limitation that will ultimately be lifted.
126 This is an implementation limitation that will ultimately be lifted.
127
127
128 `chunksize` can be negative to trigger special case processing. No such
128 `chunksize` can be negative to trigger special case processing. No such
129 processing is in place yet.
129 processing is in place yet.
130
130
131 Bundle processing
131 Bundle processing
132 ============================
132 ============================
133
133
134 Each part is processed in order using a "part handler". Handler are registered
134 Each part is processed in order using a "part handler". Handler are registered
135 for a certain part type.
135 for a certain part type.
136
136
137 The matching of a part to its handler is case insensitive. The case of the
137 The matching of a part to its handler is case insensitive. The case of the
138 part type is used to know if a part is mandatory or advisory. If the Part type
138 part type is used to know if a part is mandatory or advisory. If the Part type
139 contains any uppercase char it is considered mandatory. When no handler is
139 contains any uppercase char it is considered mandatory. When no handler is
140 known for a Mandatory part, the process is aborted and an exception is raised.
140 known for a Mandatory part, the process is aborted and an exception is raised.
141 If the part is advisory and no handler is known, the part is ignored. When the
141 If the part is advisory and no handler is known, the part is ignored. When the
142 process is aborted, the full bundle is still read from the stream to keep the
142 process is aborted, the full bundle is still read from the stream to keep the
143 channel usable. But none of the part read from an abort are processed. In the
143 channel usable. But none of the part read from an abort are processed. In the
144 future, dropping the stream may become an option for channel we do not care to
144 future, dropping the stream may become an option for channel we do not care to
145 preserve.
145 preserve.
146 """
146 """
147
147
148 import errno
148 import errno
149 import sys
149 import sys
150 import util
150 import util
151 import struct
151 import struct
152 import urllib
152 import urllib
153 import string
153 import string
154 import obsolete
154 import obsolete
155 import pushkey
155 import pushkey
156 import url
156 import url
157 import re
157 import re
158
158
159 import changegroup, error
159 import changegroup, error
160 from i18n import _
160 from i18n import _
161
161
162 _pack = struct.pack
162 _pack = struct.pack
163 _unpack = struct.unpack
163 _unpack = struct.unpack
164
164
165 _fstreamparamsize = '>i'
165 _fstreamparamsize = '>i'
166 _fpartheadersize = '>i'
166 _fpartheadersize = '>i'
167 _fparttypesize = '>B'
167 _fparttypesize = '>B'
168 _fpartid = '>I'
168 _fpartid = '>I'
169 _fpayloadsize = '>i'
169 _fpayloadsize = '>i'
170 _fpartparamcount = '>BB'
170 _fpartparamcount = '>BB'
171
171
172 preferedchunksize = 4096
172 preferedchunksize = 4096
173
173
174 _parttypeforbidden = re.compile('[^a-zA-Z0-9_:-]')
174 _parttypeforbidden = re.compile('[^a-zA-Z0-9_:-]')
175
175
176 def validateparttype(parttype):
176 def validateparttype(parttype):
177 """raise ValueError if a parttype contains invalid character"""
177 """raise ValueError if a parttype contains invalid character"""
178 if _parttypeforbidden.search(parttype):
178 if _parttypeforbidden.search(parttype):
179 raise ValueError(parttype)
179 raise ValueError(parttype)
180
180
181 def _makefpartparamsizes(nbparams):
181 def _makefpartparamsizes(nbparams):
182 """return a struct format to read part parameter sizes
182 """return a struct format to read part parameter sizes
183
183
184 The number parameters is variable so we need to build that format
184 The number parameters is variable so we need to build that format
185 dynamically.
185 dynamically.
186 """
186 """
187 return '>'+('BB'*nbparams)
187 return '>'+('BB'*nbparams)
188
188
189 parthandlermapping = {}
189 parthandlermapping = {}
190
190
191 def parthandler(parttype, params=()):
191 def parthandler(parttype, params=()):
192 """decorator that register a function as a bundle2 part handler
192 """decorator that register a function as a bundle2 part handler
193
193
194 eg::
194 eg::
195
195
196 @parthandler('myparttype', ('mandatory', 'param', 'handled'))
196 @parthandler('myparttype', ('mandatory', 'param', 'handled'))
197 def myparttypehandler(...):
197 def myparttypehandler(...):
198 '''process a part of type "my part".'''
198 '''process a part of type "my part".'''
199 ...
199 ...
200 """
200 """
201 validateparttype(parttype)
201 validateparttype(parttype)
202 def _decorator(func):
202 def _decorator(func):
203 lparttype = parttype.lower() # enforce lower case matching.
203 lparttype = parttype.lower() # enforce lower case matching.
204 assert lparttype not in parthandlermapping
204 assert lparttype not in parthandlermapping
205 parthandlermapping[lparttype] = func
205 parthandlermapping[lparttype] = func
206 func.params = frozenset(params)
206 func.params = frozenset(params)
207 return func
207 return func
208 return _decorator
208 return _decorator
209
209
210 class unbundlerecords(object):
210 class unbundlerecords(object):
211 """keep record of what happens during and unbundle
211 """keep record of what happens during and unbundle
212
212
213 New records are added using `records.add('cat', obj)`. Where 'cat' is a
213 New records are added using `records.add('cat', obj)`. Where 'cat' is a
214 category of record and obj is an arbitrary object.
214 category of record and obj is an arbitrary object.
215
215
216 `records['cat']` will return all entries of this category 'cat'.
216 `records['cat']` will return all entries of this category 'cat'.
217
217
218 Iterating on the object itself will yield `('category', obj)` tuples
218 Iterating on the object itself will yield `('category', obj)` tuples
219 for all entries.
219 for all entries.
220
220
221 All iterations happens in chronological order.
221 All iterations happens in chronological order.
222 """
222 """
223
223
224 def __init__(self):
224 def __init__(self):
225 self._categories = {}
225 self._categories = {}
226 self._sequences = []
226 self._sequences = []
227 self._replies = {}
227 self._replies = {}
228
228
229 def add(self, category, entry, inreplyto=None):
229 def add(self, category, entry, inreplyto=None):
230 """add a new record of a given category.
230 """add a new record of a given category.
231
231
232 The entry can then be retrieved in the list returned by
232 The entry can then be retrieved in the list returned by
233 self['category']."""
233 self['category']."""
234 self._categories.setdefault(category, []).append(entry)
234 self._categories.setdefault(category, []).append(entry)
235 self._sequences.append((category, entry))
235 self._sequences.append((category, entry))
236 if inreplyto is not None:
236 if inreplyto is not None:
237 self.getreplies(inreplyto).add(category, entry)
237 self.getreplies(inreplyto).add(category, entry)
238
238
239 def getreplies(self, partid):
239 def getreplies(self, partid):
240 """get the records that are replies to a specific part"""
240 """get the records that are replies to a specific part"""
241 return self._replies.setdefault(partid, unbundlerecords())
241 return self._replies.setdefault(partid, unbundlerecords())
242
242
243 def __getitem__(self, cat):
243 def __getitem__(self, cat):
244 return tuple(self._categories.get(cat, ()))
244 return tuple(self._categories.get(cat, ()))
245
245
246 def __iter__(self):
246 def __iter__(self):
247 return iter(self._sequences)
247 return iter(self._sequences)
248
248
249 def __len__(self):
249 def __len__(self):
250 return len(self._sequences)
250 return len(self._sequences)
251
251
252 def __nonzero__(self):
252 def __nonzero__(self):
253 return bool(self._sequences)
253 return bool(self._sequences)
254
254
255 class bundleoperation(object):
255 class bundleoperation(object):
256 """an object that represents a single bundling process
256 """an object that represents a single bundling process
257
257
258 Its purpose is to carry unbundle-related objects and states.
258 Its purpose is to carry unbundle-related objects and states.
259
259
260 A new object should be created at the beginning of each bundle processing.
260 A new object should be created at the beginning of each bundle processing.
261 The object is to be returned by the processing function.
261 The object is to be returned by the processing function.
262
262
263 The object has very little content now it will ultimately contain:
263 The object has very little content now it will ultimately contain:
264 * an access to the repo the bundle is applied to,
264 * an access to the repo the bundle is applied to,
265 * a ui object,
265 * a ui object,
266 * a way to retrieve a transaction to add changes to the repo,
266 * a way to retrieve a transaction to add changes to the repo,
267 * a way to record the result of processing each part,
267 * a way to record the result of processing each part,
268 * a way to construct a bundle response when applicable.
268 * a way to construct a bundle response when applicable.
269 """
269 """
270
270
271 def __init__(self, repo, transactiongetter):
271 def __init__(self, repo, transactiongetter):
272 self.repo = repo
272 self.repo = repo
273 self.ui = repo.ui
273 self.ui = repo.ui
274 self.records = unbundlerecords()
274 self.records = unbundlerecords()
275 self.gettransaction = transactiongetter
275 self.gettransaction = transactiongetter
276 self.reply = None
276 self.reply = None
277
277
278 class TransactionUnavailable(RuntimeError):
278 class TransactionUnavailable(RuntimeError):
279 pass
279 pass
280
280
281 def _notransaction():
281 def _notransaction():
282 """default method to get a transaction while processing a bundle
282 """default method to get a transaction while processing a bundle
283
283
284 Raise an exception to highlight the fact that no transaction was expected
284 Raise an exception to highlight the fact that no transaction was expected
285 to be created"""
285 to be created"""
286 raise TransactionUnavailable()
286 raise TransactionUnavailable()
287
287
288 def processbundle(repo, unbundler, transactiongetter=None):
288 def processbundle(repo, unbundler, transactiongetter=None):
289 """This function process a bundle, apply effect to/from a repo
289 """This function process a bundle, apply effect to/from a repo
290
290
291 It iterates over each part then searches for and uses the proper handling
291 It iterates over each part then searches for and uses the proper handling
292 code to process the part. Parts are processed in order.
292 code to process the part. Parts are processed in order.
293
293
294 This is very early version of this function that will be strongly reworked
294 This is very early version of this function that will be strongly reworked
295 before final usage.
295 before final usage.
296
296
297 Unknown Mandatory part will abort the process.
297 Unknown Mandatory part will abort the process.
298 """
298 """
299 if transactiongetter is None:
299 if transactiongetter is None:
300 transactiongetter = _notransaction
300 transactiongetter = _notransaction
301 op = bundleoperation(repo, transactiongetter)
301 op = bundleoperation(repo, transactiongetter)
302 # todo:
302 # todo:
303 # - replace this is a init function soon.
303 # - replace this is a init function soon.
304 # - exception catching
304 # - exception catching
305 unbundler.params
305 unbundler.params
306 iterparts = unbundler.iterparts()
306 iterparts = unbundler.iterparts()
307 part = None
307 part = None
308 try:
308 try:
309 for part in iterparts:
309 for part in iterparts:
310 _processpart(op, part)
310 _processpart(op, part)
311 except Exception, exc:
311 except Exception, exc:
312 for part in iterparts:
312 for part in iterparts:
313 # consume the bundle content
313 # consume the bundle content
314 part.seek(0, 2)
314 part.seek(0, 2)
315 # Small hack to let caller code distinguish exceptions from bundle2
315 # Small hack to let caller code distinguish exceptions from bundle2
316 # processing from processing the old format. This is mostly
316 # processing from processing the old format. This is mostly
317 # needed to handle different return codes to unbundle according to the
317 # needed to handle different return codes to unbundle according to the
318 # type of bundle. We should probably clean up or drop this return code
318 # type of bundle. We should probably clean up or drop this return code
319 # craziness in a future version.
319 # craziness in a future version.
320 exc.duringunbundle2 = True
320 exc.duringunbundle2 = True
321 raise
321 raise
322 return op
322 return op
323
323
324 def _processpart(op, part):
324 def _processpart(op, part):
325 """process a single part from a bundle
325 """process a single part from a bundle
326
326
327 The part is guaranteed to have been fully consumed when the function exits
327 The part is guaranteed to have been fully consumed when the function exits
328 (even if an exception is raised)."""
328 (even if an exception is raised)."""
329 try:
329 try:
330 try:
330 try:
331 handler = parthandlermapping.get(part.type)
331 handler = parthandlermapping.get(part.type)
332 if handler is None:
332 if handler is None:
333 raise error.UnsupportedPartError(parttype=part.type)
333 raise error.UnsupportedPartError(parttype=part.type)
334 op.ui.debug('found a handler for part %r\n' % part.type)
334 op.ui.debug('found a handler for part %r\n' % part.type)
335 unknownparams = part.mandatorykeys - handler.params
335 unknownparams = part.mandatorykeys - handler.params
336 if unknownparams:
336 if unknownparams:
337 unknownparams = list(unknownparams)
337 unknownparams = list(unknownparams)
338 unknownparams.sort()
338 unknownparams.sort()
339 raise error.UnsupportedPartError(parttype=part.type,
339 raise error.UnsupportedPartError(parttype=part.type,
340 params=unknownparams)
340 params=unknownparams)
341 except error.UnsupportedPartError, exc:
341 except error.UnsupportedPartError, exc:
342 if part.mandatory: # mandatory parts
342 if part.mandatory: # mandatory parts
343 raise
343 raise
344 op.ui.debug('ignoring unsupported advisory part %s\n' % exc)
344 op.ui.debug('ignoring unsupported advisory part %s\n' % exc)
345 return # skip to part processing
345 return # skip to part processing
346
346
347 # handler is called outside the above try block so that we don't
347 # handler is called outside the above try block so that we don't
348 # risk catching KeyErrors from anything other than the
348 # risk catching KeyErrors from anything other than the
349 # parthandlermapping lookup (any KeyError raised by handler()
349 # parthandlermapping lookup (any KeyError raised by handler()
350 # itself represents a defect of a different variety).
350 # itself represents a defect of a different variety).
351 output = None
351 output = None
352 if op.reply is not None:
352 if op.reply is not None:
353 op.ui.pushbuffer(error=True)
353 op.ui.pushbuffer(error=True)
354 output = ''
354 output = ''
355 try:
355 try:
356 handler(op, part)
356 handler(op, part)
357 finally:
357 finally:
358 if output is not None:
358 if output is not None:
359 output = op.ui.popbuffer()
359 output = op.ui.popbuffer()
360 if output:
360 if output:
361 outpart = op.reply.newpart('output', data=output,
361 outpart = op.reply.newpart('output', data=output,
362 mandatory=False)
362 mandatory=False)
363 outpart.addparam('in-reply-to', str(part.id), mandatory=False)
363 outpart.addparam('in-reply-to', str(part.id), mandatory=False)
364 finally:
364 finally:
365 # consume the part content to not corrupt the stream.
365 # consume the part content to not corrupt the stream.
366 part.seek(0, 2)
366 part.seek(0, 2)
367
367
368
368
369 def decodecaps(blob):
369 def decodecaps(blob):
370 """decode a bundle2 caps bytes blob into a dictionary
370 """decode a bundle2 caps bytes blob into a dictionary
371
371
372 The blob is a list of capabilities (one per line)
372 The blob is a list of capabilities (one per line)
373 Capabilities may have values using a line of the form::
373 Capabilities may have values using a line of the form::
374
374
375 capability=value1,value2,value3
375 capability=value1,value2,value3
376
376
377 The values are always a list."""
377 The values are always a list."""
378 caps = {}
378 caps = {}
379 for line in blob.splitlines():
379 for line in blob.splitlines():
380 if not line:
380 if not line:
381 continue
381 continue
382 if '=' not in line:
382 if '=' not in line:
383 key, vals = line, ()
383 key, vals = line, ()
384 else:
384 else:
385 key, vals = line.split('=', 1)
385 key, vals = line.split('=', 1)
386 vals = vals.split(',')
386 vals = vals.split(',')
387 key = urllib.unquote(key)
387 key = urllib.unquote(key)
388 vals = [urllib.unquote(v) for v in vals]
388 vals = [urllib.unquote(v) for v in vals]
389 caps[key] = vals
389 caps[key] = vals
390 return caps
390 return caps
391
391
392 def encodecaps(caps):
392 def encodecaps(caps):
393 """encode a bundle2 caps dictionary into a bytes blob"""
393 """encode a bundle2 caps dictionary into a bytes blob"""
394 chunks = []
394 chunks = []
395 for ca in sorted(caps):
395 for ca in sorted(caps):
396 vals = caps[ca]
396 vals = caps[ca]
397 ca = urllib.quote(ca)
397 ca = urllib.quote(ca)
398 vals = [urllib.quote(v) for v in vals]
398 vals = [urllib.quote(v) for v in vals]
399 if vals:
399 if vals:
400 ca = "%s=%s" % (ca, ','.join(vals))
400 ca = "%s=%s" % (ca, ','.join(vals))
401 chunks.append(ca)
401 chunks.append(ca)
402 return '\n'.join(chunks)
402 return '\n'.join(chunks)
403
403
404 class bundle20(object):
404 class bundle20(object):
405 """represent an outgoing bundle2 container
405 """represent an outgoing bundle2 container
406
406
407 Use the `addparam` method to add stream level parameter. and `newpart` to
407 Use the `addparam` method to add stream level parameter. and `newpart` to
408 populate it. Then call `getchunks` to retrieve all the binary chunks of
408 populate it. Then call `getchunks` to retrieve all the binary chunks of
409 data that compose the bundle2 container."""
409 data that compose the bundle2 container."""
410
410
411 _magicstring = 'HG20'
411 _magicstring = 'HG20'
412
412
413 def __init__(self, ui, capabilities=()):
413 def __init__(self, ui, capabilities=()):
414 self.ui = ui
414 self.ui = ui
415 self._params = []
415 self._params = []
416 self._parts = []
416 self._parts = []
417 self.capabilities = dict(capabilities)
417 self.capabilities = dict(capabilities)
418
418
419 @property
419 @property
420 def nbparts(self):
420 def nbparts(self):
421 """total number of parts added to the bundler"""
421 """total number of parts added to the bundler"""
422 return len(self._parts)
422 return len(self._parts)
423
423
424 # methods used to defines the bundle2 content
424 # methods used to defines the bundle2 content
425 def addparam(self, name, value=None):
425 def addparam(self, name, value=None):
426 """add a stream level parameter"""
426 """add a stream level parameter"""
427 if not name:
427 if not name:
428 raise ValueError('empty parameter name')
428 raise ValueError('empty parameter name')
429 if name[0] not in string.letters:
429 if name[0] not in string.letters:
430 raise ValueError('non letter first character: %r' % name)
430 raise ValueError('non letter first character: %r' % name)
431 self._params.append((name, value))
431 self._params.append((name, value))
432
432
433 def addpart(self, part):
433 def addpart(self, part):
434 """add a new part to the bundle2 container
434 """add a new part to the bundle2 container
435
435
436 Parts contains the actual applicative payload."""
436 Parts contains the actual applicative payload."""
437 assert part.id is None
437 assert part.id is None
438 part.id = len(self._parts) # very cheap counter
438 part.id = len(self._parts) # very cheap counter
439 self._parts.append(part)
439 self._parts.append(part)
440
440
441 def newpart(self, typeid, *args, **kwargs):
441 def newpart(self, typeid, *args, **kwargs):
442 """create a new part and add it to the containers
442 """create a new part and add it to the containers
443
443
444 As the part is directly added to the containers. For now, this means
444 As the part is directly added to the containers. For now, this means
445 that any failure to properly initialize the part after calling
445 that any failure to properly initialize the part after calling
446 ``newpart`` should result in a failure of the whole bundling process.
446 ``newpart`` should result in a failure of the whole bundling process.
447
447
448 You can still fall back to manually create and add if you need better
448 You can still fall back to manually create and add if you need better
449 control."""
449 control."""
450 part = bundlepart(typeid, *args, **kwargs)
450 part = bundlepart(typeid, *args, **kwargs)
451 self.addpart(part)
451 self.addpart(part)
452 return part
452 return part
453
453
454 # methods used to generate the bundle2 stream
454 # methods used to generate the bundle2 stream
455 def getchunks(self):
455 def getchunks(self):
456 self.ui.debug('start emission of %s stream\n' % self._magicstring)
456 self.ui.debug('start emission of %s stream\n' % self._magicstring)
457 yield self._magicstring
457 yield self._magicstring
458 param = self._paramchunk()
458 param = self._paramchunk()
459 self.ui.debug('bundle parameter: %s\n' % param)
459 self.ui.debug('bundle parameter: %s\n' % param)
460 yield _pack(_fstreamparamsize, len(param))
460 yield _pack(_fstreamparamsize, len(param))
461 if param:
461 if param:
462 yield param
462 yield param
463
463
464 self.ui.debug('start of parts\n')
464 self.ui.debug('start of parts\n')
465 for part in self._parts:
465 for part in self._parts:
466 self.ui.debug('bundle part: "%s"\n' % part.type)
466 self.ui.debug('bundle part: "%s"\n' % part.type)
467 for chunk in part.getchunks():
467 for chunk in part.getchunks():
468 yield chunk
468 yield chunk
469 self.ui.debug('end of bundle\n')
469 self.ui.debug('end of bundle\n')
470 yield _pack(_fpartheadersize, 0)
470 yield _pack(_fpartheadersize, 0)
471
471
472 def _paramchunk(self):
472 def _paramchunk(self):
473 """return a encoded version of all stream parameters"""
473 """return a encoded version of all stream parameters"""
474 blocks = []
474 blocks = []
475 for par, value in self._params:
475 for par, value in self._params:
476 par = urllib.quote(par)
476 par = urllib.quote(par)
477 if value is not None:
477 if value is not None:
478 value = urllib.quote(value)
478 value = urllib.quote(value)
479 par = '%s=%s' % (par, value)
479 par = '%s=%s' % (par, value)
480 blocks.append(par)
480 blocks.append(par)
481 return ' '.join(blocks)
481 return ' '.join(blocks)
482
482
483 class unpackermixin(object):
483 class unpackermixin(object):
484 """A mixin to extract bytes and struct data from a stream"""
484 """A mixin to extract bytes and struct data from a stream"""
485
485
486 def __init__(self, fp):
486 def __init__(self, fp):
487 self._fp = fp
487 self._fp = fp
488 self._seekable = (util.safehasattr(fp, 'seek') and
488 self._seekable = (util.safehasattr(fp, 'seek') and
489 util.safehasattr(fp, 'tell'))
489 util.safehasattr(fp, 'tell'))
490
490
491 def _unpack(self, format):
491 def _unpack(self, format):
492 """unpack this struct format from the stream"""
492 """unpack this struct format from the stream"""
493 data = self._readexact(struct.calcsize(format))
493 data = self._readexact(struct.calcsize(format))
494 return _unpack(format, data)
494 return _unpack(format, data)
495
495
496 def _readexact(self, size):
496 def _readexact(self, size):
497 """read exactly <size> bytes from the stream"""
497 """read exactly <size> bytes from the stream"""
498 return changegroup.readexactly(self._fp, size)
498 return changegroup.readexactly(self._fp, size)
499
499
500 def seek(self, offset, whence=0):
500 def seek(self, offset, whence=0):
501 """move the underlying file pointer"""
501 """move the underlying file pointer"""
502 if self._seekable:
502 if self._seekable:
503 return self._fp.seek(offset, whence)
503 return self._fp.seek(offset, whence)
504 else:
504 else:
505 raise NotImplementedError(_('File pointer is not seekable'))
505 raise NotImplementedError(_('File pointer is not seekable'))
506
506
507 def tell(self):
507 def tell(self):
508 """return the file offset, or None if file is not seekable"""
508 """return the file offset, or None if file is not seekable"""
509 if self._seekable:
509 if self._seekable:
510 try:
510 try:
511 return self._fp.tell()
511 return self._fp.tell()
512 except IOError, e:
512 except IOError, e:
513 if e.errno == errno.ESPIPE:
513 if e.errno == errno.ESPIPE:
514 self._seekable = False
514 self._seekable = False
515 else:
515 else:
516 raise
516 raise
517 return None
517 return None
518
518
519 def close(self):
519 def close(self):
520 """close underlying file"""
520 """close underlying file"""
521 if util.safehasattr(self._fp, 'close'):
521 if util.safehasattr(self._fp, 'close'):
522 return self._fp.close()
522 return self._fp.close()
523
523
524 def getunbundler(ui, fp, header=None):
524 def getunbundler(ui, fp, header=None):
525 """return a valid unbundler object for a given header"""
525 """return a valid unbundler object for a given header"""
526 if header is None:
526 if header is None:
527 header = changegroup.readexactly(fp, 4)
527 header = changegroup.readexactly(fp, 4)
528 magic, version = header[0:2], header[2:4]
528 magic, version = header[0:2], header[2:4]
529 if magic != 'HG':
529 if magic != 'HG':
530 raise util.Abort(_('not a Mercurial bundle'))
530 raise util.Abort(_('not a Mercurial bundle'))
531 unbundlerclass = formatmap.get(version)
531 unbundlerclass = formatmap.get(version)
532 if unbundlerclass is None:
532 if unbundlerclass is None:
533 raise util.Abort(_('unknown bundle version %s') % version)
533 raise util.Abort(_('unknown bundle version %s') % version)
534 unbundler = unbundlerclass(ui, fp)
534 unbundler = unbundlerclass(ui, fp)
535 ui.debug('start processing of %s stream\n' % header)
535 ui.debug('start processing of %s stream\n' % header)
536 return unbundler
536 return unbundler
537
537
538 class unbundle20(unpackermixin):
538 class unbundle20(unpackermixin):
539 """interpret a bundle2 stream
539 """interpret a bundle2 stream
540
540
541 This class is fed with a binary stream and yields parts through its
541 This class is fed with a binary stream and yields parts through its
542 `iterparts` methods."""
542 `iterparts` methods."""
543
543
544 def __init__(self, ui, fp):
544 def __init__(self, ui, fp):
545 """If header is specified, we do not read it out of the stream."""
545 """If header is specified, we do not read it out of the stream."""
546 self.ui = ui
546 self.ui = ui
547 super(unbundle20, self).__init__(fp)
547 super(unbundle20, self).__init__(fp)
548
548
549 @util.propertycache
549 @util.propertycache
550 def params(self):
550 def params(self):
551 """dictionary of stream level parameters"""
551 """dictionary of stream level parameters"""
552 self.ui.debug('reading bundle2 stream parameters\n')
552 self.ui.debug('reading bundle2 stream parameters\n')
553 params = {}
553 params = {}
554 paramssize = self._unpack(_fstreamparamsize)[0]
554 paramssize = self._unpack(_fstreamparamsize)[0]
555 if paramssize < 0:
555 if paramssize < 0:
556 raise error.BundleValueError('negative bundle param size: %i'
556 raise error.BundleValueError('negative bundle param size: %i'
557 % paramssize)
557 % paramssize)
558 if paramssize:
558 if paramssize:
559 for p in self._readexact(paramssize).split(' '):
559 for p in self._readexact(paramssize).split(' '):
560 p = p.split('=', 1)
560 p = p.split('=', 1)
561 p = [urllib.unquote(i) for i in p]
561 p = [urllib.unquote(i) for i in p]
562 if len(p) < 2:
562 if len(p) < 2:
563 p.append(None)
563 p.append(None)
564 self._processparam(*p)
564 self._processparam(*p)
565 params[p[0]] = p[1]
565 params[p[0]] = p[1]
566 return params
566 return params
567
567
568 def _processparam(self, name, value):
568 def _processparam(self, name, value):
569 """process a parameter, applying its effect if needed
569 """process a parameter, applying its effect if needed
570
570
571 Parameter starting with a lower case letter are advisory and will be
571 Parameter starting with a lower case letter are advisory and will be
572 ignored when unknown. Those starting with an upper case letter are
572 ignored when unknown. Those starting with an upper case letter are
573 mandatory and will this function will raise a KeyError when unknown.
573 mandatory and will this function will raise a KeyError when unknown.
574
574
575 Note: no option are currently supported. Any input will be either
575 Note: no option are currently supported. Any input will be either
576 ignored or failing.
576 ignored or failing.
577 """
577 """
578 if not name:
578 if not name:
579 raise ValueError('empty parameter name')
579 raise ValueError('empty parameter name')
580 if name[0] not in string.letters:
580 if name[0] not in string.letters:
581 raise ValueError('non letter first character: %r' % name)
581 raise ValueError('non letter first character: %r' % name)
582 # Some logic will be later added here to try to process the option for
582 # Some logic will be later added here to try to process the option for
583 # a dict of known parameter.
583 # a dict of known parameter.
584 if name[0].islower():
584 if name[0].islower():
585 self.ui.debug("ignoring unknown parameter %r\n" % name)
585 self.ui.debug("ignoring unknown parameter %r\n" % name)
586 else:
586 else:
587 raise error.UnsupportedPartError(params=(name,))
587 raise error.UnsupportedPartError(params=(name,))
588
588
589
589
590 def iterparts(self):
590 def iterparts(self):
591 """yield all parts contained in the stream"""
591 """yield all parts contained in the stream"""
592 # make sure param have been loaded
592 # make sure param have been loaded
593 self.params
593 self.params
594 self.ui.debug('start extraction of bundle2 parts\n')
594 self.ui.debug('start extraction of bundle2 parts\n')
595 headerblock = self._readpartheader()
595 headerblock = self._readpartheader()
596 while headerblock is not None:
596 while headerblock is not None:
597 part = unbundlepart(self.ui, headerblock, self._fp)
597 part = unbundlepart(self.ui, headerblock, self._fp)
598 yield part
598 yield part
599 part.seek(0, 2)
599 part.seek(0, 2)
600 headerblock = self._readpartheader()
600 headerblock = self._readpartheader()
601 self.ui.debug('end of bundle2 stream\n')
601 self.ui.debug('end of bundle2 stream\n')
602
602
603 def _readpartheader(self):
603 def _readpartheader(self):
604 """reads a part header size and return the bytes blob
604 """reads a part header size and return the bytes blob
605
605
606 returns None if empty"""
606 returns None if empty"""
607 headersize = self._unpack(_fpartheadersize)[0]
607 headersize = self._unpack(_fpartheadersize)[0]
608 if headersize < 0:
608 if headersize < 0:
609 raise error.BundleValueError('negative part header size: %i'
609 raise error.BundleValueError('negative part header size: %i'
610 % headersize)
610 % headersize)
611 self.ui.debug('part header size: %i\n' % headersize)
611 self.ui.debug('part header size: %i\n' % headersize)
612 if headersize:
612 if headersize:
613 return self._readexact(headersize)
613 return self._readexact(headersize)
614 return None
614 return None
615
615
616 def compressed(self):
616 def compressed(self):
617 return False
617 return False
618
618
619 formatmap = {'20': unbundle20}
619 formatmap = {'20': unbundle20}
620
620
621 class bundlepart(object):
621 class bundlepart(object):
622 """A bundle2 part contains application level payload
622 """A bundle2 part contains application level payload
623
623
624 The part `type` is used to route the part to the application level
624 The part `type` is used to route the part to the application level
625 handler.
625 handler.
626
626
627 The part payload is contained in ``part.data``. It could be raw bytes or a
627 The part payload is contained in ``part.data``. It could be raw bytes or a
628 generator of byte chunks.
628 generator of byte chunks.
629
629
630 You can add parameters to the part using the ``addparam`` method.
630 You can add parameters to the part using the ``addparam`` method.
631 Parameters can be either mandatory (default) or advisory. Remote side
631 Parameters can be either mandatory (default) or advisory. Remote side
632 should be able to safely ignore the advisory ones.
632 should be able to safely ignore the advisory ones.
633
633
634 Both data and parameters cannot be modified after the generation has begun.
634 Both data and parameters cannot be modified after the generation has begun.
635 """
635 """
636
636
637 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
637 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
638 data='', mandatory=True):
638 data='', mandatory=True):
639 validateparttype(parttype)
639 validateparttype(parttype)
640 self.id = None
640 self.id = None
641 self.type = parttype
641 self.type = parttype
642 self._data = data
642 self._data = data
643 self._mandatoryparams = list(mandatoryparams)
643 self._mandatoryparams = list(mandatoryparams)
644 self._advisoryparams = list(advisoryparams)
644 self._advisoryparams = list(advisoryparams)
645 # checking for duplicated entries
645 # checking for duplicated entries
646 self._seenparams = set()
646 self._seenparams = set()
647 for pname, __ in self._mandatoryparams + self._advisoryparams:
647 for pname, __ in self._mandatoryparams + self._advisoryparams:
648 if pname in self._seenparams:
648 if pname in self._seenparams:
649 raise RuntimeError('duplicated params: %s' % pname)
649 raise RuntimeError('duplicated params: %s' % pname)
650 self._seenparams.add(pname)
650 self._seenparams.add(pname)
651 # status of the part's generation:
651 # status of the part's generation:
652 # - None: not started,
652 # - None: not started,
653 # - False: currently generated,
653 # - False: currently generated,
654 # - True: generation done.
654 # - True: generation done.
655 self._generated = None
655 self._generated = None
656 self.mandatory = mandatory
656 self.mandatory = mandatory
657
657
658 # methods used to defines the part content
658 # methods used to defines the part content
659 def __setdata(self, data):
659 def __setdata(self, data):
660 if self._generated is not None:
660 if self._generated is not None:
661 raise error.ReadOnlyPartError('part is being generated')
661 raise error.ReadOnlyPartError('part is being generated')
662 self._data = data
662 self._data = data
663 def __getdata(self):
663 def __getdata(self):
664 return self._data
664 return self._data
665 data = property(__getdata, __setdata)
665 data = property(__getdata, __setdata)
666
666
667 @property
667 @property
668 def mandatoryparams(self):
668 def mandatoryparams(self):
669 # make it an immutable tuple to force people through ``addparam``
669 # make it an immutable tuple to force people through ``addparam``
670 return tuple(self._mandatoryparams)
670 return tuple(self._mandatoryparams)
671
671
672 @property
672 @property
673 def advisoryparams(self):
673 def advisoryparams(self):
674 # make it an immutable tuple to force people through ``addparam``
674 # make it an immutable tuple to force people through ``addparam``
675 return tuple(self._advisoryparams)
675 return tuple(self._advisoryparams)
676
676
677 def addparam(self, name, value='', mandatory=True):
677 def addparam(self, name, value='', mandatory=True):
678 if self._generated is not None:
678 if self._generated is not None:
679 raise error.ReadOnlyPartError('part is being generated')
679 raise error.ReadOnlyPartError('part is being generated')
680 if name in self._seenparams:
680 if name in self._seenparams:
681 raise ValueError('duplicated params: %s' % name)
681 raise ValueError('duplicated params: %s' % name)
682 self._seenparams.add(name)
682 self._seenparams.add(name)
683 params = self._advisoryparams
683 params = self._advisoryparams
684 if mandatory:
684 if mandatory:
685 params = self._mandatoryparams
685 params = self._mandatoryparams
686 params.append((name, value))
686 params.append((name, value))
687
687
688 # methods used to generates the bundle2 stream
688 # methods used to generates the bundle2 stream
689 def getchunks(self):
689 def getchunks(self):
690 if self._generated is not None:
690 if self._generated is not None:
691 raise RuntimeError('part can only be consumed once')
691 raise RuntimeError('part can only be consumed once')
692 self._generated = False
692 self._generated = False
693 #### header
693 #### header
694 if self.mandatory:
694 if self.mandatory:
695 parttype = self.type.upper()
695 parttype = self.type.upper()
696 else:
696 else:
697 parttype = self.type.lower()
697 parttype = self.type.lower()
698 ## parttype
698 ## parttype
699 header = [_pack(_fparttypesize, len(parttype)),
699 header = [_pack(_fparttypesize, len(parttype)),
700 parttype, _pack(_fpartid, self.id),
700 parttype, _pack(_fpartid, self.id),
701 ]
701 ]
702 ## parameters
702 ## parameters
703 # count
703 # count
704 manpar = self.mandatoryparams
704 manpar = self.mandatoryparams
705 advpar = self.advisoryparams
705 advpar = self.advisoryparams
706 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
706 header.append(_pack(_fpartparamcount, len(manpar), len(advpar)))
707 # size
707 # size
708 parsizes = []
708 parsizes = []
709 for key, value in manpar:
709 for key, value in manpar:
710 parsizes.append(len(key))
710 parsizes.append(len(key))
711 parsizes.append(len(value))
711 parsizes.append(len(value))
712 for key, value in advpar:
712 for key, value in advpar:
713 parsizes.append(len(key))
713 parsizes.append(len(key))
714 parsizes.append(len(value))
714 parsizes.append(len(value))
715 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
715 paramsizes = _pack(_makefpartparamsizes(len(parsizes) / 2), *parsizes)
716 header.append(paramsizes)
716 header.append(paramsizes)
717 # key, value
717 # key, value
718 for key, value in manpar:
718 for key, value in manpar:
719 header.append(key)
719 header.append(key)
720 header.append(value)
720 header.append(value)
721 for key, value in advpar:
721 for key, value in advpar:
722 header.append(key)
722 header.append(key)
723 header.append(value)
723 header.append(value)
724 ## finalize header
724 ## finalize header
725 headerchunk = ''.join(header)
725 headerchunk = ''.join(header)
726 yield _pack(_fpartheadersize, len(headerchunk))
726 yield _pack(_fpartheadersize, len(headerchunk))
727 yield headerchunk
727 yield headerchunk
728 ## payload
728 ## payload
729 try:
729 try:
730 for chunk in self._payloadchunks():
730 for chunk in self._payloadchunks():
731 yield _pack(_fpayloadsize, len(chunk))
731 yield _pack(_fpayloadsize, len(chunk))
732 yield chunk
732 yield chunk
733 except Exception, exc:
733 except Exception, exc:
734 # backup exception data for later
734 # backup exception data for later
735 exc_info = sys.exc_info()
735 exc_info = sys.exc_info()
736 msg = 'unexpected error: %s' % exc
736 msg = 'unexpected error: %s' % exc
737 interpart = bundlepart('error:abort', [('message', msg)],
737 interpart = bundlepart('error:abort', [('message', msg)],
738 mandatory=False)
738 mandatory=False)
739 interpart.id = 0
739 interpart.id = 0
740 yield _pack(_fpayloadsize, -1)
740 yield _pack(_fpayloadsize, -1)
741 for chunk in interpart.getchunks():
741 for chunk in interpart.getchunks():
742 yield chunk
742 yield chunk
743 # abort current part payload
743 # abort current part payload
744 yield _pack(_fpayloadsize, 0)
744 yield _pack(_fpayloadsize, 0)
745 raise exc_info[0], exc_info[1], exc_info[2]
745 raise exc_info[0], exc_info[1], exc_info[2]
746 # end of payload
746 # end of payload
747 yield _pack(_fpayloadsize, 0)
747 yield _pack(_fpayloadsize, 0)
748 self._generated = True
748 self._generated = True
749
749
750 def _payloadchunks(self):
750 def _payloadchunks(self):
751 """yield chunks of a the part payload
751 """yield chunks of a the part payload
752
752
753 Exists to handle the different methods to provide data to a part."""
753 Exists to handle the different methods to provide data to a part."""
754 # we only support fixed size data now.
754 # we only support fixed size data now.
755 # This will be improved in the future.
755 # This will be improved in the future.
756 if util.safehasattr(self.data, 'next'):
756 if util.safehasattr(self.data, 'next'):
757 buff = util.chunkbuffer(self.data)
757 buff = util.chunkbuffer(self.data)
758 chunk = buff.read(preferedchunksize)
758 chunk = buff.read(preferedchunksize)
759 while chunk:
759 while chunk:
760 yield chunk
760 yield chunk
761 chunk = buff.read(preferedchunksize)
761 chunk = buff.read(preferedchunksize)
762 elif len(self.data):
762 elif len(self.data):
763 yield self.data
763 yield self.data
764
764
765
765
766 flaginterrupt = -1
766 flaginterrupt = -1
767
767
768 class interrupthandler(unpackermixin):
768 class interrupthandler(unpackermixin):
769 """read one part and process it with restricted capability
769 """read one part and process it with restricted capability
770
770
771 This allows to transmit exception raised on the producer size during part
771 This allows to transmit exception raised on the producer size during part
772 iteration while the consumer is reading a part.
772 iteration while the consumer is reading a part.
773
773
774 Part processed in this manner only have access to a ui object,"""
774 Part processed in this manner only have access to a ui object,"""
775
775
776 def __init__(self, ui, fp):
776 def __init__(self, ui, fp):
777 super(interrupthandler, self).__init__(fp)
777 super(interrupthandler, self).__init__(fp)
778 self.ui = ui
778 self.ui = ui
779
779
780 def _readpartheader(self):
780 def _readpartheader(self):
781 """reads a part header size and return the bytes blob
781 """reads a part header size and return the bytes blob
782
782
783 returns None if empty"""
783 returns None if empty"""
784 headersize = self._unpack(_fpartheadersize)[0]
784 headersize = self._unpack(_fpartheadersize)[0]
785 if headersize < 0:
785 if headersize < 0:
786 raise error.BundleValueError('negative part header size: %i'
786 raise error.BundleValueError('negative part header size: %i'
787 % headersize)
787 % headersize)
788 self.ui.debug('part header size: %i\n' % headersize)
788 self.ui.debug('part header size: %i\n' % headersize)
789 if headersize:
789 if headersize:
790 return self._readexact(headersize)
790 return self._readexact(headersize)
791 return None
791 return None
792
792
793 def __call__(self):
793 def __call__(self):
794 self.ui.debug('bundle2 stream interruption, looking for a part.\n')
794 self.ui.debug('bundle2 stream interruption, looking for a part.\n')
795 headerblock = self._readpartheader()
795 headerblock = self._readpartheader()
796 if headerblock is None:
796 if headerblock is None:
797 self.ui.debug('no part found during interruption.\n')
797 self.ui.debug('no part found during interruption.\n')
798 return
798 return
799 part = unbundlepart(self.ui, headerblock, self._fp)
799 part = unbundlepart(self.ui, headerblock, self._fp)
800 op = interruptoperation(self.ui)
800 op = interruptoperation(self.ui)
801 _processpart(op, part)
801 _processpart(op, part)
802
802
803 class interruptoperation(object):
803 class interruptoperation(object):
804 """A limited operation to be use by part handler during interruption
804 """A limited operation to be use by part handler during interruption
805
805
806 It only have access to an ui object.
806 It only have access to an ui object.
807 """
807 """
808
808
809 def __init__(self, ui):
809 def __init__(self, ui):
810 self.ui = ui
810 self.ui = ui
811 self.reply = None
811 self.reply = None
812
812
813 @property
813 @property
814 def repo(self):
814 def repo(self):
815 raise RuntimeError('no repo access from stream interruption')
815 raise RuntimeError('no repo access from stream interruption')
816
816
817 def gettransaction(self):
817 def gettransaction(self):
818 raise TransactionUnavailable('no repo access from stream interruption')
818 raise TransactionUnavailable('no repo access from stream interruption')
819
819
820 class unbundlepart(unpackermixin):
820 class unbundlepart(unpackermixin):
821 """a bundle part read from a bundle"""
821 """a bundle part read from a bundle"""
822
822
823 def __init__(self, ui, header, fp):
823 def __init__(self, ui, header, fp):
824 super(unbundlepart, self).__init__(fp)
824 super(unbundlepart, self).__init__(fp)
825 self.ui = ui
825 self.ui = ui
826 # unbundle state attr
826 # unbundle state attr
827 self._headerdata = header
827 self._headerdata = header
828 self._headeroffset = 0
828 self._headeroffset = 0
829 self._initialized = False
829 self._initialized = False
830 self.consumed = False
830 self.consumed = False
831 # part data
831 # part data
832 self.id = None
832 self.id = None
833 self.type = None
833 self.type = None
834 self.mandatoryparams = None
834 self.mandatoryparams = None
835 self.advisoryparams = None
835 self.advisoryparams = None
836 self.params = None
836 self.params = None
837 self.mandatorykeys = ()
837 self.mandatorykeys = ()
838 self._payloadstream = None
838 self._payloadstream = None
839 self._readheader()
839 self._readheader()
840 self._mandatory = None
840 self._mandatory = None
841 self._chunkindex = [] #(payload, file) position tuples for chunk starts
841 self._chunkindex = [] #(payload, file) position tuples for chunk starts
842 self._pos = 0
842 self._pos = 0
843
843
844 def _fromheader(self, size):
844 def _fromheader(self, size):
845 """return the next <size> byte from the header"""
845 """return the next <size> byte from the header"""
846 offset = self._headeroffset
846 offset = self._headeroffset
847 data = self._headerdata[offset:(offset + size)]
847 data = self._headerdata[offset:(offset + size)]
848 self._headeroffset = offset + size
848 self._headeroffset = offset + size
849 return data
849 return data
850
850
851 def _unpackheader(self, format):
851 def _unpackheader(self, format):
852 """read given format from header
852 """read given format from header
853
853
854 This automatically compute the size of the format to read."""
854 This automatically compute the size of the format to read."""
855 data = self._fromheader(struct.calcsize(format))
855 data = self._fromheader(struct.calcsize(format))
856 return _unpack(format, data)
856 return _unpack(format, data)
857
857
858 def _initparams(self, mandatoryparams, advisoryparams):
858 def _initparams(self, mandatoryparams, advisoryparams):
859 """internal function to setup all logic related parameters"""
859 """internal function to setup all logic related parameters"""
860 # make it read only to prevent people touching it by mistake.
860 # make it read only to prevent people touching it by mistake.
861 self.mandatoryparams = tuple(mandatoryparams)
861 self.mandatoryparams = tuple(mandatoryparams)
862 self.advisoryparams = tuple(advisoryparams)
862 self.advisoryparams = tuple(advisoryparams)
863 # user friendly UI
863 # user friendly UI
864 self.params = dict(self.mandatoryparams)
864 self.params = dict(self.mandatoryparams)
865 self.params.update(dict(self.advisoryparams))
865 self.params.update(dict(self.advisoryparams))
866 self.mandatorykeys = frozenset(p[0] for p in mandatoryparams)
866 self.mandatorykeys = frozenset(p[0] for p in mandatoryparams)
867
867
868 def _payloadchunks(self, chunknum=0):
868 def _payloadchunks(self, chunknum=0):
869 '''seek to specified chunk and start yielding data'''
869 '''seek to specified chunk and start yielding data'''
870 if len(self._chunkindex) == 0:
870 if len(self._chunkindex) == 0:
871 assert chunknum == 0, 'Must start with chunk 0'
871 assert chunknum == 0, 'Must start with chunk 0'
872 self._chunkindex.append((0, super(unbundlepart, self).tell()))
872 self._chunkindex.append((0, super(unbundlepart, self).tell()))
873 else:
873 else:
874 assert chunknum < len(self._chunkindex), \
874 assert chunknum < len(self._chunkindex), \
875 'Unknown chunk %d' % chunknum
875 'Unknown chunk %d' % chunknum
876 super(unbundlepart, self).seek(self._chunkindex[chunknum][1])
876 super(unbundlepart, self).seek(self._chunkindex[chunknum][1])
877
877
878 pos = self._chunkindex[chunknum][0]
878 pos = self._chunkindex[chunknum][0]
879 payloadsize = self._unpack(_fpayloadsize)[0]
879 payloadsize = self._unpack(_fpayloadsize)[0]
880 self.ui.debug('payload chunk size: %i\n' % payloadsize)
880 self.ui.debug('payload chunk size: %i\n' % payloadsize)
881 while payloadsize:
881 while payloadsize:
882 if payloadsize == flaginterrupt:
882 if payloadsize == flaginterrupt:
883 # interruption detection, the handler will now read a
883 # interruption detection, the handler will now read a
884 # single part and process it.
884 # single part and process it.
885 interrupthandler(self.ui, self._fp)()
885 interrupthandler(self.ui, self._fp)()
886 elif payloadsize < 0:
886 elif payloadsize < 0:
887 msg = 'negative payload chunk size: %i' % payloadsize
887 msg = 'negative payload chunk size: %i' % payloadsize
888 raise error.BundleValueError(msg)
888 raise error.BundleValueError(msg)
889 else:
889 else:
890 result = self._readexact(payloadsize)
890 result = self._readexact(payloadsize)
891 chunknum += 1
891 chunknum += 1
892 pos += payloadsize
892 pos += payloadsize
893 if chunknum == len(self._chunkindex):
893 if chunknum == len(self._chunkindex):
894 self._chunkindex.append((pos,
894 self._chunkindex.append((pos,
895 super(unbundlepart, self).tell()))
895 super(unbundlepart, self).tell()))
896 yield result
896 yield result
897 payloadsize = self._unpack(_fpayloadsize)[0]
897 payloadsize = self._unpack(_fpayloadsize)[0]
898 self.ui.debug('payload chunk size: %i\n' % payloadsize)
898 self.ui.debug('payload chunk size: %i\n' % payloadsize)
899
899
900 def _findchunk(self, pos):
900 def _findchunk(self, pos):
901 '''for a given payload position, return a chunk number and offset'''
901 '''for a given payload position, return a chunk number and offset'''
902 for chunk, (ppos, fpos) in enumerate(self._chunkindex):
902 for chunk, (ppos, fpos) in enumerate(self._chunkindex):
903 if ppos == pos:
903 if ppos == pos:
904 return chunk, 0
904 return chunk, 0
905 elif ppos > pos:
905 elif ppos > pos:
906 return chunk - 1, pos - self._chunkindex[chunk - 1][0]
906 return chunk - 1, pos - self._chunkindex[chunk - 1][0]
907 raise ValueError('Unknown chunk')
907 raise ValueError('Unknown chunk')
908
908
909 def _readheader(self):
909 def _readheader(self):
910 """read the header and setup the object"""
910 """read the header and setup the object"""
911 typesize = self._unpackheader(_fparttypesize)[0]
911 typesize = self._unpackheader(_fparttypesize)[0]
912 self.type = self._fromheader(typesize)
912 self.type = self._fromheader(typesize)
913 self.ui.debug('part type: "%s"\n' % self.type)
913 self.ui.debug('part type: "%s"\n' % self.type)
914 self.id = self._unpackheader(_fpartid)[0]
914 self.id = self._unpackheader(_fpartid)[0]
915 self.ui.debug('part id: "%s"\n' % self.id)
915 self.ui.debug('part id: "%s"\n' % self.id)
916 # extract mandatory bit from type
916 # extract mandatory bit from type
917 self.mandatory = (self.type != self.type.lower())
917 self.mandatory = (self.type != self.type.lower())
918 self.type = self.type.lower()
918 self.type = self.type.lower()
919 ## reading parameters
919 ## reading parameters
920 # param count
920 # param count
921 mancount, advcount = self._unpackheader(_fpartparamcount)
921 mancount, advcount = self._unpackheader(_fpartparamcount)
922 self.ui.debug('part parameters: %i\n' % (mancount + advcount))
922 self.ui.debug('part parameters: %i\n' % (mancount + advcount))
923 # param size
923 # param size
924 fparamsizes = _makefpartparamsizes(mancount + advcount)
924 fparamsizes = _makefpartparamsizes(mancount + advcount)
925 paramsizes = self._unpackheader(fparamsizes)
925 paramsizes = self._unpackheader(fparamsizes)
926 # make it a list of couple again
926 # make it a list of couple again
927 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
927 paramsizes = zip(paramsizes[::2], paramsizes[1::2])
928 # split mandatory from advisory
928 # split mandatory from advisory
929 mansizes = paramsizes[:mancount]
929 mansizes = paramsizes[:mancount]
930 advsizes = paramsizes[mancount:]
930 advsizes = paramsizes[mancount:]
931 # retrieve param value
931 # retrieve param value
932 manparams = []
932 manparams = []
933 for key, value in mansizes:
933 for key, value in mansizes:
934 manparams.append((self._fromheader(key), self._fromheader(value)))
934 manparams.append((self._fromheader(key), self._fromheader(value)))
935 advparams = []
935 advparams = []
936 for key, value in advsizes:
936 for key, value in advsizes:
937 advparams.append((self._fromheader(key), self._fromheader(value)))
937 advparams.append((self._fromheader(key), self._fromheader(value)))
938 self._initparams(manparams, advparams)
938 self._initparams(manparams, advparams)
939 ## part payload
939 ## part payload
940 self._payloadstream = util.chunkbuffer(self._payloadchunks())
940 self._payloadstream = util.chunkbuffer(self._payloadchunks())
941 # we read the data, tell it
941 # we read the data, tell it
942 self._initialized = True
942 self._initialized = True
943
943
944 def read(self, size=None):
944 def read(self, size=None):
945 """read payload data"""
945 """read payload data"""
946 if not self._initialized:
946 if not self._initialized:
947 self._readheader()
947 self._readheader()
948 if size is None:
948 if size is None:
949 data = self._payloadstream.read()
949 data = self._payloadstream.read()
950 else:
950 else:
951 data = self._payloadstream.read(size)
951 data = self._payloadstream.read(size)
952 if size is None or len(data) < size:
952 if size is None or len(data) < size:
953 self.consumed = True
953 self.consumed = True
954 self._pos += len(data)
954 self._pos += len(data)
955 return data
955 return data
956
956
957 def tell(self):
957 def tell(self):
958 return self._pos
958 return self._pos
959
959
960 def seek(self, offset, whence=0):
960 def seek(self, offset, whence=0):
961 if whence == 0:
961 if whence == 0:
962 newpos = offset
962 newpos = offset
963 elif whence == 1:
963 elif whence == 1:
964 newpos = self._pos + offset
964 newpos = self._pos + offset
965 elif whence == 2:
965 elif whence == 2:
966 if not self.consumed:
966 if not self.consumed:
967 self.read()
967 self.read()
968 newpos = self._chunkindex[-1][0] - offset
968 newpos = self._chunkindex[-1][0] - offset
969 else:
969 else:
970 raise ValueError('Unknown whence value: %r' % (whence,))
970 raise ValueError('Unknown whence value: %r' % (whence,))
971
971
972 if newpos > self._chunkindex[-1][0] and not self.consumed:
972 if newpos > self._chunkindex[-1][0] and not self.consumed:
973 self.read()
973 self.read()
974 if not 0 <= newpos <= self._chunkindex[-1][0]:
974 if not 0 <= newpos <= self._chunkindex[-1][0]:
975 raise ValueError('Offset out of range')
975 raise ValueError('Offset out of range')
976
976
977 if self._pos != newpos:
977 if self._pos != newpos:
978 chunk, internaloffset = self._findchunk(newpos)
978 chunk, internaloffset = self._findchunk(newpos)
979 self._payloadstream = util.chunkbuffer(self._payloadchunks(chunk))
979 self._payloadstream = util.chunkbuffer(self._payloadchunks(chunk))
980 adjust = self.read(internaloffset)
980 adjust = self.read(internaloffset)
981 if len(adjust) != internaloffset:
981 if len(adjust) != internaloffset:
982 raise util.Abort(_('Seek failed\n'))
982 raise util.Abort(_('Seek failed\n'))
983 self._pos = newpos
983 self._pos = newpos
984
984
985 capabilities = {'HG20': (),
985 capabilities = {'HG20': (),
986 'listkeys': (),
986 'listkeys': (),
987 'pushkey': (),
987 'pushkey': (),
988 'digests': tuple(sorted(util.DIGESTS.keys())),
988 'digests': tuple(sorted(util.DIGESTS.keys())),
989 'remote-changegroup': ('http', 'https'),
989 'remote-changegroup': ('http', 'https'),
990 }
990 }
991
991
992 def getrepocaps(repo, allowpushback=False):
992 def getrepocaps(repo, allowpushback=False):
993 """return the bundle2 capabilities for a given repo
993 """return the bundle2 capabilities for a given repo
994
994
995 Exists to allow extensions (like evolution) to mutate the capabilities.
995 Exists to allow extensions (like evolution) to mutate the capabilities.
996 """
996 """
997 caps = capabilities.copy()
997 caps = capabilities.copy()
998 caps['changegroup'] = tuple(sorted(changegroup.packermap.keys()))
998 caps['changegroup'] = tuple(sorted(changegroup.packermap.keys()))
999 if obsolete.isenabled(repo, obsolete.exchangeopt):
999 if obsolete.isenabled(repo, obsolete.exchangeopt):
1000 supportedformat = tuple('V%i' % v for v in obsolete.formats)
1000 supportedformat = tuple('V%i' % v for v in obsolete.formats)
1001 caps['obsmarkers'] = supportedformat
1001 caps['obsmarkers'] = supportedformat
1002 if allowpushback:
1002 if allowpushback:
1003 caps['pushback'] = ()
1003 caps['pushback'] = ()
1004 return caps
1004 return caps
1005
1005
1006 def bundle2caps(remote):
1006 def bundle2caps(remote):
1007 """return the bundle capabilities of a peer as dict"""
1007 """return the bundle capabilities of a peer as dict"""
1008 raw = remote.capable('bundle2')
1008 raw = remote.capable('bundle2')
1009 if not raw and raw != '':
1009 if not raw and raw != '':
1010 return {}
1010 return {}
1011 capsblob = urllib.unquote(remote.capable('bundle2'))
1011 capsblob = urllib.unquote(remote.capable('bundle2'))
1012 return decodecaps(capsblob)
1012 return decodecaps(capsblob)
1013
1013
1014 def obsmarkersversion(caps):
1014 def obsmarkersversion(caps):
1015 """extract the list of supported obsmarkers versions from a bundle2caps dict
1015 """extract the list of supported obsmarkers versions from a bundle2caps dict
1016 """
1016 """
1017 obscaps = caps.get('obsmarkers', ())
1017 obscaps = caps.get('obsmarkers', ())
1018 return [int(c[1:]) for c in obscaps if c.startswith('V')]
1018 return [int(c[1:]) for c in obscaps if c.startswith('V')]
1019
1019
1020 @parthandler('changegroup', ('version',))
1020 @parthandler('changegroup', ('version',))
1021 def handlechangegroup(op, inpart):
1021 def handlechangegroup(op, inpart):
1022 """apply a changegroup part on the repo
1022 """apply a changegroup part on the repo
1023
1023
1024 This is a very early implementation that will massive rework before being
1024 This is a very early implementation that will massive rework before being
1025 inflicted to any end-user.
1025 inflicted to any end-user.
1026 """
1026 """
1027 # Make sure we trigger a transaction creation
1027 # Make sure we trigger a transaction creation
1028 #
1028 #
1029 # The addchangegroup function will get a transaction object by itself, but
1029 # The addchangegroup function will get a transaction object by itself, but
1030 # we need to make sure we trigger the creation of a transaction object used
1030 # we need to make sure we trigger the creation of a transaction object used
1031 # for the whole processing scope.
1031 # for the whole processing scope.
1032 op.gettransaction()
1032 op.gettransaction()
1033 unpackerversion = inpart.params.get('version', '01')
1033 unpackerversion = inpart.params.get('version', '01')
1034 # We should raise an appropriate exception here
1034 # We should raise an appropriate exception here
1035 unpacker = changegroup.packermap[unpackerversion][1]
1035 unpacker = changegroup.packermap[unpackerversion][1]
1036 cg = unpacker(inpart, 'UN')
1036 cg = unpacker(inpart, 'UN')
1037 # the source and url passed here are overwritten by the one contained in
1037 # the source and url passed here are overwritten by the one contained in
1038 # the transaction.hookargs argument. So 'bundle2' is a placeholder
1038 # the transaction.hookargs argument. So 'bundle2' is a placeholder
1039 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1039 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1040 op.records.add('changegroup', {'return': ret})
1040 op.records.add('changegroup', {'return': ret})
1041 if op.reply is not None:
1041 if op.reply is not None:
1042 # This is definitely not the final form of this
1042 # This is definitely not the final form of this
1043 # return. But one need to start somewhere.
1043 # return. But one need to start somewhere.
1044 part = op.reply.newpart('reply:changegroup', mandatory=False)
1044 part = op.reply.newpart('reply:changegroup', mandatory=False)
1045 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1045 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1046 part.addparam('return', '%i' % ret, mandatory=False)
1046 part.addparam('return', '%i' % ret, mandatory=False)
1047 assert not inpart.read()
1047 assert not inpart.read()
1048
1048
1049 _remotechangegroupparams = tuple(['url', 'size', 'digests'] +
1049 _remotechangegroupparams = tuple(['url', 'size', 'digests'] +
1050 ['digest:%s' % k for k in util.DIGESTS.keys()])
1050 ['digest:%s' % k for k in util.DIGESTS.keys()])
1051 @parthandler('remote-changegroup', _remotechangegroupparams)
1051 @parthandler('remote-changegroup', _remotechangegroupparams)
1052 def handleremotechangegroup(op, inpart):
1052 def handleremotechangegroup(op, inpart):
1053 """apply a bundle10 on the repo, given an url and validation information
1053 """apply a bundle10 on the repo, given an url and validation information
1054
1054
1055 All the information about the remote bundle to import are given as
1055 All the information about the remote bundle to import are given as
1056 parameters. The parameters include:
1056 parameters. The parameters include:
1057 - url: the url to the bundle10.
1057 - url: the url to the bundle10.
1058 - size: the bundle10 file size. It is used to validate what was
1058 - size: the bundle10 file size. It is used to validate what was
1059 retrieved by the client matches the server knowledge about the bundle.
1059 retrieved by the client matches the server knowledge about the bundle.
1060 - digests: a space separated list of the digest types provided as
1060 - digests: a space separated list of the digest types provided as
1061 parameters.
1061 parameters.
1062 - digest:<digest-type>: the hexadecimal representation of the digest with
1062 - digest:<digest-type>: the hexadecimal representation of the digest with
1063 that name. Like the size, it is used to validate what was retrieved by
1063 that name. Like the size, it is used to validate what was retrieved by
1064 the client matches what the server knows about the bundle.
1064 the client matches what the server knows about the bundle.
1065
1065
1066 When multiple digest types are given, all of them are checked.
1066 When multiple digest types are given, all of them are checked.
1067 """
1067 """
1068 try:
1068 try:
1069 raw_url = inpart.params['url']
1069 raw_url = inpart.params['url']
1070 except KeyError:
1070 except KeyError:
1071 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'url')
1071 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'url')
1072 parsed_url = util.url(raw_url)
1072 parsed_url = util.url(raw_url)
1073 if parsed_url.scheme not in capabilities['remote-changegroup']:
1073 if parsed_url.scheme not in capabilities['remote-changegroup']:
1074 raise util.Abort(_('remote-changegroup does not support %s urls') %
1074 raise util.Abort(_('remote-changegroup does not support %s urls') %
1075 parsed_url.scheme)
1075 parsed_url.scheme)
1076
1076
1077 try:
1077 try:
1078 size = int(inpart.params['size'])
1078 size = int(inpart.params['size'])
1079 except ValueError:
1079 except ValueError:
1080 raise util.Abort(_('remote-changegroup: invalid value for param "%s"')
1080 raise util.Abort(_('remote-changegroup: invalid value for param "%s"')
1081 % 'size')
1081 % 'size')
1082 except KeyError:
1082 except KeyError:
1083 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'size')
1083 raise util.Abort(_('remote-changegroup: missing "%s" param') % 'size')
1084
1084
1085 digests = {}
1085 digests = {}
1086 for typ in inpart.params.get('digests', '').split():
1086 for typ in inpart.params.get('digests', '').split():
1087 param = 'digest:%s' % typ
1087 param = 'digest:%s' % typ
1088 try:
1088 try:
1089 value = inpart.params[param]
1089 value = inpart.params[param]
1090 except KeyError:
1090 except KeyError:
1091 raise util.Abort(_('remote-changegroup: missing "%s" param') %
1091 raise util.Abort(_('remote-changegroup: missing "%s" param') %
1092 param)
1092 param)
1093 digests[typ] = value
1093 digests[typ] = value
1094
1094
1095 real_part = util.digestchecker(url.open(op.ui, raw_url), size, digests)
1095 real_part = util.digestchecker(url.open(op.ui, raw_url), size, digests)
1096
1096
1097 # Make sure we trigger a transaction creation
1097 # Make sure we trigger a transaction creation
1098 #
1098 #
1099 # The addchangegroup function will get a transaction object by itself, but
1099 # The addchangegroup function will get a transaction object by itself, but
1100 # we need to make sure we trigger the creation of a transaction object used
1100 # we need to make sure we trigger the creation of a transaction object used
1101 # for the whole processing scope.
1101 # for the whole processing scope.
1102 op.gettransaction()
1102 op.gettransaction()
1103 import exchange
1103 import exchange
1104 cg = exchange.readbundle(op.repo.ui, real_part, raw_url)
1104 cg = exchange.readbundle(op.repo.ui, real_part, raw_url)
1105 if not isinstance(cg, changegroup.cg1unpacker):
1105 if not isinstance(cg, changegroup.cg1unpacker):
1106 raise util.Abort(_('%s: not a bundle version 1.0') %
1106 raise util.Abort(_('%s: not a bundle version 1.0') %
1107 util.hidepassword(raw_url))
1107 util.hidepassword(raw_url))
1108 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1108 ret = changegroup.addchangegroup(op.repo, cg, 'bundle2', 'bundle2')
1109 op.records.add('changegroup', {'return': ret})
1109 op.records.add('changegroup', {'return': ret})
1110 if op.reply is not None:
1110 if op.reply is not None:
1111 # This is definitely not the final form of this
1111 # This is definitely not the final form of this
1112 # return. But one need to start somewhere.
1112 # return. But one need to start somewhere.
1113 part = op.reply.newpart('reply:changegroup')
1113 part = op.reply.newpart('reply:changegroup')
1114 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1114 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
1115 part.addparam('return', '%i' % ret, mandatory=False)
1115 part.addparam('return', '%i' % ret, mandatory=False)
1116 try:
1116 try:
1117 real_part.validate()
1117 real_part.validate()
1118 except util.Abort, e:
1118 except util.Abort, e:
1119 raise util.Abort(_('bundle at %s is corrupted:\n%s') %
1119 raise util.Abort(_('bundle at %s is corrupted:\n%s') %
1120 (util.hidepassword(raw_url), str(e)))
1120 (util.hidepassword(raw_url), str(e)))
1121 assert not inpart.read()
1121 assert not inpart.read()
1122
1122
1123 @parthandler('reply:changegroup', ('return', 'in-reply-to'))
1123 @parthandler('reply:changegroup', ('return', 'in-reply-to'))
1124 def handlereplychangegroup(op, inpart):
1124 def handlereplychangegroup(op, inpart):
1125 ret = int(inpart.params['return'])
1125 ret = int(inpart.params['return'])
1126 replyto = int(inpart.params['in-reply-to'])
1126 replyto = int(inpart.params['in-reply-to'])
1127 op.records.add('changegroup', {'return': ret}, replyto)
1127 op.records.add('changegroup', {'return': ret}, replyto)
1128
1128
1129 @parthandler('check:heads')
1129 @parthandler('check:heads')
1130 def handlecheckheads(op, inpart):
1130 def handlecheckheads(op, inpart):
1131 """check that head of the repo did not change
1131 """check that head of the repo did not change
1132
1132
1133 This is used to detect a push race when using unbundle.
1133 This is used to detect a push race when using unbundle.
1134 This replaces the "heads" argument of unbundle."""
1134 This replaces the "heads" argument of unbundle."""
1135 h = inpart.read(20)
1135 h = inpart.read(20)
1136 heads = []
1136 heads = []
1137 while len(h) == 20:
1137 while len(h) == 20:
1138 heads.append(h)
1138 heads.append(h)
1139 h = inpart.read(20)
1139 h = inpart.read(20)
1140 assert not h
1140 assert not h
1141 if heads != op.repo.heads():
1141 if heads != op.repo.heads():
1142 raise error.PushRaced('repository changed while pushing - '
1142 raise error.PushRaced('repository changed while pushing - '
1143 'please try again')
1143 'please try again')
1144
1144
1145 @parthandler('output')
1145 @parthandler('output')
1146 def handleoutput(op, inpart):
1146 def handleoutput(op, inpart):
1147 """forward output captured on the server to the client"""
1147 """forward output captured on the server to the client"""
1148 for line in inpart.read().splitlines():
1148 for line in inpart.read().splitlines():
1149 op.ui.write(('remote: %s\n' % line))
1149 op.ui.write(('remote: %s\n' % line))
1150
1150
1151 @parthandler('replycaps')
1151 @parthandler('replycaps')
1152 def handlereplycaps(op, inpart):
1152 def handlereplycaps(op, inpart):
1153 """Notify that a reply bundle should be created
1153 """Notify that a reply bundle should be created
1154
1154
1155 The payload contains the capabilities information for the reply"""
1155 The payload contains the capabilities information for the reply"""
1156 caps = decodecaps(inpart.read())
1156 caps = decodecaps(inpart.read())
1157 if op.reply is None:
1157 if op.reply is None:
1158 op.reply = bundle20(op.ui, caps)
1158 op.reply = bundle20(op.ui, caps)
1159
1159
1160 @parthandler('error:abort', ('message', 'hint'))
1160 @parthandler('error:abort', ('message', 'hint'))
1161 def handlereplycaps(op, inpart):
1161 def handlereplycaps(op, inpart):
1162 """Used to transmit abort error over the wire"""
1162 """Used to transmit abort error over the wire"""
1163 raise util.Abort(inpart.params['message'], hint=inpart.params.get('hint'))
1163 raise util.Abort(inpart.params['message'], hint=inpart.params.get('hint'))
1164
1164
1165 @parthandler('error:unsupportedcontent', ('parttype', 'params'))
1165 @parthandler('error:unsupportedcontent', ('parttype', 'params'))
1166 def handlereplycaps(op, inpart):
1166 def handlereplycaps(op, inpart):
1167 """Used to transmit unknown content error over the wire"""
1167 """Used to transmit unknown content error over the wire"""
1168 kwargs = {}
1168 kwargs = {}
1169 parttype = inpart.params.get('parttype')
1169 parttype = inpart.params.get('parttype')
1170 if parttype is not None:
1170 if parttype is not None:
1171 kwargs['parttype'] = parttype
1171 kwargs['parttype'] = parttype
1172 params = inpart.params.get('params')
1172 params = inpart.params.get('params')
1173 if params is not None:
1173 if params is not None:
1174 kwargs['params'] = params.split('\0')
1174 kwargs['params'] = params.split('\0')
1175
1175
1176 raise error.UnsupportedPartError(**kwargs)
1176 raise error.UnsupportedPartError(**kwargs)
1177
1177
1178 @parthandler('error:pushraced', ('message',))
1178 @parthandler('error:pushraced', ('message',))
1179 def handlereplycaps(op, inpart):
1179 def handlereplycaps(op, inpart):
1180 """Used to transmit push race error over the wire"""
1180 """Used to transmit push race error over the wire"""
1181 raise error.ResponseError(_('push failed:'), inpart.params['message'])
1181 raise error.ResponseError(_('push failed:'), inpart.params['message'])
1182
1182
1183 @parthandler('listkeys', ('namespace',))
1183 @parthandler('listkeys', ('namespace',))
1184 def handlelistkeys(op, inpart):
1184 def handlelistkeys(op, inpart):
1185 """retrieve pushkey namespace content stored in a bundle2"""
1185 """retrieve pushkey namespace content stored in a bundle2"""
1186 namespace = inpart.params['namespace']
1186 namespace = inpart.params['namespace']
1187 r = pushkey.decodekeys(inpart.read())
1187 r = pushkey.decodekeys(inpart.read())
1188 op.records.add('listkeys', (namespace, r))
1188 op.records.add('listkeys', (namespace, r))
1189
1189
1190 @parthandler('pushkey', ('namespace', 'key', 'old', 'new'))
1190 @parthandler('pushkey', ('namespace', 'key', 'old', 'new'))
1191 def handlepushkey(op, inpart):
1191 def handlepushkey(op, inpart):
1192 """process a pushkey request"""
1192 """process a pushkey request"""
1193 dec = pushkey.decode
1193 dec = pushkey.decode
1194 namespace = dec(inpart.params['namespace'])
1194 namespace = dec(inpart.params['namespace'])
1195 key = dec(inpart.params['key'])
1195 key = dec(inpart.params['key'])
1196 old = dec(inpart.params['old'])
1196 old = dec(inpart.params['old'])
1197 new = dec(inpart.params['new'])
1197 new = dec(inpart.params['new'])
1198 ret = op.repo.pushkey(namespace, key, old, new)
1198 ret = op.repo.pushkey(namespace, key, old, new)
1199 record = {'namespace': namespace,
1199 record = {'namespace': namespace,
1200 'key': key,
1200 'key': key,
1201 'old': old,
1201 'old': old,
1202 'new': new}
1202 'new': new}
1203 op.records.add('pushkey', record)
1203 op.records.add('pushkey', record)
1204 if op.reply is not None:
1204 if op.reply is not None:
1205 rpart = op.reply.newpart('reply:pushkey')
1205 rpart = op.reply.newpart('reply:pushkey')
1206 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1206 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1207 rpart.addparam('return', '%i' % ret, mandatory=False)
1207 rpart.addparam('return', '%i' % ret, mandatory=False)
1208
1208
1209 @parthandler('reply:pushkey', ('return', 'in-reply-to'))
1209 @parthandler('reply:pushkey', ('return', 'in-reply-to'))
1210 def handlepushkeyreply(op, inpart):
1210 def handlepushkeyreply(op, inpart):
1211 """retrieve the result of a pushkey request"""
1211 """retrieve the result of a pushkey request"""
1212 ret = int(inpart.params['return'])
1212 ret = int(inpart.params['return'])
1213 partid = int(inpart.params['in-reply-to'])
1213 partid = int(inpart.params['in-reply-to'])
1214 op.records.add('pushkey', {'return': ret}, partid)
1214 op.records.add('pushkey', {'return': ret}, partid)
1215
1215
1216 @parthandler('obsmarkers')
1216 @parthandler('obsmarkers')
1217 def handleobsmarker(op, inpart):
1217 def handleobsmarker(op, inpart):
1218 """add a stream of obsmarkers to the repo"""
1218 """add a stream of obsmarkers to the repo"""
1219 tr = op.gettransaction()
1219 tr = op.gettransaction()
1220 new = op.repo.obsstore.mergemarkers(tr, inpart.read())
1220 markerdata = inpart.read()
1221 if op.ui.config('experimental', 'obsmarkers-exchange-debug', False):
1222 op.ui.write(('obsmarker-exchange: %i bytes received\n')
1223 % len(markerdata))
1224 new = op.repo.obsstore.mergemarkers(tr, markerdata)
1221 if new:
1225 if new:
1222 op.repo.ui.status(_('%i new obsolescence markers\n') % new)
1226 op.repo.ui.status(_('%i new obsolescence markers\n') % new)
1223 op.records.add('obsmarkers', {'new': new})
1227 op.records.add('obsmarkers', {'new': new})
1224 if op.reply is not None:
1228 if op.reply is not None:
1225 rpart = op.reply.newpart('reply:obsmarkers')
1229 rpart = op.reply.newpart('reply:obsmarkers')
1226 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1230 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
1227 rpart.addparam('new', '%i' % new, mandatory=False)
1231 rpart.addparam('new', '%i' % new, mandatory=False)
1228
1232
1229
1233
1230 @parthandler('reply:obsmarkers', ('new', 'in-reply-to'))
1234 @parthandler('reply:obsmarkers', ('new', 'in-reply-to'))
1231 def handlepushkeyreply(op, inpart):
1235 def handlepushkeyreply(op, inpart):
1232 """retrieve the result of a pushkey request"""
1236 """retrieve the result of a pushkey request"""
1233 ret = int(inpart.params['new'])
1237 ret = int(inpart.params['new'])
1234 partid = int(inpart.params['in-reply-to'])
1238 partid = int(inpart.params['in-reply-to'])
1235 op.records.add('obsmarkers', {'new': ret}, partid)
1239 op.records.add('obsmarkers', {'new': ret}, partid)
@@ -1,821 +1,830 b''
1 $ cat >> $HGRCPATH << EOF
1 $ cat >> $HGRCPATH << EOF
2 > [phases]
2 > [phases]
3 > # public changeset are not obsolete
3 > # public changeset are not obsolete
4 > publish=false
4 > publish=false
5 > [ui]
5 > [ui]
6 > logtemplate="{rev}:{node|short} ({phase}) [{tags} {bookmarks}] {desc|firstline}\n"
6 > logtemplate="{rev}:{node|short} ({phase}) [{tags} {bookmarks}] {desc|firstline}\n"
7 > EOF
7 > EOF
8 $ mkcommit() {
8 $ mkcommit() {
9 > echo "$1" > "$1"
9 > echo "$1" > "$1"
10 > hg add "$1"
10 > hg add "$1"
11 > hg ci -m "add $1"
11 > hg ci -m "add $1"
12 > }
12 > }
13 $ getid() {
13 $ getid() {
14 > hg log -T "{node}\n" --hidden -r "desc('$1')"
14 > hg log -T "{node}\n" --hidden -r "desc('$1')"
15 > }
15 > }
16
16
17 $ cat > debugkeys.py <<EOF
17 $ cat > debugkeys.py <<EOF
18 > def reposetup(ui, repo):
18 > def reposetup(ui, repo):
19 > class debugkeysrepo(repo.__class__):
19 > class debugkeysrepo(repo.__class__):
20 > def listkeys(self, namespace):
20 > def listkeys(self, namespace):
21 > ui.write('listkeys %s\n' % (namespace,))
21 > ui.write('listkeys %s\n' % (namespace,))
22 > return super(debugkeysrepo, self).listkeys(namespace)
22 > return super(debugkeysrepo, self).listkeys(namespace)
23 >
23 >
24 > if repo.local():
24 > if repo.local():
25 > repo.__class__ = debugkeysrepo
25 > repo.__class__ = debugkeysrepo
26 > EOF
26 > EOF
27
27
28 $ hg init tmpa
28 $ hg init tmpa
29 $ cd tmpa
29 $ cd tmpa
30 $ mkcommit kill_me
30 $ mkcommit kill_me
31
31
32 Checking that the feature is properly disabled
32 Checking that the feature is properly disabled
33
33
34 $ hg debugobsolete -d '0 0' `getid kill_me` -u babar
34 $ hg debugobsolete -d '0 0' `getid kill_me` -u babar
35 abort: creating obsolete markers is not enabled on this repo
35 abort: creating obsolete markers is not enabled on this repo
36 [255]
36 [255]
37
37
38 Enabling it
38 Enabling it
39
39
40 $ cat >> $HGRCPATH << EOF
40 $ cat >> $HGRCPATH << EOF
41 > [experimental]
41 > [experimental]
42 > evolution=createmarkers,exchange
42 > evolution=createmarkers,exchange
43 > EOF
43 > EOF
44
44
45 Killing a single changeset without replacement
45 Killing a single changeset without replacement
46
46
47 $ hg debugobsolete 0
47 $ hg debugobsolete 0
48 abort: changeset references must be full hexadecimal node identifiers
48 abort: changeset references must be full hexadecimal node identifiers
49 [255]
49 [255]
50 $ hg debugobsolete '00'
50 $ hg debugobsolete '00'
51 abort: changeset references must be full hexadecimal node identifiers
51 abort: changeset references must be full hexadecimal node identifiers
52 [255]
52 [255]
53 $ hg debugobsolete -d '0 0' `getid kill_me` -u babar
53 $ hg debugobsolete -d '0 0' `getid kill_me` -u babar
54 $ hg debugobsolete
54 $ hg debugobsolete
55 97b7c2d76b1845ed3eb988cd612611e72406cef0 0 (Thu Jan 01 00:00:00 1970 +0000) {'user': 'babar'}
55 97b7c2d76b1845ed3eb988cd612611e72406cef0 0 (Thu Jan 01 00:00:00 1970 +0000) {'user': 'babar'}
56
56
57 (test that mercurial is not confused)
57 (test that mercurial is not confused)
58
58
59 $ hg up null --quiet # having 0 as parent prevents it to be hidden
59 $ hg up null --quiet # having 0 as parent prevents it to be hidden
60 $ hg tip
60 $ hg tip
61 -1:000000000000 (public) [tip ]
61 -1:000000000000 (public) [tip ]
62 $ hg up --hidden tip --quiet
62 $ hg up --hidden tip --quiet
63
63
64 Killing a single changeset with itself should fail
64 Killing a single changeset with itself should fail
65 (simple local safeguard)
65 (simple local safeguard)
66
66
67 $ hg debugobsolete `getid kill_me` `getid kill_me`
67 $ hg debugobsolete `getid kill_me` `getid kill_me`
68 abort: bad obsmarker input: in-marker cycle with 97b7c2d76b1845ed3eb988cd612611e72406cef0
68 abort: bad obsmarker input: in-marker cycle with 97b7c2d76b1845ed3eb988cd612611e72406cef0
69 [255]
69 [255]
70
70
71 $ cd ..
71 $ cd ..
72
72
73 Killing a single changeset with replacement
73 Killing a single changeset with replacement
74 (and testing the format option)
74 (and testing the format option)
75
75
76 $ hg init tmpb
76 $ hg init tmpb
77 $ cd tmpb
77 $ cd tmpb
78 $ mkcommit a
78 $ mkcommit a
79 $ mkcommit b
79 $ mkcommit b
80 $ mkcommit original_c
80 $ mkcommit original_c
81 $ hg up "desc('b')"
81 $ hg up "desc('b')"
82 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
82 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
83 $ mkcommit new_c
83 $ mkcommit new_c
84 created new head
84 created new head
85 $ hg log -r 'hidden()' --template '{rev}:{node|short} {desc}\n' --hidden
85 $ hg log -r 'hidden()' --template '{rev}:{node|short} {desc}\n' --hidden
86 $ hg debugobsolete --config format.obsstore-version=0 --flag 12 `getid original_c` `getid new_c` -d '121 120'
86 $ hg debugobsolete --config format.obsstore-version=0 --flag 12 `getid original_c` `getid new_c` -d '121 120'
87 $ hg log -r 'hidden()' --template '{rev}:{node|short} {desc}\n' --hidden
87 $ hg log -r 'hidden()' --template '{rev}:{node|short} {desc}\n' --hidden
88 2:245bde4270cd add original_c
88 2:245bde4270cd add original_c
89 $ hg debugrevlog -cd
89 $ hg debugrevlog -cd
90 # rev p1rev p2rev start end deltastart base p1 p2 rawsize totalsize compression heads chainlen
90 # rev p1rev p2rev start end deltastart base p1 p2 rawsize totalsize compression heads chainlen
91 0 -1 -1 0 59 0 0 0 0 58 58 0 1 0
91 0 -1 -1 0 59 0 0 0 0 58 58 0 1 0
92 1 0 -1 59 118 59 59 0 0 58 116 0 1 0
92 1 0 -1 59 118 59 59 0 0 58 116 0 1 0
93 2 1 -1 118 193 118 118 59 0 76 192 0 1 0
93 2 1 -1 118 193 118 118 59 0 76 192 0 1 0
94 3 1 -1 193 260 193 193 59 0 66 258 0 2 0
94 3 1 -1 193 260 193 193 59 0 66 258 0 2 0
95 $ hg debugobsolete
95 $ hg debugobsolete
96 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
96 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
97
97
98 (check for version number of the obsstore)
98 (check for version number of the obsstore)
99
99
100 $ dd bs=1 count=1 if=.hg/store/obsstore 2>/dev/null
100 $ dd bs=1 count=1 if=.hg/store/obsstore 2>/dev/null
101 \x00 (no-eol) (esc)
101 \x00 (no-eol) (esc)
102
102
103 do it again (it read the obsstore before adding new changeset)
103 do it again (it read the obsstore before adding new changeset)
104
104
105 $ hg up '.^'
105 $ hg up '.^'
106 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
106 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
107 $ mkcommit new_2_c
107 $ mkcommit new_2_c
108 created new head
108 created new head
109 $ hg debugobsolete -d '1337 0' `getid new_c` `getid new_2_c`
109 $ hg debugobsolete -d '1337 0' `getid new_c` `getid new_2_c`
110 $ hg debugobsolete
110 $ hg debugobsolete
111 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
111 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
112 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
112 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
113
113
114 Register two markers with a missing node
114 Register two markers with a missing node
115
115
116 $ hg up '.^'
116 $ hg up '.^'
117 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
117 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
118 $ mkcommit new_3_c
118 $ mkcommit new_3_c
119 created new head
119 created new head
120 $ hg debugobsolete -d '1338 0' `getid new_2_c` 1337133713371337133713371337133713371337
120 $ hg debugobsolete -d '1338 0' `getid new_2_c` 1337133713371337133713371337133713371337
121 $ hg debugobsolete -d '1339 0' 1337133713371337133713371337133713371337 `getid new_3_c`
121 $ hg debugobsolete -d '1339 0' 1337133713371337133713371337133713371337 `getid new_3_c`
122 $ hg debugobsolete
122 $ hg debugobsolete
123 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
123 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
124 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
124 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
125 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
125 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
126 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
126 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
127
127
128 Refuse pathological nullid successors
128 Refuse pathological nullid successors
129 $ hg debugobsolete -d '9001 0' 1337133713371337133713371337133713371337 0000000000000000000000000000000000000000
129 $ hg debugobsolete -d '9001 0' 1337133713371337133713371337133713371337 0000000000000000000000000000000000000000
130 transaction abort!
130 transaction abort!
131 rollback completed
131 rollback completed
132 abort: bad obsolescence marker detected: invalid successors nullid
132 abort: bad obsolescence marker detected: invalid successors nullid
133 [255]
133 [255]
134
134
135 Check that graphlog detect that a changeset is obsolete:
135 Check that graphlog detect that a changeset is obsolete:
136
136
137 $ hg log -G
137 $ hg log -G
138 @ 5:5601fb93a350 (draft) [tip ] add new_3_c
138 @ 5:5601fb93a350 (draft) [tip ] add new_3_c
139 |
139 |
140 o 1:7c3bad9141dc (draft) [ ] add b
140 o 1:7c3bad9141dc (draft) [ ] add b
141 |
141 |
142 o 0:1f0dee641bb7 (draft) [ ] add a
142 o 0:1f0dee641bb7 (draft) [ ] add a
143
143
144
144
145 check that heads does not report them
145 check that heads does not report them
146
146
147 $ hg heads
147 $ hg heads
148 5:5601fb93a350 (draft) [tip ] add new_3_c
148 5:5601fb93a350 (draft) [tip ] add new_3_c
149 $ hg heads --hidden
149 $ hg heads --hidden
150 5:5601fb93a350 (draft) [tip ] add new_3_c
150 5:5601fb93a350 (draft) [tip ] add new_3_c
151 4:ca819180edb9 (draft) [ ] add new_2_c
151 4:ca819180edb9 (draft) [ ] add new_2_c
152 3:cdbce2fbb163 (draft) [ ] add new_c
152 3:cdbce2fbb163 (draft) [ ] add new_c
153 2:245bde4270cd (draft) [ ] add original_c
153 2:245bde4270cd (draft) [ ] add original_c
154
154
155
155
156 check that summary does not report them
156 check that summary does not report them
157
157
158 $ hg init ../sink
158 $ hg init ../sink
159 $ echo '[paths]' >> .hg/hgrc
159 $ echo '[paths]' >> .hg/hgrc
160 $ echo 'default=../sink' >> .hg/hgrc
160 $ echo 'default=../sink' >> .hg/hgrc
161 $ hg summary --remote
161 $ hg summary --remote
162 parent: 5:5601fb93a350 tip
162 parent: 5:5601fb93a350 tip
163 add new_3_c
163 add new_3_c
164 branch: default
164 branch: default
165 commit: (clean)
165 commit: (clean)
166 update: (current)
166 update: (current)
167 remote: 3 outgoing
167 remote: 3 outgoing
168
168
169 $ hg summary --remote --hidden
169 $ hg summary --remote --hidden
170 parent: 5:5601fb93a350 tip
170 parent: 5:5601fb93a350 tip
171 add new_3_c
171 add new_3_c
172 branch: default
172 branch: default
173 commit: (clean)
173 commit: (clean)
174 update: 3 new changesets, 4 branch heads (merge)
174 update: 3 new changesets, 4 branch heads (merge)
175 remote: 3 outgoing
175 remote: 3 outgoing
176
176
177 check that various commands work well with filtering
177 check that various commands work well with filtering
178
178
179 $ hg tip
179 $ hg tip
180 5:5601fb93a350 (draft) [tip ] add new_3_c
180 5:5601fb93a350 (draft) [tip ] add new_3_c
181 $ hg log -r 6
181 $ hg log -r 6
182 abort: unknown revision '6'!
182 abort: unknown revision '6'!
183 [255]
183 [255]
184 $ hg log -r 4
184 $ hg log -r 4
185 abort: hidden revision '4'!
185 abort: hidden revision '4'!
186 (use --hidden to access hidden revisions)
186 (use --hidden to access hidden revisions)
187 [255]
187 [255]
188 $ hg debugrevspec 'rev(6)'
188 $ hg debugrevspec 'rev(6)'
189 $ hg debugrevspec 'rev(4)'
189 $ hg debugrevspec 'rev(4)'
190 $ hg debugrevspec 'null'
190 $ hg debugrevspec 'null'
191 -1
191 -1
192
192
193 Check that public changeset are not accounted as obsolete:
193 Check that public changeset are not accounted as obsolete:
194
194
195 $ hg --hidden phase --public 2
195 $ hg --hidden phase --public 2
196 $ hg log -G
196 $ hg log -G
197 @ 5:5601fb93a350 (draft) [tip ] add new_3_c
197 @ 5:5601fb93a350 (draft) [tip ] add new_3_c
198 |
198 |
199 | o 2:245bde4270cd (public) [ ] add original_c
199 | o 2:245bde4270cd (public) [ ] add original_c
200 |/
200 |/
201 o 1:7c3bad9141dc (public) [ ] add b
201 o 1:7c3bad9141dc (public) [ ] add b
202 |
202 |
203 o 0:1f0dee641bb7 (public) [ ] add a
203 o 0:1f0dee641bb7 (public) [ ] add a
204
204
205
205
206 And that bumped changeset are detected
206 And that bumped changeset are detected
207 --------------------------------------
207 --------------------------------------
208
208
209 If we didn't filtered obsolete changesets out, 3 and 4 would show up too. Also
209 If we didn't filtered obsolete changesets out, 3 and 4 would show up too. Also
210 note that the bumped changeset (5:5601fb93a350) is not a direct successor of
210 note that the bumped changeset (5:5601fb93a350) is not a direct successor of
211 the public changeset
211 the public changeset
212
212
213 $ hg log --hidden -r 'bumped()'
213 $ hg log --hidden -r 'bumped()'
214 5:5601fb93a350 (draft) [tip ] add new_3_c
214 5:5601fb93a350 (draft) [tip ] add new_3_c
215
215
216 And that we can't push bumped changeset
216 And that we can't push bumped changeset
217
217
218 $ hg push ../tmpa -r 0 --force #(make repo related)
218 $ hg push ../tmpa -r 0 --force #(make repo related)
219 pushing to ../tmpa
219 pushing to ../tmpa
220 searching for changes
220 searching for changes
221 warning: repository is unrelated
221 warning: repository is unrelated
222 adding changesets
222 adding changesets
223 adding manifests
223 adding manifests
224 adding file changes
224 adding file changes
225 added 1 changesets with 1 changes to 1 files (+1 heads)
225 added 1 changesets with 1 changes to 1 files (+1 heads)
226 $ hg push ../tmpa
226 $ hg push ../tmpa
227 pushing to ../tmpa
227 pushing to ../tmpa
228 searching for changes
228 searching for changes
229 abort: push includes bumped changeset: 5601fb93a350!
229 abort: push includes bumped changeset: 5601fb93a350!
230 [255]
230 [255]
231
231
232 Fixing "bumped" situation
232 Fixing "bumped" situation
233 We need to create a clone of 5 and add a special marker with a flag
233 We need to create a clone of 5 and add a special marker with a flag
234
234
235 $ hg up '5^'
235 $ hg up '5^'
236 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
236 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
237 $ hg revert -ar 5
237 $ hg revert -ar 5
238 adding new_3_c
238 adding new_3_c
239 $ hg ci -m 'add n3w_3_c'
239 $ hg ci -m 'add n3w_3_c'
240 created new head
240 created new head
241 $ hg debugobsolete -d '1338 0' --flags 1 `getid new_3_c` `getid n3w_3_c`
241 $ hg debugobsolete -d '1338 0' --flags 1 `getid new_3_c` `getid n3w_3_c`
242 $ hg log -r 'bumped()'
242 $ hg log -r 'bumped()'
243 $ hg log -G
243 $ hg log -G
244 @ 6:6f9641995072 (draft) [tip ] add n3w_3_c
244 @ 6:6f9641995072 (draft) [tip ] add n3w_3_c
245 |
245 |
246 | o 2:245bde4270cd (public) [ ] add original_c
246 | o 2:245bde4270cd (public) [ ] add original_c
247 |/
247 |/
248 o 1:7c3bad9141dc (public) [ ] add b
248 o 1:7c3bad9141dc (public) [ ] add b
249 |
249 |
250 o 0:1f0dee641bb7 (public) [ ] add a
250 o 0:1f0dee641bb7 (public) [ ] add a
251
251
252
252
253 $ cd ..
253 $ cd ..
254
254
255 Revision 0 is hidden
255 Revision 0 is hidden
256 --------------------
256 --------------------
257
257
258 $ hg init rev0hidden
258 $ hg init rev0hidden
259 $ cd rev0hidden
259 $ cd rev0hidden
260
260
261 $ mkcommit kill0
261 $ mkcommit kill0
262 $ hg up -q null
262 $ hg up -q null
263 $ hg debugobsolete `getid kill0`
263 $ hg debugobsolete `getid kill0`
264 $ mkcommit a
264 $ mkcommit a
265 $ mkcommit b
265 $ mkcommit b
266
266
267 Should pick the first visible revision as "repo" node
267 Should pick the first visible revision as "repo" node
268
268
269 $ hg archive ../archive-null
269 $ hg archive ../archive-null
270 $ cat ../archive-null/.hg_archival.txt
270 $ cat ../archive-null/.hg_archival.txt
271 repo: 1f0dee641bb7258c56bd60e93edfa2405381c41e
271 repo: 1f0dee641bb7258c56bd60e93edfa2405381c41e
272 node: 7c3bad9141dcb46ff89abf5f61856facd56e476c
272 node: 7c3bad9141dcb46ff89abf5f61856facd56e476c
273 branch: default
273 branch: default
274 latesttag: null
274 latesttag: null
275 latesttagdistance: 2
275 latesttagdistance: 2
276 changessincelatesttag: 2
276 changessincelatesttag: 2
277
277
278
278
279 $ cd ..
279 $ cd ..
280
280
281 Exchange Test
281 Exchange Test
282 ============================
282 ============================
283
283
284 Destination repo does not have any data
284 Destination repo does not have any data
285 ---------------------------------------
285 ---------------------------------------
286
286
287 Simple incoming test
287 Simple incoming test
288
288
289 $ hg init tmpc
289 $ hg init tmpc
290 $ cd tmpc
290 $ cd tmpc
291 $ hg incoming ../tmpb
291 $ hg incoming ../tmpb
292 comparing with ../tmpb
292 comparing with ../tmpb
293 0:1f0dee641bb7 (public) [ ] add a
293 0:1f0dee641bb7 (public) [ ] add a
294 1:7c3bad9141dc (public) [ ] add b
294 1:7c3bad9141dc (public) [ ] add b
295 2:245bde4270cd (public) [ ] add original_c
295 2:245bde4270cd (public) [ ] add original_c
296 6:6f9641995072 (draft) [tip ] add n3w_3_c
296 6:6f9641995072 (draft) [tip ] add n3w_3_c
297
297
298 Try to pull markers
298 Try to pull markers
299 (extinct changeset are excluded but marker are pushed)
299 (extinct changeset are excluded but marker are pushed)
300
300
301 $ hg pull ../tmpb
301 $ hg pull ../tmpb
302 pulling from ../tmpb
302 pulling from ../tmpb
303 requesting all changes
303 requesting all changes
304 adding changesets
304 adding changesets
305 adding manifests
305 adding manifests
306 adding file changes
306 adding file changes
307 added 4 changesets with 4 changes to 4 files (+1 heads)
307 added 4 changesets with 4 changes to 4 files (+1 heads)
308 (run 'hg heads' to see heads, 'hg merge' to merge)
308 (run 'hg heads' to see heads, 'hg merge' to merge)
309 $ hg debugobsolete
309 $ hg debugobsolete
310 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
310 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
311 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
311 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
312 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
312 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
313 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
313 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
314 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
314 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
315
315
316 Rollback//Transaction support
316 Rollback//Transaction support
317
317
318 $ hg debugobsolete -d '1340 0' aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
318 $ hg debugobsolete -d '1340 0' aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
319 $ hg debugobsolete
319 $ hg debugobsolete
320 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
320 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
321 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
321 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
322 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
322 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
323 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
323 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
324 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
324 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
325 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb 0 (Thu Jan 01 00:22:20 1970 +0000) {'user': 'test'}
325 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb 0 (Thu Jan 01 00:22:20 1970 +0000) {'user': 'test'}
326 $ hg rollback -n
326 $ hg rollback -n
327 repository tip rolled back to revision 3 (undo debugobsolete)
327 repository tip rolled back to revision 3 (undo debugobsolete)
328 $ hg rollback
328 $ hg rollback
329 repository tip rolled back to revision 3 (undo debugobsolete)
329 repository tip rolled back to revision 3 (undo debugobsolete)
330 $ hg debugobsolete
330 $ hg debugobsolete
331 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
331 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
332 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
332 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
333 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
333 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
334 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
334 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
335 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
335 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
336
336
337 $ cd ..
337 $ cd ..
338
338
339 Try to push markers
339 Try to push markers
340
340
341 $ hg init tmpd
341 $ hg init tmpd
342 $ hg -R tmpb push tmpd
342 $ hg -R tmpb push tmpd
343 pushing to tmpd
343 pushing to tmpd
344 searching for changes
344 searching for changes
345 adding changesets
345 adding changesets
346 adding manifests
346 adding manifests
347 adding file changes
347 adding file changes
348 added 4 changesets with 4 changes to 4 files (+1 heads)
348 added 4 changesets with 4 changes to 4 files (+1 heads)
349 $ hg -R tmpd debugobsolete | sort
349 $ hg -R tmpd debugobsolete | sort
350 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
350 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
351 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
351 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
352 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
352 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
353 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
353 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
354 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
354 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
355
355
356 Check obsolete keys are exchanged only if source has an obsolete store
356 Check obsolete keys are exchanged only if source has an obsolete store
357
357
358 $ hg init empty
358 $ hg init empty
359 $ hg --config extensions.debugkeys=debugkeys.py -R empty push tmpd
359 $ hg --config extensions.debugkeys=debugkeys.py -R empty push tmpd
360 pushing to tmpd
360 pushing to tmpd
361 listkeys phases
361 listkeys phases
362 listkeys bookmarks
362 listkeys bookmarks
363 no changes found
363 no changes found
364 listkeys phases
364 listkeys phases
365 [1]
365 [1]
366
366
367 clone support
367 clone support
368 (markers are copied and extinct changesets are included to allow hardlinks)
368 (markers are copied and extinct changesets are included to allow hardlinks)
369
369
370 $ hg clone tmpb clone-dest
370 $ hg clone tmpb clone-dest
371 updating to branch default
371 updating to branch default
372 3 files updated, 0 files merged, 0 files removed, 0 files unresolved
372 3 files updated, 0 files merged, 0 files removed, 0 files unresolved
373 $ hg -R clone-dest log -G --hidden
373 $ hg -R clone-dest log -G --hidden
374 @ 6:6f9641995072 (draft) [tip ] add n3w_3_c
374 @ 6:6f9641995072 (draft) [tip ] add n3w_3_c
375 |
375 |
376 | x 5:5601fb93a350 (draft) [ ] add new_3_c
376 | x 5:5601fb93a350 (draft) [ ] add new_3_c
377 |/
377 |/
378 | x 4:ca819180edb9 (draft) [ ] add new_2_c
378 | x 4:ca819180edb9 (draft) [ ] add new_2_c
379 |/
379 |/
380 | x 3:cdbce2fbb163 (draft) [ ] add new_c
380 | x 3:cdbce2fbb163 (draft) [ ] add new_c
381 |/
381 |/
382 | o 2:245bde4270cd (public) [ ] add original_c
382 | o 2:245bde4270cd (public) [ ] add original_c
383 |/
383 |/
384 o 1:7c3bad9141dc (public) [ ] add b
384 o 1:7c3bad9141dc (public) [ ] add b
385 |
385 |
386 o 0:1f0dee641bb7 (public) [ ] add a
386 o 0:1f0dee641bb7 (public) [ ] add a
387
387
388 $ hg -R clone-dest debugobsolete
388 $ hg -R clone-dest debugobsolete
389 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
389 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
390 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
390 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
391 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
391 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
392 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
392 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
393 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
393 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
394
394
395
395
396 Destination repo have existing data
396 Destination repo have existing data
397 ---------------------------------------
397 ---------------------------------------
398
398
399 On pull
399 On pull
400
400
401 $ hg init tmpe
401 $ hg init tmpe
402 $ cd tmpe
402 $ cd tmpe
403 $ hg debugobsolete -d '1339 0' 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00
403 $ hg debugobsolete -d '1339 0' 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00
404 $ hg pull ../tmpb
404 $ hg pull ../tmpb
405 pulling from ../tmpb
405 pulling from ../tmpb
406 requesting all changes
406 requesting all changes
407 adding changesets
407 adding changesets
408 adding manifests
408 adding manifests
409 adding file changes
409 adding file changes
410 added 4 changesets with 4 changes to 4 files (+1 heads)
410 added 4 changesets with 4 changes to 4 files (+1 heads)
411 (run 'hg heads' to see heads, 'hg merge' to merge)
411 (run 'hg heads' to see heads, 'hg merge' to merge)
412 $ hg debugobsolete
412 $ hg debugobsolete
413 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
413 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
414 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
414 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
415 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
415 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
416 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
416 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
417 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
417 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
418 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
418 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
419
419
420
420
421 On push
421 On push
422
422
423 $ hg push ../tmpc
423 $ hg push ../tmpc
424 pushing to ../tmpc
424 pushing to ../tmpc
425 searching for changes
425 searching for changes
426 no changes found
426 no changes found
427 [1]
427 [1]
428 $ hg -R ../tmpc debugobsolete
428 $ hg -R ../tmpc debugobsolete
429 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
429 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
430 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
430 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
431 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
431 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
432 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
432 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
433 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
433 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
434 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
434 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
435
435
436 detect outgoing obsolete and unstable
436 detect outgoing obsolete and unstable
437 ---------------------------------------
437 ---------------------------------------
438
438
439
439
440 $ hg log -G
440 $ hg log -G
441 o 3:6f9641995072 (draft) [tip ] add n3w_3_c
441 o 3:6f9641995072 (draft) [tip ] add n3w_3_c
442 |
442 |
443 | o 2:245bde4270cd (public) [ ] add original_c
443 | o 2:245bde4270cd (public) [ ] add original_c
444 |/
444 |/
445 o 1:7c3bad9141dc (public) [ ] add b
445 o 1:7c3bad9141dc (public) [ ] add b
446 |
446 |
447 o 0:1f0dee641bb7 (public) [ ] add a
447 o 0:1f0dee641bb7 (public) [ ] add a
448
448
449 $ hg up 'desc("n3w_3_c")'
449 $ hg up 'desc("n3w_3_c")'
450 3 files updated, 0 files merged, 0 files removed, 0 files unresolved
450 3 files updated, 0 files merged, 0 files removed, 0 files unresolved
451 $ mkcommit original_d
451 $ mkcommit original_d
452 $ mkcommit original_e
452 $ mkcommit original_e
453 $ hg debugobsolete --record-parents `getid original_d` -d '0 0'
453 $ hg debugobsolete --record-parents `getid original_d` -d '0 0'
454 $ hg debugobsolete | grep `getid original_d`
454 $ hg debugobsolete | grep `getid original_d`
455 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
455 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
456 $ hg log -r 'obsolete()'
456 $ hg log -r 'obsolete()'
457 4:94b33453f93b (draft) [ ] add original_d
457 4:94b33453f93b (draft) [ ] add original_d
458 $ hg log -G -r '::unstable()'
458 $ hg log -G -r '::unstable()'
459 @ 5:cda648ca50f5 (draft) [tip ] add original_e
459 @ 5:cda648ca50f5 (draft) [tip ] add original_e
460 |
460 |
461 x 4:94b33453f93b (draft) [ ] add original_d
461 x 4:94b33453f93b (draft) [ ] add original_d
462 |
462 |
463 o 3:6f9641995072 (draft) [ ] add n3w_3_c
463 o 3:6f9641995072 (draft) [ ] add n3w_3_c
464 |
464 |
465 o 1:7c3bad9141dc (public) [ ] add b
465 o 1:7c3bad9141dc (public) [ ] add b
466 |
466 |
467 o 0:1f0dee641bb7 (public) [ ] add a
467 o 0:1f0dee641bb7 (public) [ ] add a
468
468
469
469
470 refuse to push obsolete changeset
470 refuse to push obsolete changeset
471
471
472 $ hg push ../tmpc/ -r 'desc("original_d")'
472 $ hg push ../tmpc/ -r 'desc("original_d")'
473 pushing to ../tmpc/
473 pushing to ../tmpc/
474 searching for changes
474 searching for changes
475 abort: push includes obsolete changeset: 94b33453f93b!
475 abort: push includes obsolete changeset: 94b33453f93b!
476 [255]
476 [255]
477
477
478 refuse to push unstable changeset
478 refuse to push unstable changeset
479
479
480 $ hg push ../tmpc/
480 $ hg push ../tmpc/
481 pushing to ../tmpc/
481 pushing to ../tmpc/
482 searching for changes
482 searching for changes
483 abort: push includes unstable changeset: cda648ca50f5!
483 abort: push includes unstable changeset: cda648ca50f5!
484 [255]
484 [255]
485
485
486 Test that extinct changeset are properly detected
486 Test that extinct changeset are properly detected
487
487
488 $ hg log -r 'extinct()'
488 $ hg log -r 'extinct()'
489
489
490 Don't try to push extinct changeset
490 Don't try to push extinct changeset
491
491
492 $ hg init ../tmpf
492 $ hg init ../tmpf
493 $ hg out ../tmpf
493 $ hg out ../tmpf
494 comparing with ../tmpf
494 comparing with ../tmpf
495 searching for changes
495 searching for changes
496 0:1f0dee641bb7 (public) [ ] add a
496 0:1f0dee641bb7 (public) [ ] add a
497 1:7c3bad9141dc (public) [ ] add b
497 1:7c3bad9141dc (public) [ ] add b
498 2:245bde4270cd (public) [ ] add original_c
498 2:245bde4270cd (public) [ ] add original_c
499 3:6f9641995072 (draft) [ ] add n3w_3_c
499 3:6f9641995072 (draft) [ ] add n3w_3_c
500 4:94b33453f93b (draft) [ ] add original_d
500 4:94b33453f93b (draft) [ ] add original_d
501 5:cda648ca50f5 (draft) [tip ] add original_e
501 5:cda648ca50f5 (draft) [tip ] add original_e
502 $ hg push ../tmpf -f # -f because be push unstable too
502 $ hg push ../tmpf -f # -f because be push unstable too
503 pushing to ../tmpf
503 pushing to ../tmpf
504 searching for changes
504 searching for changes
505 adding changesets
505 adding changesets
506 adding manifests
506 adding manifests
507 adding file changes
507 adding file changes
508 added 6 changesets with 6 changes to 6 files (+1 heads)
508 added 6 changesets with 6 changes to 6 files (+1 heads)
509
509
510 no warning displayed
510 no warning displayed
511
511
512 $ hg push ../tmpf
512 $ hg push ../tmpf
513 pushing to ../tmpf
513 pushing to ../tmpf
514 searching for changes
514 searching for changes
515 no changes found
515 no changes found
516 [1]
516 [1]
517
517
518 Do not warn about new head when the new head is a successors of a remote one
518 Do not warn about new head when the new head is a successors of a remote one
519
519
520 $ hg log -G
520 $ hg log -G
521 @ 5:cda648ca50f5 (draft) [tip ] add original_e
521 @ 5:cda648ca50f5 (draft) [tip ] add original_e
522 |
522 |
523 x 4:94b33453f93b (draft) [ ] add original_d
523 x 4:94b33453f93b (draft) [ ] add original_d
524 |
524 |
525 o 3:6f9641995072 (draft) [ ] add n3w_3_c
525 o 3:6f9641995072 (draft) [ ] add n3w_3_c
526 |
526 |
527 | o 2:245bde4270cd (public) [ ] add original_c
527 | o 2:245bde4270cd (public) [ ] add original_c
528 |/
528 |/
529 o 1:7c3bad9141dc (public) [ ] add b
529 o 1:7c3bad9141dc (public) [ ] add b
530 |
530 |
531 o 0:1f0dee641bb7 (public) [ ] add a
531 o 0:1f0dee641bb7 (public) [ ] add a
532
532
533 $ hg up -q 'desc(n3w_3_c)'
533 $ hg up -q 'desc(n3w_3_c)'
534 $ mkcommit obsolete_e
534 $ mkcommit obsolete_e
535 created new head
535 created new head
536 $ hg debugobsolete `getid 'original_e'` `getid 'obsolete_e'`
536 $ hg debugobsolete `getid 'original_e'` `getid 'obsolete_e'`
537 $ hg outgoing ../tmpf # parasite hg outgoing testin
537 $ hg outgoing ../tmpf # parasite hg outgoing testin
538 comparing with ../tmpf
538 comparing with ../tmpf
539 searching for changes
539 searching for changes
540 6:3de5eca88c00 (draft) [tip ] add obsolete_e
540 6:3de5eca88c00 (draft) [tip ] add obsolete_e
541 $ hg push ../tmpf
541 $ hg push ../tmpf
542 pushing to ../tmpf
542 pushing to ../tmpf
543 searching for changes
543 searching for changes
544 adding changesets
544 adding changesets
545 adding manifests
545 adding manifests
546 adding file changes
546 adding file changes
547 added 1 changesets with 1 changes to 1 files (+1 heads)
547 added 1 changesets with 1 changes to 1 files (+1 heads)
548
548
549 test relevance computation
549 test relevance computation
550 ---------------------------------------
550 ---------------------------------------
551
551
552 Checking simple case of "marker relevance".
552 Checking simple case of "marker relevance".
553
553
554
554
555 Reminder of the repo situation
555 Reminder of the repo situation
556
556
557 $ hg log --hidden --graph
557 $ hg log --hidden --graph
558 @ 6:3de5eca88c00 (draft) [tip ] add obsolete_e
558 @ 6:3de5eca88c00 (draft) [tip ] add obsolete_e
559 |
559 |
560 | x 5:cda648ca50f5 (draft) [ ] add original_e
560 | x 5:cda648ca50f5 (draft) [ ] add original_e
561 | |
561 | |
562 | x 4:94b33453f93b (draft) [ ] add original_d
562 | x 4:94b33453f93b (draft) [ ] add original_d
563 |/
563 |/
564 o 3:6f9641995072 (draft) [ ] add n3w_3_c
564 o 3:6f9641995072 (draft) [ ] add n3w_3_c
565 |
565 |
566 | o 2:245bde4270cd (public) [ ] add original_c
566 | o 2:245bde4270cd (public) [ ] add original_c
567 |/
567 |/
568 o 1:7c3bad9141dc (public) [ ] add b
568 o 1:7c3bad9141dc (public) [ ] add b
569 |
569 |
570 o 0:1f0dee641bb7 (public) [ ] add a
570 o 0:1f0dee641bb7 (public) [ ] add a
571
571
572
572
573 List of all markers
573 List of all markers
574
574
575 $ hg debugobsolete
575 $ hg debugobsolete
576 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
576 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
577 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
577 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
578 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
578 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
579 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
579 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
580 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
580 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
581 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
581 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
582 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
582 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
583 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
583 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
584
584
585 List of changesets with no chain
585 List of changesets with no chain
586
586
587 $ hg debugobsolete --hidden --rev ::2
587 $ hg debugobsolete --hidden --rev ::2
588
588
589 List of changesets that are included on marker chain
589 List of changesets that are included on marker chain
590
590
591 $ hg debugobsolete --hidden --rev 6
591 $ hg debugobsolete --hidden --rev 6
592 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
592 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
593
593
594 List of changesets with a longer chain, (including a pruned children)
594 List of changesets with a longer chain, (including a pruned children)
595
595
596 $ hg debugobsolete --hidden --rev 3
596 $ hg debugobsolete --hidden --rev 3
597 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
597 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
598 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
598 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
599 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
599 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
600 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
600 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
601 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
601 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
602 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
602 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
603 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
603 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
604
604
605 List of both
605 List of both
606
606
607 $ hg debugobsolete --hidden --rev 3::6
607 $ hg debugobsolete --hidden --rev 3::6
608 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
608 1337133713371337133713371337133713371337 5601fb93a350734d935195fee37f4054c529ff39 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
609 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
609 1339133913391339133913391339133913391339 ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:19 1970 +0000) {'user': 'test'}
610 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
610 245bde4270cd1072a27757984f9cda8ba26f08ca cdbce2fbb16313928851e97e0d85413f3f7eb77f C (Thu Jan 01 00:00:01 1970 -0002) {'user': 'test'}
611 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
611 5601fb93a350734d935195fee37f4054c529ff39 6f96419950729f3671185b847352890f074f7557 1 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
612 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
612 94b33453f93bdb8d457ef9b770851a618bf413e1 0 {6f96419950729f3671185b847352890f074f7557} (Thu Jan 01 00:00:00 1970 +0000) {'user': 'test'}
613 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
613 ca819180edb99ed25ceafb3e9584ac287e240b00 1337133713371337133713371337133713371337 0 (Thu Jan 01 00:22:18 1970 +0000) {'user': 'test'}
614 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
614 cda648ca50f50482b7055c0b0c4c117bba6733d9 3de5eca88c00aa039da7399a220f4a5221faa585 0 (*) {'user': 'test'} (glob)
615 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
615 cdbce2fbb16313928851e97e0d85413f3f7eb77f ca819180edb99ed25ceafb3e9584ac287e240b00 0 (Thu Jan 01 00:22:17 1970 +0000) {'user': 'test'}
616
616
617 #if serve
617 #if serve
618
618
619 Test the debug output for exchange
620 ----------------------------------
621
622 $ hg pull ../tmpb --config 'experimental.obsmarkers-exchange-debug=True' --config 'experimental.bundle2-exp=True'
623 pulling from ../tmpb
624 searching for changes
625 no changes found
626 obsmarker-exchange: 346 bytes received
627
619 check hgweb does not explode
628 check hgweb does not explode
620 ====================================
629 ====================================
621
630
622 $ hg unbundle $TESTDIR/bundles/hgweb+obs.hg
631 $ hg unbundle $TESTDIR/bundles/hgweb+obs.hg
623 adding changesets
632 adding changesets
624 adding manifests
633 adding manifests
625 adding file changes
634 adding file changes
626 added 62 changesets with 63 changes to 9 files (+60 heads)
635 added 62 changesets with 63 changes to 9 files (+60 heads)
627 (run 'hg heads .' to see heads, 'hg merge' to merge)
636 (run 'hg heads .' to see heads, 'hg merge' to merge)
628 $ for node in `hg log -r 'desc(babar_)' --template '{node}\n'`;
637 $ for node in `hg log -r 'desc(babar_)' --template '{node}\n'`;
629 > do
638 > do
630 > hg debugobsolete $node
639 > hg debugobsolete $node
631 > done
640 > done
632 $ hg up tip
641 $ hg up tip
633 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
642 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
634
643
635 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
644 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
636 $ cat hg.pid >> $DAEMON_PIDS
645 $ cat hg.pid >> $DAEMON_PIDS
637
646
638 check changelog view
647 check changelog view
639
648
640 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'shortlog/'
649 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'shortlog/'
641 200 Script output follows
650 200 Script output follows
642
651
643 check graph view
652 check graph view
644
653
645 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'graph'
654 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'graph'
646 200 Script output follows
655 200 Script output follows
647
656
648 check filelog view
657 check filelog view
649
658
650 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'log/'`hg log -r . -T "{node}"`/'babar'
659 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'log/'`hg log -r . -T "{node}"`/'babar'
651 200 Script output follows
660 200 Script output follows
652
661
653 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/68'
662 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/68'
654 200 Script output follows
663 200 Script output follows
655 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/67'
664 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/67'
656 404 Not Found
665 404 Not Found
657 [1]
666 [1]
658
667
659 check that web.view config option:
668 check that web.view config option:
660
669
661 $ "$TESTDIR/killdaemons.py" hg.pid
670 $ "$TESTDIR/killdaemons.py" hg.pid
662 $ cat >> .hg/hgrc << EOF
671 $ cat >> .hg/hgrc << EOF
663 > [web]
672 > [web]
664 > view=all
673 > view=all
665 > EOF
674 > EOF
666 $ wait
675 $ wait
667 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
676 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
668 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/67'
677 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/67'
669 200 Script output follows
678 200 Script output follows
670 $ "$TESTDIR/killdaemons.py" hg.pid
679 $ "$TESTDIR/killdaemons.py" hg.pid
671
680
672 Checking _enable=False warning if obsolete marker exists
681 Checking _enable=False warning if obsolete marker exists
673
682
674 $ echo '[experimental]' >> $HGRCPATH
683 $ echo '[experimental]' >> $HGRCPATH
675 $ echo "evolution=" >> $HGRCPATH
684 $ echo "evolution=" >> $HGRCPATH
676 $ hg log -r tip
685 $ hg log -r tip
677 obsolete feature not enabled but 68 markers found!
686 obsolete feature not enabled but 68 markers found!
678 68:c15e9edfca13 (draft) [tip ] add celestine
687 68:c15e9edfca13 (draft) [tip ] add celestine
679
688
680 reenable for later test
689 reenable for later test
681
690
682 $ echo '[experimental]' >> $HGRCPATH
691 $ echo '[experimental]' >> $HGRCPATH
683 $ echo "evolution=createmarkers,exchange" >> $HGRCPATH
692 $ echo "evolution=createmarkers,exchange" >> $HGRCPATH
684
693
685 #endif
694 #endif
686
695
687 Test incoming/outcoming with changesets obsoleted remotely, known locally
696 Test incoming/outcoming with changesets obsoleted remotely, known locally
688 ===============================================================================
697 ===============================================================================
689
698
690 This test issue 3805
699 This test issue 3805
691
700
692 $ hg init repo-issue3805
701 $ hg init repo-issue3805
693 $ cd repo-issue3805
702 $ cd repo-issue3805
694 $ echo "foo" > foo
703 $ echo "foo" > foo
695 $ hg ci -Am "A"
704 $ hg ci -Am "A"
696 adding foo
705 adding foo
697 $ hg clone . ../other-issue3805
706 $ hg clone . ../other-issue3805
698 updating to branch default
707 updating to branch default
699 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
708 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
700 $ echo "bar" >> foo
709 $ echo "bar" >> foo
701 $ hg ci --amend
710 $ hg ci --amend
702 $ cd ../other-issue3805
711 $ cd ../other-issue3805
703 $ hg log -G
712 $ hg log -G
704 @ 0:193e9254ce7e (draft) [tip ] A
713 @ 0:193e9254ce7e (draft) [tip ] A
705
714
706 $ hg log -G -R ../repo-issue3805
715 $ hg log -G -R ../repo-issue3805
707 @ 2:3816541e5485 (draft) [tip ] A
716 @ 2:3816541e5485 (draft) [tip ] A
708
717
709 $ hg incoming
718 $ hg incoming
710 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
719 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
711 searching for changes
720 searching for changes
712 2:3816541e5485 (draft) [tip ] A
721 2:3816541e5485 (draft) [tip ] A
713 $ hg incoming --bundle ../issue3805.hg
722 $ hg incoming --bundle ../issue3805.hg
714 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
723 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
715 searching for changes
724 searching for changes
716 2:3816541e5485 (draft) [tip ] A
725 2:3816541e5485 (draft) [tip ] A
717 $ hg outgoing
726 $ hg outgoing
718 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
727 comparing with $TESTTMP/tmpe/repo-issue3805 (glob)
719 searching for changes
728 searching for changes
720 no changes found
729 no changes found
721 [1]
730 [1]
722
731
723 #if serve
732 #if serve
724
733
725 $ hg serve -R ../repo-issue3805 -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
734 $ hg serve -R ../repo-issue3805 -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
726 $ cat hg.pid >> $DAEMON_PIDS
735 $ cat hg.pid >> $DAEMON_PIDS
727
736
728 $ hg incoming http://localhost:$HGPORT
737 $ hg incoming http://localhost:$HGPORT
729 comparing with http://localhost:$HGPORT/
738 comparing with http://localhost:$HGPORT/
730 searching for changes
739 searching for changes
731 1:3816541e5485 (draft) [tip ] A
740 1:3816541e5485 (draft) [tip ] A
732 $ hg outgoing http://localhost:$HGPORT
741 $ hg outgoing http://localhost:$HGPORT
733 comparing with http://localhost:$HGPORT/
742 comparing with http://localhost:$HGPORT/
734 searching for changes
743 searching for changes
735 no changes found
744 no changes found
736 [1]
745 [1]
737
746
738 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
747 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
739
748
740 #endif
749 #endif
741
750
742 This test issue 3814
751 This test issue 3814
743
752
744 (nothing to push but locally hidden changeset)
753 (nothing to push but locally hidden changeset)
745
754
746 $ cd ..
755 $ cd ..
747 $ hg init repo-issue3814
756 $ hg init repo-issue3814
748 $ cd repo-issue3805
757 $ cd repo-issue3805
749 $ hg push -r 3816541e5485 ../repo-issue3814
758 $ hg push -r 3816541e5485 ../repo-issue3814
750 pushing to ../repo-issue3814
759 pushing to ../repo-issue3814
751 searching for changes
760 searching for changes
752 adding changesets
761 adding changesets
753 adding manifests
762 adding manifests
754 adding file changes
763 adding file changes
755 added 1 changesets with 1 changes to 1 files
764 added 1 changesets with 1 changes to 1 files
756 $ hg out ../repo-issue3814
765 $ hg out ../repo-issue3814
757 comparing with ../repo-issue3814
766 comparing with ../repo-issue3814
758 searching for changes
767 searching for changes
759 no changes found
768 no changes found
760 [1]
769 [1]
761
770
762 Test that a local tag blocks a changeset from being hidden
771 Test that a local tag blocks a changeset from being hidden
763
772
764 $ hg tag -l visible -r 0 --hidden
773 $ hg tag -l visible -r 0 --hidden
765 $ hg log -G
774 $ hg log -G
766 @ 2:3816541e5485 (draft) [tip ] A
775 @ 2:3816541e5485 (draft) [tip ] A
767
776
768 x 0:193e9254ce7e (draft) [visible ] A
777 x 0:193e9254ce7e (draft) [visible ] A
769
778
770 Test that removing a local tag does not cause some commands to fail
779 Test that removing a local tag does not cause some commands to fail
771
780
772 $ hg tag -l -r tip tiptag
781 $ hg tag -l -r tip tiptag
773 $ hg tags
782 $ hg tags
774 tiptag 2:3816541e5485
783 tiptag 2:3816541e5485
775 tip 2:3816541e5485
784 tip 2:3816541e5485
776 visible 0:193e9254ce7e
785 visible 0:193e9254ce7e
777 $ hg --config extensions.strip= strip -r tip --no-backup
786 $ hg --config extensions.strip= strip -r tip --no-backup
778 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
787 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
779 $ hg tags
788 $ hg tags
780 visible 0:193e9254ce7e
789 visible 0:193e9254ce7e
781 tip 0:193e9254ce7e
790 tip 0:193e9254ce7e
782
791
783 #if serve
792 #if serve
784
793
785 Test issue 4506
794 Test issue 4506
786
795
787 $ cd ..
796 $ cd ..
788 $ hg init repo-issue4506
797 $ hg init repo-issue4506
789 $ cd repo-issue4506
798 $ cd repo-issue4506
790 $ echo "0" > foo
799 $ echo "0" > foo
791 $ hg add foo
800 $ hg add foo
792 $ hg ci -m "content-0"
801 $ hg ci -m "content-0"
793
802
794 $ hg up null
803 $ hg up null
795 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
804 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
796 $ echo "1" > bar
805 $ echo "1" > bar
797 $ hg add bar
806 $ hg add bar
798 $ hg ci -m "content-1"
807 $ hg ci -m "content-1"
799 created new head
808 created new head
800 $ hg up 0
809 $ hg up 0
801 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
810 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
802 $ hg graft 1
811 $ hg graft 1
803 grafting 1:1c9eddb02162 "content-1" (tip)
812 grafting 1:1c9eddb02162 "content-1" (tip)
804
813
805 $ hg debugobsolete `hg log -r1 -T'{node}'` `hg log -r2 -T'{node}'`
814 $ hg debugobsolete `hg log -r1 -T'{node}'` `hg log -r2 -T'{node}'`
806
815
807 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
816 $ hg serve -n test -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
808 $ cat hg.pid >> $DAEMON_PIDS
817 $ cat hg.pid >> $DAEMON_PIDS
809
818
810 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/1'
819 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'rev/1'
811 404 Not Found
820 404 Not Found
812 [1]
821 [1]
813 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'file/tip/bar'
822 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'file/tip/bar'
814 200 Script output follows
823 200 Script output follows
815 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'annotate/tip/bar'
824 $ "$TESTDIR/get-with-headers.py" --headeronly localhost:$HGPORT 'annotate/tip/bar'
816 200 Script output follows
825 200 Script output follows
817
826
818 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
827 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
819
828
820 #endif
829 #endif
821
830
General Comments 0
You need to be logged in to leave comments. Login now