Show More
@@ -438,7 +438,7 b' def debugcolor(ui, repo, **opts):' | |||||
438 | _styles = {} |
|
438 | _styles = {} | |
439 | for effect in _effects.keys(): |
|
439 | for effect in _effects.keys(): | |
440 | _styles[effect] = effect |
|
440 | _styles[effect] = effect | |
441 | ui.write(('colormode: %s\n') % ui._colormode) |
|
441 | ui.write(('color mode: %s\n') % ui._colormode) | |
442 | ui.write(_('available colors:\n')) |
|
442 | ui.write(_('available colors:\n')) | |
443 | for label, colors in _styles.items(): |
|
443 | for label, colors in _styles.items(): | |
444 | ui.write(('%s\n') % colors, label=label) |
|
444 | ui.write(('%s\n') % colors, label=label) |
@@ -146,7 +146,7 b' def convert(ui, src, dest=None, revmapfi' | |||||
146 | you want to close a branch. Each entry contains a revision or hash |
|
146 | you want to close a branch. Each entry contains a revision or hash | |
147 | separated by white space. |
|
147 | separated by white space. | |
148 |
|
148 | |||
149 |
The tag |
|
149 | The tagmap is a file that exactly analogous to the branchmap. This will | |
150 | rename tags on the fly and prevent the 'update tags' commit usually found |
|
150 | rename tags on the fly and prevent the 'update tags' commit usually found | |
151 | at the end of a convert process. |
|
151 | at the end of a convert process. | |
152 |
|
152 |
@@ -24,7 +24,7 b' The format is architectured as follow' | |||||
24 | the Binary format |
|
24 | the Binary format | |
25 | ============================ |
|
25 | ============================ | |
26 |
|
26 | |||
27 |
All numbers are unsigned and big |
|
27 | All numbers are unsigned and big-endian. | |
28 |
|
28 | |||
29 | stream level parameters |
|
29 | stream level parameters | |
30 | ------------------------ |
|
30 | ------------------------ | |
@@ -40,21 +40,21 b' Binary format is as follow' | |||||
40 | A blob of `params size` containing the serialized version of all stream level |
|
40 | A blob of `params size` containing the serialized version of all stream level | |
41 | parameters. |
|
41 | parameters. | |
42 |
|
42 | |||
43 |
The blob contains a space separated list of parameters. |
|
43 | The blob contains a space separated list of parameters. Parameters with value | |
44 | are stored in the form `<name>=<value>`. Both name and value are urlquoted. |
|
44 | are stored in the form `<name>=<value>`. Both name and value are urlquoted. | |
45 |
|
45 | |||
46 | Empty name are obviously forbidden. |
|
46 | Empty name are obviously forbidden. | |
47 |
|
47 | |||
48 | Name MUST start with a letter. If this first letter is lower case, the |
|
48 | Name MUST start with a letter. If this first letter is lower case, the | |
49 |
parameter is advisory and can be safe |
|
49 | parameter is advisory and can be safely ignored. However when the first | |
50 | letter is capital, the parameter is mandatory and the bundling process MUST |
|
50 | letter is capital, the parameter is mandatory and the bundling process MUST | |
51 | stop if he is not able to proceed it. |
|
51 | stop if he is not able to proceed it. | |
52 |
|
52 | |||
53 | Stream parameters use a simple textual format for two main reasons: |
|
53 | Stream parameters use a simple textual format for two main reasons: | |
54 |
|
54 | |||
55 |
- Stream level parameters should remain |
|
55 | - Stream level parameters should remain simple and we want to discourage any | |
56 | crazy usage. |
|
56 | crazy usage. | |
57 |
- Textual data allow easy human inspection of a |
|
57 | - Textual data allow easy human inspection of a bundle2 header in case of | |
58 | troubles. |
|
58 | troubles. | |
59 |
|
59 | |||
60 | Any Applicative level options MUST go into a bundle2 part instead. |
|
60 | Any Applicative level options MUST go into a bundle2 part instead. | |
@@ -85,14 +85,14 b' Binary format is as follow' | |||||
85 |
|
85 | |||
86 | :typesize: (one byte) |
|
86 | :typesize: (one byte) | |
87 |
|
87 | |||
88 |
:typ |
|
88 | :parttype: alphanumerical part name | |
89 |
|
89 | |||
90 | :partid: A 32bits integer (unique in the bundle) that can be used to refer |
|
90 | :partid: A 32bits integer (unique in the bundle) that can be used to refer | |
91 | to this part. |
|
91 | to this part. | |
92 |
|
92 | |||
93 | :parameters: |
|
93 | :parameters: | |
94 |
|
94 | |||
95 |
Part's parameter may have arbitra |
|
95 | Part's parameter may have arbitrary content, the binary structure is:: | |
96 |
|
96 | |||
97 | <mandatory-count><advisory-count><param-sizes><param-data> |
|
97 | <mandatory-count><advisory-count><param-sizes><param-data> | |
98 |
|
98 | |||
@@ -121,7 +121,7 b' Binary format is as follow' | |||||
121 | `chunksize` says)` The payload part is concluded by a zero size chunk. |
|
121 | `chunksize` says)` The payload part is concluded by a zero size chunk. | |
122 |
|
122 | |||
123 | The current implementation always produces either zero or one chunk. |
|
123 | The current implementation always produces either zero or one chunk. | |
124 | This is an implementation limitation that will ultimatly be lifted. |
|
124 | This is an implementation limitation that will ultimately be lifted. | |
125 |
|
125 | |||
126 | Bundle processing |
|
126 | Bundle processing | |
127 | ============================ |
|
127 | ============================ | |
@@ -193,7 +193,7 b' class unbundlerecords(object):' | |||||
193 | """keep record of what happens during and unbundle |
|
193 | """keep record of what happens during and unbundle | |
194 |
|
194 | |||
195 | New records are added using `records.add('cat', obj)`. Where 'cat' is a |
|
195 | New records are added using `records.add('cat', obj)`. Where 'cat' is a | |
196 |
category of record and obj is an arbitra |
|
196 | category of record and obj is an arbitrary object. | |
197 |
|
197 | |||
198 | `records['cat']` will return all entries of this category 'cat'. |
|
198 | `records['cat']` will return all entries of this category 'cat'. | |
199 |
|
199 | |||
@@ -327,7 +327,7 b' class bundle20(object):' | |||||
327 |
|
327 | |||
328 | Use the `addparam` method to add stream level parameter. and `addpart` to |
|
328 | Use the `addparam` method to add stream level parameter. and `addpart` to | |
329 | populate it. Then call `getchunks` to retrieve all the binary chunks of |
|
329 | populate it. Then call `getchunks` to retrieve all the binary chunks of | |
330 | datathat compose the bundle2 container.""" |
|
330 | data that compose the bundle2 container.""" | |
331 |
|
331 | |||
332 | def __init__(self, ui): |
|
332 | def __init__(self, ui): | |
333 | self.ui = ui |
|
333 | self.ui = ui | |
@@ -345,7 +345,7 b' class bundle20(object):' | |||||
345 | def addpart(self, part): |
|
345 | def addpart(self, part): | |
346 | """add a new part to the bundle2 container |
|
346 | """add a new part to the bundle2 container | |
347 |
|
347 | |||
348 |
Parts contains the actual |
|
348 | Parts contains the actual applicative payload.""" | |
349 | assert part.id is None |
|
349 | assert part.id is None | |
350 | part.id = len(self._parts) # very cheap counter |
|
350 | part.id = len(self._parts) # very cheap counter | |
351 | self._parts.append(part) |
|
351 | self._parts.append(part) | |
@@ -412,7 +412,7 b' class unbundle20(unpackermixin):' | |||||
412 |
|
412 | |||
413 | @util.propertycache |
|
413 | @util.propertycache | |
414 | def params(self): |
|
414 | def params(self): | |
415 |
"""diction |
|
415 | """dictionary of stream level parameters""" | |
416 | self.ui.debug('reading bundle2 stream parameters\n') |
|
416 | self.ui.debug('reading bundle2 stream parameters\n') | |
417 | params = {} |
|
417 | params = {} | |
418 | paramssize = self._unpack(_fstreamparamsize)[0] |
|
418 | paramssize = self._unpack(_fstreamparamsize)[0] |
@@ -2258,7 +2258,7 b' def _performrevert(repo, parents, ctx, r' | |||||
2258 | This is an independent function to let extension to plug in and react to |
|
2258 | This is an independent function to let extension to plug in and react to | |
2259 | the imminent revert. |
|
2259 | the imminent revert. | |
2260 |
|
2260 | |||
2261 | Make sure you have the working directory locked when caling this function. |
|
2261 | Make sure you have the working directory locked when calling this function. | |
2262 | """ |
|
2262 | """ | |
2263 | parent, p2 = parents |
|
2263 | parent, p2 = parents | |
2264 | node = ctx.node() |
|
2264 | node = ctx.node() |
@@ -1,4 +1,4 b'' | |||||
1 | # exchange.py - utily to exchange data between repo. |
|
1 | # exchange.py - utility to exchange data between repos. | |
2 | # |
|
2 | # | |
3 | # Copyright 2005-2007 Matt Mackall <mpm@selenic.com> |
|
3 | # Copyright 2005-2007 Matt Mackall <mpm@selenic.com> | |
4 | # |
|
4 | # | |
@@ -17,7 +17,7 b' class pushoperation(object):' | |||||
17 |
|
17 | |||
18 | It purpose is to carry push related state and very common operation. |
|
18 | It purpose is to carry push related state and very common operation. | |
19 |
|
19 | |||
20 | A new should be created at the begining of each push and discarded |
|
20 | A new should be created at the beginning of each push and discarded | |
21 | afterward. |
|
21 | afterward. | |
22 | """ |
|
22 | """ | |
23 |
|
23 | |||
@@ -42,7 +42,7 b' class pushoperation(object):' | |||||
42 | # we have outgoing changesets but refused to push |
|
42 | # we have outgoing changesets but refused to push | |
43 | # - other values as described by addchangegroup() |
|
43 | # - other values as described by addchangegroup() | |
44 | self.ret = None |
|
44 | self.ret = None | |
45 | # discover.outgoing object (contains common and outgoin data) |
|
45 | # discover.outgoing object (contains common and outgoing data) | |
46 | self.outgoing = None |
|
46 | self.outgoing = None | |
47 | # all remote heads before the push |
|
47 | # all remote heads before the push | |
48 | self.remoteheads = None |
|
48 | self.remoteheads = None | |
@@ -244,7 +244,7 b' def _pushcomputecommonheads(pushop):' | |||||
244 | pushop.commonheads = cheads |
|
244 | pushop.commonheads = cheads | |
245 |
|
245 | |||
246 | def _pushsyncphase(pushop): |
|
246 | def _pushsyncphase(pushop): | |
247 | """synchronise phase information locally and remotly""" |
|
247 | """synchronise phase information locally and remotely""" | |
248 | unfi = pushop.repo.unfiltered() |
|
248 | unfi = pushop.repo.unfiltered() | |
249 | cheads = pushop.commonheads |
|
249 | cheads = pushop.commonheads | |
250 | if pushop.ret: |
|
250 | if pushop.ret: | |
@@ -379,7 +379,7 b' class pulloperation(object):' | |||||
379 |
|
379 | |||
380 | It purpose is to carry push related state and very common operation. |
|
380 | It purpose is to carry push related state and very common operation. | |
381 |
|
381 | |||
382 | A new should be created at the begining of each pull and discarded |
|
382 | A new should be created at the beginning of each pull and discarded | |
383 | afterward. |
|
383 | afterward. | |
384 | """ |
|
384 | """ | |
385 |
|
385 | |||
@@ -400,9 +400,9 b' class pulloperation(object):' | |||||
400 | self.common = None |
|
400 | self.common = None | |
401 | # set of pulled head |
|
401 | # set of pulled head | |
402 | self.rheads = None |
|
402 | self.rheads = None | |
403 | # list of missing changeset to fetch remotly |
|
403 | # list of missing changeset to fetch remotely | |
404 | self.fetch = None |
|
404 | self.fetch = None | |
405 |
# result of changegroup pulling (used as return |
|
405 | # result of changegroup pulling (used as return code by pull) | |
406 | self.cgresult = None |
|
406 | self.cgresult = None | |
407 | # list of step remaining todo (related to future bundle2 usage) |
|
407 | # list of step remaining todo (related to future bundle2 usage) | |
408 | self.todosteps = set(['changegroup', 'phases', 'obsmarkers']) |
|
408 | self.todosteps = set(['changegroup', 'phases', 'obsmarkers']) | |
@@ -609,7 +609,7 b' def getbundle(repo, source, heads=None, ' | |||||
609 | return bundle2.unbundle20(repo.ui, util.chunkbuffer(bundler.getchunks())) |
|
609 | return bundle2.unbundle20(repo.ui, util.chunkbuffer(bundler.getchunks())) | |
610 |
|
610 | |||
611 | class PushRaced(RuntimeError): |
|
611 | class PushRaced(RuntimeError): | |
612 | """An exception raised during unbunding that indicate a push race""" |
|
612 | """An exception raised during unbundling that indicate a push race""" | |
613 |
|
613 | |||
614 | def check_heads(repo, their_heads, context): |
|
614 | def check_heads(repo, their_heads, context): | |
615 | """check if the heads of a repo have been modified |
|
615 | """check if the heads of a repo have been modified | |
@@ -629,7 +629,7 b' def unbundle(repo, cg, heads, source, ur' | |||||
629 | """Apply a bundle to a repo. |
|
629 | """Apply a bundle to a repo. | |
630 |
|
630 | |||
631 | this function makes sure the repo is locked during the application and have |
|
631 | this function makes sure the repo is locked during the application and have | |
632 | mechanism to check that no push race occured between the creation of the |
|
632 | mechanism to check that no push race occurred between the creation of the | |
633 | bundle and its application. |
|
633 | bundle and its application. | |
634 |
|
634 | |||
635 | If the push was raced as PushRaced exception is raised.""" |
|
635 | If the push was raced as PushRaced exception is raised.""" |
@@ -85,14 +85,14 b' class mergestate(object):' | |||||
85 | def _readrecords(self): |
|
85 | def _readrecords(self): | |
86 | """Read merge state from disk and return a list of record (TYPE, data) |
|
86 | """Read merge state from disk and return a list of record (TYPE, data) | |
87 |
|
87 | |||
88 |
We read data from both |
|
88 | We read data from both v1 and v2 files and decide which one to use. | |
89 |
|
89 | |||
90 |
V1 ha |
|
90 | V1 has been used by version prior to 2.9.1 and contains less data than | |
91 |
v2. We read both version and check if no data in v2 contradict |
|
91 | v2. We read both versions and check if no data in v2 contradicts | |
92 | v1. If there is not contradiction we can safely assume that both v1 |
|
92 | v1. If there is not contradiction we can safely assume that both v1 | |
93 | and v2 were written at the same time and use the extract data in v2. If |
|
93 | and v2 were written at the same time and use the extract data in v2. If | |
94 | there is contradiction we ignore v2 content as we assume an old version |
|
94 | there is contradiction we ignore v2 content as we assume an old version | |
95 |
of Mercurial ha |
|
95 | of Mercurial has overwritten the mergestate file and left an old v2 | |
96 | file around. |
|
96 | file around. | |
97 |
|
97 | |||
98 | returns list of record [(TYPE, data), ...]""" |
|
98 | returns list of record [(TYPE, data), ...]""" |
@@ -176,7 +176,7 b' def encodemeta(meta):' | |||||
176 | if ':' in key or '\0' in key: |
|
176 | if ':' in key or '\0' in key: | |
177 | raise ValueError("':' and '\0' are forbidden in metadata key'") |
|
177 | raise ValueError("':' and '\0' are forbidden in metadata key'") | |
178 | if '\0' in value: |
|
178 | if '\0' in value: | |
179 |
raise ValueError("':' |
|
179 | raise ValueError("':' is forbidden in metadata value'") | |
180 | return '\0'.join(['%s:%s' % (k, meta[k]) for k in sorted(meta)]) |
|
180 | return '\0'.join(['%s:%s' % (k, meta[k]) for k in sorted(meta)]) | |
181 |
|
181 | |||
182 | def decodemeta(data): |
|
182 | def decodemeta(data): | |
@@ -355,8 +355,8 b' def _encodeonemarker(marker):' | |||||
355 | def _pushkeyescape(markers): |
|
355 | def _pushkeyescape(markers): | |
356 | """encode markers into a dict suitable for pushkey exchange |
|
356 | """encode markers into a dict suitable for pushkey exchange | |
357 |
|
357 | |||
358 |
- binary data is base8 |
|
358 | - binary data is base85 encoded | |
359 |
- split |
|
359 | - split in chunks smaller than 5300 bytes""" | |
360 | keys = {} |
|
360 | keys = {} | |
361 | parts = [] |
|
361 | parts = [] | |
362 | currentlen = _maxpayload * 2 # ensure we create a new part |
|
362 | currentlen = _maxpayload * 2 # ensure we create a new part | |
@@ -652,7 +652,7 b' def successorssets(repo, initialnode, ca' | |||||
652 | # Within a marker, a successor may have divergent successors |
|
652 | # Within a marker, a successor may have divergent successors | |
653 | # sets. In such a case, the marker will contribute multiple |
|
653 | # sets. In such a case, the marker will contribute multiple | |
654 | # divergent successors sets. If multiple successors have |
|
654 | # divergent successors sets. If multiple successors have | |
655 |
# divergent successors sets, a |
|
655 | # divergent successors sets, a Cartesian product is used. | |
656 | # |
|
656 | # | |
657 | # At the end we post-process successors sets to remove |
|
657 | # At the end we post-process successors sets to remove | |
658 | # duplicated entry and successors set that are strict subset of |
|
658 | # duplicated entry and successors set that are strict subset of | |
@@ -779,7 +779,7 b' def _computeextinctset(repo):' | |||||
779 | def _computebumpedset(repo): |
|
779 | def _computebumpedset(repo): | |
780 | """the set of revs trying to obsolete public revisions""" |
|
780 | """the set of revs trying to obsolete public revisions""" | |
781 | bumped = set() |
|
781 | bumped = set() | |
782 |
# util |
|
782 | # util function (avoid attribute lookup in the loop) | |
783 | phase = repo._phasecache.phase # would be faster to grab the full list |
|
783 | phase = repo._phasecache.phase # would be faster to grab the full list | |
784 | public = phases.public |
|
784 | public = phases.public | |
785 | cl = repo.changelog |
|
785 | cl = repo.changelog | |
@@ -825,7 +825,7 b' def createmarkers(repo, relations, flag=' | |||||
825 | """Add obsolete markers between changesets in a repo |
|
825 | """Add obsolete markers between changesets in a repo | |
826 |
|
826 | |||
827 | <relations> must be an iterable of (<old>, (<new>, ...)[,{metadata}]) |
|
827 | <relations> must be an iterable of (<old>, (<new>, ...)[,{metadata}]) | |
828 |
tuple. `old` and `news` are changectx. metadata is an optional diction |
|
828 | tuple. `old` and `news` are changectx. metadata is an optional dictionary | |
829 | containing metadata for this marker only. It is merged with the global |
|
829 | containing metadata for this marker only. It is merged with the global | |
830 | metadata specified through the `metadata` argument of this function, |
|
830 | metadata specified through the `metadata` argument of this function, | |
831 |
|
831 |
@@ -2459,7 +2459,7 b' class _addset(_orderedsetmixin):' | |||||
2459 |
|
2459 | |||
2460 | If the ascending attribute is set, that means the two structures are |
|
2460 | If the ascending attribute is set, that means the two structures are | |
2461 | ordered in either an ascending or descending way. Therefore, we can add |
|
2461 | ordered in either an ascending or descending way. Therefore, we can add | |
2462 | them mantaining the order by iterating over both at the same time |
|
2462 | them maintaining the order by iterating over both at the same time | |
2463 |
|
2463 | |||
2464 | This class does not duck-type baseset and it's only supposed to be used |
|
2464 | This class does not duck-type baseset and it's only supposed to be used | |
2465 | internally |
|
2465 | internally |
@@ -393,7 +393,7 b' class wirepeer(peer.peerrepository):' | |||||
393 |
|
393 | |||
394 | The command is expected to return a stream. |
|
394 | The command is expected to return a stream. | |
395 |
|
395 | |||
396 |
The stream may have been compressed in some implementa |
|
396 | The stream may have been compressed in some implementations. This | |
397 | function takes care of the decompression. This is the only difference |
|
397 | function takes care of the decompression. This is the only difference | |
398 | with _callstream. |
|
398 | with _callstream. | |
399 |
|
399 | |||
@@ -475,7 +475,7 b' def options(cmd, keys, others):' | |||||
475 | commands = {} |
|
475 | commands = {} | |
476 |
|
476 | |||
477 | def wireprotocommand(name, args=''): |
|
477 | def wireprotocommand(name, args=''): | |
478 | """decorator for wireprotocol command""" |
|
478 | """decorator for wire protocol command""" | |
479 | def register(func): |
|
479 | def register(func): | |
480 | commands[name] = (func, args) |
|
480 | commands[name] = (func, args) | |
481 | return func |
|
481 | return func | |
@@ -551,7 +551,7 b' def _capabilities(repo, proto):' | |||||
551 |
|
551 | |||
552 | - returns a lists: easy to alter |
|
552 | - returns a lists: easy to alter | |
553 | - change done here will be propagated to both `capabilities` and `hello` |
|
553 | - change done here will be propagated to both `capabilities` and `hello` | |
554 |
command without any other |
|
554 | command without any other action needed. | |
555 | """ |
|
555 | """ | |
556 | # copy to prevent modification of the global list |
|
556 | # copy to prevent modification of the global list | |
557 | caps = list(wireprotocaps) |
|
557 | caps = list(wireprotocaps) | |
@@ -569,7 +569,7 b' def _capabilities(repo, proto):' | |||||
569 | caps.append('httpheader=1024') |
|
569 | caps.append('httpheader=1024') | |
570 | return caps |
|
570 | return caps | |
571 |
|
571 | |||
572 |
# If you are writ |
|
572 | # If you are writing an extension and consider wrapping this function. Wrap | |
573 | # `_capabilities` instead. |
|
573 | # `_capabilities` instead. | |
574 | @wireprotocommand('capabilities') |
|
574 | @wireprotocommand('capabilities') | |
575 | def capabilities(repo, proto): |
|
575 | def capabilities(repo, proto): | |
@@ -692,7 +692,7 b' def stream(repo, proto):' | |||||
692 | The format is simple: the server writes out a line with the amount |
|
692 | The format is simple: the server writes out a line with the amount | |
693 | of files, then the total amount of bytes to be transferred (separated |
|
693 | of files, then the total amount of bytes to be transferred (separated | |
694 | by a space). Then, for each file, the server first writes the filename |
|
694 | by a space). Then, for each file, the server first writes the filename | |
695 | and filesize (separated by the null character), then the file contents. |
|
695 | and file size (separated by the null character), then the file contents. | |
696 | ''' |
|
696 | ''' | |
697 |
|
697 | |||
698 | if not _allowstream(repo.ui): |
|
698 | if not _allowstream(repo.ui): | |
@@ -776,7 +776,7 b' def unbundle(repo, proto, heads):' | |||||
776 | os.unlink(tempname) |
|
776 | os.unlink(tempname) | |
777 | except util.Abort, inst: |
|
777 | except util.Abort, inst: | |
778 | # The old code we moved used sys.stderr directly. |
|
778 | # The old code we moved used sys.stderr directly. | |
779 |
# We did not change |
|
779 | # We did not change it to minimise code change. | |
780 | # This need to be moved to something proper. |
|
780 | # This need to be moved to something proper. | |
781 | # Feel free to do it. |
|
781 | # Feel free to do it. | |
782 | sys.stderr.write("abort: %s\n" % inst) |
|
782 | sys.stderr.write("abort: %s\n" % inst) |
@@ -149,7 +149,7 b' def getparser():' | |||||
149 | help="print a test coverage report") |
|
149 | help="print a test coverage report") | |
150 | parser.add_option("-d", "--debug", action="store_true", |
|
150 | parser.add_option("-d", "--debug", action="store_true", | |
151 | help="debug mode: write output of test scripts to console" |
|
151 | help="debug mode: write output of test scripts to console" | |
152 |
" rather than capturing and diff |
|
152 | " rather than capturing and diffing it (disables timeout)") | |
153 | parser.add_option("-f", "--first", action="store_true", |
|
153 | parser.add_option("-f", "--first", action="store_true", | |
154 | help="exit on the first test failure") |
|
154 | help="exit on the first test failure") | |
155 | parser.add_option("-H", "--htmlcov", action="store_true", |
|
155 | parser.add_option("-H", "--htmlcov", action="store_true", | |
@@ -606,7 +606,7 b' def rematch(el, l):' | |||||
606 |
|
606 | |||
607 | def globmatch(el, l): |
|
607 | def globmatch(el, l): | |
608 | # The only supported special characters are * and ? plus / which also |
|
608 | # The only supported special characters are * and ? plus / which also | |
609 | # matches \ on windows. Escaping of these caracters is supported. |
|
609 | # matches \ on windows. Escaping of these characters is supported. | |
610 | if el + '\n' == l: |
|
610 | if el + '\n' == l: | |
611 | if os.altsep: |
|
611 | if os.altsep: | |
612 | # matching on "/" is not needed for this line |
|
612 | # matching on "/" is not needed for this line | |
@@ -664,7 +664,7 b' def tsttest(test, wd, options, replaceme' | |||||
664 | after = {} |
|
664 | after = {} | |
665 | pos = prepos = -1 |
|
665 | pos = prepos = -1 | |
666 |
|
666 | |||
667 | # Expected shellscript output |
|
667 | # Expected shell script output | |
668 | expected = {} |
|
668 | expected = {} | |
669 |
|
669 | |||
670 | # We keep track of whether or not we're in a Python block so we |
|
670 | # We keep track of whether or not we're in a Python block so we |
@@ -103,7 +103,7 b' def test_lazyancestors():' | |||||
103 |
|
103 | |||
104 |
|
104 | |||
105 | # The C gca algorithm requires a real repo. These are textual descriptions of |
|
105 | # The C gca algorithm requires a real repo. These are textual descriptions of | |
106 |
# |
|
106 | # DAGs that have been known to be problematic. | |
107 | dagtests = [ |
|
107 | dagtests = [ | |
108 | '+2*2*2/*3/2', |
|
108 | '+2*2*2/*3/2', | |
109 | '+3*3/*2*2/*4*4/*4/2*4/2*2', |
|
109 | '+3*3/*2*2/*4*4/*4/2*4/2*2', |
@@ -183,7 +183,7 b' set and then unset it' | |||||
183 | [1] |
|
183 | [1] | |
184 |
|
184 | |||
185 | when a bookmark is active, hg up -r . is |
|
185 | when a bookmark is active, hg up -r . is | |
186 | analogus to hg book -i <active bookmark> |
|
186 | analogous to hg book -i <active bookmark> | |
187 |
|
187 | |||
188 | $ hg up -q X |
|
188 | $ hg up -q X | |
189 | $ hg up -q . |
|
189 | $ hg up -q . |
@@ -546,7 +546,7 b' test for http://mercurial.selenic.com/bt' | |||||
546 |
|
546 | |||
547 | test that verify bundle does not traceback |
|
547 | test that verify bundle does not traceback | |
548 |
|
548 | |||
549 | partial history bundle, fails w/ unkown parent |
|
549 | partial history bundle, fails w/ unknown parent | |
550 |
|
550 | |||
551 | $ hg -R bundle.hg verify |
|
551 | $ hg -R bundle.hg verify | |
552 | abort: 00changelog.i@bbd179dfa0a7: unknown parent! |
|
552 | abort: 00changelog.i@bbd179dfa0a7: unknown parent! |
@@ -1486,7 +1486,7 b' No tag set:' | |||||
1486 | 1: null+2 |
|
1486 | 1: null+2 | |
1487 | 0: null+1 |
|
1487 | 0: null+1 | |
1488 |
|
1488 | |||
1489 |
One common tag: long |
|
1489 | One common tag: longest path wins: | |
1490 |
|
1490 | |||
1491 | $ hg tag -r 1 -m t1 -d '6 0' t1 |
|
1491 | $ hg tag -r 1 -m t1 -d '6 0' t1 | |
1492 | $ hg log --template '{rev}: {latesttag}+{latesttagdistance}\n' |
|
1492 | $ hg log --template '{rev}: {latesttag}+{latesttagdistance}\n' | |
@@ -1709,7 +1709,7 b' Test string escaping:' | |||||
1709 | $ hg log -R a -r 8 --template '{files % r"{file}\n"}\n' |
|
1709 | $ hg log -R a -r 8 --template '{files % r"{file}\n"}\n' | |
1710 | fourth\nsecond\nthird\n |
|
1710 | fourth\nsecond\nthird\n | |
1711 |
|
1711 | |||
1712 |
Test string escap |
|
1712 | Test string escaping in nested expression: | |
1713 |
|
1713 | |||
1714 | $ hg log -R a -r 8 --template '{ifeq(r"\x6e", if("1", "\x5c\x786e"), join(files, "\x5c\x786e"))}\n' |
|
1714 | $ hg log -R a -r 8 --template '{ifeq(r"\x6e", if("1", "\x5c\x786e"), join(files, "\x5c\x786e"))}\n' | |
1715 | fourth\x6esecond\x6ethird |
|
1715 | fourth\x6esecond\x6ethird |
@@ -126,7 +126,7 b'' | |||||
126 | you want to close a branch. Each entry contains a revision or hash |
|
126 | you want to close a branch. Each entry contains a revision or hash | |
127 | separated by white space. |
|
127 | separated by white space. | |
128 |
|
128 | |||
129 |
The tag |
|
129 | The tagmap is a file that exactly analogous to the branchmap. This will | |
130 | rename tags on the fly and prevent the 'update tags' commit usually found |
|
130 | rename tags on the fly and prevent the 'update tags' commit usually found | |
131 | at the end of a convert process. |
|
131 | at the end of a convert process. | |
132 |
|
132 |
@@ -238,7 +238,7 b' define commands to display help text' | |||||
238 |
|
238 | |||
239 | use "hg -v help show_ambig_ru" to show the global options |
|
239 | use "hg -v help show_ambig_ru" to show the global options | |
240 |
|
240 | |||
241 | (2-2-4) display Russian ambiguous-width charactes in utf-8 |
|
241 | (2-2-4) display Russian ambiguous-width characters in utf-8 | |
242 |
|
242 | |||
243 | $ COLUMNS=60 HGENCODINGAMBIGUOUS=wide hg --encoding utf-8 --config extensions.show=./show.py help show_ambig_ru |
|
243 | $ COLUMNS=60 HGENCODINGAMBIGUOUS=wide hg --encoding utf-8 --config extensions.show=./show.py help show_ambig_ru | |
244 | hg show_ambig_ru |
|
244 | hg show_ambig_ru |
@@ -118,7 +118,7 b' match in last "line" without newline' | |||||
118 |
|
118 | |||
119 | $ cd .. |
|
119 | $ cd .. | |
120 |
|
120 | |||
121 |
Issue685: trac |
|
121 | Issue685: traceback in grep -r after rename | |
122 |
|
122 | |||
123 | Got a traceback when using grep on a single |
|
123 | Got a traceback when using grep on a single | |
124 | revision with renamed files. |
|
124 | revision with renamed files. |
@@ -1,4 +1,4 b'' | |||||
1 |
Test histedit exten |
|
1 | Test histedit extension: Fold commands | |
2 | ====================================== |
|
2 | ====================================== | |
3 |
|
3 | |||
4 | This test file is dedicated to testing the fold command in non conflicting |
|
4 | This test file is dedicated to testing the fold command in non conflicting | |
@@ -173,7 +173,7 b' check saving last-message.txt' | |||||
173 | folding and creating no new change doesn't break: |
|
173 | folding and creating no new change doesn't break: | |
174 | ------------------------------------------------- |
|
174 | ------------------------------------------------- | |
175 |
|
175 | |||
176 | folded content is dropped during a merge. The folded commit should properly disapear. |
|
176 | folded content is dropped during a merge. The folded commit should properly disappear. | |
177 |
|
177 | |||
178 | $ mkdir fold-to-empty-test |
|
178 | $ mkdir fold-to-empty-test | |
179 | $ cd fold-to-empty-test |
|
179 | $ cd fold-to-empty-test |
@@ -13,7 +13,7 b' using the "replace" error handler:' | |||||
13 | abortado: n?o foi encontrado um reposit?rio em '$TESTTMP' (.hg n?o encontrado)! |
|
13 | abortado: n?o foi encontrado um reposit?rio em '$TESTTMP' (.hg n?o encontrado)! | |
14 | [255] |
|
14 | [255] | |
15 |
|
15 | |||
16 | Using a more accomodating encoding: |
|
16 | Using a more accommodating encoding: | |
17 |
|
17 | |||
18 | $ HGENCODING=UTF-8 LANGUAGE=pt_BR hg tip |
|
18 | $ HGENCODING=UTF-8 LANGUAGE=pt_BR hg tip | |
19 | abortado: n\xc3\xa3o foi encontrado um reposit\xc3\xb3rio em '$TESTTMP' (.hg n\xc3\xa3o encontrado)! (esc) |
|
19 | abortado: n\xc3\xa3o foi encontrado um reposit\xc3\xb3rio em '$TESTTMP' (.hg n\xc3\xa3o encontrado)! (esc) |
@@ -321,7 +321,7 b' Multiple binary files:' | |||||
321 | a874b471193996e7cb034bb301cac7bdaf3e3f46 644 mbinary2 |
|
321 | a874b471193996e7cb034bb301cac7bdaf3e3f46 644 mbinary2 | |
322 |
|
322 | |||
323 | Binary file and delta hunk (we build the patch using this sed hack to |
|
323 | Binary file and delta hunk (we build the patch using this sed hack to | |
324 |
avoid an unquoted ^, which check-code says breaks sh on |
|
324 | avoid an unquoted ^, which check-code says breaks sh on Solaris): | |
325 |
|
325 | |||
326 | $ sed 's/ caret /^/g;s/ dollarparen /$(/g' > quote-hack.patch <<'EOF' |
|
326 | $ sed 's/ caret /^/g;s/ dollarparen /$(/g' > quote-hack.patch <<'EOF' | |
327 | > diff --git a/delta b/delta |
|
327 | > diff --git a/delta b/delta |
@@ -27,7 +27,7 b' Discard all cached largefiles in USERCAC' | |||||
27 |
|
27 | |||
28 | Create mirror repo, and pull from source without largefile: |
|
28 | Create mirror repo, and pull from source without largefile: | |
29 | "pull" is used instead of "clone" for suppression of (1) updating to |
|
29 | "pull" is used instead of "clone" for suppression of (1) updating to | |
30 |
tip (= ca |
|
30 | tip (= caching largefile from source repo), and (2) recording source | |
31 | repo as "default" path in .hg/hgrc. |
|
31 | repo as "default" path in .hg/hgrc. | |
32 |
|
32 | |||
33 | $ hg init mirror |
|
33 | $ hg init mirror |
@@ -489,8 +489,8 b' dir after a purge.' | |||||
489 | $ cat sub2/large7 |
|
489 | $ cat sub2/large7 | |
490 | large7 |
|
490 | large7 | |
491 |
|
491 | |||
492 | Test addremove: verify that files that should be added as largfiles are added as |
|
492 | Test addremove: verify that files that should be added as largefiles are added as | |
493 | such and that already-existing largfiles are not added as normal files by |
|
493 | such and that already-existing largefiles are not added as normal files by | |
494 | accident. |
|
494 | accident. | |
495 |
|
495 | |||
496 | $ rm normal3 |
|
496 | $ rm normal3 |
@@ -441,7 +441,7 b' hg qseries -m with color' | |||||
441 | \x1b[0;31;1mb.patch\x1b[0m (esc) |
|
441 | \x1b[0;31;1mb.patch\x1b[0m (esc) | |
442 |
|
442 | |||
443 |
|
443 | |||
444 | excercise cornercases in "qselect --reapply" |
|
444 | excercise corner cases in "qselect --reapply" | |
445 |
|
445 | |||
446 | $ hg qpop -a |
|
446 | $ hg qpop -a | |
447 | popping c.patch |
|
447 | popping c.patch |
@@ -210,7 +210,7 b' Update with no arguments: tipmost revisi' | |||||
210 | abort: branch foobar not found |
|
210 | abort: branch foobar not found | |
211 | [255] |
|
211 | [255] | |
212 |
|
212 | |||
213 | Fastforward merge: |
|
213 | Fast-forward merge: | |
214 |
|
214 | |||
215 | $ hg branch ff |
|
215 | $ hg branch ff | |
216 | marked working directory as branch ff |
|
216 | marked working directory as branch ff |
@@ -247,7 +247,7 b' split is not divergences' | |||||
247 | 392fd25390da |
|
247 | 392fd25390da | |
248 | $ hg log -r 'divergent()' |
|
248 | $ hg log -r 'divergent()' | |
249 |
|
249 | |||
250 |
Even when subsequent |
|
250 | Even when subsequent rewriting happen | |
251 |
|
251 | |||
252 | $ mkcommit A_3 |
|
252 | $ mkcommit A_3 | |
253 | created new head |
|
253 | created new head |
@@ -908,10 +908,9 b' Discovery locally secret changeset on a ' | |||||
908 | o 0 public a-A - 054250a37db4 |
|
908 | o 0 public a-A - 054250a37db4 | |
909 |
|
909 | |||
910 |
|
910 | |||
911 | pushing a locally public and draft changesets remotly secret should make them |
|
911 | pushing a locally public and draft changesets remotely secret should make them | |
912 | appear on the remote side. |
|
912 | appear on the remote side. | |
913 |
|
913 | |||
914 |
|
||||
915 | $ hg -R ../mu phase --secret --force 967b449fbc94 |
|
914 | $ hg -R ../mu phase --secret --force 967b449fbc94 | |
916 | $ hg push -r 435b5d83910c ../mu |
|
915 | $ hg push -r 435b5d83910c ../mu | |
917 | pushing to ../mu |
|
916 | pushing to ../mu |
@@ -148,7 +148,7 b' Test secret changeset are not pushed' | |||||
148 | (Issue3303) |
|
148 | (Issue3303) | |
149 | Check that remote secret changeset are ignore when checking creation of remote heads |
|
149 | Check that remote secret changeset are ignore when checking creation of remote heads | |
150 |
|
150 | |||
151 |
We add a secret head into the push destination. |
|
151 | We add a secret head into the push destination. This secret head shadows a | |
152 | visible shared between the initial repo and the push destination. |
|
152 | visible shared between the initial repo and the push destination. | |
153 |
|
153 | |||
154 | $ hg up -q 4 # B' |
|
154 | $ hg up -q 4 # B' | |
@@ -156,8 +156,8 b' visible shared between the initial repo ' | |||||
156 | $ hg phase . |
|
156 | $ hg phase . | |
157 | 5: secret |
|
157 | 5: secret | |
158 |
|
158 | |||
159 |
|
|
159 | We now try to push a new public changeset that descend from the common public | |
160 |
|
|
160 | head shadowed by the remote secret head. | |
161 |
|
|
161 | ||
162 | $ cd ../initialrepo |
|
162 | $ cd ../initialrepo | |
163 | $ hg up -q 6 #B' |
|
163 | $ hg up -q 6 #B' |
@@ -39,8 +39,8 b' mercurial.localrepo.localrepository.test' | |||||
39 | mercurial.localrepo.localrepository.testcachedunfifoobar = testcachedunfifoobar |
|
39 | mercurial.localrepo.localrepository.testcachedunfifoobar = testcachedunfifoobar | |
40 |
|
40 | |||
41 |
|
41 | |||
42 |
# |
|
42 | # Create an empty repo and instantiate it. It is important to run | |
43 |
# th |
|
43 | # these tests on the real object to detect regression. | |
44 | repopath = os.path.join(os.environ['TESTTMP'], 'repo') |
|
44 | repopath = os.path.join(os.environ['TESTTMP'], 'repo') | |
45 | assert subprocess.call(['hg', 'init', repopath]) == 0 |
|
45 | assert subprocess.call(['hg', 'init', repopath]) == 0 | |
46 | ui = uimod.ui() |
|
46 | ui = uimod.ui() |
@@ -353,7 +353,7 b' Using --new-branch to push new named bra' | |||||
353 | adding file changes |
|
353 | adding file changes | |
354 | added 1 changesets with 1 changes to 1 files |
|
354 | added 1 changesets with 1 changes to 1 files | |
355 |
|
355 | |||
356 |
Pushing muli |
|
356 | Pushing multi headed new branch: | |
357 |
|
357 | |||
358 | $ echo 14 > foo |
|
358 | $ echo 14 > foo | |
359 | $ hg -q branch f |
|
359 | $ hg -q branch f |
@@ -650,7 +650,7 b' each root have a different common ancest' | |||||
650 | o 0: 'A' |
|
650 | o 0: 'A' | |
651 |
|
651 | |||
652 |
|
652 | |||
653 | Test that rebase is not confused by $CWD disapearing during rebase (issue 4121) |
|
653 | Test that rebase is not confused by $CWD disappearing during rebase (issue 4121) | |
654 |
|
654 | |||
655 | $ cd .. |
|
655 | $ cd .. | |
656 | $ hg init cwd-vanish |
|
656 | $ hg init cwd-vanish |
@@ -774,7 +774,7 b' m "nm a b" "um x a" " " "22 get a, ' | |||||
774 |
|
774 | |||
775 | Systematic and terse testing of merge merges and ancestor calculation: |
|
775 | Systematic and terse testing of merge merges and ancestor calculation: | |
776 |
|
776 | |||
777 |
Ex |
|
777 | Expected result: | |
778 |
|
778 | |||
779 | \ a m1 m2 dst |
|
779 | \ a m1 m2 dst | |
780 | 0 - f f f "versions differ" |
|
780 | 0 - f f f "versions differ" |
@@ -548,7 +548,7 b' test sorting two sorted collections in d' | |||||
548 | 6 |
|
548 | 6 | |
549 | 2 |
|
549 | 2 | |
550 |
|
550 | |||
551 |
test sub |
|
551 | test subtracting something from an addset | |
552 |
|
552 | |||
553 | $ log '(outgoing() or removes(a)) - removes(a)' |
|
553 | $ log '(outgoing() or removes(a)) - removes(a)' | |
554 | 8 |
|
554 | 8 |
@@ -452,7 +452,7 b' Sticky subrepositories, no changes' | |||||
452 | da5f5b1d8ffcf62fb8327bcd3c89a4367a6018e7 |
|
452 | da5f5b1d8ffcf62fb8327bcd3c89a4367a6018e7 | |
453 | $ cd .. |
|
453 | $ cd .. | |
454 |
|
454 | |||
455 |
Sticky subrepositor |
|
455 | Sticky subrepositories, file changes | |
456 | $ touch s/f1 |
|
456 | $ touch s/f1 | |
457 | $ cd s |
|
457 | $ cd s | |
458 | $ git add f1 |
|
458 | $ git add f1 |
@@ -306,7 +306,7 b' Sticky subrepositories, no changes' | |||||
306 | 2 |
|
306 | 2 | |
307 | $ cd .. |
|
307 | $ cd .. | |
308 |
|
308 | |||
309 |
Sticky subrepositor |
|
309 | Sticky subrepositories, file changes | |
310 | $ touch s/f1 |
|
310 | $ touch s/f1 | |
311 | $ cd s |
|
311 | $ cd s | |
312 | $ svn add f1 |
|
312 | $ svn add f1 |
@@ -948,7 +948,7 b' Sticky subrepositories, no changes' | |||||
948 | $ hg -R t id |
|
948 | $ hg -R t id | |
949 | e95bcfa18a35 |
|
949 | e95bcfa18a35 | |
950 |
|
950 | |||
951 |
Sticky subrepositor |
|
951 | Sticky subrepositories, file changes | |
952 | $ touch s/f1 |
|
952 | $ touch s/f1 | |
953 | $ touch t/f1 |
|
953 | $ touch t/f1 | |
954 | $ hg add -S s/f1 |
|
954 | $ hg add -S s/f1 | |
@@ -1333,7 +1333,7 b' configuration' | |||||
1333 | $ cd .. |
|
1333 | $ cd .. | |
1334 |
|
1334 | |||
1335 |
|
1335 | |||
1336 | Test that comit --secret works on both repo and subrepo (issue4182) |
|
1336 | Test that commit --secret works on both repo and subrepo (issue4182) | |
1337 |
|
1337 | |||
1338 | $ cd main |
|
1338 | $ cd main | |
1339 | $ echo secret >> b |
|
1339 | $ echo secret >> b |
@@ -71,7 +71,7 b' test transplanted revset' | |||||
71 | "transplanted([set])" |
|
71 | "transplanted([set])" | |
72 | Transplanted changesets in set, or all transplanted changesets. |
|
72 | Transplanted changesets in set, or all transplanted changesets. | |
73 |
|
73 | |||
74 | test tranplanted keyword |
|
74 | test transplanted keyword | |
75 |
|
75 | |||
76 | $ hg log --template '{rev} {transplanted}\n' |
|
76 | $ hg log --template '{rev} {transplanted}\n' | |
77 | 7 a53251cdf717679d1907b289f991534be05c997a |
|
77 | 7 a53251cdf717679d1907b289f991534be05c997a | |
@@ -414,7 +414,7 b' Issue1111: Test transplant --merge' | |||||
414 | $ hg ci -m appendd |
|
414 | $ hg ci -m appendd | |
415 | created new head |
|
415 | created new head | |
416 |
|
416 | |||
417 | tranplant |
|
417 | transplant | |
418 |
|
418 | |||
419 | $ hg transplant -m 1 |
|
419 | $ hg transplant -m 1 | |
420 | applying 42dc4432fd35 |
|
420 | applying 42dc4432fd35 |
General Comments 0
You need to be logged in to leave comments.
Login now