##// END OF EJS Templates
merge default into stable for 3.1 code freeze
Matt Mackall -
r21926:6c36dc6c merge 3.1-rc stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,7 b''
1 FROM centos
2 RUN yum install -y gcc
3 RUN yum install -y python-devel python-docutils
4 RUN yum install -y make
5 RUN yum install -y rpm-build
6 RUN yum install -y gettext
7 RUN yum install -y tar
@@ -0,0 +1,6 b''
1 FROM fedora
2 RUN yum install -y gcc
3 RUN yum install -y python-devel python-docutils
4 RUN yum install -y make
5 RUN yum install -y rpm-build
6 RUN yum install -y gettext
@@ -0,0 +1,14 b''
1 #!/bin/bash
2
3 BUILDDIR=$(dirname $0)
4 ROOTDIR=$(cd $BUILDDIR/..; pwd)
5
6 if which docker.io >> /dev/null ; then
7 DOCKER=docker.io
8 elif which docker >> /dev/null ; then
9 DOCKER=docker
10 fi
11
12 $DOCKER build --tag "hg-dockerrpm-$1" - < $BUILDDIR/docker/$1
13 $DOCKER run --rm -v $ROOTDIR:/hg "hg-dockerrpm-$1" bash -c \
14 "cp -a hg hg-build; cd hg-build; make clean local $1; cp packages/$1/* /hg/packages/$1/"
@@ -0,0 +1,4 b''
1 Run Mercurial tests with Vagrant:
2
3 $ vagrant up
4 $ vagrant ssh -c ./run-tests.sh
@@ -0,0 +1,14 b''
1 # -*- mode: ruby -*-
2
3 Vagrant.configure('2') do |config|
4 # Debian 7.4 32-bit i386 without configuration management software
5 config.vm.box = "puppetlabs/debian-7.4-32-nocm"
6 #config.vm.box = "pnd/debian-wheezy32-basebox"
7 config.vm.hostname = "tests"
8
9 config.vm.define "tests" do |conf|
10 conf.vm.provision :file, source: "run-tests.sh", destination:"run-tests.sh"
11 conf.vm.provision :shell, path: "provision.sh"
12 conf.vm.synced_folder "../..", "/hgshared"
13 end
14 end
@@ -0,0 +1,10 b''
1 #!/bin/sh
2 # This scripts is used to install dependencies for
3 # testing Mercurial. Mainly used by Vagrant (see
4 # Vagrantfile for details).
5
6 export DEBIAN_FRONTEND=noninteractive
7 apt-get update
8 apt-get install -y -q python-dev unzip
9 # run-tests.sh is added by Vagrantfile
10 chmod +x run-tests.sh
@@ -0,0 +1,13 b''
1 #!/bin/sh
2 # This scripts is used to setup temp directory in memory
3 # for running Mercurial tests in vritual machine managed
4 # by Vagrant (see Vagrantfile for details).
5
6 cd /hgshared
7 make local
8 cd tests
9 mkdir /tmp/ram
10 sudo mount -t tmpfs -o size=100M tmpfs /tmp/ram
11 export TMPDIR=/tmp/ram
12 ./run-tests.py -l --time
13
@@ -0,0 +1,265 b''
1 # tagmerge.py - merge .hgtags files
2 #
3 # Copyright 2014 Angel Ezquerra <angel.ezquerra@gmail.com>
4 #
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
7
8 # This module implements an automatic merge algorithm for mercurial's tag files
9 #
10 # The tagmerge algorithm implemented in this module is able to resolve most
11 # merge conflicts that currently would trigger a .hgtags merge conflict. The
12 # only case that it does not (and cannot) handle is that in which two tags point
13 # to different revisions on each merge parent _and_ their corresponding tag
14 # histories have the same rank (i.e. the same length). In all other cases the
15 # merge algorithm will choose the revision belonging to the parent with the
16 # highest ranked tag history. The merged tag history is the combination of both
17 # tag histories (special care is taken to try to combine common tag histories
18 # where possible).
19 #
20 # In addition to actually merging the tags from two parents, taking into
21 # account the base, the algorithm also tries to minimize the difference
22 # between the merged tag file and the first parent's tag file (i.e. it tries to
23 # make the merged tag order as as similar as possible to the first parent's tag
24 # file order).
25 #
26 # The algorithm works as follows:
27 # 1. read the tags from p1, p2 and the base
28 # - when reading the p1 tags, also get the line numbers associated to each
29 # tag node (these will be used to sort the merged tags in a way that
30 # minimizes the diff to p1). Ignore the file numbers when reading p2 and
31 # the base
32 # 2. recover the "lost tags" (i.e. those that are found in the base but not on
33 # p1 or p2) and add them back to p1 and/or p2
34 # - at this point the only tags that are on p1 but not on p2 are those new
35 # tags that were introduced in p1. Same thing for the tags that are on p2
36 # but not on p2
37 # 3. take all tags that are only on p1 or only on p2 (but not on the base)
38 # - Note that these are the tags that were introduced between base and p1
39 # and between base and p2, possibly on separate clones
40 # 4. for each tag found both on p1 and p2 perform the following merge algorithm:
41 # - the tags conflict if their tag "histories" have the same "rank" (i.e.
42 # length) _AND_ the last (current) tag is _NOT_ the same
43 # - for non conflicting tags:
44 # - choose which are the high and the low ranking nodes
45 # - the high ranking list of nodes is the one that is longer.
46 # In case of draw favor p1
47 # - the merged node list is made of 3 parts:
48 # - first the nodes that are common to the beginning of both
49 # the low and the high ranking nodes
50 # - second the non common low ranking nodes
51 # - finally the non common high ranking nodes (with the last
52 # one being the merged tag node)
53 # - note that this is equivalent to putting the whole low ranking
54 # node list first, followed by the non common high ranking nodes
55 # - note that during the merge we keep the "node line numbers", which will
56 # be used when writing the merged tags to the tag file
57 # 5. write the merged tags taking into account to their positions in the first
58 # parent (i.e. try to keep the relative ordering of the nodes that come
59 # from p1). This minimizes the diff between the merged and the p1 tag files
60 # This is donw by using the following algorithm
61 # - group the nodes for a given tag that must be written next to each other
62 # - A: nodes that come from consecutive lines on p1
63 # - B: nodes that come from p2 (i.e. whose associated line number is
64 # None) and are next to one of the a nodes in A
65 # - each group is associated with a line number coming from p1
66 # - generate a "tag block" for each of the groups
67 # - a tag block is a set of consecutive "node tag" lines belonging to
68 # the same tag and which will be written next to each other on the
69 # merged tags file
70 # - sort the "tag blocks" according to their associated number line
71 # - put blocks whose nodes come all from p2 first
72 # - write the tag blocks in the sorted order
73
74 import tags
75 import util
76 from node import nullid, hex
77 from i18n import _
78 import operator
79 hexnullid = hex(nullid)
80
81 def readtagsformerge(ui, repo, lines, fn='', keeplinenums=False):
82 '''read the .hgtags file into a structure that is suitable for merging
83
84 Sepending on the keeplinenumbers flag, clear the line numbers associated
85 with each tag. Rhis is done because only the line numbers of the first
86 parent are useful for merging
87 '''
88 filetags = tags._readtaghist(ui, repo, lines, fn=fn, recode=None,
89 calcnodelines=True)[1]
90 for tagname, taginfo in filetags.items():
91 if not keeplinenums:
92 for el in taginfo:
93 el[1] = None
94 return filetags
95
96 def grouptagnodesbyline(tagnodes):
97 '''
98 Group nearby nodes (i.e. those that must be written next to each other)
99
100 The input is a list of [node, position] pairs, corresponding to a given tag
101 The position is the line number where the node was found on the first parent
102 .hgtags file, or None for those nodes that came from the base or the second
103 parent .hgtags files.
104
105 This function groups those [node, position] pairs, returning a list of
106 groups of nodes that must be written next to each other because their
107 positions are consecutive or have no position preference (because their
108 position is None).
109
110 The result is a list of [position, [consecutive node list]]
111 '''
112 firstlinenum = None
113 for hexnode, linenum in tagnodes:
114 firstlinenum = linenum
115 if firstlinenum is not None:
116 break
117 if firstlinenum is None:
118 return [[None, [el[0] for el in tagnodes]]]
119 tagnodes[0][1] = firstlinenum
120 groupednodes = [[firstlinenum, []]]
121 prevlinenum = firstlinenum
122 for hexnode, linenum in tagnodes:
123 if linenum is not None and linenum - prevlinenum > 1:
124 groupednodes.append([linenum, []])
125 groupednodes[-1][1].append(hexnode)
126 if linenum is not None:
127 prevlinenum = linenum
128 return groupednodes
129
130 def writemergedtags(repo, mergedtags):
131 '''
132 write the merged tags while trying to minimize the diff to the first parent
133
134 This function uses the ordering info stored on the merged tags dict to
135 generate an .hgtags file which is correct (in the sense that its contents
136 correspond to the result of the tag merge) while also being as close as
137 possible to the first parent's .hgtags file.
138 '''
139 # group the node-tag pairs that must be written next to each other
140 for tname, taglist in mergedtags.items():
141 mergedtags[tname] = grouptagnodesbyline(taglist)
142
143 # convert the grouped merged tags dict into a format that resembles the
144 # final .hgtags file (i.e. a list of blocks of 'node tag' pairs)
145 def taglist2string(tlist, tname):
146 return '\n'.join(['%s %s' % (hexnode, tname) for hexnode in tlist])
147
148 finaltags = []
149 for tname, tags in mergedtags.items():
150 for block in tags:
151 block[1] = taglist2string(block[1], tname)
152 finaltags += tags
153
154 # the tag groups are linked to a "position" that can be used to sort them
155 # before writing them
156 # the position is calculated to ensure that the diff of the merged .hgtags
157 # file to the first parent's .hgtags file is as small as possible
158 finaltags.sort(key=operator.itemgetter(0))
159
160 # finally we can join the sorted groups to get the final contents of the
161 # merged .hgtags file, and then write it to disk
162 mergedtagstring = '\n'.join([tags for rank, tags in finaltags if tags])
163 fp = repo.wfile('.hgtags', 'wb')
164 fp.write(mergedtagstring + '\n')
165 fp.close()
166
167 def singletagmerge(p1nodes, p2nodes):
168 '''
169 merge the nodes corresponding to a single tag
170
171 Note that the inputs are lists of node-linenum pairs (i.e. not just lists
172 of nodes)
173 '''
174 if not p2nodes:
175 return p1nodes
176 if not p1nodes:
177 return p2nodes
178
179 # there is no conflict unless both tags point to different revisions
180 # and have a non identical tag history
181 p1currentnode = p1nodes[-1][0]
182 p2currentnode = p2nodes[-1][0]
183 if p1currentnode != p2currentnode and len(p1nodes) == len(p2nodes):
184 # cannot merge two tags with same rank pointing to different nodes
185 return None
186
187 # which are the highest ranking (hr) / lowest ranking (lr) nodes?
188 if len(p1nodes) >= len(p2nodes):
189 hrnodes, lrnodes = p1nodes, p2nodes
190 else:
191 hrnodes, lrnodes = p2nodes, p1nodes
192
193 # the lowest ranking nodes will be written first, followed by the highest
194 # ranking nodes
195 # to avoid unwanted tag rank explosion we try to see if there are some
196 # common nodes that can be written only once
197 commonidx = len(lrnodes)
198 for n in range(len(lrnodes)):
199 if hrnodes[n][0] != lrnodes[n][0]:
200 commonidx = n
201 break
202 lrnodes[n][1] = p1nodes[n][1]
203
204 # the merged node list has 3 parts:
205 # - common nodes
206 # - non common lowest ranking nodes
207 # - non common highest ranking nodes
208 # note that the common nodes plus the non common lowest ranking nodes is the
209 # whole list of lr nodes
210 return lrnodes + hrnodes[commonidx:]
211
212 def merge(repo, fcd, fco, fca):
213 '''
214 Merge the tags of two revisions, taking into account the base tags
215 Try to minimize the diff between the merged tags and the first parent tags
216 '''
217 ui = repo.ui
218 # read the p1, p2 and base tags
219 # only keep the line numbers for the p1 tags
220 p1tags = readtagsformerge(
221 ui, repo, fcd.data().splitlines(), fn="p1 tags",
222 keeplinenums=True)
223 p2tags = readtagsformerge(
224 ui, repo, fco.data().splitlines(), fn="p2 tags",
225 keeplinenums=False)
226 basetags = readtagsformerge(
227 ui, repo, fca.data().splitlines(), fn="base tags",
228 keeplinenums=False)
229
230 # recover the list of "lost tags" (i.e. those that were found on the base
231 # revision but not on one of the revisions being merged)
232 basetagset = set(basetags)
233 for n, pntags in enumerate((p1tags, p2tags)):
234 pntagset = set(pntags)
235 pnlosttagset = basetagset - pntagset
236 for t in pnlosttagset:
237 pntags[t] = basetags[t]
238 if pntags[t][-1][0] != hexnullid:
239 pntags[t].append([hexnullid, None])
240
241 conflictedtags = [] # for reporting purposes
242 mergedtags = util.sortdict(p1tags)
243 # sortdict does not implement iteritems()
244 for tname, p2nodes in p2tags.items():
245 if tname not in mergedtags:
246 mergedtags[tname] = p2nodes
247 continue
248 p1nodes = mergedtags[tname]
249 mergednodes = singletagmerge(p1nodes, p2nodes)
250 if mergednodes is None:
251 conflictedtags.append(tname)
252 continue
253 mergedtags[tname] = mergednodes
254
255 if conflictedtags:
256 numconflicts = len(conflictedtags)
257 ui.warn(_('automatic .hgtags merge failed\n'
258 'the following %d tags are in conflict: %s\n')
259 % (numconflicts, ', '.join(sorted(conflictedtags))))
260 return True, 1
261
262 writemergedtags(repo, mergedtags)
263 ui.note(_('.hgtags merged successfully\n'))
264 return False, 0
265
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -25,7 +25,9 b' tests/*.err'
25 tests/htmlcov
25 tests/htmlcov
26 build
26 build
27 contrib/hgsh/hgsh
27 contrib/hgsh/hgsh
28 contrib/vagrant/.vagrant
28 dist
29 dist
30 packages
29 doc/common.txt
31 doc/common.txt
30 doc/*.[0-9]
32 doc/*.[0-9]
31 doc/*.[0-9].txt
33 doc/*.[0-9].txt
@@ -132,6 +132,37 b' i18n/hg.pot: $(PYFILES) $(DOCFILES) i18n'
132 msgmerge --no-location --update $@.tmp $^
132 msgmerge --no-location --update $@.tmp $^
133 mv -f $@.tmp $@
133 mv -f $@.tmp $@
134
134
135 # Packaging targets
136
137 osx:
138 @which -s bdist_mpkg || \
139 (echo "Missing bdist_mpkg (easy_install bdist_mpkg)"; false)
140 bdist_mpkg setup.py
141 mkdir -p packages/osx
142 rm -rf dist/mercurial-*.mpkg
143 mv dist/mercurial*macosx*.zip packages/osx
144
145 fedora:
146 mkdir -p packages/fedora
147 contrib/buildrpm
148 cp rpmbuild/RPMS/*/* packages/fedora
149 cp rpmbuild/SRPMS/* packages/fedora
150 rm -rf rpmbuild
151
152 docker-fedora:
153 mkdir -p packages/fedora
154 contrib/dockerrpm fedora
155
156 centos6:
157 mkdir -p packages/centos6
158 contrib/buildrpm
159 cp rpmbuild/RPMS/*/* packages/centos6
160 cp rpmbuild/SRPMS/* packages/centos6
161
162 docker-centos6:
163 mkdir -p packages/centos6
164 contrib/dockerrpm centos6
165
135 .PHONY: help all local build doc clean install install-bin install-doc \
166 .PHONY: help all local build doc clean install install-bin install-doc \
136 install-home install-home-bin install-home-doc dist dist-notests tests \
167 install-home install-home-bin install-home-doc dist dist-notests tests \
137 update-pot
168 update-pot fedora docker-fedora
@@ -629,7 +629,7 b' complete -o bashdefault -o default -o no'
629
629
630 _hg_cmd_shelve()
630 _hg_cmd_shelve()
631 {
631 {
632 if [[ "$prev" = @(-d|--delete) ]]; then
632 if [[ "$prev" = @(-d|--delete|-l|--list) ]]; then
633 _hg_shelves
633 _hg_shelves
634 else
634 else
635 _hg_status "mard"
635 _hg_status "mard"
@@ -1,16 +1,13 b''
1 #!/bin/sh
1 #!/bin/sh -e
2 #
2 #
3 # Build a Mercurial RPM in place.
3 # Build a Mercurial RPM from the current repo
4 #
4 #
5 # Tested on
5 # Tested on
6 # - Fedora 8 (with docutils 0.5)
6 # - Fedora 20
7 # - Fedora 11
7 # - CentOS 5
8 # - OpenSuse 11.2
8 # - centOS 6
9
9
10 cd "`dirname $0`/.."
10 cd "`dirname $0`/.."
11 HG="$PWD/hg"
12 PYTHONPATH="$PWD/mercurial/pure"
13 export PYTHONPATH
14
11
15 specfile=contrib/mercurial.spec
12 specfile=contrib/mercurial.spec
16 if [ ! -f $specfile ]; then
13 if [ ! -f $specfile ]; then
@@ -23,21 +20,17 b' if [ ! -d .hg ]; then'
23 exit 1
20 exit 1
24 fi
21 fi
25
22
26 if $HG id -i | grep '+$' > /dev/null 2>&1; then
23 # build local hg and use it
27 echo -n "Your local changes will NOT be in the RPM. Continue [y/n] ? "
24 python setup.py build_py -c -d .
28 read answer
25 HG="$PWD/hg"
29 if echo $answer | grep -iv '^y'; then
26 PYTHONPATH="$PWD/mercurial/pure"
30 exit
27 export PYTHONPATH
31 fi
32 fi
33
28
34 rpmdir="$PWD/rpmbuild"
29 rpmdir="$PWD/rpmbuild"
35
30
36 rm -rf $rpmdir
31 rm -rf $rpmdir
37 mkdir -p $rpmdir/SOURCES $rpmdir/SPECS $rpmdir/RPMS $rpmdir/SRPMS $rpmdir/BUILD
32 mkdir -p $rpmdir/SOURCES $rpmdir/SPECS $rpmdir/RPMS $rpmdir/SRPMS $rpmdir/BUILD
38
33
39 # make setup.py build the version string
40 python setup.py build_py -c -d .
41 hgversion=`$HG version | sed -ne 's/.*(version \(.*\))$/\1/p'`
34 hgversion=`$HG version | sed -ne 's/.*(version \(.*\))$/\1/p'`
42
35
43 if echo $hgversion | grep -- '-' > /dev/null 2>&1; then
36 if echo $hgversion | grep -- '-' > /dev/null 2>&1; then
@@ -50,8 +43,8 b' else'
50 release='0'
43 release='0'
51 fi
44 fi
52
45
53 $HG archive -t tgz $rpmdir/SOURCES/mercurial-$version.tar.gz
46 $HG archive -t tgz $rpmdir/SOURCES/mercurial-$version-$release.tar.gz
54 rpmspec=$rpmdir/SPECS/mercurial-$version.spec
47 rpmspec=$rpmdir/SPECS/mercurial.spec
55
48
56 sed -e "s,^Version:.*,Version: $version," \
49 sed -e "s,^Version:.*,Version: $version," \
57 -e "s,^Release:.*,Release: $release," \
50 -e "s,^Release:.*,Release: $release," \
@@ -247,8 +247,6 b' pypats = ['
247 (r'^\s*os\.path\.relpath', "relpath not available in Python 2.4"),
247 (r'^\s*os\.path\.relpath', "relpath not available in Python 2.4"),
248 (r'(?<!def)\s+(any|all|format)\(',
248 (r'(?<!def)\s+(any|all|format)\(',
249 "any/all/format not available in Python 2.4", 'no-py24'),
249 "any/all/format not available in Python 2.4", 'no-py24'),
250 (r'(?<!def)\s+(callable)\(',
251 "callable not available in Python 3, use getattr(f, '__call__', None)"),
252 (r'if\s.*\selse', "if ... else form not available in Python 2.4"),
250 (r'if\s.*\selse', "if ... else form not available in Python 2.4"),
253 (r'^\s*(%s)\s\s' % '|'.join(keyword.kwlist),
251 (r'^\s*(%s)\s\s' % '|'.join(keyword.kwlist),
254 "gratuitous whitespace after Python keyword"),
252 "gratuitous whitespace after Python keyword"),
@@ -367,16 +365,28 b' inrevlogpats = ['
367 []
365 []
368 ]
366 ]
369
367
368 webtemplatefilters = []
369
370 webtemplatepats = [
371 [],
372 [
373 (r'{desc(\|(?!websub|firstline)[^\|]*)+}',
374 'follow desc keyword with either firstline or websub'),
375 ]
376 ]
377
370 checks = [
378 checks = [
371 ('python', r'.*\.(py|cgi)$', pyfilters, pypats),
379 ('python', r'.*\.(py|cgi)$', r'^#!.*python', pyfilters, pypats),
372 ('test script', r'(.*/)?test-[^.~]*$', testfilters, testpats),
380 ('test script', r'(.*/)?test-[^.~]*$', '', testfilters, testpats),
373 ('c', r'.*\.[ch]$', cfilters, cpats),
381 ('c', r'.*\.[ch]$', '', cfilters, cpats),
374 ('unified test', r'.*\.t$', utestfilters, utestpats),
382 ('unified test', r'.*\.t$', '', utestfilters, utestpats),
375 ('layering violation repo in revlog', r'mercurial/revlog\.py', pyfilters,
383 ('layering violation repo in revlog', r'mercurial/revlog\.py', '',
376 inrevlogpats),
384 pyfilters, inrevlogpats),
377 ('layering violation ui in util', r'mercurial/util\.py', pyfilters,
385 ('layering violation ui in util', r'mercurial/util\.py', '', pyfilters,
378 inutilpats),
386 inutilpats),
379 ('txt', r'.*\.txt$', txtfilters, txtpats),
387 ('txt', r'.*\.txt$', '', txtfilters, txtpats),
388 ('web template', r'mercurial/templates/.*\.tmpl', '',
389 webtemplatefilters, webtemplatepats),
380 ]
390 ]
381
391
382 def _preparepats():
392 def _preparepats():
@@ -392,7 +402,7 b' def _preparepats():'
392 p = re.sub(r'(?<!\\)\[\^', r'[^\\n', p)
402 p = re.sub(r'(?<!\\)\[\^', r'[^\\n', p)
393
403
394 pats[i] = (re.compile(p, re.MULTILINE),) + pseq[1:]
404 pats[i] = (re.compile(p, re.MULTILINE),) + pseq[1:]
395 filters = c[2]
405 filters = c[3]
396 for i, flt in enumerate(filters):
406 for i, flt in enumerate(filters):
397 filters[i] = re.compile(flt[0]), flt[1]
407 filters[i] = re.compile(flt[0]), flt[1]
398 _preparepats()
408 _preparepats()
@@ -446,22 +456,24 b' def checkfile(f, logfunc=_defaultlogger.'
446 """
456 """
447 blamecache = None
457 blamecache = None
448 result = True
458 result = True
449 for name, match, filters, pats in checks:
459
460 try:
461 fp = open(f)
462 except IOError, e:
463 print "Skipping %s, %s" % (f, str(e).split(':', 1)[0])
464 return result
465 pre = post = fp.read()
466 fp.close()
467
468 for name, match, magic, filters, pats in checks:
450 if debug:
469 if debug:
451 print name, f
470 print name, f
452 fc = 0
471 fc = 0
453 if not re.match(match, f):
472 if not (re.match(match, f) or (magic and re.search(magic, f))):
454 if debug:
473 if debug:
455 print "Skipping %s for %s it doesn't match %s" % (
474 print "Skipping %s for %s it doesn't match %s" % (
456 name, match, f)
475 name, match, f)
457 continue
476 continue
458 try:
459 fp = open(f)
460 except IOError, e:
461 print "Skipping %s, %s" % (f, str(e).split(':', 1)[0])
462 continue
463 pre = post = fp.read()
464 fp.close()
465 if "no-" "check-code" in pre:
477 if "no-" "check-code" in pre:
466 print "Skipping %s it has no-" "check-code" % f
478 print "Skipping %s it has no-" "check-code" % f
467 return "Skip" # skip checking this file
479 return "Skip" # skip checking this file
@@ -4,6 +4,10 b''
4 import sys
4 import sys
5 import mercurial
5 import mercurial
6 import code
6 import code
7 from mercurial import cmdutil
8
9 cmdtable = {}
10 command = cmdutil.command(cmdtable)
7
11
8 def pdb(ui, repo, msg, **opts):
12 def pdb(ui, repo, msg, **opts):
9 objects = {
13 objects = {
@@ -24,6 +28,7 b' def ipdb(ui, repo, msg, **opts):'
24
28
25 IPython.embed()
29 IPython.embed()
26
30
31 @command('debugshell|dbsh', [])
27 def debugshell(ui, repo, **opts):
32 def debugshell(ui, repo, **opts):
28 bannermsg = "loaded repo : %s\n" \
33 bannermsg = "loaded repo : %s\n" \
29 "using source: %s" % (repo.root,
34 "using source: %s" % (repo.root,
@@ -47,7 +52,3 b' def debugshell(ui, repo, **opts):'
47 debugger = 'pdb'
52 debugger = 'pdb'
48
53
49 getattr(sys.modules[__name__], debugger)(ui, repo, bannermsg, **opts)
54 getattr(sys.modules[__name__], debugger)(ui, repo, bannermsg, **opts)
50
51 cmdtable = {
52 "debugshell|dbsh": (debugshell, [])
53 }
@@ -12,10 +12,10 b' from lib2to3.pygram import python_symbol'
12 # XXX: Implementing a blacklist in 2to3 turned out to be more troublesome than
12 # XXX: Implementing a blacklist in 2to3 turned out to be more troublesome than
13 # blacklisting some modules inside the fixers. So, this is what I came with.
13 # blacklisting some modules inside the fixers. So, this is what I came with.
14
14
15 blacklist = ['mercurial/demandimport.py',
15 blacklist = ('mercurial/demandimport.py',
16 'mercurial/py3kcompat.py', # valid python 3 already
16 'mercurial/py3kcompat.py', # valid python 3 already
17 'mercurial/i18n.py',
17 'mercurial/i18n.py',
18 ]
18 )
19
19
20 def isdocstring(node):
20 def isdocstring(node):
21 def isclassorfunction(ancestor):
21 def isclassorfunction(ancestor):
@@ -83,7 +83,8 b' class FixBytes(fixer_base.BaseFix):'
83 PATTERN = 'STRING'
83 PATTERN = 'STRING'
84
84
85 def transform(self, node, results):
85 def transform(self, node, results):
86 if self.filename in blacklist:
86 # The filename may be prefixed with a build directory.
87 if self.filename.endswith(blacklist):
87 return
88 return
88 if node.type == token.STRING:
89 if node.type == token.STRING:
89 if _re.match(node.value):
90 if _re.match(node.value):
@@ -1,3 +1,6 b''
1 %global emacs_lispdir %{_datadir}/emacs/site-lisp
2 %global pythonver %(python -c 'import sys;print ".".join(map(str, sys.version_info[:2]))')
3
1 Summary: A fast, lightweight Source Control Management system
4 Summary: A fast, lightweight Source Control Management system
2 Name: mercurial
5 Name: mercurial
3 Version: snapshot
6 Version: snapshot
@@ -5,33 +8,21 b' Release: 0'
5 License: GPLv2+
8 License: GPLv2+
6 Group: Development/Tools
9 Group: Development/Tools
7 URL: http://mercurial.selenic.com/
10 URL: http://mercurial.selenic.com/
8 Source0: http://mercurial.selenic.com/release/%{name}-%{version}.tar.gz
11 Source0: %{name}-%{version}-%{release}.tar.gz
9 BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root
12 BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root
10
13
11 # From the README:
12 #
13 # Note: some distributions fails to include bits of distutils by
14 # default, you'll need python-dev to install. You'll also need a C
15 # compiler and a 3-way merge tool like merge, tkdiff, or kdiff3.
16 #
17 # python-devel provides an adequate python-dev. The merge tool is a
18 # run-time dependency.
19 #
20 BuildRequires: python >= 2.4, python-devel, make, gcc, python-docutils >= 0.5, gettext
14 BuildRequires: python >= 2.4, python-devel, make, gcc, python-docutils >= 0.5, gettext
21 Provides: hg = %{version}-%{release}
15 Provides: hg = %{version}-%{release}
22 Requires: python >= 2.4
16 Requires: python >= 2.4
23 # The hgk extension uses the wish tcl interpreter, but we don't enforce it
17 # The hgk extension uses the wish tcl interpreter, but we don't enforce it
24 #Requires: tk
18 #Requires: tk
25
19
26 %define pythonver %(python -c 'import sys;print ".".join(map(str, sys.version_info[:2]))')
27 %define emacs_lispdir %{_datadir}/emacs/site-lisp
28
29 %description
20 %description
30 Mercurial is a fast, lightweight source control management system designed
21 Mercurial is a fast, lightweight source control management system designed
31 for efficient handling of very large distributed projects.
22 for efficient handling of very large distributed projects.
32
23
33 %prep
24 %prep
34 %setup -q
25 %setup -q -n mercurial-%{version}-%{release}
35
26
36 %build
27 %build
37 make all
28 make all
@@ -40,8 +31,8 b' make all'
40 rm -rf $RPM_BUILD_ROOT
31 rm -rf $RPM_BUILD_ROOT
41 make install DESTDIR=$RPM_BUILD_ROOT PREFIX=%{_prefix} MANDIR=%{_mandir}
32 make install DESTDIR=$RPM_BUILD_ROOT PREFIX=%{_prefix} MANDIR=%{_mandir}
42
33
43 install -m 755 contrib/hgk $RPM_BUILD_ROOT%{_bindir}
34 install -m 755 contrib/hgk $RPM_BUILD_ROOT%{_bindir}/
44 install -m 755 contrib/hg-ssh $RPM_BUILD_ROOT%{_bindir}
35 install -m 755 contrib/hg-ssh $RPM_BUILD_ROOT%{_bindir}/
45
36
46 bash_completion_dir=$RPM_BUILD_ROOT%{_sysconfdir}/bash_completion.d
37 bash_completion_dir=$RPM_BUILD_ROOT%{_sysconfdir}/bash_completion.d
47 mkdir -p $bash_completion_dir
38 mkdir -p $bash_completion_dir
@@ -52,8 +43,8 b' mkdir -p $zsh_completion_dir'
52 install -m 644 contrib/zsh_completion $zsh_completion_dir/_mercurial
43 install -m 644 contrib/zsh_completion $zsh_completion_dir/_mercurial
53
44
54 mkdir -p $RPM_BUILD_ROOT%{emacs_lispdir}
45 mkdir -p $RPM_BUILD_ROOT%{emacs_lispdir}
55 install -m 644 contrib/mercurial.el $RPM_BUILD_ROOT%{emacs_lispdir}
46 install -m 644 contrib/mercurial.el $RPM_BUILD_ROOT%{emacs_lispdir}/
56 install -m 644 contrib/mq.el $RPM_BUILD_ROOT%{emacs_lispdir}
47 install -m 644 contrib/mq.el $RPM_BUILD_ROOT%{emacs_lispdir}/
57
48
58 mkdir -p $RPM_BUILD_ROOT/%{_sysconfdir}/mercurial/hgrc.d
49 mkdir -p $RPM_BUILD_ROOT/%{_sysconfdir}/mercurial/hgrc.d
59 install -m 644 contrib/mergetools.hgrc $RPM_BUILD_ROOT%{_sysconfdir}/mercurial/hgrc.d/mergetools.rc
50 install -m 644 contrib/mergetools.hgrc $RPM_BUILD_ROOT%{_sysconfdir}/mercurial/hgrc.d/mergetools.rc
@@ -14,6 +14,7 b' gvimdiff.regkey=Software\\Vim\\GVim'
14 gvimdiff.regkeyalt=Software\Wow6432Node\Vim\GVim
14 gvimdiff.regkeyalt=Software\Wow6432Node\Vim\GVim
15 gvimdiff.regname=path
15 gvimdiff.regname=path
16 gvimdiff.priority=-9
16 gvimdiff.priority=-9
17 gvimdiff.diffargs=--nofork -d -g -O $parent $child
17
18
18 vimdiff.args=$local $other $base -c 'redraw | echomsg "hg merge conflict, type \":cq\" to abort vimdiff"'
19 vimdiff.args=$local $other $base -c 'redraw | echomsg "hg merge conflict, type \":cq\" to abort vimdiff"'
19 vimdiff.check=changed
20 vimdiff.check=changed
@@ -71,6 +72,12 b' ecmerge.regkeyalt=Software\\Wow6432Node\\E'
71 ecmerge.gui=True
72 ecmerge.gui=True
72 ecmerge.diffargs=$parent $child --mode=diff2 --title1='$plabel1' --title2='$clabel'
73 ecmerge.diffargs=$parent $child --mode=diff2 --title1='$plabel1' --title2='$clabel'
73
74
75 # editmerge is a small script shipped in contrib.
76 # It needs this config otherwise it behaves the same as internal:local
77 editmerge.args=$output
78 editmerge.check=changed
79 editmerge.premerge=keep
80
74 filemerge.executable=/Developer/Applications/Utilities/FileMerge.app/Contents/MacOS/FileMerge
81 filemerge.executable=/Developer/Applications/Utilities/FileMerge.app/Contents/MacOS/FileMerge
75 filemerge.args=-left $other -right $local -ancestor $base -merge $output
82 filemerge.args=-left $other -right $local -ancestor $base -merge $output
76 filemerge.gui=True
83 filemerge.gui=True
@@ -14,7 +14,12 b''
14 # to compare performance.
14 # to compare performance.
15
15
16 import sys
16 import sys
17 import os
17 from subprocess import check_call, Popen, CalledProcessError, STDOUT, PIPE
18 from subprocess import check_call, Popen, CalledProcessError, STDOUT, PIPE
19 # cannot use argparse, python 2.7 only
20 from optparse import OptionParser
21
22
18
23
19 def check_output(*args, **kwargs):
24 def check_output(*args, **kwargs):
20 kwargs.setdefault('stderr', PIPE)
25 kwargs.setdefault('stderr', PIPE)
@@ -33,15 +38,19 b' def update(rev):'
33 print >> sys.stderr, 'update to revision %s failed, aborting' % rev
38 print >> sys.stderr, 'update to revision %s failed, aborting' % rev
34 sys.exit(exc.returncode)
39 sys.exit(exc.returncode)
35
40
36 def perf(revset):
41 def perf(revset, target=None):
37 """run benchmark for this very revset"""
42 """run benchmark for this very revset"""
38 try:
43 try:
39 output = check_output(['./hg',
44 cmd = ['./hg',
40 '--config',
45 '--config',
41 'extensions.perf=contrib/perf.py',
46 'extensions.perf='
42 'perfrevset',
47 + os.path.join(contribdir, 'perf.py'),
43 revset],
48 'perfrevset',
44 stderr=STDOUT)
49 revset]
50 if target is not None:
51 cmd.append('-R')
52 cmd.append(target)
53 output = check_output(cmd, stderr=STDOUT)
45 output = output.lstrip('!') # remove useless ! in this context
54 output = output.lstrip('!') # remove useless ! in this context
46 return output.strip()
55 return output.strip()
47 except CalledProcessError, exc:
56 except CalledProcessError, exc:
@@ -65,12 +74,26 b' def getrevs(spec):'
65 return [r for r in out.split() if r]
74 return [r for r in out.split() if r]
66
75
67
76
77 parser = OptionParser(usage="usage: %prog [options] <revs>")
78 parser.add_option("-f", "--file",
79 help="read revset from FILE", metavar="FILE")
80 parser.add_option("-R", "--repo",
81 help="run benchmark on REPO", metavar="REPO")
68
82
69 target_rev = sys.argv[1]
83 (options, args) = parser.parse_args()
84
85 if len(sys.argv) < 2:
86 parser.print_help()
87 sys.exit(255)
88
89 # the directory where both this script and the perf.py extension live.
90 contribdir = os.path.dirname(__file__)
91
92 target_rev = args[0]
70
93
71 revsetsfile = sys.stdin
94 revsetsfile = sys.stdin
72 if len(sys.argv) > 2:
95 if options.file:
73 revsetsfile = open(sys.argv[2])
96 revsetsfile = open(options.file)
74
97
75 revsets = [l.strip() for l in revsetsfile]
98 revsets = [l.strip() for l in revsetsfile]
76
99
@@ -95,7 +118,7 b' for r in revs:'
95 res = []
118 res = []
96 results.append(res)
119 results.append(res)
97 for idx, rset in enumerate(revsets):
120 for idx, rset in enumerate(revsets):
98 data = perf(rset)
121 data = perf(rset, target=options.repo)
99 res.append(data)
122 res.append(data)
100 print "%i)" % idx, data
123 print "%i)" % idx, data
101 sys.stdout.flush()
124 sys.stdout.flush()
@@ -2,11 +2,13 b' all()'
2 draft()
2 draft()
3 ::tip
3 ::tip
4 draft() and ::tip
4 draft() and ::tip
5 ::tip and draft()
5 0::tip
6 0::tip
6 roots(0::tip)
7 roots(0::tip)
7 author(lmoscovicz)
8 author(lmoscovicz)
8 author(mpm)
9 author(mpm)
9 author(lmoscovicz) or author(mpm)
10 author(lmoscovicz) or author(mpm)
11 author(mpm) or author(lmoscovicz)
10 tip:0
12 tip:0
11 max(tip:0)
13 max(tip:0)
12 min(0:tip)
14 min(0:tip)
@@ -18,3 +20,4 b' public()'
18 :10000 and public()
20 :10000 and public()
19 draft()
21 draft()
20 :10000 and draft()
22 :10000 and draft()
23 max(::(tip~20) - obsolete())
@@ -334,7 +334,7 b' def synthesize(ui, repo, descpath, **opt'
334 for __ in xrange(add):
334 for __ in xrange(add):
335 lines.insert(random.randint(0, len(lines)), makeline())
335 lines.insert(random.randint(0, len(lines)), makeline())
336 path = fctx.path()
336 path = fctx.path()
337 changes[path] = context.memfilectx(path,
337 changes[path] = context.memfilectx(repo, path,
338 '\n'.join(lines) + '\n')
338 '\n'.join(lines) + '\n')
339 for __ in xrange(pick(filesremoved)):
339 for __ in xrange(pick(filesremoved)):
340 path = random.choice(mfk)
340 path = random.choice(mfk)
@@ -354,7 +354,7 b' def synthesize(ui, repo, descpath, **opt'
354 path = '/'.join(filter(None, path))
354 path = '/'.join(filter(None, path))
355 data = '\n'.join(makeline()
355 data = '\n'.join(makeline()
356 for __ in xrange(pick(linesinfilesadded))) + '\n'
356 for __ in xrange(pick(linesinfilesadded))) + '\n'
357 changes[path] = context.memfilectx(path, data)
357 changes[path] = context.memfilectx(repo, path, data)
358 def filectxfn(repo, memctx, path):
358 def filectxfn(repo, memctx, path):
359 data = changes[path]
359 data = changes[path]
360 if data is None:
360 if data is None:
@@ -105,7 +105,7 b' function! s:HGChangeToCurrentFileDir(fil'
105 let fileName=<SID>HGResolveLink(a:fileName)
105 let fileName=<SID>HGResolveLink(a:fileName)
106 let newCwd=fnamemodify(fileName, ':h')
106 let newCwd=fnamemodify(fileName, ':h')
107 if strlen(newCwd) > 0
107 if strlen(newCwd) > 0
108 execute 'cd' escape(newCwd, ' ')
108 try | execute 'cd' escape(newCwd, ' ') | catch | | endtry
109 endif
109 endif
110 return oldCwd
110 return oldCwd
111 endfunction
111 endfunction
@@ -396,7 +396,7 b' function! s:HGGetStatusVars(revisionVar,'
396
396
397 return returnExpression
397 return returnExpression
398 finally
398 finally
399 execute 'cd' escape(oldCwd, ' ')
399 try | execute 'cd' escape(oldCwd, ' ') | catch | | endtry
400 endtry
400 endtry
401 endfunction
401 endfunction
402
402
@@ -14,7 +14,6 b' from mercurial.commands import table'
14 from mercurial.help import helptable
14 from mercurial.help import helptable
15 from mercurial import extensions
15 from mercurial import extensions
16 from mercurial import minirst
16 from mercurial import minirst
17 from mercurial import util
18
17
19 _verbose = False
18 _verbose = False
20
19
@@ -87,7 +86,7 b' def checkcmdtable(cmdtable, namefmt, ini'
87 def checkhghelps():
86 def checkhghelps():
88 errorcnt = 0
87 errorcnt = 0
89 for names, sec, doc in helptable:
88 for names, sec, doc in helptable:
90 if util.safehasattr(doc, '__call__'):
89 if callable(doc):
91 doc = doc()
90 doc = doc()
92 errorcnt += checkseclevel(doc,
91 errorcnt += checkseclevel(doc,
93 '%s help topic' % names[0],
92 '%s help topic' % names[0],
@@ -14,7 +14,6 b' from mercurial.commands import table, gl'
14 from mercurial.i18n import gettext, _
14 from mercurial.i18n import gettext, _
15 from mercurial.help import helptable, loaddoc
15 from mercurial.help import helptable, loaddoc
16 from mercurial import extensions
16 from mercurial import extensions
17 from mercurial import util
18
17
19 def get_desc(docstr):
18 def get_desc(docstr):
20 if not docstr:
19 if not docstr:
@@ -137,7 +136,7 b' def helpprinter(ui, helptable, sectionfu'
137 ui.write("\n")
136 ui.write("\n")
138 if sectionfunc:
137 if sectionfunc:
139 ui.write(sectionfunc(sec))
138 ui.write(sectionfunc(sec))
140 if util.safehasattr(doc, '__call__'):
139 if callable(doc):
141 doc = doc()
140 doc = doc()
142 ui.write(doc)
141 ui.write(doc)
143 ui.write("\n")
142 ui.write("\n")
@@ -10,6 +10,11 b''
10 import os
10 import os
11 import sys
11 import sys
12
12
13 if os.environ.get('HGUNICODEPEDANTRY', False):
14 reload(sys)
15 sys.setdefaultencoding("undefined")
16
17
13 libdir = '@LIBDIR@'
18 libdir = '@LIBDIR@'
14
19
15 if libdir != '@' 'LIBDIR' '@':
20 if libdir != '@' 'LIBDIR' '@':
@@ -14,12 +14,20 b' This extension is deprecated. You should'
14 "children(REV)"` instead.
14 "children(REV)"` instead.
15 '''
15 '''
16
16
17 from mercurial import cmdutil, commands
17 from mercurial import cmdutil
18 from mercurial.commands import templateopts
18 from mercurial.commands import templateopts
19 from mercurial.i18n import _
19 from mercurial.i18n import _
20
20
21 cmdtable = {}
22 command = cmdutil.command(cmdtable)
21 testedwith = 'internal'
23 testedwith = 'internal'
22
24
25 @command('children',
26 [('r', 'rev', '',
27 _('show children of the specified revision'), _('REV')),
28 ] + templateopts,
29 _('hg children [-r REV] [FILE]'),
30 inferrepo=True)
23 def children(ui, repo, file_=None, **opts):
31 def children(ui, repo, file_=None, **opts):
24 """show the children of the given or working directory revision
32 """show the children of the given or working directory revision
25
33
@@ -39,14 +47,3 b' def children(ui, repo, file_=None, **opt'
39 for cctx in ctx.children():
47 for cctx in ctx.children():
40 displayer.show(cctx)
48 displayer.show(cctx)
41 displayer.close()
49 displayer.close()
42
43 cmdtable = {
44 "children":
45 (children,
46 [('r', 'rev', '',
47 _('show children of the specified revision'), _('REV')),
48 ] + templateopts,
49 _('hg children [-r REV] [FILE]')),
50 }
51
52 commands.inferrepo += " children"
@@ -14,6 +14,8 b' from mercurial import encoding'
14 import os
14 import os
15 import time, datetime
15 import time, datetime
16
16
17 cmdtable = {}
18 command = cmdutil.command(cmdtable)
17 testedwith = 'internal'
19 testedwith = 'internal'
18
20
19 def maketemplater(ui, repo, tmpl):
21 def maketemplater(ui, repo, tmpl):
@@ -88,6 +90,22 b' def countrate(ui, repo, amap, *pats, **o'
88 return rate
90 return rate
89
91
90
92
93 @command('churn',
94 [('r', 'rev', [],
95 _('count rate for the specified revision or range'), _('REV')),
96 ('d', 'date', '',
97 _('count rate for revisions matching date spec'), _('DATE')),
98 ('t', 'template', '{author|email}',
99 _('template to group changesets'), _('TEMPLATE')),
100 ('f', 'dateformat', '',
101 _('strftime-compatible format for grouping by date'), _('FORMAT')),
102 ('c', 'changesets', False, _('count rate by number of changesets')),
103 ('s', 'sort', False, _('sort by key (default: sort by count)')),
104 ('', 'diffstat', False, _('display added/removed lines separately')),
105 ('', 'aliases', '', _('file with email aliases'), _('FILE')),
106 ] + commands.walkopts,
107 _("hg churn [-d DATE] [-r REV] [--aliases FILE] [FILE]"),
108 inferrepo=True)
91 def churn(ui, repo, *pats, **opts):
109 def churn(ui, repo, *pats, **opts):
92 '''histogram of changes to the repository
110 '''histogram of changes to the repository
93
111
@@ -180,26 +198,3 b' def churn(ui, repo, *pats, **opts):'
180
198
181 for name, count in rate:
199 for name, count in rate:
182 ui.write(format(name, count))
200 ui.write(format(name, count))
183
184
185 cmdtable = {
186 "churn":
187 (churn,
188 [('r', 'rev', [],
189 _('count rate for the specified revision or range'), _('REV')),
190 ('d', 'date', '',
191 _('count rate for revisions matching date spec'), _('DATE')),
192 ('t', 'template', '{author|email}',
193 _('template to group changesets'), _('TEMPLATE')),
194 ('f', 'dateformat', '',
195 _('strftime-compatible format for grouping by date'), _('FORMAT')),
196 ('c', 'changesets', False, _('count rate by number of changesets')),
197 ('s', 'sort', False, _('sort by key (default: sort by count)')),
198 ('', 'diffstat', False, _('display added/removed lines separately')),
199 ('', 'aliases', '',
200 _('file with email aliases'), _('FILE')),
201 ] + commands.walkopts,
202 _("hg churn [-d DATE] [-r REV] [--aliases FILE] [FILE]")),
203 }
204
205 commands.inferrepo += " churn"
@@ -111,10 +111,12 b' disable color.'
111
111
112 import os
112 import os
113
113
114 from mercurial import commands, dispatch, extensions, ui as uimod, util
114 from mercurial import cmdutil, commands, dispatch, extensions, ui as uimod, util
115 from mercurial import templater, error
115 from mercurial import templater, error
116 from mercurial.i18n import _
116 from mercurial.i18n import _
117
117
118 cmdtable = {}
119 command = cmdutil.command(cmdtable)
118 testedwith = 'internal'
120 testedwith = 'internal'
119
121
120 # start and stop parameters for effects
122 # start and stop parameters for effects
@@ -166,7 +168,7 b' def _terminfosetup(ui, mode):'
166 def _modesetup(ui, coloropt):
168 def _modesetup(ui, coloropt):
167 global _terminfo_params
169 global _terminfo_params
168
170
169 auto = coloropt == 'auto'
171 auto = (coloropt == 'auto')
170 always = not auto and util.parsebool(coloropt)
172 always = not auto and util.parsebool(coloropt)
171 if not always and not auto:
173 if not always and not auto:
172 return None
174 return None
@@ -440,6 +442,7 b' def extsetup(ui):'
440 _("when to colorize (boolean, always, auto, or never)"),
442 _("when to colorize (boolean, always, auto, or never)"),
441 _('TYPE')))
443 _('TYPE')))
442
444
445 @command('debugcolor', [], 'hg debugcolor')
443 def debugcolor(ui, repo, **opts):
446 def debugcolor(ui, repo, **opts):
444 global _styles
447 global _styles
445 _styles = {}
448 _styles = {}
@@ -579,8 +582,3 b' else:'
579 finally:
582 finally:
580 # Explicitly reset original attributes
583 # Explicitly reset original attributes
581 _kernel32.SetConsoleTextAttribute(stdout, origattr)
584 _kernel32.SetConsoleTextAttribute(stdout, origattr)
582
583 cmdtable = {
584 'debugcolor':
585 (debugcolor, [], ('hg debugcolor'))
586 }
@@ -10,13 +10,35 b''
10 import convcmd
10 import convcmd
11 import cvsps
11 import cvsps
12 import subversion
12 import subversion
13 from mercurial import commands, templatekw
13 from mercurial import cmdutil, templatekw
14 from mercurial.i18n import _
14 from mercurial.i18n import _
15
15
16 cmdtable = {}
17 command = cmdutil.command(cmdtable)
16 testedwith = 'internal'
18 testedwith = 'internal'
17
19
18 # Commands definition was moved elsewhere to ease demandload job.
20 # Commands definition was moved elsewhere to ease demandload job.
19
21
22 @command('convert',
23 [('', 'authors', '',
24 _('username mapping filename (DEPRECATED, use --authormap instead)'),
25 _('FILE')),
26 ('s', 'source-type', '', _('source repository type'), _('TYPE')),
27 ('d', 'dest-type', '', _('destination repository type'), _('TYPE')),
28 ('r', 'rev', '', _('import up to source revision REV'), _('REV')),
29 ('A', 'authormap', '', _('remap usernames using this file'), _('FILE')),
30 ('', 'filemap', '', _('remap file names using contents of file'),
31 _('FILE')),
32 ('', 'splicemap', '', _('splice synthesized history into place'),
33 _('FILE')),
34 ('', 'branchmap', '', _('change branch names while converting'),
35 _('FILE')),
36 ('', 'branchsort', None, _('try to sort changesets by branches')),
37 ('', 'datesort', None, _('try to sort changesets by date')),
38 ('', 'sourcesort', None, _('preserve source changesets order')),
39 ('', 'closesort', None, _('try to reorder closed revisions'))],
40 _('hg convert [OPTION]... SOURCE [DEST [REVMAP]]'),
41 norepo=True)
20 def convert(ui, src, dest=None, revmapfile=None, **opts):
42 def convert(ui, src, dest=None, revmapfile=None, **opts):
21 """convert a foreign SCM repository to a Mercurial one.
43 """convert a foreign SCM repository to a Mercurial one.
22
44
@@ -282,9 +304,29 b' def convert(ui, src, dest=None, revmapfi'
282 """
304 """
283 return convcmd.convert(ui, src, dest, revmapfile, **opts)
305 return convcmd.convert(ui, src, dest, revmapfile, **opts)
284
306
307 @command('debugsvnlog', [], 'hg debugsvnlog', norepo=True)
285 def debugsvnlog(ui, **opts):
308 def debugsvnlog(ui, **opts):
286 return subversion.debugsvnlog(ui, **opts)
309 return subversion.debugsvnlog(ui, **opts)
287
310
311 @command('debugcvsps',
312 [
313 # Main options shared with cvsps-2.1
314 ('b', 'branches', [], _('only return changes on specified branches')),
315 ('p', 'prefix', '', _('prefix to remove from file names')),
316 ('r', 'revisions', [],
317 _('only return changes after or between specified tags')),
318 ('u', 'update-cache', None, _("update cvs log cache")),
319 ('x', 'new-cache', None, _("create new cvs log cache")),
320 ('z', 'fuzz', 60, _('set commit time fuzz in seconds')),
321 ('', 'root', '', _('specify cvsroot')),
322 # Options specific to builtin cvsps
323 ('', 'parents', '', _('show parent changesets')),
324 ('', 'ancestors', '', _('show current changeset in ancestor branches')),
325 # Options that are ignored for compatibility with cvsps-2.1
326 ('A', 'cvs-direct', None, _('ignored for compatibility')),
327 ],
328 _('hg debugcvsps [OPTION]... [PATH]...'),
329 norepo=True)
288 def debugcvsps(ui, *args, **opts):
330 def debugcvsps(ui, *args, **opts):
289 '''create changeset information from CVS
331 '''create changeset information from CVS
290
332
@@ -298,59 +340,6 b' def debugcvsps(ui, *args, **opts):'
298 dates.'''
340 dates.'''
299 return cvsps.debugcvsps(ui, *args, **opts)
341 return cvsps.debugcvsps(ui, *args, **opts)
300
342
301 commands.norepo += " convert debugsvnlog debugcvsps"
302
303 cmdtable = {
304 "convert":
305 (convert,
306 [('', 'authors', '',
307 _('username mapping filename (DEPRECATED, use --authormap instead)'),
308 _('FILE')),
309 ('s', 'source-type', '',
310 _('source repository type'), _('TYPE')),
311 ('d', 'dest-type', '',
312 _('destination repository type'), _('TYPE')),
313 ('r', 'rev', '',
314 _('import up to source revision REV'), _('REV')),
315 ('A', 'authormap', '',
316 _('remap usernames using this file'), _('FILE')),
317 ('', 'filemap', '',
318 _('remap file names using contents of file'), _('FILE')),
319 ('', 'splicemap', '',
320 _('splice synthesized history into place'), _('FILE')),
321 ('', 'branchmap', '',
322 _('change branch names while converting'), _('FILE')),
323 ('', 'branchsort', None, _('try to sort changesets by branches')),
324 ('', 'datesort', None, _('try to sort changesets by date')),
325 ('', 'sourcesort', None, _('preserve source changesets order')),
326 ('', 'closesort', None, _('try to reorder closed revisions'))],
327 _('hg convert [OPTION]... SOURCE [DEST [REVMAP]]')),
328 "debugsvnlog":
329 (debugsvnlog,
330 [],
331 'hg debugsvnlog'),
332 "debugcvsps":
333 (debugcvsps,
334 [
335 # Main options shared with cvsps-2.1
336 ('b', 'branches', [], _('only return changes on specified branches')),
337 ('p', 'prefix', '', _('prefix to remove from file names')),
338 ('r', 'revisions', [],
339 _('only return changes after or between specified tags')),
340 ('u', 'update-cache', None, _("update cvs log cache")),
341 ('x', 'new-cache', None, _("create new cvs log cache")),
342 ('z', 'fuzz', 60, _('set commit time fuzz in seconds')),
343 ('', 'root', '', _('specify cvsroot')),
344 # Options specific to builtin cvsps
345 ('', 'parents', '', _('show parent changesets')),
346 ('', 'ancestors', '',
347 _('show current changeset in ancestor branches')),
348 # Options that are ignored for compatibility with cvsps-2.1
349 ('A', 'cvs-direct', None, _('ignored for compatibility')),
350 ],
351 _('hg debugcvsps [OPTION]... [PATH]...')),
352 }
353
354 def kwconverted(ctx, name):
343 def kwconverted(ctx, name):
355 rev = ctx.extra().get('convert_revision', '')
344 rev = ctx.extra().get('convert_revision', '')
356 if rev.startswith('svn:'):
345 if rev.startswith('svn:'):
@@ -260,8 +260,15 b' class converter_sink(object):'
260 """
260 """
261 pass
261 pass
262
262
263 def hascommit(self, rev):
263 def hascommitfrommap(self, rev):
264 """Return True if the sink contains rev"""
264 """Return False if a rev mentioned in a filemap is known to not be
265 present."""
266 raise NotImplementedError
267
268 def hascommitforsplicemap(self, rev):
269 """This method is for the special needs for splicemap handling and not
270 for general use. Returns True if the sink contains rev, aborts on some
271 special cases."""
265 raise NotImplementedError
272 raise NotImplementedError
266
273
267 class commandline(object):
274 class commandline(object):
@@ -173,8 +173,12 b' class converter(object):'
173 parents = {}
173 parents = {}
174 while visit:
174 while visit:
175 n = visit.pop(0)
175 n = visit.pop(0)
176 if n in known or n in self.map:
176 if n in known:
177 continue
177 continue
178 if n in self.map:
179 m = self.map[n]
180 if m == SKIPREV or self.dest.hascommitfrommap(m):
181 continue
178 known.add(n)
182 known.add(n)
179 self.ui.progress(_('scanning'), len(known), unit=_('revisions'))
183 self.ui.progress(_('scanning'), len(known), unit=_('revisions'))
180 commit = self.cachecommit(n)
184 commit = self.cachecommit(n)
@@ -193,7 +197,7 b' class converter(object):'
193 """
197 """
194 for c in sorted(splicemap):
198 for c in sorted(splicemap):
195 if c not in parents:
199 if c not in parents:
196 if not self.dest.hascommit(self.map.get(c, c)):
200 if not self.dest.hascommitforsplicemap(self.map.get(c, c)):
197 # Could be in source but not converted during this run
201 # Could be in source but not converted during this run
198 self.ui.warn(_('splice map revision %s is not being '
202 self.ui.warn(_('splice map revision %s is not being '
199 'converted, ignoring\n') % c)
203 'converted, ignoring\n') % c)
@@ -201,7 +205,7 b' class converter(object):'
201 pc = []
205 pc = []
202 for p in splicemap[c]:
206 for p in splicemap[c]:
203 # We do not have to wait for nodes already in dest.
207 # We do not have to wait for nodes already in dest.
204 if self.dest.hascommit(self.map.get(p, p)):
208 if self.dest.hascommitforsplicemap(self.map.get(p, p)):
205 continue
209 continue
206 # Parent is not in dest and not being converted, not good
210 # Parent is not in dest and not being converted, not good
207 if p not in parents:
211 if p not in parents:
@@ -46,6 +46,18 b' class convert_git(converter_source):'
46 del os.environ['GIT_DIR']
46 del os.environ['GIT_DIR']
47 else:
47 else:
48 os.environ['GIT_DIR'] = prevgitdir
48 os.environ['GIT_DIR'] = prevgitdir
49
50 def gitpipe(self, s):
51 prevgitdir = os.environ.get('GIT_DIR')
52 os.environ['GIT_DIR'] = self.path
53 try:
54 return util.popen3(s)
55 finally:
56 if prevgitdir is None:
57 del os.environ['GIT_DIR']
58 else:
59 os.environ['GIT_DIR'] = prevgitdir
60
49 else:
61 else:
50 def gitopen(self, s, err=None):
62 def gitopen(self, s, err=None):
51 if err == subprocess.PIPE:
63 if err == subprocess.PIPE:
@@ -56,6 +68,9 b' class convert_git(converter_source):'
56 else:
68 else:
57 return util.popen('GIT_DIR=%s %s' % (self.path, s), 'rb')
69 return util.popen('GIT_DIR=%s %s' % (self.path, s), 'rb')
58
70
71 def gitpipe(self, s):
72 return util.popen3('GIT_DIR=%s %s' % (self.path, s))
73
59 def popen_with_stderr(self, s):
74 def popen_with_stderr(self, s):
60 p = subprocess.Popen(s, shell=True, bufsize=-1,
75 p = subprocess.Popen(s, shell=True, bufsize=-1,
61 close_fds=util.closefds,
76 close_fds=util.closefds,
@@ -84,6 +99,12 b' class convert_git(converter_source):'
84 self.path = path
99 self.path = path
85 self.submodules = []
100 self.submodules = []
86
101
102 self.catfilepipe = self.gitpipe('git cat-file --batch')
103
104 def after(self):
105 for f in self.catfilepipe:
106 f.close()
107
87 def getheads(self):
108 def getheads(self):
88 if not self.rev:
109 if not self.rev:
89 heads, ret = self.gitread('git rev-parse --branches --remotes')
110 heads, ret = self.gitread('git rev-parse --branches --remotes')
@@ -98,9 +119,17 b' class convert_git(converter_source):'
98 def catfile(self, rev, type):
119 def catfile(self, rev, type):
99 if rev == hex(nullid):
120 if rev == hex(nullid):
100 raise IOError
121 raise IOError
101 data, ret = self.gitread("git cat-file %s %s" % (type, rev))
122 self.catfilepipe[0].write(rev+'\n')
102 if ret:
123 self.catfilepipe[0].flush()
124 info = self.catfilepipe[1].readline().split()
125 if info[1] != type:
103 raise util.Abort(_('cannot read %r object at %s') % (type, rev))
126 raise util.Abort(_('cannot read %r object at %s') % (type, rev))
127 size = int(info[2])
128 data = self.catfilepipe[1].read(size)
129 if len(data) < size:
130 raise util.Abort(_('cannot read %r object at %s: %s') % (type, rev))
131 # read the trailing newline
132 self.catfilepipe[1].read(1)
104 return data
133 return data
105
134
106 def getfile(self, name, rev):
135 def getfile(self, name, rev):
@@ -136,8 +136,8 b' class mercurial_sink(converter_sink):'
136 data, mode = source.getfile(f, v)
136 data, mode = source.getfile(f, v)
137 if f == '.hgtags':
137 if f == '.hgtags':
138 data = self._rewritetags(source, revmap, data)
138 data = self._rewritetags(source, revmap, data)
139 return context.memfilectx(f, data, 'l' in mode, 'x' in mode,
139 return context.memfilectx(self.repo, f, data, 'l' in mode,
140 copies.get(f))
140 'x' in mode, copies.get(f))
141
141
142 pl = []
142 pl = []
143 for p in parents:
143 for p in parents:
@@ -165,6 +165,24 b' class mercurial_sink(converter_sink):'
165 text = text.replace(sha1, newrev[:len(sha1)])
165 text = text.replace(sha1, newrev[:len(sha1)])
166
166
167 extra = commit.extra.copy()
167 extra = commit.extra.copy()
168
169 for label in ('source', 'transplant_source', 'rebase_source'):
170 node = extra.get(label)
171
172 if node is None:
173 continue
174
175 # Only transplant stores its reference in binary
176 if label == 'transplant_source':
177 node = hex(node)
178
179 newrev = revmap.get(node)
180 if newrev is not None:
181 if label == 'transplant_source':
182 newrev = bin(newrev)
183
184 extra[label] = newrev
185
168 if self.branchnames and commit.branch:
186 if self.branchnames and commit.branch:
169 extra['branch'] = commit.branch
187 extra['branch'] = commit.branch
170 if commit.rev:
188 if commit.rev:
@@ -229,7 +247,7 b' class mercurial_sink(converter_sink):'
229
247
230 data = "".join(newlines)
248 data = "".join(newlines)
231 def getfilectx(repo, memctx, f):
249 def getfilectx(repo, memctx, f):
232 return context.memfilectx(f, data, False, False, None)
250 return context.memfilectx(repo, f, data, False, False, None)
233
251
234 self.ui.status(_("updating tags\n"))
252 self.ui.status(_("updating tags\n"))
235 date = "%s 0" % int(time.mktime(time.gmtime()))
253 date = "%s 0" % int(time.mktime(time.gmtime()))
@@ -253,7 +271,11 b' class mercurial_sink(converter_sink):'
253 destmarks[bookmark] = bin(updatedbookmark[bookmark])
271 destmarks[bookmark] = bin(updatedbookmark[bookmark])
254 destmarks.write()
272 destmarks.write()
255
273
256 def hascommit(self, rev):
274 def hascommitfrommap(self, rev):
275 # the exact semantics of clonebranches is unclear so we can't say no
276 return rev in self.repo or self.clonebranches
277
278 def hascommitforsplicemap(self, rev):
257 if rev not in self.repo and self.clonebranches:
279 if rev not in self.repo and self.clonebranches:
258 raise util.Abort(_('revision %s not found in destination '
280 raise util.Abort(_('revision %s not found in destination '
259 'repository (lookups with clonebranches=true '
281 'repository (lookups with clonebranches=true '
@@ -394,7 +416,9 b' class mercurial_source(converter_source)'
394 sortkey=ctx.rev())
416 sortkey=ctx.rev())
395
417
396 def gettags(self):
418 def gettags(self):
397 tags = [t for t in self.repo.tagslist() if t[0] != 'tip']
419 # This will get written to .hgtags, filter non global tags out.
420 tags = [t for t in self.repo.tagslist()
421 if self.repo.tagtype(t[0]) == 'global']
398 return dict([(name, hex(node)) for name, node in tags
422 return dict([(name, hex(node)) for name, node in tags
399 if self.keep(node)])
423 if self.keep(node)])
400
424
@@ -1300,7 +1300,12 b' class svn_sink(converter_sink, commandli'
1300 self.ui.warn(_('writing Subversion tags is not yet implemented\n'))
1300 self.ui.warn(_('writing Subversion tags is not yet implemented\n'))
1301 return None, None
1301 return None, None
1302
1302
1303 def hascommit(self, rev):
1303 def hascommitfrommap(self, rev):
1304 # We trust that revisions referenced in a map still is present
1305 # TODO: implement something better if necessary and feasible
1306 return True
1307
1308 def hascommitforsplicemap(self, rev):
1304 # This is not correct as one can convert to an existing subversion
1309 # This is not correct as one can convert to an existing subversion
1305 # repository and childmap would not list all revisions. Too bad.
1310 # repository and childmap would not list all revisions. Too bad.
1306 if rev in self.childmap:
1311 if rev in self.childmap:
@@ -63,9 +63,11 b' pretty fast (at least faster than having'
63
63
64 from mercurial.i18n import _
64 from mercurial.i18n import _
65 from mercurial.node import short, nullid
65 from mercurial.node import short, nullid
66 from mercurial import scmutil, scmutil, util, commands, encoding
66 from mercurial import cmdutil, scmutil, scmutil, util, commands, encoding
67 import os, shlex, shutil, tempfile, re
67 import os, shlex, shutil, tempfile, re
68
68
69 cmdtable = {}
70 command = cmdutil.command(cmdtable)
69 testedwith = 'internal'
71 testedwith = 'internal'
70
72
71 def snapshot(ui, repo, files, node, tmproot):
73 def snapshot(ui, repo, files, node, tmproot):
@@ -238,6 +240,16 b' def dodiff(ui, repo, diffcmd, diffopts, '
238 ui.note(_('cleaning up temp directory\n'))
240 ui.note(_('cleaning up temp directory\n'))
239 shutil.rmtree(tmproot)
241 shutil.rmtree(tmproot)
240
242
243 @command('extdiff',
244 [('p', 'program', '',
245 _('comparison program to run'), _('CMD')),
246 ('o', 'option', [],
247 _('pass option to comparison program'), _('OPT')),
248 ('r', 'rev', [], _('revision'), _('REV')),
249 ('c', 'change', '', _('change made by revision'), _('REV')),
250 ] + commands.walkopts,
251 _('hg extdiff [OPT]... [FILE]...'),
252 inferrepo=True)
241 def extdiff(ui, repo, *pats, **opts):
253 def extdiff(ui, repo, *pats, **opts):
242 '''use external program to diff repository (or selected files)
254 '''use external program to diff repository (or selected files)
243
255
@@ -262,21 +274,6 b' def extdiff(ui, repo, *pats, **opts):'
262 option = option or ['-Npru']
274 option = option or ['-Npru']
263 return dodiff(ui, repo, program, option, pats, opts)
275 return dodiff(ui, repo, program, option, pats, opts)
264
276
265 cmdtable = {
266 "extdiff":
267 (extdiff,
268 [('p', 'program', '',
269 _('comparison program to run'), _('CMD')),
270 ('o', 'option', [],
271 _('pass option to comparison program'), _('OPT')),
272 ('r', 'rev', [],
273 _('revision'), _('REV')),
274 ('c', 'change', '',
275 _('change made by revision'), _('REV')),
276 ] + commands.walkopts,
277 _('hg extdiff [OPT]... [FILE]...')),
278 }
279
280 def uisetup(ui):
277 def uisetup(ui):
281 for cmd, path in ui.configitems('extdiff'):
278 for cmd, path in ui.configitems('extdiff'):
282 if cmd.startswith('cmd.'):
279 if cmd.startswith('cmd.'):
@@ -329,5 +326,3 b' use %(path)s to diff repository (or sele'
329 cmdtable[cmd] = (save(cmd, path, diffopts),
326 cmdtable[cmd] = (save(cmd, path, diffopts),
330 cmdtable['extdiff'][1][1:],
327 cmdtable['extdiff'][1][1:],
331 _('hg %s [OPTION]... [FILE]...') % cmd)
328 _('hg %s [OPTION]... [FILE]...') % cmd)
332
333 commands.inferrepo += " extdiff"
@@ -52,6 +52,8 b' import os, urllib2'
52
52
53 ERRMAX = 128
53 ERRMAX = 128
54
54
55 _executable = _mountpoint = _service = None
56
55 def auth_getkey(self, params):
57 def auth_getkey(self, params):
56 if not self.ui.interactive():
58 if not self.ui.interactive():
57 raise util.Abort(_('factotum not interactive'))
59 raise util.Abort(_('factotum not interactive'))
@@ -12,8 +12,18 b' from mercurial.node import nullid, short'
12 from mercurial import commands, cmdutil, hg, util, error
12 from mercurial import commands, cmdutil, hg, util, error
13 from mercurial.lock import release
13 from mercurial.lock import release
14
14
15 cmdtable = {}
16 command = cmdutil.command(cmdtable)
15 testedwith = 'internal'
17 testedwith = 'internal'
16
18
19 @command('fetch',
20 [('r', 'rev', [],
21 _('a specific revision you would like to pull'), _('REV')),
22 ('e', 'edit', None, _('edit commit message')),
23 ('', 'force-editor', None, _('edit commit message (DEPRECATED)')),
24 ('', 'switch-parent', None, _('switch parents when merging')),
25 ] + commands.commitopts + commands.commitopts2 + commands.remoteopts,
26 _('hg fetch [SOURCE]'))
17 def fetch(ui, repo, source='default', **opts):
27 def fetch(ui, repo, source='default', **opts):
18 '''pull changes from a remote repository, merge new changes if needed.
28 '''pull changes from a remote repository, merge new changes if needed.
19
29
@@ -132,10 +142,9 b" def fetch(ui, repo, source='default', **"
132 message = (cmdutil.logmessage(ui, opts) or
142 message = (cmdutil.logmessage(ui, opts) or
133 ('Automated merge with %s' %
143 ('Automated merge with %s' %
134 util.removeauth(other.url())))
144 util.removeauth(other.url())))
135 editor = cmdutil.commiteditor
145 editopt = opts.get('edit') or opts.get('force_editor')
136 if opts.get('force_editor') or opts.get('edit'):
146 n = repo.commit(message, opts['user'], opts['date'],
137 editor = cmdutil.commitforceeditor
147 editor=cmdutil.getcommiteditor(edit=editopt))
138 n = repo.commit(message, opts['user'], opts['date'], editor=editor)
139 ui.status(_('new changeset %d:%s merges remote changes '
148 ui.status(_('new changeset %d:%s merges remote changes '
140 'with local\n') % (repo.changelog.rev(n),
149 'with local\n') % (repo.changelog.rev(n),
141 short(n)))
150 short(n)))
@@ -144,15 +153,3 b" def fetch(ui, repo, source='default', **"
144
153
145 finally:
154 finally:
146 release(lock, wlock)
155 release(lock, wlock)
147
148 cmdtable = {
149 'fetch':
150 (fetch,
151 [('r', 'rev', [],
152 _('a specific revision you would like to pull'), _('REV')),
153 ('e', 'edit', None, _('edit commit message')),
154 ('', 'force-editor', None, _('edit commit message (DEPRECATED)')),
155 ('', 'switch-parent', None, _('switch parents when merging')),
156 ] + commands.commitopts + commands.commitopts2 + commands.remoteopts,
157 _('hg fetch [SOURCE]')),
158 }
@@ -204,6 +204,7 b' def keystr(ui, key):'
204 _('the key id to sign with'), _('ID')),
204 _('the key id to sign with'), _('ID')),
205 ('m', 'message', '',
205 ('m', 'message', '',
206 _('commit message'), _('TEXT')),
206 _('commit message'), _('TEXT')),
207 ('e', 'edit', False, _('invoke editor on commit messages')),
207 ] + commands.commitopts2,
208 ] + commands.commitopts2,
208 _('hg sign [OPTION]... [REV]...'))
209 _('hg sign [OPTION]... [REV]...'))
209 def sign(ui, repo, *revs, **opts):
210 def sign(ui, repo, *revs, **opts):
@@ -276,7 +277,8 b' def sign(ui, repo, *revs, **opts):'
276 % hgnode.short(n)
277 % hgnode.short(n)
277 for n in nodes])
278 for n in nodes])
278 try:
279 try:
279 repo.commit(message, opts['user'], opts['date'], match=msigs)
280 repo.commit(message, opts['user'], opts['date'], match=msigs,
281 editor=cmdutil.getcommiteditor(**opts))
280 except ValueError, inst:
282 except ValueError, inst:
281 raise util.Abort(str(inst))
283 raise util.Abort(str(inst))
282
284
@@ -43,7 +43,8 b" testedwith = 'internal'"
43 ('P', 'prune', [],
43 ('P', 'prune', [],
44 _('do not display revision or any of its ancestors'), _('REV')),
44 _('do not display revision or any of its ancestors'), _('REV')),
45 ] + commands.logopts + commands.walkopts,
45 ] + commands.logopts + commands.walkopts,
46 _('[OPTION]... [FILE]'))
46 _('[OPTION]... [FILE]'),
47 inferrepo=True)
47 def graphlog(ui, repo, *pats, **opts):
48 def graphlog(ui, repo, *pats, **opts):
48 """show revision history alongside an ASCII revision graph
49 """show revision history alongside an ASCII revision graph
49
50
@@ -54,5 +55,3 b' def graphlog(ui, repo, *pats, **opts):'
54 directory.
55 directory.
55 """
56 """
56 return cmdutil.graphlog(ui, repo, *pats, **opts)
57 return cmdutil.graphlog(ui, repo, *pats, **opts)
57
58 commands.inferrepo += " glog"
@@ -35,12 +35,23 b' vdiff on hovered and selected revisions.'
35 '''
35 '''
36
36
37 import os
37 import os
38 from mercurial import commands, util, patch, revlog, scmutil
38 from mercurial import cmdutil, commands, util, patch, revlog, scmutil
39 from mercurial.node import nullid, nullrev, short
39 from mercurial.node import nullid, nullrev, short
40 from mercurial.i18n import _
40 from mercurial.i18n import _
41
41
42 cmdtable = {}
43 command = cmdutil.command(cmdtable)
42 testedwith = 'internal'
44 testedwith = 'internal'
43
45
46 @command('debug-diff-tree',
47 [('p', 'patch', None, _('generate patch')),
48 ('r', 'recursive', None, _('recursive')),
49 ('P', 'pretty', None, _('pretty')),
50 ('s', 'stdin', None, _('stdin')),
51 ('C', 'copy', None, _('detect copies')),
52 ('S', 'search', "", _('search'))],
53 ('hg git-diff-tree [OPTION]... NODE1 NODE2 [FILE]...'),
54 inferrepo=True)
44 def difftree(ui, repo, node1=None, node2=None, *files, **opts):
55 def difftree(ui, repo, node1=None, node2=None, *files, **opts):
45 """diff trees from two commits"""
56 """diff trees from two commits"""
46 def __difftree(repo, node1, node2, files=[]):
57 def __difftree(repo, node1, node2, files=[]):
@@ -125,6 +136,7 b' def catcommit(ui, repo, n, prefix, ctx=N'
125 if prefix:
136 if prefix:
126 ui.write('\0')
137 ui.write('\0')
127
138
139 @command('debug-merge-base', [], _('hg debug-merge-base REV REV'))
128 def base(ui, repo, node1, node2):
140 def base(ui, repo, node1, node2):
129 """output common ancestor information"""
141 """output common ancestor information"""
130 node1 = repo.lookup(node1)
142 node1 = repo.lookup(node1)
@@ -132,6 +144,10 b' def base(ui, repo, node1, node2):'
132 n = repo.changelog.ancestor(node1, node2)
144 n = repo.changelog.ancestor(node1, node2)
133 ui.write(short(n) + "\n")
145 ui.write(short(n) + "\n")
134
146
147 @command('debug-cat-file',
148 [('s', 'stdin', None, _('stdin'))],
149 _('hg debug-cat-file [OPTION]... TYPE FILE'),
150 inferrepo=True)
135 def catfile(ui, repo, type=None, r=None, **opts):
151 def catfile(ui, repo, type=None, r=None, **opts):
136 """cat a specific revision"""
152 """cat a specific revision"""
137 # in stdin mode, every line except the commit is prefixed with two
153 # in stdin mode, every line except the commit is prefixed with two
@@ -276,6 +292,9 b' def revtree(ui, args, repo, full="tree",'
276 break
292 break
277 count += 1
293 count += 1
278
294
295 @command('debug-rev-parse',
296 [('', 'default', '', _('ignored'))],
297 _('hg debug-rev-parse REV'))
279 def revparse(ui, repo, *revs, **opts):
298 def revparse(ui, repo, *revs, **opts):
280 """parse given revisions"""
299 """parse given revisions"""
281 def revstr(rev):
300 def revstr(rev):
@@ -292,6 +311,12 b' def revparse(ui, repo, *revs, **opts):'
292 # git rev-list tries to order things by date, and has the ability to stop
311 # git rev-list tries to order things by date, and has the ability to stop
293 # at a given commit without walking the whole repo. TODO add the stop
312 # at a given commit without walking the whole repo. TODO add the stop
294 # parameter
313 # parameter
314 @command('debug-rev-list',
315 [('H', 'header', None, _('header')),
316 ('t', 'topo-order', None, _('topo-order')),
317 ('p', 'parents', None, _('parents')),
318 ('n', 'max-count', 0, _('max-count'))],
319 ('hg debug-rev-list [OPTION]... REV...'))
295 def revlist(ui, repo, *revs, **opts):
320 def revlist(ui, repo, *revs, **opts):
296 """print revisions"""
321 """print revisions"""
297 if opts['header']:
322 if opts['header']:
@@ -301,6 +326,7 b' def revlist(ui, repo, *revs, **opts):'
301 copy = [x for x in revs]
326 copy = [x for x in revs]
302 revtree(ui, copy, repo, full, opts['max_count'], opts['parents'])
327 revtree(ui, copy, repo, full, opts['max_count'], opts['parents'])
303
328
329 @command('debug-config', [], _('hg debug-config'))
304 def config(ui, repo, **opts):
330 def config(ui, repo, **opts):
305 """print extension options"""
331 """print extension options"""
306 def writeopt(name, value):
332 def writeopt(name, value):
@@ -309,6 +335,10 b' def config(ui, repo, **opts):'
309 writeopt('vdiff', ui.config('hgk', 'vdiff', ''))
335 writeopt('vdiff', ui.config('hgk', 'vdiff', ''))
310
336
311
337
338 @command('view',
339 [('l', 'limit', '',
340 _('limit number of changes displayed'), _('NUM'))],
341 _('hg view [-l LIMIT] [REVRANGE]'))
312 def view(ui, repo, *etc, **opts):
342 def view(ui, repo, *etc, **opts):
313 "start interactive history viewer"
343 "start interactive history viewer"
314 os.chdir(repo.root)
344 os.chdir(repo.root)
@@ -316,41 +346,3 b' def view(ui, repo, *etc, **opts):'
316 cmd = ui.config("hgk", "path", "hgk") + " %s %s" % (optstr, " ".join(etc))
346 cmd = ui.config("hgk", "path", "hgk") + " %s %s" % (optstr, " ".join(etc))
317 ui.debug("running %s\n" % cmd)
347 ui.debug("running %s\n" % cmd)
318 util.system(cmd)
348 util.system(cmd)
319
320 cmdtable = {
321 "^view":
322 (view,
323 [('l', 'limit', '',
324 _('limit number of changes displayed'), _('NUM'))],
325 _('hg view [-l LIMIT] [REVRANGE]')),
326 "debug-diff-tree":
327 (difftree,
328 [('p', 'patch', None, _('generate patch')),
329 ('r', 'recursive', None, _('recursive')),
330 ('P', 'pretty', None, _('pretty')),
331 ('s', 'stdin', None, _('stdin')),
332 ('C', 'copy', None, _('detect copies')),
333 ('S', 'search', "", _('search'))],
334 _('hg git-diff-tree [OPTION]... NODE1 NODE2 [FILE]...')),
335 "debug-cat-file":
336 (catfile,
337 [('s', 'stdin', None, _('stdin'))],
338 _('hg debug-cat-file [OPTION]... TYPE FILE')),
339 "debug-config":
340 (config, [], _('hg debug-config')),
341 "debug-merge-base":
342 (base, [], _('hg debug-merge-base REV REV')),
343 "debug-rev-parse":
344 (revparse,
345 [('', 'default', '', _('ignored'))],
346 _('hg debug-rev-parse REV')),
347 "debug-rev-list":
348 (revlist,
349 [('H', 'header', None, _('header')),
350 ('t', 'topo-order', None, _('topo-order')),
351 ('p', 'parents', None, _('parents')),
352 ('n', 'max-count', 0, _('max-count'))],
353 _('hg debug-rev-list [OPTION]... REV...')),
354 }
355
356 commands.inferrepo += " debug-diff-tree debug-cat-file"
@@ -275,7 +275,8 b' def collapse(repo, first, last, commitop'
275 if path in headmf:
275 if path in headmf:
276 fctx = last[path]
276 fctx = last[path]
277 flags = fctx.flags()
277 flags = fctx.flags()
278 mctx = context.memfilectx(fctx.path(), fctx.data(),
278 mctx = context.memfilectx(repo,
279 fctx.path(), fctx.data(),
279 islink='l' in flags,
280 islink='l' in flags,
280 isexec='x' in flags,
281 isexec='x' in flags,
281 copied=copied.get(path))
282 copied=copied.get(path))
@@ -298,9 +299,8 b' def collapse(repo, first, last, commitop'
298 filectxfn=filectxfn,
299 filectxfn=filectxfn,
299 user=user,
300 user=user,
300 date=date,
301 date=date,
301 extra=extra)
302 extra=extra,
302 new._text = cmdutil.commitforceeditor(repo, new, [])
303 editor=cmdutil.getcommiteditor(edit=True))
303 repo.savecommitmessage(new.description())
304 return repo.commitctx(new)
304 return repo.commitctx(new)
305
305
306 def pick(ui, repo, ctx, ha, opts):
306 def pick(ui, repo, ctx, ha, opts):
@@ -402,12 +402,11 b' def message(ui, repo, ctx, ha, opts):'
402 if stats and stats[3] > 0:
402 if stats and stats[3] > 0:
403 raise error.InterventionRequired(
403 raise error.InterventionRequired(
404 _('Fix up the change and run hg histedit --continue'))
404 _('Fix up the change and run hg histedit --continue'))
405 message = oldctx.description() + '\n'
405 message = oldctx.description()
406 message = ui.edit(message, ui.username())
407 repo.savecommitmessage(message)
408 commit = commitfuncfor(repo, oldctx)
406 commit = commitfuncfor(repo, oldctx)
409 new = commit(text=message, user=oldctx.user(), date=oldctx.date(),
407 new = commit(text=message, user=oldctx.user(), date=oldctx.date(),
410 extra=oldctx.extra())
408 extra=oldctx.extra(),
409 editor=cmdutil.getcommiteditor(edit=True))
411 newctx = repo[new]
410 newctx = repo[new]
412 if oldctx.node() != newctx.node():
411 if oldctx.node() != newctx.node():
413 return newctx, [(oldctx.node(), (new,))]
412 return newctx, [(oldctx.node(), (new,))]
@@ -682,11 +681,9 b' def bootstrapcontinue(ui, repo, parentct'
682 if action in ('f', 'fold'):
681 if action in ('f', 'fold'):
683 message = 'fold-temp-revision %s' % currentnode
682 message = 'fold-temp-revision %s' % currentnode
684 else:
683 else:
685 message = ctx.description() + '\n'
684 message = ctx.description()
686 if action in ('e', 'edit', 'm', 'mess'):
685 editopt = action in ('e', 'edit', 'm', 'mess')
687 editor = cmdutil.commitforceeditor
686 editor = cmdutil.getcommiteditor(edit=editopt)
688 else:
689 editor = False
690 commit = commitfuncfor(repo, ctx)
687 commit = commitfuncfor(repo, ctx)
691 new = commit(text=message, user=ctx.user(),
688 new = commit(text=message, user=ctx.user(),
692 date=ctx.date(), extra=ctx.extra(),
689 date=ctx.date(), extra=ctx.extra(),
@@ -763,7 +760,8 b' def makedesc(c):'
763 if c.description():
760 if c.description():
764 summary = c.description().splitlines()[0]
761 summary = c.description().splitlines()[0]
765 line = 'pick %s %d %s' % (c, c.rev(), summary)
762 line = 'pick %s %d %s' % (c, c.rev(), summary)
766 return line[:80] # trim to 80 chars so it's not stupidly wide in my editor
763 # trim to 80 columns so it's not stupidly wide in my editor
764 return util.ellipsis(line, 80)
767
765
768 def verifyrules(rules, repo, ctxs):
766 def verifyrules(rules, repo, ctxs):
769 """Verify that there exists exactly one edit rule per given changeset.
767 """Verify that there exists exactly one edit rule per given changeset.
@@ -89,9 +89,6 b' from mercurial.hgweb import webcommands'
89 from mercurial.i18n import _
89 from mercurial.i18n import _
90 import os, re, shutil, tempfile
90 import os, re, shutil, tempfile
91
91
92 commands.optionalrepo += ' kwdemo'
93 commands.inferrepo += ' kwexpand kwfiles kwshrink'
94
95 cmdtable = {}
92 cmdtable = {}
96 command = cmdutil.command(cmdtable)
93 command = cmdutil.command(cmdtable)
97 testedwith = 'internal'
94 testedwith = 'internal'
@@ -363,7 +360,8 b' def _kwfwrite(ui, repo, expand, *pats, *'
363 [('d', 'default', None, _('show default keyword template maps')),
360 [('d', 'default', None, _('show default keyword template maps')),
364 ('f', 'rcfile', '',
361 ('f', 'rcfile', '',
365 _('read maps from rcfile'), _('FILE'))],
362 _('read maps from rcfile'), _('FILE'))],
366 _('hg kwdemo [-d] [-f RCFILE] [TEMPLATEMAP]...'))
363 _('hg kwdemo [-d] [-f RCFILE] [TEMPLATEMAP]...'),
364 optionalrepo=True)
367 def demo(ui, repo, *args, **opts):
365 def demo(ui, repo, *args, **opts):
368 '''print [keywordmaps] configuration and an expansion example
366 '''print [keywordmaps] configuration and an expansion example
369
367
@@ -454,7 +452,10 b' def demo(ui, repo, *args, **opts):'
454 ui.write(repo.wread(fn))
452 ui.write(repo.wread(fn))
455 shutil.rmtree(tmpdir, ignore_errors=True)
453 shutil.rmtree(tmpdir, ignore_errors=True)
456
454
457 @command('kwexpand', commands.walkopts, _('hg kwexpand [OPTION]... [FILE]...'))
455 @command('kwexpand',
456 commands.walkopts,
457 _('hg kwexpand [OPTION]... [FILE]...'),
458 inferrepo=True)
458 def expand(ui, repo, *pats, **opts):
459 def expand(ui, repo, *pats, **opts):
459 '''expand keywords in the working directory
460 '''expand keywords in the working directory
460
461
@@ -470,7 +471,8 b' def expand(ui, repo, *pats, **opts):'
470 ('i', 'ignore', None, _('show files excluded from expansion')),
471 ('i', 'ignore', None, _('show files excluded from expansion')),
471 ('u', 'unknown', None, _('only show unknown (not tracked) files')),
472 ('u', 'unknown', None, _('only show unknown (not tracked) files')),
472 ] + commands.walkopts,
473 ] + commands.walkopts,
473 _('hg kwfiles [OPTION]... [FILE]...'))
474 _('hg kwfiles [OPTION]... [FILE]...'),
475 inferrepo=True)
474 def files(ui, repo, *pats, **opts):
476 def files(ui, repo, *pats, **opts):
475 '''show files configured for keyword expansion
477 '''show files configured for keyword expansion
476
478
@@ -524,7 +526,10 b' def files(ui, repo, *pats, **opts):'
524 repo.pathto(f, cwd), label=label)
526 repo.pathto(f, cwd), label=label)
525 fm.end()
527 fm.end()
526
528
527 @command('kwshrink', commands.walkopts, _('hg kwshrink [OPTION]... [FILE]...'))
529 @command('kwshrink',
530 commands.walkopts,
531 _('hg kwshrink [OPTION]... [FILE]...'),
532 inferrepo=True)
528 def shrink(ui, repo, *pats, **opts):
533 def shrink(ui, repo, *pats, **opts):
529 '''revert expanded keywords in the working directory
534 '''revert expanded keywords in the working directory
530
535
@@ -105,7 +105,7 b' explicitly do so with the --large flag p'
105 command.
105 command.
106 '''
106 '''
107
107
108 from mercurial import commands, hg, localrepo
108 from mercurial import hg, localrepo
109
109
110 import lfcommands
110 import lfcommands
111 import proto
111 import proto
@@ -125,6 +125,4 b' def uisetup(ui):'
125 hg.wirepeersetupfuncs.append(proto.wirereposetup)
125 hg.wirepeersetupfuncs.append(proto.wirereposetup)
126 uisetupmod.uisetup(ui)
126 uisetupmod.uisetup(ui)
127
127
128 commands.norepo += " lfconvert"
129
130 cmdtable = lfcommands.cmdtable
128 cmdtable = lfcommands.cmdtable
@@ -21,6 +21,18 b' import basestore'
21
21
22 # -- Commands ----------------------------------------------------------
22 # -- Commands ----------------------------------------------------------
23
23
24 cmdtable = {}
25 command = cmdutil.command(cmdtable)
26
27 @command('lfconvert',
28 [('s', 'size', '',
29 _('minimum size (MB) for files to be converted as largefiles'), 'SIZE'),
30 ('', 'to-normal', False,
31 _('convert from a largefiles repo to a normal repo')),
32 ],
33 _('hg lfconvert SOURCE DEST [FILE ...]'),
34 norepo=True,
35 inferrepo=True)
24 def lfconvert(ui, src, dest, *pats, **opts):
36 def lfconvert(ui, src, dest, *pats, **opts):
25 '''convert a normal repository to a largefiles repository
37 '''convert a normal repository to a largefiles repository
26
38
@@ -160,10 +172,10 b' def _addchangeset(ui, rsrc, rdst, ctx, r'
160 finally:
172 finally:
161 if fd:
173 if fd:
162 fd.close()
174 fd.close()
163 return context.memfilectx(f, data, 'l' in fctx.flags(),
175 return context.memfilectx(repo, f, data, 'l' in fctx.flags(),
164 'x' in fctx.flags(), renamed)
176 'x' in fctx.flags(), renamed)
165 else:
177 else:
166 return _getnormalcontext(repo.ui, ctx, f, revmap)
178 return _getnormalcontext(repo, ctx, f, revmap)
167
179
168 dstfiles = []
180 dstfiles = []
169 for file in files:
181 for file in files:
@@ -243,10 +255,11 b' def _lfconvert_addchangeset(rsrc, rdst, '
243 # doesn't change after rename or copy
255 # doesn't change after rename or copy
244 renamed = lfutil.standin(renamed[0])
256 renamed = lfutil.standin(renamed[0])
245
257
246 return context.memfilectx(f, lfiletohash[srcfname] + '\n', 'l' in
258 return context.memfilectx(repo, f, lfiletohash[srcfname] + '\n',
247 fctx.flags(), 'x' in fctx.flags(), renamed)
259 'l' in fctx.flags(), 'x' in fctx.flags(),
260 renamed)
248 else:
261 else:
249 return _getnormalcontext(repo.ui, ctx, f, revmap)
262 return _getnormalcontext(repo, ctx, f, revmap)
250
263
251 # Commit
264 # Commit
252 _commitcontext(rdst, parents, ctx, dstfiles, getfilectx, revmap)
265 _commitcontext(rdst, parents, ctx, dstfiles, getfilectx, revmap)
@@ -281,7 +294,7 b' def _convertparents(ctx, revmap):'
281 return parents
294 return parents
282
295
283 # Get memfilectx for a normal file
296 # Get memfilectx for a normal file
284 def _getnormalcontext(ui, ctx, f, revmap):
297 def _getnormalcontext(repo, ctx, f, revmap):
285 try:
298 try:
286 fctx = ctx.filectx(f)
299 fctx = ctx.filectx(f)
287 except error.LookupError:
300 except error.LookupError:
@@ -292,8 +305,8 b' def _getnormalcontext(ui, ctx, f, revmap'
292
305
293 data = fctx.data()
306 data = fctx.data()
294 if f == '.hgtags':
307 if f == '.hgtags':
295 data = _converttags (ui, revmap, data)
308 data = _converttags (repo.ui, revmap, data)
296 return context.memfilectx(f, data, 'l' in fctx.flags(),
309 return context.memfilectx(repo, f, data, 'l' in fctx.flags(),
297 'x' in fctx.flags(), renamed)
310 'x' in fctx.flags(), renamed)
298
311
299 # Remap tag data using a revision map
312 # Remap tag data using a revision map
@@ -519,6 +532,10 b' def updatelfiles(ui, repo, filelist=None'
519 finally:
532 finally:
520 wlock.release()
533 wlock.release()
521
534
535 @command('lfpull',
536 [('r', 'rev', [], _('pull largefiles for these revisions'))
537 ] + commands.remoteopts,
538 _('-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]'))
522 def lfpull(ui, repo, source="default", **opts):
539 def lfpull(ui, repo, source="default", **opts):
523 """pull largefiles for the specified revisions from the specified source
540 """pull largefiles for the specified revisions from the specified source
524
541
@@ -553,24 +570,3 b' def lfpull(ui, repo, source="default", *'
553 (cached, missing) = cachelfiles(ui, repo, rev)
570 (cached, missing) = cachelfiles(ui, repo, rev)
554 numcached += len(cached)
571 numcached += len(cached)
555 ui.status(_("%d largefiles cached\n") % numcached)
572 ui.status(_("%d largefiles cached\n") % numcached)
556
557 # -- hg commands declarations ------------------------------------------------
558
559 cmdtable = {
560 'lfconvert': (lfconvert,
561 [('s', 'size', '',
562 _('minimum size (MB) for files to be converted '
563 'as largefiles'),
564 'SIZE'),
565 ('', 'to-normal', False,
566 _('convert from a largefiles repo to a normal repo')),
567 ],
568 _('hg lfconvert SOURCE DEST [FILE ...]')),
569 'lfpull': (lfpull,
570 [('r', 'rev', [], _('pull largefiles for these revisions'))
571 ] + commands.remoteopts,
572 _('-r REV... [-e CMD] [--remotecmd CMD] [SOURCE]')
573 ),
574 }
575
576 commands.inferrepo += " lfconvert"
@@ -123,9 +123,13 b' def openlfdirstate(ui, repo, create=True'
123 # it. This ensures that we create it on the first meaningful
123 # it. This ensures that we create it on the first meaningful
124 # largefiles operation in a new clone.
124 # largefiles operation in a new clone.
125 if create and not os.path.exists(os.path.join(lfstoredir, 'dirstate')):
125 if create and not os.path.exists(os.path.join(lfstoredir, 'dirstate')):
126 util.makedirs(lfstoredir)
127 matcher = getstandinmatcher(repo)
126 matcher = getstandinmatcher(repo)
128 for standin in repo.dirstate.walk(matcher, [], False, False):
127 standins = repo.dirstate.walk(matcher, [], False, False)
128
129 if len(standins) > 0:
130 util.makedirs(lfstoredir)
131
132 for standin in standins:
129 lfile = splitstandin(standin)
133 lfile = splitstandin(standin)
130 lfdirstate.normallookup(lfile)
134 lfdirstate.normallookup(lfile)
131 return lfdirstate
135 return lfdirstate
@@ -282,6 +282,8 b' def overridelog(orig, ui, repo, *pats, *'
282 standin = lfutil.standin(m._files[i])
282 standin = lfutil.standin(m._files[i])
283 if standin in repo[ctx.node()]:
283 if standin in repo[ctx.node()]:
284 m._files[i] = standin
284 m._files[i] = standin
285 elif m._files[i] not in repo[ctx.node()]:
286 m._files.append(standin)
285 pats.add(standin)
287 pats.add(standin)
286
288
287 m._fmap = set(m._files)
289 m._fmap = set(m._files)
@@ -408,14 +410,13 b' def overridecalculateupdates(origfn, rep'
408 if overwrite:
410 if overwrite:
409 return actions
411 return actions
410
412
411 removes = set(a[0] for a in actions if a[1] == 'r')
413 removes = set(a[0] for a in actions['r'])
412 processed = []
413
414
414 for action in actions:
415 newglist = []
415 f, m, args, msg = action
416 for action in actions['g']:
416
417 f, args, msg = action
417 splitstandin = f and lfutil.splitstandin(f)
418 splitstandin = f and lfutil.splitstandin(f)
418 if (m == "g" and splitstandin is not None and
419 if (splitstandin is not None and
419 splitstandin in p1 and splitstandin not in removes):
420 splitstandin in p1 and splitstandin not in removes):
420 # Case 1: normal file in the working copy, largefile in
421 # Case 1: normal file in the working copy, largefile in
421 # the second parent
422 # the second parent
@@ -425,12 +426,11 b' def overridecalculateupdates(origfn, rep'
425 'use (l)argefile or keep (n)ormal file?'
426 'use (l)argefile or keep (n)ormal file?'
426 '$$ &Largefile $$ &Normal file') % lfile
427 '$$ &Largefile $$ &Normal file') % lfile
427 if repo.ui.promptchoice(msg, 0) == 0:
428 if repo.ui.promptchoice(msg, 0) == 0:
428 processed.append((lfile, "r", None, msg))
429 actions['r'].append((lfile, None, msg))
429 processed.append((standin, "g", (p2.flags(standin),), msg))
430 newglist.append((standin, (p2.flags(standin),), msg))
430 else:
431 else:
431 processed.append((standin, "r", None, msg))
432 actions['r'].append((standin, None, msg))
432 elif (m == "g" and
433 elif lfutil.standin(f) in p1 and lfutil.standin(f) not in removes:
433 lfutil.standin(f) in p1 and lfutil.standin(f) not in removes):
434 # Case 2: largefile in the working copy, normal file in
434 # Case 2: largefile in the working copy, normal file in
435 # the second parent
435 # the second parent
436 standin = lfutil.standin(f)
436 standin = lfutil.standin(f)
@@ -439,20 +439,23 b' def overridecalculateupdates(origfn, rep'
439 'keep (l)argefile or use (n)ormal file?'
439 'keep (l)argefile or use (n)ormal file?'
440 '$$ &Largefile $$ &Normal file') % lfile
440 '$$ &Largefile $$ &Normal file') % lfile
441 if repo.ui.promptchoice(msg, 0) == 0:
441 if repo.ui.promptchoice(msg, 0) == 0:
442 processed.append((lfile, "r", None, msg))
442 actions['r'].append((lfile, None, msg))
443 else:
443 else:
444 processed.append((standin, "r", None, msg))
444 actions['r'].append((standin, None, msg))
445 processed.append((lfile, "g", (p2.flags(lfile),), msg))
445 newglist.append((lfile, (p2.flags(lfile),), msg))
446 else:
446 else:
447 processed.append(action)
447 newglist.append(action)
448
448
449 return processed
449 newglist.sort()
450 actions['g'] = newglist
451
452 return actions
450
453
451 # Override filemerge to prompt the user about how they wish to merge
454 # Override filemerge to prompt the user about how they wish to merge
452 # largefiles. This will handle identical edits without prompting the user.
455 # largefiles. This will handle identical edits without prompting the user.
453 def overridefilemerge(origfn, repo, mynode, orig, fcd, fco, fca):
456 def overridefilemerge(origfn, repo, mynode, orig, fcd, fco, fca, labels=None):
454 if not lfutil.isstandin(orig):
457 if not lfutil.isstandin(orig):
455 return origfn(repo, mynode, orig, fcd, fco, fca)
458 return origfn(repo, mynode, orig, fcd, fco, fca, labels=labels)
456
459
457 ahash = fca.data().strip().lower()
460 ahash = fca.data().strip().lower()
458 dhash = fcd.data().strip().lower()
461 dhash = fcd.data().strip().lower()
@@ -989,17 +992,59 b' def overrideforget(orig, ui, repo, *pats'
989
992
990 return result
993 return result
991
994
995 def _getoutgoings(repo, other, missing, addfunc):
996 """get pairs of filename and largefile hash in outgoing revisions
997 in 'missing'.
998
999 largefiles already existing on 'other' repository are ignored.
1000
1001 'addfunc' is invoked with each unique pairs of filename and
1002 largefile hash value.
1003 """
1004 knowns = set()
1005 lfhashes = set()
1006 def dedup(fn, lfhash):
1007 k = (fn, lfhash)
1008 if k not in knowns:
1009 knowns.add(k)
1010 lfhashes.add(lfhash)
1011 lfutil.getlfilestoupload(repo, missing, dedup)
1012 if lfhashes:
1013 lfexists = basestore._openstore(repo, other).exists(lfhashes)
1014 for fn, lfhash in knowns:
1015 if not lfexists[lfhash]: # lfhash doesn't exist on "other"
1016 addfunc(fn, lfhash)
1017
992 def outgoinghook(ui, repo, other, opts, missing):
1018 def outgoinghook(ui, repo, other, opts, missing):
993 if opts.pop('large', None):
1019 if opts.pop('large', None):
994 toupload = set()
1020 lfhashes = set()
995 lfutil.getlfilestoupload(repo, missing,
1021 if ui.debugflag:
996 lambda fn, lfhash: toupload.add(fn))
1022 toupload = {}
1023 def addfunc(fn, lfhash):
1024 if fn not in toupload:
1025 toupload[fn] = []
1026 toupload[fn].append(lfhash)
1027 lfhashes.add(lfhash)
1028 def showhashes(fn):
1029 for lfhash in sorted(toupload[fn]):
1030 ui.debug(' %s\n' % (lfhash))
1031 else:
1032 toupload = set()
1033 def addfunc(fn, lfhash):
1034 toupload.add(fn)
1035 lfhashes.add(lfhash)
1036 def showhashes(fn):
1037 pass
1038 _getoutgoings(repo, other, missing, addfunc)
1039
997 if not toupload:
1040 if not toupload:
998 ui.status(_('largefiles: no files to upload\n'))
1041 ui.status(_('largefiles: no files to upload\n'))
999 else:
1042 else:
1000 ui.status(_('largefiles to upload:\n'))
1043 ui.status(_('largefiles to upload (%d entities):\n')
1044 % (len(lfhashes)))
1001 for file in sorted(toupload):
1045 for file in sorted(toupload):
1002 ui.status(lfutil.splitstandin(file) + '\n')
1046 ui.status(lfutil.splitstandin(file) + '\n')
1047 showhashes(file)
1003 ui.status('\n')
1048 ui.status('\n')
1004
1049
1005 def summaryremotehook(ui, repo, opts, changes):
1050 def summaryremotehook(ui, repo, opts, changes):
@@ -1017,14 +1062,19 b' def summaryremotehook(ui, repo, opts, ch'
1017 return
1062 return
1018
1063
1019 toupload = set()
1064 toupload = set()
1020 lfutil.getlfilestoupload(repo, outgoing.missing,
1065 lfhashes = set()
1021 lambda fn, lfhash: toupload.add(fn))
1066 def addfunc(fn, lfhash):
1067 toupload.add(fn)
1068 lfhashes.add(lfhash)
1069 _getoutgoings(repo, peer, outgoing.missing, addfunc)
1070
1022 if not toupload:
1071 if not toupload:
1023 # i18n: column positioning for "hg summary"
1072 # i18n: column positioning for "hg summary"
1024 ui.status(_('largefiles: (no files to upload)\n'))
1073 ui.status(_('largefiles: (no files to upload)\n'))
1025 else:
1074 else:
1026 # i18n: column positioning for "hg summary"
1075 # i18n: column positioning for "hg summary"
1027 ui.status(_('largefiles: %d to upload\n') % len(toupload))
1076 ui.status(_('largefiles: %d entities for %d files to upload\n')
1077 % (len(lfhashes), len(toupload)))
1028
1078
1029 def overridesummary(orig, ui, repo, *pats, **opts):
1079 def overridesummary(orig, ui, repo, *pats, **opts):
1030 try:
1080 try:
@@ -72,8 +72,6 b' from mercurial import localrepo'
72 from mercurial import subrepo
72 from mercurial import subrepo
73 import os, re, errno, shutil
73 import os, re, errno, shutil
74
74
75 commands.norepo += " qclone"
76
77 seriesopts = [('s', 'summary', None, _('print first line of patch header'))]
75 seriesopts = [('s', 'summary', None, _('print first line of patch header'))]
78
76
79 cmdtable = {}
77 cmdtable = {}
@@ -1026,6 +1024,7 b' class queue(object):'
1026 msg: a string or a no-argument function returning a string
1024 msg: a string or a no-argument function returning a string
1027 """
1025 """
1028 msg = opts.get('msg')
1026 msg = opts.get('msg')
1027 edit = opts.get('edit')
1029 user = opts.get('user')
1028 user = opts.get('user')
1030 date = opts.get('date')
1029 date = opts.get('date')
1031 if date:
1030 if date:
@@ -1078,12 +1077,25 b' class queue(object):'
1078 p.write("# User " + user + "\n")
1077 p.write("# User " + user + "\n")
1079 if date:
1078 if date:
1080 p.write("# Date %s %s\n\n" % date)
1079 p.write("# Date %s %s\n\n" % date)
1081 if util.safehasattr(msg, '__call__'):
1080
1082 msg = msg()
1081 defaultmsg = "[mq]: %s" % patchfn
1083 repo.savecommitmessage(msg)
1082 editor = cmdutil.getcommiteditor()
1084 commitmsg = msg and msg or ("[mq]: %s" % patchfn)
1083 if edit:
1084 def finishdesc(desc):
1085 if desc.rstrip():
1086 return desc
1087 else:
1088 return defaultmsg
1089 # i18n: this message is shown in editor with "HG: " prefix
1090 extramsg = _('Leave message empty to use default message.')
1091 editor = cmdutil.getcommiteditor(finishdesc=finishdesc,
1092 extramsg=extramsg)
1093 commitmsg = msg
1094 else:
1095 commitmsg = msg or defaultmsg
1096
1085 n = newcommit(repo, None, commitmsg, user, date, match=match,
1097 n = newcommit(repo, None, commitmsg, user, date, match=match,
1086 force=True)
1098 force=True, editor=editor)
1087 if n is None:
1099 if n is None:
1088 raise util.Abort(_("repo commit failed"))
1100 raise util.Abort(_("repo commit failed"))
1089 try:
1101 try:
@@ -1092,8 +1104,9 b' class queue(object):'
1092 self.parseseries()
1104 self.parseseries()
1093 self.seriesdirty = True
1105 self.seriesdirty = True
1094 self.applieddirty = True
1106 self.applieddirty = True
1095 if msg:
1107 nctx = repo[n]
1096 msg = msg + "\n\n"
1108 if nctx.description() != defaultmsg.rstrip():
1109 msg = nctx.description() + "\n\n"
1097 p.write(msg)
1110 p.write(msg)
1098 if commitfiles:
1111 if commitfiles:
1099 parent = self.qparents(repo, n)
1112 parent = self.qparents(repo, n)
@@ -1471,6 +1484,7 b' class queue(object):'
1471 self.ui.write(_("no patches applied\n"))
1484 self.ui.write(_("no patches applied\n"))
1472 return 1
1485 return 1
1473 msg = opts.get('msg', '').rstrip()
1486 msg = opts.get('msg', '').rstrip()
1487 edit = opts.get('edit')
1474 newuser = opts.get('user')
1488 newuser = opts.get('user')
1475 newdate = opts.get('date')
1489 newdate = opts.get('date')
1476 if newdate:
1490 if newdate:
@@ -1495,8 +1509,6 b' class queue(object):'
1495
1509
1496 ph = patchheader(self.join(patchfn), self.plainmode)
1510 ph = patchheader(self.join(patchfn), self.plainmode)
1497 diffopts = self.diffopts({'git': opts.get('git')}, patchfn)
1511 diffopts = self.diffopts({'git': opts.get('git')}, patchfn)
1498 if msg:
1499 ph.setmessage(msg)
1500 if newuser:
1512 if newuser:
1501 ph.setuser(newuser)
1513 ph.setuser(newuser)
1502 if newdate:
1514 if newdate:
@@ -1506,10 +1518,6 b' class queue(object):'
1506 # only commit new patch when write is complete
1518 # only commit new patch when write is complete
1507 patchf = self.opener(patchfn, 'w', atomictemp=True)
1519 patchf = self.opener(patchfn, 'w', atomictemp=True)
1508
1520
1509 comments = str(ph)
1510 if comments:
1511 patchf.write(comments)
1512
1513 # update the dirstate in place, strip off the qtip commit
1521 # update the dirstate in place, strip off the qtip commit
1514 # and then commit.
1522 # and then commit.
1515 #
1523 #
@@ -1629,14 +1637,6 b' class queue(object):'
1629 for f in forget:
1637 for f in forget:
1630 repo.dirstate.drop(f)
1638 repo.dirstate.drop(f)
1631
1639
1632 if not msg:
1633 if not ph.message:
1634 message = "[mq]: %s\n" % patchfn
1635 else:
1636 message = "\n".join(ph.message)
1637 else:
1638 message = msg
1639
1640 user = ph.user or changes[1]
1640 user = ph.user or changes[1]
1641
1641
1642 oldphase = repo[top].phase()
1642 oldphase = repo[top].phase()
@@ -1653,16 +1653,41 b' class queue(object):'
1653 try:
1653 try:
1654 # might be nice to attempt to roll back strip after this
1654 # might be nice to attempt to roll back strip after this
1655
1655
1656 defaultmsg = "[mq]: %s" % patchfn
1657 editor = cmdutil.getcommiteditor()
1658 if edit:
1659 def finishdesc(desc):
1660 if desc.rstrip():
1661 ph.setmessage(desc)
1662 return desc
1663 return defaultmsg
1664 # i18n: this message is shown in editor with "HG: " prefix
1665 extramsg = _('Leave message empty to use default message.')
1666 editor = cmdutil.getcommiteditor(finishdesc=finishdesc,
1667 extramsg=extramsg)
1668 message = msg or "\n".join(ph.message)
1669 elif not msg:
1670 if not ph.message:
1671 message = defaultmsg
1672 else:
1673 message = "\n".join(ph.message)
1674 else:
1675 message = msg
1676 ph.setmessage(msg)
1677
1656 # Ensure we create a new changeset in the same phase than
1678 # Ensure we create a new changeset in the same phase than
1657 # the old one.
1679 # the old one.
1658 n = newcommit(repo, oldphase, message, user, ph.date,
1680 n = newcommit(repo, oldphase, message, user, ph.date,
1659 match=match, force=True)
1681 match=match, force=True, editor=editor)
1660 # only write patch after a successful commit
1682 # only write patch after a successful commit
1661 c = [list(x) for x in refreshchanges]
1683 c = [list(x) for x in refreshchanges]
1662 if inclsubs:
1684 if inclsubs:
1663 self.putsubstate2changes(substatestate, c)
1685 self.putsubstate2changes(substatestate, c)
1664 chunks = patchmod.diff(repo, patchparent,
1686 chunks = patchmod.diff(repo, patchparent,
1665 changes=c, opts=diffopts)
1687 changes=c, opts=diffopts)
1688 comments = str(ph)
1689 if comments:
1690 patchf.write(comments)
1666 for chunk in chunks:
1691 for chunk in chunks:
1667 patchf.write(chunk)
1692 patchf.write(chunk)
1668 patchf.close()
1693 patchf.close()
@@ -2228,7 +2253,8 b' def init(ui, repo, **opts):'
2228 ('p', 'patches', '',
2253 ('p', 'patches', '',
2229 _('location of source patch repository'), _('REPO')),
2254 _('location of source patch repository'), _('REPO')),
2230 ] + commands.remoteopts,
2255 ] + commands.remoteopts,
2231 _('hg qclone [OPTION]... SOURCE [DEST]'))
2256 _('hg qclone [OPTION]... SOURCE [DEST]'),
2257 norepo=True)
2232 def clone(ui, source, dest=None, **opts):
2258 def clone(ui, source, dest=None, **opts):
2233 '''clone main and patch repository at same time
2259 '''clone main and patch repository at same time
2234
2260
@@ -2307,7 +2333,8 b' def clone(ui, source, dest=None, **opts)'
2307
2333
2308 @command("qcommit|qci",
2334 @command("qcommit|qci",
2309 commands.table["^commit|ci"][1],
2335 commands.table["^commit|ci"][1],
2310 _('hg qcommit [OPTION]... [FILE]...'))
2336 _('hg qcommit [OPTION]... [FILE]...'),
2337 inferrepo=True)
2311 def commit(ui, repo, *pats, **opts):
2338 def commit(ui, repo, *pats, **opts):
2312 """commit changes in the queue repository (DEPRECATED)
2339 """commit changes in the queue repository (DEPRECATED)
2313
2340
@@ -2390,7 +2417,8 b' def setupheaderopts(ui, opts):'
2390 ('d', 'date', '',
2417 ('d', 'date', '',
2391 _('add "Date: <DATE>" to patch'), _('DATE'))
2418 _('add "Date: <DATE>" to patch'), _('DATE'))
2392 ] + commands.walkopts + commands.commitopts,
2419 ] + commands.walkopts + commands.commitopts,
2393 _('hg qnew [-e] [-m TEXT] [-l FILE] PATCH [FILE]...'))
2420 _('hg qnew [-e] [-m TEXT] [-l FILE] PATCH [FILE]...'),
2421 inferrepo=True)
2394 def new(ui, repo, patch, *args, **opts):
2422 def new(ui, repo, patch, *args, **opts):
2395 """create a new patch
2423 """create a new patch
2396
2424
@@ -2417,14 +2445,8 b' def new(ui, repo, patch, *args, **opts):'
2417 Returns 0 on successful creation of a new patch.
2445 Returns 0 on successful creation of a new patch.
2418 """
2446 """
2419 msg = cmdutil.logmessage(ui, opts)
2447 msg = cmdutil.logmessage(ui, opts)
2420 def getmsg():
2421 return ui.edit(msg, opts.get('user') or ui.username())
2422 q = repo.mq
2448 q = repo.mq
2423 opts['msg'] = msg
2449 opts['msg'] = msg
2424 if opts.get('edit'):
2425 opts['msg'] = getmsg
2426 else:
2427 opts['msg'] = msg
2428 setupheaderopts(ui, opts)
2450 setupheaderopts(ui, opts)
2429 q.new(repo, patch, *args, **opts)
2451 q.new(repo, patch, *args, **opts)
2430 q.savedirty()
2452 q.savedirty()
@@ -2444,7 +2466,8 b' def new(ui, repo, patch, *args, **opts):'
2444 ('d', 'date', '',
2466 ('d', 'date', '',
2445 _('add/update date field in patch with given date'), _('DATE'))
2467 _('add/update date field in patch with given date'), _('DATE'))
2446 ] + commands.walkopts + commands.commitopts,
2468 ] + commands.walkopts + commands.commitopts,
2447 _('hg qrefresh [-I] [-X] [-e] [-m TEXT] [-l FILE] [-s] [FILE]...'))
2469 _('hg qrefresh [-I] [-X] [-e] [-m TEXT] [-l FILE] [-s] [FILE]...'),
2470 inferrepo=True)
2448 def refresh(ui, repo, *pats, **opts):
2471 def refresh(ui, repo, *pats, **opts):
2449 """update the current patch
2472 """update the current patch
2450
2473
@@ -2468,17 +2491,6 b' def refresh(ui, repo, *pats, **opts):'
2468 """
2491 """
2469 q = repo.mq
2492 q = repo.mq
2470 message = cmdutil.logmessage(ui, opts)
2493 message = cmdutil.logmessage(ui, opts)
2471 if opts.get('edit'):
2472 if not q.applied:
2473 ui.write(_("no patches applied\n"))
2474 return 1
2475 if message:
2476 raise util.Abort(_('option "-e" incompatible with "-m" or "-l"'))
2477 patch = q.applied[-1].name
2478 ph = patchheader(q.join(patch), q.plainmode)
2479 message = ui.edit('\n'.join(ph.message), ph.user or ui.username())
2480 # We don't want to lose the patch message if qrefresh fails (issue2062)
2481 repo.savecommitmessage(message)
2482 setupheaderopts(ui, opts)
2494 setupheaderopts(ui, opts)
2483 wlock = repo.wlock()
2495 wlock = repo.wlock()
2484 try:
2496 try:
@@ -2490,7 +2502,8 b' def refresh(ui, repo, *pats, **opts):'
2490
2502
2491 @command("^qdiff",
2503 @command("^qdiff",
2492 commands.diffopts + commands.diffopts2 + commands.walkopts,
2504 commands.diffopts + commands.diffopts2 + commands.walkopts,
2493 _('hg qdiff [OPTION]... [FILE]...'))
2505 _('hg qdiff [OPTION]... [FILE]...'),
2506 inferrepo=True)
2494 def diff(ui, repo, *pats, **opts):
2507 def diff(ui, repo, *pats, **opts):
2495 """diff of the current patch and subsequent modifications
2508 """diff of the current patch and subsequent modifications
2496
2509
@@ -2536,9 +2549,6 b' def fold(ui, repo, *files, **opts):'
2536 q.checklocalchanges(repo)
2549 q.checklocalchanges(repo)
2537
2550
2538 message = cmdutil.logmessage(ui, opts)
2551 message = cmdutil.logmessage(ui, opts)
2539 if opts.get('edit'):
2540 if message:
2541 raise util.Abort(_('option "-e" incompatible with "-m" or "-l"'))
2542
2552
2543 parent = q.lookup('qtip')
2553 parent = q.lookup('qtip')
2544 patches = []
2554 patches = []
@@ -2564,7 +2574,7 b' def fold(ui, repo, *files, **opts):'
2564
2574
2565 if not message:
2575 if not message:
2566 ph = patchheader(q.join(parent), q.plainmode)
2576 ph = patchheader(q.join(parent), q.plainmode)
2567 message, user = ph.message, ph.user
2577 message = ph.message
2568 for msg in messages:
2578 for msg in messages:
2569 if msg:
2579 if msg:
2570 if message:
2580 if message:
@@ -2572,14 +2582,10 b' def fold(ui, repo, *files, **opts):'
2572 message.extend(msg)
2582 message.extend(msg)
2573 message = '\n'.join(message)
2583 message = '\n'.join(message)
2574
2584
2575 if opts.get('edit'):
2576 message = ui.edit(message, user or ui.username())
2577 repo.savecommitmessage(message)
2578
2579 diffopts = q.patchopts(q.diffopts(), *patches)
2585 diffopts = q.patchopts(q.diffopts(), *patches)
2580 wlock = repo.wlock()
2586 wlock = repo.wlock()
2581 try:
2587 try:
2582 q.refresh(repo, msg=message, git=diffopts.git)
2588 q.refresh(repo, msg=message, git=diffopts.git, edit=opts.get('edit'))
2583 q.delete(repo, patches, opts)
2589 q.delete(repo, patches, opts)
2584 q.savedirty()
2590 q.savedirty()
2585 finally:
2591 finally:
@@ -3454,5 +3460,3 b" colortable = {'qguard.negative': 'red',"
3454 'qseries.guarded': 'black bold',
3460 'qseries.guarded': 'black bold',
3455 'qseries.missing': 'red bold',
3461 'qseries.missing': 'red bold',
3456 'qseries.unapplied': 'black bold'}
3462 'qseries.unapplied': 'black bold'}
3457
3458 commands.inferrepo += " qnew qrefresh qdiff qcommit"
@@ -39,12 +39,20 b' paged.'
39
39
40 If pager.attend is present, pager.ignore will be ignored.
40 If pager.attend is present, pager.ignore will be ignored.
41
41
42 Lastly, you can enable and disable paging for individual commands with
43 the attend-<command> option. This setting takes precedence over
44 existing attend and ignore options and defaults::
45
46 [pager]
47 attend-cat = false
48
42 To ignore global commands like :hg:`version` or :hg:`help`, you have
49 To ignore global commands like :hg:`version` or :hg:`help`, you have
43 to specify them in your user configuration file.
50 to specify them in your user configuration file.
44
51
45 The --pager=... option can also be used to control when the pager is
52 The --pager=... option can also be used to control when the pager is
46 used. Use a boolean value like yes, no, on, off, or use auto for
53 used. Use a boolean value like yes, no, on, off, or use auto for
47 normal behavior.
54 normal behavior.
55
48 '''
56 '''
49
57
50 import atexit, sys, os, signal, subprocess, errno, shlex
58 import atexit, sys, os, signal, subprocess, errno, shlex
@@ -116,25 +124,37 b' def uisetup(ui):'
116
124
117 def pagecmd(orig, ui, options, cmd, cmdfunc):
125 def pagecmd(orig, ui, options, cmd, cmdfunc):
118 p = ui.config("pager", "pager", os.environ.get("PAGER"))
126 p = ui.config("pager", "pager", os.environ.get("PAGER"))
127 usepager = False
128 always = util.parsebool(options['pager'])
129 auto = options['pager'] == 'auto'
119
130
120 if p:
131 if not p:
132 pass
133 elif always:
134 usepager = True
135 elif not auto:
136 usepager = False
137 else:
121 attend = ui.configlist('pager', 'attend', attended)
138 attend = ui.configlist('pager', 'attend', attended)
122 auto = options['pager'] == 'auto'
139 ignore = ui.configlist('pager', 'ignore')
123 always = util.parsebool(options['pager'])
124
125 cmds, _ = cmdutil.findcmd(cmd, commands.table)
140 cmds, _ = cmdutil.findcmd(cmd, commands.table)
126
141
127 ignore = ui.configlist('pager', 'ignore')
128 for cmd in cmds:
142 for cmd in cmds:
129 if (always or auto and
143 var = 'attend-%s' % cmd
130 (cmd in attend or
144 if ui.config('pager', var):
131 (cmd not in ignore and not attend))):
145 usepager = ui.configbool('pager', var)
132 ui.setconfig('ui', 'formatted', ui.formatted(), 'pager')
146 break
133 ui.setconfig('ui', 'interactive', False, 'pager')
147 if (cmd in attend or
134 if util.safehasattr(signal, "SIGPIPE"):
148 (cmd not in ignore and not attend)):
135 signal.signal(signal.SIGPIPE, signal.SIG_DFL)
149 usepager = True
136 _runpager(ui, p)
137 break
150 break
151
152 if usepager:
153 ui.setconfig('ui', 'formatted', ui.formatted(), 'pager')
154 ui.setconfig('ui', 'interactive', False, 'pager')
155 if util.safehasattr(signal, "SIGPIPE"):
156 signal.signal(signal.SIGPIPE, signal.SIG_DFL)
157 _runpager(ui, p)
138 return orig(ui, options, cmd, cmdfunc)
158 return orig(ui, options, cmd, cmdfunc)
139
159
140 extensions.wrapfunction(dispatch, '_runcommand', pagecmd)
160 extensions.wrapfunction(dispatch, '_runcommand', pagecmd)
@@ -149,6 +149,8 b' def makepatch(ui, repo, patchlines, opts'
149 subj = '[PATCH %0*d of %d%s] %s' % (tlen, idx, total, flag, subj)
149 subj = '[PATCH %0*d of %d%s] %s' % (tlen, idx, total, flag, subj)
150 msg['Subject'] = mail.headencode(ui, subj, _charsets, opts.get('test'))
150 msg['Subject'] = mail.headencode(ui, subj, _charsets, opts.get('test'))
151 msg['X-Mercurial-Node'] = node
151 msg['X-Mercurial-Node'] = node
152 msg['X-Mercurial-Series-Index'] = '%i' % idx
153 msg['X-Mercurial-Series-Total'] = '%i' % total
152 return msg, subj, ds
154 return msg, subj, ds
153
155
154 emailopts = [
156 emailopts = [
@@ -507,9 +509,13 b' def patchbomb(ui, repo, *revs, **opts):'
507 sender_addr = email.Utils.parseaddr(sender)[1]
509 sender_addr = email.Utils.parseaddr(sender)[1]
508 sender = mail.addressencode(ui, sender, _charsets, opts.get('test'))
510 sender = mail.addressencode(ui, sender, _charsets, opts.get('test'))
509 sendmail = None
511 sendmail = None
512 firstpatch = None
510 for i, (m, subj, ds) in enumerate(msgs):
513 for i, (m, subj, ds) in enumerate(msgs):
511 try:
514 try:
512 m['Message-Id'] = genmsgid(m['X-Mercurial-Node'])
515 m['Message-Id'] = genmsgid(m['X-Mercurial-Node'])
516 if not firstpatch:
517 firstpatch = m['Message-Id']
518 m['X-Mercurial-Series-Id'] = firstpatch
513 except TypeError:
519 except TypeError:
514 m['Message-Id'] = genmsgid('patchbomb')
520 m['Message-Id'] = genmsgid('patchbomb')
515 if parent:
521 if parent:
@@ -41,6 +41,8 b' import time'
41 from mercurial.i18n import _
41 from mercurial.i18n import _
42 testedwith = 'internal'
42 testedwith = 'internal'
43
43
44 from mercurial import encoding
45
44 def spacejoin(*args):
46 def spacejoin(*args):
45 return ' '.join(s for s in args if s)
47 return ' '.join(s for s in args if s)
46
48
@@ -137,10 +139,10 b' class progbar(object):'
137 else:
139 else:
138 wid = 20
140 wid = 20
139 if slice == 'end':
141 if slice == 'end':
140 add = item[-wid:]
142 add = encoding.trim(item, wid, leftside=True)
141 else:
143 else:
142 add = item[:wid]
144 add = encoding.trim(item, wid)
143 add += (wid - len(add)) * ' '
145 add += (wid - encoding.colwidth(add)) * ' '
144 elif indicator == 'bar':
146 elif indicator == 'bar':
145 add = ''
147 add = ''
146 needprogress = True
148 needprogress = True
@@ -157,9 +159,9 b' class progbar(object):'
157 if needprogress:
159 if needprogress:
158 used = 0
160 used = 0
159 if head:
161 if head:
160 used += len(head) + 1
162 used += encoding.colwidth(head) + 1
161 if tail:
163 if tail:
162 used += len(tail) + 1
164 used += encoding.colwidth(tail) + 1
163 progwidth = termwidth - used - 3
165 progwidth = termwidth - used - 3
164 if total and pos <= total:
166 if total and pos <= total:
165 amt = pos * progwidth // total
167 amt = pos * progwidth // total
@@ -180,7 +182,7 b' class progbar(object):'
180 out = spacejoin(head, prog, tail)
182 out = spacejoin(head, prog, tail)
181 else:
183 else:
182 out = spacejoin(head, tail)
184 out = spacejoin(head, tail)
183 sys.stderr.write('\r' + out[:termwidth])
185 sys.stderr.write('\r' + encoding.trim(out, termwidth))
184 self.lasttopic = topic
186 self.lasttopic = topic
185 sys.stderr.flush()
187 sys.stderr.flush()
186
188
@@ -35,6 +35,8 b" testedwith = 'internal'"
35 @command('purge|clean',
35 @command('purge|clean',
36 [('a', 'abort-on-err', None, _('abort if an error occurs')),
36 [('a', 'abort-on-err', None, _('abort if an error occurs')),
37 ('', 'all', None, _('purge ignored files too')),
37 ('', 'all', None, _('purge ignored files too')),
38 ('', 'dirs', None, _('purge empty directories')),
39 ('', 'files', None, _('purge files')),
38 ('p', 'print', None, _('print filenames instead of deleting them')),
40 ('p', 'print', None, _('print filenames instead of deleting them')),
39 ('0', 'print0', None, _('end filenames with NUL, for use with xargs'
41 ('0', 'print0', None, _('end filenames with NUL, for use with xargs'
40 ' (implies -p/--print)')),
42 ' (implies -p/--print)')),
@@ -46,7 +48,7 b' def purge(ui, repo, *dirs, **opts):'
46 Delete files not known to Mercurial. This is useful to test local
48 Delete files not known to Mercurial. This is useful to test local
47 and uncommitted changes in an otherwise-clean source tree.
49 and uncommitted changes in an otherwise-clean source tree.
48
50
49 This means that purge will delete:
51 This means that purge will delete the following by default:
50
52
51 - Unknown files: files marked with "?" by :hg:`status`
53 - Unknown files: files marked with "?" by :hg:`status`
52 - Empty directories: in fact Mercurial ignores directories unless
54 - Empty directories: in fact Mercurial ignores directories unless
@@ -58,6 +60,10 b' def purge(ui, repo, *dirs, **opts):'
58 - Ignored files (unless --all is specified)
60 - Ignored files (unless --all is specified)
59 - New files added to the repository (with :hg:`add`)
61 - New files added to the repository (with :hg:`add`)
60
62
63 The --files and --dirs options can be used to direct purge to delete
64 only files, only directories, or both. If neither option is given,
65 both will be deleted.
66
61 If directories are given on the command line, only files in these
67 If directories are given on the command line, only files in these
62 directories are considered.
68 directories are considered.
63
69
@@ -71,6 +77,11 b' def purge(ui, repo, *dirs, **opts):'
71 if opts['print0']:
77 if opts['print0']:
72 eol = '\0'
78 eol = '\0'
73 act = False # --print0 implies --print
79 act = False # --print0 implies --print
80 removefiles = opts['files']
81 removedirs = opts['dirs']
82 if not removefiles and not removedirs:
83 removefiles = True
84 removedirs = True
74
85
75 def remove(remove_func, name):
86 def remove(remove_func, name):
76 if act:
87 if act:
@@ -100,13 +111,15 b' def purge(ui, repo, *dirs, **opts):'
100 match.explicitdir = match.traversedir = directories.append
111 match.explicitdir = match.traversedir = directories.append
101 status = repo.status(match=match, ignored=opts['all'], unknown=True)
112 status = repo.status(match=match, ignored=opts['all'], unknown=True)
102
113
103 for f in sorted(status[4] + status[5]):
114 if removefiles:
104 if act:
115 for f in sorted(status[4] + status[5]):
105 ui.note(_('removing file %s\n') % f)
116 if act:
106 remove(removefile, f)
117 ui.note(_('removing file %s\n') % f)
118 remove(removefile, f)
107
119
108 for f in sorted(directories, reverse=True):
120 if removedirs:
109 if match(f) and not os.listdir(repo.wjoin(f)):
121 for f in sorted(directories, reverse=True):
110 if act:
122 if match(f) and not os.listdir(repo.wjoin(f)):
111 ui.note(_('removing directory %s\n') % f)
123 if act:
112 remove(os.rmdir, f)
124 ui.note(_('removing directory %s\n') % f)
125 remove(os.rmdir, f)
@@ -138,9 +138,7 b' def rebase(ui, repo, **opts):'
138 skipped = set()
138 skipped = set()
139 targetancestors = set()
139 targetancestors = set()
140
140
141 editor = None
141 editor = cmdutil.getcommiteditor(**opts)
142 if opts.get('edit'):
143 editor = cmdutil.commitforceeditor
144
142
145 lock = wlock = None
143 lock = wlock = None
146 try:
144 try:
@@ -385,7 +383,7 b' def rebase(ui, repo, **opts):'
385 for rebased in state:
383 for rebased in state:
386 if rebased not in skipped and state[rebased] > nullmerge:
384 if rebased not in skipped and state[rebased] > nullmerge:
387 commitmsg += '\n* %s' % repo[rebased].description()
385 commitmsg += '\n* %s' % repo[rebased].description()
388 editor = cmdutil.commitforceeditor
386 editor = cmdutil.getcommiteditor(edit=True)
389 newrev = concludenode(repo, rev, p1, external, commitmsg=commitmsg,
387 newrev = concludenode(repo, rev, p1, external, commitmsg=commitmsg,
390 extrafn=extrafn, editor=editor)
388 extrafn=extrafn, editor=editor)
391 for oldrev in state.iterkeys():
389 for oldrev in state.iterkeys():
@@ -542,7 +540,8 b' def rebasenode(repo, rev, p1, state, col'
542 repo.ui.debug(" detach base %d:%s\n" % (repo[base].rev(), repo[base]))
540 repo.ui.debug(" detach base %d:%s\n" % (repo[base].rev(), repo[base]))
543 # When collapsing in-place, the parent is the common ancestor, we
541 # When collapsing in-place, the parent is the common ancestor, we
544 # have to allow merging with it.
542 # have to allow merging with it.
545 return merge.update(repo, rev, True, True, False, base, collapse)
543 return merge.update(repo, rev, True, True, False, base, collapse,
544 labels=['dest', 'source'])
546
545
547 def nearestrebased(repo, rev, state):
546 def nearestrebased(repo, rev, state):
548 """return the nearest ancestors of rev in the rebase result"""
547 """return the nearest ancestors of rev in the rebase result"""
@@ -459,6 +459,11 b' def qrefresh(origfn, ui, repo, *pats, **'
459 # backup all changed files
459 # backup all changed files
460 dorecord(ui, repo, committomq, 'qrefresh', True, *pats, **opts)
460 dorecord(ui, repo, committomq, 'qrefresh', True, *pats, **opts)
461
461
462 # This command registration is replaced during uisetup().
463 @command('qrecord',
464 [],
465 _('hg qrecord [OPTION]... PATCH [FILE]...'),
466 inferrepo=True)
462 def qrecord(ui, repo, patch, *pats, **opts):
467 def qrecord(ui, repo, patch, *pats, **opts):
463 '''interactively record a new patch
468 '''interactively record a new patch
464
469
@@ -598,9 +603,7 b' def dorecord(ui, repo, commitfunc, cmdsu'
598 # patch. Now is the time to delegate the job to
603 # patch. Now is the time to delegate the job to
599 # commit/qrefresh or the like!
604 # commit/qrefresh or the like!
600
605
601 # it is important to first chdir to repo root -- we'll call
606 # Make all of the pathnames absolute.
602 # a highlevel command with list of pathnames relative to
603 # repo root
604 newfiles = [repo.wjoin(nf) for nf in newfiles]
607 newfiles = [repo.wjoin(nf) for nf in newfiles]
605 commitfunc(ui, repo, *newfiles, **opts)
608 commitfunc(ui, repo, *newfiles, **opts)
606
609
@@ -637,10 +640,6 b' def dorecord(ui, repo, commitfunc, cmdsu'
637 finally:
640 finally:
638 ui.write = oldwrite
641 ui.write = oldwrite
639
642
640 cmdtable["qrecord"] = \
641 (qrecord, [], # placeholder until mq is available
642 _('hg qrecord [OPTION]... PATCH [FILE]...'))
643
644 def uisetup(ui):
643 def uisetup(ui):
645 try:
644 try:
646 mq = extensions.find('mq')
645 mq = extensions.find('mq')
@@ -661,5 +660,3 b' def uisetup(ui):'
661 def _wrapcmd(cmd, table, wrapfn, msg):
660 def _wrapcmd(cmd, table, wrapfn, msg):
662 entry = extensions.wrapcommand(table, cmd, wrapfn)
661 entry = extensions.wrapcommand(table, cmd, wrapfn)
663 entry[1].append(('i', 'interactive', None, msg))
662 entry[1].append(('i', 'interactive', None, msg))
664
665 commands.inferrepo += " record qrecord"
@@ -7,12 +7,15 b''
7
7
8 """recreates hardlinks between repository clones"""
8 """recreates hardlinks between repository clones"""
9
9
10 from mercurial import hg, util
10 from mercurial import cmdutil, hg, util
11 from mercurial.i18n import _
11 from mercurial.i18n import _
12 import os, stat
12 import os, stat
13
13
14 cmdtable = {}
15 command = cmdutil.command(cmdtable)
14 testedwith = 'internal'
16 testedwith = 'internal'
15
17
18 @command('relink', [], _('[ORIGIN]'))
16 def relink(ui, repo, origin=None, **opts):
19 def relink(ui, repo, origin=None, **opts):
17 """recreate hardlinks between two repositories
20 """recreate hardlinks between two repositories
18
21
@@ -178,11 +181,3 b' def do_relink(src, dst, files, ui):'
178
181
179 ui.status(_('relinked %d files (%s reclaimed)\n') %
182 ui.status(_('relinked %d files (%s reclaimed)\n') %
180 (relinked, util.bytecount(savedbytes)))
183 (relinked, util.bytecount(savedbytes)))
181
182 cmdtable = {
183 'relink': (
184 relink,
185 [],
186 _('[ORIGIN]')
187 )
188 }
@@ -6,10 +6,16 b''
6 '''share a common history between several working directories'''
6 '''share a common history between several working directories'''
7
7
8 from mercurial.i18n import _
8 from mercurial.i18n import _
9 from mercurial import hg, commands, util
9 from mercurial import cmdutil, hg, util
10
10
11 cmdtable = {}
12 command = cmdutil.command(cmdtable)
11 testedwith = 'internal'
13 testedwith = 'internal'
12
14
15 @command('share',
16 [('U', 'noupdate', None, _('do not create a working copy'))],
17 _('[-U] SOURCE [DEST]'),
18 norepo=True)
13 def share(ui, source, dest=None, noupdate=False):
19 def share(ui, source, dest=None, noupdate=False):
14 """create a new shared repository
20 """create a new shared repository
15
21
@@ -30,6 +36,7 b' def share(ui, source, dest=None, noupdat'
30
36
31 return hg.share(ui, source, dest, not noupdate)
37 return hg.share(ui, source, dest, not noupdate)
32
38
39 @command('unshare', [], '')
33 def unshare(ui, repo):
40 def unshare(ui, repo):
34 """convert a shared repository to a normal one
41 """convert a shared repository to a normal one
35
42
@@ -60,16 +67,3 b' def unshare(ui, repo):'
60
67
61 # update store, spath, sopener and sjoin of repo
68 # update store, spath, sopener and sjoin of repo
62 repo.unfiltered().__init__(repo.baseui, repo.root)
69 repo.unfiltered().__init__(repo.baseui, repo.root)
63
64 cmdtable = {
65 "share":
66 (share,
67 [('U', 'noupdate', None, _('do not create a working copy'))],
68 _('[-U] SOURCE [DEST]')),
69 "unshare":
70 (unshare,
71 [],
72 ''),
73 }
74
75 commands.norepo += " share"
@@ -178,7 +178,8 b' def createcmd(ui, repo, pats, opts):'
178 if hasmq:
178 if hasmq:
179 saved, repo.mq.checkapplied = repo.mq.checkapplied, False
179 saved, repo.mq.checkapplied = repo.mq.checkapplied, False
180 try:
180 try:
181 return repo.commit(message, user, opts.get('date'), match)
181 return repo.commit(message, user, opts.get('date'), match,
182 editor=cmdutil.getcommiteditor(**opts))
182 finally:
183 finally:
183 if hasmq:
184 if hasmq:
184 repo.mq.checkapplied = saved
185 repo.mq.checkapplied = saved
@@ -635,6 +636,8 b' def unshelve(ui, repo, *shelved, **opts)'
635 _('shelve with the specified commit date'), _('DATE')),
636 _('shelve with the specified commit date'), _('DATE')),
636 ('d', 'delete', None,
637 ('d', 'delete', None,
637 _('delete the named shelved change(s)')),
638 _('delete the named shelved change(s)')),
639 ('e', 'edit', False,
640 _('invoke editor on commit messages')),
638 ('l', 'list', None,
641 ('l', 'list', None,
639 _('list current shelves')),
642 _('list current shelves')),
640 ('m', 'message', '',
643 ('m', 'message', '',
@@ -675,20 +678,32 b' def shelvecmd(ui, repo, *pats, **opts):'
675 '''
678 '''
676 cmdutil.checkunfinished(repo)
679 cmdutil.checkunfinished(repo)
677
680
678 def checkopt(opt, incompatible):
681 allowables = [
682 ('addremove', 'create'), # 'create' is pseudo action
683 ('cleanup', 'cleanup'),
684 # ('date', 'create'), # ignored for passing '--date "0 0"' in tests
685 ('delete', 'delete'),
686 ('edit', 'create'),
687 ('list', 'list'),
688 ('message', 'create'),
689 ('name', 'create'),
690 ('patch', 'list'),
691 ('stat', 'list'),
692 ]
693 def checkopt(opt):
679 if opts[opt]:
694 if opts[opt]:
680 for i in incompatible.split():
695 for i, allowable in allowables:
681 if opts[i]:
696 if opts[i] and opt != allowable:
682 raise util.Abort(_("options '--%s' and '--%s' may not be "
697 raise util.Abort(_("options '--%s' and '--%s' may not be "
683 "used together") % (opt, i))
698 "used together") % (opt, i))
684 return True
699 return True
685 if checkopt('cleanup', 'addremove delete list message name patch stat'):
700 if checkopt('cleanup'):
686 if pats:
701 if pats:
687 raise util.Abort(_("cannot specify names when using '--cleanup'"))
702 raise util.Abort(_("cannot specify names when using '--cleanup'"))
688 return cleanupcmd(ui, repo)
703 return cleanupcmd(ui, repo)
689 elif checkopt('delete', 'addremove cleanup list message name patch stat'):
704 elif checkopt('delete'):
690 return deletecmd(ui, repo, pats)
705 return deletecmd(ui, repo, pats)
691 elif checkopt('list', 'addremove cleanup delete message name'):
706 elif checkopt('list'):
692 return listcmd(ui, repo, pats, opts)
707 return listcmd(ui, repo, pats, opts)
693 else:
708 else:
694 for i in ('patch', 'stat'):
709 for i in ('patch', 'stat'):
@@ -42,7 +42,7 b' def checklocalchanges(repo, force=False,'
42 raise util.Abort(_("local changed subrepos found" + excsuffix))
42 raise util.Abort(_("local changed subrepos found" + excsuffix))
43 return m, a, r, d
43 return m, a, r, d
44
44
45 def strip(ui, repo, revs, update=True, backup="all", force=None):
45 def strip(ui, repo, revs, update=True, backup="all", force=None, bookmark=None):
46 wlock = lock = None
46 wlock = lock = None
47 try:
47 try:
48 wlock = repo.wlock()
48 wlock = repo.wlock()
@@ -59,6 +59,14 b' def strip(ui, repo, revs, update=True, b'
59 repo.dirstate.write()
59 repo.dirstate.write()
60
60
61 repair.strip(ui, repo, revs, backup)
61 repair.strip(ui, repo, revs, backup)
62
63 marks = repo._bookmarks
64 if bookmark:
65 if bookmark == repo._bookmarkcurrent:
66 bookmarks.unsetcurrent(repo)
67 del marks[bookmark]
68 marks.write()
69 ui.write(_("bookmark '%s' deleted\n") % bookmark)
62 finally:
70 finally:
63 release(lock, wlock)
71 release(lock, wlock)
64
72
@@ -70,9 +78,6 b' def strip(ui, repo, revs, update=True, b'
70 'option)'), _('REV')),
78 'option)'), _('REV')),
71 ('f', 'force', None, _('force removal of changesets, discard '
79 ('f', 'force', None, _('force removal of changesets, discard '
72 'uncommitted changes (no backup)')),
80 'uncommitted changes (no backup)')),
73 ('b', 'backup', None, _('bundle only changesets with local revision'
74 ' number greater than REV which are not'
75 ' descendants of REV (DEPRECATED)')),
76 ('', 'no-backup', None, _('no backups')),
81 ('', 'no-backup', None, _('no backups')),
77 ('', 'nobackup', None, _('no backups (DEPRECATED)')),
82 ('', 'nobackup', None, _('no backups (DEPRECATED)')),
78 ('n', '', None, _('ignored (DEPRECATED)')),
83 ('n', '', None, _('ignored (DEPRECATED)')),
@@ -205,15 +210,9 b' def stripcmd(ui, repo, *revs, **opts):'
205 repo.dirstate.write()
210 repo.dirstate.write()
206 update = False
211 update = False
207
212
208 if opts.get('bookmark'):
209 if mark == repo._bookmarkcurrent:
210 bookmarks.unsetcurrent(repo)
211 del marks[mark]
212 marks.write()
213 ui.write(_("bookmark '%s' deleted\n") % mark)
214
213
215 strip(ui, repo, revs, backup=backup, update=update,
214 strip(ui, repo, revs, backup=backup, update=update,
216 force=opts.get('force'))
215 force=opts.get('force'), bookmark=opts.get('bookmark'))
217 finally:
216 finally:
218 wlock.release()
217 wlock.release()
219
218
@@ -80,13 +80,13 b' class transplants(object):'
80 self.dirty = True
80 self.dirty = True
81
81
82 class transplanter(object):
82 class transplanter(object):
83 def __init__(self, ui, repo):
83 def __init__(self, ui, repo, opts):
84 self.ui = ui
84 self.ui = ui
85 self.path = repo.join('transplant')
85 self.path = repo.join('transplant')
86 self.opener = scmutil.opener(self.path)
86 self.opener = scmutil.opener(self.path)
87 self.transplants = transplants(self.path, 'transplants',
87 self.transplants = transplants(self.path, 'transplants',
88 opener=self.opener)
88 opener=self.opener)
89 self.editor = None
89 self.editor = cmdutil.getcommiteditor(**opts)
90
90
91 def applied(self, repo, node, parent):
91 def applied(self, repo, node, parent):
92 '''returns True if a node is already an ancestor of parent
92 '''returns True if a node is already an ancestor of parent
@@ -599,9 +599,7 b' def transplant(ui, repo, *revs, **opts):'
599 if not opts.get('filter'):
599 if not opts.get('filter'):
600 opts['filter'] = ui.config('transplant', 'filter')
600 opts['filter'] = ui.config('transplant', 'filter')
601
601
602 tp = transplanter(ui, repo)
602 tp = transplanter(ui, repo, opts)
603 if opts.get('edit'):
604 tp.editor = cmdutil.commitforceeditor
605
603
606 cmdutil.checkunfinished(repo)
604 cmdutil.checkunfinished(repo)
607 p1, p2 = repo.dirstate.parents()
605 p1, p2 = repo.dirstate.parents()
@@ -58,7 +58,7 b' def read(repo):'
58 if repo.filtername is not None:
58 if repo.filtername is not None:
59 msg += ' (%s)' % repo.filtername
59 msg += ' (%s)' % repo.filtername
60 msg += ': %s\n'
60 msg += ': %s\n'
61 repo.ui.warn(msg % inst)
61 repo.ui.debug(msg % inst)
62 partial = None
62 partial = None
63 return partial
63 return partial
64
64
@@ -221,7 +221,8 b' class branchcache(dict):'
221 repo.ui.log('branchcache',
221 repo.ui.log('branchcache',
222 'wrote %s branch cache with %d labels and %d nodes\n',
222 'wrote %s branch cache with %d labels and %d nodes\n',
223 repo.filtername, len(self), nodecount)
223 repo.filtername, len(self), nodecount)
224 except (IOError, OSError, util.Abort):
224 except (IOError, OSError, util.Abort), inst:
225 repo.ui.debug("couldn't write branch cache: %s\n" % inst)
225 # Abort may be raise by read only opener
226 # Abort may be raise by read only opener
226 pass
227 pass
227
228
@@ -113,6 +113,8 b' Binary format is as follow'
113
113
114 Mandatory parameters comes first, then the advisory ones.
114 Mandatory parameters comes first, then the advisory ones.
115
115
116 Each parameter's key MUST be unique within the part.
117
116 :payload:
118 :payload:
117
119
118 payload is a series of `<chunksize><chunkdata>`.
120 payload is a series of `<chunksize><chunkdata>`.
@@ -144,6 +146,7 b' import util'
144 import struct
146 import struct
145 import urllib
147 import urllib
146 import string
148 import string
149 import pushkey
147
150
148 import changegroup, error
151 import changegroup, error
149 from i18n import _
152 from i18n import _
@@ -170,18 +173,14 b' def _makefpartparamsizes(nbparams):'
170 """
173 """
171 return '>'+('BB'*nbparams)
174 return '>'+('BB'*nbparams)
172
175
173 class UnknownPartError(KeyError):
174 """error raised when no handler is found for a Mandatory part"""
175 pass
176
177 parthandlermapping = {}
176 parthandlermapping = {}
178
177
179 def parthandler(parttype):
178 def parthandler(parttype, params=()):
180 """decorator that register a function as a bundle2 part handler
179 """decorator that register a function as a bundle2 part handler
181
180
182 eg::
181 eg::
183
182
184 @parthandler('myparttype')
183 @parthandler('myparttype', ('mandatory', 'param', 'handled'))
185 def myparttypehandler(...):
184 def myparttypehandler(...):
186 '''process a part of type "my part".'''
185 '''process a part of type "my part".'''
187 ...
186 ...
@@ -190,6 +189,7 b' def parthandler(parttype):'
190 lparttype = parttype.lower() # enforce lower case matching.
189 lparttype = parttype.lower() # enforce lower case matching.
191 assert lparttype not in parthandlermapping
190 assert lparttype not in parthandlermapping
192 parthandlermapping[lparttype] = func
191 parthandlermapping[lparttype] = func
192 func.params = frozenset(params)
193 return func
193 return func
194 return _decorator
194 return _decorator
195
195
@@ -295,18 +295,25 b' def processbundle(repo, unbundler, trans'
295 # part key are matched lower case
295 # part key are matched lower case
296 key = parttype.lower()
296 key = parttype.lower()
297 try:
297 try:
298 handler = parthandlermapping[key]
298 handler = parthandlermapping.get(key)
299 if handler is None:
300 raise error.BundleValueError(parttype=key)
299 op.ui.debug('found a handler for part %r\n' % parttype)
301 op.ui.debug('found a handler for part %r\n' % parttype)
300 except KeyError:
302 unknownparams = part.mandatorykeys - handler.params
303 if unknownparams:
304 unknownparams = list(unknownparams)
305 unknownparams.sort()
306 raise error.BundleValueError(parttype=key,
307 params=unknownparams)
308 except error.BundleValueError, exc:
301 if key != parttype: # mandatory parts
309 if key != parttype: # mandatory parts
302 # todo:
310 raise
303 # - use a more precise exception
311 op.ui.debug('ignoring unsupported advisory part %s\n' % exc)
304 raise UnknownPartError(key)
305 op.ui.debug('ignoring unknown advisory part %r\n' % key)
306 # consuming the part
312 # consuming the part
307 part.read()
313 part.read()
308 continue
314 continue
309
315
316
310 # handler is called outside the above try block so that we don't
317 # handler is called outside the above try block so that we don't
311 # risk catching KeyErrors from anything other than the
318 # risk catching KeyErrors from anything other than the
312 # parthandlermapping lookup (any KeyError raised by handler()
319 # parthandlermapping lookup (any KeyError raised by handler()
@@ -321,11 +328,8 b' def processbundle(repo, unbundler, trans'
321 if output is not None:
328 if output is not None:
322 output = op.ui.popbuffer()
329 output = op.ui.popbuffer()
323 if output:
330 if output:
324 outpart = bundlepart('b2x:output',
331 outpart = op.reply.newpart('b2x:output', data=output)
325 advisoryparams=[('in-reply-to',
332 outpart.addparam('in-reply-to', str(part.id), mandatory=False)
326 str(part.id))],
327 data=output)
328 op.reply.addpart(outpart)
329 part.read()
333 part.read()
330 except Exception, exc:
334 except Exception, exc:
331 if part is not None:
335 if part is not None:
@@ -381,7 +385,7 b' def encodecaps(caps):'
381 class bundle20(object):
385 class bundle20(object):
382 """represent an outgoing bundle2 container
386 """represent an outgoing bundle2 container
383
387
384 Use the `addparam` method to add stream level parameter. and `addpart` to
388 Use the `addparam` method to add stream level parameter. and `newpart` to
385 populate it. Then call `getchunks` to retrieve all the binary chunks of
389 populate it. Then call `getchunks` to retrieve all the binary chunks of
386 data that compose the bundle2 container."""
390 data that compose the bundle2 container."""
387
391
@@ -391,6 +395,12 b' class bundle20(object):'
391 self._parts = []
395 self._parts = []
392 self.capabilities = dict(capabilities)
396 self.capabilities = dict(capabilities)
393
397
398 @property
399 def nbparts(self):
400 """total number of parts added to the bundler"""
401 return len(self._parts)
402
403 # methods used to defines the bundle2 content
394 def addparam(self, name, value=None):
404 def addparam(self, name, value=None):
395 """add a stream level parameter"""
405 """add a stream level parameter"""
396 if not name:
406 if not name:
@@ -407,6 +417,20 b' class bundle20(object):'
407 part.id = len(self._parts) # very cheap counter
417 part.id = len(self._parts) # very cheap counter
408 self._parts.append(part)
418 self._parts.append(part)
409
419
420 def newpart(self, typeid, *args, **kwargs):
421 """create a new part and add it to the containers
422
423 As the part is directly added to the containers. For now, this means
424 that any failure to properly initialize the part after calling
425 ``newpart`` should result in a failure of the whole bundling process.
426
427 You can still fall back to manually create and add if you need better
428 control."""
429 part = bundlepart(typeid, *args, **kwargs)
430 self.addpart(part)
431 return part
432
433 # methods used to generate the bundle2 stream
410 def getchunks(self):
434 def getchunks(self):
411 self.ui.debug('start emission of %s stream\n' % _magicstring)
435 self.ui.debug('start emission of %s stream\n' % _magicstring)
412 yield _magicstring
436 yield _magicstring
@@ -505,7 +529,7 b' class unbundle20(unpackermixin):'
505 if name[0].islower():
529 if name[0].islower():
506 self.ui.debug("ignoring unknown parameter %r\n" % name)
530 self.ui.debug("ignoring unknown parameter %r\n" % name)
507 else:
531 else:
508 raise KeyError(name)
532 raise error.BundleValueError(params=(name,))
509
533
510
534
511 def iterparts(self):
535 def iterparts(self):
@@ -536,17 +560,71 b' class bundlepart(object):'
536
560
537 The part `type` is used to route the part to the application level
561 The part `type` is used to route the part to the application level
538 handler.
562 handler.
563
564 The part payload is contained in ``part.data``. It could be raw bytes or a
565 generator of byte chunks.
566
567 You can add parameters to the part using the ``addparam`` method.
568 Parameters can be either mandatory (default) or advisory. Remote side
569 should be able to safely ignore the advisory ones.
570
571 Both data and parameters cannot be modified after the generation has begun.
539 """
572 """
540
573
541 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
574 def __init__(self, parttype, mandatoryparams=(), advisoryparams=(),
542 data=''):
575 data=''):
543 self.id = None
576 self.id = None
544 self.type = parttype
577 self.type = parttype
545 self.data = data
578 self._data = data
546 self.mandatoryparams = mandatoryparams
579 self._mandatoryparams = list(mandatoryparams)
547 self.advisoryparams = advisoryparams
580 self._advisoryparams = list(advisoryparams)
581 # checking for duplicated entries
582 self._seenparams = set()
583 for pname, __ in self._mandatoryparams + self._advisoryparams:
584 if pname in self._seenparams:
585 raise RuntimeError('duplicated params: %s' % pname)
586 self._seenparams.add(pname)
587 # status of the part's generation:
588 # - None: not started,
589 # - False: currently generated,
590 # - True: generation done.
591 self._generated = None
592
593 # methods used to defines the part content
594 def __setdata(self, data):
595 if self._generated is not None:
596 raise error.ReadOnlyPartError('part is being generated')
597 self._data = data
598 def __getdata(self):
599 return self._data
600 data = property(__getdata, __setdata)
548
601
602 @property
603 def mandatoryparams(self):
604 # make it an immutable tuple to force people through ``addparam``
605 return tuple(self._mandatoryparams)
606
607 @property
608 def advisoryparams(self):
609 # make it an immutable tuple to force people through ``addparam``
610 return tuple(self._advisoryparams)
611
612 def addparam(self, name, value='', mandatory=True):
613 if self._generated is not None:
614 raise error.ReadOnlyPartError('part is being generated')
615 if name in self._seenparams:
616 raise ValueError('duplicated params: %s' % name)
617 self._seenparams.add(name)
618 params = self._advisoryparams
619 if mandatory:
620 params = self._mandatoryparams
621 params.append((name, value))
622
623 # methods used to generates the bundle2 stream
549 def getchunks(self):
624 def getchunks(self):
625 if self._generated is not None:
626 raise RuntimeError('part can only be consumed once')
627 self._generated = False
550 #### header
628 #### header
551 ## parttype
629 ## parttype
552 header = [_pack(_fparttypesize, len(self.type)),
630 header = [_pack(_fparttypesize, len(self.type)),
@@ -584,6 +662,7 b' class bundlepart(object):'
584 yield chunk
662 yield chunk
585 # end of payload
663 # end of payload
586 yield _pack(_fpayloadsize, 0)
664 yield _pack(_fpayloadsize, 0)
665 self._generated = True
587
666
588 def _payloadchunks(self):
667 def _payloadchunks(self):
589 """yield chunks of a the part payload
668 """yield chunks of a the part payload
@@ -616,6 +695,8 b' class unbundlepart(unpackermixin):'
616 self.type = None
695 self.type = None
617 self.mandatoryparams = None
696 self.mandatoryparams = None
618 self.advisoryparams = None
697 self.advisoryparams = None
698 self.params = None
699 self.mandatorykeys = ()
619 self._payloadstream = None
700 self._payloadstream = None
620 self._readheader()
701 self._readheader()
621
702
@@ -633,6 +714,16 b' class unbundlepart(unpackermixin):'
633 data = self._fromheader(struct.calcsize(format))
714 data = self._fromheader(struct.calcsize(format))
634 return _unpack(format, data)
715 return _unpack(format, data)
635
716
717 def _initparams(self, mandatoryparams, advisoryparams):
718 """internal function to setup all logic related parameters"""
719 # make it read only to prevent people touching it by mistake.
720 self.mandatoryparams = tuple(mandatoryparams)
721 self.advisoryparams = tuple(advisoryparams)
722 # user friendly UI
723 self.params = dict(self.mandatoryparams)
724 self.params.update(dict(self.advisoryparams))
725 self.mandatorykeys = frozenset(p[0] for p in mandatoryparams)
726
636 def _readheader(self):
727 def _readheader(self):
637 """read the header and setup the object"""
728 """read the header and setup the object"""
638 typesize = self._unpackheader(_fparttypesize)[0]
729 typesize = self._unpackheader(_fparttypesize)[0]
@@ -659,8 +750,7 b' class unbundlepart(unpackermixin):'
659 advparams = []
750 advparams = []
660 for key, value in advsizes:
751 for key, value in advsizes:
661 advparams.append((self._fromheader(key), self._fromheader(value)))
752 advparams.append((self._fromheader(key), self._fromheader(value)))
662 self.mandatoryparams = manparams
753 self._initparams(manparams, advparams)
663 self.advisoryparams = advparams
664 ## part payload
754 ## part payload
665 def payloadchunks():
755 def payloadchunks():
666 payloadsize = self._unpack(_fpayloadsize)[0]
756 payloadsize = self._unpack(_fpayloadsize)[0]
@@ -685,6 +775,13 b' class unbundlepart(unpackermixin):'
685 self.consumed = True
775 self.consumed = True
686 return data
776 return data
687
777
778 def bundle2caps(remote):
779 """return the bundlecapabilities of a peer as dict"""
780 raw = remote.capable('bundle2-exp')
781 if not raw and raw != '':
782 return {}
783 capsblob = urllib.unquote(remote.capable('bundle2-exp'))
784 return decodecaps(capsblob)
688
785
689 @parthandler('b2x:changegroup')
786 @parthandler('b2x:changegroup')
690 def handlechangegroup(op, inpart):
787 def handlechangegroup(op, inpart):
@@ -705,17 +802,16 b' def handlechangegroup(op, inpart):'
705 if op.reply is not None:
802 if op.reply is not None:
706 # This is definitly not the final form of this
803 # This is definitly not the final form of this
707 # return. But one need to start somewhere.
804 # return. But one need to start somewhere.
708 part = bundlepart('b2x:reply:changegroup', (),
805 part = op.reply.newpart('b2x:reply:changegroup')
709 [('in-reply-to', str(inpart.id)),
806 part.addparam('in-reply-to', str(inpart.id), mandatory=False)
710 ('return', '%i' % ret)])
807 part.addparam('return', '%i' % ret, mandatory=False)
711 op.reply.addpart(part)
712 assert not inpart.read()
808 assert not inpart.read()
713
809
714 @parthandler('b2x:reply:changegroup')
810 @parthandler('b2x:reply:changegroup', ('return', 'in-reply-to'))
715 def handlechangegroup(op, inpart):
811 def handlechangegroup(op, inpart):
716 p = dict(inpart.advisoryparams)
812 ret = int(inpart.params['return'])
717 ret = int(p['return'])
813 replyto = int(inpart.params['in-reply-to'])
718 op.records.add('changegroup', {'return': ret}, int(p['in-reply-to']))
814 op.records.add('changegroup', {'return': ret}, replyto)
719
815
720 @parthandler('b2x:check:heads')
816 @parthandler('b2x:check:heads')
721 def handlechangegroup(op, inpart):
817 def handlechangegroup(op, inpart):
@@ -748,21 +844,58 b' def handlereplycaps(op, inpart):'
748 if op.reply is None:
844 if op.reply is None:
749 op.reply = bundle20(op.ui, caps)
845 op.reply = bundle20(op.ui, caps)
750
846
751 @parthandler('b2x:error:abort')
847 @parthandler('b2x:error:abort', ('message', 'hint'))
752 def handlereplycaps(op, inpart):
848 def handlereplycaps(op, inpart):
753 """Used to transmit abort error over the wire"""
849 """Used to transmit abort error over the wire"""
754 manargs = dict(inpart.mandatoryparams)
850 raise util.Abort(inpart.params['message'], hint=inpart.params.get('hint'))
755 advargs = dict(inpart.advisoryparams)
756 raise util.Abort(manargs['message'], hint=advargs.get('hint'))
757
851
758 @parthandler('b2x:error:unknownpart')
852 @parthandler('b2x:error:unsupportedcontent', ('parttype', 'params'))
759 def handlereplycaps(op, inpart):
853 def handlereplycaps(op, inpart):
760 """Used to transmit unknown part error over the wire"""
854 """Used to transmit unknown content error over the wire"""
761 manargs = dict(inpart.mandatoryparams)
855 kwargs = {}
762 raise UnknownPartError(manargs['parttype'])
856 parttype = inpart.params.get('parttype')
857 if parttype is not None:
858 kwargs['parttype'] = parttype
859 params = inpart.params.get('params')
860 if params is not None:
861 kwargs['params'] = params.split('\0')
763
862
764 @parthandler('b2x:error:pushraced')
863 raise error.BundleValueError(**kwargs)
864
865 @parthandler('b2x:error:pushraced', ('message',))
765 def handlereplycaps(op, inpart):
866 def handlereplycaps(op, inpart):
766 """Used to transmit push race error over the wire"""
867 """Used to transmit push race error over the wire"""
767 manargs = dict(inpart.mandatoryparams)
868 raise error.ResponseError(_('push failed:'), inpart.params['message'])
768 raise error.ResponseError(_('push failed:'), manargs['message'])
869
870 @parthandler('b2x:listkeys', ('namespace',))
871 def handlelistkeys(op, inpart):
872 """retrieve pushkey namespace content stored in a bundle2"""
873 namespace = inpart.params['namespace']
874 r = pushkey.decodekeys(inpart.read())
875 op.records.add('listkeys', (namespace, r))
876
877 @parthandler('b2x:pushkey', ('namespace', 'key', 'old', 'new'))
878 def handlepushkey(op, inpart):
879 """process a pushkey request"""
880 dec = pushkey.decode
881 namespace = dec(inpart.params['namespace'])
882 key = dec(inpart.params['key'])
883 old = dec(inpart.params['old'])
884 new = dec(inpart.params['new'])
885 ret = op.repo.pushkey(namespace, key, old, new)
886 record = {'namespace': namespace,
887 'key': key,
888 'old': old,
889 'new': new}
890 op.records.add('pushkey', record)
891 if op.reply is not None:
892 rpart = op.reply.newpart('b2x:reply:pushkey')
893 rpart.addparam('in-reply-to', str(inpart.id), mandatory=False)
894 rpart.addparam('return', '%i' % ret, mandatory=False)
895
896 @parthandler('b2x:reply:pushkey', ('return', 'in-reply-to'))
897 def handlepushkeyreply(op, inpart):
898 """retrieve the result of a pushkey request"""
899 ret = int(inpart.params['return'])
900 partid = int(inpart.params['in-reply-to'])
901 op.records.add('pushkey', {'return': ret}, partid)
@@ -493,6 +493,25 b' def getlocalbundle(repo, source, outgoin'
493 bundler = bundle10(repo, bundlecaps)
493 bundler = bundle10(repo, bundlecaps)
494 return getsubset(repo, outgoing, bundler, source)
494 return getsubset(repo, outgoing, bundler, source)
495
495
496 def _computeoutgoing(repo, heads, common):
497 """Computes which revs are outgoing given a set of common
498 and a set of heads.
499
500 This is a separate function so extensions can have access to
501 the logic.
502
503 Returns a discovery.outgoing object.
504 """
505 cl = repo.changelog
506 if common:
507 hasnode = cl.hasnode
508 common = [n for n in common if hasnode(n)]
509 else:
510 common = [nullid]
511 if not heads:
512 heads = cl.heads()
513 return discovery.outgoing(cl, common, heads)
514
496 def getbundle(repo, source, heads=None, common=None, bundlecaps=None):
515 def getbundle(repo, source, heads=None, common=None, bundlecaps=None):
497 """Like changegroupsubset, but returns the set difference between the
516 """Like changegroupsubset, but returns the set difference between the
498 ancestors of heads and the ancestors common.
517 ancestors of heads and the ancestors common.
@@ -502,15 +521,7 b' def getbundle(repo, source, heads=None, '
502 The nodes in common might not all be known locally due to the way the
521 The nodes in common might not all be known locally due to the way the
503 current discovery protocol works.
522 current discovery protocol works.
504 """
523 """
505 cl = repo.changelog
524 outgoing = _computeoutgoing(repo, heads, common)
506 if common:
507 hasnode = cl.hasnode
508 common = [n for n in common if hasnode(n)]
509 else:
510 common = [nullid]
511 if not heads:
512 heads = cl.heads()
513 outgoing = discovery.outgoing(cl, common, heads)
514 return getlocalbundle(repo, source, outgoing, bundlecaps=bundlecaps)
525 return getlocalbundle(repo, source, outgoing, bundlecaps=bundlecaps)
515
526
516 def changegroup(repo, basenodes, source):
527 def changegroup(repo, basenodes, source):
@@ -109,6 +109,30 b' def logmessage(ui, opts):'
109 (logfile, inst.strerror))
109 (logfile, inst.strerror))
110 return message
110 return message
111
111
112 def getcommiteditor(edit=False, finishdesc=None, extramsg=None, **opts):
113 """get appropriate commit message editor according to '--edit' option
114
115 'finishdesc' is a function to be called with edited commit message
116 (= 'description' of the new changeset) just after editing, but
117 before checking empty-ness. It should return actual text to be
118 stored into history. This allows to change description before
119 storing.
120
121 'extramsg' is a extra message to be shown in the editor instead of
122 'Leave message empty to abort commit' line. 'HG: ' prefix and EOL
123 is automatically added.
124
125 'getcommiteditor' returns 'commitforceeditor' regardless of
126 'edit', if one of 'finishdesc' or 'extramsg' is specified, because
127 they are specific for usage in MQ.
128 """
129 if edit or finishdesc or extramsg:
130 return lambda r, c, s: commitforceeditor(r, c, s,
131 finishdesc=finishdesc,
132 extramsg=extramsg)
133 else:
134 return commiteditor
135
112 def loglimit(opts):
136 def loglimit(opts):
113 """get the log limit according to option -l/--limit"""
137 """get the log limit according to option -l/--limit"""
114 limit = opts.get('limit')
138 limit = opts.get('limit')
@@ -562,16 +586,16 b' def tryimportone(ui, repo, hunk, parents'
562 tmpname, message, user, date, branch, nodeid, p1, p2 = \
586 tmpname, message, user, date, branch, nodeid, p1, p2 = \
563 patch.extract(ui, hunk)
587 patch.extract(ui, hunk)
564
588
565 editor = commiteditor
589 editor = getcommiteditor(**opts)
566 if opts.get('edit'):
567 editor = commitforceeditor
568 update = not opts.get('bypass')
590 update = not opts.get('bypass')
569 strip = opts["strip"]
591 strip = opts["strip"]
570 sim = float(opts.get('similarity') or 0)
592 sim = float(opts.get('similarity') or 0)
571 if not tmpname:
593 if not tmpname:
572 return (None, None)
594 return (None, None, False)
573 msg = _('applied to working directory')
595 msg = _('applied to working directory')
574
596
597 rejects = False
598
575 try:
599 try:
576 cmdline_message = logmessage(ui, opts)
600 cmdline_message = logmessage(ui, opts)
577 if cmdline_message:
601 if cmdline_message:
@@ -617,9 +641,17 b' def tryimportone(ui, repo, hunk, parents'
617 if opts.get('exact') or opts.get('import_branch'):
641 if opts.get('exact') or opts.get('import_branch'):
618 repo.dirstate.setbranch(branch or 'default')
642 repo.dirstate.setbranch(branch or 'default')
619
643
644 partial = opts.get('partial', False)
620 files = set()
645 files = set()
621 patch.patch(ui, repo, tmpname, strip=strip, files=files,
646 try:
622 eolmode=None, similarity=sim / 100.0)
647 patch.patch(ui, repo, tmpname, strip=strip, files=files,
648 eolmode=None, similarity=sim / 100.0)
649 except patch.PatchError, e:
650 if not partial:
651 raise util.Abort(str(e))
652 if partial:
653 rejects = True
654
623 files = list(files)
655 files = list(files)
624 if opts.get('no_commit'):
656 if opts.get('no_commit'):
625 if message:
657 if message:
@@ -634,7 +666,7 b' def tryimportone(ui, repo, hunk, parents'
634 m = scmutil.matchfiles(repo, files or [])
666 m = scmutil.matchfiles(repo, files or [])
635 n = repo.commit(message, opts.get('user') or user,
667 n = repo.commit(message, opts.get('user') or user,
636 opts.get('date') or date, match=m,
668 opts.get('date') or date, match=m,
637 editor=editor)
669 editor=editor, force=partial)
638 else:
670 else:
639 if opts.get('exact') or opts.get('import_branch'):
671 if opts.get('exact') or opts.get('import_branch'):
640 branch = branch or 'default'
672 branch = branch or 'default'
@@ -653,8 +685,7 b' def tryimportone(ui, repo, hunk, parents'
653 opts.get('user') or user,
685 opts.get('user') or user,
654 opts.get('date') or date,
686 opts.get('date') or date,
655 branch, files, store,
687 branch, files, store,
656 editor=commiteditor)
688 editor=getcommiteditor())
657 repo.savecommitmessage(memctx.description())
658 n = memctx.commit()
689 n = memctx.commit()
659 finally:
690 finally:
660 store.close()
691 store.close()
@@ -663,7 +694,7 b' def tryimportone(ui, repo, hunk, parents'
663 if n:
694 if n:
664 # i18n: refers to a short changeset id
695 # i18n: refers to a short changeset id
665 msg = _('created %s') % short(n)
696 msg = _('created %s') % short(n)
666 return (msg, n)
697 return (msg, n, rejects)
667 finally:
698 finally:
668 os.unlink(tmpname)
699 os.unlink(tmpname)
669
700
@@ -1468,7 +1499,6 b' def _makelogfilematcher(repo, files, fol'
1468 fcache = {}
1499 fcache = {}
1469 fcacheready = [False]
1500 fcacheready = [False]
1470 pctx = repo['.']
1501 pctx = repo['.']
1471 wctx = repo[None]
1472
1502
1473 def populate():
1503 def populate():
1474 for fn in files:
1504 for fn in files:
@@ -1481,7 +1511,7 b' def _makelogfilematcher(repo, files, fol'
1481 # Lazy initialization
1511 # Lazy initialization
1482 fcacheready[0] = True
1512 fcacheready[0] = True
1483 populate()
1513 populate()
1484 return scmutil.match(wctx, fcache.get(rev, []), default='path')
1514 return scmutil.matchfiles(repo, fcache.get(rev, []))
1485
1515
1486 return filematcher
1516 return filematcher
1487
1517
@@ -1690,7 +1720,7 b' def getlogrevs(repo, pats, opts):'
1690 if opts.get('rev'):
1720 if opts.get('rev'):
1691 revs = scmutil.revrange(repo, opts['rev'])
1721 revs = scmutil.revrange(repo, opts['rev'])
1692 elif follow:
1722 elif follow:
1693 revs = revset.baseset(repo.revs('reverse(:.)'))
1723 revs = repo.revs('reverse(:.)')
1694 else:
1724 else:
1695 revs = revset.spanset(repo)
1725 revs = revset.spanset(repo)
1696 revs.reverse()
1726 revs.reverse()
@@ -2039,7 +2069,8 b' def amend(ui, repo, commitfunc, old, ext'
2039 try:
2069 try:
2040 fctx = ctx[path]
2070 fctx = ctx[path]
2041 flags = fctx.flags()
2071 flags = fctx.flags()
2042 mctx = context.memfilectx(fctx.path(), fctx.data(),
2072 mctx = context.memfilectx(repo,
2073 fctx.path(), fctx.data(),
2043 islink='l' in flags,
2074 islink='l' in flags,
2044 isexec='x' in flags,
2075 isexec='x' in flags,
2045 copied=copied.get(path))
2076 copied=copied.get(path))
@@ -2058,12 +2089,10 b' def amend(ui, repo, commitfunc, old, ext'
2058
2089
2059 user = opts.get('user') or old.user()
2090 user = opts.get('user') or old.user()
2060 date = opts.get('date') or old.date()
2091 date = opts.get('date') or old.date()
2061 editmsg = False
2092 editor = getcommiteditor(**opts)
2062 if not message:
2093 if not message:
2063 editmsg = True
2094 editor = getcommiteditor(edit=True)
2064 message = old.description()
2095 message = old.description()
2065 elif opts.get('edit'):
2066 editmsg = True
2067
2096
2068 pureextra = extra.copy()
2097 pureextra = extra.copy()
2069 extra['amend_source'] = old.hex()
2098 extra['amend_source'] = old.hex()
@@ -2075,10 +2104,8 b' def amend(ui, repo, commitfunc, old, ext'
2075 filectxfn=filectxfn,
2104 filectxfn=filectxfn,
2076 user=user,
2105 user=user,
2077 date=date,
2106 date=date,
2078 extra=extra)
2107 extra=extra,
2079 if editmsg:
2108 editor=editor)
2080 new._text = commitforceeditor(repo, new, [])
2081 repo.savecommitmessage(new.description())
2082
2109
2083 newdesc = changelog.stripdesc(new.description())
2110 newdesc = changelog.stripdesc(new.description())
2084 if ((not node)
2111 if ((not node)
@@ -2143,7 +2170,46 b' def commiteditor(repo, ctx, subs):'
2143 return ctx.description()
2170 return ctx.description()
2144 return commitforceeditor(repo, ctx, subs)
2171 return commitforceeditor(repo, ctx, subs)
2145
2172
2146 def commitforceeditor(repo, ctx, subs):
2173 def commitforceeditor(repo, ctx, subs, finishdesc=None, extramsg=None):
2174 if not extramsg:
2175 extramsg = _("Leave message empty to abort commit.")
2176 tmpl = repo.ui.config('committemplate', 'changeset', '').strip()
2177 if tmpl:
2178 committext = buildcommittemplate(repo, ctx, subs, extramsg, tmpl)
2179 else:
2180 committext = buildcommittext(repo, ctx, subs, extramsg)
2181
2182 # run editor in the repository root
2183 olddir = os.getcwd()
2184 os.chdir(repo.root)
2185 text = repo.ui.edit(committext, ctx.user(), ctx.extra())
2186 text = re.sub("(?m)^HG:.*(\n|$)", "", text)
2187 os.chdir(olddir)
2188
2189 if finishdesc:
2190 text = finishdesc(text)
2191 if not text.strip():
2192 raise util.Abort(_("empty commit message"))
2193
2194 return text
2195
2196 def buildcommittemplate(repo, ctx, subs, extramsg, tmpl):
2197 ui = repo.ui
2198 tmpl, mapfile = gettemplate(ui, tmpl, None)
2199
2200 try:
2201 t = changeset_templater(ui, repo, None, {}, tmpl, mapfile, False)
2202 except SyntaxError, inst:
2203 raise util.Abort(inst.args[0])
2204
2205 if not extramsg:
2206 extramsg = '' # ensure that extramsg is string
2207
2208 ui.pushbuffer()
2209 t.show(ctx, extramsg=extramsg)
2210 return ui.popbuffer()
2211
2212 def buildcommittext(repo, ctx, subs, extramsg):
2147 edittext = []
2213 edittext = []
2148 modified, added, removed = ctx.modified(), ctx.added(), ctx.removed()
2214 modified, added, removed = ctx.modified(), ctx.added(), ctx.removed()
2149 if ctx.description():
2215 if ctx.description():
@@ -2152,7 +2218,7 b' def commitforceeditor(repo, ctx, subs):'
2152 edittext.append("") # Empty line between message and comments.
2218 edittext.append("") # Empty line between message and comments.
2153 edittext.append(_("HG: Enter commit message."
2219 edittext.append(_("HG: Enter commit message."
2154 " Lines beginning with 'HG:' are removed."))
2220 " Lines beginning with 'HG:' are removed."))
2155 edittext.append(_("HG: Leave message empty to abort commit."))
2221 edittext.append("HG: %s" % extramsg)
2156 edittext.append("HG: --")
2222 edittext.append("HG: --")
2157 edittext.append(_("HG: user: %s") % ctx.user())
2223 edittext.append(_("HG: user: %s") % ctx.user())
2158 if ctx.p2():
2224 if ctx.p2():
@@ -2168,17 +2234,8 b' def commitforceeditor(repo, ctx, subs):'
2168 if not added and not modified and not removed:
2234 if not added and not modified and not removed:
2169 edittext.append(_("HG: no files changed"))
2235 edittext.append(_("HG: no files changed"))
2170 edittext.append("")
2236 edittext.append("")
2171 # run editor in the repository root
2172 olddir = os.getcwd()
2173 os.chdir(repo.root)
2174 text = repo.ui.edit("\n".join(edittext), ctx.user(), ctx.extra())
2175 text = re.sub("(?m)^HG:.*(\n|$)", "", text)
2176 os.chdir(olddir)
2177
2237
2178 if not text.strip():
2238 return "\n".join(edittext)
2179 raise util.Abort(_("empty commit message"))
2180
2181 return text
2182
2239
2183 def commitstatus(repo, node, branch, bheads=None, opts={}):
2240 def commitstatus(repo, node, branch, bheads=None, opts={}):
2184 ctx = repo[node]
2241 ctx = repo[node]
@@ -2231,6 +2288,8 b' def revert(ui, repo, ctx, parents, *pats'
2231 node = ctx.node()
2288 node = ctx.node()
2232
2289
2233 mf = ctx.manifest()
2290 mf = ctx.manifest()
2291 if node == p2:
2292 parent = p2
2234 if node == parent:
2293 if node == parent:
2235 pmf = mf
2294 pmf = mf
2236 else:
2295 else:
@@ -2240,18 +2299,22 b' def revert(ui, repo, ctx, parents, *pats'
2240 # so have to walk both. do not print errors if files exist in one
2299 # so have to walk both. do not print errors if files exist in one
2241 # but not other.
2300 # but not other.
2242
2301
2302 # `names` is a mapping for all elements in working copy and target revision
2303 # The mapping is in the form:
2304 # <asb path in repo> -> (<path from CWD>, <exactly specified by matcher?>)
2243 names = {}
2305 names = {}
2244
2306
2245 wlock = repo.wlock()
2307 wlock = repo.wlock()
2246 try:
2308 try:
2247 # walk dirstate.
2309 ## filling of the `names` mapping
2310 # walk dirstate to fill `names`
2248
2311
2249 m = scmutil.match(repo[None], pats, opts)
2312 m = scmutil.match(repo[None], pats, opts)
2250 m.bad = lambda x, y: False
2313 m.bad = lambda x, y: False
2251 for abs in repo.walk(m):
2314 for abs in repo.walk(m):
2252 names[abs] = m.rel(abs), m.exact(abs)
2315 names[abs] = m.rel(abs), m.exact(abs)
2253
2316
2254 # walk target manifest.
2317 # walk target manifest to fill `names`
2255
2318
2256 def badfn(path, msg):
2319 def badfn(path, msg):
2257 if path in names:
2320 if path in names:
@@ -2272,11 +2335,13 b' def revert(ui, repo, ctx, parents, *pats'
2272
2335
2273 # get the list of subrepos that must be reverted
2336 # get the list of subrepos that must be reverted
2274 targetsubs = sorted(s for s in ctx.substate if m(s))
2337 targetsubs = sorted(s for s in ctx.substate if m(s))
2338
2339 # Find status of all file in `names`. (Against working directory parent)
2275 m = scmutil.matchfiles(repo, names)
2340 m = scmutil.matchfiles(repo, names)
2276 changes = repo.status(match=m)[:4]
2341 changes = repo.status(node1=parent, match=m)[:4]
2277 modified, added, removed, deleted = map(set, changes)
2342 modified, added, removed, deleted = map(set, changes)
2278
2343
2279 # if f is a rename, also revert the source
2344 # if f is a rename, update `names` to also revert the source
2280 cwd = repo.getcwd()
2345 cwd = repo.getcwd()
2281 for f in added:
2346 for f in added:
2282 src = repo.dirstate.copied(f)
2347 src = repo.dirstate.copied(f)
@@ -2284,15 +2349,19 b' def revert(ui, repo, ctx, parents, *pats'
2284 removed.add(src)
2349 removed.add(src)
2285 names[src] = (repo.pathto(src, cwd), True)
2350 names[src] = (repo.pathto(src, cwd), True)
2286
2351
2352 ## computation of the action to performs on `names` content.
2353
2287 def removeforget(abs):
2354 def removeforget(abs):
2288 if repo.dirstate[abs] == 'a':
2355 if repo.dirstate[abs] == 'a':
2289 return _('forgetting %s\n')
2356 return _('forgetting %s\n')
2290 return _('removing %s\n')
2357 return _('removing %s\n')
2291
2358
2292 revert = ([], _('reverting %s\n'))
2359 # action to be actually performed by revert
2293 add = ([], _('adding %s\n'))
2360 # (<list of file>, message>) tuple
2294 remove = ([], removeforget)
2361 actions = {'revert': ([], _('reverting %s\n')),
2295 undelete = ([], _('undeleting %s\n'))
2362 'add': ([], _('adding %s\n')),
2363 'remove': ([], removeforget),
2364 'undelete': ([], _('undeleting %s\n'))}
2296
2365
2297 disptable = (
2366 disptable = (
2298 # dispatch table:
2367 # dispatch table:
@@ -2301,14 +2370,20 b' def revert(ui, repo, ctx, parents, *pats'
2301 # action if not in target manifest
2370 # action if not in target manifest
2302 # make backup if in target manifest
2371 # make backup if in target manifest
2303 # make backup if not in target manifest
2372 # make backup if not in target manifest
2304 (modified, revert, remove, True, True),
2373 (modified, (actions['revert'], True),
2305 (added, revert, remove, True, False),
2374 (actions['remove'], True)),
2306 (removed, undelete, None, True, False),
2375 (added, (actions['revert'], True),
2307 (deleted, revert, remove, False, False),
2376 (actions['remove'], False)),
2377 (removed, (actions['undelete'], True),
2378 (None, False)),
2379 (deleted, (actions['revert'], False),
2380 (actions['remove'], False)),
2308 )
2381 )
2309
2382
2310 for abs, (rel, exact) in sorted(names.items()):
2383 for abs, (rel, exact) in sorted(names.items()):
2384 # hash on file in target manifest (or None if missing from target)
2311 mfentry = mf.get(abs)
2385 mfentry = mf.get(abs)
2386 # target file to be touch on disk (relative to cwd)
2312 target = repo.wjoin(abs)
2387 target = repo.wjoin(abs)
2313 def handle(xlist, dobackup):
2388 def handle(xlist, dobackup):
2314 xlist[0].append(abs)
2389 xlist[0].append(abs)
@@ -2325,27 +2400,35 b' def revert(ui, repo, ctx, parents, *pats'
2325 if not isinstance(msg, basestring):
2400 if not isinstance(msg, basestring):
2326 msg = msg(abs)
2401 msg = msg(abs)
2327 ui.status(msg % rel)
2402 ui.status(msg % rel)
2328 for table, hitlist, misslist, backuphit, backupmiss in disptable:
2403 # search the entry in the dispatch table.
2404 # if the file is in any of this sets, it was touched in the working
2405 # directory parent and we are sure it needs to be reverted.
2406 for table, hit, miss in disptable:
2329 if abs not in table:
2407 if abs not in table:
2330 continue
2408 continue
2331 # file has changed in dirstate
2409 # file has changed in dirstate
2332 if mfentry:
2410 if mfentry:
2333 handle(hitlist, backuphit)
2411 handle(*hit)
2334 elif misslist is not None:
2412 elif miss[0] is not None:
2335 handle(misslist, backupmiss)
2413 handle(*miss)
2336 break
2414 break
2337 else:
2415 else:
2416 # Not touched in current dirstate.
2417
2418 # file is unknown in parent, restore older version or ignore.
2338 if abs not in repo.dirstate:
2419 if abs not in repo.dirstate:
2339 if mfentry:
2420 if mfentry:
2340 handle(add, True)
2421 handle(actions['add'], True)
2341 elif exact:
2422 elif exact:
2342 ui.warn(_('file not managed: %s\n') % rel)
2423 ui.warn(_('file not managed: %s\n') % rel)
2343 continue
2424 continue
2344 # file has not changed in dirstate
2425
2426 # parent is target, no changes mean no changes
2345 if node == parent:
2427 if node == parent:
2346 if exact:
2428 if exact:
2347 ui.warn(_('no changes needed to %s\n') % rel)
2429 ui.warn(_('no changes needed to %s\n') % rel)
2348 continue
2430 continue
2431 # no change in dirstate but parent and target may differ
2349 if pmf is None:
2432 if pmf is None:
2350 # only need parent manifest in this unlikely case,
2433 # only need parent manifest in this unlikely case,
2351 # so do not read by default
2434 # so do not read by default
@@ -2355,11 +2438,12 b' def revert(ui, repo, ctx, parents, *pats'
2355 # manifests, do nothing
2438 # manifests, do nothing
2356 if (pmf[abs] != mfentry or
2439 if (pmf[abs] != mfentry or
2357 pmf.flags(abs) != mf.flags(abs)):
2440 pmf.flags(abs) != mf.flags(abs)):
2358 handle(revert, False)
2441 handle(actions['revert'], False)
2359 else:
2442 else:
2360 handle(remove, False)
2443 handle(actions['remove'], False)
2444
2361 if not opts.get('dry_run'):
2445 if not opts.get('dry_run'):
2362 _performrevert(repo, parents, ctx, revert, add, remove, undelete)
2446 _performrevert(repo, parents, ctx, actions)
2363
2447
2364 if targetsubs:
2448 if targetsubs:
2365 # Revert the subrepos on the revert list
2449 # Revert the subrepos on the revert list
@@ -2368,8 +2452,8 b' def revert(ui, repo, ctx, parents, *pats'
2368 finally:
2452 finally:
2369 wlock.release()
2453 wlock.release()
2370
2454
2371 def _performrevert(repo, parents, ctx, revert, add, remove, undelete):
2455 def _performrevert(repo, parents, ctx, actions):
2372 """function that actually perform all the action computed for revert
2456 """function that actually perform all the actions computed for revert
2373
2457
2374 This is an independent function to let extension to plug in and react to
2458 This is an independent function to let extension to plug in and react to
2375 the imminent revert.
2459 the imminent revert.
@@ -2383,7 +2467,7 b' def _performrevert(repo, parents, ctx, r'
2383 repo.wwrite(f, fc.data(), fc.flags())
2467 repo.wwrite(f, fc.data(), fc.flags())
2384
2468
2385 audit_path = pathutil.pathauditor(repo.root)
2469 audit_path = pathutil.pathauditor(repo.root)
2386 for f in remove[0]:
2470 for f in actions['remove'][0]:
2387 if repo.dirstate[f] == 'a':
2471 if repo.dirstate[f] == 'a':
2388 repo.dirstate.drop(f)
2472 repo.dirstate.drop(f)
2389 continue
2473 continue
@@ -2403,38 +2487,79 b' def _performrevert(repo, parents, ctx, r'
2403 normal = repo.dirstate.normallookup
2487 normal = repo.dirstate.normallookup
2404 else:
2488 else:
2405 normal = repo.dirstate.normal
2489 normal = repo.dirstate.normal
2406 for f in revert[0]:
2490 for f in actions['revert'][0]:
2407 checkout(f)
2491 checkout(f)
2408 if normal:
2492 if normal:
2409 normal(f)
2493 normal(f)
2410
2494
2411 for f in add[0]:
2495 for f in actions['add'][0]:
2412 checkout(f)
2496 checkout(f)
2413 repo.dirstate.add(f)
2497 repo.dirstate.add(f)
2414
2498
2415 normal = repo.dirstate.normallookup
2499 normal = repo.dirstate.normallookup
2416 if node == parent and p2 == nullid:
2500 if node == parent and p2 == nullid:
2417 normal = repo.dirstate.normal
2501 normal = repo.dirstate.normal
2418 for f in undelete[0]:
2502 for f in actions['undelete'][0]:
2419 checkout(f)
2503 checkout(f)
2420 normal(f)
2504 normal(f)
2421
2505
2422 copied = copies.pathcopies(repo[parent], ctx)
2506 copied = copies.pathcopies(repo[parent], ctx)
2423
2507
2424 for f in add[0] + undelete[0] + revert[0]:
2508 for f in actions['add'][0] + actions['undelete'][0] + actions['revert'][0]:
2425 if f in copied:
2509 if f in copied:
2426 repo.dirstate.copy(copied[f], f)
2510 repo.dirstate.copy(copied[f], f)
2427
2511
2428 def command(table):
2512 def command(table):
2429 '''returns a function object bound to table which can be used as
2513 """Returns a function object to be used as a decorator for making commands.
2430 a decorator for populating table as a command table'''
2514
2515 This function receives a command table as its argument. The table should
2516 be a dict.
2517
2518 The returned function can be used as a decorator for adding commands
2519 to that command table. This function accepts multiple arguments to define
2520 a command.
2521
2522 The first argument is the command name.
2523
2524 The options argument is an iterable of tuples defining command arguments.
2525 See ``mercurial.fancyopts.fancyopts()`` for the format of each tuple.
2431
2526
2432 def cmd(name, options=(), synopsis=None):
2527 The synopsis argument defines a short, one line summary of how to use the
2528 command. This shows up in the help output.
2529
2530 The norepo argument defines whether the command does not require a
2531 local repository. Most commands operate against a repository, thus the
2532 default is False.
2533
2534 The optionalrepo argument defines whether the command optionally requires
2535 a local repository.
2536
2537 The inferrepo argument defines whether to try to find a repository from the
2538 command line arguments. If True, arguments will be examined for potential
2539 repository locations. See ``findrepo()``. If a repository is found, it
2540 will be used.
2541 """
2542 def cmd(name, options=(), synopsis=None, norepo=False, optionalrepo=False,
2543 inferrepo=False):
2433 def decorator(func):
2544 def decorator(func):
2434 if synopsis:
2545 if synopsis:
2435 table[name] = func, list(options), synopsis
2546 table[name] = func, list(options), synopsis
2436 else:
2547 else:
2437 table[name] = func, list(options)
2548 table[name] = func, list(options)
2549
2550 if norepo:
2551 # Avoid import cycle.
2552 import commands
2553 commands.norepo += ' %s' % ' '.join(parsealiases(name))
2554
2555 if optionalrepo:
2556 import commands
2557 commands.optionalrepo += ' %s' % ' '.join(parsealiases(name))
2558
2559 if inferrepo:
2560 import commands
2561 commands.inferrepo += ' %s' % ' '.join(parsealiases(name))
2562
2438 return func
2563 return func
2439 return decorator
2564 return decorator
2440
2565
@@ -14,6 +14,7 b' import hg, scmutil, util, revlog, copies'
14 import patch, help, encoding, templatekw, discovery
14 import patch, help, encoding, templatekw, discovery
15 import archival, changegroup, cmdutil, hbisect
15 import archival, changegroup, cmdutil, hbisect
16 import sshserver, hgweb, commandserver
16 import sshserver, hgweb, commandserver
17 import extensions
17 from hgweb import server as hgweb_server
18 from hgweb import server as hgweb_server
18 import merge as mergemod
19 import merge as mergemod
19 import minirst, revset, fileset
20 import minirst, revset, fileset
@@ -26,6 +27,18 b' table = {}'
26
27
27 command = cmdutil.command(table)
28 command = cmdutil.command(table)
28
29
30 # Space delimited list of commands that don't require local repositories.
31 # This should be populated by passing norepo=True into the @command decorator.
32 norepo = ''
33 # Space delimited list of commands that optionally require local repositories.
34 # This should be populated by passing optionalrepo=True into the @command
35 # decorator.
36 optionalrepo = ''
37 # Space delimited list of commands that will examine arguments looking for
38 # a repository. This should be populated by passing inferrepo=True into the
39 # @command decorator.
40 inferrepo = ''
41
29 # common command options
42 # common command options
30
43
31 globalopts = [
44 globalopts = [
@@ -147,7 +160,8 b' subrepoopts = ['
147
160
148 @command('^add',
161 @command('^add',
149 walkopts + subrepoopts + dryrunopts,
162 walkopts + subrepoopts + dryrunopts,
150 _('[OPTION]... [FILE]...'))
163 _('[OPTION]... [FILE]...'),
164 inferrepo=True)
151 def add(ui, repo, *pats, **opts):
165 def add(ui, repo, *pats, **opts):
152 """add the specified files on the next commit
166 """add the specified files on the next commit
153
167
@@ -183,7 +197,8 b' def add(ui, repo, *pats, **opts):'
183
197
184 @command('addremove',
198 @command('addremove',
185 similarityopts + walkopts + dryrunopts,
199 similarityopts + walkopts + dryrunopts,
186 _('[OPTION]... [FILE]...'))
200 _('[OPTION]... [FILE]...'),
201 inferrepo=True)
187 def addremove(ui, repo, *pats, **opts):
202 def addremove(ui, repo, *pats, **opts):
188 """add all new files, delete all missing files
203 """add all new files, delete all missing files
189
204
@@ -227,7 +242,8 b' def addremove(ui, repo, *pats, **opts):'
227 ('c', 'changeset', None, _('list the changeset')),
242 ('c', 'changeset', None, _('list the changeset')),
228 ('l', 'line-number', None, _('show line number at the first appearance'))
243 ('l', 'line-number', None, _('show line number at the first appearance'))
229 ] + diffwsopts + walkopts,
244 ] + diffwsopts + walkopts,
230 _('[-r REV] [-f] [-a] [-u] [-d] [-n] [-c] [-l] FILE...'))
245 _('[-r REV] [-f] [-a] [-u] [-d] [-n] [-c] [-l] FILE...'),
246 inferrepo=True)
231 def annotate(ui, repo, *pats, **opts):
247 def annotate(ui, repo, *pats, **opts):
232 """show changeset information by line for each file
248 """show changeset information by line for each file
233
249
@@ -386,6 +402,7 b' def archive(ui, repo, dest, **opts):'
386 ('', 'parent', '',
402 ('', 'parent', '',
387 _('parent to choose when backing out merge (DEPRECATED)'), _('REV')),
403 _('parent to choose when backing out merge (DEPRECATED)'), _('REV')),
388 ('r', 'rev', '', _('revision to backout'), _('REV')),
404 ('r', 'rev', '', _('revision to backout'), _('REV')),
405 ('e', 'edit', False, _('invoke editor on commit messages')),
389 ] + mergetoolopts + walkopts + commitopts + commitopts2,
406 ] + mergetoolopts + walkopts + commitopts + commitopts2,
390 _('[OPTION]... [-r] REV'))
407 _('[OPTION]... [-r] REV'))
391 def backout(ui, repo, node=None, rev=None, **opts):
408 def backout(ui, repo, node=None, rev=None, **opts):
@@ -487,13 +504,12 b' def backout(ui, repo, node=None, rev=Non'
487 cmdutil.revert(ui, repo, rctx, repo.dirstate.parents())
504 cmdutil.revert(ui, repo, rctx, repo.dirstate.parents())
488
505
489
506
490 e = cmdutil.commiteditor
491 if not opts['message'] and not opts['logfile']:
492 # we don't translate commit messages
493 opts['message'] = "Backed out changeset %s" % short(node)
494 e = cmdutil.commitforceeditor
495
496 def commitfunc(ui, repo, message, match, opts):
507 def commitfunc(ui, repo, message, match, opts):
508 e = cmdutil.getcommiteditor(**opts)
509 if not message:
510 # we don't translate commit messages
511 message = "Backed out changeset %s" % short(node)
512 e = cmdutil.getcommiteditor(edit=True)
497 return repo.commit(message, opts.get('user'), opts.get('date'),
513 return repo.commit(message, opts.get('user'), opts.get('date'),
498 match, editor=e)
514 match, editor=e)
499 newnode = cmdutil.commit(ui, repo, commitfunc, [], opts)
515 newnode = cmdutil.commit(ui, repo, commitfunc, [], opts)
@@ -795,31 +811,45 b' def bisect(ui, repo, rev=None, extra=Non'
795 ('i', 'inactive', False, _('mark a bookmark inactive'))],
811 ('i', 'inactive', False, _('mark a bookmark inactive'))],
796 _('hg bookmarks [OPTIONS]... [NAME]...'))
812 _('hg bookmarks [OPTIONS]... [NAME]...'))
797 def bookmark(ui, repo, *names, **opts):
813 def bookmark(ui, repo, *names, **opts):
798 '''track a line of development with movable markers
814 '''create a new bookmark or list existing bookmarks
799
815
800 Bookmarks are pointers to certain commits that move when committing.
816 Bookmarks are labels on changesets to help track lines of development.
801 Bookmarks are local. They can be renamed, copied and deleted. It is
817 Bookmarks are unversioned and can be moved, renamed and deleted.
802 possible to use :hg:`merge NAME` to merge from a given bookmark, and
818 Deleting or moving a bookmark has no effect on the associated changesets.
803 :hg:`update NAME` to update to a given bookmark.
819
804
820 Creating or updating to a bookmark causes it to be marked as 'active'.
805 You can use :hg:`bookmark NAME` to set a bookmark on the working
821 Active bookmarks are indicated with a '*'.
806 directory's parent revision with the given name. If you specify
822 When a commit is made, an active bookmark will advance to the new commit.
807 a revision using -r REV (where REV may be an existing bookmark),
823 A plain :hg:`update` will also advance an active bookmark, if possible.
808 the bookmark is assigned to that revision.
824 Updating away from a bookmark will cause it to be deactivated.
809
825
810 Bookmarks can be pushed and pulled between repositories (see :hg:`help
826 Bookmarks can be pushed and pulled between repositories (see
811 push` and :hg:`help pull`). This requires both the local and remote
827 :hg:`help push` and :hg:`help pull`). If a shared bookmark has
812 repositories to support bookmarks. For versions prior to 1.8, this means
828 diverged, a new 'divergent bookmark' of the form 'name@path' will
813 the bookmarks extension must be enabled.
829 be created. Using :hg:'merge' will resolve the divergence.
814
830
815 If you set a bookmark called '@', new clones of the repository will
831 A bookmark named '@' has the special property that :hg:`clone` will
816 have that revision checked out (and the bookmark made active) by
832 check it out by default if it exists.
817 default.
833
818
834 .. container:: verbose
819 With -i/--inactive, the new bookmark will not be made the active
835
820 bookmark. If -r/--rev is given, the new bookmark will not be made
836 Examples:
821 active even if -i/--inactive is not given. If no NAME is given, the
837
822 current active bookmark will be marked inactive.
838 - create an active bookmark for a new line of development::
839
840 hg book new-feature
841
842 - create an inactive bookmark as a place marker::
843
844 hg book -i reviewed
845
846 - create an inactive bookmark on another changeset::
847
848 hg book -r .^ tested
849
850 - move the '@' bookmark from another branch::
851
852 hg book -f @
823 '''
853 '''
824 force = opts.get('force')
854 force = opts.get('force')
825 rev = opts.get('rev')
855 rev = opts.get('rev')
@@ -1155,7 +1185,8 b' def bundle(ui, repo, fname, dest=None, *'
1155 ('r', 'rev', '', _('print the given revision'), _('REV')),
1185 ('r', 'rev', '', _('print the given revision'), _('REV')),
1156 ('', 'decode', None, _('apply any matching decode filter')),
1186 ('', 'decode', None, _('apply any matching decode filter')),
1157 ] + walkopts,
1187 ] + walkopts,
1158 _('[OPTION]... FILE...'))
1188 _('[OPTION]... FILE...'),
1189 inferrepo=True)
1159 def cat(ui, repo, file1, *pats, **opts):
1190 def cat(ui, repo, file1, *pats, **opts):
1160 """output the current or given revision of files
1191 """output the current or given revision of files
1161
1192
@@ -1191,7 +1222,8 b' def cat(ui, repo, file1, *pats, **opts):'
1191 ('', 'pull', None, _('use pull protocol to copy metadata')),
1222 ('', 'pull', None, _('use pull protocol to copy metadata')),
1192 ('', 'uncompressed', None, _('use uncompressed transfer (fast over LAN)')),
1223 ('', 'uncompressed', None, _('use uncompressed transfer (fast over LAN)')),
1193 ] + remoteopts,
1224 ] + remoteopts,
1194 _('[OPTION]... SOURCE [DEST]'))
1225 _('[OPTION]... SOURCE [DEST]'),
1226 norepo=True)
1195 def clone(ui, source, dest=None, **opts):
1227 def clone(ui, source, dest=None, **opts):
1196 """make a copy of an existing repository
1228 """make a copy of an existing repository
1197
1229
@@ -1310,7 +1342,8 b' def clone(ui, source, dest=None, **opts)'
1310 ('e', 'edit', None,
1342 ('e', 'edit', None,
1311 _('further edit commit message already specified')),
1343 _('further edit commit message already specified')),
1312 ] + walkopts + commitopts + commitopts2 + subrepoopts,
1344 ] + walkopts + commitopts + commitopts2 + subrepoopts,
1313 _('[OPTION]... [FILE]...'))
1345 _('[OPTION]... [FILE]...'),
1346 inferrepo=True)
1314 def commit(ui, repo, *pats, **opts):
1347 def commit(ui, repo, *pats, **opts):
1315 """commit the specified files or all outstanding changes
1348 """commit the specified files or all outstanding changes
1316
1349
@@ -1347,8 +1380,6 b' def commit(ui, repo, *pats, **opts):'
1347
1380
1348 Returns 0 on success, 1 if nothing changed.
1381 Returns 0 on success, 1 if nothing changed.
1349 """
1382 """
1350 forceeditor = opts.get('edit')
1351
1352 if opts.get('subrepos'):
1383 if opts.get('subrepos'):
1353 if opts.get('amend'):
1384 if opts.get('amend'):
1354 raise util.Abort(_('cannot amend with --subrepos'))
1385 raise util.Abort(_('cannot amend with --subrepos'))
@@ -1410,10 +1441,6 b' def commit(ui, repo, *pats, **opts):'
1410 bookmarks.setcurrent(repo, bm)
1441 bookmarks.setcurrent(repo, bm)
1411 newmarks.write()
1442 newmarks.write()
1412 else:
1443 else:
1413 e = cmdutil.commiteditor
1414 if forceeditor:
1415 e = cmdutil.commitforceeditor
1416
1417 def commitfunc(ui, repo, message, match, opts):
1444 def commitfunc(ui, repo, message, match, opts):
1418 try:
1445 try:
1419 if opts.get('secret'):
1446 if opts.get('secret'):
@@ -1423,7 +1450,9 b' def commit(ui, repo, *pats, **opts):'
1423 'commit')
1450 'commit')
1424
1451
1425 return repo.commit(message, opts.get('user'), opts.get('date'),
1452 return repo.commit(message, opts.get('user'), opts.get('date'),
1426 match, editor=e, extra=extra)
1453 match,
1454 editor=cmdutil.getcommiteditor(**opts),
1455 extra=extra)
1427 finally:
1456 finally:
1428 ui.setconfig('phases', 'new-commit', oldcommitphase, 'commit')
1457 ui.setconfig('phases', 'new-commit', oldcommitphase, 'commit')
1429 repo.baseui.setconfig('phases', 'new-commit', oldcommitphase,
1458 repo.baseui.setconfig('phases', 'new-commit', oldcommitphase,
@@ -1448,7 +1477,8 b' def commit(ui, repo, *pats, **opts):'
1448 ('e', 'edit', None, _('edit user config')),
1477 ('e', 'edit', None, _('edit user config')),
1449 ('l', 'local', None, _('edit repository config')),
1478 ('l', 'local', None, _('edit repository config')),
1450 ('g', 'global', None, _('edit global config'))],
1479 ('g', 'global', None, _('edit global config'))],
1451 _('[-u] [NAME]...'))
1480 _('[-u] [NAME]...'),
1481 optionalrepo=True)
1452 def config(ui, repo, *values, **opts):
1482 def config(ui, repo, *values, **opts):
1453 """show combined config settings from all hgrc files
1483 """show combined config settings from all hgrc files
1454
1484
@@ -1567,7 +1597,7 b' def copy(ui, repo, *pats, **opts):'
1567 finally:
1597 finally:
1568 wlock.release()
1598 wlock.release()
1569
1599
1570 @command('debugancestor', [], _('[INDEX] REV1 REV2'))
1600 @command('debugancestor', [], _('[INDEX] REV1 REV2'), optionalrepo=True)
1571 def debugancestor(ui, repo, *args):
1601 def debugancestor(ui, repo, *args):
1572 """find the ancestor revision of two revisions in a given index"""
1602 """find the ancestor revision of two revisions in a given index"""
1573 if len(args) == 3:
1603 if len(args) == 3:
@@ -1686,17 +1716,17 b' def debugbuilddag(ui, repo, text=None,'
1686 ml[id * linesperrev] += " r%i" % id
1716 ml[id * linesperrev] += " r%i" % id
1687 mergedtext = "\n".join(ml)
1717 mergedtext = "\n".join(ml)
1688 files.append(fn)
1718 files.append(fn)
1689 fctxs[fn] = context.memfilectx(fn, mergedtext)
1719 fctxs[fn] = context.memfilectx(repo, fn, mergedtext)
1690
1720
1691 if overwritten_file:
1721 if overwritten_file:
1692 fn = "of"
1722 fn = "of"
1693 files.append(fn)
1723 files.append(fn)
1694 fctxs[fn] = context.memfilectx(fn, "r%i\n" % id)
1724 fctxs[fn] = context.memfilectx(repo, fn, "r%i\n" % id)
1695
1725
1696 if new_file:
1726 if new_file:
1697 fn = "nf%i" % id
1727 fn = "nf%i" % id
1698 files.append(fn)
1728 files.append(fn)
1699 fctxs[fn] = context.memfilectx(fn, "r%i\n" % id)
1729 fctxs[fn] = context.memfilectx(repo, fn, "r%i\n" % id)
1700 if len(ps) > 1:
1730 if len(ps) > 1:
1701 if not p2:
1731 if not p2:
1702 p2 = repo[ps[1]]
1732 p2 = repo[ps[1]]
@@ -1737,7 +1767,10 b' def debugbuilddag(ui, repo, text=None,'
1737 ui.progress(_('building'), None)
1767 ui.progress(_('building'), None)
1738 release(tr, lock)
1768 release(tr, lock)
1739
1769
1740 @command('debugbundle', [('a', 'all', None, _('show all details'))], _('FILE'))
1770 @command('debugbundle',
1771 [('a', 'all', None, _('show all details'))],
1772 _('FILE'),
1773 norepo=True)
1741 def debugbundle(ui, bundlepath, all=None, **opts):
1774 def debugbundle(ui, bundlepath, all=None, **opts):
1742 """lists the contents of a bundle"""
1775 """lists the contents of a bundle"""
1743 f = hg.openpath(ui, bundlepath)
1776 f = hg.openpath(ui, bundlepath)
@@ -1815,7 +1848,7 b' def debugcheckstate(ui, repo):'
1815 error = _(".hg/dirstate inconsistent with current parent's manifest")
1848 error = _(".hg/dirstate inconsistent with current parent's manifest")
1816 raise util.Abort(error)
1849 raise util.Abort(error)
1817
1850
1818 @command('debugcommands', [], _('[COMMAND]'))
1851 @command('debugcommands', [], _('[COMMAND]'), norepo=True)
1819 def debugcommands(ui, cmd='', *args):
1852 def debugcommands(ui, cmd='', *args):
1820 """list all available commands and options"""
1853 """list all available commands and options"""
1821 for cmd, vals in sorted(table.iteritems()):
1854 for cmd, vals in sorted(table.iteritems()):
@@ -1825,7 +1858,8 b" def debugcommands(ui, cmd='', *args):"
1825
1858
1826 @command('debugcomplete',
1859 @command('debugcomplete',
1827 [('o', 'options', None, _('show the command options'))],
1860 [('o', 'options', None, _('show the command options'))],
1828 _('[-o] CMD'))
1861 _('[-o] CMD'),
1862 norepo=True)
1829 def debugcomplete(ui, cmd='', **opts):
1863 def debugcomplete(ui, cmd='', **opts):
1830 """returns the completion list associated with the given command"""
1864 """returns the completion list associated with the given command"""
1831
1865
@@ -1855,7 +1889,8 b" def debugcomplete(ui, cmd='', **opts):"
1855 ('b', 'branches', None, _('annotate with branch names')),
1889 ('b', 'branches', None, _('annotate with branch names')),
1856 ('', 'dots', None, _('use dots for runs')),
1890 ('', 'dots', None, _('use dots for runs')),
1857 ('s', 'spaces', None, _('separate elements by spaces'))],
1891 ('s', 'spaces', None, _('separate elements by spaces'))],
1858 _('[OPTION]... [FILE [REV]...]'))
1892 _('[OPTION]... [FILE [REV]...]'),
1893 optionalrepo=True)
1859 def debugdag(ui, repo, file_=None, *revs, **opts):
1894 def debugdag(ui, repo, file_=None, *revs, **opts):
1860 """format the changelog or an index DAG as a concise textual description
1895 """format the changelog or an index DAG as a concise textual description
1861
1896
@@ -1929,7 +1964,8 b' def debugdata(ui, repo, file_, rev=None,'
1929
1964
1930 @command('debugdate',
1965 @command('debugdate',
1931 [('e', 'extended', None, _('try extended date formats'))],
1966 [('e', 'extended', None, _('try extended date formats'))],
1932 _('[-e] DATE [RANGE]'))
1967 _('[-e] DATE [RANGE]'),
1968 norepo=True, optionalrepo=True)
1933 def debugdate(ui, date, range=None, **opts):
1969 def debugdate(ui, date, range=None, **opts):
1934 """parse and display a date"""
1970 """parse and display a date"""
1935 if opts["extended"]:
1971 if opts["extended"]:
@@ -2025,7 +2061,7 b' def debugfileset(ui, repo, expr, **opts)'
2025 for f in ctx.getfileset(expr):
2061 for f in ctx.getfileset(expr):
2026 ui.write("%s\n" % f)
2062 ui.write("%s\n" % f)
2027
2063
2028 @command('debugfsinfo', [], _('[PATH]'))
2064 @command('debugfsinfo', [], _('[PATH]'), norepo=True)
2029 def debugfsinfo(ui, path="."):
2065 def debugfsinfo(ui, path="."):
2030 """show information detected about current filesystem"""
2066 """show information detected about current filesystem"""
2031 util.writefile('.debugfsinfo', '')
2067 util.writefile('.debugfsinfo', '')
@@ -2040,7 +2076,8 b' def debugfsinfo(ui, path="."):'
2040 [('H', 'head', [], _('id of head node'), _('ID')),
2076 [('H', 'head', [], _('id of head node'), _('ID')),
2041 ('C', 'common', [], _('id of common node'), _('ID')),
2077 ('C', 'common', [], _('id of common node'), _('ID')),
2042 ('t', 'type', 'bzip2', _('bundle compression type to use'), _('TYPE'))],
2078 ('t', 'type', 'bzip2', _('bundle compression type to use'), _('TYPE'))],
2043 _('REPO FILE [-H|-C ID]...'))
2079 _('REPO FILE [-H|-C ID]...'),
2080 norepo=True)
2044 def debuggetbundle(ui, repopath, bundlepath, head=None, common=None, **opts):
2081 def debuggetbundle(ui, repopath, bundlepath, head=None, common=None, **opts):
2045 """retrieves a bundle from a repo
2082 """retrieves a bundle from a repo
2046
2083
@@ -2080,7 +2117,8 b' def debugignore(ui, repo, *values, **opt'
2080 [('c', 'changelog', False, _('open changelog')),
2117 [('c', 'changelog', False, _('open changelog')),
2081 ('m', 'manifest', False, _('open manifest')),
2118 ('m', 'manifest', False, _('open manifest')),
2082 ('f', 'format', 0, _('revlog format'), _('FORMAT'))],
2119 ('f', 'format', 0, _('revlog format'), _('FORMAT'))],
2083 _('[-f FORMAT] -c|-m|FILE'))
2120 _('[-f FORMAT] -c|-m|FILE'),
2121 optionalrepo=True)
2084 def debugindex(ui, repo, file_=None, **opts):
2122 def debugindex(ui, repo, file_=None, **opts):
2085 """dump the contents of an index file"""
2123 """dump the contents of an index file"""
2086 r = cmdutil.openrevlog(repo, 'debugindex', file_, opts)
2124 r = cmdutil.openrevlog(repo, 'debugindex', file_, opts)
@@ -2122,7 +2160,7 b' def debugindex(ui, repo, file_=None, **o'
2122 i, r.flags(i), r.start(i), r.length(i), r.rawsize(i),
2160 i, r.flags(i), r.start(i), r.length(i), r.rawsize(i),
2123 base, r.linkrev(i), pr[0], pr[1], short(node)))
2161 base, r.linkrev(i), pr[0], pr[1], short(node)))
2124
2162
2125 @command('debugindexdot', [], _('FILE'))
2163 @command('debugindexdot', [], _('FILE'), optionalrepo=True)
2126 def debugindexdot(ui, repo, file_):
2164 def debugindexdot(ui, repo, file_):
2127 """dump an index DAG as a graphviz dot file"""
2165 """dump an index DAG as a graphviz dot file"""
2128 r = None
2166 r = None
@@ -2141,7 +2179,7 b' def debugindexdot(ui, repo, file_):'
2141 ui.write("\t%d -> %d\n" % (r.rev(pp[1]), i))
2179 ui.write("\t%d -> %d\n" % (r.rev(pp[1]), i))
2142 ui.write("}\n")
2180 ui.write("}\n")
2143
2181
2144 @command('debuginstall', [], '')
2182 @command('debuginstall', [], '', norepo=True)
2145 def debuginstall(ui):
2183 def debuginstall(ui):
2146 '''test Mercurial installation
2184 '''test Mercurial installation
2147
2185
@@ -2239,7 +2277,7 b' def debuginstall(ui):'
2239
2277
2240 return problems
2278 return problems
2241
2279
2242 @command('debugknown', [], _('REPO ID...'))
2280 @command('debugknown', [], _('REPO ID...'), norepo=True)
2243 def debugknown(ui, repopath, *ids, **opts):
2281 def debugknown(ui, repopath, *ids, **opts):
2244 """test whether node ids are known to a repo
2282 """test whether node ids are known to a repo
2245
2283
@@ -2277,6 +2315,7 b' def debugobsolete(ui, repo, precursor=No'
2277 """create arbitrary obsolete marker
2315 """create arbitrary obsolete marker
2278
2316
2279 With no arguments, displays the list of obsolescence markers."""
2317 With no arguments, displays the list of obsolescence markers."""
2318
2280 def parsenodeid(s):
2319 def parsenodeid(s):
2281 try:
2320 try:
2282 # We do not use revsingle/revrange functions here to accept
2321 # We do not use revsingle/revrange functions here to accept
@@ -2376,7 +2415,7 b' def debugpathcomplete(ui, repo, *specs, '
2376 ui.write('\n'.join(repo.pathto(p, cwd) for p in sorted(files)))
2415 ui.write('\n'.join(repo.pathto(p, cwd) for p in sorted(files)))
2377 ui.write('\n')
2416 ui.write('\n')
2378
2417
2379 @command('debugpushkey', [], _('REPO NAMESPACE [KEY OLD NEW]'))
2418 @command('debugpushkey', [], _('REPO NAMESPACE [KEY OLD NEW]'), norepo=True)
2380 def debugpushkey(ui, repopath, namespace, *keyinfo, **opts):
2419 def debugpushkey(ui, repopath, namespace, *keyinfo, **opts):
2381 '''access the pushkey key/value protocol
2420 '''access the pushkey key/value protocol
2382
2421
@@ -2461,7 +2500,8 b' def debugrename(ui, repo, file1, *pats, '
2461 [('c', 'changelog', False, _('open changelog')),
2500 [('c', 'changelog', False, _('open changelog')),
2462 ('m', 'manifest', False, _('open manifest')),
2501 ('m', 'manifest', False, _('open manifest')),
2463 ('d', 'dump', False, _('dump index data'))],
2502 ('d', 'dump', False, _('dump index data'))],
2464 _('-c|-m|FILE'))
2503 _('-c|-m|FILE'),
2504 optionalrepo=True)
2465 def debugrevlog(ui, repo, file_=None, **opts):
2505 def debugrevlog(ui, repo, file_=None, **opts):
2466 """show data and statistics about a revlog"""
2506 """show data and statistics about a revlog"""
2467 r = cmdutil.openrevlog(repo, 'debugrevlog', file_, opts)
2507 r = cmdutil.openrevlog(repo, 'debugrevlog', file_, opts)
@@ -2768,7 +2808,7 b' def debugsuccessorssets(ui, repo, *revs)'
2768 ui.write(node2str(node))
2808 ui.write(node2str(node))
2769 ui.write('\n')
2809 ui.write('\n')
2770
2810
2771 @command('debugwalk', walkopts, _('[OPTION]... [FILE]...'))
2811 @command('debugwalk', walkopts, _('[OPTION]... [FILE]...'), inferrepo=True)
2772 def debugwalk(ui, repo, *pats, **opts):
2812 def debugwalk(ui, repo, *pats, **opts):
2773 """show how files match on given patterns"""
2813 """show how files match on given patterns"""
2774 m = scmutil.match(repo[None], pats, opts)
2814 m = scmutil.match(repo[None], pats, opts)
@@ -2790,7 +2830,8 b' def debugwalk(ui, repo, *pats, **opts):'
2790 ('', 'four', '', 'four'),
2830 ('', 'four', '', 'four'),
2791 ('', 'five', '', 'five'),
2831 ('', 'five', '', 'five'),
2792 ] + remoteopts,
2832 ] + remoteopts,
2793 _('REPO [OPTIONS]... [ONE [TWO]]'))
2833 _('REPO [OPTIONS]... [ONE [TWO]]'),
2834 norepo=True)
2794 def debugwireargs(ui, repopath, *vals, **opts):
2835 def debugwireargs(ui, repopath, *vals, **opts):
2795 repo = hg.peer(ui, opts, repopath)
2836 repo = hg.peer(ui, opts, repopath)
2796 for opt in remoteopts:
2837 for opt in remoteopts:
@@ -2810,7 +2851,8 b' def debugwireargs(ui, repopath, *vals, *'
2810 [('r', 'rev', [], _('revision'), _('REV')),
2851 [('r', 'rev', [], _('revision'), _('REV')),
2811 ('c', 'change', '', _('change made by revision'), _('REV'))
2852 ('c', 'change', '', _('change made by revision'), _('REV'))
2812 ] + diffopts + diffopts2 + walkopts + subrepoopts,
2853 ] + diffopts + diffopts2 + walkopts + subrepoopts,
2813 _('[OPTION]... ([-c REV] | [-r REV1 [-r REV2]]) [FILE]...'))
2854 _('[OPTION]... ([-c REV] | [-r REV1 [-r REV2]]) [FILE]...'),
2855 inferrepo=True)
2814 def diff(ui, repo, *pats, **opts):
2856 def diff(ui, repo, *pats, **opts):
2815 """diff repository (or selected files)
2857 """diff repository (or selected files)
2816
2858
@@ -2972,7 +3014,7 b' def export(ui, repo, *changesets, **opts'
2972 switch_parent=opts.get('switch_parent'),
3014 switch_parent=opts.get('switch_parent'),
2973 opts=patch.diffopts(ui, opts))
3015 opts=patch.diffopts(ui, opts))
2974
3016
2975 @command('^forget', walkopts, _('[OPTION]... FILE...'))
3017 @command('^forget', walkopts, _('[OPTION]... FILE...'), inferrepo=True)
2976 def forget(ui, repo, *pats, **opts):
3018 def forget(ui, repo, *pats, **opts):
2977 """forget the specified files on the next commit
3019 """forget the specified files on the next commit
2978
3020
@@ -3077,9 +3119,7 b' def graft(ui, repo, *revs, **opts):'
3077 if not opts.get('date') and opts.get('currentdate'):
3119 if not opts.get('date') and opts.get('currentdate'):
3078 opts['date'] = "%d %d" % util.makedate()
3120 opts['date'] = "%d %d" % util.makedate()
3079
3121
3080 editor = None
3122 editor = cmdutil.getcommiteditor(**opts)
3081 if opts.get('edit'):
3082 editor = cmdutil.commitforceeditor
3083
3123
3084 cont = False
3124 cont = False
3085 if opts['continue']:
3125 if opts['continue']:
@@ -3186,7 +3226,8 b' def graft(ui, repo, *revs, **opts):'
3186 repo.ui.setconfig('ui', 'forcemerge', opts.get('tool', ''),
3226 repo.ui.setconfig('ui', 'forcemerge', opts.get('tool', ''),
3187 'graft')
3227 'graft')
3188 stats = mergemod.update(repo, ctx.node(), True, True, False,
3228 stats = mergemod.update(repo, ctx.node(), True, True, False,
3189 ctx.p1().node())
3229 ctx.p1().node(),
3230 labels=['local', 'graft'])
3190 finally:
3231 finally:
3191 repo.ui.setconfig('ui', 'forcemerge', '', 'graft')
3232 repo.ui.setconfig('ui', 'forcemerge', '', 'graft')
3192 # report any conflicts
3233 # report any conflicts
@@ -3238,7 +3279,8 b' def graft(ui, repo, *revs, **opts):'
3238 ('u', 'user', None, _('list the author (long with -v)')),
3279 ('u', 'user', None, _('list the author (long with -v)')),
3239 ('d', 'date', None, _('list the date (short with -q)')),
3280 ('d', 'date', None, _('list the date (short with -q)')),
3240 ] + walkopts,
3281 ] + walkopts,
3241 _('[OPTION]... PATTERN [FILE]...'))
3282 _('[OPTION]... PATTERN [FILE]...'),
3283 inferrepo=True)
3242 def grep(ui, repo, pattern, *pats, **opts):
3284 def grep(ui, repo, pattern, *pats, **opts):
3243 """search for a pattern in specified files and revisions
3285 """search for a pattern in specified files and revisions
3244
3286
@@ -3261,7 +3303,7 b' def grep(ui, repo, pattern, *pats, **opt'
3261 if opts.get('ignore_case'):
3303 if opts.get('ignore_case'):
3262 reflags |= re.I
3304 reflags |= re.I
3263 try:
3305 try:
3264 regexp = util.compilere(pattern, reflags)
3306 regexp = util.re.compile(pattern, reflags)
3265 except re.error, inst:
3307 except re.error, inst:
3266 ui.warn(_("grep: invalid match pattern: %s\n") % inst)
3308 ui.warn(_("grep: invalid match pattern: %s\n") % inst)
3267 return 1
3309 return 1
@@ -3517,7 +3559,8 b' def heads(ui, repo, *branchrevs, **opts)'
3517 ('c', 'command', None, _('show only help for commands')),
3559 ('c', 'command', None, _('show only help for commands')),
3518 ('k', 'keyword', '', _('show topics matching keyword')),
3560 ('k', 'keyword', '', _('show topics matching keyword')),
3519 ],
3561 ],
3520 _('[-ec] [TOPIC]'))
3562 _('[-ec] [TOPIC]'),
3563 norepo=True)
3521 def help_(ui, name=None, **opts):
3564 def help_(ui, name=None, **opts):
3522 """show help for a given topic or a help overview
3565 """show help for a given topic or a help overview
3523
3566
@@ -3552,7 +3595,8 b' def help_(ui, name=None, **opts):'
3552 ('t', 'tags', None, _('show tags')),
3595 ('t', 'tags', None, _('show tags')),
3553 ('B', 'bookmarks', None, _('show bookmarks')),
3596 ('B', 'bookmarks', None, _('show bookmarks')),
3554 ] + remoteopts,
3597 ] + remoteopts,
3555 _('[-nibtB] [-r REV] [SOURCE]'))
3598 _('[-nibtB] [-r REV] [SOURCE]'),
3599 optionalrepo=True)
3556 def identify(ui, repo, source=None, rev=None,
3600 def identify(ui, repo, source=None, rev=None,
3557 num=None, id=None, branch=None, tags=None, bookmarks=None, **opts):
3601 num=None, id=None, branch=None, tags=None, bookmarks=None, **opts):
3558 """identify the working copy or specified revision
3602 """identify the working copy or specified revision
@@ -3692,6 +3736,8 b' def identify(ui, repo, source=None, rev='
3692 _("don't commit, just update the working directory")),
3736 _("don't commit, just update the working directory")),
3693 ('', 'bypass', None,
3737 ('', 'bypass', None,
3694 _("apply patch without touching the working directory")),
3738 _("apply patch without touching the working directory")),
3739 ('', 'partial', None,
3740 _('commit even if some hunks fail')),
3695 ('', 'exact', None,
3741 ('', 'exact', None,
3696 _('apply patch to the nodes from which it was generated')),
3742 _('apply patch to the nodes from which it was generated')),
3697 ('', 'import-branch', None,
3743 ('', 'import-branch', None,
@@ -3733,6 +3779,16 b' def import_(ui, repo, patch1=None, *patc'
3733 With -s/--similarity, hg will attempt to discover renames and
3779 With -s/--similarity, hg will attempt to discover renames and
3734 copies in the patch in the same way as :hg:`addremove`.
3780 copies in the patch in the same way as :hg:`addremove`.
3735
3781
3782 Use --partial to ensure a changeset will be created from the patch
3783 even if some hunks fail to apply. Hunks that fail to apply will be
3784 written to a <target-file>.rej file. Conflicts can then be resolved
3785 by hand before :hg:`commit --amend` is run to update the created
3786 changeset. This flag exists to let people import patches that
3787 partially apply without losing the associated metadata (author,
3788 date, description, ...), Note that when none of the hunk applies
3789 cleanly, :hg:`import --partial` will create an empty changeset,
3790 importing only the patch metadata.
3791
3736 To read a patch from standard input, use "-" as the patch name. If
3792 To read a patch from standard input, use "-" as the patch name. If
3737 a URL is specified, the patch will be downloaded from it.
3793 a URL is specified, the patch will be downloaded from it.
3738 See :hg:`help dates` for a list of formats valid for -d/--date.
3794 See :hg:`help dates` for a list of formats valid for -d/--date.
@@ -3758,7 +3814,7 b' def import_(ui, repo, patch1=None, *patc'
3758
3814
3759 hg import --exact proposed-fix.patch
3815 hg import --exact proposed-fix.patch
3760
3816
3761 Returns 0 on success.
3817 Returns 0 on success, 1 on partial success (see --partial).
3762 """
3818 """
3763
3819
3764 if not patch1:
3820 if not patch1:
@@ -3790,6 +3846,7 b' def import_(ui, repo, patch1=None, *patc'
3790 base = opts["base"]
3846 base = opts["base"]
3791 wlock = lock = tr = None
3847 wlock = lock = tr = None
3792 msgs = []
3848 msgs = []
3849 ret = 0
3793
3850
3794
3851
3795 try:
3852 try:
@@ -3811,8 +3868,9 b' def import_(ui, repo, patch1=None, *patc'
3811
3868
3812 haspatch = False
3869 haspatch = False
3813 for hunk in patch.split(patchfile):
3870 for hunk in patch.split(patchfile):
3814 (msg, node) = cmdutil.tryimportone(ui, repo, hunk, parents,
3871 (msg, node, rej) = cmdutil.tryimportone(ui, repo, hunk,
3815 opts, msgs, hg.clean)
3872 parents, opts,
3873 msgs, hg.clean)
3816 if msg:
3874 if msg:
3817 haspatch = True
3875 haspatch = True
3818 ui.note(msg + '\n')
3876 ui.note(msg + '\n')
@@ -3820,6 +3878,12 b' def import_(ui, repo, patch1=None, *patc'
3820 parents = repo.parents()
3878 parents = repo.parents()
3821 else:
3879 else:
3822 parents = [repo[node]]
3880 parents = [repo[node]]
3881 if rej:
3882 ui.write_err(_("patch applied partially\n"))
3883 ui.write_err(("(fix the .rej files and run "
3884 "`hg commit --amend`)\n"))
3885 ret = 1
3886 break
3823
3887
3824 if not haspatch:
3888 if not haspatch:
3825 raise util.Abort(_('%s: no diffs found') % patchurl)
3889 raise util.Abort(_('%s: no diffs found') % patchurl)
@@ -3828,6 +3892,7 b' def import_(ui, repo, patch1=None, *patc'
3828 tr.close()
3892 tr.close()
3829 if msgs:
3893 if msgs:
3830 repo.savecommitmessage('\n* * *\n'.join(msgs))
3894 repo.savecommitmessage('\n* * *\n'.join(msgs))
3895 return ret
3831 except: # re-raises
3896 except: # re-raises
3832 # wlock.release() indirectly calls dirstate.write(): since
3897 # wlock.release() indirectly calls dirstate.write(): since
3833 # we're crashing, we do not want to change the working dir
3898 # we're crashing, we do not want to change the working dir
@@ -3913,7 +3978,8 b' def incoming(ui, repo, source="default",'
3913 del repo._subtoppath
3978 del repo._subtoppath
3914
3979
3915
3980
3916 @command('^init', remoteopts, _('[-e CMD] [--remotecmd CMD] [DEST]'))
3981 @command('^init', remoteopts, _('[-e CMD] [--remotecmd CMD] [DEST]'),
3982 norepo=True)
3917 def init(ui, dest=".", **opts):
3983 def init(ui, dest=".", **opts):
3918 """create a new repository in the given directory
3984 """create a new repository in the given directory
3919
3985
@@ -3993,7 +4059,8 b' def locate(ui, repo, *pats, **opts):'
3993 ('P', 'prune', [],
4059 ('P', 'prune', [],
3994 _('do not display revision or any of its ancestors'), _('REV')),
4060 _('do not display revision or any of its ancestors'), _('REV')),
3995 ] + logopts + walkopts,
4061 ] + logopts + walkopts,
3996 _('[OPTION]... [FILE]'))
4062 _('[OPTION]... [FILE]'),
4063 inferrepo=True)
3997 def log(ui, repo, *pats, **opts):
4064 def log(ui, repo, *pats, **opts):
3998 """show revision history of entire repository or files
4065 """show revision history of entire repository or files
3999
4066
@@ -4351,7 +4418,8 b' def outgoing(ui, repo, dest=None, **opts'
4351 @command('parents',
4418 @command('parents',
4352 [('r', 'rev', '', _('show parents of the specified revision'), _('REV')),
4419 [('r', 'rev', '', _('show parents of the specified revision'), _('REV')),
4353 ] + templateopts,
4420 ] + templateopts,
4354 _('[-r REV] [FILE]'))
4421 _('[-r REV] [FILE]'),
4422 inferrepo=True)
4355 def parents(ui, repo, file_=None, **opts):
4423 def parents(ui, repo, file_=None, **opts):
4356 """show the parents of the working directory or revision
4424 """show the parents of the working directory or revision
4357
4425
@@ -4394,7 +4462,7 b' def parents(ui, repo, file_=None, **opts'
4394 displayer.show(repo[n])
4462 displayer.show(repo[n])
4395 displayer.close()
4463 displayer.close()
4396
4464
4397 @command('paths', [], _('[NAME]'))
4465 @command('paths', [], _('[NAME]'), optionalrepo=True)
4398 def paths(ui, repo, search=None):
4466 def paths(ui, repo, search=None):
4399 """show aliases for remote repositories
4467 """show aliases for remote repositories
4400
4468
@@ -4748,7 +4816,8 b' def recover(ui, repo):'
4748 ('f', 'force', None,
4816 ('f', 'force', None,
4749 _('remove (and delete) file even if added or modified')),
4817 _('remove (and delete) file even if added or modified')),
4750 ] + walkopts,
4818 ] + walkopts,
4751 _('[OPTION]... FILE...'))
4819 _('[OPTION]... FILE...'),
4820 inferrepo=True)
4752 def remove(ui, repo, *pats, **opts):
4821 def remove(ui, repo, *pats, **opts):
4753 """remove the specified files on the next commit
4822 """remove the specified files on the next commit
4754
4823
@@ -4877,7 +4946,8 b' def rename(ui, repo, *pats, **opts):'
4877 ('u', 'unmark', None, _('mark files as unresolved')),
4946 ('u', 'unmark', None, _('mark files as unresolved')),
4878 ('n', 'no-status', None, _('hide status prefix'))]
4947 ('n', 'no-status', None, _('hide status prefix'))]
4879 + mergetoolopts + walkopts,
4948 + mergetoolopts + walkopts,
4880 _('[OPTION]... [FILE]...'))
4949 _('[OPTION]... [FILE]...'),
4950 inferrepo=True)
4881 def resolve(ui, repo, *pats, **opts):
4951 def resolve(ui, repo, *pats, **opts):
4882 """redo merges or set/view the merge status of files
4952 """redo merges or set/view the merge status of files
4883
4953
@@ -4930,46 +5000,66 b' def resolve(ui, repo, *pats, **opts):'
4930 wlock = repo.wlock()
5000 wlock = repo.wlock()
4931 try:
5001 try:
4932 ms = mergemod.mergestate(repo)
5002 ms = mergemod.mergestate(repo)
5003
5004 if not ms.active() and not show:
5005 raise util.Abort(
5006 _('resolve command not applicable when not merging'))
5007
4933 m = scmutil.match(repo[None], pats, opts)
5008 m = scmutil.match(repo[None], pats, opts)
4934 ret = 0
5009 ret = 0
5010 didwork = False
4935
5011
4936 for f in ms:
5012 for f in ms:
4937 if m(f):
5013 if not m(f):
4938 if show:
5014 continue
4939 if nostatus:
5015
4940 ui.write("%s\n" % f)
5016 didwork = True
4941 else:
5017
4942 ui.write("%s %s\n" % (ms[f].upper(), f),
5018 if show:
4943 label='resolve.' +
5019 if nostatus:
4944 {'u': 'unresolved', 'r': 'resolved'}[ms[f]])
5020 ui.write("%s\n" % f)
4945 elif mark:
4946 ms.mark(f, "r")
4947 elif unmark:
4948 ms.mark(f, "u")
4949 else:
5021 else:
4950 wctx = repo[None]
5022 ui.write("%s %s\n" % (ms[f].upper(), f),
4951
5023 label='resolve.' +
4952 # backup pre-resolve (merge uses .orig for its own purposes)
5024 {'u': 'unresolved', 'r': 'resolved'}[ms[f]])
4953 a = repo.wjoin(f)
5025 elif mark:
4954 util.copyfile(a, a + ".resolve")
5026 ms.mark(f, "r")
4955
5027 elif unmark:
4956 try:
5028 ms.mark(f, "u")
4957 # resolve file
5029 else:
4958 ui.setconfig('ui', 'forcemerge', opts.get('tool', ''),
5030 wctx = repo[None]
4959 'resolve')
5031
4960 if ms.resolve(f, wctx):
5032 # backup pre-resolve (merge uses .orig for its own purposes)
4961 ret = 1
5033 a = repo.wjoin(f)
4962 finally:
5034 util.copyfile(a, a + ".resolve")
4963 ui.setconfig('ui', 'forcemerge', '', 'resolve')
5035
4964 ms.commit()
5036 try:
4965
5037 # resolve file
4966 # replace filemerge's .orig file with our resolve file
5038 ui.setconfig('ui', 'forcemerge', opts.get('tool', ''),
4967 util.rename(a + ".resolve", a + ".orig")
5039 'resolve')
5040 if ms.resolve(f, wctx):
5041 ret = 1
5042 finally:
5043 ui.setconfig('ui', 'forcemerge', '', 'resolve')
5044 ms.commit()
5045
5046 # replace filemerge's .orig file with our resolve file
5047 util.rename(a + ".resolve", a + ".orig")
4968
5048
4969 ms.commit()
5049 ms.commit()
5050
5051 if not didwork and pats:
5052 ui.warn(_("arguments do not match paths that need resolving\n"))
5053
4970 finally:
5054 finally:
4971 wlock.release()
5055 wlock.release()
4972
5056
5057 # Nudge users into finishing an unfinished operation. We don't print
5058 # this with the list/show operation because we want list/show to remain
5059 # machine readable.
5060 if not list(ms.unresolved()) and not show:
5061 ui.status(_('no more unresolved files\n'))
5062
4973 return ret
5063 return ret
4974
5064
4975 @command('revert',
5065 @command('revert',
@@ -5126,7 +5216,8 b' def root(ui, repo):'
5126 ('', 'style', '', _('template style to use'), _('STYLE')),
5216 ('', 'style', '', _('template style to use'), _('STYLE')),
5127 ('6', 'ipv6', None, _('use IPv6 in addition to IPv4')),
5217 ('6', 'ipv6', None, _('use IPv6 in addition to IPv4')),
5128 ('', 'certificate', '', _('SSL certificate file'), _('FILE'))],
5218 ('', 'certificate', '', _('SSL certificate file'), _('FILE'))],
5129 _('[OPTION]...'))
5219 _('[OPTION]...'),
5220 optionalrepo=True)
5130 def serve(ui, repo, **opts):
5221 def serve(ui, repo, **opts):
5131 """start stand-alone webserver
5222 """start stand-alone webserver
5132
5223
@@ -5155,13 +5246,10 b' def serve(ui, repo, **opts):'
5155 if opts["stdio"] and opts["cmdserver"]:
5246 if opts["stdio"] and opts["cmdserver"]:
5156 raise util.Abort(_("cannot use --stdio with --cmdserver"))
5247 raise util.Abort(_("cannot use --stdio with --cmdserver"))
5157
5248
5158 def checkrepo():
5249 if opts["stdio"]:
5159 if repo is None:
5250 if repo is None:
5160 raise error.RepoError(_("there is no Mercurial repository here"
5251 raise error.RepoError(_("there is no Mercurial repository here"
5161 " (.hg not found)"))
5252 " (.hg not found)"))
5162
5163 if opts["stdio"]:
5164 checkrepo()
5165 s = sshserver.sshserver(ui, repo)
5253 s = sshserver.sshserver(ui, repo)
5166 s.serve_forever()
5254 s.serve_forever()
5167
5255
@@ -5232,6 +5320,7 b' class httpservice(object):'
5232 write = self.ui.write
5320 write = self.ui.write
5233 write(_('listening at http://%s%s/%s (bound to %s:%d)\n') %
5321 write(_('listening at http://%s%s/%s (bound to %s:%d)\n') %
5234 (fqaddr, port, prefix, bindaddr, self.httpd.port))
5322 (fqaddr, port, prefix, bindaddr, self.httpd.port))
5323 self.ui.flush() # avoid buffering of status message
5235
5324
5236 def run(self):
5325 def run(self):
5237 self.httpd.serve_forever()
5326 self.httpd.serve_forever()
@@ -5252,7 +5341,8 b' class httpservice(object):'
5252 ('', 'rev', [], _('show difference from revision'), _('REV')),
5341 ('', 'rev', [], _('show difference from revision'), _('REV')),
5253 ('', 'change', '', _('list the changed files of a revision'), _('REV')),
5342 ('', 'change', '', _('list the changed files of a revision'), _('REV')),
5254 ] + walkopts + subrepoopts,
5343 ] + walkopts + subrepoopts,
5255 _('[OPTION]... [FILE]...'))
5344 _('[OPTION]... [FILE]...'),
5345 inferrepo=True)
5256 def status(ui, repo, *pats, **opts):
5346 def status(ui, repo, *pats, **opts):
5257 """show changed files in the working directory
5347 """show changed files in the working directory
5258
5348
@@ -5689,16 +5779,15 b' def tag(ui, repo, name1, *names, **opts)'
5689 if date:
5779 if date:
5690 date = util.parsedate(date)
5780 date = util.parsedate(date)
5691
5781
5692 if opts.get('edit'):
5782 editor = cmdutil.getcommiteditor(**opts)
5693 message = ui.edit(message, ui.username())
5694 repo.savecommitmessage(message)
5695
5783
5696 # don't allow tagging the null rev
5784 # don't allow tagging the null rev
5697 if (not opts.get('remove') and
5785 if (not opts.get('remove') and
5698 scmutil.revsingle(repo, rev_).rev() == nullrev):
5786 scmutil.revsingle(repo, rev_).rev() == nullrev):
5699 raise util.Abort(_("cannot tag null revision"))
5787 raise util.Abort(_("cannot tag null revision"))
5700
5788
5701 repo.tag(names, r, message, opts.get('local'), opts.get('user'), date)
5789 repo.tag(names, r, message, opts.get('local'), opts.get('user'), date,
5790 editor=editor)
5702 finally:
5791 finally:
5703 release(lock, wlock)
5792 release(lock, wlock)
5704
5793
@@ -5791,9 +5880,11 b' def unbundle(ui, repo, fname1, *fnames, '
5791 ('c', 'check', None,
5880 ('c', 'check', None,
5792 _('update across branches if no uncommitted changes')),
5881 _('update across branches if no uncommitted changes')),
5793 ('d', 'date', '', _('tipmost revision matching date'), _('DATE')),
5882 ('d', 'date', '', _('tipmost revision matching date'), _('DATE')),
5794 ('r', 'rev', '', _('revision'), _('REV'))],
5883 ('r', 'rev', '', _('revision'), _('REV'))
5884 ] + mergetoolopts,
5795 _('[-c] [-C] [-d DATE] [[-r] REV]'))
5885 _('[-c] [-C] [-d DATE] [[-r] REV]'))
5796 def update(ui, repo, node=None, rev=None, clean=False, date=None, check=False):
5886 def update(ui, repo, node=None, rev=None, clean=False, date=None, check=False,
5887 tool=None):
5797 """update working directory (or switch revisions)
5888 """update working directory (or switch revisions)
5798
5889
5799 Update the repository's working directory to the specified
5890 Update the repository's working directory to the specified
@@ -5874,6 +5965,8 b' def update(ui, repo, node=None, rev=None'
5874 rev = repo[repo[None].branch()].rev()
5965 rev = repo[repo[None].branch()].rev()
5875 mergemod._checkunknown(repo, repo[None], repo[rev])
5966 mergemod._checkunknown(repo, repo[None], repo[rev])
5876
5967
5968 repo.ui.setconfig('ui', 'forcemerge', tool, 'update')
5969
5877 if clean:
5970 if clean:
5878 ret = hg.clean(repo, rev)
5971 ret = hg.clean(repo, rev)
5879 else:
5972 else:
@@ -5884,7 +5977,11 b' def update(ui, repo, node=None, rev=None'
5884 ui.status(_("updating bookmark %s\n") % repo._bookmarkcurrent)
5977 ui.status(_("updating bookmark %s\n") % repo._bookmarkcurrent)
5885 elif brev in repo._bookmarks:
5978 elif brev in repo._bookmarks:
5886 bookmarks.setcurrent(repo, brev)
5979 bookmarks.setcurrent(repo, brev)
5980 ui.status(_("(activating bookmark %s)\n") % brev)
5887 elif brev:
5981 elif brev:
5982 if repo._bookmarkcurrent:
5983 ui.status(_("(leaving bookmark %s)\n") %
5984 repo._bookmarkcurrent)
5888 bookmarks.unsetcurrent(repo)
5985 bookmarks.unsetcurrent(repo)
5889
5986
5890 return ret
5987 return ret
@@ -5908,7 +6005,7 b' def verify(ui, repo):'
5908 """
6005 """
5909 return hg.verify(repo)
6006 return hg.verify(repo)
5910
6007
5911 @command('version', [])
6008 @command('version', [], norepo=True)
5912 def version_(ui):
6009 def version_(ui):
5913 """output version and copyright information"""
6010 """output version and copyright information"""
5914 ui.write(_("Mercurial Distributed SCM (version %s)\n")
6011 ui.write(_("Mercurial Distributed SCM (version %s)\n")
@@ -5921,10 +6018,14 b' def version_(ui):'
5921 "not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n"
6018 "not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n"
5922 ))
6019 ))
5923
6020
5924 norepo = ("clone init version help debugcommands debugcomplete"
6021 ui.note(_("\nEnabled extensions:\n\n"))
5925 " debugdate debuginstall debugfsinfo debugpushkey debugwireargs"
6022 if ui.verbose:
5926 " debugknown debuggetbundle debugbundle")
6023 # format names and versions into columns
5927 optionalrepo = ("identify paths serve config showconfig debugancestor debugdag"
6024 names = []
5928 " debugdata debugindex debugindexdot debugrevlog")
6025 vers = []
5929 inferrepo = ("add addremove annotate cat commit diff grep forget log parents"
6026 for name, module in extensions.extensions():
5930 " remove resolve status debugwalk")
6027 names.append(name)
6028 vers.append(extensions.moduleversion(module))
6029 maxnamelen = max(len(n) for n in names)
6030 for i, name in enumerate(names):
6031 ui.write(" %-*s %s\n" % (maxnamelen, name, vers[i]))
@@ -9,37 +9,6 b' from i18n import _'
9 import error, util
9 import error, util
10 import os, errno
10 import os, errno
11
11
12 class sortdict(dict):
13 'a simple sorted dictionary'
14 def __init__(self, data=None):
15 self._list = []
16 if data:
17 self.update(data)
18 def copy(self):
19 return sortdict(self)
20 def __setitem__(self, key, val):
21 if key in self:
22 self._list.remove(key)
23 self._list.append(key)
24 dict.__setitem__(self, key, val)
25 def __iter__(self):
26 return self._list.__iter__()
27 def update(self, src):
28 for k in src:
29 self[k] = src[k]
30 def clear(self):
31 dict.clear(self)
32 self._list = []
33 def items(self):
34 return [(k, self[k]) for k in self._list]
35 def __delitem__(self, key):
36 dict.__delitem__(self, key)
37 self._list.remove(key)
38 def keys(self):
39 return self._list
40 def iterkeys(self):
41 return self._list.__iter__()
42
43 class config(object):
12 class config(object):
44 def __init__(self, data=None):
13 def __init__(self, data=None):
45 self._data = {}
14 self._data = {}
@@ -65,7 +34,7 b' class config(object):'
65 del self._source[(s, n)]
34 del self._source[(s, n)]
66 for s in src:
35 for s in src:
67 if s not in self:
36 if s not in self:
68 self._data[s] = sortdict()
37 self._data[s] = util.sortdict()
69 self._data[s].update(src._data[s])
38 self._data[s].update(src._data[s])
70 self._source.update(src._source)
39 self._source.update(src._source)
71 def get(self, section, item, default=None):
40 def get(self, section, item, default=None):
@@ -91,7 +60,7 b' class config(object):'
91 return self._data.get(section, {}).items()
60 return self._data.get(section, {}).items()
92 def set(self, section, item, value, source=""):
61 def set(self, section, item, value, source=""):
93 if section not in self:
62 if section not in self:
94 self._data[section] = sortdict()
63 self._data[section] = util.sortdict()
95 self._data[section][item] = value
64 self._data[section][item] = value
96 if source:
65 if source:
97 self._source[(section, item)] = source
66 self._source[(section, item)] = source
@@ -111,13 +80,13 b' class config(object):'
111 self._source.pop((section, item), None)
80 self._source.pop((section, item), None)
112
81
113 def parse(self, src, data, sections=None, remap=None, include=None):
82 def parse(self, src, data, sections=None, remap=None, include=None):
114 sectionre = util.compilere(r'\[([^\[]+)\]')
83 sectionre = util.re.compile(r'\[([^\[]+)\]')
115 itemre = util.compilere(r'([^=\s][^=]*?)\s*=\s*(.*\S|)')
84 itemre = util.re.compile(r'([^=\s][^=]*?)\s*=\s*(.*\S|)')
116 contre = util.compilere(r'\s+(\S|\S.*\S)\s*$')
85 contre = util.re.compile(r'\s+(\S|\S.*\S)\s*$')
117 emptyre = util.compilere(r'(;|#|\s*$)')
86 emptyre = util.re.compile(r'(;|#|\s*$)')
118 commentre = util.compilere(r'(;|#)')
87 commentre = util.re.compile(r'(;|#)')
119 unsetre = util.compilere(r'%unset\s+(\S+)')
88 unsetre = util.re.compile(r'%unset\s+(\S+)')
120 includere = util.compilere(r'%include\s+(\S|\S.*\S)\s*$')
89 includere = util.re.compile(r'%include\s+(\S|\S.*\S)\s*$')
121 section = ""
90 section = ""
122 item = None
91 item = None
123 line = 0
92 line = 0
@@ -162,7 +131,7 b' class config(object):'
162 if remap:
131 if remap:
163 section = remap.get(section, section)
132 section = remap.get(section, section)
164 if section not in self:
133 if section not in self:
165 self._data[section] = sortdict()
134 self._data[section] = util.sortdict()
166 continue
135 continue
167 m = itemre.match(l)
136 m = itemre.match(l)
168 if m:
137 if m:
@@ -13,6 +13,7 b' import os, errno, stat'
13 import obsolete as obsmod
13 import obsolete as obsmod
14 import repoview
14 import repoview
15 import fileset
15 import fileset
16 import revlog
16
17
17 propertycache = util.propertycache
18 propertycache = util.propertycache
18
19
@@ -63,10 +64,87 b' class basectx(object):'
63 for f in sorted(self._manifest):
64 for f in sorted(self._manifest):
64 yield f
65 yield f
65
66
67 def _manifestmatches(self, match, s):
68 """generate a new manifest filtered by the match argument
69
70 This method is for internal use only and mainly exists to provide an
71 object oriented way for other contexts to customize the manifest
72 generation.
73 """
74 if match.always():
75 return self.manifest().copy()
76
77 files = match.files()
78 if (match.matchfn == match.exact or
79 (not match.anypats() and util.all(fn in self for fn in files))):
80 return self.manifest().intersectfiles(files)
81
82 mf = self.manifest().copy()
83 for fn in mf.keys():
84 if not match(fn):
85 del mf[fn]
86 return mf
87
88 def _matchstatus(self, other, s, match, listignored, listclean,
89 listunknown):
90 """return match.always if match is none
91
92 This internal method provides a way for child objects to override the
93 match operator.
94 """
95 return match or matchmod.always(self._repo.root, self._repo.getcwd())
96
97 def _prestatus(self, other, s, match, listignored, listclean, listunknown):
98 """provide a hook to allow child objects to preprocess status results
99
100 For example, this allows other contexts, such as workingctx, to query
101 the dirstate before comparing the manifests.
102 """
103 # load earliest manifest first for caching reasons
104 if self.rev() < other.rev():
105 self.manifest()
106 return s
107
108 def _poststatus(self, other, s, match, listignored, listclean, listunknown):
109 """provide a hook to allow child objects to postprocess status results
110
111 For example, this allows other contexts, such as workingctx, to filter
112 suspect symlinks in the case of FAT32 and NTFS filesytems.
113 """
114 return s
115
116 def _buildstatus(self, other, s, match, listignored, listclean,
117 listunknown):
118 """build a status with respect to another context"""
119 mf1 = other._manifestmatches(match, s)
120 mf2 = self._manifestmatches(match, s)
121
122 modified, added, clean = [], [], []
123 deleted, unknown, ignored = s[3], [], []
124 withflags = mf1.withflags() | mf2.withflags()
125 for fn, mf2node in mf2.iteritems():
126 if fn in mf1:
127 if (fn not in deleted and
128 ((fn in withflags and mf1.flags(fn) != mf2.flags(fn)) or
129 (mf1[fn] != mf2node and
130 (mf2node or self[fn].cmp(other[fn]))))):
131 modified.append(fn)
132 elif listclean:
133 clean.append(fn)
134 del mf1[fn]
135 elif fn not in deleted:
136 added.append(fn)
137 removed = mf1.keys()
138
139 return [modified, added, removed, deleted, unknown, ignored, clean]
140
66 @propertycache
141 @propertycache
67 def substate(self):
142 def substate(self):
68 return subrepo.state(self, self._repo.ui)
143 return subrepo.state(self, self._repo.ui)
69
144
145 def subrev(self, subpath):
146 return self.substate[subpath][1]
147
70 def rev(self):
148 def rev(self):
71 return self._rev
149 return self._rev
72 def node(self):
150 def node(self):
@@ -185,8 +263,7 b' class basectx(object):'
185 if ctx2 is not None:
263 if ctx2 is not None:
186 ctx2 = self._repo[ctx2]
264 ctx2 = self._repo[ctx2]
187 diffopts = patch.diffopts(self._repo.ui, opts)
265 diffopts = patch.diffopts(self._repo.ui, opts)
188 return patch.diff(self._repo, ctx2.node(), self.node(),
266 return patch.diff(self._repo, ctx2, self, match=match, opts=diffopts)
189 match=match, opts=diffopts)
190
267
191 @propertycache
268 @propertycache
192 def _dirs(self):
269 def _dirs(self):
@@ -198,19 +275,81 b' class basectx(object):'
198 def dirty(self):
275 def dirty(self):
199 return False
276 return False
200
277
278 def status(self, other=None, match=None, listignored=False,
279 listclean=False, listunknown=False, listsubrepos=False):
280 """return status of files between two nodes or node and working
281 directory.
282
283 If other is None, compare this node with working directory.
284
285 returns (modified, added, removed, deleted, unknown, ignored, clean)
286 """
287
288 ctx1 = self
289 ctx2 = self._repo[other]
290
291 # This next code block is, admittedly, fragile logic that tests for
292 # reversing the contexts and wouldn't need to exist if it weren't for
293 # the fast (and common) code path of comparing the working directory
294 # with its first parent.
295 #
296 # What we're aiming for here is the ability to call:
297 #
298 # workingctx.status(parentctx)
299 #
300 # If we always built the manifest for each context and compared those,
301 # then we'd be done. But the special case of the above call means we
302 # just copy the manifest of the parent.
303 reversed = False
304 if (not isinstance(ctx1, changectx)
305 and isinstance(ctx2, changectx)):
306 reversed = True
307 ctx1, ctx2 = ctx2, ctx1
308
309 r = [[], [], [], [], [], [], []]
310 match = ctx2._matchstatus(ctx1, r, match, listignored, listclean,
311 listunknown)
312 r = ctx2._prestatus(ctx1, r, match, listignored, listclean, listunknown)
313 r = ctx2._buildstatus(ctx1, r, match, listignored, listclean,
314 listunknown)
315 r = ctx2._poststatus(ctx1, r, match, listignored, listclean,
316 listunknown)
317
318 if reversed:
319 r[1], r[2], r[3], r[4] = r[2], r[1], r[4], r[3]
320
321 if listsubrepos:
322 for subpath, sub in scmutil.itersubrepos(ctx1, ctx2):
323 rev2 = ctx2.subrev(subpath)
324 try:
325 submatch = matchmod.narrowmatcher(subpath, match)
326 s = sub.status(rev2, match=submatch, ignored=listignored,
327 clean=listclean, unknown=listunknown,
328 listsubrepos=True)
329 for rfiles, sfiles in zip(r, s):
330 rfiles.extend("%s/%s" % (subpath, f) for f in sfiles)
331 except error.LookupError:
332 self._repo.ui.status(_("skipping missing "
333 "subrepository: %s\n") % subpath)
334
335 for l in r:
336 l.sort()
337
338 # we return a tuple to signify that this list isn't changing
339 return tuple(r)
340
341
201 def makememctx(repo, parents, text, user, date, branch, files, store,
342 def makememctx(repo, parents, text, user, date, branch, files, store,
202 editor=None):
343 editor=None):
203 def getfilectx(repo, memctx, path):
344 def getfilectx(repo, memctx, path):
204 data, (islink, isexec), copied = store.getfile(path)
345 data, (islink, isexec), copied = store.getfile(path)
205 return memfilectx(path, data, islink=islink, isexec=isexec,
346 return memfilectx(repo, path, data, islink=islink, isexec=isexec,
206 copied=copied)
347 copied=copied, memctx=memctx)
207 extra = {}
348 extra = {}
208 if branch:
349 if branch:
209 extra['branch'] = encoding.fromlocal(branch)
350 extra['branch'] = encoding.fromlocal(branch)
210 ctx = memctx(repo, parents, text, files, getfilectx, user,
351 ctx = memctx(repo, parents, text, files, getfilectx, user,
211 date, extra)
352 date, extra, editor)
212 if editor:
213 ctx._text = editor(repo, ctx, [])
214 return ctx
353 return ctx
215
354
216 class changectx(basectx):
355 class changectx(basectx):
@@ -823,14 +962,7 b' class committablectx(basectx):'
823 if user:
962 if user:
824 self._user = user
963 self._user = user
825 if changes:
964 if changes:
826 self._status = list(changes[:4])
965 self._status = changes
827 self._unknown = changes[4]
828 self._ignored = changes[5]
829 self._clean = changes[6]
830 else:
831 self._unknown = None
832 self._ignored = None
833 self._clean = None
834
966
835 self._extra = {}
967 self._extra = {}
836 if extra:
968 if extra:
@@ -850,9 +982,6 b' class committablectx(basectx):'
850 def __nonzero__(self):
982 def __nonzero__(self):
851 return True
983 return True
852
984
853 def __contains__(self, key):
854 return self._repo.dirstate[key] not in "?r"
855
856 def _buildflagfunc(self):
985 def _buildflagfunc(self):
857 # Create a fallback function for getting file flags when the
986 # Create a fallback function for getting file flags when the
858 # filesystem doesn't support them
987 # filesystem doesn't support them
@@ -891,7 +1020,7 b' class committablectx(basectx):'
891
1020
892 @propertycache
1021 @propertycache
893 def _manifest(self):
1022 def _manifest(self):
894 """generate a manifest corresponding to the working directory"""
1023 """generate a manifest corresponding to the values in self._status"""
895
1024
896 man = self._parents[0].manifest().copy()
1025 man = self._parents[0].manifest().copy()
897 if len(self._parents) > 1:
1026 if len(self._parents) > 1:
@@ -905,7 +1034,7 b' class committablectx(basectx):'
905
1034
906 copied = self._repo.dirstate.copies()
1035 copied = self._repo.dirstate.copies()
907 ff = self._flagfunc
1036 ff = self._flagfunc
908 modified, added, removed, deleted = self._status
1037 modified, added, removed, deleted = self._status[:4]
909 for i, l in (("a", added), ("m", modified)):
1038 for i, l in (("a", added), ("m", modified)):
910 for f in l:
1039 for f in l:
911 orig = copied.get(f, f)
1040 orig = copied.get(f, f)
@@ -923,7 +1052,7 b' class committablectx(basectx):'
923
1052
924 @propertycache
1053 @propertycache
925 def _status(self):
1054 def _status(self):
926 return self._repo.status()[:4]
1055 return self._repo.status()
927
1056
928 @propertycache
1057 @propertycache
929 def _user(self):
1058 def _user(self):
@@ -933,21 +1062,8 b' class committablectx(basectx):'
933 def _date(self):
1062 def _date(self):
934 return util.makedate()
1063 return util.makedate()
935
1064
936 def status(self, ignored=False, clean=False, unknown=False):
1065 def subrev(self, subpath):
937 """Explicit status query
1066 return None
938 Unless this method is used to query the working copy status, the
939 _status property will implicitly read the status using its default
940 arguments."""
941 stat = self._repo.status(ignored=ignored, clean=clean, unknown=unknown)
942 self._unknown = self._ignored = self._clean = None
943 if unknown:
944 self._unknown = stat[4]
945 if ignored:
946 self._ignored = stat[5]
947 if clean:
948 self._clean = stat[6]
949 self._status = stat[:4]
950 return stat
951
1067
952 def user(self):
1068 def user(self):
953 return self._user or self._repo.ui.username()
1069 return self._user or self._repo.ui.username()
@@ -967,14 +1083,11 b' class committablectx(basectx):'
967 def deleted(self):
1083 def deleted(self):
968 return self._status[3]
1084 return self._status[3]
969 def unknown(self):
1085 def unknown(self):
970 assert self._unknown is not None # must call status first
1086 return self._status[4]
971 return self._unknown
972 def ignored(self):
1087 def ignored(self):
973 assert self._ignored is not None # must call status first
1088 return self._status[5]
974 return self._ignored
975 def clean(self):
1089 def clean(self):
976 assert self._clean is not None # must call status first
1090 return self._status[6]
977 return self._clean
978 def branch(self):
1091 def branch(self):
979 return encoding.tolocal(self._extra['branch'])
1092 return encoding.tolocal(self._extra['branch'])
980 def closesbranch(self):
1093 def closesbranch(self):
@@ -1069,6 +1182,9 b' class workingctx(committablectx):'
1069 if d[f] != 'r':
1182 if d[f] != 'r':
1070 yield f
1183 yield f
1071
1184
1185 def __contains__(self, key):
1186 return self._repo.dirstate[key] not in "?r"
1187
1072 @propertycache
1188 @propertycache
1073 def _parents(self):
1189 def _parents(self):
1074 p = self._repo.dirstate.parents()
1190 p = self._repo.dirstate.parents()
@@ -1180,6 +1296,170 b' class workingctx(committablectx):'
1180 finally:
1296 finally:
1181 wlock.release()
1297 wlock.release()
1182
1298
1299 def _filtersuspectsymlink(self, files):
1300 if not files or self._repo.dirstate._checklink:
1301 return files
1302
1303 # Symlink placeholders may get non-symlink-like contents
1304 # via user error or dereferencing by NFS or Samba servers,
1305 # so we filter out any placeholders that don't look like a
1306 # symlink
1307 sane = []
1308 for f in files:
1309 if self.flags(f) == 'l':
1310 d = self[f].data()
1311 if d == '' or len(d) >= 1024 or '\n' in d or util.binary(d):
1312 self._repo.ui.debug('ignoring suspect symlink placeholder'
1313 ' "%s"\n' % f)
1314 continue
1315 sane.append(f)
1316 return sane
1317
1318 def _checklookup(self, files):
1319 # check for any possibly clean files
1320 if not files:
1321 return [], []
1322
1323 modified = []
1324 fixup = []
1325 pctx = self._parents[0]
1326 # do a full compare of any files that might have changed
1327 for f in sorted(files):
1328 if (f not in pctx or self.flags(f) != pctx.flags(f)
1329 or pctx[f].cmp(self[f])):
1330 modified.append(f)
1331 else:
1332 fixup.append(f)
1333
1334 # update dirstate for files that are actually clean
1335 if fixup:
1336 try:
1337 # updating the dirstate is optional
1338 # so we don't wait on the lock
1339 normal = self._repo.dirstate.normal
1340 wlock = self._repo.wlock(False)
1341 try:
1342 for f in fixup:
1343 normal(f)
1344 finally:
1345 wlock.release()
1346 except error.LockError:
1347 pass
1348 return modified, fixup
1349
1350 def _manifestmatches(self, match, s):
1351 """Slow path for workingctx
1352
1353 The fast path is when we compare the working directory to its parent
1354 which means this function is comparing with a non-parent; therefore we
1355 need to build a manifest and return what matches.
1356 """
1357 mf = self._repo['.']._manifestmatches(match, s)
1358 modified, added, removed = s[0:3]
1359 for f in modified + added:
1360 mf[f] = None
1361 mf.set(f, self.flags(f))
1362 for f in removed:
1363 if f in mf:
1364 del mf[f]
1365 return mf
1366
1367 def _prestatus(self, other, s, match, listignored, listclean, listunknown):
1368 """override the parent hook with a dirstate query
1369
1370 We use this prestatus hook to populate the status with information from
1371 the dirstate.
1372 """
1373 # doesn't need to call super; if that changes, be aware that super
1374 # calls self.manifest which would slow down the common case of calling
1375 # status against a workingctx's parent
1376 return self._dirstatestatus(match, listignored, listclean, listunknown)
1377
1378 def _poststatus(self, other, s, match, listignored, listclean, listunknown):
1379 """override the parent hook with a filter for suspect symlinks
1380
1381 We use this poststatus hook to filter out symlinks that might have
1382 accidentally ended up with the entire contents of the file they are
1383 susposed to be linking to.
1384 """
1385 s[0] = self._filtersuspectsymlink(s[0])
1386 self._status = s[:]
1387 return s
1388
1389 def _dirstatestatus(self, match=None, ignored=False, clean=False,
1390 unknown=False):
1391 '''Gets the status from the dirstate -- internal use only.'''
1392 listignored, listclean, listunknown = ignored, clean, unknown
1393 match = match or matchmod.always(self._repo.root, self._repo.getcwd())
1394 subrepos = []
1395 if '.hgsub' in self:
1396 subrepos = sorted(self.substate)
1397 s = self._repo.dirstate.status(match, subrepos, listignored,
1398 listclean, listunknown)
1399 cmp, modified, added, removed, deleted, unknown, ignored, clean = s
1400
1401 # check for any possibly clean files
1402 if cmp:
1403 modified2, fixup = self._checklookup(cmp)
1404 modified += modified2
1405
1406 # update dirstate for files that are actually clean
1407 if fixup and listclean:
1408 clean += fixup
1409
1410 return [modified, added, removed, deleted, unknown, ignored, clean]
1411
1412 def _buildstatus(self, other, s, match, listignored, listclean,
1413 listunknown):
1414 """build a status with respect to another context
1415
1416 This includes logic for maintaining the fast path of status when
1417 comparing the working directory against its parent, which is to skip
1418 building a new manifest if self (working directory) is not comparing
1419 against its parent (repo['.']).
1420 """
1421 if other != self._repo['.']:
1422 s = super(workingctx, self)._buildstatus(other, s, match,
1423 listignored, listclean,
1424 listunknown)
1425 return s
1426
1427 def _matchstatus(self, other, s, match, listignored, listclean,
1428 listunknown):
1429 """override the match method with a filter for directory patterns
1430
1431 We use inheritance to customize the match.bad method only in cases of
1432 workingctx since it belongs only to the working directory when
1433 comparing against the parent changeset.
1434
1435 If we aren't comparing against the working directory's parent, then we
1436 just use the default match object sent to us.
1437 """
1438 superself = super(workingctx, self)
1439 match = superself._matchstatus(other, s, match, listignored, listclean,
1440 listunknown)
1441 if other != self._repo['.']:
1442 def bad(f, msg):
1443 # 'f' may be a directory pattern from 'match.files()',
1444 # so 'f not in ctx1' is not enough
1445 if f not in other and f not in other.dirs():
1446 self._repo.ui.warn('%s: %s\n' %
1447 (self._repo.dirstate.pathto(f), msg))
1448 match.bad = bad
1449 return match
1450
1451 def status(self, other='.', match=None, listignored=False,
1452 listclean=False, listunknown=False, listsubrepos=False):
1453 # yet to be determined: what to do if 'other' is a 'workingctx' or a
1454 # 'memctx'?
1455 s = super(workingctx, self).status(other, match, listignored, listclean,
1456 listunknown, listsubrepos)
1457 # calling 'super' subtly reveresed the contexts, so we flip the results
1458 # (s[1] is 'added' and s[2] is 'removed')
1459 s = list(s)
1460 s[1], s[2] = s[2], s[1]
1461 return tuple(s)
1462
1183 class committablefilectx(basefilectx):
1463 class committablefilectx(basefilectx):
1184 """A committablefilectx provides common functionality for a file context
1464 """A committablefilectx provides common functionality for a file context
1185 that wants the ability to commit, e.g. workingfilectx or memfilectx."""
1465 that wants the ability to commit, e.g. workingfilectx or memfilectx."""
@@ -1259,7 +1539,7 b' class workingfilectx(committablefilectx)'
1259 # invert comparison to reuse the same code path
1539 # invert comparison to reuse the same code path
1260 return fctx.cmp(self)
1540 return fctx.cmp(self)
1261
1541
1262 class memctx(object):
1542 class memctx(committablectx):
1263 """Use memctx to perform in-memory commits via localrepo.commitctx().
1543 """Use memctx to perform in-memory commits via localrepo.commitctx().
1264
1544
1265 Revision information is supplied at initialization time while
1545 Revision information is supplied at initialization time while
@@ -1287,73 +1567,25 b' class memctx(object):'
1287 is a dictionary of metadata or is left empty.
1567 is a dictionary of metadata or is left empty.
1288 """
1568 """
1289 def __init__(self, repo, parents, text, files, filectxfn, user=None,
1569 def __init__(self, repo, parents, text, files, filectxfn, user=None,
1290 date=None, extra=None):
1570 date=None, extra=None, editor=False):
1291 self._repo = repo
1571 super(memctx, self).__init__(repo, text, user, date, extra)
1292 self._rev = None
1572 self._rev = None
1293 self._node = None
1573 self._node = None
1294 self._text = text
1295 self._date = date and util.parsedate(date) or util.makedate()
1296 self._user = user
1297 parents = [(p or nullid) for p in parents]
1574 parents = [(p or nullid) for p in parents]
1298 p1, p2 = parents
1575 p1, p2 = parents
1299 self._parents = [changectx(self._repo, p) for p in (p1, p2)]
1576 self._parents = [changectx(self._repo, p) for p in (p1, p2)]
1300 files = sorted(set(files))
1577 files = sorted(set(files))
1301 self._status = [files, [], [], [], []]
1578 self._status = [files, [], [], [], []]
1302 self._filectxfn = filectxfn
1579 self._filectxfn = filectxfn
1580 self.substate = None
1303
1581
1304 self._extra = extra and extra.copy() or {}
1582 self._extra = extra and extra.copy() or {}
1305 if self._extra.get('branch', '') == '':
1583 if self._extra.get('branch', '') == '':
1306 self._extra['branch'] = 'default'
1584 self._extra['branch'] = 'default'
1307
1585
1308 def __str__(self):
1586 if editor:
1309 return str(self._parents[0]) + "+"
1587 self._text = editor(self._repo, self, [])
1310
1588 self._repo.savecommitmessage(self._text)
1311 def __int__(self):
1312 return self._rev
1313
1314 def __nonzero__(self):
1315 return True
1316
1317 def __getitem__(self, key):
1318 return self.filectx(key)
1319
1320 def p1(self):
1321 return self._parents[0]
1322 def p2(self):
1323 return self._parents[1]
1324
1325 def user(self):
1326 return self._user or self._repo.ui.username()
1327 def date(self):
1328 return self._date
1329 def description(self):
1330 return self._text
1331 def files(self):
1332 return self.modified()
1333 def modified(self):
1334 return self._status[0]
1335 def added(self):
1336 return self._status[1]
1337 def removed(self):
1338 return self._status[2]
1339 def deleted(self):
1340 return self._status[3]
1341 def unknown(self):
1342 return self._status[4]
1343 def ignored(self):
1344 return self._status[5]
1345 def clean(self):
1346 return self._status[6]
1347 def branch(self):
1348 return encoding.tolocal(self._extra['branch'])
1349 def extra(self):
1350 return self._extra
1351 def flags(self, f):
1352 return self[f].flags()
1353
1354 def parents(self):
1355 """return contexts for each parent changeset"""
1356 return self._parents
1357
1589
1358 def filectx(self, path, filelog=None):
1590 def filectx(self, path, filelog=None):
1359 """get a file context from the working directory"""
1591 """get a file context from the working directory"""
@@ -1363,12 +1595,34 b' class memctx(object):'
1363 """commit context to the repo"""
1595 """commit context to the repo"""
1364 return self._repo.commitctx(self)
1596 return self._repo.commitctx(self)
1365
1597
1366 class memfilectx(object):
1598 @propertycache
1599 def _manifest(self):
1600 """generate a manifest based on the return values of filectxfn"""
1601
1602 # keep this simple for now; just worry about p1
1603 pctx = self._parents[0]
1604 man = pctx.manifest().copy()
1605
1606 for f, fnode in man.iteritems():
1607 p1node = nullid
1608 p2node = nullid
1609 p = pctx[f].parents()
1610 if len(p) > 0:
1611 p1node = p[0].node()
1612 if len(p) > 1:
1613 p2node = p[1].node()
1614 man[f] = revlog.hash(self[f].data(), p1node, p2node)
1615
1616 return man
1617
1618
1619 class memfilectx(committablefilectx):
1367 """memfilectx represents an in-memory file to commit.
1620 """memfilectx represents an in-memory file to commit.
1368
1621
1369 See memctx for more details.
1622 See memctx and commitablefilectx for more details.
1370 """
1623 """
1371 def __init__(self, path, data, islink=False, isexec=False, copied=None):
1624 def __init__(self, repo, path, data, islink=False,
1625 isexec=False, copied=None, memctx=None):
1372 """
1626 """
1373 path is the normalized file path relative to repository root.
1627 path is the normalized file path relative to repository root.
1374 data is the file content as a string.
1628 data is the file content as a string.
@@ -1376,21 +1630,17 b' class memfilectx(object):'
1376 isexec is True if the file is executable.
1630 isexec is True if the file is executable.
1377 copied is the source file path if current file was copied in the
1631 copied is the source file path if current file was copied in the
1378 revision being committed, or None."""
1632 revision being committed, or None."""
1379 self._path = path
1633 super(memfilectx, self).__init__(repo, path, None, memctx)
1380 self._data = data
1634 self._data = data
1381 self._flags = (islink and 'l' or '') + (isexec and 'x' or '')
1635 self._flags = (islink and 'l' or '') + (isexec and 'x' or '')
1382 self._copied = None
1636 self._copied = None
1383 if copied:
1637 if copied:
1384 self._copied = (copied, nullid)
1638 self._copied = (copied, nullid)
1385
1639
1386 def __nonzero__(self):
1387 return True
1388 def __str__(self):
1389 return "%s@%s" % (self.path(), self._changectx)
1390 def path(self):
1391 return self._path
1392 def data(self):
1640 def data(self):
1393 return self._data
1641 return self._data
1642 def size(self):
1643 return len(self.data())
1394 def flags(self):
1644 def flags(self):
1395 return self._flags
1645 return self._flags
1396 def isexec(self):
1646 def isexec(self):
@@ -24,13 +24,17 b' These imports will not be delayed:'
24 b = __import__(a)
24 b = __import__(a)
25 '''
25 '''
26
26
27 import __builtin__, os
27 import __builtin__, os, sys
28 _origimport = __import__
28 _origimport = __import__
29
29
30 nothing = object()
30 nothing = object()
31
31
32 try:
32 try:
33 _origimport(__builtin__.__name__, {}, {}, None, -1)
33 # Python 3 doesn't have relative imports nor level -1.
34 level = -1
35 if sys.version_info[0] >= 3:
36 level = 0
37 _origimport(__builtin__.__name__, {}, {}, None, level)
34 except TypeError: # no level argument
38 except TypeError: # no level argument
35 def _import(name, globals, locals, fromlist, level):
39 def _import(name, globals, locals, fromlist, level):
36 "call _origimport with no level argument"
40 "call _origimport with no level argument"
@@ -55,7 +59,7 b' def _hgextimport(importfunc, name, globa'
55
59
56 class _demandmod(object):
60 class _demandmod(object):
57 """module demand-loader and proxy"""
61 """module demand-loader and proxy"""
58 def __init__(self, name, globals, locals, level=-1):
62 def __init__(self, name, globals, locals, level=level):
59 if '.' in name:
63 if '.' in name:
60 head, rest = name.split('.', 1)
64 head, rest = name.split('.', 1)
61 after = [rest]
65 after = [rest]
@@ -105,7 +109,7 b' class _demandmod(object):'
105 self._load()
109 self._load()
106 setattr(self._module, attr, val)
110 setattr(self._module, attr, val)
107
111
108 def _demandimport(name, globals=None, locals=None, fromlist=None, level=-1):
112 def _demandimport(name, globals=None, locals=None, fromlist=None, level=level):
109 if not locals or name in ignore or fromlist == ('*',):
113 if not locals or name in ignore or fromlist == ('*',):
110 # these cases we can't really delay
114 # these cases we can't really delay
111 return _hgextimport(_import, name, globals, locals, fromlist, level)
115 return _hgextimport(_import, name, globals, locals, fromlist, level)
@@ -138,25 +138,12 b' static int dirs_fromdict(PyObject *dirs,'
138 return -1;
138 return -1;
139 }
139 }
140 if (skipchar) {
140 if (skipchar) {
141 PyObject *st;
141 if (!dirstate_tuple_check(value)) {
142
143 if (!PyTuple_Check(value) ||
144 PyTuple_GET_SIZE(value) == 0) {
145 PyErr_SetString(PyExc_TypeError,
142 PyErr_SetString(PyExc_TypeError,
146 "expected non-empty tuple");
143 "expected a dirstate tuple");
147 return -1;
144 return -1;
148 }
145 }
149
146 if (((dirstateTupleObject *)value)->state == skipchar)
150 st = PyTuple_GET_ITEM(value, 0);
151
152 if (!PyString_Check(st) || PyString_GET_SIZE(st) == 0) {
153 PyErr_SetString(PyExc_TypeError,
154 "expected non-empty string "
155 "at tuple index 0");
156 return -1;
157 }
158
159 if (PyString_AS_STRING(st)[0] == skipchar)
160 continue;
147 continue;
161 }
148 }
162
149
@@ -14,6 +14,8 b' propertycache = util.propertycache'
14 filecache = scmutil.filecache
14 filecache = scmutil.filecache
15 _rangemask = 0x7fffffff
15 _rangemask = 0x7fffffff
16
16
17 dirstatetuple = parsers.dirstatetuple
18
17 class repocache(filecache):
19 class repocache(filecache):
18 """filecache for files in .hg/"""
20 """filecache for files in .hg/"""
19 def join(self, obj, fname):
21 def join(self, obj, fname):
@@ -335,7 +337,7 b' class dirstate(object):'
335 if oldstate in "?r" and "_dirs" in self.__dict__:
337 if oldstate in "?r" and "_dirs" in self.__dict__:
336 self._dirs.addpath(f)
338 self._dirs.addpath(f)
337 self._dirty = True
339 self._dirty = True
338 self._map[f] = (state, mode, size, mtime)
340 self._map[f] = dirstatetuple(state, mode, size, mtime)
339
341
340 def normal(self, f):
342 def normal(self, f):
341 '''Mark a file normal and clean.'''
343 '''Mark a file normal and clean.'''
@@ -400,7 +402,7 b' class dirstate(object):'
400 size = -1
402 size = -1
401 elif entry[0] == 'n' and entry[2] == -2: # other parent
403 elif entry[0] == 'n' and entry[2] == -2: # other parent
402 size = -2
404 size = -2
403 self._map[f] = ('r', 0, size, 0)
405 self._map[f] = dirstatetuple('r', 0, size, 0)
404 if size == 0 and f in self._copymap:
406 if size == 0 and f in self._copymap:
405 del self._copymap[f]
407 del self._copymap[f]
406
408
@@ -493,9 +495,9 b' class dirstate(object):'
493 self._map[f] = oldmap[f]
495 self._map[f] = oldmap[f]
494 else:
496 else:
495 if 'x' in allfiles.flags(f):
497 if 'x' in allfiles.flags(f):
496 self._map[f] = ('n', 0777, -1, 0)
498 self._map[f] = dirstatetuple('n', 0777, -1, 0)
497 else:
499 else:
498 self._map[f] = ('n', 0666, -1, 0)
500 self._map[f] = dirstatetuple('n', 0666, -1, 0)
499 self._pl = (parent, nullid)
501 self._pl = (parent, nullid)
500 self._dirty = True
502 self._dirty = True
501
503
@@ -821,7 +823,18 b' class dirstate(object):'
821 uadd(fn)
823 uadd(fn)
822 continue
824 continue
823
825
824 state, mode, size, time = dmap[fn]
826 # This is equivalent to 'state, mode, size, time = dmap[fn]' but not
827 # written like that for performance reasons. dmap[fn] is not a
828 # Python tuple in compiled builds. The CPython UNPACK_SEQUENCE
829 # opcode has fast paths when the value to be unpacked is a tuple or
830 # a list, but falls back to creating a full-fledged iterator in
831 # general. That is much slower than simply accessing and storing the
832 # tuple members one by one.
833 t = dmap[fn]
834 state = t[0]
835 mode = t[1]
836 size = t[2]
837 time = t[3]
825
838
826 if not st and state in "nma":
839 if not st and state in "nma":
827 dadd(fn)
840 dadd(fn)
@@ -341,6 +341,10 b' def checkheads(repo, remote, outgoing, r'
341 if branch not in ('default', None):
341 if branch not in ('default', None):
342 error = _("push creates new remote head %s "
342 error = _("push creates new remote head %s "
343 "on branch '%s'!") % (short(dhs[0]), branch)
343 "on branch '%s'!") % (short(dhs[0]), branch)
344 elif repo[dhs[0]].bookmarks():
345 error = _("push creates new remote head %s "
346 "with bookmark '%s'!") % (
347 short(dhs[0]), repo[dhs[0]].bookmarks()[0])
344 else:
348 else:
345 error = _("push creates new remote head %s!"
349 error = _("push creates new remote head %s!"
346 ) % short(dhs[0])
350 ) % short(dhs[0])
@@ -225,7 +225,8 b' def _runcatch(req):'
225 # it might be anything, for example a string
225 # it might be anything, for example a string
226 reason = inst.reason
226 reason = inst.reason
227 ui.warn(_("abort: error: %s\n") % reason)
227 ui.warn(_("abort: error: %s\n") % reason)
228 elif util.safehasattr(inst, "args") and inst.args[0] == errno.EPIPE:
228 elif (util.safehasattr(inst, "args")
229 and inst.args and inst.args[0] == errno.EPIPE):
229 if ui.debugflag:
230 if ui.debugflag:
230 ui.warn(_("broken pipe\n"))
231 ui.warn(_("broken pipe\n"))
231 elif getattr(inst, "strerror", None):
232 elif getattr(inst, "strerror", None):
@@ -165,6 +165,99 b' def getcols(s, start, c):'
165 if colwidth(t) == c:
165 if colwidth(t) == c:
166 return t
166 return t
167
167
168 def trim(s, width, ellipsis='', leftside=False):
169 """Trim string 's' to at most 'width' columns (including 'ellipsis').
170
171 If 'leftside' is True, left side of string 's' is trimmed.
172 'ellipsis' is always placed at trimmed side.
173
174 >>> ellipsis = '+++'
175 >>> from mercurial import encoding
176 >>> encoding.encoding = 'utf-8'
177 >>> t= '1234567890'
178 >>> print trim(t, 12, ellipsis=ellipsis)
179 1234567890
180 >>> print trim(t, 10, ellipsis=ellipsis)
181 1234567890
182 >>> print trim(t, 8, ellipsis=ellipsis)
183 12345+++
184 >>> print trim(t, 8, ellipsis=ellipsis, leftside=True)
185 +++67890
186 >>> print trim(t, 8)
187 12345678
188 >>> print trim(t, 8, leftside=True)
189 34567890
190 >>> print trim(t, 3, ellipsis=ellipsis)
191 +++
192 >>> print trim(t, 1, ellipsis=ellipsis)
193 +
194 >>> u = u'\u3042\u3044\u3046\u3048\u304a' # 2 x 5 = 10 columns
195 >>> t = u.encode(encoding.encoding)
196 >>> print trim(t, 12, ellipsis=ellipsis)
197 \xe3\x81\x82\xe3\x81\x84\xe3\x81\x86\xe3\x81\x88\xe3\x81\x8a
198 >>> print trim(t, 10, ellipsis=ellipsis)
199 \xe3\x81\x82\xe3\x81\x84\xe3\x81\x86\xe3\x81\x88\xe3\x81\x8a
200 >>> print trim(t, 8, ellipsis=ellipsis)
201 \xe3\x81\x82\xe3\x81\x84+++
202 >>> print trim(t, 8, ellipsis=ellipsis, leftside=True)
203 +++\xe3\x81\x88\xe3\x81\x8a
204 >>> print trim(t, 5)
205 \xe3\x81\x82\xe3\x81\x84
206 >>> print trim(t, 5, leftside=True)
207 \xe3\x81\x88\xe3\x81\x8a
208 >>> print trim(t, 4, ellipsis=ellipsis)
209 +++
210 >>> print trim(t, 4, ellipsis=ellipsis, leftside=True)
211 +++
212 >>> t = '\x11\x22\x33\x44\x55\x66\x77\x88\x99\xaa' # invalid byte sequence
213 >>> print trim(t, 12, ellipsis=ellipsis)
214 \x11\x22\x33\x44\x55\x66\x77\x88\x99\xaa
215 >>> print trim(t, 10, ellipsis=ellipsis)
216 \x11\x22\x33\x44\x55\x66\x77\x88\x99\xaa
217 >>> print trim(t, 8, ellipsis=ellipsis)
218 \x11\x22\x33\x44\x55+++
219 >>> print trim(t, 8, ellipsis=ellipsis, leftside=True)
220 +++\x66\x77\x88\x99\xaa
221 >>> print trim(t, 8)
222 \x11\x22\x33\x44\x55\x66\x77\x88
223 >>> print trim(t, 8, leftside=True)
224 \x33\x44\x55\x66\x77\x88\x99\xaa
225 >>> print trim(t, 3, ellipsis=ellipsis)
226 +++
227 >>> print trim(t, 1, ellipsis=ellipsis)
228 +
229 """
230 try:
231 u = s.decode(encoding)
232 except UnicodeDecodeError:
233 if len(s) <= width: # trimming is not needed
234 return s
235 width -= len(ellipsis)
236 if width <= 0: # no enough room even for ellipsis
237 return ellipsis[:width + len(ellipsis)]
238 if leftside:
239 return ellipsis + s[-width:]
240 return s[:width] + ellipsis
241
242 if ucolwidth(u) <= width: # trimming is not needed
243 return s
244
245 width -= len(ellipsis)
246 if width <= 0: # no enough room even for ellipsis
247 return ellipsis[:width + len(ellipsis)]
248
249 if leftside:
250 uslice = lambda i: u[i:]
251 concat = lambda s: ellipsis + s
252 else:
253 uslice = lambda i: u[:-i]
254 concat = lambda s: s + ellipsis
255 for i in xrange(1, len(u)):
256 usub = uslice(i)
257 if ucolwidth(usub) <= width:
258 return concat(usub.encode(encoding))
259 return ellipsis # no enough room for multi-column characters
260
168 def lower(s):
261 def lower(s):
169 "best-effort encoding-aware case-folding of local string s"
262 "best-effort encoding-aware case-folding of local string s"
170 try:
263 try:
@@ -98,3 +98,22 b' class SignatureError(Exception):'
98 class PushRaced(RuntimeError):
98 class PushRaced(RuntimeError):
99 """An exception raised during unbundling that indicate a push race"""
99 """An exception raised during unbundling that indicate a push race"""
100
100
101 # bundle2 related errors
102 class BundleValueError(ValueError):
103 """error raised when bundle2 cannot be processed"""
104
105 def __init__(self, parttype=None, params=()):
106 self.parttype = parttype
107 self.params = params
108 if self.parttype is None:
109 msg = 'Stream Parameter'
110 else:
111 msg = parttype
112 if self.params:
113 msg = '%s - %s' % (msg, ', '.join(self.params))
114 ValueError.__init__(self, msg)
115
116 class ReadOnlyPartError(RuntimeError):
117 """error raised when code tries to alter a part being generated"""
118 pass
119
@@ -9,7 +9,7 b' from i18n import _'
9 from node import hex, nullid
9 from node import hex, nullid
10 import errno, urllib
10 import errno, urllib
11 import util, scmutil, changegroup, base85, error
11 import util, scmutil, changegroup, base85, error
12 import discovery, phases, obsolete, bookmarks, bundle2
12 import discovery, phases, obsolete, bookmarks, bundle2, pushkey
13
13
14 def readbundle(ui, fh, fname, vfs=None):
14 def readbundle(ui, fh, fname, vfs=None):
15 header = changegroup.readexactly(fh, 4)
15 header = changegroup.readexactly(fh, 4)
@@ -61,6 +61,9 b' class pushoperation(object):'
61 self.newbranch = newbranch
61 self.newbranch = newbranch
62 # did a local lock get acquired?
62 # did a local lock get acquired?
63 self.locallocked = None
63 self.locallocked = None
64 # step already performed
65 # (used to check what steps have been already performed through bundle2)
66 self.stepsdone = set()
64 # Integer version of the push result
67 # Integer version of the push result
65 # - None means nothing to push
68 # - None means nothing to push
66 # - 0 means HTTP error
69 # - 0 means HTTP error
@@ -128,16 +131,11 b' def push(repo, remote, force=False, revs'
128 lock = pushop.remote.lock()
131 lock = pushop.remote.lock()
129 try:
132 try:
130 _pushdiscovery(pushop)
133 _pushdiscovery(pushop)
131 if _pushcheckoutgoing(pushop):
134 if (pushop.repo.ui.configbool('experimental', 'bundle2-exp',
132 pushop.repo.prepushoutgoinghooks(pushop.repo,
135 False)
133 pushop.remote,
136 and pushop.remote.capable('bundle2-exp')):
134 pushop.outgoing)
137 _pushbundle2(pushop)
135 if (pushop.repo.ui.configbool('experimental', 'bundle2-exp',
138 _pushchangeset(pushop)
136 False)
137 and pushop.remote.capable('bundle2-exp')):
138 _pushbundle2(pushop)
139 else:
140 _pushchangeset(pushop)
141 _pushcomputecommonheads(pushop)
139 _pushcomputecommonheads(pushop)
142 _pushsyncphase(pushop)
140 _pushsyncphase(pushop)
143 _pushobsolete(pushop)
141 _pushobsolete(pushop)
@@ -203,57 +201,73 b' def _pushcheckoutgoing(pushop):'
203 newbm)
201 newbm)
204 return True
202 return True
205
203
204 def _pushb2ctx(pushop, bundler):
205 """handle changegroup push through bundle2
206
207 addchangegroup result is stored in the ``pushop.ret`` attribute.
208 """
209 if 'changesets' in pushop.stepsdone:
210 return
211 pushop.stepsdone.add('changesets')
212 # Send known heads to the server for race detection.
213 pushop.stepsdone.add('changesets')
214 if not _pushcheckoutgoing(pushop):
215 return
216 pushop.repo.prepushoutgoinghooks(pushop.repo,
217 pushop.remote,
218 pushop.outgoing)
219 if not pushop.force:
220 bundler.newpart('B2X:CHECK:HEADS', data=iter(pushop.remoteheads))
221 cg = changegroup.getlocalbundle(pushop.repo, 'push', pushop.outgoing)
222 cgpart = bundler.newpart('B2X:CHANGEGROUP', data=cg.getchunks())
223 def handlereply(op):
224 """extract addchangroup returns from server reply"""
225 cgreplies = op.records.getreplies(cgpart.id)
226 assert len(cgreplies['changegroup']) == 1
227 pushop.ret = cgreplies['changegroup'][0]['return']
228 return handlereply
229
230 # list of function that may decide to add parts to an outgoing bundle2
231 bundle2partsgenerators = [_pushb2ctx]
232
206 def _pushbundle2(pushop):
233 def _pushbundle2(pushop):
207 """push data to the remote using bundle2
234 """push data to the remote using bundle2
208
235
209 The only currently supported type of data is changegroup but this will
236 The only currently supported type of data is changegroup but this will
210 evolve in the future."""
237 evolve in the future."""
211 # Send known head to the server for race detection.
238 bundler = bundle2.bundle20(pushop.ui, bundle2.bundle2caps(pushop.remote))
212 capsblob = urllib.unquote(pushop.remote.capable('bundle2-exp'))
213 caps = bundle2.decodecaps(capsblob)
214 bundler = bundle2.bundle20(pushop.ui, caps)
215 # create reply capability
239 # create reply capability
216 capsblob = bundle2.encodecaps(pushop.repo.bundle2caps)
240 capsblob = bundle2.encodecaps(pushop.repo.bundle2caps)
217 bundler.addpart(bundle2.bundlepart('b2x:replycaps', data=capsblob))
241 bundler.newpart('b2x:replycaps', data=capsblob)
218 if not pushop.force:
242 replyhandlers = []
219 part = bundle2.bundlepart('B2X:CHECK:HEADS',
243 for partgen in bundle2partsgenerators:
220 data=iter(pushop.remoteheads))
244 ret = partgen(pushop, bundler)
221 bundler.addpart(part)
245 replyhandlers.append(ret)
222 extrainfo = _pushbundle2extraparts(pushop, bundler)
246 # do not push if nothing to push
223 # add the changegroup bundle
247 if bundler.nbparts <= 1:
224 cg = changegroup.getlocalbundle(pushop.repo, 'push', pushop.outgoing)
248 return
225 cgpart = bundle2.bundlepart('B2X:CHANGEGROUP', data=cg.getchunks())
226 bundler.addpart(cgpart)
227 stream = util.chunkbuffer(bundler.getchunks())
249 stream = util.chunkbuffer(bundler.getchunks())
228 try:
250 try:
229 reply = pushop.remote.unbundle(stream, ['force'], 'push')
251 reply = pushop.remote.unbundle(stream, ['force'], 'push')
230 except bundle2.UnknownPartError, exc:
252 except error.BundleValueError, exc:
231 raise util.Abort('missing support for %s' % exc)
253 raise util.Abort('missing support for %s' % exc)
232 try:
254 try:
233 op = bundle2.processbundle(pushop.repo, reply)
255 op = bundle2.processbundle(pushop.repo, reply)
234 except bundle2.UnknownPartError, exc:
256 except error.BundleValueError, exc:
235 raise util.Abort('missing support for %s' % exc)
257 raise util.Abort('missing support for %s' % exc)
236 cgreplies = op.records.getreplies(cgpart.id)
258 for rephand in replyhandlers:
237 assert len(cgreplies['changegroup']) == 1
259 rephand(op)
238 pushop.ret = cgreplies['changegroup'][0]['return']
239 _pushbundle2extrareply(pushop, op, extrainfo)
240
241 def _pushbundle2extraparts(pushop, bundler):
242 """hook function to let extensions add parts
243
244 Return a dict to let extensions pass data to the reply processing.
245 """
246 return {}
247
248 def _pushbundle2extrareply(pushop, op, extrainfo):
249 """hook function to let extensions react to part replies
250
251 The dict from _pushbundle2extrareply is fed to this function.
252 """
253 pass
254
260
255 def _pushchangeset(pushop):
261 def _pushchangeset(pushop):
256 """Make the actual push of changeset bundle to remote repo"""
262 """Make the actual push of changeset bundle to remote repo"""
263 if 'changesets' in pushop.stepsdone:
264 return
265 pushop.stepsdone.add('changesets')
266 if not _pushcheckoutgoing(pushop):
267 return
268 pushop.repo.prepushoutgoinghooks(pushop.repo,
269 pushop.remote,
270 pushop.outgoing)
257 outgoing = pushop.outgoing
271 outgoing = pushop.outgoing
258 unbundle = pushop.remote.capable('unbundle')
272 unbundle = pushop.remote.capable('unbundle')
259 # TODO: get bundlecaps from remote
273 # TODO: get bundlecaps from remote
@@ -330,37 +344,6 b' def _pushsyncphase(pushop):'
330 """synchronise phase information locally and remotely"""
344 """synchronise phase information locally and remotely"""
331 unfi = pushop.repo.unfiltered()
345 unfi = pushop.repo.unfiltered()
332 cheads = pushop.commonheads
346 cheads = pushop.commonheads
333 if pushop.ret:
334 # push succeed, synchronize target of the push
335 cheads = pushop.outgoing.missingheads
336 elif pushop.revs is None:
337 # All out push fails. synchronize all common
338 cheads = pushop.outgoing.commonheads
339 else:
340 # I want cheads = heads(::missingheads and ::commonheads)
341 # (missingheads is revs with secret changeset filtered out)
342 #
343 # This can be expressed as:
344 # cheads = ( (missingheads and ::commonheads)
345 # + (commonheads and ::missingheads))"
346 # )
347 #
348 # while trying to push we already computed the following:
349 # common = (::commonheads)
350 # missing = ((commonheads::missingheads) - commonheads)
351 #
352 # We can pick:
353 # * missingheads part of common (::commonheads)
354 common = set(pushop.outgoing.common)
355 nm = pushop.repo.changelog.nodemap
356 cheads = [node for node in pushop.revs if nm[node] in common]
357 # and
358 # * commonheads parents on missing
359 revset = unfi.set('%ln and parents(roots(%ln))',
360 pushop.outgoing.commonheads,
361 pushop.outgoing.missing)
362 cheads.extend(c.node() for c in revset)
363 pushop.commonheads = cheads
364 # even when we don't push, exchanging phase data is useful
347 # even when we don't push, exchanging phase data is useful
365 remotephases = pushop.remote.listkeys('phases')
348 remotephases = pushop.remote.listkeys('phases')
366 if (pushop.ui.configbool('ui', '_usedassubrepo', False)
349 if (pushop.ui.configbool('ui', '_usedassubrepo', False)
@@ -395,16 +378,54 b' def _pushsyncphase(pushop):'
395 # Get the list of all revs draft on remote by public here.
378 # Get the list of all revs draft on remote by public here.
396 # XXX Beware that revset break if droots is not strictly
379 # XXX Beware that revset break if droots is not strictly
397 # XXX root we may want to ensure it is but it is costly
380 # XXX root we may want to ensure it is but it is costly
398 outdated = unfi.set('heads((%ln::%ln) and public())',
381 outdated = unfi.set('heads((%ln::%ln) and public())',
399 droots, cheads)
382 droots, cheads)
400 for newremotehead in outdated:
383
401 r = pushop.remote.pushkey('phases',
384 b2caps = bundle2.bundle2caps(pushop.remote)
402 newremotehead.hex(),
385 if 'b2x:pushkey' in b2caps:
403 str(phases.draft),
386 # server supports bundle2, let's do a batched push through it
404 str(phases.public))
387 #
405 if not r:
388 # This will eventually be unified with the changesets bundle2 push
406 pushop.ui.warn(_('updating %s to public failed!\n')
389 bundler = bundle2.bundle20(pushop.ui, b2caps)
407 % newremotehead)
390 capsblob = bundle2.encodecaps(pushop.repo.bundle2caps)
391 bundler.newpart('b2x:replycaps', data=capsblob)
392 part2node = []
393 enc = pushkey.encode
394 for newremotehead in outdated:
395 part = bundler.newpart('b2x:pushkey')
396 part.addparam('namespace', enc('phases'))
397 part.addparam('key', enc(newremotehead.hex()))
398 part.addparam('old', enc(str(phases.draft)))
399 part.addparam('new', enc(str(phases.public)))
400 part2node.append((part.id, newremotehead))
401 stream = util.chunkbuffer(bundler.getchunks())
402 try:
403 reply = pushop.remote.unbundle(stream, ['force'], 'push')
404 op = bundle2.processbundle(pushop.repo, reply)
405 except error.BundleValueError, exc:
406 raise util.Abort('missing support for %s' % exc)
407 for partid, node in part2node:
408 partrep = op.records.getreplies(partid)
409 results = partrep['pushkey']
410 assert len(results) <= 1
411 msg = None
412 if not results:
413 msg = _('server ignored update of %s to public!\n') % node
414 elif not int(results[0]['return']):
415 msg = _('updating %s to public failed!\n') % node
416 if msg is not None:
417 pushop.ui.warn(msg)
418
419 else:
420 # fallback to independant pushkey command
421 for newremotehead in outdated:
422 r = pushop.remote.pushkey('phases',
423 newremotehead.hex(),
424 str(phases.draft),
425 str(phases.public))
426 if not r:
427 pushop.ui.warn(_('updating %s to public failed!\n')
428 % newremotehead)
408
429
409 def _localphasemove(pushop, nodes, phase=phases.public):
430 def _localphasemove(pushop, nodes, phase=phases.public):
410 """move <nodes> to <phase> in the local source repo"""
431 """move <nodes> to <phase> in the local source repo"""
@@ -568,14 +589,15 b' def _pullbundle2(pullop):'
568 """pull data using bundle2
589 """pull data using bundle2
569
590
570 For now, the only supported data are changegroup."""
591 For now, the only supported data are changegroup."""
571 kwargs = {'bundlecaps': set(['HG2X'])}
592 remotecaps = bundle2.bundle2caps(pullop.remote)
572 capsblob = bundle2.encodecaps(pullop.repo.bundle2caps)
593 kwargs = {'bundlecaps': caps20to10(pullop.repo)}
573 kwargs['bundlecaps'].add('bundle2=' + urllib.quote(capsblob))
574 # pulling changegroup
594 # pulling changegroup
575 pullop.todosteps.remove('changegroup')
595 pullop.todosteps.remove('changegroup')
576
596
577 kwargs['common'] = pullop.common
597 kwargs['common'] = pullop.common
578 kwargs['heads'] = pullop.heads or pullop.rheads
598 kwargs['heads'] = pullop.heads or pullop.rheads
599 if 'b2x:listkeys' in remotecaps:
600 kwargs['listkeys'] = ['phase']
579 if not pullop.fetch:
601 if not pullop.fetch:
580 pullop.repo.ui.status(_("no changes found\n"))
602 pullop.repo.ui.status(_("no changes found\n"))
581 pullop.cgresult = 0
603 pullop.cgresult = 0
@@ -588,13 +610,18 b' def _pullbundle2(pullop):'
588 bundle = pullop.remote.getbundle('pull', **kwargs)
610 bundle = pullop.remote.getbundle('pull', **kwargs)
589 try:
611 try:
590 op = bundle2.processbundle(pullop.repo, bundle, pullop.gettransaction)
612 op = bundle2.processbundle(pullop.repo, bundle, pullop.gettransaction)
591 except bundle2.UnknownPartError, exc:
613 except error.BundleValueError, exc:
592 raise util.Abort('missing support for %s' % exc)
614 raise util.Abort('missing support for %s' % exc)
593
615
594 if pullop.fetch:
616 if pullop.fetch:
595 assert len(op.records['changegroup']) == 1
617 assert len(op.records['changegroup']) == 1
596 pullop.cgresult = op.records['changegroup'][0]['return']
618 pullop.cgresult = op.records['changegroup'][0]['return']
597
619
620 # processing phases change
621 for namespace, value in op.records['listkeys']:
622 if namespace == 'phases':
623 _pullapplyphases(pullop, value)
624
598 def _pullbundle2extraprepare(pullop, kwargs):
625 def _pullbundle2extraprepare(pullop, kwargs):
599 """hook function so that extensions can extend the getbundle call"""
626 """hook function so that extensions can extend the getbundle call"""
600 pass
627 pass
@@ -624,8 +651,8 b' def _pullchangeset(pullop):'
624 cg = pullop.remote.changegroup(pullop.fetch, 'pull')
651 cg = pullop.remote.changegroup(pullop.fetch, 'pull')
625 elif not pullop.remote.capable('changegroupsubset'):
652 elif not pullop.remote.capable('changegroupsubset'):
626 raise util.Abort(_("partial pull cannot be done because "
653 raise util.Abort(_("partial pull cannot be done because "
627 "other repository doesn't support "
654 "other repository doesn't support "
628 "changegroupsubset."))
655 "changegroupsubset."))
629 else:
656 else:
630 cg = pullop.remote.changegroupsubset(pullop.fetch, pullop.heads, 'pull')
657 cg = pullop.remote.changegroupsubset(pullop.fetch, pullop.heads, 'pull')
631 pullop.cgresult = changegroup.addchangegroup(pullop.repo, cg, 'pull',
658 pullop.cgresult = changegroup.addchangegroup(pullop.repo, cg, 'pull',
@@ -633,8 +660,12 b' def _pullchangeset(pullop):'
633
660
634 def _pullphase(pullop):
661 def _pullphase(pullop):
635 # Get remote phases data from remote
662 # Get remote phases data from remote
663 remotephases = pullop.remote.listkeys('phases')
664 _pullapplyphases(pullop, remotephases)
665
666 def _pullapplyphases(pullop, remotephases):
667 """apply phase movement from observed remote state"""
636 pullop.todosteps.remove('phases')
668 pullop.todosteps.remove('phases')
637 remotephases = pullop.remote.listkeys('phases')
638 publishing = bool(remotephases.get('publishing', False))
669 publishing = bool(remotephases.get('publishing', False))
639 if remotephases and not publishing:
670 if remotephases and not publishing:
640 # remote is new and unpublishing
671 # remote is new and unpublishing
@@ -672,6 +703,13 b' def _pullobsolete(pullop):'
672 pullop.repo.invalidatevolatilesets()
703 pullop.repo.invalidatevolatilesets()
673 return tr
704 return tr
674
705
706 def caps20to10(repo):
707 """return a set with appropriate options to use bundle20 during getbundle"""
708 caps = set(['HG2X'])
709 capsblob = bundle2.encodecaps(repo.bundle2caps)
710 caps.add('bundle2=' + urllib.quote(capsblob))
711 return caps
712
675 def getbundle(repo, source, heads=None, common=None, bundlecaps=None,
713 def getbundle(repo, source, heads=None, common=None, bundlecaps=None,
676 **kwargs):
714 **kwargs):
677 """return a full bundle (with potentially multiple kind of parts)
715 """return a full bundle (with potentially multiple kind of parts)
@@ -691,6 +729,9 b' def getbundle(repo, source, heads=None, '
691 cg = changegroup.getbundle(repo, source, heads=heads,
729 cg = changegroup.getbundle(repo, source, heads=heads,
692 common=common, bundlecaps=bundlecaps)
730 common=common, bundlecaps=bundlecaps)
693 if bundlecaps is None or 'HG2X' not in bundlecaps:
731 if bundlecaps is None or 'HG2X' not in bundlecaps:
732 if kwargs:
733 raise ValueError(_('unsupported getbundle arguments: %s')
734 % ', '.join(sorted(kwargs.keys())))
694 return cg
735 return cg
695 # very crude first implementation,
736 # very crude first implementation,
696 # the bundle API will change and the generation will be done lazily.
737 # the bundle API will change and the generation will be done lazily.
@@ -701,8 +742,13 b' def getbundle(repo, source, heads=None, '
701 b2caps.update(bundle2.decodecaps(blob))
742 b2caps.update(bundle2.decodecaps(blob))
702 bundler = bundle2.bundle20(repo.ui, b2caps)
743 bundler = bundle2.bundle20(repo.ui, b2caps)
703 if cg:
744 if cg:
704 part = bundle2.bundlepart('b2x:changegroup', data=cg.getchunks())
745 bundler.newpart('b2x:changegroup', data=cg.getchunks())
705 bundler.addpart(part)
746 listkeys = kwargs.get('listkeys', ())
747 for namespace in listkeys:
748 part = bundler.newpart('b2x:listkeys')
749 part.addparam('namespace', namespace)
750 keys = repo.listkeys(namespace).items()
751 part.data = pushkey.encodekeys(keys)
706 _getbundleextrapart(bundler, repo, source, heads=heads, common=common,
752 _getbundleextrapart(bundler, repo, source, heads=heads, common=common,
707 bundlecaps=bundlecaps, **kwargs)
753 bundlecaps=bundlecaps, **kwargs)
708 return util.chunkbuffer(bundler.getchunks())
754 return util.chunkbuffer(bundler.getchunks())
@@ -138,7 +138,7 b' def wrapcommand(table, command, wrapper)'
138 where orig is the original (wrapped) function, and *args, **kwargs
138 where orig is the original (wrapped) function, and *args, **kwargs
139 are the arguments passed to it.
139 are the arguments passed to it.
140 '''
140 '''
141 assert util.safehasattr(wrapper, '__call__')
141 assert callable(wrapper)
142 aliases, entry = cmdutil.findcmd(command, table)
142 aliases, entry = cmdutil.findcmd(command, table)
143 for alias, e in table.iteritems():
143 for alias, e in table.iteritems():
144 if e is entry:
144 if e is entry:
@@ -191,12 +191,12 b' def wrapfunction(container, funcname, wr'
191 your end users, you should play nicely with others by using the
191 your end users, you should play nicely with others by using the
192 subclass trick.
192 subclass trick.
193 '''
193 '''
194 assert util.safehasattr(wrapper, '__call__')
194 assert callable(wrapper)
195 def wrap(*args, **kwargs):
195 def wrap(*args, **kwargs):
196 return wrapper(origfn, *args, **kwargs)
196 return wrapper(origfn, *args, **kwargs)
197
197
198 origfn = getattr(container, funcname)
198 origfn = getattr(container, funcname)
199 assert util.safehasattr(origfn, '__call__')
199 assert callable(origfn)
200 setattr(container, funcname, wrap)
200 setattr(container, funcname, wrap)
201 return origfn
201 return origfn
202
202
@@ -367,3 +367,16 b' def enabled(shortname=True):'
367 exts[ename] = doc.splitlines()[0].strip()
367 exts[ename] = doc.splitlines()[0].strip()
368
368
369 return exts
369 return exts
370
371 def moduleversion(module):
372 '''return version information from given module as a string'''
373 if (util.safehasattr(module, 'getversion')
374 and callable(module.getversion)):
375 version = module.getversion()
376 elif util.safehasattr(module, '__version__'):
377 version = module.__version__
378 else:
379 version = ''
380 if isinstance(version, (list, tuple)):
381 version = '.'.join(str(o) for o in version)
382 return version
@@ -77,7 +77,7 b' def fancyopts(args, options, state, gnu='
77 # copy defaults to state
77 # copy defaults to state
78 if isinstance(default, list):
78 if isinstance(default, list):
79 state[name] = default[:]
79 state[name] = default[:]
80 elif getattr(default, '__call__', False):
80 elif callable(default):
81 state[name] = None
81 state[name] = None
82 else:
82 else:
83 state[name] = default
83 state[name] = default
@@ -7,8 +7,9 b''
7
7
8 from node import short
8 from node import short
9 from i18n import _
9 from i18n import _
10 import util, simplemerge, match, error
10 import util, simplemerge, match, error, templater, templatekw
11 import os, tempfile, re, filecmp
11 import os, tempfile, re, filecmp
12 import tagmerge
12
13
13 def _toolstr(ui, tool, part, default=""):
14 def _toolstr(ui, tool, part, default=""):
14 return ui.config("merge-tools", tool + "." + part, default)
15 return ui.config("merge-tools", tool + "." + part, default)
@@ -169,7 +170,7 b' def _ifail(repo, mynode, orig, fcd, fco,'
169 used to resolve these conflicts."""
170 used to resolve these conflicts."""
170 return 1
171 return 1
171
172
172 def _premerge(repo, toolconf, files):
173 def _premerge(repo, toolconf, files, labels=None):
173 tool, toolpath, binary, symlink = toolconf
174 tool, toolpath, binary, symlink = toolconf
174 if symlink:
175 if symlink:
175 return 1
176 return 1
@@ -190,7 +191,7 b' def _premerge(repo, toolconf, files):'
190 (tool, premerge, _valid))
191 (tool, premerge, _valid))
191
192
192 if premerge:
193 if premerge:
193 r = simplemerge.simplemerge(ui, a, b, c, quiet=True)
194 r = simplemerge.simplemerge(ui, a, b, c, quiet=True, label=labels)
194 if not r:
195 if not r:
195 ui.debug(" premerge successful\n")
196 ui.debug(" premerge successful\n")
196 return 0
197 return 0
@@ -201,7 +202,7 b' def _premerge(repo, toolconf, files):'
201 @internaltool('merge', True,
202 @internaltool('merge', True,
202 _("merging %s incomplete! "
203 _("merging %s incomplete! "
203 "(edit conflicts, then use 'hg resolve --mark')\n"))
204 "(edit conflicts, then use 'hg resolve --mark')\n"))
204 def _imerge(repo, mynode, orig, fcd, fco, fca, toolconf, files):
205 def _imerge(repo, mynode, orig, fcd, fco, fca, toolconf, files, labels=None):
205 """
206 """
206 Uses the internal non-interactive simple merge algorithm for merging
207 Uses the internal non-interactive simple merge algorithm for merging
207 files. It will fail if there are any conflicts and leave markers in
208 files. It will fail if there are any conflicts and leave markers in
@@ -211,19 +212,28 b' def _imerge(repo, mynode, orig, fcd, fco'
211 repo.ui.warn(_('warning: internal:merge cannot merge symlinks '
212 repo.ui.warn(_('warning: internal:merge cannot merge symlinks '
212 'for %s\n') % fcd.path())
213 'for %s\n') % fcd.path())
213 return False, 1
214 return False, 1
214
215 r = _premerge(repo, toolconf, files, labels=labels)
215 r = _premerge(repo, toolconf, files)
216 if r:
216 if r:
217 a, b, c, back = files
217 a, b, c, back = files
218
218
219 ui = repo.ui
219 ui = repo.ui
220
220
221 r = simplemerge.simplemerge(ui, a, b, c, label=['local', 'other'])
221 r = simplemerge.simplemerge(ui, a, b, c, label=labels, no_minimal=True)
222 return True, r
222 return True, r
223 return False, 0
223 return False, 0
224
224
225 @internaltool('tagmerge', True,
226 _("automatic tag merging of %s failed! "
227 "(use 'hg resolve --tool internal:merge' or another merge "
228 "tool of your choice)\n"))
229 def _itagmerge(repo, mynode, orig, fcd, fco, fca, toolconf, files, labels=None):
230 """
231 Uses the internal tag merge algorithm (experimental).
232 """
233 return tagmerge.merge(repo, fcd, fco, fca)
234
225 @internaltool('dump', True)
235 @internaltool('dump', True)
226 def _idump(repo, mynode, orig, fcd, fco, fca, toolconf, files):
236 def _idump(repo, mynode, orig, fcd, fco, fca, toolconf, files, labels=None):
227 """
237 """
228 Creates three versions of the files to merge, containing the
238 Creates three versions of the files to merge, containing the
229 contents of local, other and base. These files can then be used to
239 contents of local, other and base. These files can then be used to
@@ -231,7 +241,7 b' def _idump(repo, mynode, orig, fcd, fco,'
231 ``a.txt``, these files will accordingly be named ``a.txt.local``,
241 ``a.txt``, these files will accordingly be named ``a.txt.local``,
232 ``a.txt.other`` and ``a.txt.base`` and they will be placed in the
242 ``a.txt.other`` and ``a.txt.base`` and they will be placed in the
233 same directory as ``a.txt``."""
243 same directory as ``a.txt``."""
234 r = _premerge(repo, toolconf, files)
244 r = _premerge(repo, toolconf, files, labels=labels)
235 if r:
245 if r:
236 a, b, c, back = files
246 a, b, c, back = files
237
247
@@ -242,8 +252,8 b' def _idump(repo, mynode, orig, fcd, fco,'
242 repo.wwrite(fd + ".base", fca.data(), fca.flags())
252 repo.wwrite(fd + ".base", fca.data(), fca.flags())
243 return False, r
253 return False, r
244
254
245 def _xmerge(repo, mynode, orig, fcd, fco, fca, toolconf, files):
255 def _xmerge(repo, mynode, orig, fcd, fco, fca, toolconf, files, labels=None):
246 r = _premerge(repo, toolconf, files)
256 r = _premerge(repo, toolconf, files, labels=labels)
247 if r:
257 if r:
248 tool, toolpath, binary, symlink = toolconf
258 tool, toolpath, binary, symlink = toolconf
249 a, b, c, back = files
259 a, b, c, back = files
@@ -270,7 +280,57 b' def _xmerge(repo, mynode, orig, fcd, fco'
270 return True, r
280 return True, r
271 return False, 0
281 return False, 0
272
282
273 def filemerge(repo, mynode, orig, fcd, fco, fca):
283 def _formatconflictmarker(repo, ctx, template, label, pad):
284 """Applies the given template to the ctx, prefixed by the label.
285
286 Pad is the minimum width of the label prefix, so that multiple markers
287 can have aligned templated parts.
288 """
289 if ctx.node() is None:
290 ctx = ctx.p1()
291
292 props = templatekw.keywords.copy()
293 props['templ'] = template
294 props['ctx'] = ctx
295 props['repo'] = repo
296 templateresult = template('conflictmarker', **props)
297
298 label = ('%s:' % label).ljust(pad + 1)
299 mark = '%s %s' % (label, templater.stringify(templateresult))
300
301 if mark:
302 mark = mark.splitlines()[0] # split for safety
303
304 # 8 for the prefix of conflict marker lines (e.g. '<<<<<<< ')
305 return util.ellipsis(mark, 80 - 8)
306
307 _defaultconflictmarker = ('{node|short} ' +
308 '{ifeq(tags, "tip", "", "{tags} ")}' +
309 '{if(bookmarks, "{bookmarks} ")}' +
310 '{ifeq(branch, "default", "", "{branch} ")}' +
311 '- {author|user}: {desc|firstline}')
312
313 _defaultconflictlabels = ['local', 'other']
314
315 def _formatlabels(repo, fcd, fco, labels):
316 """Formats the given labels using the conflict marker template.
317
318 Returns a list of formatted labels.
319 """
320 cd = fcd.changectx()
321 co = fco.changectx()
322
323 ui = repo.ui
324 template = ui.config('ui', 'mergemarkertemplate', _defaultconflictmarker)
325 template = templater.parsestring(template, quoted=False)
326 tmpl = templater.templater(None, cache={ 'conflictmarker' : template })
327
328 pad = max(len(labels[0]), len(labels[1]))
329
330 return [_formatconflictmarker(repo, cd, tmpl, labels[0], pad),
331 _formatconflictmarker(repo, co, tmpl, labels[1], pad)]
332
333 def filemerge(repo, mynode, orig, fcd, fco, fca, labels=None):
274 """perform a 3-way merge in the working directory
334 """perform a 3-way merge in the working directory
275
335
276 mynode = parent node before merge
336 mynode = parent node before merge
@@ -327,8 +387,17 b' def filemerge(repo, mynode, orig, fcd, f'
327
387
328 ui.debug("my %s other %s ancestor %s\n" % (fcd, fco, fca))
388 ui.debug("my %s other %s ancestor %s\n" % (fcd, fco, fca))
329
389
390 markerstyle = ui.config('ui', 'mergemarkers', 'basic')
391 if markerstyle == 'basic':
392 formattedlabels = _defaultconflictlabels
393 else:
394 if not labels:
395 labels = _defaultconflictlabels
396
397 formattedlabels = _formatlabels(repo, fcd, fco, labels)
398
330 needcheck, r = func(repo, mynode, orig, fcd, fco, fca, toolconf,
399 needcheck, r = func(repo, mynode, orig, fcd, fco, fca, toolconf,
331 (a, b, c, back))
400 (a, b, c, back), labels=formattedlabels)
332 if not needcheck:
401 if not needcheck:
333 if r:
402 if r:
334 if onfailure:
403 if onfailure:
@@ -404,7 +404,7 b' def help_(ui, name, unknowncmd=False, fu'
404 # description
404 # description
405 if not doc:
405 if not doc:
406 rst.append(" %s\n" % _("(no help text available)"))
406 rst.append(" %s\n" % _("(no help text available)"))
407 if util.safehasattr(doc, '__call__'):
407 if callable(doc):
408 rst += [" %s\n" % l for l in doc().splitlines()]
408 rst += [" %s\n" % l for l in doc().splitlines()]
409
409
410 if not ui.verbose:
410 if not ui.verbose:
@@ -481,8 +481,11 b' def help_(ui, name, unknowncmd=False, fu'
481 rst.append('%s:\n\n' % title)
481 rst.append('%s:\n\n' % title)
482 rst.extend(minirst.maketable(sorted(matches[t]), 1))
482 rst.extend(minirst.maketable(sorted(matches[t]), 1))
483 rst.append('\n')
483 rst.append('\n')
484 if not rst:
485 msg = _('no matches')
486 hint = _('try "hg help" for a list of topics')
487 raise util.Abort(msg, hint=hint)
484 elif name and name != 'shortlist':
488 elif name and name != 'shortlist':
485 i = None
486 if unknowncmd:
489 if unknowncmd:
487 queries = (helpextcmd,)
490 queries = (helpextcmd,)
488 elif opts.get('extension'):
491 elif opts.get('extension'):
@@ -494,12 +497,16 b' def help_(ui, name, unknowncmd=False, fu'
494 for f in queries:
497 for f in queries:
495 try:
498 try:
496 rst = f(name)
499 rst = f(name)
497 i = None
498 break
500 break
499 except error.UnknownCommand, inst:
501 except error.UnknownCommand:
500 i = inst
502 pass
501 if i:
503 else:
502 raise i
504 if unknowncmd:
505 raise error.UnknownCommand(name)
506 else:
507 msg = _('no such help topic: %s') % name
508 hint = _('try "hg help --keyword %s"') % name
509 raise util.Abort(msg, hint=hint)
503 else:
510 else:
504 # program name
511 # program name
505 if not ui.quiet:
512 if not ui.quiet:
@@ -330,6 +330,64 b' If no suitable authentication entry is f'
330 for credentials as usual if required by the remote.
330 for credentials as usual if required by the remote.
331
331
332
332
333 ``committemplate``
334 ------------------
335
336 ``changeset`` configuration in this section is used as the template to
337 customize the text shown in the editor when committing.
338
339 In addition to pre-defined template keywords, commit log specific one
340 below can be used for customization:
341
342 ``extramsg``
343 String: Extra message (typically 'Leave message empty to abort
344 commit.'). This may be changed by some commands or extensions.
345
346 For example, the template configuration below shows as same text as
347 one shown by default::
348
349 [committemplate]
350 changeset = {desc}\n\n
351 HG: Enter commit message. Lines beginning with 'HG:' are removed.
352 HG: {extramsg}
353 HG: --
354 HG: user: {author}\n{ifeq(p2rev, "-1", "",
355 "HG: branch merge\n")
356 }HG: branch '{branch}'\n{if(currentbookmark,
357 "HG: bookmark '{currentbookmark}'\n") }{subrepos %
358 "HG: subrepo {subrepo}\n" }{file_adds %
359 "HG: added {file}\n" }{file_mods %
360 "HG: changed {file}\n" }{file_dels %
361 "HG: removed {file}\n" }{if(files, "",
362 "HG: no files changed\n")}
363
364 .. note::
365
366 For some problematic encodings (see :hg:`help win32mbcs` for
367 detail), this customization should be configured carefully, to
368 avoid showing broken characters.
369
370 For example, if multibyte character ending with backslash (0x5c) is
371 followed by ASCII character 'n' in the customized template,
372 sequence of backslash and 'n' is treated as line-feed unexpectedly
373 (and multibyte character is broken, too).
374
375 Customized template is used for commands below (``--edit`` may be
376 required):
377
378 - :hg:`backout`
379 - :hg:`commit`
380 - :hg:`fetch` (for merge commit only)
381 - :hg:`graft`
382 - :hg:`histedit`
383 - :hg:`import`
384 - :hg:`qfold`, :hg:`qnew` and :hg:`qrefresh`
385 - :hg:`rebase`
386 - :hg:`shelve`
387 - :hg:`sign`
388 - :hg:`tag`
389 - :hg:`transplant`
390
333 ``decode/encode``
391 ``decode/encode``
334 -----------------
392 -----------------
335
393
@@ -807,7 +865,9 b' Example::'
807 ---------------
865 ---------------
808
866
809 This section configures external merge tools to use for file-level
867 This section configures external merge tools to use for file-level
810 merges.
868 merges. This section has likely been preconfigured at install time.
869 Use :hg:`config merge-tools` to check the existing configuration.
870 Also see :hg:`help merge-tools` for more details.
811
871
812 Example ``~/.hgrc``::
872 Example ``~/.hgrc``::
813
873
@@ -819,6 +879,9 b' Example ``~/.hgrc``::'
819 # Give higher priority
879 # Give higher priority
820 kdiff3.priority = 1
880 kdiff3.priority = 1
821
881
882 # Changing the priority of preconfigured tool
883 vimdiff.priority = 0
884
822 # Define new tool
885 # Define new tool
823 myHtmlTool.args = -m $local $other $base $output
886 myHtmlTool.args = -m $local $other $base $output
824 myHtmlTool.regkey = Software\FooSoftware\HtmlMerge
887 myHtmlTool.regkey = Software\FooSoftware\HtmlMerge
@@ -838,7 +901,13 b' Supported arguments:'
838 ``args``
901 ``args``
839 The arguments to pass to the tool executable. You can refer to the
902 The arguments to pass to the tool executable. You can refer to the
840 files being merged as well as the output file through these
903 files being merged as well as the output file through these
841 variables: ``$base``, ``$local``, ``$other``, ``$output``.
904 variables: ``$base``, ``$local``, ``$other``, ``$output``. The meaning
905 of ``$local`` and ``$other`` can vary depending on which action is being
906 performed. During and update or merge, ``$local`` represents the original
907 state of the file, while ``$other`` represents the commit you are updating
908 to or the commit you are merging with. During a rebase ``$local``
909 represents the destination of the rebase, and ``$other`` represents the
910 commit being rebased.
842 Default: ``$local $base $other``
911 Default: ``$local $base $other``
843
912
844 ``premerge``
913 ``premerge``
@@ -1203,6 +1272,27 b' User interface controls.'
1203 For more information on merge tools see :hg:`help merge-tools`.
1272 For more information on merge tools see :hg:`help merge-tools`.
1204 For configuring merge tools see the ``[merge-tools]`` section.
1273 For configuring merge tools see the ``[merge-tools]`` section.
1205
1274
1275 ``mergemarkers``
1276 Sets the merge conflict marker label styling. The ``detailed``
1277 style uses the ``mergemarkertemplate`` setting to style the labels.
1278 The ``basic`` style just uses 'local' and 'other' as the marker label.
1279 One of ``basic`` or ``detailed``.
1280 Default is ``basic``.
1281
1282 ``mergemarkertemplate``
1283 The template used to print the commit description next to each conflict
1284 marker during merge conflicts. See :hg:`help templates` for the template
1285 format.
1286 Defaults to showing the hash, tags, branches, bookmarks, author, and
1287 the first line of the commit description.
1288 You have to pay attention to encodings of managed files, if you
1289 use non-ASCII characters in tags, branches, bookmarks, author
1290 and/or commit descriptions. At template expansion, non-ASCII
1291 characters use the encoding specified by ``--encoding`` global
1292 option, ``HGENCODING`` or other locale setting environment
1293 variables. The difference of encoding between merged file and
1294 conflict markers causes serious problem.
1295
1206 ``portablefilenames``
1296 ``portablefilenames``
1207 Check for portable filenames. Can be ``warn``, ``ignore`` or ``abort``.
1297 Check for portable filenames. Can be ``warn``, ``ignore`` or ``abort``.
1208 Default is ``warn``.
1298 Default is ``warn``.
@@ -66,10 +66,14 b' In addition to filters, there are some b'
66
66
67 - shortest(node)
67 - shortest(node)
68
68
69 - startswith(string, text)
70
69 - strip(text[, chars])
71 - strip(text[, chars])
70
72
71 - sub(pat, repl, expr)
73 - sub(pat, repl, expr)
72
74
75 - word(number, text[, separator])
76
73 Also, for any expression that returns a list, there is a list operator:
77 Also, for any expression that returns a list, there is a list operator:
74
78
75 - expr % "{template}"
79 - expr % "{template}"
@@ -84,6 +88,10 b' Some sample command line templates:'
84
88
85 $ hg log -r 0 --template "files: {join(files, ', ')}\n"
89 $ hg log -r 0 --template "files: {join(files, ', ')}\n"
86
90
91 - Modify each line of a commit description::
92
93 $ hg log --template "{splitlines(desc) % '**** {line}\n'}"
94
87 - Format date::
95 - Format date::
88
96
89 $ hg log -r 0 --template "{date(date, '%Y')}\n"
97 $ hg log -r 0 --template "{date(date, '%Y')}\n"
@@ -120,3 +128,11 b' Some sample command line templates:'
120 - Mark the working copy parent with '@'::
128 - Mark the working copy parent with '@'::
121
129
122 $ hg log --template "{ifcontains(rev, revset('.'), '@')}\n"
130 $ hg log --template "{ifcontains(rev, revset('.'), '@')}\n"
131
132 - Show only commit descriptions that start with "template"::
133
134 $ hg log --template "{startswith(\"template\", firstline(desc))}\n"
135
136 - Print the first word of each line of a commit message::
137
138 $ hg log --template "{word(\"0\", desc)}\n"
@@ -172,15 +172,15 b' def share(ui, source, dest=None, update='
172
172
173 sharedpath = srcrepo.sharedpath # if our source is already sharing
173 sharedpath = srcrepo.sharedpath # if our source is already sharing
174
174
175 root = os.path.realpath(dest)
175 destwvfs = scmutil.vfs(dest, realpath=True)
176 roothg = os.path.join(root, '.hg')
176 destvfs = scmutil.vfs(os.path.join(destwvfs.base, '.hg'), realpath=True)
177
177
178 if os.path.exists(roothg):
178 if destvfs.lexists():
179 raise util.Abort(_('destination already exists'))
179 raise util.Abort(_('destination already exists'))
180
180
181 if not os.path.isdir(root):
181 if not destwvfs.isdir():
182 os.mkdir(root)
182 destwvfs.mkdir()
183 util.makedir(roothg, notindexed=True)
183 destvfs.makedir()
184
184
185 requirements = ''
185 requirements = ''
186 try:
186 try:
@@ -190,10 +190,10 b' def share(ui, source, dest=None, update='
190 raise
190 raise
191
191
192 requirements += 'shared\n'
192 requirements += 'shared\n'
193 util.writefile(os.path.join(roothg, 'requires'), requirements)
193 destvfs.write('requires', requirements)
194 util.writefile(os.path.join(roothg, 'sharedpath'), sharedpath)
194 destvfs.write('sharedpath', sharedpath)
195
195
196 r = repository(ui, root)
196 r = repository(ui, destwvfs.base)
197
197
198 default = srcrepo.ui.config('paths', 'default')
198 default = srcrepo.ui.config('paths', 'default')
199 if default:
199 if default:
@@ -311,10 +311,12 b' def clone(ui, peeropts, source, dest=Non'
311
311
312 if not dest:
312 if not dest:
313 raise util.Abort(_("empty destination path is not valid"))
313 raise util.Abort(_("empty destination path is not valid"))
314 if os.path.exists(dest):
314
315 if not os.path.isdir(dest):
315 destvfs = scmutil.vfs(dest, expandpath=True)
316 if destvfs.lexists():
317 if not destvfs.isdir():
316 raise util.Abort(_("destination '%s' already exists") % dest)
318 raise util.Abort(_("destination '%s' already exists") % dest)
317 elif os.listdir(dest):
319 elif destvfs.listdir():
318 raise util.Abort(_("destination '%s' is not empty") % dest)
320 raise util.Abort(_("destination '%s' is not empty") % dest)
319
321
320 srclock = destlock = cleandir = None
322 srclock = destlock = cleandir = None
@@ -483,7 +485,8 b' def updaterepo(repo, node, overwrite):'
483 When overwrite is set, changes are clobbered, merged else
485 When overwrite is set, changes are clobbered, merged else
484
486
485 returns stats (see pydoc mercurial.merge.applyupdates)"""
487 returns stats (see pydoc mercurial.merge.applyupdates)"""
486 return mergemod.update(repo, node, False, overwrite, None)
488 return mergemod.update(repo, node, False, overwrite, None,
489 labels=['working copy', 'destination'])
487
490
488 def update(repo, node):
491 def update(repo, node):
489 """update the working directory to node, merging linear changes"""
492 """update the working directory to node, merging linear changes"""
@@ -19,7 +19,7 b' def _pythonhook(ui, repo, name, hname, f'
19 unmodified commands (e.g. mercurial.commands.update) can
19 unmodified commands (e.g. mercurial.commands.update) can
20 be run as hooks without wrappers to convert return values.'''
20 be run as hooks without wrappers to convert return values.'''
21
21
22 if util.safehasattr(funcname, '__call__'):
22 if callable(funcname):
23 obj = funcname
23 obj = funcname
24 funcname = obj.__module__ + "." + obj.__name__
24 funcname = obj.__module__ + "." + obj.__name__
25 else:
25 else:
@@ -70,7 +70,7 b' def _pythonhook(ui, repo, name, hname, f'
70 raise util.Abort(_('%s hook is invalid '
70 raise util.Abort(_('%s hook is invalid '
71 '("%s" is not defined)') %
71 '("%s" is not defined)') %
72 (hname, funcname))
72 (hname, funcname))
73 if not util.safehasattr(obj, '__call__'):
73 if not callable(obj):
74 raise util.Abort(_('%s hook is invalid '
74 raise util.Abort(_('%s hook is invalid '
75 '("%s" is not callable)') %
75 '("%s" is not callable)') %
76 (hname, funcname))
76 (hname, funcname))
@@ -117,7 +117,7 b' def _exthook(ui, repo, name, cmd, args, '
117 starttime = time.time()
117 starttime = time.time()
118 env = {}
118 env = {}
119 for k, v in args.iteritems():
119 for k, v in args.iteritems():
120 if util.safehasattr(v, '__call__'):
120 if callable(v):
121 v = v()
121 v = v()
122 if isinstance(v, dict):
122 if isinstance(v, dict):
123 # make the dictionary element order stable across Python
123 # make the dictionary element order stable across Python
@@ -184,7 +184,7 b' def hook(ui, repo, name, throw=False, **'
184 # files seem to be bogus, give up on redirecting (WSGI, etc)
184 # files seem to be bogus, give up on redirecting (WSGI, etc)
185 pass
185 pass
186
186
187 if util.safehasattr(cmd, '__call__'):
187 if callable(cmd):
188 r = _pythonhook(ui, repo, name, hname, cmd, args, throw) or r
188 r = _pythonhook(ui, repo, name, hname, cmd, args, throw) or r
189 elif cmd.startswith('python:'):
189 elif cmd.startswith('python:'):
190 if cmd.count(':') >= 2:
190 if cmd.count(':') >= 2:
@@ -36,7 +36,11 b' def gettext(message):'
36 if message is None:
36 if message is None:
37 return message
37 return message
38
38
39 paragraphs = message.split('\n\n')
39 if type(message) is unicode:
40 # goofy unicode docstrings in test
41 paragraphs = message.split(u'\n\n')
42 else:
43 paragraphs = [p.decode("ascii") for p in message.split('\n\n')]
40 # Be careful not to translate the empty string -- it holds the
44 # Be careful not to translate the empty string -- it holds the
41 # meta data of the .po file.
45 # meta data of the .po file.
42 u = u'\n\n'.join([p and t.ugettext(p) or '' for p in paragraphs])
46 u = u'\n\n'.join([p and t.ugettext(p) or '' for p in paragraphs])
@@ -180,7 +180,9 b' class localrepository(object):'
180 requirements = ['revlogv1']
180 requirements = ['revlogv1']
181 filtername = None
181 filtername = None
182
182
183 bundle2caps = {'HG2X': ()}
183 bundle2caps = {'HG2X': (),
184 'b2x:listkeys': (),
185 'b2x:pushkey': ()}
184
186
185 # a list of (ui, featureset) functions.
187 # a list of (ui, featureset) functions.
186 # only functions defined in module of enabled extensions are invoked
188 # only functions defined in module of enabled extensions are invoked
@@ -476,10 +478,17 b' class localrepository(object):'
476 return 'file:' + self.root
478 return 'file:' + self.root
477
479
478 def hook(self, name, throw=False, **args):
480 def hook(self, name, throw=False, **args):
481 """Call a hook, passing this repo instance.
482
483 This a convenience method to aid invoking hooks. Extensions likely
484 won't call this unless they have registered a custom hook or are
485 replacing code that is expected to call a hook.
486 """
479 return hook.hook(self.ui, self, name, throw, **args)
487 return hook.hook(self.ui, self, name, throw, **args)
480
488
481 @unfilteredmethod
489 @unfilteredmethod
482 def _tag(self, names, node, message, local, user, date, extra={}):
490 def _tag(self, names, node, message, local, user, date, extra={},
491 editor=False):
483 if isinstance(names, str):
492 if isinstance(names, str):
484 names = (names,)
493 names = (names,)
485
494
@@ -539,14 +548,15 b' class localrepository(object):'
539 self[None].add(['.hgtags'])
548 self[None].add(['.hgtags'])
540
549
541 m = matchmod.exact(self.root, '', ['.hgtags'])
550 m = matchmod.exact(self.root, '', ['.hgtags'])
542 tagnode = self.commit(message, user, date, extra=extra, match=m)
551 tagnode = self.commit(message, user, date, extra=extra, match=m,
552 editor=editor)
543
553
544 for name in names:
554 for name in names:
545 self.hook('tag', node=hex(node), tag=name, local=local)
555 self.hook('tag', node=hex(node), tag=name, local=local)
546
556
547 return tagnode
557 return tagnode
548
558
549 def tag(self, names, node, message, local, user, date):
559 def tag(self, names, node, message, local, user, date, editor=False):
550 '''tag a revision with one or more symbolic names.
560 '''tag a revision with one or more symbolic names.
551
561
552 names is a list of strings or, when adding a single tag, names may be a
562 names is a list of strings or, when adding a single tag, names may be a
@@ -574,7 +584,7 b' class localrepository(object):'
574 '(please commit .hgtags manually)'))
584 '(please commit .hgtags manually)'))
575
585
576 self.tags() # instantiate the cache
586 self.tags() # instantiate the cache
577 self._tag(names, node, message, local, user, date)
587 self._tag(names, node, message, local, user, date, editor=editor)
578
588
579 @filteredpropertycache
589 @filteredpropertycache
580 def _tagscache(self):
590 def _tagscache(self):
@@ -855,7 +865,8 b' class localrepository(object):'
855 # abort here if the journal already exists
865 # abort here if the journal already exists
856 if self.svfs.exists("journal"):
866 if self.svfs.exists("journal"):
857 raise error.RepoError(
867 raise error.RepoError(
858 _("abandoned transaction found - run hg recover"))
868 _("abandoned transaction found"),
869 hint=_("run 'hg recover' to clean up transaction"))
859
870
860 def onclose():
871 def onclose():
861 self.store.write(self._transref())
872 self.store.write(self._transref())
@@ -1501,149 +1512,9 b' class localrepository(object):'
1501 def status(self, node1='.', node2=None, match=None,
1512 def status(self, node1='.', node2=None, match=None,
1502 ignored=False, clean=False, unknown=False,
1513 ignored=False, clean=False, unknown=False,
1503 listsubrepos=False):
1514 listsubrepos=False):
1504 """return status of files between two nodes or node and working
1515 '''a convenience method that calls node1.status(node2)'''
1505 directory.
1516 return self[node1].status(node2, match, ignored, clean, unknown,
1506
1517 listsubrepos)
1507 If node1 is None, use the first dirstate parent instead.
1508 If node2 is None, compare node1 with working directory.
1509 """
1510
1511 def mfmatches(ctx):
1512 mf = ctx.manifest().copy()
1513 if match.always():
1514 return mf
1515 for fn in mf.keys():
1516 if not match(fn):
1517 del mf[fn]
1518 return mf
1519
1520 ctx1 = self[node1]
1521 ctx2 = self[node2]
1522
1523 working = ctx2.rev() is None
1524 parentworking = working and ctx1 == self['.']
1525 match = match or matchmod.always(self.root, self.getcwd())
1526 listignored, listclean, listunknown = ignored, clean, unknown
1527
1528 # load earliest manifest first for caching reasons
1529 if not working and ctx2.rev() < ctx1.rev():
1530 ctx2.manifest()
1531
1532 if not parentworking:
1533 def bad(f, msg):
1534 # 'f' may be a directory pattern from 'match.files()',
1535 # so 'f not in ctx1' is not enough
1536 if f not in ctx1 and f not in ctx1.dirs():
1537 self.ui.warn('%s: %s\n' % (self.dirstate.pathto(f), msg))
1538 match.bad = bad
1539
1540 if working: # we need to scan the working dir
1541 subrepos = []
1542 if '.hgsub' in self.dirstate:
1543 subrepos = sorted(ctx2.substate)
1544 s = self.dirstate.status(match, subrepos, listignored,
1545 listclean, listunknown)
1546 cmp, modified, added, removed, deleted, unknown, ignored, clean = s
1547
1548 # check for any possibly clean files
1549 if parentworking and cmp:
1550 fixup = []
1551 # do a full compare of any files that might have changed
1552 for f in sorted(cmp):
1553 if (f not in ctx1 or ctx2.flags(f) != ctx1.flags(f)
1554 or ctx1[f].cmp(ctx2[f])):
1555 modified.append(f)
1556 else:
1557 fixup.append(f)
1558
1559 # update dirstate for files that are actually clean
1560 if fixup:
1561 if listclean:
1562 clean += fixup
1563
1564 try:
1565 # updating the dirstate is optional
1566 # so we don't wait on the lock
1567 wlock = self.wlock(False)
1568 try:
1569 for f in fixup:
1570 self.dirstate.normal(f)
1571 finally:
1572 wlock.release()
1573 except error.LockError:
1574 pass
1575
1576 if not parentworking:
1577 mf1 = mfmatches(ctx1)
1578 if working:
1579 # we are comparing working dir against non-parent
1580 # generate a pseudo-manifest for the working dir
1581 mf2 = mfmatches(self['.'])
1582 for f in cmp + modified + added:
1583 mf2[f] = None
1584 mf2.set(f, ctx2.flags(f))
1585 for f in removed:
1586 if f in mf2:
1587 del mf2[f]
1588 else:
1589 # we are comparing two revisions
1590 deleted, unknown, ignored = [], [], []
1591 mf2 = mfmatches(ctx2)
1592
1593 modified, added, clean = [], [], []
1594 withflags = mf1.withflags() | mf2.withflags()
1595 for fn, mf2node in mf2.iteritems():
1596 if fn in mf1:
1597 if (fn not in deleted and
1598 ((fn in withflags and mf1.flags(fn) != mf2.flags(fn)) or
1599 (mf1[fn] != mf2node and
1600 (mf2node or ctx1[fn].cmp(ctx2[fn]))))):
1601 modified.append(fn)
1602 elif listclean:
1603 clean.append(fn)
1604 del mf1[fn]
1605 elif fn not in deleted:
1606 added.append(fn)
1607 removed = mf1.keys()
1608
1609 if working and modified and not self.dirstate._checklink:
1610 # Symlink placeholders may get non-symlink-like contents
1611 # via user error or dereferencing by NFS or Samba servers,
1612 # so we filter out any placeholders that don't look like a
1613 # symlink
1614 sane = []
1615 for f in modified:
1616 if ctx2.flags(f) == 'l':
1617 d = ctx2[f].data()
1618 if d == '' or len(d) >= 1024 or '\n' in d or util.binary(d):
1619 self.ui.debug('ignoring suspect symlink placeholder'
1620 ' "%s"\n' % f)
1621 continue
1622 sane.append(f)
1623 modified = sane
1624
1625 r = modified, added, removed, deleted, unknown, ignored, clean
1626
1627 if listsubrepos:
1628 for subpath, sub in scmutil.itersubrepos(ctx1, ctx2):
1629 if working:
1630 rev2 = None
1631 else:
1632 rev2 = ctx2.substate[subpath][1]
1633 try:
1634 submatch = matchmod.narrowmatcher(subpath, match)
1635 s = sub.status(rev2, match=submatch, ignored=listignored,
1636 clean=listclean, unknown=listunknown,
1637 listsubrepos=True)
1638 for rfiles, sfiles in zip(r, s):
1639 rfiles.extend("%s/%s" % (subpath, f) for f in sfiles)
1640 except error.LookupError:
1641 self.ui.status(_("skipping missing subrepository: %s\n")
1642 % subpath)
1643
1644 for l in r:
1645 l.sort()
1646 return r
1647
1518
1648 def heads(self, start=None):
1519 def heads(self, start=None):
1649 heads = self.changelog.heads(start)
1520 heads = self.changelog.heads(start)
@@ -25,6 +25,18 b' class manifestdict(dict):'
25 self._flags[f] = flags
25 self._flags[f] = flags
26 def copy(self):
26 def copy(self):
27 return manifestdict(self, dict.copy(self._flags))
27 return manifestdict(self, dict.copy(self._flags))
28 def intersectfiles(self, files):
29 '''make a new manifestdict with the intersection of self with files
30
31 The algorithm assumes that files is much smaller than self.'''
32 ret = manifestdict()
33 for fn in files:
34 if fn in self:
35 ret[fn] = self[fn]
36 flags = self._flags.get(fn, None)
37 if flags:
38 ret._flags[fn] = flags
39 return ret
28 def flagsdiff(self, d2):
40 def flagsdiff(self, d2):
29 return dicthelpers.diff(self._flags, d2._flags, "")
41 return dicthelpers.diff(self._flags, d2._flags, "")
30
42
@@ -12,7 +12,7 b' from i18n import _'
12 def _rematcher(regex):
12 def _rematcher(regex):
13 '''compile the regexp with the best available regexp engine and return a
13 '''compile the regexp with the best available regexp engine and return a
14 matcher function'''
14 matcher function'''
15 m = util.compilere(regex)
15 m = util.re.compile(regex)
16 try:
16 try:
17 # slightly faster, provided by facebook's re2 bindings
17 # slightly faster, provided by facebook's re2 bindings
18 return m.test_match
18 return m.test_match
@@ -247,7 +247,7 b' def _globre(pat):'
247 i, n = 0, len(pat)
247 i, n = 0, len(pat)
248 res = ''
248 res = ''
249 group = 0
249 group = 0
250 escape = re.escape
250 escape = util.re.escape
251 def peek():
251 def peek():
252 return i < n and pat[i]
252 return i < n and pat[i]
253 while i < n:
253 while i < n:
@@ -310,11 +310,11 b' def _regex(kind, pat, globsuffix):'
310 if kind == 're':
310 if kind == 're':
311 return pat
311 return pat
312 if kind == 'path':
312 if kind == 'path':
313 return '^' + re.escape(pat) + '(?:/|$)'
313 return '^' + util.re.escape(pat) + '(?:/|$)'
314 if kind == 'relglob':
314 if kind == 'relglob':
315 return '(?:|.*/)' + _globre(pat) + globsuffix
315 return '(?:|.*/)' + _globre(pat) + globsuffix
316 if kind == 'relpath':
316 if kind == 'relpath':
317 return re.escape(pat) + '(?:/|$)'
317 return util.re.escape(pat) + '(?:/|$)'
318 if kind == 'relre':
318 if kind == 'relre':
319 if pat.startswith('^'):
319 if pat.startswith('^'):
320 return pat
320 return pat
@@ -37,6 +37,7 b' class diffopts(object):'
37 'showfunc': False,
37 'showfunc': False,
38 'git': False,
38 'git': False,
39 'nodates': False,
39 'nodates': False,
40 'nobinary': False,
40 'ignorews': False,
41 'ignorews': False,
41 'ignorewsamount': False,
42 'ignorewsamount': False,
42 'ignoreblanklines': False,
43 'ignoreblanklines': False,
This diff has been collapsed as it changes many lines, (672 lines changed) Show them Hide them
@@ -55,6 +55,8 b' class mergestate(object):'
55
55
56 def reset(self, node=None, other=None):
56 def reset(self, node=None, other=None):
57 self._state = {}
57 self._state = {}
58 self._local = None
59 self._other = None
58 if node:
60 if node:
59 self._local = node
61 self._local = node
60 self._other = other
62 self._other = other
@@ -68,6 +70,8 b' class mergestate(object):'
68 of on disk file.
70 of on disk file.
69 """
71 """
70 self._state = {}
72 self._state = {}
73 self._local = None
74 self._other = None
71 records = self._readrecords()
75 records = self._readrecords()
72 for rtype, record in records:
76 for rtype, record in records:
73 if rtype == 'L':
77 if rtype == 'L':
@@ -171,6 +175,18 b' class mergestate(object):'
171 raise
175 raise
172 return records
176 return records
173
177
178 def active(self):
179 """Whether mergestate is active.
180
181 Returns True if there appears to be mergestate. This is a rough proxy
182 for "is a merge in progress."
183 """
184 # Check local variables before looking at filesystem for performance
185 # reasons.
186 return bool(self._local) or bool(self._state) or \
187 self._repo.opener.exists(self.statepathv1) or \
188 self._repo.opener.exists(self.statepathv2)
189
174 def commit(self):
190 def commit(self):
175 """Write current state on disk (if necessary)"""
191 """Write current state on disk (if necessary)"""
176 if self._dirty:
192 if self._dirty:
@@ -232,10 +248,7 b' class mergestate(object):'
232 return self._state[dfile][0]
248 return self._state[dfile][0]
233
249
234 def __iter__(self):
250 def __iter__(self):
235 l = self._state.keys()
251 return iter(sorted(self._state))
236 l.sort()
237 for f in l:
238 yield f
239
252
240 def files(self):
253 def files(self):
241 return self._state.keys()
254 return self._state.keys()
@@ -244,7 +257,14 b' class mergestate(object):'
244 self._state[dfile][0] = state
257 self._state[dfile][0] = state
245 self._dirty = True
258 self._dirty = True
246
259
247 def resolve(self, dfile, wctx):
260 def unresolved(self):
261 """Obtain the paths of unresolved files."""
262
263 for f, entry in self._state.items():
264 if entry[0] == 'u':
265 yield f
266
267 def resolve(self, dfile, wctx, labels=None):
248 """rerun merge process for file path `dfile`"""
268 """rerun merge process for file path `dfile`"""
249 if self[dfile] == 'r':
269 if self[dfile] == 'r':
250 return 0
270 return 0
@@ -267,7 +287,8 b' class mergestate(object):'
267 f = self._repo.opener("merge/" + hash)
287 f = self._repo.opener("merge/" + hash)
268 self._repo.wwrite(dfile, f.read(), flags)
288 self._repo.wwrite(dfile, f.read(), flags)
269 f.close()
289 f.close()
270 r = filemerge.filemerge(self._repo, self._local, lfile, fcd, fco, fca)
290 r = filemerge.filemerge(self._repo, self._local, lfile, fcd, fco, fca,
291 labels=labels)
271 if r is None:
292 if r is None:
272 # no real conflict
293 # no real conflict
273 del self._state[dfile]
294 del self._state[dfile]
@@ -310,62 +331,44 b' def _forgetremoved(wctx, mctx, branchmer'
310 as removed.
331 as removed.
311 """
332 """
312
333
313 actions = []
334 ractions = []
314 state = branchmerge and 'r' or 'f'
335 factions = xactions = []
336 if branchmerge:
337 xactions = ractions
315 for f in wctx.deleted():
338 for f in wctx.deleted():
316 if f not in mctx:
339 if f not in mctx:
317 actions.append((f, state, None, "forget deleted"))
340 xactions.append((f, None, "forget deleted"))
318
341
319 if not branchmerge:
342 if not branchmerge:
320 for f in wctx.removed():
343 for f in wctx.removed():
321 if f not in mctx:
344 if f not in mctx:
322 actions.append((f, "f", None, "forget removed"))
345 factions.append((f, None, "forget removed"))
323
346
324 return actions
347 return ractions, factions
325
348
326 def _checkcollision(repo, wmf, actions):
349 def _checkcollision(repo, wmf, actions):
327 # build provisional merged manifest up
350 # build provisional merged manifest up
328 pmmf = set(wmf)
351 pmmf = set(wmf)
329
352
330 def addop(f, args):
353 if actions:
331 pmmf.add(f)
354 # k, dr, e and rd are no-op
332 def removeop(f, args):
355 for m in 'a', 'f', 'g', 'cd', 'dc':
333 pmmf.discard(f)
356 for f, args, msg in actions[m]:
334 def nop(f, args):
357 pmmf.add(f)
335 pass
358 for f, args, msg in actions['r']:
336
359 pmmf.discard(f)
337 def renamemoveop(f, args):
360 for f, args, msg in actions['dm']:
338 f2, flags = args
361 f2, flags = args
339 pmmf.discard(f2)
362 pmmf.discard(f2)
340 pmmf.add(f)
363 pmmf.add(f)
341 def renamegetop(f, args):
364 for f, args, msg in actions['dg']:
342 f2, flags = args
365 f2, flags = args
343 pmmf.add(f)
366 pmmf.add(f)
344 def mergeop(f, args):
367 for f, args, msg in actions['m']:
345 f1, f2, fa, move, anc = args
368 f1, f2, fa, move, anc = args
346 if move:
369 if move:
347 pmmf.discard(f1)
370 pmmf.discard(f1)
348 pmmf.add(f)
371 pmmf.add(f)
349
350 opmap = {
351 "a": addop,
352 "dm": renamemoveop,
353 "dg": renamegetop,
354 "dr": nop,
355 "e": nop,
356 "k": nop,
357 "f": addop, # untracked file should be kept in working directory
358 "g": addop,
359 "m": mergeop,
360 "r": removeop,
361 "rd": nop,
362 "cd": addop,
363 "dc": addop,
364 }
365 for f, m, args, msg in actions:
366 op = opmap.get(m)
367 assert op, m
368 op(f, args)
369
372
370 # check case-folding collision in provisional merged manifest
373 # check case-folding collision in provisional merged manifest
371 foldmap = {}
374 foldmap = {}
@@ -386,7 +389,8 b' def manifestmerge(repo, wctx, p2, pa, br'
386 acceptremote = accept the incoming changes without prompting
389 acceptremote = accept the incoming changes without prompting
387 """
390 """
388
391
389 actions, copy, movewithdir = [], {}, {}
392 actions = dict((m, []) for m in 'a f g cd dc r dm dg m dr e rd k'.split())
393 copy, movewithdir = {}, {}
390
394
391 # manifests fetched in order are going to be faster, so prime the caches
395 # manifests fetched in order are going to be faster, so prime the caches
392 [x.manifest() for x in
396 [x.manifest() for x in
@@ -396,9 +400,9 b' def manifestmerge(repo, wctx, p2, pa, br'
396 ret = copies.mergecopies(repo, wctx, p2, pa)
400 ret = copies.mergecopies(repo, wctx, p2, pa)
397 copy, movewithdir, diverge, renamedelete = ret
401 copy, movewithdir, diverge, renamedelete = ret
398 for of, fl in diverge.iteritems():
402 for of, fl in diverge.iteritems():
399 actions.append((of, "dr", (fl,), "divergent renames"))
403 actions['dr'].append((of, (fl,), "divergent renames"))
400 for of, fl in renamedelete.iteritems():
404 for of, fl in renamedelete.iteritems():
401 actions.append((of, "rd", (fl,), "rename and delete"))
405 actions['rd'].append((of, (fl,), "rename and delete"))
402
406
403 repo.ui.note(_("resolving manifests\n"))
407 repo.ui.note(_("resolving manifests\n"))
404 repo.ui.debug(" branchmerge: %s, force: %s, partial: %s\n"
408 repo.ui.debug(" branchmerge: %s, force: %s, partial: %s\n"
@@ -450,50 +454,50 b' def manifestmerge(repo, wctx, p2, pa, br'
450 fla = ma.flags(fa)
454 fla = ma.flags(fa)
451 nol = 'l' not in fl1 + fl2 + fla
455 nol = 'l' not in fl1 + fl2 + fla
452 if n2 == a and fl2 == fla:
456 if n2 == a and fl2 == fla:
453 actions.append((f, "k", (), "keep")) # remote unchanged
457 actions['k'].append((f, (), "keep")) # remote unchanged
454 elif n1 == a and fl1 == fla: # local unchanged - use remote
458 elif n1 == a and fl1 == fla: # local unchanged - use remote
455 if n1 == n2: # optimization: keep local content
459 if n1 == n2: # optimization: keep local content
456 actions.append((f, "e", (fl2,), "update permissions"))
460 actions['e'].append((f, (fl2,), "update permissions"))
457 else:
461 else:
458 actions.append((f, "g", (fl2,), "remote is newer"))
462 actions['g'].append((f, (fl2,), "remote is newer"))
459 elif nol and n2 == a: # remote only changed 'x'
463 elif nol and n2 == a: # remote only changed 'x'
460 actions.append((f, "e", (fl2,), "update permissions"))
464 actions['e'].append((f, (fl2,), "update permissions"))
461 elif nol and n1 == a: # local only changed 'x'
465 elif nol and n1 == a: # local only changed 'x'
462 actions.append((f, "g", (fl1,), "remote is newer"))
466 actions['g'].append((f, (fl1,), "remote is newer"))
463 else: # both changed something
467 else: # both changed something
464 actions.append((f, "m", (f, f, fa, False, pa.node()),
468 actions['m'].append((f, (f, f, fa, False, pa.node()),
465 "versions differ"))
469 "versions differ"))
466 elif f in copied: # files we'll deal with on m2 side
470 elif f in copied: # files we'll deal with on m2 side
467 pass
471 pass
468 elif n1 and f in movewithdir: # directory rename, move local
472 elif n1 and f in movewithdir: # directory rename, move local
469 f2 = movewithdir[f]
473 f2 = movewithdir[f]
470 actions.append((f2, "dm", (f, fl1),
474 actions['dm'].append((f2, (f, fl1),
471 "remote directory rename - move from " + f))
475 "remote directory rename - move from " + f))
472 elif n1 and f in copy:
476 elif n1 and f in copy:
473 f2 = copy[f]
477 f2 = copy[f]
474 actions.append((f, "m", (f, f2, f2, False, pa.node()),
478 actions['m'].append((f, (f, f2, f2, False, pa.node()),
475 "local copied/moved from " + f2))
479 "local copied/moved from " + f2))
476 elif n1 and f in ma: # clean, a different, no remote
480 elif n1 and f in ma: # clean, a different, no remote
477 if n1 != ma[f]:
481 if n1 != ma[f]:
478 if acceptremote:
482 if acceptremote:
479 actions.append((f, "r", None, "remote delete"))
483 actions['r'].append((f, None, "remote delete"))
480 else:
484 else:
481 actions.append((f, "cd", None, "prompt changed/deleted"))
485 actions['cd'].append((f, None, "prompt changed/deleted"))
482 elif n1[20:] == "a": # added, no remote
486 elif n1[20:] == "a": # added, no remote
483 actions.append((f, "f", None, "remote deleted"))
487 actions['f'].append((f, None, "remote deleted"))
484 else:
488 else:
485 actions.append((f, "r", None, "other deleted"))
489 actions['r'].append((f, None, "other deleted"))
486 elif n2 and f in movewithdir:
490 elif n2 and f in movewithdir:
487 f2 = movewithdir[f]
491 f2 = movewithdir[f]
488 actions.append((f2, "dg", (f, fl2),
492 actions['dg'].append((f2, (f, fl2),
489 "local directory rename - get from " + f))
493 "local directory rename - get from " + f))
490 elif n2 and f in copy:
494 elif n2 and f in copy:
491 f2 = copy[f]
495 f2 = copy[f]
492 if f2 in m2:
496 if f2 in m2:
493 actions.append((f, "m", (f2, f, f2, False, pa.node()),
497 actions['m'].append((f, (f2, f, f2, False, pa.node()),
494 "remote copied from " + f2))
498 "remote copied from " + f2))
495 else:
499 else:
496 actions.append((f, "m", (f2, f, f2, True, pa.node()),
500 actions['m'].append((f, (f2, f, f2, True, pa.node()),
497 "remote moved from " + f2))
501 "remote moved from " + f2))
498 elif n2 and f not in ma:
502 elif n2 and f not in ma:
499 # local unknown, remote created: the logic is described by the
503 # local unknown, remote created: the logic is described by the
@@ -509,17 +513,17 b' def manifestmerge(repo, wctx, p2, pa, br'
509 # Checking whether the files are different is expensive, so we
513 # Checking whether the files are different is expensive, so we
510 # don't do that when we can avoid it.
514 # don't do that when we can avoid it.
511 if force and not branchmerge:
515 if force and not branchmerge:
512 actions.append((f, "g", (fl2,), "remote created"))
516 actions['g'].append((f, (fl2,), "remote created"))
513 else:
517 else:
514 different = _checkunknownfile(repo, wctx, p2, f)
518 different = _checkunknownfile(repo, wctx, p2, f)
515 if force and branchmerge and different:
519 if force and branchmerge and different:
516 # FIXME: This is wrong - f is not in ma ...
520 # FIXME: This is wrong - f is not in ma ...
517 actions.append((f, "m", (f, f, f, False, pa.node()),
521 actions['m'].append((f, (f, f, f, False, pa.node()),
518 "remote differs from untracked local"))
522 "remote differs from untracked local"))
519 elif not force and different:
523 elif not force and different:
520 aborts.append((f, "ud"))
524 aborts.append((f, "ud"))
521 else:
525 else:
522 actions.append((f, "g", (fl2,), "remote created"))
526 actions['g'].append((f, (fl2,), "remote created"))
523 elif n2 and n2 != ma[f]:
527 elif n2 and n2 != ma[f]:
524 different = _checkunknownfile(repo, wctx, p2, f)
528 different = _checkunknownfile(repo, wctx, p2, f)
525 if not force and different:
529 if not force and different:
@@ -527,10 +531,10 b' def manifestmerge(repo, wctx, p2, pa, br'
527 else:
531 else:
528 # if different: old untracked f may be overwritten and lost
532 # if different: old untracked f may be overwritten and lost
529 if acceptremote:
533 if acceptremote:
530 actions.append((f, "g", (m2.flags(f),),
534 actions['g'].append((f, (m2.flags(f),),
531 "remote recreating"))
535 "remote recreating"))
532 else:
536 else:
533 actions.append((f, "dc", (m2.flags(f),),
537 actions['dc'].append((f, (m2.flags(f),),
534 "prompt deleted/changed"))
538 "prompt deleted/changed"))
535
539
536 for f, m in sorted(aborts):
540 for f, m in sorted(aborts):
@@ -545,44 +549,32 b' def manifestmerge(repo, wctx, p2, pa, br'
545 # check collision between files only in p2 for clean update
549 # check collision between files only in p2 for clean update
546 if (not branchmerge and
550 if (not branchmerge and
547 (force or not wctx.dirty(missing=True, branch=False))):
551 (force or not wctx.dirty(missing=True, branch=False))):
548 _checkcollision(repo, m2, [])
552 _checkcollision(repo, m2, None)
549 else:
553 else:
550 _checkcollision(repo, m1, actions)
554 _checkcollision(repo, m1, actions)
551
555
552 return actions
556 return actions
553
557
554 def actionkey(a):
558 def batchremove(repo, actions):
555 return a[1] in "rf" and -1 or 0, a
559 """apply removes to the working directory
556
557 def getremove(repo, mctx, overwrite, args):
558 """apply usually-non-interactive updates to the working directory
559
560 mctx is the context to be merged into the working copy
561
560
562 yields tuples for progress updates
561 yields tuples for progress updates
563 """
562 """
564 verbose = repo.ui.verbose
563 verbose = repo.ui.verbose
565 unlink = util.unlinkpath
564 unlink = util.unlinkpath
566 wjoin = repo.wjoin
565 wjoin = repo.wjoin
567 fctx = mctx.filectx
568 wwrite = repo.wwrite
569 audit = repo.wopener.audit
566 audit = repo.wopener.audit
570 i = 0
567 i = 0
571 for arg in args:
568 for f, args, msg in actions:
572 f = arg[0]
569 repo.ui.debug(" %s: %s -> r\n" % (f, msg))
573 if arg[1] == 'r':
570 if verbose:
574 if verbose:
571 repo.ui.note(_("removing %s\n") % f)
575 repo.ui.note(_("removing %s\n") % f)
572 audit(f)
576 audit(f)
573 try:
577 try:
574 unlink(wjoin(f), ignoremissing=True)
578 unlink(wjoin(f), ignoremissing=True)
575 except OSError, inst:
579 except OSError, inst:
576 repo.ui.warn(_("update failed to remove %s: %s!\n") %
580 repo.ui.warn(_("update failed to remove %s: %s!\n") %
577 (f, inst.strerror))
581 (f, inst.strerror))
582 else:
583 if verbose:
584 repo.ui.note(_("getting %s\n") % f)
585 wwrite(f, fctx(f).data(), arg[2][0])
586 if i == 100:
578 if i == 100:
587 yield i, f
579 yield i, f
588 i = 0
580 i = 0
@@ -590,7 +582,30 b' def getremove(repo, mctx, overwrite, arg'
590 if i > 0:
582 if i > 0:
591 yield i, f
583 yield i, f
592
584
593 def applyupdates(repo, actions, wctx, mctx, overwrite):
585 def batchget(repo, mctx, actions):
586 """apply gets to the working directory
587
588 mctx is the context to get from
589
590 yields tuples for progress updates
591 """
592 verbose = repo.ui.verbose
593 fctx = mctx.filectx
594 wwrite = repo.wwrite
595 i = 0
596 for f, args, msg in actions:
597 repo.ui.debug(" %s: %s -> g\n" % (f, msg))
598 if verbose:
599 repo.ui.note(_("getting %s\n") % f)
600 wwrite(f, fctx(f).data(), args[0])
601 if i == 100:
602 yield i, f
603 i = 0
604 i += 1
605 if i > 0:
606 yield i, f
607
608 def applyupdates(repo, actions, wctx, mctx, overwrite, labels=None):
594 """apply the merge action list to the working directory
609 """apply the merge action list to the working directory
595
610
596 wctx is the working copy context
611 wctx is the working copy context
@@ -604,29 +619,30 b' def applyupdates(repo, actions, wctx, mc'
604 ms = mergestate(repo)
619 ms = mergestate(repo)
605 ms.reset(wctx.p1().node(), mctx.node())
620 ms.reset(wctx.p1().node(), mctx.node())
606 moves = []
621 moves = []
607 actions.sort(key=actionkey)
622 for m, l in actions.items():
623 l.sort()
608
624
609 # prescan for merges
625 # prescan for merges
610 for a in actions:
626 for f, args, msg in actions['m']:
611 f, m, args, msg = a
627 f1, f2, fa, move, anc = args
612 repo.ui.debug(" %s: %s -> %s\n" % (f, msg, m))
628 if f == '.hgsubstate': # merged internally
613 if m == "m": # merge
629 continue
614 f1, f2, fa, move, anc = args
630 repo.ui.debug(" preserving %s for resolve of %s\n" % (f1, f))
615 if f == '.hgsubstate': # merged internally
631 fcl = wctx[f1]
616 continue
632 fco = mctx[f2]
617 repo.ui.debug(" preserving %s for resolve of %s\n" % (f1, f))
633 actx = repo[anc]
618 fcl = wctx[f1]
634 if fa in actx:
619 fco = mctx[f2]
635 fca = actx[fa]
620 actx = repo[anc]
636 else:
621 if fa in actx:
637 fca = repo.filectx(f1, fileid=nullrev)
622 fca = actx[fa]
638 ms.add(fcl, fco, fca, f)
623 else:
639 if f1 != f and move:
624 fca = repo.filectx(f1, fileid=nullrev)
640 moves.append(f1)
625 ms.add(fcl, fco, fca, f)
626 if f1 != f and move:
627 moves.append(f1)
628
641
629 audit = repo.wopener.audit
642 audit = repo.wopener.audit
643 _updating = _('updating')
644 _files = _('files')
645 progress = repo.ui.progress
630
646
631 # remove renamed files after safely stored
647 # remove renamed files after safely stored
632 for f in moves:
648 for f in moves:
@@ -635,86 +651,120 b' def applyupdates(repo, actions, wctx, mc'
635 audit(f)
651 audit(f)
636 util.unlinkpath(repo.wjoin(f))
652 util.unlinkpath(repo.wjoin(f))
637
653
638 numupdates = len([a for a in actions if a[1] != 'k'])
654 numupdates = sum(len(l) for m, l in actions.items() if m != 'k')
639 workeractions = [a for a in actions if a[1] in 'gr']
640 updateactions = [a for a in workeractions if a[1] == 'g']
641 updated = len(updateactions)
642 removeactions = [a for a in workeractions if a[1] == 'r']
643 removed = len(removeactions)
644 actions = [a for a in actions if a[1] not in 'grk']
645
655
646 hgsub = [a[1] for a in workeractions if a[0] == '.hgsubstate']
656 if [a for a in actions['r'] if a[0] == '.hgsubstate']:
647 if hgsub and hgsub[0] == 'r':
648 subrepo.submerge(repo, wctx, mctx, wctx, overwrite)
657 subrepo.submerge(repo, wctx, mctx, wctx, overwrite)
649
658
659 # remove in parallel (must come first)
650 z = 0
660 z = 0
651 prog = worker.worker(repo.ui, 0.001, getremove, (repo, mctx, overwrite),
661 prog = worker.worker(repo.ui, 0.001, batchremove, (repo,), actions['r'])
652 removeactions)
653 for i, item in prog:
662 for i, item in prog:
654 z += i
663 z += i
655 repo.ui.progress(_('updating'), z, item=item, total=numupdates,
664 progress(_updating, z, item=item, total=numupdates, unit=_files)
656 unit=_('files'))
665 removed = len(actions['r'])
657 prog = worker.worker(repo.ui, 0.001, getremove, (repo, mctx, overwrite),
666
658 updateactions)
667 # get in parallel
668 prog = worker.worker(repo.ui, 0.001, batchget, (repo, mctx), actions['g'])
659 for i, item in prog:
669 for i, item in prog:
660 z += i
670 z += i
661 repo.ui.progress(_('updating'), z, item=item, total=numupdates,
671 progress(_updating, z, item=item, total=numupdates, unit=_files)
662 unit=_('files'))
672 updated = len(actions['g'])
663
673
664 if hgsub and hgsub[0] == 'g':
674 if [a for a in actions['g'] if a[0] == '.hgsubstate']:
665 subrepo.submerge(repo, wctx, mctx, wctx, overwrite)
675 subrepo.submerge(repo, wctx, mctx, wctx, overwrite)
666
676
667 _updating = _('updating')
677 # forget (manifest only, just log it) (must come first)
668 _files = _('files')
678 for f, args, msg in actions['f']:
669 progress = repo.ui.progress
679 repo.ui.debug(" %s: %s -> f\n" % (f, msg))
680 z += 1
681 progress(_updating, z, item=f, total=numupdates, unit=_files)
682
683 # re-add (manifest only, just log it)
684 for f, args, msg in actions['a']:
685 repo.ui.debug(" %s: %s -> a\n" % (f, msg))
686 z += 1
687 progress(_updating, z, item=f, total=numupdates, unit=_files)
688
689 # keep (noop, just log it)
690 for f, args, msg in actions['k']:
691 repo.ui.debug(" %s: %s -> k\n" % (f, msg))
692 # no progress
670
693
671 for i, a in enumerate(actions):
694 # merge
672 f, m, args, msg = a
695 for f, args, msg in actions['m']:
673 progress(_updating, z + i + 1, item=f, total=numupdates, unit=_files)
696 repo.ui.debug(" %s: %s -> m\n" % (f, msg))
674 if m == "m": # merge
697 z += 1
675 f1, f2, fa, move, anc = args
698 progress(_updating, z, item=f, total=numupdates, unit=_files)
676 if f == '.hgsubstate': # subrepo states need updating
699 f1, f2, fa, move, anc = args
677 subrepo.submerge(repo, wctx, mctx, wctx.ancestor(mctx),
700 if f == '.hgsubstate': # subrepo states need updating
678 overwrite)
701 subrepo.submerge(repo, wctx, mctx, wctx.ancestor(mctx),
679 continue
702 overwrite)
680 audit(f)
703 continue
681 r = ms.resolve(f, wctx)
704 audit(f)
682 if r is not None and r > 0:
705 r = ms.resolve(f, wctx, labels=labels)
683 unresolved += 1
706 if r is not None and r > 0:
707 unresolved += 1
708 else:
709 if r is None:
710 updated += 1
684 else:
711 else:
685 if r is None:
712 merged += 1
686 updated += 1
713
687 else:
714 # directory rename, move local
688 merged += 1
715 for f, args, msg in actions['dm']:
689 elif m == "dm": # directory rename, move local
716 repo.ui.debug(" %s: %s -> dm\n" % (f, msg))
690 f0, flags = args
717 z += 1
691 repo.ui.note(_("moving %s to %s\n") % (f0, f))
718 progress(_updating, z, item=f, total=numupdates, unit=_files)
692 audit(f)
719 f0, flags = args
693 repo.wwrite(f, wctx.filectx(f0).data(), flags)
720 repo.ui.note(_("moving %s to %s\n") % (f0, f))
694 util.unlinkpath(repo.wjoin(f0))
721 audit(f)
695 updated += 1
722 repo.wwrite(f, wctx.filectx(f0).data(), flags)
696 elif m == "dg": # local directory rename, get
723 util.unlinkpath(repo.wjoin(f0))
697 f0, flags = args
724 updated += 1
698 repo.ui.note(_("getting %s to %s\n") % (f0, f))
725
699 repo.wwrite(f, mctx.filectx(f0).data(), flags)
726 # local directory rename, get
700 updated += 1
727 for f, args, msg in actions['dg']:
701 elif m == "dr": # divergent renames
728 repo.ui.debug(" %s: %s -> dg\n" % (f, msg))
702 fl, = args
729 z += 1
703 repo.ui.warn(_("note: possible conflict - %s was renamed "
730 progress(_updating, z, item=f, total=numupdates, unit=_files)
704 "multiple times to:\n") % f)
731 f0, flags = args
705 for nf in fl:
732 repo.ui.note(_("getting %s to %s\n") % (f0, f))
706 repo.ui.warn(" %s\n" % nf)
733 repo.wwrite(f, mctx.filectx(f0).data(), flags)
707 elif m == "rd": # rename and delete
734 updated += 1
708 fl, = args
735
709 repo.ui.warn(_("note: possible conflict - %s was deleted "
736 # divergent renames
710 "and renamed to:\n") % f)
737 for f, args, msg in actions['dr']:
711 for nf in fl:
738 repo.ui.debug(" %s: %s -> dr\n" % (f, msg))
712 repo.ui.warn(" %s\n" % nf)
739 z += 1
713 elif m == "e": # exec
740 progress(_updating, z, item=f, total=numupdates, unit=_files)
714 flags, = args
741 fl, = args
715 audit(f)
742 repo.ui.warn(_("note: possible conflict - %s was renamed "
716 util.setflags(repo.wjoin(f), 'l' in flags, 'x' in flags)
743 "multiple times to:\n") % f)
717 updated += 1
744 for nf in fl:
745 repo.ui.warn(" %s\n" % nf)
746
747 # rename and delete
748 for f, args, msg in actions['rd']:
749 repo.ui.debug(" %s: %s -> rd\n" % (f, msg))
750 z += 1
751 progress(_updating, z, item=f, total=numupdates, unit=_files)
752 fl, = args
753 repo.ui.warn(_("note: possible conflict - %s was deleted "
754 "and renamed to:\n") % f)
755 for nf in fl:
756 repo.ui.warn(" %s\n" % nf)
757
758 # exec
759 for f, args, msg in actions['e']:
760 repo.ui.debug(" %s: %s -> e\n" % (f, msg))
761 z += 1
762 progress(_updating, z, item=f, total=numupdates, unit=_files)
763 flags, = args
764 audit(f)
765 util.setflags(repo.wjoin(f), 'l' in flags, 'x' in flags)
766 updated += 1
767
718 ms.commit()
768 ms.commit()
719 progress(_updating, None, total=numupdates, unit=_files)
769 progress(_updating, None, total=numupdates, unit=_files)
720
770
@@ -735,163 +785,173 b' def calculateupdates(repo, wctx, mctx, a'
735 (wctx, mctx, _(' and ').join(str(anc) for anc in ancestors)))
785 (wctx, mctx, _(' and ').join(str(anc) for anc in ancestors)))
736
786
737 # Call for bids
787 # Call for bids
738 fbids = {} # mapping filename to list af action bids
788 fbids = {} # mapping filename to bids (action method to list af actions)
739 for ancestor in ancestors:
789 for ancestor in ancestors:
740 repo.ui.note(_('\ncalculating bids for ancestor %s\n') % ancestor)
790 repo.ui.note(_('\ncalculating bids for ancestor %s\n') % ancestor)
741 actions = manifestmerge(repo, wctx, mctx, ancestor,
791 actions = manifestmerge(repo, wctx, mctx, ancestor,
742 branchmerge, force,
792 branchmerge, force,
743 partial, acceptremote, followcopies)
793 partial, acceptremote, followcopies)
744 for a in sorted(actions):
794 for m, l in sorted(actions.items()):
745 repo.ui.debug(' %s: %s\n' % (a[0], a[1]))
795 for a in l:
746 f = a[0]
796 f, args, msg = a
747 if f in fbids:
797 repo.ui.debug(' %s: %s -> %s\n' % (f, msg, m))
748 fbids[f].append(a)
798 if f in fbids:
749 else:
799 d = fbids[f]
750 fbids[f] = [a]
800 if m in d:
801 d[m].append(a)
802 else:
803 d[m] = [a]
804 else:
805 fbids[f] = {m: [a]}
751
806
752 # Pick the best bid for each file
807 # Pick the best bid for each file
753 repo.ui.note(_('\nauction for merging merge bids\n'))
808 repo.ui.note(_('\nauction for merging merge bids\n'))
754 actions = []
809 actions = dict((m, []) for m in actions.keys())
755 for f, bidsl in sorted(fbids.items()):
810 for f, bids in sorted(fbids.items()):
811 # bids is a mapping from action method to list af actions
756 # Consensus?
812 # Consensus?
757 a0 = bidsl[0]
813 if len(bids) == 1: # all bids are the same kind of method
758 if util.all(a == a0 for a in bidsl[1:]): # len(bidsl) is > 1
814 m, l = bids.items()[0]
759 repo.ui.note(" %s: consensus for %s\n" % (f, a0[1]))
815 if util.all(a == l[0] for a in l[1:]): # len(bids) is > 1
760 actions.append(a0)
816 repo.ui.note(" %s: consensus for %s\n" % (f, m))
761 continue
817 actions[m].append(l[0])
762 # Group bids by kind of action
818 continue
763 bids = {}
764 for a in bidsl:
765 m = a[1]
766 if m in bids:
767 bids[m].append(a)
768 else:
769 bids[m] = [a]
770 # If keep is an option, just do it.
819 # If keep is an option, just do it.
771 if "k" in bids:
820 if "k" in bids:
772 repo.ui.note(" %s: picking 'keep' action\n" % f)
821 repo.ui.note(" %s: picking 'keep' action\n" % f)
773 actions.append(bids["k"][0])
822 actions['k'].append(bids["k"][0])
774 continue
823 continue
775 # If all gets agree [how could they not?], just do it.
824 # If there are gets and they all agree [how could they not?], do it.
776 if "g" in bids:
825 if "g" in bids:
777 ga0 = bids["g"][0]
826 ga0 = bids["g"][0]
778 if util.all(a == ga0 for a in bids["g"][1:]):
827 if util.all(a == ga0 for a in bids["g"][1:]):
779 repo.ui.note(" %s: picking 'get' action\n" % f)
828 repo.ui.note(" %s: picking 'get' action\n" % f)
780 actions.append(ga0)
829 actions['g'].append(ga0)
781 continue
830 continue
782 # TODO: Consider other simple actions such as mode changes
831 # TODO: Consider other simple actions such as mode changes
783 # Handle inefficient democrazy.
832 # Handle inefficient democrazy.
784 repo.ui.note(_(' %s: multiple bids for merge action:\n') % f)
833 repo.ui.note(_(' %s: multiple bids for merge action:\n') % f)
785 for _f, m, args, msg in bidsl:
834 for m, l in sorted(bids.items()):
786 repo.ui.note(' %s -> %s\n' % (msg, m))
835 for _f, args, msg in l:
836 repo.ui.note(' %s -> %s\n' % (msg, m))
787 # Pick random action. TODO: Instead, prompt user when resolving
837 # Pick random action. TODO: Instead, prompt user when resolving
788 a0 = bidsl[0]
838 m, l = bids.items()[0]
789 repo.ui.warn(_(' %s: ambiguous merge - picked %s action\n') %
839 repo.ui.warn(_(' %s: ambiguous merge - picked %s action\n') %
790 (f, a0[1]))
840 (f, m))
791 actions.append(a0)
841 actions[m].append(l[0])
792 continue
842 continue
793 repo.ui.note(_('end of auction\n\n'))
843 repo.ui.note(_('end of auction\n\n'))
794
844
795 # Filter out prompts.
845 # Prompt and create actions. TODO: Move this towards resolve phase.
796 newactions, prompts = [], []
846 for f, args, msg in actions['cd']:
797 for a in actions:
847 if repo.ui.promptchoice(
798 if a[1] in ("cd", "dc"):
848 _("local changed %s which remote deleted\n"
799 prompts.append(a)
849 "use (c)hanged version or (d)elete?"
850 "$$ &Changed $$ &Delete") % f, 0):
851 actions['r'].append((f, None, "prompt delete"))
800 else:
852 else:
801 newactions.append(a)
853 actions['a'].append((f, None, "prompt keep"))
802 # Prompt and create actions. TODO: Move this towards resolve phase.
854 del actions['cd'][:]
803 for f, m, args, msg in sorted(prompts):
855
804 if m == "cd":
856 for f, args, msg in actions['dc']:
805 if repo.ui.promptchoice(
857 flags, = args
806 _("local changed %s which remote deleted\n"
858 if repo.ui.promptchoice(
807 "use (c)hanged version or (d)elete?"
859 _("remote changed %s which local deleted\n"
808 "$$ &Changed $$ &Delete") % f, 0):
860 "use (c)hanged version or leave (d)eleted?"
809 newactions.append((f, "r", None, "prompt delete"))
861 "$$ &Changed $$ &Deleted") % f, 0) == 0:
810 else:
862 actions['g'].append((f, (flags,), "prompt recreating"))
811 newactions.append((f, "a", None, "prompt keep"))
863 del actions['dc'][:]
812 elif m == "dc":
813 flags, = args
814 if repo.ui.promptchoice(
815 _("remote changed %s which local deleted\n"
816 "use (c)hanged version or leave (d)eleted?"
817 "$$ &Changed $$ &Deleted") % f, 0) == 0:
818 newactions.append((f, "g", (flags,), "prompt recreating"))
819 else: assert False, m
820
864
821 if wctx.rev() is None:
865 if wctx.rev() is None:
822 newactions += _forgetremoved(wctx, mctx, branchmerge)
866 ractions, factions = _forgetremoved(wctx, mctx, branchmerge)
867 actions['r'].extend(ractions)
868 actions['f'].extend(factions)
823
869
824 return newactions
870 return actions
825
871
826 def recordupdates(repo, actions, branchmerge):
872 def recordupdates(repo, actions, branchmerge):
827 "record merge actions to the dirstate"
873 "record merge actions to the dirstate"
828
874 # remove (must come first)
829 for a in actions:
875 for f, args, msg in actions['r']:
830 f, m, args, msg = a
876 if branchmerge:
831 if m == "r": # remove
877 repo.dirstate.remove(f)
832 if branchmerge:
878 else:
833 repo.dirstate.remove(f)
834 else:
835 repo.dirstate.drop(f)
836 elif m == "a": # re-add
837 if not branchmerge:
838 repo.dirstate.add(f)
839 elif m == "f": # forget
840 repo.dirstate.drop(f)
879 repo.dirstate.drop(f)
841 elif m == "e": # exec change
880
842 repo.dirstate.normallookup(f)
881 # forget (must come first)
843 elif m == "k": # keep
882 for f, args, msg in actions['f']:
844 pass
883 repo.dirstate.drop(f)
845 elif m == "g": # get
884
846 if branchmerge:
885 # re-add
847 repo.dirstate.otherparent(f)
886 for f, args, msg in actions['a']:
848 else:
887 if not branchmerge:
849 repo.dirstate.normal(f)
888 repo.dirstate.add(f)
850 elif m == "m": # merge
889
851 f1, f2, fa, move, anc = args
890 # exec change
852 if branchmerge:
891 for f, args, msg in actions['e']:
853 # We've done a branch merge, mark this file as merged
892 repo.dirstate.normallookup(f)
854 # so that we properly record the merger later
893
855 repo.dirstate.merge(f)
894 # keep
856 if f1 != f2: # copy/rename
895 for f, args, msg in actions['k']:
857 if move:
896 pass
858 repo.dirstate.remove(f1)
897
859 if f1 != f:
898 # get
860 repo.dirstate.copy(f1, f)
899 for f, args, msg in actions['g']:
861 else:
900 if branchmerge:
862 repo.dirstate.copy(f2, f)
901 repo.dirstate.otherparent(f)
863 else:
902 else:
864 # We've update-merged a locally modified file, so
903 repo.dirstate.normal(f)
865 # we set the dirstate to emulate a normal checkout
904
866 # of that file some time in the past. Thus our
905 # merge
867 # merge will appear as a normal local file
906 for f, args, msg in actions['m']:
868 # modification.
907 f1, f2, fa, move, anc = args
869 if f2 == f: # file not locally copied/moved
908 if branchmerge:
870 repo.dirstate.normallookup(f)
909 # We've done a branch merge, mark this file as merged
910 # so that we properly record the merger later
911 repo.dirstate.merge(f)
912 if f1 != f2: # copy/rename
871 if move:
913 if move:
872 repo.dirstate.drop(f1)
914 repo.dirstate.remove(f1)
873 elif m == "dm": # directory rename, move local
915 if f1 != f:
874 f0, flag = args
916 repo.dirstate.copy(f1, f)
875 if f0 not in repo.dirstate:
917 else:
876 # untracked file moved
918 repo.dirstate.copy(f2, f)
877 continue
919 else:
878 if branchmerge:
920 # We've update-merged a locally modified file, so
879 repo.dirstate.add(f)
921 # we set the dirstate to emulate a normal checkout
880 repo.dirstate.remove(f0)
922 # of that file some time in the past. Thus our
881 repo.dirstate.copy(f0, f)
923 # merge will appear as a normal local file
882 else:
924 # modification.
883 repo.dirstate.normal(f)
925 if f2 == f: # file not locally copied/moved
884 repo.dirstate.drop(f0)
926 repo.dirstate.normallookup(f)
885 elif m == "dg": # directory rename, get
927 if move:
886 f0, flag = args
928 repo.dirstate.drop(f1)
887 if branchmerge:
929
888 repo.dirstate.add(f)
930 # directory rename, move local
889 repo.dirstate.copy(f0, f)
931 for f, args, msg in actions['dm']:
890 else:
932 f0, flag = args
891 repo.dirstate.normal(f)
933 if f0 not in repo.dirstate:
934 # untracked file moved
935 continue
936 if branchmerge:
937 repo.dirstate.add(f)
938 repo.dirstate.remove(f0)
939 repo.dirstate.copy(f0, f)
940 else:
941 repo.dirstate.normal(f)
942 repo.dirstate.drop(f0)
943
944 # directory rename, get
945 for f, args, msg in actions['dg']:
946 f0, flag = args
947 if branchmerge:
948 repo.dirstate.add(f)
949 repo.dirstate.copy(f0, f)
950 else:
951 repo.dirstate.normal(f)
892
952
893 def update(repo, node, branchmerge, force, partial, ancestor=None,
953 def update(repo, node, branchmerge, force, partial, ancestor=None,
894 mergeancestor=False):
954 mergeancestor=False, labels=None):
895 """
955 """
896 Perform a merge between the working directory and the given node
956 Perform a merge between the working directory and the given node
897
957
@@ -1071,7 +1131,7 b' def update(repo, node, branchmerge, forc'
1071 # note that we're in the middle of an update
1131 # note that we're in the middle of an update
1072 repo.vfs.write('updatestate', p2.hex())
1132 repo.vfs.write('updatestate', p2.hex())
1073
1133
1074 stats = applyupdates(repo, actions, wc, p2, overwrite)
1134 stats = applyupdates(repo, actions, wc, p2, overwrite, labels=labels)
1075
1135
1076 if not partial:
1136 if not partial:
1077 repo.setparents(fp1, fp2)
1137 repo.setparents(fp1, fp2)
@@ -56,7 +56,7 b' def replace(text, substs):'
56 # on strings in local encoding causes invalid byte sequences.
56 # on strings in local encoding causes invalid byte sequences.
57 utext = text.decode(encoding.encoding)
57 utext = text.decode(encoding.encoding)
58 for f, t in substs:
58 for f, t in substs:
59 utext = utext.replace(f, t)
59 utext = utext.replace(f.decode("ascii"), t.decode("ascii"))
60 return utext.encode(encoding.encoding)
60 return utext.encode(encoding.encoding)
61
61
62 _blockre = re.compile(r"\n(?:\s*\n)+")
62 _blockre = re.compile(r"\n(?:\s*\n)+")
@@ -153,6 +153,122 b' quit:'
153 return NULL;
153 return NULL;
154 }
154 }
155
155
156 static inline dirstateTupleObject *make_dirstate_tuple(char state, int mode,
157 int size, int mtime)
158 {
159 dirstateTupleObject *t = PyObject_New(dirstateTupleObject,
160 &dirstateTupleType);
161 if (!t)
162 return NULL;
163 t->state = state;
164 t->mode = mode;
165 t->size = size;
166 t->mtime = mtime;
167 return t;
168 }
169
170 static PyObject *dirstate_tuple_new(PyTypeObject *subtype, PyObject *args,
171 PyObject *kwds)
172 {
173 /* We do all the initialization here and not a tp_init function because
174 * dirstate_tuple is immutable. */
175 dirstateTupleObject *t;
176 char state;
177 int size, mode, mtime;
178 if (!PyArg_ParseTuple(args, "ciii", &state, &mode, &size, &mtime))
179 return NULL;
180
181 t = (dirstateTupleObject *)subtype->tp_alloc(subtype, 1);
182 if (!t)
183 return NULL;
184 t->state = state;
185 t->mode = mode;
186 t->size = size;
187 t->mtime = mtime;
188
189 return (PyObject *)t;
190 }
191
192 static void dirstate_tuple_dealloc(PyObject *o)
193 {
194 PyObject_Del(o);
195 }
196
197 static Py_ssize_t dirstate_tuple_length(PyObject *o)
198 {
199 return 4;
200 }
201
202 static PyObject *dirstate_tuple_item(PyObject *o, Py_ssize_t i)
203 {
204 dirstateTupleObject *t = (dirstateTupleObject *)o;
205 switch (i) {
206 case 0:
207 return PyBytes_FromStringAndSize(&t->state, 1);
208 case 1:
209 return PyInt_FromLong(t->mode);
210 case 2:
211 return PyInt_FromLong(t->size);
212 case 3:
213 return PyInt_FromLong(t->mtime);
214 default:
215 PyErr_SetString(PyExc_IndexError, "index out of range");
216 return NULL;
217 }
218 }
219
220 static PySequenceMethods dirstate_tuple_sq = {
221 dirstate_tuple_length, /* sq_length */
222 0, /* sq_concat */
223 0, /* sq_repeat */
224 dirstate_tuple_item, /* sq_item */
225 0, /* sq_ass_item */
226 0, /* sq_contains */
227 0, /* sq_inplace_concat */
228 0 /* sq_inplace_repeat */
229 };
230
231 PyTypeObject dirstateTupleType = {
232 PyVarObject_HEAD_INIT(NULL, 0)
233 "dirstate_tuple", /* tp_name */
234 sizeof(dirstateTupleObject),/* tp_basicsize */
235 0, /* tp_itemsize */
236 (destructor)dirstate_tuple_dealloc, /* tp_dealloc */
237 0, /* tp_print */
238 0, /* tp_getattr */
239 0, /* tp_setattr */
240 0, /* tp_compare */
241 0, /* tp_repr */
242 0, /* tp_as_number */
243 &dirstate_tuple_sq, /* tp_as_sequence */
244 0, /* tp_as_mapping */
245 0, /* tp_hash */
246 0, /* tp_call */
247 0, /* tp_str */
248 0, /* tp_getattro */
249 0, /* tp_setattro */
250 0, /* tp_as_buffer */
251 Py_TPFLAGS_DEFAULT, /* tp_flags */
252 "dirstate tuple", /* tp_doc */
253 0, /* tp_traverse */
254 0, /* tp_clear */
255 0, /* tp_richcompare */
256 0, /* tp_weaklistoffset */
257 0, /* tp_iter */
258 0, /* tp_iternext */
259 0, /* tp_methods */
260 0, /* tp_members */
261 0, /* tp_getset */
262 0, /* tp_base */
263 0, /* tp_dict */
264 0, /* tp_descr_get */
265 0, /* tp_descr_set */
266 0, /* tp_dictoffset */
267 0, /* tp_init */
268 0, /* tp_alloc */
269 dirstate_tuple_new, /* tp_new */
270 };
271
156 static PyObject *parse_dirstate(PyObject *self, PyObject *args)
272 static PyObject *parse_dirstate(PyObject *self, PyObject *args)
157 {
273 {
158 PyObject *dmap, *cmap, *parents = NULL, *ret = NULL;
274 PyObject *dmap, *cmap, *parents = NULL, *ret = NULL;
@@ -192,11 +308,8 b' static PyObject *parse_dirstate(PyObject'
192 goto quit;
308 goto quit;
193 }
309 }
194
310
195 entry = Py_BuildValue("ciii", state, mode, size, mtime);
311 entry = (PyObject *)make_dirstate_tuple(state, mode, size,
196 if (!entry)
312 mtime);
197 goto quit;
198 PyObject_GC_UnTrack(entry); /* don't waste time with this */
199
200 cpos = memchr(cur, 0, flen);
313 cpos = memchr(cur, 0, flen);
201 if (cpos) {
314 if (cpos) {
202 fname = PyBytes_FromStringAndSize(cur, cpos - cur);
315 fname = PyBytes_FromStringAndSize(cur, cpos - cur);
@@ -229,39 +342,13 b' quit:'
229 return ret;
342 return ret;
230 }
343 }
231
344
232 static inline int getintat(PyObject *tuple, int off, uint32_t *v)
233 {
234 PyObject *o = PyTuple_GET_ITEM(tuple, off);
235 long val;
236
237 if (PyInt_Check(o))
238 val = PyInt_AS_LONG(o);
239 else if (PyLong_Check(o)) {
240 val = PyLong_AsLong(o);
241 if (val == -1 && PyErr_Occurred())
242 return -1;
243 } else {
244 PyErr_SetString(PyExc_TypeError, "expected an int or long");
245 return -1;
246 }
247 if (LONG_MAX > INT_MAX && (val > INT_MAX || val < INT_MIN)) {
248 PyErr_SetString(PyExc_OverflowError,
249 "Python value to large to convert to uint32_t");
250 return -1;
251 }
252 *v = (uint32_t)val;
253 return 0;
254 }
255
256 static PyObject *dirstate_unset;
257
258 /*
345 /*
259 * Efficiently pack a dirstate object into its on-disk format.
346 * Efficiently pack a dirstate object into its on-disk format.
260 */
347 */
261 static PyObject *pack_dirstate(PyObject *self, PyObject *args)
348 static PyObject *pack_dirstate(PyObject *self, PyObject *args)
262 {
349 {
263 PyObject *packobj = NULL;
350 PyObject *packobj = NULL;
264 PyObject *map, *copymap, *pl;
351 PyObject *map, *copymap, *pl, *mtime_unset = NULL;
265 Py_ssize_t nbytes, pos, l;
352 Py_ssize_t nbytes, pos, l;
266 PyObject *k, *v, *pn;
353 PyObject *k, *v, *pn;
267 char *p, *s;
354 char *p, *s;
@@ -318,34 +405,38 b' static PyObject *pack_dirstate(PyObject '
318 p += 20;
405 p += 20;
319
406
320 for (pos = 0; PyDict_Next(map, &pos, &k, &v); ) {
407 for (pos = 0; PyDict_Next(map, &pos, &k, &v); ) {
408 dirstateTupleObject *tuple;
409 char state;
321 uint32_t mode, size, mtime;
410 uint32_t mode, size, mtime;
322 Py_ssize_t len, l;
411 Py_ssize_t len, l;
323 PyObject *o;
412 PyObject *o;
324 char *s, *t;
413 char *t;
325
414
326 if (!PyTuple_Check(v) || PyTuple_GET_SIZE(v) != 4) {
415 if (!dirstate_tuple_check(v)) {
327 PyErr_SetString(PyExc_TypeError, "expected a 4-tuple");
416 PyErr_SetString(PyExc_TypeError,
328 goto bail;
417 "expected a dirstate tuple");
329 }
330 o = PyTuple_GET_ITEM(v, 0);
331 if (PyString_AsStringAndSize(o, &s, &l) == -1 || l != 1) {
332 PyErr_SetString(PyExc_TypeError, "expected one byte");
333 goto bail;
418 goto bail;
334 }
419 }
335 *p++ = *s;
420 tuple = (dirstateTupleObject *)v;
336 if (getintat(v, 1, &mode) == -1)
421
337 goto bail;
422 state = tuple->state;
338 if (getintat(v, 2, &size) == -1)
423 mode = tuple->mode;
339 goto bail;
424 size = tuple->size;
340 if (getintat(v, 3, &mtime) == -1)
425 mtime = tuple->mtime;
341 goto bail;
426 if (state == 'n' && mtime == (uint32_t)now) {
342 if (*s == 'n' && mtime == (uint32_t)now) {
343 /* See pure/parsers.py:pack_dirstate for why we do
427 /* See pure/parsers.py:pack_dirstate for why we do
344 * this. */
428 * this. */
345 if (PyDict_SetItem(map, k, dirstate_unset) == -1)
429 mtime = -1;
430 mtime_unset = (PyObject *)make_dirstate_tuple(
431 state, mode, size, mtime);
432 if (!mtime_unset)
346 goto bail;
433 goto bail;
347 mtime = -1;
434 if (PyDict_SetItem(map, k, mtime_unset) == -1)
435 goto bail;
436 Py_DECREF(mtime_unset);
437 mtime_unset = NULL;
348 }
438 }
439 *p++ = state;
349 putbe32(mode, p);
440 putbe32(mode, p);
350 putbe32(size, p + 4);
441 putbe32(size, p + 4);
351 putbe32(mtime, p + 8);
442 putbe32(mtime, p + 8);
@@ -374,6 +465,7 b' static PyObject *pack_dirstate(PyObject '
374
465
375 return packobj;
466 return packobj;
376 bail:
467 bail:
468 Py_XDECREF(mtime_unset);
377 Py_XDECREF(packobj);
469 Py_XDECREF(packobj);
378 return NULL;
470 return NULL;
379 }
471 }
@@ -2016,18 +2108,19 b' static void module_init(PyObject *mod)'
2016 dirs_module_init(mod);
2108 dirs_module_init(mod);
2017
2109
2018 indexType.tp_new = PyType_GenericNew;
2110 indexType.tp_new = PyType_GenericNew;
2019 if (PyType_Ready(&indexType) < 0)
2111 if (PyType_Ready(&indexType) < 0 ||
2112 PyType_Ready(&dirstateTupleType) < 0)
2020 return;
2113 return;
2021 Py_INCREF(&indexType);
2114 Py_INCREF(&indexType);
2022
2023 PyModule_AddObject(mod, "index", (PyObject *)&indexType);
2115 PyModule_AddObject(mod, "index", (PyObject *)&indexType);
2116 Py_INCREF(&dirstateTupleType);
2117 PyModule_AddObject(mod, "dirstatetuple",
2118 (PyObject *)&dirstateTupleType);
2024
2119
2025 nullentry = Py_BuildValue("iiiiiiis#", 0, 0, 0,
2120 nullentry = Py_BuildValue("iiiiiiis#", 0, 0, 0,
2026 -1, -1, -1, -1, nullid, 20);
2121 -1, -1, -1, -1, nullid, 20);
2027 if (nullentry)
2122 if (nullentry)
2028 PyObject_GC_UnTrack(nullentry);
2123 PyObject_GC_UnTrack(nullentry);
2029
2030 dirstate_unset = Py_BuildValue("ciii", 'n', 0, -1, -1);
2031 }
2124 }
2032
2125
2033 static int check_python_version(void)
2126 static int check_python_version(void)
@@ -417,12 +417,12 b' class fsbackend(abstractbackend):'
417 return os.path.join(self.opener.base, f)
417 return os.path.join(self.opener.base, f)
418
418
419 def getfile(self, fname):
419 def getfile(self, fname):
420 path = self._join(fname)
420 if self.opener.islink(fname):
421 if os.path.islink(path):
421 return (self.opener.readlink(fname), (True, False))
422 return (os.readlink(path), (True, False))
422
423 isexec = False
423 isexec = False
424 try:
424 try:
425 isexec = os.lstat(path).st_mode & 0100 != 0
425 isexec = self.opener.lstat(fname).st_mode & 0100 != 0
426 except OSError, e:
426 except OSError, e:
427 if e.errno != errno.ENOENT:
427 if e.errno != errno.ENOENT:
428 raise
428 raise
@@ -431,17 +431,17 b' class fsbackend(abstractbackend):'
431 def setfile(self, fname, data, mode, copysource):
431 def setfile(self, fname, data, mode, copysource):
432 islink, isexec = mode
432 islink, isexec = mode
433 if data is None:
433 if data is None:
434 util.setflags(self._join(fname), islink, isexec)
434 self.opener.setflags(fname, islink, isexec)
435 return
435 return
436 if islink:
436 if islink:
437 self.opener.symlink(data, fname)
437 self.opener.symlink(data, fname)
438 else:
438 else:
439 self.opener.write(fname, data)
439 self.opener.write(fname, data)
440 if isexec:
440 if isexec:
441 util.setflags(self._join(fname), False, True)
441 self.opener.setflags(fname, False, True)
442
442
443 def unlink(self, fname):
443 def unlink(self, fname):
444 util.unlinkpath(self._join(fname), ignoremissing=True)
444 self.opener.unlinkpath(fname, ignoremissing=True)
445
445
446 def writerej(self, fname, failed, total, lines):
446 def writerej(self, fname, failed, total, lines):
447 fname = fname + ".rej"
447 fname = fname + ".rej"
@@ -453,7 +453,7 b' class fsbackend(abstractbackend):'
453 fp.close()
453 fp.close()
454
454
455 def exists(self, fname):
455 def exists(self, fname):
456 return os.path.lexists(self._join(fname))
456 return self.opener.lexists(fname)
457
457
458 class workingbackend(fsbackend):
458 class workingbackend(fsbackend):
459 def __init__(self, ui, repo, similarity):
459 def __init__(self, ui, repo, similarity):
@@ -1521,14 +1521,11 b' def patch(ui, repo, patchname, strip=1, '
1521 patcher = ui.config('ui', 'patch')
1521 patcher = ui.config('ui', 'patch')
1522 if files is None:
1522 if files is None:
1523 files = set()
1523 files = set()
1524 try:
1524 if patcher:
1525 if patcher:
1525 return _externalpatch(ui, repo, patcher, patchname, strip,
1526 return _externalpatch(ui, repo, patcher, patchname, strip,
1526 files, similarity)
1527 files, similarity)
1527 return internalpatch(ui, repo, patchname, strip, files, eolmode,
1528 return internalpatch(ui, repo, patchname, strip, files, eolmode,
1528 similarity)
1529 similarity)
1530 except PatchError, err:
1531 raise util.Abort(str(err))
1532
1529
1533 def changedfiles(ui, repo, patchpath, strip=1):
1530 def changedfiles(ui, repo, patchpath, strip=1):
1534 backend = fsbackend(ui, repo.root)
1531 backend = fsbackend(ui, repo.root)
@@ -1564,6 +1561,7 b' def diffopts(ui, opts=None, untrusted=Fa'
1564 text=opts and opts.get('text'),
1561 text=opts and opts.get('text'),
1565 git=get('git'),
1562 git=get('git'),
1566 nodates=get('nodates'),
1563 nodates=get('nodates'),
1564 nobinary=get('nobinary'),
1567 showfunc=get('show_function', 'showfunc'),
1565 showfunc=get('show_function', 'showfunc'),
1568 ignorews=get('ignore_all_space', 'ignorews'),
1566 ignorews=get('ignore_all_space', 'ignorews'),
1569 ignorewsamount=get('ignore_space_change', 'ignorewsamount'),
1567 ignorewsamount=get('ignore_space_change', 'ignorewsamount'),
@@ -1624,7 +1622,7 b' def diff(repo, node1=None, node2=None, m'
1624
1622
1625 revs = None
1623 revs = None
1626 hexfunc = repo.ui.debugflag and hex or short
1624 hexfunc = repo.ui.debugflag and hex or short
1627 revs = [hexfunc(node) for node in [node1, node2] if node]
1625 revs = [hexfunc(node) for node in [ctx1.node(), ctx2.node()] if node]
1628
1626
1629 copy = {}
1627 copy = {}
1630 if opts.git or opts.upgrade:
1628 if opts.git or opts.upgrade:
@@ -1818,7 +1816,7 b' def trydiff(repo, revs, ctx1, ctx2, modi'
1818 if dodiff:
1816 if dodiff:
1819 if opts.git or revs:
1817 if opts.git or revs:
1820 header.insert(0, diffline(join(a), join(b), revs))
1818 header.insert(0, diffline(join(a), join(b), revs))
1821 if dodiff == 'binary':
1819 if dodiff == 'binary' and not opts.nobinary:
1822 text = mdiff.b85diff(to, tn)
1820 text = mdiff.b85diff(to, tn)
1823 if text:
1821 if text:
1824 addindexmeta(header, [gitindex(to), gitindex(tn)])
1822 addindexmeta(header, [gitindex(to), gitindex(tn)])
@@ -15,6 +15,12 b' import struct, zlib, cStringIO'
15 _decompress = zlib.decompress
15 _decompress = zlib.decompress
16 _sha = util.sha1
16 _sha = util.sha1
17
17
18 # Some code below makes tuples directly because it's more convenient. However,
19 # code outside this module should always use dirstatetuple.
20 def dirstatetuple(*x):
21 # x is a tuple
22 return x
23
18 def parse_manifest(mfdict, fdict, lines):
24 def parse_manifest(mfdict, fdict, lines):
19 for l in lines.splitlines():
25 for l in lines.splitlines():
20 f, n = l.split('\0')
26 f, n = l.split('\0')
@@ -104,7 +110,7 b' def pack_dirstate(dmap, copymap, pl, now'
104 # dirstate, forcing future 'status' calls to compare the
110 # dirstate, forcing future 'status' calls to compare the
105 # contents of the file if the size is the same. This prevents
111 # contents of the file if the size is the same. This prevents
106 # mistakenly treating such files as clean.
112 # mistakenly treating such files as clean.
107 e = (e[0], e[1], e[2], -1)
113 e = dirstatetuple(e[0], e[1], e[2], -1)
108 dmap[f] = e
114 dmap[f] = e
109
115
110 if f in copymap:
116 if f in copymap:
@@ -5,7 +5,7 b''
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 import bookmarks, phases, obsolete
8 import bookmarks, phases, obsolete, encoding
9
9
10 def _nslist(repo):
10 def _nslist(repo):
11 n = {}
11 n = {}
@@ -37,3 +37,18 b' def list(repo, namespace):'
37 lk = _get(namespace)[1]
37 lk = _get(namespace)[1]
38 return lk(repo)
38 return lk(repo)
39
39
40 encode = encoding.fromlocal
41
42 decode = encoding.tolocal
43
44 def encodekeys(keys):
45 """encode the content of a pushkey namespace for exchange over the wire"""
46 return '\n'.join(['%s\t%s' % (encode(k), encode(v)) for k, v in keys])
47
48 def decodekeys(data):
49 """decode the content of a pushkey namespace from exchange over the wire"""
50 result = {}
51 for l in data.splitlines():
52 k, v = l.split('\t')
53 result[decode(k)] = decode(v)
54 return result
@@ -5,7 +5,7 b''
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 import os, builtins
8 import builtins
9
9
10 from numbers import Number
10 from numbers import Number
11
11
@@ -52,13 +52,6 b' def bytesformatter(format, args):'
52 return ret.encode('utf-8', 'surrogateescape')
52 return ret.encode('utf-8', 'surrogateescape')
53 builtins.bytesformatter = bytesformatter
53 builtins.bytesformatter = bytesformatter
54
54
55 # Create bytes equivalents for os.environ values
56 for key in list(os.environ.keys()):
57 # UTF-8 is fine for us
58 bkey = key.encode('utf-8', 'surrogateescape')
59 bvalue = os.environ[key].encode('utf-8', 'surrogateescape')
60 os.environ[bkey] = bvalue
61
62 origord = builtins.ord
55 origord = builtins.ord
63 def fakeord(char):
56 def fakeord(char):
64 if isinstance(char, int):
57 if isinstance(char, int):
@@ -399,6 +399,9 b' def only(repo, subset, x):'
399 args = getargs(x, 1, 2, _('only takes one or two arguments'))
399 args = getargs(x, 1, 2, _('only takes one or two arguments'))
400 include = getset(repo, spanset(repo), args[0]).set()
400 include = getset(repo, spanset(repo), args[0]).set()
401 if len(args) == 1:
401 if len(args) == 1:
402 if len(include) == 0:
403 return baseset([])
404
402 descendants = set(_revdescendants(repo, include, False))
405 descendants = set(_revdescendants(repo, include, False))
403 exclude = [rev for rev in cl.headrevs()
406 exclude = [rev for rev in cl.headrevs()
404 if not rev in descendants and not rev in include]
407 if not rev in descendants and not rev in include]
@@ -1102,16 +1105,6 b' def minrev(repo, subset, x):'
1102 return baseset([m])
1105 return baseset([m])
1103 return baseset([])
1106 return baseset([])
1104
1107
1105 def _missingancestors(repo, subset, x):
1106 # i18n: "_missingancestors" is a keyword
1107 revs, bases = getargs(x, 2, 2,
1108 _("_missingancestors requires two arguments"))
1109 rs = baseset(repo)
1110 revs = getset(repo, rs, revs)
1111 bases = getset(repo, rs, bases)
1112 missing = set(repo.changelog.findmissingrevs(bases, revs))
1113 return baseset([r for r in subset if r in missing])
1114
1115 def modifies(repo, subset, x):
1108 def modifies(repo, subset, x):
1116 """``modifies(pattern)``
1109 """``modifies(pattern)``
1117 Changesets modifying files matched by pattern.
1110 Changesets modifying files matched by pattern.
@@ -1724,7 +1717,6 b' symbols = {'
1724 "max": maxrev,
1717 "max": maxrev,
1725 "merge": merge,
1718 "merge": merge,
1726 "min": minrev,
1719 "min": minrev,
1727 "_missingancestors": _missingancestors,
1728 "modifies": modifies,
1720 "modifies": modifies,
1729 "obsolete": obsolete,
1721 "obsolete": obsolete,
1730 "origin": origin,
1722 "origin": origin,
@@ -1796,7 +1788,6 b' safesymbols = set(['
1796 "max",
1788 "max",
1797 "merge",
1789 "merge",
1798 "min",
1790 "min",
1799 "_missingancestors",
1800 "modifies",
1791 "modifies",
1801 "obsolete",
1792 "obsolete",
1802 "origin",
1793 "origin",
@@ -1867,7 +1858,7 b' def optimize(x, small):'
1867 wb, tb = optimize(x[2], True)
1858 wb, tb = optimize(x[2], True)
1868
1859
1869 # (::x and not ::y)/(not ::y and ::x) have a fast path
1860 # (::x and not ::y)/(not ::y and ::x) have a fast path
1870 def ismissingancestors(revs, bases):
1861 def isonly(revs, bases):
1871 return (
1862 return (
1872 revs[0] == 'func'
1863 revs[0] == 'func'
1873 and getstring(revs[1], _('not a symbol')) == 'ancestors'
1864 and getstring(revs[1], _('not a symbol')) == 'ancestors'
@@ -1876,12 +1867,10 b' def optimize(x, small):'
1876 and getstring(bases[1][1], _('not a symbol')) == 'ancestors')
1867 and getstring(bases[1][1], _('not a symbol')) == 'ancestors')
1877
1868
1878 w = min(wa, wb)
1869 w = min(wa, wb)
1879 if ismissingancestors(ta, tb):
1870 if isonly(ta, tb):
1880 return w, ('func', ('symbol', '_missingancestors'),
1871 return w, ('func', ('symbol', 'only'), ('list', ta[2], tb[1][2]))
1881 ('list', ta[2], tb[1][2]))
1872 if isonly(tb, ta):
1882 if ismissingancestors(tb, ta):
1873 return w, ('func', ('symbol', 'only'), ('list', tb[2], ta[1][2]))
1883 return w, ('func', ('symbol', '_missingancestors'),
1884 ('list', tb[2], ta[1][2]))
1885
1874
1886 if wa > wb:
1875 if wa > wb:
1887 return w, (op, tb, ta)
1876 return w, (op, tb, ta)
@@ -2243,11 +2232,7 b' class baseset(list):'
2243 """Returns a new object with the substraction of the two collections.
2232 """Returns a new object with the substraction of the two collections.
2244
2233
2245 This is part of the mandatory API for smartset."""
2234 This is part of the mandatory API for smartset."""
2246 if isinstance(other, baseset):
2235 return self.filter(lambda x: x not in other)
2247 s = other.set()
2248 else:
2249 s = set(other)
2250 return baseset(self.set() - s)
2251
2236
2252 def __and__(self, other):
2237 def __and__(self, other):
2253 """Returns a new object with the intersection of the two collections.
2238 """Returns a new object with the intersection of the two collections.
@@ -2764,10 +2749,6 b' class spanset(_orderedsetmixin):'
2764 if self._start < self._end:
2749 if self._start < self._end:
2765 self.reverse()
2750 self.reverse()
2766
2751
2767 def _contained(self, rev):
2768 return (rev <= self._start and rev > self._end) or (rev >= self._start
2769 and rev < self._end)
2770
2771 def __iter__(self):
2752 def __iter__(self):
2772 if self._start <= self._end:
2753 if self._start <= self._end:
2773 iterrange = xrange(self._start, self._end)
2754 iterrange = xrange(self._start, self._end)
@@ -2825,7 +2806,7 b' class spanset(_orderedsetmixin):'
2825 start = self._start
2806 start = self._start
2826 end = self._end
2807 end = self._end
2827 for rev in self._hiddenrevs:
2808 for rev in self._hiddenrevs:
2828 if (end < rev <= start) or (start <= rev and rev < end):
2809 if (end < rev <= start) or (start <= rev < end):
2829 count += 1
2810 count += 1
2830 return abs(self._end - self._start) - count
2811 return abs(self._end - self._start) - count
2831
2812
@@ -178,9 +178,15 b' class abstractvfs(object):'
178 def islink(self, path=None):
178 def islink(self, path=None):
179 return os.path.islink(self.join(path))
179 return os.path.islink(self.join(path))
180
180
181 def lexists(self, path=None):
182 return os.path.lexists(self.join(path))
183
181 def lstat(self, path=None):
184 def lstat(self, path=None):
182 return os.lstat(self.join(path))
185 return os.lstat(self.join(path))
183
186
187 def listdir(self, path=None):
188 return os.listdir(self.join(path))
189
184 def makedir(self, path=None, notindexed=True):
190 def makedir(self, path=None, notindexed=True):
185 return util.makedir(self.join(path), notindexed)
191 return util.makedir(self.join(path), notindexed)
186
192
@@ -223,6 +229,9 b' class abstractvfs(object):'
223 def unlink(self, path=None):
229 def unlink(self, path=None):
224 return util.unlink(self.join(path))
230 return util.unlink(self.join(path))
225
231
232 def unlinkpath(self, path=None, ignoremissing=False):
233 return util.unlinkpath(self.join(path), ignoremissing)
234
226 def utime(self, path=None, t=None):
235 def utime(self, path=None, t=None):
227 return os.utime(self.join(path), t)
236 return os.utime(self.join(path), t)
228
237
@@ -416,11 +416,11 b' def simplemerge(ui, local, base, other, '
416 name_a = local
416 name_a = local
417 name_b = other
417 name_b = other
418 labels = opts.get('label', [])
418 labels = opts.get('label', [])
419 if labels:
419 if len(labels) > 0:
420 name_a = labels.pop(0)
420 name_a = labels[0]
421 if labels:
421 if len(labels) > 1:
422 name_b = labels.pop(0)
422 name_b = labels[1]
423 if labels:
423 if len(labels) > 2:
424 raise util.Abort(_("can only specify two labels."))
424 raise util.Abort(_("can only specify two labels."))
425
425
426 try:
426 try:
@@ -35,8 +35,10 b' def _calcfilehash(filename):'
35 data = ''
35 data = ''
36 if os.path.exists(filename):
36 if os.path.exists(filename):
37 fd = open(filename, 'rb')
37 fd = open(filename, 'rb')
38 data = fd.read()
38 try:
39 fd.close()
39 data = fd.read()
40 finally:
41 fd.close()
40 return util.sha1(data).hexdigest()
42 return util.sha1(data).hexdigest()
41
43
42 class SubrepoAbort(error.Abort):
44 class SubrepoAbort(error.Abort):
@@ -205,12 +207,13 b' def submerge(repo, wctx, mctx, actx, ove'
205 sm[s] = r
207 sm[s] = r
206 else:
208 else:
207 debug(s, "both sides changed")
209 debug(s, "both sides changed")
210 srepo = wctx.sub(s)
208 option = repo.ui.promptchoice(
211 option = repo.ui.promptchoice(
209 _(' subrepository %s diverged (local revision: %s, '
212 _(' subrepository %s diverged (local revision: %s, '
210 'remote revision: %s)\n'
213 'remote revision: %s)\n'
211 '(M)erge, keep (l)ocal or keep (r)emote?'
214 '(M)erge, keep (l)ocal or keep (r)emote?'
212 '$$ &Merge $$ &Local $$ &Remote')
215 '$$ &Merge $$ &Local $$ &Remote')
213 % (s, l[1][:12], r[1][:12]), 0)
216 % (s, srepo.shortid(l[1]), srepo.shortid(r[1])), 0)
214 if option == 0:
217 if option == 0:
215 wctx.sub(s).merge(r)
218 wctx.sub(s).merge(r)
216 sm[s] = l
219 sm[s] = l
@@ -502,6 +505,9 b' class abstractsubrepo(object):'
502 % (substate[0], substate[2]))
505 % (substate[0], substate[2]))
503 return []
506 return []
504
507
508 def shortid(self, revid):
509 return revid
510
505 class hgsubrepo(abstractsubrepo):
511 class hgsubrepo(abstractsubrepo):
506 def __init__(self, ctx, path, state):
512 def __init__(self, ctx, path, state):
507 self._path = path
513 self._path = path
@@ -521,8 +527,14 b' class hgsubrepo(abstractsubrepo):'
521 self._initrepo(r, state[0], create)
527 self._initrepo(r, state[0], create)
522
528
523 def storeclean(self, path):
529 def storeclean(self, path):
530 lock = self._repo.lock()
531 try:
532 return self._storeclean(path)
533 finally:
534 lock.release()
535
536 def _storeclean(self, path):
524 clean = True
537 clean = True
525 lock = self._repo.lock()
526 itercache = self._calcstorehash(path)
538 itercache = self._calcstorehash(path)
527 try:
539 try:
528 for filehash in self._readstorehashcache(path):
540 for filehash in self._readstorehashcache(path):
@@ -539,7 +551,6 b' class hgsubrepo(abstractsubrepo):'
539 clean = False
551 clean = False
540 except StopIteration:
552 except StopIteration:
541 pass
553 pass
542 lock.release()
543 return clean
554 return clean
544
555
545 def _calcstorehash(self, remotepath):
556 def _calcstorehash(self, remotepath):
@@ -565,8 +576,10 b' class hgsubrepo(abstractsubrepo):'
565 if not os.path.exists(cachefile):
576 if not os.path.exists(cachefile):
566 return ''
577 return ''
567 fd = open(cachefile, 'r')
578 fd = open(cachefile, 'r')
568 pullstate = fd.readlines()
579 try:
569 fd.close()
580 pullstate = fd.readlines()
581 finally:
582 fd.close()
570 return pullstate
583 return pullstate
571
584
572 def _cachestorehash(self, remotepath):
585 def _cachestorehash(self, remotepath):
@@ -577,14 +590,18 b' class hgsubrepo(abstractsubrepo):'
577 '''
590 '''
578 cachefile = self._getstorehashcachepath(remotepath)
591 cachefile = self._getstorehashcachepath(remotepath)
579 lock = self._repo.lock()
592 lock = self._repo.lock()
580 storehash = list(self._calcstorehash(remotepath))
593 try:
581 cachedir = os.path.dirname(cachefile)
594 storehash = list(self._calcstorehash(remotepath))
582 if not os.path.exists(cachedir):
595 cachedir = os.path.dirname(cachefile)
583 util.makedirs(cachedir, notindexed=True)
596 if not os.path.exists(cachedir):
584 fd = open(cachefile, 'w')
597 util.makedirs(cachedir, notindexed=True)
585 fd.writelines(storehash)
598 fd = open(cachefile, 'w')
586 fd.close()
599 try:
587 lock.release()
600 fd.writelines(storehash)
601 finally:
602 fd.close()
603 finally:
604 lock.release()
588
605
589 @annotatesubrepoerror
606 @annotatesubrepoerror
590 def _initrepo(self, parentrepo, source, create):
607 def _initrepo(self, parentrepo, source, create):
@@ -592,12 +609,11 b' class hgsubrepo(abstractsubrepo):'
592 self._repo._subsource = source
609 self._repo._subsource = source
593
610
594 if create:
611 if create:
595 fp = self._repo.opener("hgrc", "w", text=True)
612 lines = ['[paths]\n']
596 fp.write('[paths]\n')
597
613
598 def addpathconfig(key, value):
614 def addpathconfig(key, value):
599 if value:
615 if value:
600 fp.write('%s = %s\n' % (key, value))
616 lines.append('%s = %s\n' % (key, value))
601 self._repo.ui.setconfig('paths', key, value, 'subrepo')
617 self._repo.ui.setconfig('paths', key, value, 'subrepo')
602
618
603 defpath = _abssource(self._repo, abort=False)
619 defpath = _abssource(self._repo, abort=False)
@@ -605,7 +621,12 b' class hgsubrepo(abstractsubrepo):'
605 addpathconfig('default', defpath)
621 addpathconfig('default', defpath)
606 if defpath != defpushpath:
622 if defpath != defpushpath:
607 addpathconfig('default-push', defpushpath)
623 addpathconfig('default-push', defpushpath)
608 fp.close()
624
625 fp = self._repo.opener("hgrc", "w", text=True)
626 try:
627 fp.write(''.join(lines))
628 finally:
629 fp.close()
609
630
610 @annotatesubrepoerror
631 @annotatesubrepoerror
611 def add(self, ui, match, dryrun, listsubrepos, prefix, explicitonly):
632 def add(self, ui, match, dryrun, listsubrepos, prefix, explicitonly):
@@ -867,6 +888,9 b' class hgsubrepo(abstractsubrepo):'
867 pats = []
888 pats = []
868 cmdutil.revert(ui, self._repo, ctx, parents, *pats, **opts)
889 cmdutil.revert(ui, self._repo, ctx, parents, *pats, **opts)
869
890
891 def shortid(self, revid):
892 return revid[:12]
893
870 class svnsubrepo(abstractsubrepo):
894 class svnsubrepo(abstractsubrepo):
871 def __init__(self, ctx, path, state):
895 def __init__(self, ctx, path, state):
872 self._path = path
896 self._path = path
@@ -1563,6 +1587,9 b' class gitsubrepo(abstractsubrepo):'
1563 deleted = unknown = ignored = clean = []
1587 deleted = unknown = ignored = clean = []
1564 return modified, added, removed, deleted, unknown, ignored, clean
1588 return modified, added, removed, deleted, unknown, ignored, clean
1565
1589
1590 def shortid(self, revid):
1591 return revid[:7]
1592
1566 types = {
1593 types = {
1567 'hg': hgsubrepo,
1594 'hg': hgsubrepo,
1568 'svn': svnsubrepo,
1595 'svn': svnsubrepo,
@@ -12,6 +12,7 b''
12
12
13 from node import nullid, bin, hex, short
13 from node import nullid, bin, hex, short
14 from i18n import _
14 from i18n import _
15 import util
15 import encoding
16 import encoding
16 import error
17 import error
17 import errno
18 import errno
@@ -83,20 +84,29 b' def readlocaltags(ui, repo, alltags, tag'
83
84
84 _updatetags(filetags, "local", alltags, tagtypes)
85 _updatetags(filetags, "local", alltags, tagtypes)
85
86
86 def _readtags(ui, repo, lines, fn, recode=None):
87 def _readtaghist(ui, repo, lines, fn, recode=None, calcnodelines=False):
87 '''Read tag definitions from a file (or any source of lines).
88 '''Read tag definitions from a file (or any source of lines).
88 Return a mapping from tag name to (node, hist): node is the node id
89 This function returns two sortdicts with similar information:
89 from the last line read for that name, and hist is the list of node
90 - the first dict, bingtaglist, contains the tag information as expected by
90 ids previously associated with it (in file order). All node ids are
91 the _readtags function, i.e. a mapping from tag name to (node, hist):
91 binary, not hex.'''
92 - node is the node id from the last line read for that name,
93 - hist is the list of node ids previously associated with it (in file
94 order). All node ids are binary, not hex.
95 - the second dict, hextaglines, is a mapping from tag name to a list of
96 [hexnode, line number] pairs, ordered from the oldest to the newest node.
97 When calcnodelines is False the hextaglines dict is not calculated (an
98 empty dict is returned). This is done to improve this function's
99 performance in cases where the line numbers are not needed.
100 '''
92
101
93 filetags = {} # map tag name to (node, hist)
102 bintaghist = util.sortdict()
103 hextaglines = util.sortdict()
94 count = 0
104 count = 0
95
105
96 def warn(msg):
106 def warn(msg):
97 ui.warn(_("%s, line %s: %s\n") % (fn, count, msg))
107 ui.warn(_("%s, line %s: %s\n") % (fn, count, msg))
98
108
99 for line in lines:
109 for nline, line in enumerate(lines):
100 count += 1
110 count += 1
101 if not line:
111 if not line:
102 continue
112 continue
@@ -115,11 +125,28 b' def _readtags(ui, repo, lines, fn, recod'
115 continue
125 continue
116
126
117 # update filetags
127 # update filetags
118 hist = []
128 if calcnodelines:
119 if name in filetags:
129 # map tag name to a list of line numbers
120 n, hist = filetags[name]
130 if name not in hextaglines:
121 hist.append(n)
131 hextaglines[name] = []
122 filetags[name] = (nodebin, hist)
132 hextaglines[name].append([nodehex, nline])
133 continue
134 # map tag name to (node, hist)
135 if name not in bintaghist:
136 bintaghist[name] = []
137 bintaghist[name].append(nodebin)
138 return bintaghist, hextaglines
139
140 def _readtags(ui, repo, lines, fn, recode=None, calcnodelines=False):
141 '''Read tag definitions from a file (or any source of lines).
142 Return a mapping from tag name to (node, hist): node is the node id
143 from the last line read for that name, and hist is the list of node
144 ids previously associated with it (in file order). All node ids are
145 binary, not hex.'''
146 filetags, nodelines = _readtaghist(ui, repo, lines, fn, recode=recode,
147 calcnodelines=calcnodelines)
148 for tag, taghist in filetags.items():
149 filetags[tag] = (taghist[-1], taghist[:-1])
123 return filetags
150 return filetags
124
151
125 def _updatetags(filetags, tagtype, alltags, tagtypes):
152 def _updatetags(filetags, tagtype, alltags, tagtypes):
@@ -8,6 +8,7 b''
8 import cgi, re, os, time, urllib
8 import cgi, re, os, time, urllib
9 import encoding, node, util
9 import encoding, node, util
10 import hbisect
10 import hbisect
11 import templatekw
11
12
12 def addbreaks(text):
13 def addbreaks(text):
13 """:addbreaks: Any text. Add an XHTML "<br />" tag before the end of
14 """:addbreaks: Any text. Add an XHTML "<br />" tag before the end of
@@ -302,6 +303,10 b' def shortdate(text):'
302 """:shortdate: Date. Returns a date like "2006-09-18"."""
303 """:shortdate: Date. Returns a date like "2006-09-18"."""
303 return util.shortdate(text)
304 return util.shortdate(text)
304
305
306 def splitlines(text):
307 """:splitlines: Any text. Split text into a list of lines."""
308 return templatekw.showlist('line', text.splitlines(), 'lines')
309
305 def stringescape(text):
310 def stringescape(text):
306 return text.encode('string_escape')
311 return text.encode('string_escape')
307
312
@@ -384,6 +389,7 b' filters = {'
384 "short": short,
389 "short": short,
385 "shortbisect": shortbisect,
390 "shortbisect": shortbisect,
386 "shortdate": shortdate,
391 "shortdate": shortdate,
392 "splitlines": splitlines,
387 "stringescape": stringescape,
393 "stringescape": stringescape,
388 "stringify": stringify,
394 "stringify": stringify,
389 "strip": strip,
395 "strip": strip,
@@ -208,6 +208,17 b' def showchildren(**args):'
208 childrevs = ['%d:%s' % (cctx, cctx) for cctx in ctx.children()]
208 childrevs = ['%d:%s' % (cctx, cctx) for cctx in ctx.children()]
209 return showlist('children', childrevs, element='child', **args)
209 return showlist('children', childrevs, element='child', **args)
210
210
211 def showcurrentbookmark(**args):
212 """:currentbookmark: String. The active bookmark, if it is
213 associated with the changeset"""
214 import bookmarks as bookmarks # to avoid circular import issues
215 repo = args['repo']
216 if bookmarks.iscurrent(repo):
217 current = repo._bookmarkcurrent
218 if current in args['ctx'].bookmarks():
219 return current
220 return ''
221
211 def showdate(repo, ctx, templ, **args):
222 def showdate(repo, ctx, templ, **args):
212 """:date: Date information. The date when the changeset was committed."""
223 """:date: Date information. The date when the changeset was committed."""
213 return ctx.date()
224 return ctx.date()
@@ -345,6 +356,22 b' def showrev(repo, ctx, templ, **args):'
345 """:rev: Integer. The repository-local changeset revision number."""
356 """:rev: Integer. The repository-local changeset revision number."""
346 return ctx.rev()
357 return ctx.rev()
347
358
359 def showsubrepos(**args):
360 """:subrepos: List of strings. Updated subrepositories in the changeset."""
361 ctx = args['ctx']
362 substate = ctx.substate
363 if not substate:
364 return showlist('subrepo', [], **args)
365 psubstate = ctx.parents()[0].substate or {}
366 subrepos = []
367 for sub in substate:
368 if sub not in psubstate or substate[sub] != psubstate[sub]:
369 subrepos.append(sub) # modified or newly added in ctx
370 for sub in psubstate:
371 if sub not in substate:
372 subrepos.append(sub) # removed in ctx
373 return showlist('subrepo', sorted(subrepos), **args)
374
348 def showtags(**args):
375 def showtags(**args):
349 """:tags: List of strings. Any tags associated with the changeset."""
376 """:tags: List of strings. Any tags associated with the changeset."""
350 return showlist('tag', args['ctx'].tags(), **args)
377 return showlist('tag', args['ctx'].tags(), **args)
@@ -364,6 +391,7 b' keywords = {'
364 'branches': showbranches,
391 'branches': showbranches,
365 'bookmarks': showbookmarks,
392 'bookmarks': showbookmarks,
366 'children': showchildren,
393 'children': showchildren,
394 'currentbookmark': showcurrentbookmark,
367 'date': showdate,
395 'date': showdate,
368 'desc': showdescription,
396 'desc': showdescription,
369 'diffstat': showdiffstat,
397 'diffstat': showdiffstat,
@@ -385,6 +413,7 b' keywords = {'
385 'phase': showphase,
413 'phase': showphase,
386 'phaseidx': showphaseidx,
414 'phaseidx': showphaseidx,
387 'rev': showrev,
415 'rev': showrev,
416 'subrepos': showsubrepos,
388 'tags': showtags,
417 'tags': showtags,
389 }
418 }
390
419
@@ -111,7 +111,7 b' def compileexp(exp, context):'
111 def getsymbol(exp):
111 def getsymbol(exp):
112 if exp[0] == 'symbol':
112 if exp[0] == 'symbol':
113 return exp[1]
113 return exp[1]
114 raise error.ParseError(_("expected a symbol"))
114 raise error.ParseError(_("expected a symbol, got '%s'") % exp[0])
115
115
116 def getlist(x):
116 def getlist(x):
117 if not x:
117 if not x:
@@ -148,7 +148,7 b' def runsymbol(context, mapping, key):'
148 v = context.process(key, mapping)
148 v = context.process(key, mapping)
149 except TemplateNotFound:
149 except TemplateNotFound:
150 v = ''
150 v = ''
151 if util.safehasattr(v, '__call__'):
151 if callable(v):
152 return v(**mapping)
152 return v(**mapping)
153 if isinstance(v, types.GeneratorType):
153 if isinstance(v, types.GeneratorType):
154 v = list(v)
154 v = list(v)
@@ -185,7 +185,7 b' def runtemplate(context, mapping, templa'
185 def runmap(context, mapping, data):
185 def runmap(context, mapping, data):
186 func, data, ctmpl = data
186 func, data, ctmpl = data
187 d = func(context, mapping, data)
187 d = func(context, mapping, data)
188 if util.safehasattr(d, '__call__'):
188 if callable(d):
189 d = d()
189 d = d()
190
190
191 lm = mapping.copy()
191 lm = mapping.copy()
@@ -335,7 +335,7 b' def join(context, mapping, args):'
335 raise error.ParseError(_("join expects one or two arguments"))
335 raise error.ParseError(_("join expects one or two arguments"))
336
336
337 joinset = args[0][0](context, mapping, args[0][1])
337 joinset = args[0][0](context, mapping, args[0][1])
338 if util.safehasattr(joinset, '__call__'):
338 if callable(joinset):
339 jf = joinset.joinfmt
339 jf = joinset.joinfmt
340 joinset = [jf(x) for x in joinset()]
340 joinset = [jf(x) for x in joinset()]
341
341
@@ -466,6 +466,36 b' def sub(context, mapping, args):'
466 src = stringify(_evalifliteral(args[2], context, mapping))
466 src = stringify(_evalifliteral(args[2], context, mapping))
467 yield re.sub(pat, rpl, src)
467 yield re.sub(pat, rpl, src)
468
468
469 def startswith(context, mapping, args):
470 if len(args) != 2:
471 raise error.ParseError(_("startswith expects two arguments"))
472
473 patn = stringify(args[0][0](context, mapping, args[0][1]))
474 text = stringify(args[1][0](context, mapping, args[1][1]))
475 if text.startswith(patn):
476 return text
477 return ''
478
479
480 def word(context, mapping, args):
481 """return nth word from a string"""
482 if not (2 <= len(args) <= 3):
483 raise error.ParseError(_("word expects two or three arguments, got %d")
484 % len(args))
485
486 num = int(stringify(args[0][0](context, mapping, args[0][1])))
487 text = stringify(args[1][0](context, mapping, args[1][1]))
488 if len(args) == 3:
489 splitter = stringify(args[2][0](context, mapping, args[2][1]))
490 else:
491 splitter = None
492
493 tokens = text.split(splitter)
494 if num >= len(tokens):
495 return ''
496 else:
497 return tokens[num]
498
469 methods = {
499 methods = {
470 "string": lambda e, c: (runstring, e[1]),
500 "string": lambda e, c: (runstring, e[1]),
471 "rawstring": lambda e, c: (runrawstring, e[1]),
501 "rawstring": lambda e, c: (runrawstring, e[1]),
@@ -490,8 +520,10 b' funcs = {'
490 "revset": revset,
520 "revset": revset,
491 "rstdoc": rstdoc,
521 "rstdoc": rstdoc,
492 "shortest": shortest,
522 "shortest": shortest,
523 "startswith": startswith,
493 "strip": strip,
524 "strip": strip,
494 "sub": sub,
525 "sub": sub,
526 "word": word,
495 }
527 }
496
528
497 # template engine
529 # template engine
@@ -32,7 +32,7 b''
32 </tr>
32 </tr>
33 <tr>
33 <tr>
34 <th style="text-align:left;vertical-align:top;">description</th>
34 <th style="text-align:left;vertical-align:top;">description</th>
35 <td>{desc|strip|escape|addbreaks|nonempty}</td>
35 <td>{desc|strip|escape|websub|addbreaks|nonempty}</td>
36 </tr>
36 </tr>
37 <tr>
37 <tr>
38 <th style="text-align:left;vertical-align:top;">files</th>
38 <th style="text-align:left;vertical-align:top;">files</th>
@@ -27,7 +27,7 b''
27 </tr>
27 </tr>
28 <tr>
28 <tr>
29 <th style="text-align:left;vertical-align:top;">description</th>
29 <th style="text-align:left;vertical-align:top;">description</th>
30 <td>{desc|strip|escape|addbreaks|nonempty}</td>
30 <td>{desc|strip|escape|websub|addbreaks|nonempty}</td>
31 </tr>
31 </tr>
32 <tr>
32 <tr>
33 <th style="text-align:left;vertical-align:top;">files</th>
33 <th style="text-align:left;vertical-align:top;">files</th>
@@ -1,7 +1,7 b''
1 <item>
1 <item>
2 <title>{desc|strip|firstline|strip|escape}</title>
2 <title>{desc|strip|firstline|strip|escape}</title>
3 <link>{urlbase}{url|urlescape}log{node|short}/{file|urlescape}</link>
3 <link>{urlbase}{url|urlescape}log{node|short}/{file|urlescape}</link>
4 <description><![CDATA[{desc|strip|escape|addbreaks|nonempty}]]></description>
4 <description><![CDATA[{desc|strip|escape|websub|addbreaks|nonempty}]]></description>
5 <author>{author|obfuscate}</author>
5 <author>{author|obfuscate}</author>
6 <pubDate>{date|rfc822date}</pubDate>
6 <pubDate>{date|rfc822date}</pubDate>
7 </item>
7 </item>
@@ -151,6 +151,17 b' typedef unsigned __int64 uint64_t;'
151 #define inline __inline
151 #define inline __inline
152 #endif
152 #endif
153
153
154 typedef struct {
155 PyObject_HEAD
156 char state;
157 int mode;
158 int size;
159 int mtime;
160 } dirstateTupleObject;
161
162 extern PyTypeObject dirstateTupleType;
163 #define dirstate_tuple_check(op) (Py_TYPE(op) == &dirstateTupleType)
164
154 static inline uint32_t getbe32(const char *c)
165 static inline uint32_t getbe32(const char *c)
155 {
166 {
156 const unsigned char *d = (const unsigned char *)c;
167 const unsigned char *d = (const unsigned char *)c;
@@ -15,7 +15,8 b' hide platform-specific details from the '
15
15
16 from i18n import _
16 from i18n import _
17 import error, osutil, encoding
17 import error, osutil, encoding
18 import errno, re, shutil, sys, tempfile, traceback
18 import errno, shutil, sys, tempfile, traceback
19 import re as remod
19 import os, time, datetime, calendar, textwrap, signal, collections
20 import os, time, datetime, calendar, textwrap, signal, collections
20 import imp, socket, urllib
21 import imp, socket, urllib
21
22
@@ -223,6 +224,37 b' except AttributeError:'
223 del self[i]
224 del self[i]
224 break
225 break
225
226
227 class sortdict(dict):
228 '''a simple sorted dictionary'''
229 def __init__(self, data=None):
230 self._list = []
231 if data:
232 self.update(data)
233 def copy(self):
234 return sortdict(self)
235 def __setitem__(self, key, val):
236 if key in self:
237 self._list.remove(key)
238 self._list.append(key)
239 dict.__setitem__(self, key, val)
240 def __iter__(self):
241 return self._list.__iter__()
242 def update(self, src):
243 for k in src:
244 self[k] = src[k]
245 def clear(self):
246 dict.clear(self)
247 self._list = []
248 def items(self):
249 return [(k, self[k]) for k in self._list]
250 def __delitem__(self, key):
251 dict.__delitem__(self, key)
252 self._list.remove(key)
253 def keys(self):
254 return self._list
255 def iterkeys(self):
256 return self._list.__iter__()
257
226 class lrucachedict(object):
258 class lrucachedict(object):
227 '''cache most recent gets from or sets to this dictionary'''
259 '''cache most recent gets from or sets to this dictionary'''
228 def __init__(self, maxsize):
260 def __init__(self, maxsize):
@@ -684,29 +716,50 b' try:'
684 except ImportError:
716 except ImportError:
685 _re2 = False
717 _re2 = False
686
718
687 def compilere(pat, flags=0):
719 class _re(object):
688 '''Compile a regular expression, using re2 if possible
720 def _checkre2(self):
689
721 global _re2
690 For best performance, use only re2-compatible regexp features. The
691 only flags from the re module that are re2-compatible are
692 IGNORECASE and MULTILINE.'''
693 global _re2
694 if _re2 is None:
695 try:
722 try:
696 # check if match works, see issue3964
723 # check if match works, see issue3964
697 _re2 = bool(re2.match(r'\[([^\[]+)\]', '[ui]'))
724 _re2 = bool(re2.match(r'\[([^\[]+)\]', '[ui]'))
698 except ImportError:
725 except ImportError:
699 _re2 = False
726 _re2 = False
700 if _re2 and (flags & ~(re.IGNORECASE | re.MULTILINE)) == 0:
727
701 if flags & re.IGNORECASE:
728 def compile(self, pat, flags=0):
702 pat = '(?i)' + pat
729 '''Compile a regular expression, using re2 if possible
703 if flags & re.MULTILINE:
730
704 pat = '(?m)' + pat
731 For best performance, use only re2-compatible regexp features. The
705 try:
732 only flags from the re module that are re2-compatible are
706 return re2.compile(pat)
733 IGNORECASE and MULTILINE.'''
707 except re2.error:
734 if _re2 is None:
708 pass
735 self._checkre2()
709 return re.compile(pat, flags)
736 if _re2 and (flags & ~(remod.IGNORECASE | remod.MULTILINE)) == 0:
737 if flags & remod.IGNORECASE:
738 pat = '(?i)' + pat
739 if flags & remod.MULTILINE:
740 pat = '(?m)' + pat
741 try:
742 return re2.compile(pat)
743 except re2.error:
744 pass
745 return remod.compile(pat, flags)
746
747 @propertycache
748 def escape(self):
749 '''Return the version of escape corresponding to self.compile.
750
751 This is imperfect because whether re2 or re is used for a particular
752 function depends on the flags, etc, but it's the best we can do.
753 '''
754 global _re2
755 if _re2 is None:
756 self._checkre2()
757 if _re2:
758 return re2.escape
759 else:
760 return remod.escape
761
762 re = _re()
710
763
711 _fspathcache = {}
764 _fspathcache = {}
712 def fspath(name, root):
765 def fspath(name, root):
@@ -730,7 +783,7 b' def fspath(name, root):'
730 seps = seps + os.altsep
783 seps = seps + os.altsep
731 # Protect backslashes. This gets silly very quickly.
784 # Protect backslashes. This gets silly very quickly.
732 seps.replace('\\','\\\\')
785 seps.replace('\\','\\\\')
733 pattern = re.compile(r'([^%s]+)|([%s]+)' % (seps, seps))
786 pattern = remod.compile(r'([^%s]+)|([%s]+)' % (seps, seps))
734 dir = os.path.normpath(root)
787 dir = os.path.normpath(root)
735 result = []
788 result = []
736 for part, sep in pattern.findall(name):
789 for part, sep in pattern.findall(name):
@@ -1287,23 +1340,9 b' def email(author):'
1287 r = None
1340 r = None
1288 return author[author.find('<') + 1:r]
1341 return author[author.find('<') + 1:r]
1289
1342
1290 def _ellipsis(text, maxlength):
1291 if len(text) <= maxlength:
1292 return text, False
1293 else:
1294 return "%s..." % (text[:maxlength - 3]), True
1295
1296 def ellipsis(text, maxlength=400):
1343 def ellipsis(text, maxlength=400):
1297 """Trim string to at most maxlength (default: 400) characters."""
1344 """Trim string to at most maxlength (default: 400) columns in display."""
1298 try:
1345 return encoding.trim(text, maxlength, ellipsis='...')
1299 # use unicode not to split at intermediate multi-byte sequence
1300 utext, truncated = _ellipsis(text.decode(encoding.encoding),
1301 maxlength)
1302 if not truncated:
1303 return text
1304 return utext.encode(encoding.encoding)
1305 except (UnicodeDecodeError, UnicodeEncodeError):
1306 return _ellipsis(text, maxlength)[0]
1307
1346
1308 def unitcountfn(*unittable):
1347 def unitcountfn(*unittable):
1309 '''return a function that renders a readable count of some quantity'''
1348 '''return a function that renders a readable count of some quantity'''
@@ -1548,7 +1587,7 b' def interpolate(prefix, mapping, s, fn=N'
1548 else:
1587 else:
1549 prefix_char = prefix
1588 prefix_char = prefix
1550 mapping[prefix_char] = prefix_char
1589 mapping[prefix_char] = prefix_char
1551 r = re.compile(r'%s(%s)' % (prefix, patterns))
1590 r = remod.compile(r'%s(%s)' % (prefix, patterns))
1552 return r.sub(lambda x: fn(mapping[x.group()[1:]]), s)
1591 return r.sub(lambda x: fn(mapping[x.group()[1:]]), s)
1553
1592
1554 def getport(port):
1593 def getport(port):
@@ -1663,7 +1702,7 b' class url(object):'
1663
1702
1664 _safechars = "!~*'()+"
1703 _safechars = "!~*'()+"
1665 _safepchars = "/!~*'()+:\\"
1704 _safepchars = "/!~*'()+:\\"
1666 _matchscheme = re.compile(r'^[a-zA-Z0-9+.\-]+:').match
1705 _matchscheme = remod.compile(r'^[a-zA-Z0-9+.\-]+:').match
1667
1706
1668 def __init__(self, path, parsequery=True, parsefragment=True):
1707 def __init__(self, path, parsequery=True, parsefragment=True):
1669 # We slowly chomp away at path until we have only the path left
1708 # We slowly chomp away at path until we have only the path left
@@ -8,7 +8,7 b''
8 import urllib, tempfile, os, sys
8 import urllib, tempfile, os, sys
9 from i18n import _
9 from i18n import _
10 from node import bin, hex
10 from node import bin, hex
11 import changegroup as changegroupmod, bundle2
11 import changegroup as changegroupmod, bundle2, pushkey as pushkeymod
12 import peer, error, encoding, util, store, exchange
12 import peer, error, encoding, util, store, exchange
13
13
14
14
@@ -190,6 +190,21 b' def unescapearg(escaped):'
190 .replace(':,', ',')
190 .replace(':,', ',')
191 .replace('::', ':'))
191 .replace('::', ':'))
192
192
193 # mapping of options accepted by getbundle and their types
194 #
195 # Meant to be extended by extensions. It is extensions responsibility to ensure
196 # such options are properly processed in exchange.getbundle.
197 #
198 # supported types are:
199 #
200 # :nodes: list of binary nodes
201 # :csv: list of comma-separated values
202 # :plain: string with no transformation needed.
203 gboptsmap = {'heads': 'nodes',
204 'common': 'nodes',
205 'bundlecaps': 'csv',
206 'listkeys': 'csv'}
207
193 # client side
208 # client side
194
209
195 class wirepeer(peer.peerrepository):
210 class wirepeer(peer.peerrepository):
@@ -303,11 +318,7 b' class wirepeer(peer.peerrepository):'
303 self.ui.debug('preparing listkeys for "%s"\n' % namespace)
318 self.ui.debug('preparing listkeys for "%s"\n' % namespace)
304 yield {'namespace': encoding.fromlocal(namespace)}, f
319 yield {'namespace': encoding.fromlocal(namespace)}, f
305 d = f.value
320 d = f.value
306 r = {}
321 yield pushkeymod.decodekeys(d)
307 for l in d.splitlines():
308 k, v = l.split('\t')
309 r[encoding.tolocal(k)] = encoding.tolocal(v)
310 yield r
311
322
312 def stream_out(self):
323 def stream_out(self):
313 return self._callstream('stream_out')
324 return self._callstream('stream_out')
@@ -325,18 +336,25 b' class wirepeer(peer.peerrepository):'
325 bases=bases, heads=heads)
336 bases=bases, heads=heads)
326 return changegroupmod.unbundle10(f, 'UN')
337 return changegroupmod.unbundle10(f, 'UN')
327
338
328 def getbundle(self, source, heads=None, common=None, bundlecaps=None,
339 def getbundle(self, source, **kwargs):
329 **kwargs):
330 self.requirecap('getbundle', _('look up remote changes'))
340 self.requirecap('getbundle', _('look up remote changes'))
331 opts = {}
341 opts = {}
332 if heads is not None:
342 for key, value in kwargs.iteritems():
333 opts['heads'] = encodelist(heads)
343 if value is None:
334 if common is not None:
344 continue
335 opts['common'] = encodelist(common)
345 keytype = gboptsmap.get(key)
336 if bundlecaps is not None:
346 if keytype is None:
337 opts['bundlecaps'] = ','.join(bundlecaps)
347 assert False, 'unexpected'
338 opts.update(kwargs)
348 elif keytype == 'nodes':
349 value = encodelist(value)
350 elif keytype == 'csv':
351 value = ','.join(value)
352 elif keytype != 'plain':
353 raise KeyError('unknown getbundle option type %s'
354 % keytype)
355 opts[key] = value
339 f = self._callcompressable("getbundle", **opts)
356 f = self._callcompressable("getbundle", **opts)
357 bundlecaps = kwargs.get('bundlecaps')
340 if bundlecaps is not None and 'HG2X' in bundlecaps:
358 if bundlecaps is not None and 'HG2X' in bundlecaps:
341 return bundle2.unbundle20(self.ui, f)
359 return bundle2.unbundle20(self.ui, f)
342 else:
360 else:
@@ -489,7 +507,7 b' def options(cmd, keys, others):'
489 opts[k] = others[k]
507 opts[k] = others[k]
490 del others[k]
508 del others[k]
491 if others:
509 if others:
492 sys.stderr.write("abort: %s got unexpected arguments %s\n"
510 sys.stderr.write("warning: %s ignored unexpected arguments %s\n"
493 % (cmd, ",".join(others)))
511 % (cmd, ",".join(others)))
494 return opts
512 return opts
495
513
@@ -627,12 +645,16 b" gboptslist = ['heads', 'common', 'bundle"
627
645
628 @wireprotocommand('getbundle', '*')
646 @wireprotocommand('getbundle', '*')
629 def getbundle(repo, proto, others):
647 def getbundle(repo, proto, others):
630 opts = options('getbundle', gboptslist, others)
648 opts = options('getbundle', gboptsmap.keys(), others)
631 for k, v in opts.iteritems():
649 for k, v in opts.iteritems():
632 if k in ('heads', 'common'):
650 keytype = gboptsmap[k]
651 if keytype == 'nodes':
633 opts[k] = decodelist(v)
652 opts[k] = decodelist(v)
634 elif k == 'bundlecaps':
653 elif keytype == 'csv':
635 opts[k] = set(v.split(','))
654 opts[k] = set(v.split(','))
655 elif keytype != 'plain':
656 raise KeyError('unknown getbundle option type %s'
657 % keytype)
636 cg = exchange.getbundle(repo, 'serve', **opts)
658 cg = exchange.getbundle(repo, 'serve', **opts)
637 return streamres(proto.groupchunks(cg))
659 return streamres(proto.groupchunks(cg))
638
660
@@ -655,9 +677,7 b' def hello(repo, proto):'
655 @wireprotocommand('listkeys', 'namespace')
677 @wireprotocommand('listkeys', 'namespace')
656 def listkeys(repo, proto, namespace):
678 def listkeys(repo, proto, namespace):
657 d = repo.listkeys(encoding.tolocal(namespace)).items()
679 d = repo.listkeys(encoding.tolocal(namespace)).items()
658 t = '\n'.join(['%s\t%s' % (encoding.fromlocal(k), encoding.fromlocal(v))
680 return pushkeymod.encodekeys(d)
659 for k, v in d])
660 return t
661
681
662 @wireprotocommand('lookup', 'key')
682 @wireprotocommand('lookup', 'key')
663 def lookup(repo, proto, key):
683 def lookup(repo, proto, key):
@@ -809,11 +829,13 b' def unbundle(repo, proto, heads):'
809 finally:
829 finally:
810 fp.close()
830 fp.close()
811 os.unlink(tempname)
831 os.unlink(tempname)
812 except bundle2.UnknownPartError, exc:
832 except error.BundleValueError, exc:
813 bundler = bundle2.bundle20(repo.ui)
833 bundler = bundle2.bundle20(repo.ui)
814 part = bundle2.bundlepart('B2X:ERROR:UNKNOWNPART',
834 errpart = bundler.newpart('B2X:ERROR:UNSUPPORTEDCONTENT')
815 [('parttype', str(exc))])
835 if exc.parttype is not None:
816 bundler.addpart(part)
836 errpart.addparam('parttype', exc.parttype)
837 if exc.params:
838 errpart.addparam('params', '\0'.join(exc.params))
817 return streamres(bundler.getchunks())
839 return streamres(bundler.getchunks())
818 except util.Abort, inst:
840 except util.Abort, inst:
819 # The old code we moved used sys.stderr directly.
841 # The old code we moved used sys.stderr directly.
@@ -835,9 +857,7 b' def unbundle(repo, proto, heads):'
835 except error.PushRaced, exc:
857 except error.PushRaced, exc:
836 if getattr(exc, 'duringunbundle2', False):
858 if getattr(exc, 'duringunbundle2', False):
837 bundler = bundle2.bundle20(repo.ui)
859 bundler = bundle2.bundle20(repo.ui)
838 part = bundle2.bundlepart('B2X:ERROR:PUSHRACED',
860 bundler.newpart('B2X:ERROR:PUSHRACED', [('message', str(exc))])
839 [('message', str(exc))])
840 bundler.addpart(part)
841 return streamres(bundler.getchunks())
861 return streamres(bundler.getchunks())
842 else:
862 else:
843 return pusherr(str(exc))
863 return pusherr(str(exc))
@@ -513,7 +513,7 b" if sys.platform == 'darwin' and os.path."
513 version = version[0]
513 version = version[0]
514 xcode4 = (version.startswith('Xcode') and
514 xcode4 = (version.startswith('Xcode') and
515 StrictVersion(version.split()[1]) >= StrictVersion('4.0'))
515 StrictVersion(version.split()[1]) >= StrictVersion('4.0'))
516 xcode51 = re.match(r'^Xcode\s+5\.1\.', version) is not None
516 xcode51 = re.match(r'^Xcode\s+5\.1', version) is not None
517 else:
517 else:
518 # xcodebuild returns empty on OS X Lion with XCode 4.3 not
518 # xcodebuild returns empty on OS X Lion with XCode 4.3 not
519 # installed, but instead with only command-line tools. Assume
519 # installed, but instead with only command-line tools. Assume
@@ -1,8 +1,14 b''
1 # Extension dedicated to test patch.diff() upgrade modes
1 # Extension dedicated to test patch.diff() upgrade modes
2 #
2 #
3 #
3 #
4 from mercurial import scmutil, patch, util
4 from mercurial import cmdutil, scmutil, patch, util
5
5
6 cmdtable = {}
7 command = cmdutil.command(cmdtable)
8
9 @command('autodiff',
10 [('', 'git', '', 'git upgrade mode (yes/no/auto/warn/abort)')],
11 '[OPTION]... [FILE]...')
6 def autodiff(ui, repo, *pats, **opts):
12 def autodiff(ui, repo, *pats, **opts):
7 diffopts = patch.diffopts(ui, opts)
13 diffopts = patch.diffopts(ui, opts)
8 git = opts.get('git', 'no')
14 git = opts.get('git', 'no')
@@ -36,11 +42,3 b' def autodiff(ui, repo, *pats, **opts):'
36 ui.write(chunk)
42 ui.write(chunk)
37 for fn in sorted(brokenfiles):
43 for fn in sorted(brokenfiles):
38 ui.write(('data lost for: %s\n' % fn))
44 ui.write(('data lost for: %s\n' % fn))
39
40 cmdtable = {
41 "autodiff":
42 (autodiff,
43 [('', 'git', '', 'git upgrade mode (yes/no/auto/warn/abort)'),
44 ],
45 '[OPTION]... [FILE]...'),
46 }
@@ -29,12 +29,15 b' lines = []'
29 for line in sys.stdin:
29 for line in sys.stdin:
30 # We whitelist tests (see more messages in pyflakes.messages)
30 # We whitelist tests (see more messages in pyflakes.messages)
31 pats = [
31 pats = [
32 r"imported but unused",
32 (r"imported but unused", None),
33 r"local variable '.*' is assigned to but never used",
33 (r"local variable '.*' is assigned to but never used", None),
34 r"unable to detect undefined names",
34 (r"unable to detect undefined names", None),
35 (r"undefined name '.*'",
36 r"undefined name '(WindowsError|memoryview)'")
35 ]
37 ]
36 for msgtype, pat in enumerate(pats):
38
37 if re.search(pat, line):
39 for msgtype, (pat, excl) in enumerate(pats):
40 if re.search(pat, line) and (not excl or not re.search(excl, line)):
38 break # pattern matches
41 break # pattern matches
39 else:
42 else:
40 continue # no pattern matched, next line
43 continue # no pattern matched, next line
@@ -49,3 +52,7 b' for line in sys.stdin:'
49 for msgtype, line in sorted(lines, key=makekey):
52 for msgtype, line in sorted(lines, key=makekey):
50 sys.stdout.write(line)
53 sys.stdout.write(line)
51 print
54 print
55
56 # self test of "undefined name" detection for other than 'memoryview'
57 if False:
58 print undefinedname
This diff has been collapsed as it changes many lines, (2238 lines changed) Show them Hide them
@@ -57,6 +57,7 b' import re'
57 import threading
57 import threading
58 import killdaemons as killmod
58 import killdaemons as killmod
59 import Queue as queue
59 import Queue as queue
60 import unittest
60
61
61 processlock = threading.Lock()
62 processlock = threading.Lock()
62
63
@@ -92,18 +93,12 b' def Popen4(cmd, wd, timeout, env=None):'
92
93
93 return p
94 return p
94
95
95 # reserved exit code to skip test (used by hghave)
96 SKIPPED_STATUS = 80
97 SKIPPED_PREFIX = 'skipped: '
98 FAILED_PREFIX = 'hghave check failed: '
99 PYTHON = sys.executable.replace('\\', '/')
96 PYTHON = sys.executable.replace('\\', '/')
100 IMPL_PATH = 'PYTHONPATH'
97 IMPL_PATH = 'PYTHONPATH'
101 if 'java' in sys.platform:
98 if 'java' in sys.platform:
102 IMPL_PATH = 'JYTHONPATH'
99 IMPL_PATH = 'JYTHONPATH'
103
100
104 requiredtools = [os.path.basename(sys.executable), "diff", "grep", "unzip",
101 TESTDIR = HGTMP = INST = BINDIR = TMPBINDIR = PYTHONDIR = None
105 "gunzip", "bunzip2", "sed"]
106 createdfiles = []
107
102
108 defaults = {
103 defaults = {
109 'jobs': ('HGTEST_JOBS', 1),
104 'jobs': ('HGTEST_JOBS', 1),
@@ -117,7 +112,7 b' def parselistfiles(files, listtype, warn'
117 for filename in files:
112 for filename in files:
118 try:
113 try:
119 path = os.path.expanduser(os.path.expandvars(filename))
114 path = os.path.expanduser(os.path.expandvars(filename))
120 f = open(path, "r")
115 f = open(path, "rb")
121 except IOError, err:
116 except IOError, err:
122 if err.errno != errno.ENOENT:
117 if err.errno != errno.ENOENT:
123 raise
118 raise
@@ -134,6 +129,7 b' def parselistfiles(files, listtype, warn'
134 return entries
129 return entries
135
130
136 def getparser():
131 def getparser():
132 """Obtain the OptionParser used by the CLI."""
137 parser = optparse.OptionParser("%prog [options] [tests]")
133 parser = optparse.OptionParser("%prog [options] [tests]")
138
134
139 # keep these sorted
135 # keep these sorted
@@ -214,6 +210,7 b' def getparser():'
214 return parser
210 return parser
215
211
216 def parseargs(args, parser):
212 def parseargs(args, parser):
213 """Parse arguments with our OptionParser and validate results."""
217 (options, args) = parser.parse_args(args)
214 (options, args) = parser.parse_args(args)
218
215
219 # jython is always pure
216 # jython is always pure
@@ -285,47 +282,33 b' def rename(src, dst):'
285 shutil.copy(src, dst)
282 shutil.copy(src, dst)
286 os.remove(src)
283 os.remove(src)
287
284
288 def parsehghaveoutput(lines):
285 def getdiff(expected, output, ref, err):
289 '''Parse hghave log lines.
290 Return tuple of lists (missing, failed):
291 * the missing/unknown features
292 * the features for which existence check failed'''
293 missing = []
294 failed = []
295 for line in lines:
296 if line.startswith(SKIPPED_PREFIX):
297 line = line.splitlines()[0]
298 missing.append(line[len(SKIPPED_PREFIX):])
299 elif line.startswith(FAILED_PREFIX):
300 line = line.splitlines()[0]
301 failed.append(line[len(FAILED_PREFIX):])
302
303 return missing, failed
304
305 def showdiff(expected, output, ref, err):
306 print
307 servefail = False
286 servefail = False
287 lines = []
308 for line in difflib.unified_diff(expected, output, ref, err):
288 for line in difflib.unified_diff(expected, output, ref, err):
309 sys.stdout.write(line)
289 if line.startswith('+++') or line.startswith('---'):
290 if line.endswith(' \n'):
291 line = line[:-2] + '\n'
292 lines.append(line)
310 if not servefail and line.startswith(
293 if not servefail and line.startswith(
311 '+ abort: child process failed to start'):
294 '+ abort: child process failed to start'):
312 servefail = True
295 servefail = True
313 return {'servefail': servefail}
314
296
297 return servefail, lines
315
298
316 verbose = False
299 verbose = False
317 def vlog(*msg):
300 def vlog(*msg):
318 if verbose is not False:
301 """Log only when in verbose mode."""
319 iolock.acquire()
302 if verbose is False:
320 if verbose:
303 return
321 print verbose,
304
322 for m in msg:
305 return log(*msg)
323 print m,
324 print
325 sys.stdout.flush()
326 iolock.release()
327
306
328 def log(*msg):
307 def log(*msg):
308 """Log something to stdout.
309
310 Arguments are strings to print.
311 """
329 iolock.acquire()
312 iolock.acquire()
330 if verbose:
313 if verbose:
331 print verbose,
314 print verbose,
@@ -335,80 +318,6 b' def log(*msg):'
335 sys.stdout.flush()
318 sys.stdout.flush()
336 iolock.release()
319 iolock.release()
337
320
338 def findprogram(program):
339 """Search PATH for a executable program"""
340 for p in os.environ.get('PATH', os.defpath).split(os.pathsep):
341 name = os.path.join(p, program)
342 if os.name == 'nt' or os.access(name, os.X_OK):
343 return name
344 return None
345
346 def createhgrc(path, options):
347 # create a fresh hgrc
348 hgrc = open(path, 'w')
349 hgrc.write('[ui]\n')
350 hgrc.write('slash = True\n')
351 hgrc.write('interactive = False\n')
352 hgrc.write('[defaults]\n')
353 hgrc.write('backout = -d "0 0"\n')
354 hgrc.write('commit = -d "0 0"\n')
355 hgrc.write('shelve = --date "0 0"\n')
356 hgrc.write('tag = -d "0 0"\n')
357 if options.extra_config_opt:
358 for opt in options.extra_config_opt:
359 section, key = opt.split('.', 1)
360 assert '=' in key, ('extra config opt %s must '
361 'have an = for assignment' % opt)
362 hgrc.write('[%s]\n%s\n' % (section, key))
363 hgrc.close()
364
365 def createenv(options, testtmp, threadtmp, port):
366 env = os.environ.copy()
367 env['TESTTMP'] = testtmp
368 env['HOME'] = testtmp
369 env["HGPORT"] = str(port)
370 env["HGPORT1"] = str(port + 1)
371 env["HGPORT2"] = str(port + 2)
372 env["HGRCPATH"] = os.path.join(threadtmp, '.hgrc')
373 env["DAEMON_PIDS"] = os.path.join(threadtmp, 'daemon.pids')
374 env["HGEDITOR"] = sys.executable + ' -c "import sys; sys.exit(0)"'
375 env["HGMERGE"] = "internal:merge"
376 env["HGUSER"] = "test"
377 env["HGENCODING"] = "ascii"
378 env["HGENCODINGMODE"] = "strict"
379
380 # Reset some environment variables to well-known values so that
381 # the tests produce repeatable output.
382 env['LANG'] = env['LC_ALL'] = env['LANGUAGE'] = 'C'
383 env['TZ'] = 'GMT'
384 env["EMAIL"] = "Foo Bar <foo.bar@example.com>"
385 env['COLUMNS'] = '80'
386 env['TERM'] = 'xterm'
387
388 for k in ('HG HGPROF CDPATH GREP_OPTIONS http_proxy no_proxy ' +
389 'NO_PROXY').split():
390 if k in env:
391 del env[k]
392
393 # unset env related to hooks
394 for k in env.keys():
395 if k.startswith('HG_'):
396 del env[k]
397
398 return env
399
400 def checktools():
401 # Before we go any further, check for pre-requisite tools
402 # stuff from coreutils (cat, rm, etc) are not tested
403 for p in requiredtools:
404 if os.name == 'nt' and not p.endswith('.exe'):
405 p += '.exe'
406 found = findprogram(p)
407 if found:
408 vlog("# Found prerequisite", p, "at", found)
409 else:
410 print "WARNING: Did not find prerequisite tool: "+p
411
412 def terminate(proc):
321 def terminate(proc):
413 """Terminate subprocess (with fallback for Python versions < 2.6)"""
322 """Terminate subprocess (with fallback for Python versions < 2.6)"""
414 vlog('# Terminating process %d' % proc.pid)
323 vlog('# Terminating process %d' % proc.pid)
@@ -421,264 +330,413 b' def killdaemons(pidfile):'
421 return killmod.killdaemons(pidfile, tryhard=False, remove=True,
330 return killmod.killdaemons(pidfile, tryhard=False, remove=True,
422 logfn=vlog)
331 logfn=vlog)
423
332
424 def cleanup(options):
333 class Test(unittest.TestCase):
425 if not options.keep_tmpdir:
334 """Encapsulates a single, runnable test.
426 vlog("# Cleaning up HGTMP", HGTMP)
335
427 shutil.rmtree(HGTMP, True)
336 While this class conforms to the unittest.TestCase API, it differs in that
428 for f in createdfiles:
337 instances need to be instantiated manually. (Typically, unittest.TestCase
429 try:
338 classes are instantiated automatically by scanning modules.)
430 os.remove(f)
339 """
431 except OSError:
340
432 pass
341 # Status code reserved for skipped tests (used by hghave).
342 SKIPPED_STATUS = 80
343
344 def __init__(self, path, tmpdir, keeptmpdir=False,
345 debug=False,
346 timeout=defaults['timeout'],
347 startport=defaults['port'], extraconfigopts=None,
348 py3kwarnings=False, shell=None):
349 """Create a test from parameters.
433
350
434 def usecorrectpython():
351 path is the full path to the file defining the test.
435 # some tests run python interpreter. they must use same
352
436 # interpreter we use or bad things will happen.
353 tmpdir is the main temporary directory to use for this test.
437 pyexename = sys.platform == 'win32' and 'python.exe' or 'python'
354
438 if getattr(os, 'symlink', None):
355 keeptmpdir determines whether to keep the test's temporary directory
439 vlog("# Making python executable in test path a symlink to '%s'" %
356 after execution. It defaults to removal (False).
440 sys.executable)
441 mypython = os.path.join(TMPBINDIR, pyexename)
442 try:
443 if os.readlink(mypython) == sys.executable:
444 return
445 os.unlink(mypython)
446 except OSError, err:
447 if err.errno != errno.ENOENT:
448 raise
449 if findprogram(pyexename) != sys.executable:
450 try:
451 os.symlink(sys.executable, mypython)
452 createdfiles.append(mypython)
453 except OSError, err:
454 # child processes may race, which is harmless
455 if err.errno != errno.EEXIST:
456 raise
457 else:
458 exedir, exename = os.path.split(sys.executable)
459 vlog("# Modifying search path to find %s as %s in '%s'" %
460 (exename, pyexename, exedir))
461 path = os.environ['PATH'].split(os.pathsep)
462 while exedir in path:
463 path.remove(exedir)
464 os.environ['PATH'] = os.pathsep.join([exedir] + path)
465 if not findprogram(pyexename):
466 print "WARNING: Cannot find %s in search path" % pyexename
467
357
468 def installhg(options):
358 debug mode will make the test execute verbosely, with unfiltered
469 vlog("# Performing temporary installation of HG")
359 output.
470 installerrs = os.path.join("tests", "install.err")
360
471 compiler = ''
361 timeout controls the maximum run time of the test. It is ignored when
472 if options.compiler:
362 debug is True.
473 compiler = '--compiler ' + options.compiler
363
474 pure = options.pure and "--pure" or ""
364 startport controls the starting port number to use for this test. Each
475 py3 = ''
365 test will reserve 3 port numbers for execution. It is the caller's
476 if sys.version_info[0] == 3:
366 responsibility to allocate a non-overlapping port range to Test
477 py3 = '--c2to3'
367 instances.
478
368
479 # Run installer in hg root
369 extraconfigopts is an iterable of extra hgrc config options. Values
480 script = os.path.realpath(sys.argv[0])
370 must have the form "key=value" (something understood by hgrc). Values
481 hgroot = os.path.dirname(os.path.dirname(script))
371 of the form "foo.key=value" will result in "[foo] key=value".
482 os.chdir(hgroot)
372
483 nohome = '--home=""'
373 py3kwarnings enables Py3k warnings.
484 if os.name == 'nt':
374
485 # The --home="" trick works only on OS where os.sep == '/'
375 shell is the shell to execute tests in.
486 # because of a distutils convert_path() fast-path. Avoid it at
376 """
487 # least on Windows for now, deal with .pydistutils.cfg bugs
377
488 # when they happen.
378 self.path = path
489 nohome = ''
379 self.name = os.path.basename(path)
490 cmd = ('%(exe)s setup.py %(py3)s %(pure)s clean --all'
380 self._testdir = os.path.dirname(path)
491 ' build %(compiler)s --build-base="%(base)s"'
381 self.errpath = os.path.join(self._testdir, '%s.err' % self.name)
492 ' install --force --prefix="%(prefix)s" --install-lib="%(libdir)s"'
493 ' --install-scripts="%(bindir)s" %(nohome)s >%(logfile)s 2>&1'
494 % {'exe': sys.executable, 'py3': py3, 'pure': pure,
495 'compiler': compiler, 'base': os.path.join(HGTMP, "build"),
496 'prefix': INST, 'libdir': PYTHONDIR, 'bindir': BINDIR,
497 'nohome': nohome, 'logfile': installerrs})
498 vlog("# Running", cmd)
499 if os.system(cmd) == 0:
500 if not options.verbose:
501 os.remove(installerrs)
502 else:
503 f = open(installerrs)
504 for line in f:
505 print line,
506 f.close()
507 sys.exit(1)
508 os.chdir(TESTDIR)
509
382
510 usecorrectpython()
383 self._threadtmp = tmpdir
511
384 self._keeptmpdir = keeptmpdir
512 if options.py3k_warnings and not options.anycoverage:
385 self._debug = debug
513 vlog("# Updating hg command to enable Py3k Warnings switch")
386 self._timeout = timeout
514 f = open(os.path.join(BINDIR, 'hg'), 'r')
387 self._startport = startport
515 lines = [line.rstrip() for line in f]
388 self._extraconfigopts = extraconfigopts or []
516 lines[0] += ' -3'
389 self._py3kwarnings = py3kwarnings
517 f.close()
390 self._shell = shell
518 f = open(os.path.join(BINDIR, 'hg'), 'w')
519 for line in lines:
520 f.write(line + '\n')
521 f.close()
522
391
523 hgbat = os.path.join(BINDIR, 'hg.bat')
392 self._aborted = False
524 if os.path.isfile(hgbat):
393 self._daemonpids = []
525 # hg.bat expects to be put in bin/scripts while run-tests.py
394 self._finished = None
526 # installation layout put it in bin/ directly. Fix it
395 self._ret = None
527 f = open(hgbat, 'rb')
396 self._out = None
528 data = f.read()
397 self._skipped = None
529 f.close()
398 self._testtmp = None
530 if '"%~dp0..\python" "%~dp0hg" %*' in data:
399
531 data = data.replace('"%~dp0..\python" "%~dp0hg" %*',
400 # If we're not in --debug mode and reference output file exists,
532 '"%~dp0python" "%~dp0hg" %*')
401 # check test output against it.
533 f = open(hgbat, 'wb')
402 if debug:
534 f.write(data)
403 self._refout = None # to match "out is None"
404 elif os.path.exists(self.refpath):
405 f = open(self.refpath, 'rb')
406 self._refout = f.read().splitlines(True)
535 f.close()
407 f.close()
536 else:
408 else:
537 print 'WARNING: cannot fix hg.bat reference to python.exe'
409 self._refout = []
410
411 def __str__(self):
412 return self.name
413
414 def shortDescription(self):
415 return self.name
538
416
539 if options.anycoverage:
417 def setUp(self):
540 custom = os.path.join(TESTDIR, 'sitecustomize.py')
418 """Tasks to perform before run()."""
541 target = os.path.join(PYTHONDIR, 'sitecustomize.py')
419 self._finished = False
542 vlog('# Installing coverage trigger to %s' % target)
420 self._ret = None
543 shutil.copyfile(custom, target)
421 self._out = None
544 rc = os.path.join(TESTDIR, '.coveragerc')
422 self._skipped = None
545 vlog('# Installing coverage rc to %s' % rc)
423
546 os.environ['COVERAGE_PROCESS_START'] = rc
424 try:
547 fn = os.path.join(INST, '..', '.coverage')
425 os.mkdir(self._threadtmp)
548 os.environ['COVERAGE_FILE'] = fn
426 except OSError, e:
427 if e.errno != errno.EEXIST:
428 raise
429
430 self._testtmp = os.path.join(self._threadtmp,
431 os.path.basename(self.path))
432 os.mkdir(self._testtmp)
433
434 # Remove any previous output files.
435 if os.path.exists(self.errpath):
436 os.remove(self.errpath)
549
437
550 def outputtimes(options):
438 def run(self, result):
551 vlog('# Producing time report')
439 """Run this test and report results against a TestResult instance."""
552 times.sort(key=lambda t: (t[1], t[0]), reverse=True)
440 # This function is extremely similar to unittest.TestCase.run(). Once
553 cols = '%7.3f %s'
441 # we require Python 2.7 (or at least its version of unittest), this
554 print '\n%-7s %s' % ('Time', 'Test')
442 # function can largely go away.
555 for test, timetaken in times:
443 self._result = result
556 print cols % (timetaken, test)
444 result.startTest(self)
445 try:
446 try:
447 self.setUp()
448 except (KeyboardInterrupt, SystemExit):
449 self._aborted = True
450 raise
451 except Exception:
452 result.addError(self, sys.exc_info())
453 return
557
454
558 def outputcoverage(options):
455 success = False
559
456 try:
560 vlog('# Producing coverage report')
457 self.runTest()
561 os.chdir(PYTHONDIR)
458 except KeyboardInterrupt:
459 self._aborted = True
460 raise
461 except SkipTest, e:
462 result.addSkip(self, str(e))
463 except IgnoreTest, e:
464 result.addIgnore(self, str(e))
465 except WarnTest, e:
466 result.addWarn(self, str(e))
467 except self.failureException, e:
468 # This differs from unittest in that we don't capture
469 # the stack trace. This is for historical reasons and
470 # this decision could be revisted in the future,
471 # especially for PythonTest instances.
472 if result.addFailure(self, str(e)):
473 success = True
474 except Exception:
475 result.addError(self, sys.exc_info())
476 else:
477 success = True
562
478
563 def covrun(*args):
479 try:
564 cmd = 'coverage %s' % ' '.join(args)
480 self.tearDown()
565 vlog('# Running: %s' % cmd)
481 except (KeyboardInterrupt, SystemExit):
566 os.system(cmd)
482 self._aborted = True
483 raise
484 except Exception:
485 result.addError(self, sys.exc_info())
486 success = False
567
487
568 covrun('-c')
488 if success:
569 omit = ','.join(os.path.join(x, '*') for x in [BINDIR, TESTDIR])
489 result.addSuccess(self)
570 covrun('-i', '-r', '"--omit=%s"' % omit) # report
490 finally:
571 if options.htmlcov:
491 result.stopTest(self, interrupted=self._aborted)
572 htmldir = os.path.join(TESTDIR, 'htmlcov')
492
573 covrun('-i', '-b', '"--directory=%s"' % htmldir, '"--omit=%s"' % omit)
493 def runTest(self):
574 if options.annotate:
494 """Run this test instance.
575 adir = os.path.join(TESTDIR, 'annotated')
495
576 if not os.path.isdir(adir):
496 This will return a tuple describing the result of the test.
577 os.mkdir(adir)
497 """
578 covrun('-i', '-a', '"--directory=%s"' % adir, '"--omit=%s"' % omit)
498 replacements = self._getreplacements()
499 env = self._getenv()
500 self._daemonpids.append(env['DAEMON_PIDS'])
501 self._createhgrc(env['HGRCPATH'])
502
503 vlog('# Test', self.name)
504
505 ret, out = self._run(replacements, env)
506 self._finished = True
507 self._ret = ret
508 self._out = out
509
510 def describe(ret):
511 if ret < 0:
512 return 'killed by signal: %d' % -ret
513 return 'returned error code %d' % ret
514
515 self._skipped = False
516
517 if ret == self.SKIPPED_STATUS:
518 if out is None: # Debug mode, nothing to parse.
519 missing = ['unknown']
520 failed = None
521 else:
522 missing, failed = TTest.parsehghaveoutput(out)
523
524 if not missing:
525 missing = ['irrelevant']
579
526
580 def pytest(test, wd, options, replacements, env):
527 if failed:
581 py3kswitch = options.py3k_warnings and ' -3' or ''
528 self.fail('hg have failed checking for %s' % failed[-1])
582 cmd = '%s%s "%s"' % (PYTHON, py3kswitch, test)
529 else:
583 vlog("# Running", cmd)
530 self._skipped = True
584 if os.name == 'nt':
531 raise SkipTest(missing[-1])
585 replacements.append((r'\r\n', '\n'))
532 elif ret == 'timeout':
586 return run(cmd, wd, options, replacements, env)
533 self.fail('timed out')
534 elif ret is False:
535 raise WarnTest('no result code from test')
536 elif out != self._refout:
537 # Diff generation may rely on written .err file.
538 if (ret != 0 or out != self._refout) and not self._skipped \
539 and not self._debug:
540 f = open(self.errpath, 'wb')
541 for line in out:
542 f.write(line)
543 f.close()
587
544
588 needescape = re.compile(r'[\x00-\x08\x0b-\x1f\x7f-\xff]').search
545 # The result object handles diff calculation for us.
589 escapesub = re.compile(r'[\x00-\x08\x0b-\x1f\\\x7f-\xff]').sub
546 if self._result.addOutputMismatch(self, ret, out, self._refout):
590 escapemap = dict((chr(i), r'\x%02x' % i) for i in range(256))
547 # change was accepted, skip failing
591 escapemap.update({'\\': '\\\\', '\r': r'\r'})
548 return
592 def escapef(m):
549
593 return escapemap[m.group(0)]
550 if ret:
594 def stringescape(s):
551 msg = 'output changed and ' + describe(ret)
595 return escapesub(escapef, s)
552 else:
553 msg = 'output changed'
596
554
597 def rematch(el, l):
555 self.fail(msg)
598 try:
556 elif ret:
599 # use \Z to ensure that the regex matches to the end of the string
557 self.fail(describe(ret))
600 if os.name == 'nt':
558
601 return re.match(el + r'\r?\n\Z', l)
559 def tearDown(self):
602 return re.match(el + r'\n\Z', l)
560 """Tasks to perform after run()."""
603 except re.error:
561 for entry in self._daemonpids:
604 # el is an invalid regex
562 killdaemons(entry)
605 return False
563 self._daemonpids = []
564
565 if not self._keeptmpdir:
566 shutil.rmtree(self._testtmp, True)
567 shutil.rmtree(self._threadtmp, True)
606
568
607 def globmatch(el, l):
569 if (self._ret != 0 or self._out != self._refout) and not self._skipped \
608 # The only supported special characters are * and ? plus / which also
570 and not self._debug and self._out:
609 # matches \ on windows. Escaping of these characters is supported.
571 f = open(self.errpath, 'wb')
610 if el + '\n' == l:
572 for line in self._out:
611 if os.altsep:
573 f.write(line)
612 # matching on "/" is not needed for this line
574 f.close()
613 return '-glob'
575
614 return True
576 vlog("# Ret was:", self._ret)
615 i, n = 0, len(el)
577
616 res = ''
578 def _run(self, replacements, env):
617 while i < n:
579 # This should be implemented in child classes to run tests.
618 c = el[i]
580 raise SkipTest('unknown test type')
619 i += 1
581
620 if c == '\\' and el[i] in '*?\\/':
582 def abort(self):
621 res += el[i - 1:i + 1]
583 """Terminate execution of this test."""
622 i += 1
584 self._aborted = True
623 elif c == '*':
585
624 res += '.*'
586 def _getreplacements(self):
625 elif c == '?':
587 """Obtain a mapping of text replacements to apply to test output.
626 res += '.'
588
627 elif c == '/' and os.altsep:
589 Test output needs to be normalized so it can be compared to expected
628 res += '[/\\\\]'
590 output. This function defines how some of that normalization will
591 occur.
592 """
593 r = [
594 (r':%s\b' % self._startport, ':$HGPORT'),
595 (r':%s\b' % (self._startport + 1), ':$HGPORT1'),
596 (r':%s\b' % (self._startport + 2), ':$HGPORT2'),
597 ]
598
599 if os.name == 'nt':
600 r.append(
601 (''.join(c.isalpha() and '[%s%s]' % (c.lower(), c.upper()) or
602 c in '/\\' and r'[/\\]' or c.isdigit() and c or '\\' + c
603 for c in self._testtmp), '$TESTTMP'))
629 else:
604 else:
630 res += re.escape(c)
605 r.append((re.escape(self._testtmp), '$TESTTMP'))
631 return rematch(res, l)
606
607 return r
608
609 def _getenv(self):
610 """Obtain environment variables to use during test execution."""
611 env = os.environ.copy()
612 env['TESTTMP'] = self._testtmp
613 env['HOME'] = self._testtmp
614 env["HGPORT"] = str(self._startport)
615 env["HGPORT1"] = str(self._startport + 1)
616 env["HGPORT2"] = str(self._startport + 2)
617 env["HGRCPATH"] = os.path.join(self._threadtmp, '.hgrc')
618 env["DAEMON_PIDS"] = os.path.join(self._threadtmp, 'daemon.pids')
619 env["HGEDITOR"] = sys.executable + ' -c "import sys; sys.exit(0)"'
620 env["HGMERGE"] = "internal:merge"
621 env["HGUSER"] = "test"
622 env["HGENCODING"] = "ascii"
623 env["HGENCODINGMODE"] = "strict"
624
625 # Reset some environment variables to well-known values so that
626 # the tests produce repeatable output.
627 env['LANG'] = env['LC_ALL'] = env['LANGUAGE'] = 'C'
628 env['TZ'] = 'GMT'
629 env["EMAIL"] = "Foo Bar <foo.bar@example.com>"
630 env['COLUMNS'] = '80'
631 env['TERM'] = 'xterm'
632
633 for k in ('HG HGPROF CDPATH GREP_OPTIONS http_proxy no_proxy ' +
634 'NO_PROXY').split():
635 if k in env:
636 del env[k]
637
638 # unset env related to hooks
639 for k in env.keys():
640 if k.startswith('HG_'):
641 del env[k]
642
643 return env
632
644
633 def linematch(el, l):
645 def _createhgrc(self, path):
634 if el == l: # perfect match (fast)
646 """Create an hgrc file for this test."""
635 return True
647 hgrc = open(path, 'wb')
636 if el:
648 hgrc.write('[ui]\n')
637 if el.endswith(" (esc)\n"):
649 hgrc.write('slash = True\n')
638 el = el[:-7].decode('string-escape') + '\n'
650 hgrc.write('interactive = False\n')
639 if el == l or os.name == 'nt' and el[:-1] + '\r\n' == l:
651 hgrc.write('mergemarkers = detailed\n')
640 return True
652 hgrc.write('[defaults]\n')
641 if el.endswith(" (re)\n"):
653 hgrc.write('backout = -d "0 0"\n')
642 return rematch(el[:-6], l)
654 hgrc.write('commit = -d "0 0"\n')
643 if el.endswith(" (glob)\n"):
655 hgrc.write('shelve = --date "0 0"\n')
644 return globmatch(el[:-8], l)
656 hgrc.write('tag = -d "0 0"\n')
645 if os.altsep and l.replace('\\', '/') == el:
657 for opt in self._extraconfigopts:
646 return '+glob'
658 section, key = opt.split('.', 1)
647 return False
659 assert '=' in key, ('extra config opt %s must '
660 'have an = for assignment' % opt)
661 hgrc.write('[%s]\n%s\n' % (section, key))
662 hgrc.close()
663
664 def fail(self, msg):
665 # unittest differentiates between errored and failed.
666 # Failed is denoted by AssertionError (by default at least).
667 raise AssertionError(msg)
668
669 class PythonTest(Test):
670 """A Python-based test."""
671
672 @property
673 def refpath(self):
674 return os.path.join(self._testdir, '%s.out' % self.name)
675
676 def _run(self, replacements, env):
677 py3kswitch = self._py3kwarnings and ' -3' or ''
678 cmd = '%s%s "%s"' % (PYTHON, py3kswitch, self.path)
679 vlog("# Running", cmd)
680 if os.name == 'nt':
681 replacements.append((r'\r\n', '\n'))
682 result = run(cmd, self._testtmp, replacements, env,
683 debug=self._debug, timeout=self._timeout)
684 if self._aborted:
685 raise KeyboardInterrupt()
686
687 return result
688
689 class TTest(Test):
690 """A "t test" is a test backed by a .t file."""
648
691
649 def tsttest(test, wd, options, replacements, env):
692 SKIPPED_PREFIX = 'skipped: '
650 # We generate a shell script which outputs unique markers to line
693 FAILED_PREFIX = 'hghave check failed: '
651 # up script results with our source. These markers include input
694 NEEDESCAPE = re.compile(r'[\x00-\x08\x0b-\x1f\x7f-\xff]').search
652 # line number and the last return code
695
653 salt = "SALT" + str(time.time())
696 ESCAPESUB = re.compile(r'[\x00-\x08\x0b-\x1f\\\x7f-\xff]').sub
654 def addsalt(line, inpython):
697 ESCAPEMAP = dict((chr(i), r'\x%02x' % i) for i in range(256))
655 if inpython:
698 ESCAPEMAP.update({'\\': '\\\\', '\r': r'\r'})
656 script.append('%s %d 0\n' % (salt, line))
699
657 else:
700 @property
658 script.append('echo %s %s $?\n' % (salt, line))
701 def refpath(self):
702 return os.path.join(self._testdir, self.name)
703
704 def _run(self, replacements, env):
705 f = open(self.path, 'rb')
706 lines = f.readlines()
707 f.close()
708
709 salt, script, after, expected = self._parsetest(lines)
659
710
660 # After we run the shell script, we re-unify the script output
711 # Write out the generated script.
661 # with non-active parts of the source, with synchronization by our
712 fname = '%s.sh' % self._testtmp
662 # SALT line number markers. The after table contains the
713 f = open(fname, 'wb')
663 # non-active components, ordered by line number
714 for l in script:
664 after = {}
715 f.write(l)
665 pos = prepos = -1
716 f.close()
666
717
667 # Expected shell script output
718 cmd = '%s "%s"' % (self._shell, fname)
668 expected = {}
719 vlog("# Running", cmd)
720
721 exitcode, output = run(cmd, self._testtmp, replacements, env,
722 debug=self._debug, timeout=self._timeout)
669
723
670 # We keep track of whether or not we're in a Python block so we
724 if self._aborted:
671 # can generate the surrounding doctest magic
725 raise KeyboardInterrupt()
672 inpython = False
726
727 # Do not merge output if skipped. Return hghave message instead.
728 # Similarly, with --debug, output is None.
729 if exitcode == self.SKIPPED_STATUS or output is None:
730 return exitcode, output
673
731
674 # True or False when in a true or false conditional section
732 return self._processoutput(exitcode, output, salt, after, expected)
675 skipping = None
676
733
677 def hghave(reqs):
734 def _hghave(self, reqs):
678 # TODO: do something smarter when all other uses of hghave is gone
735 # TODO do something smarter when all other uses of hghave are gone.
679 tdir = TESTDIR.replace('\\', '/')
736 tdir = self._testdir.replace('\\', '/')
680 proc = Popen4('%s -c "%s/hghave %s"' %
737 proc = Popen4('%s -c "%s/hghave %s"' %
681 (options.shell, tdir, ' '.join(reqs)), wd, 0)
738 (self._shell, tdir, ' '.join(reqs)),
739 self._testtmp, 0)
682 stdout, stderr = proc.communicate()
740 stdout, stderr = proc.communicate()
683 ret = proc.wait()
741 ret = proc.wait()
684 if wifexited(ret):
742 if wifexited(ret):
@@ -686,172 +744,271 b' def tsttest(test, wd, options, replaceme'
686 if ret == 2:
744 if ret == 2:
687 print stdout
745 print stdout
688 sys.exit(1)
746 sys.exit(1)
747
689 return ret == 0
748 return ret == 0
690
749
691 f = open(test)
750 def _parsetest(self, lines):
692 t = f.readlines()
751 # We generate a shell script which outputs unique markers to line
693 f.close()
752 # up script results with our source. These markers include input
753 # line number and the last return code.
754 salt = "SALT" + str(time.time())
755 def addsalt(line, inpython):
756 if inpython:
757 script.append('%s %d 0\n' % (salt, line))
758 else:
759 script.append('echo %s %s $?\n' % (salt, line))
760
761 script = []
762
763 # After we run the shell script, we re-unify the script output
764 # with non-active parts of the source, with synchronization by our
765 # SALT line number markers. The after table contains the non-active
766 # components, ordered by line number.
767 after = {}
768
769 # Expected shell script output.
770 expected = {}
771
772 pos = prepos = -1
773
774 # True or False when in a true or false conditional section
775 skipping = None
776
777 # We keep track of whether or not we're in a Python block so we
778 # can generate the surrounding doctest magic.
779 inpython = False
780
781 if self._debug:
782 script.append('set -x\n')
783 if os.getenv('MSYSTEM'):
784 script.append('alias pwd="pwd -W"\n')
694
785
695 script = []
786 for n, l in enumerate(lines):
696 if options.debug:
787 if not l.endswith('\n'):
697 script.append('set -x\n')
788 l += '\n'
698 if os.getenv('MSYSTEM'):
789 if l.startswith('#if'):
699 script.append('alias pwd="pwd -W"\n')
790 lsplit = l.split()
700 n = 0
791 if len(lsplit) < 2 or lsplit[0] != '#if':
701 for n, l in enumerate(t):
792 after.setdefault(pos, []).append(' !!! invalid #if\n')
702 if not l.endswith('\n'):
793 if skipping is not None:
703 l += '\n'
794 after.setdefault(pos, []).append(' !!! nested #if\n')
704 if l.startswith('#if'):
795 skipping = not self._hghave(lsplit[1:])
705 lsplit = l.split()
796 after.setdefault(pos, []).append(l)
706 if len(lsplit) < 2 or lsplit[0] != '#if':
797 elif l.startswith('#else'):
707 after.setdefault(pos, []).append(' !!! invalid #if\n')
798 if skipping is None:
708 if skipping is not None:
799 after.setdefault(pos, []).append(' !!! missing #if\n')
709 after.setdefault(pos, []).append(' !!! nested #if\n')
800 skipping = not skipping
710 skipping = not hghave(lsplit[1:])
801 after.setdefault(pos, []).append(l)
711 after.setdefault(pos, []).append(l)
802 elif l.startswith('#endif'):
712 elif l.startswith('#else'):
803 if skipping is None:
713 if skipping is None:
804 after.setdefault(pos, []).append(' !!! missing #if\n')
714 after.setdefault(pos, []).append(' !!! missing #if\n')
805 skipping = None
715 skipping = not skipping
806 after.setdefault(pos, []).append(l)
716 after.setdefault(pos, []).append(l)
807 elif skipping:
717 elif l.startswith('#endif'):
808 after.setdefault(pos, []).append(l)
718 if skipping is None:
809 elif l.startswith(' >>> '): # python inlines
719 after.setdefault(pos, []).append(' !!! missing #if\n')
810 after.setdefault(pos, []).append(l)
720 skipping = None
811 prepos = pos
721 after.setdefault(pos, []).append(l)
812 pos = n
722 elif skipping:
813 if not inpython:
723 after.setdefault(pos, []).append(l)
814 # We've just entered a Python block. Add the header.
724 elif l.startswith(' >>> '): # python inlines
815 inpython = True
725 after.setdefault(pos, []).append(l)
816 addsalt(prepos, False) # Make sure we report the exit code.
726 prepos = pos
817 script.append('%s -m heredoctest <<EOF\n' % PYTHON)
727 pos = n
818 addsalt(n, True)
728 if not inpython:
819 script.append(l[2:])
729 # we've just entered a Python block, add the header
820 elif l.startswith(' ... '): # python inlines
730 inpython = True
821 after.setdefault(prepos, []).append(l)
731 addsalt(prepos, False) # make sure we report the exit code
822 script.append(l[2:])
732 script.append('%s -m heredoctest <<EOF\n' % PYTHON)
823 elif l.startswith(' $ '): # commands
733 addsalt(n, True)
824 if inpython:
734 script.append(l[2:])
825 script.append('EOF\n')
735 elif l.startswith(' ... '): # python inlines
826 inpython = False
736 after.setdefault(prepos, []).append(l)
827 after.setdefault(pos, []).append(l)
737 script.append(l[2:])
828 prepos = pos
738 elif l.startswith(' $ '): # commands
829 pos = n
739 if inpython:
830 addsalt(n, False)
740 script.append("EOF\n")
831 cmd = l[4:].split()
741 inpython = False
832 if len(cmd) == 2 and cmd[0] == 'cd':
742 after.setdefault(pos, []).append(l)
833 l = ' $ cd %s || exit 1\n' % cmd[1]
743 prepos = pos
834 script.append(l[4:])
744 pos = n
835 elif l.startswith(' > '): # continuations
745 addsalt(n, False)
836 after.setdefault(prepos, []).append(l)
746 cmd = l[4:].split()
837 script.append(l[4:])
747 if len(cmd) == 2 and cmd[0] == 'cd':
838 elif l.startswith(' '): # results
748 l = ' $ cd %s || exit 1\n' % cmd[1]
839 # Queue up a list of expected results.
749 script.append(l[4:])
840 expected.setdefault(pos, []).append(l[2:])
750 elif l.startswith(' > '): # continuations
841 else:
751 after.setdefault(prepos, []).append(l)
842 if inpython:
752 script.append(l[4:])
843 script.append('EOF\n')
753 elif l.startswith(' '): # results
844 inpython = False
754 # queue up a list of expected results
845 # Non-command/result. Queue up for merged output.
755 expected.setdefault(pos, []).append(l[2:])
846 after.setdefault(pos, []).append(l)
756 else:
847
757 if inpython:
848 if inpython:
758 script.append("EOF\n")
849 script.append('EOF\n')
759 inpython = False
850 if skipping is not None:
760 # non-command/result - queue up for merged output
851 after.setdefault(pos, []).append(' !!! missing #endif\n')
761 after.setdefault(pos, []).append(l)
852 addsalt(n + 1, False)
853
854 return salt, script, after, expected
855
856 def _processoutput(self, exitcode, output, salt, after, expected):
857 # Merge the script output back into a unified test.
858 warnonly = 1 # 1: not yet; 2: yes; 3: for sure not
859 if exitcode != 0:
860 warnonly = 3
861
862 pos = -1
863 postout = []
864 for l in output:
865 lout, lcmd = l, None
866 if salt in l:
867 lout, lcmd = l.split(salt, 1)
868
869 if lout:
870 if not lout.endswith('\n'):
871 lout += ' (no-eol)\n'
762
872
763 if inpython:
873 # Find the expected output at the current position.
764 script.append("EOF\n")
874 el = None
765 if skipping is not None:
875 if expected.get(pos, None):
766 after.setdefault(pos, []).append(' !!! missing #endif\n')
876 el = expected[pos].pop(0)
767 addsalt(n + 1, False)
768
877
769 # Write out the script and execute it
878 r = TTest.linematch(el, lout)
770 name = wd + '.sh'
879 if isinstance(r, str):
771 f = open(name, 'w')
880 if r == '+glob':
772 for l in script:
881 lout = el[:-1] + ' (glob)\n'
773 f.write(l)
882 r = '' # Warn only this line.
774 f.close()
883 elif r == '-glob':
884 lout = ''.join(el.rsplit(' (glob)', 1))
885 r = '' # Warn only this line.
886 else:
887 log('\ninfo, unknown linematch result: %r\n' % r)
888 r = False
889 if r:
890 postout.append(' ' + el)
891 else:
892 if self.NEEDESCAPE(lout):
893 lout = TTest._stringescape('%s (esc)\n' %
894 lout.rstrip('\n'))
895 postout.append(' ' + lout) # Let diff deal with it.
896 if r != '': # If line failed.
897 warnonly = 3 # for sure not
898 elif warnonly == 1: # Is "not yet" and line is warn only.
899 warnonly = 2 # Yes do warn.
775
900
776 cmd = '%s "%s"' % (options.shell, name)
901 if lcmd:
777 vlog("# Running", cmd)
902 # Add on last return code.
778 exitcode, output = run(cmd, wd, options, replacements, env)
903 ret = int(lcmd.split()[1])
779 # do not merge output if skipped, return hghave message instead
904 if ret != 0:
780 # similarly, with --debug, output is None
905 postout.append(' [%s]\n' % ret)
781 if exitcode == SKIPPED_STATUS or output is None:
906 if pos in after:
782 return exitcode, output
907 # Merge in non-active test bits.
908 postout += after.pop(pos)
909 pos = int(lcmd.split()[0])
783
910
784 # Merge the script output back into a unified test
911 if pos in after:
912 postout += after.pop(pos)
785
913
786 warnonly = 1 # 1: not yet, 2: yes, 3: for sure not
914 if warnonly == 2:
787 if exitcode != 0: # failure has been reported
915 exitcode = False # Set exitcode to warned.
788 warnonly = 3 # set to "for sure not"
916
789 pos = -1
917 return exitcode, postout
790 postout = []
791 for l in output:
792 lout, lcmd = l, None
793 if salt in l:
794 lout, lcmd = l.split(salt, 1)
795
918
796 if lout:
919 @staticmethod
797 if not lout.endswith('\n'):
920 def rematch(el, l):
798 lout += ' (no-eol)\n'
921 try:
922 # use \Z to ensure that the regex matches to the end of the string
923 if os.name == 'nt':
924 return re.match(el + r'\r?\n\Z', l)
925 return re.match(el + r'\n\Z', l)
926 except re.error:
927 # el is an invalid regex
928 return False
799
929
800 # find the expected output at the current position
930 @staticmethod
801 el = None
931 def globmatch(el, l):
802 if pos in expected and expected[pos]:
932 # The only supported special characters are * and ? plus / which also
803 el = expected[pos].pop(0)
933 # matches \ on windows. Escaping of these characters is supported.
804
934 if el + '\n' == l:
805 r = linematch(el, lout)
935 if os.altsep:
806 if isinstance(r, str):
936 # matching on "/" is not needed for this line
807 if r == '+glob':
937 return '-glob'
808 lout = el[:-1] + ' (glob)\n'
938 return True
809 r = '' # warn only this line
939 i, n = 0, len(el)
810 elif r == '-glob':
940 res = ''
811 lout = ''.join(el.rsplit(' (glob)', 1))
941 while i < n:
812 r = '' # warn only this line
942 c = el[i]
813 else:
943 i += 1
814 log('\ninfo, unknown linematch result: %r\n' % r)
944 if c == '\\' and el[i] in '*?\\/':
815 r = False
945 res += el[i - 1:i + 1]
816 if r:
946 i += 1
817 postout.append(" " + el)
947 elif c == '*':
948 res += '.*'
949 elif c == '?':
950 res += '.'
951 elif c == '/' and os.altsep:
952 res += '[/\\\\]'
818 else:
953 else:
819 if needescape(lout):
954 res += re.escape(c)
820 lout = stringescape(lout.rstrip('\n')) + " (esc)\n"
955 return TTest.rematch(res, l)
821 postout.append(" " + lout) # let diff deal with it
956
822 if r != '': # if line failed
957 @staticmethod
823 warnonly = 3 # set to "for sure not"
958 def linematch(el, l):
824 elif warnonly == 1: # is "not yet" (and line is warn only)
959 if el == l: # perfect match (fast)
825 warnonly = 2 # set to "yes" do warn
960 return True
961 if el:
962 if el.endswith(" (esc)\n"):
963 el = el[:-7].decode('string-escape') + '\n'
964 if el == l or os.name == 'nt' and el[:-1] + '\r\n' == l:
965 return True
966 if el.endswith(" (re)\n"):
967 return TTest.rematch(el[:-6], l)
968 if el.endswith(" (glob)\n"):
969 return TTest.globmatch(el[:-8], l)
970 if os.altsep and l.replace('\\', '/') == el:
971 return '+glob'
972 return False
973
974 @staticmethod
975 def parsehghaveoutput(lines):
976 '''Parse hghave log lines.
826
977
827 if lcmd:
978 Return tuple of lists (missing, failed):
828 # add on last return code
979 * the missing/unknown features
829 ret = int(lcmd.split()[1])
980 * the features for which existence check failed'''
830 if ret != 0:
981 missing = []
831 postout.append(" [%s]\n" % ret)
982 failed = []
832 if pos in after:
983 for line in lines:
833 # merge in non-active test bits
984 if line.startswith(TTest.SKIPPED_PREFIX):
834 postout += after.pop(pos)
985 line = line.splitlines()[0]
835 pos = int(lcmd.split()[0])
986 missing.append(line[len(TTest.SKIPPED_PREFIX):])
987 elif line.startswith(TTest.FAILED_PREFIX):
988 line = line.splitlines()[0]
989 failed.append(line[len(TTest.FAILED_PREFIX):])
836
990
837 if pos in after:
991 return missing, failed
838 postout += after.pop(pos)
839
992
840 if warnonly == 2:
993 @staticmethod
841 exitcode = False # set exitcode to warned
994 def _escapef(m):
842 return exitcode, postout
995 return TTest.ESCAPEMAP[m.group(0)]
996
997 @staticmethod
998 def _stringescape(s):
999 return TTest.ESCAPESUB(TTest._escapef, s)
1000
843
1001
844 wifexited = getattr(os, "WIFEXITED", lambda x: False)
1002 wifexited = getattr(os, "WIFEXITED", lambda x: False)
845 def run(cmd, wd, options, replacements, env):
1003 def run(cmd, wd, replacements, env, debug=False, timeout=None):
846 """Run command in a sub-process, capturing the output (stdout and stderr).
1004 """Run command in a sub-process, capturing the output (stdout and stderr).
847 Return a tuple (exitcode, output). output is None in debug mode."""
1005 Return a tuple (exitcode, output). output is None in debug mode."""
848 # TODO: Use subprocess.Popen if we're running on Python 2.4
1006 if debug:
849 if options.debug:
850 proc = subprocess.Popen(cmd, shell=True, cwd=wd, env=env)
1007 proc = subprocess.Popen(cmd, shell=True, cwd=wd, env=env)
851 ret = proc.wait()
1008 ret = proc.wait()
852 return (ret, None)
1009 return (ret, None)
853
1010
854 proc = Popen4(cmd, wd, options.timeout, env)
1011 proc = Popen4(cmd, wd, timeout, env)
855 def cleanup():
1012 def cleanup():
856 terminate(proc)
1013 terminate(proc)
857 ret = proc.wait()
1014 ret = proc.wait()
@@ -880,442 +1037,787 b' def run(cmd, wd, options, replacements, '
880 if ret:
1037 if ret:
881 killdaemons(env['DAEMON_PIDS'])
1038 killdaemons(env['DAEMON_PIDS'])
882
1039
883 if abort:
884 raise KeyboardInterrupt()
885
886 for s, r in replacements:
1040 for s, r in replacements:
887 output = re.sub(s, r, output)
1041 output = re.sub(s, r, output)
888 return ret, output.splitlines(True)
1042 return ret, output.splitlines(True)
889
1043
890 def runone(options, test, count):
1044 iolock = threading.Lock()
891 '''returns a result element: (code, test, msg)'''
1045
1046 class SkipTest(Exception):
1047 """Raised to indicate that a test is to be skipped."""
892
1048
893 def skip(msg):
1049 class IgnoreTest(Exception):
894 if options.verbose:
1050 """Raised to indicate that a test is to be ignored."""
895 log("\nSkipping %s: %s" % (testpath, msg))
1051
896 return 's', test, msg
1052 class WarnTest(Exception):
1053 """Raised to indicate that a test warned."""
897
1054
898 def fail(msg, ret):
1055 class TestResult(unittest._TextTestResult):
899 warned = ret is False
1056 """Holds results when executing via unittest."""
900 if not options.nodiff:
1057 # Don't worry too much about accessing the non-public _TextTestResult.
901 log("\n%s: %s %s" % (warned and 'Warning' or 'ERROR', test, msg))
1058 # It is relatively common in Python testing tools.
902 if (not ret and options.interactive
1059 def __init__(self, options, *args, **kwargs):
903 and os.path.exists(testpath + ".err")):
1060 super(TestResult, self).__init__(*args, **kwargs)
904 iolock.acquire()
1061
905 print "Accept this change? [n] ",
1062 self._options = options
906 answer = sys.stdin.readline().strip()
907 iolock.release()
908 if answer.lower() in "y yes".split():
909 if test.endswith(".t"):
910 rename(testpath + ".err", testpath)
911 else:
912 rename(testpath + ".err", testpath + ".out")
913 return '.', test, ''
914 return warned and '~' or '!', test, msg
915
1063
916 def success():
1064 # unittest.TestResult didn't have skipped until 2.7. We need to
917 return '.', test, ''
1065 # polyfill it.
918
1066 self.skipped = []
919 def ignore(msg):
920 return 'i', test, msg
921
1067
922 def describe(ret):
1068 # We have a custom "ignored" result that isn't present in any Python
923 if ret < 0:
1069 # unittest implementation. It is very similar to skipped. It may make
924 return 'killed by signal %d' % -ret
1070 # sense to map it into skip some day.
925 return 'returned error code %d' % ret
1071 self.ignored = []
926
927 testpath = os.path.join(TESTDIR, test)
928 err = os.path.join(TESTDIR, test + ".err")
929 lctest = test.lower()
930
1072
931 if not os.path.exists(testpath):
1073 # We have a custom "warned" result that isn't present in any Python
932 return skip("doesn't exist")
1074 # unittest implementation. It is very similar to failed. It may make
933
1075 # sense to map it into fail some day.
934 if not (options.whitelisted and test in options.whitelisted):
1076 self.warned = []
935 if options.blacklist and test in options.blacklist:
936 return skip("blacklisted")
937
938 if options.retest and not os.path.exists(test + ".err"):
939 return ignore("not retesting")
940
1077
941 if options.keywords:
1078 self.times = []
942 fp = open(test)
1079 self._started = {}
943 t = fp.read().lower() + test.lower()
1080
944 fp.close()
1081 def addFailure(self, test, reason):
945 for k in options.keywords.lower().split():
1082 self.failures.append((test, reason))
946 if k in t:
947 break
948 else:
949 return ignore("doesn't match keyword")
950
1083
951 if not os.path.basename(lctest).startswith("test-"):
1084 if self._options.first:
952 return skip("not a test file")
1085 self.stop()
953 for ext, func, out in testtypes:
1086 else:
954 if lctest.endswith(ext):
1087 if not self._options.nodiff:
955 runner = func
1088 self.stream.write('\nERROR: %s output changed\n' % test)
956 ref = os.path.join(TESTDIR, test + out)
1089
957 break
1090 self.stream.write('!')
958 else:
959 return skip("unknown test type")
960
1091
961 vlog("# Test", test)
1092 def addError(self, *args, **kwargs):
962
1093 super(TestResult, self).addError(*args, **kwargs)
963 if os.path.exists(err):
964 os.remove(err) # Remove any previous output files
965
1094
966 # Make a tmp subdirectory to work in
1095 if self._options.first:
967 threadtmp = os.path.join(HGTMP, "child%d" % count)
1096 self.stop()
968 testtmp = os.path.join(threadtmp, os.path.basename(test))
1097
969 os.mkdir(threadtmp)
1098 # Polyfill.
970 os.mkdir(testtmp)
1099 def addSkip(self, test, reason):
1100 self.skipped.append((test, reason))
971
1101
972 port = options.port + count * 3
1102 if self.showAll:
973 replacements = [
1103 self.stream.writeln('skipped %s' % reason)
974 (r':%s\b' % port, ':$HGPORT'),
1104 else:
975 (r':%s\b' % (port + 1), ':$HGPORT1'),
1105 self.stream.write('s')
976 (r':%s\b' % (port + 2), ':$HGPORT2'),
1106 self.stream.flush()
977 ]
978 if os.name == 'nt':
979 replacements.append(
980 (''.join(c.isalpha() and '[%s%s]' % (c.lower(), c.upper()) or
981 c in '/\\' and r'[/\\]' or
982 c.isdigit() and c or
983 '\\' + c
984 for c in testtmp), '$TESTTMP'))
985 else:
986 replacements.append((re.escape(testtmp), '$TESTTMP'))
987
1107
988 env = createenv(options, testtmp, threadtmp, port)
1108 def addIgnore(self, test, reason):
989 createhgrc(env['HGRCPATH'], options)
1109 self.ignored.append((test, reason))
990
1110
991 starttime = time.time()
1111 if self.showAll:
992 try:
1112 self.stream.writeln('ignored %s' % reason)
993 ret, out = runner(testpath, testtmp, options, replacements, env)
1113 else:
994 except KeyboardInterrupt:
1114 if reason != 'not retesting':
995 endtime = time.time()
1115 self.stream.write('i')
996 log('INTERRUPTED: %s (after %d seconds)' % (test, endtime - starttime))
1116 self.stream.flush()
997 raise
998 endtime = time.time()
999 times.append((test, endtime - starttime))
1000 vlog("# Ret was:", ret)
1001
1117
1002 killdaemons(env['DAEMON_PIDS'])
1118 def addWarn(self, test, reason):
1119 self.warned.append((test, reason))
1003
1120
1004 skipped = (ret == SKIPPED_STATUS)
1121 if self._options.first:
1122 self.stop()
1005
1123
1006 # If we're not in --debug mode and reference output file exists,
1124 if self.showAll:
1007 # check test output against it.
1125 self.stream.writeln('warned %s' % reason)
1008 if options.debug:
1126 else:
1009 refout = None # to match "out is None"
1127 self.stream.write('~')
1010 elif os.path.exists(ref):
1128 self.stream.flush()
1011 f = open(ref, "r")
1129
1012 refout = f.read().splitlines(True)
1130 def addOutputMismatch(self, test, ret, got, expected):
1013 f.close()
1131 """Record a mismatch in test output for a particular test."""
1014 else:
1015 refout = []
1016
1132
1017 if (ret != 0 or out != refout) and not skipped and not options.debug:
1133 accepted = False
1018 # Save errors to a file for diagnosis
1019 f = open(err, "wb")
1020 for line in out:
1021 f.write(line)
1022 f.close()
1023
1134
1024 if skipped:
1135 iolock.acquire()
1025 if out is None: # debug mode: nothing to parse
1136 if self._options.nodiff:
1026 missing = ['unknown']
1137 pass
1027 failed = None
1138 elif self._options.view:
1028 else:
1139 os.system("%s %s %s" %
1029 missing, failed = parsehghaveoutput(out)
1140 (self._options.view, test.refpath, test.errpath))
1030 if not missing:
1031 missing = ['irrelevant']
1032 if failed:
1033 result = fail("hghave failed checking for %s" % failed[-1], ret)
1034 skipped = False
1035 else:
1141 else:
1036 result = skip(missing[-1])
1142 failed, lines = getdiff(expected, got,
1037 elif ret == 'timeout':
1143 test.refpath, test.errpath)
1038 result = fail("timed out", ret)
1144 if failed:
1039 elif out != refout:
1145 self.addFailure(test, 'diff generation failed')
1040 info = {}
1041 if not options.nodiff:
1042 iolock.acquire()
1043 if options.view:
1044 os.system("%s %s %s" % (options.view, ref, err))
1045 else:
1146 else:
1046 info = showdiff(refout, out, ref, err)
1147 self.stream.write('\n')
1047 iolock.release()
1148 for line in lines:
1048 msg = ""
1149 self.stream.write(line)
1049 if info.get('servefail'): msg += "serve failed and "
1150 self.stream.flush()
1050 if ret:
1051 msg += "output changed and " + describe(ret)
1052 else:
1053 msg += "output changed"
1054 result = fail(msg, ret)
1055 elif ret:
1056 result = fail(describe(ret), ret)
1057 else:
1058 result = success()
1059
1151
1060 if not options.verbose:
1152 # handle interactive prompt without releasing iolock
1061 iolock.acquire()
1153 if self._options.interactive:
1062 sys.stdout.write(result[0])
1154 self.stream.write('Accept this change? [n] ')
1063 sys.stdout.flush()
1155 answer = sys.stdin.readline().strip()
1156 if answer.lower() in ('y', 'yes'):
1157 if test.name.endswith('.t'):
1158 rename(test.errpath, test.path)
1159 else:
1160 rename(test.errpath, '%s.out' % test.path)
1161 accepted = True
1162
1064 iolock.release()
1163 iolock.release()
1065
1164
1066 if not options.keep_tmpdir:
1165 return accepted
1067 shutil.rmtree(threadtmp, True)
1068 return result
1069
1166
1070 _hgpath = None
1167 def startTest(self, test):
1168 super(TestResult, self).startTest(test)
1071
1169
1072 def _gethgpath():
1170 self._started[test.name] = time.time()
1073 """Return the path to the mercurial package that is actually found by
1171
1074 the current Python interpreter."""
1172 def stopTest(self, test, interrupted=False):
1075 global _hgpath
1173 super(TestResult, self).stopTest(test)
1076 if _hgpath is not None:
1174
1077 return _hgpath
1175 self.times.append((test.name, time.time() - self._started[test.name]))
1176 del self._started[test.name]
1078
1177
1079 cmd = '%s -c "import mercurial; print (mercurial.__path__[0])"'
1178 if interrupted:
1080 pipe = os.popen(cmd % PYTHON)
1179 self.stream.writeln('INTERRUPTED: %s (after %d seconds)' % (
1081 try:
1180 test.name, self.times[-1][1]))
1082 _hgpath = pipe.read().strip()
1181
1083 finally:
1182 class TestSuite(unittest.TestSuite):
1084 pipe.close()
1183 """Custom unitest TestSuite that knows how to execute Mercurial tests."""
1085 return _hgpath
1086
1184
1087 def _checkhglib(verb):
1185 def __init__(self, testdir, jobs=1, whitelist=None, blacklist=None,
1088 """Ensure that the 'mercurial' package imported by python is
1186 retest=False, keywords=None, loop=False,
1089 the one we expect it to be. If not, print a warning to stderr."""
1187 *args, **kwargs):
1090 expecthg = os.path.join(PYTHONDIR, 'mercurial')
1188 """Create a new instance that can run tests with a configuration.
1091 actualhg = _gethgpath()
1189
1092 if os.path.abspath(actualhg) != os.path.abspath(expecthg):
1190 testdir specifies the directory where tests are executed from. This
1093 sys.stderr.write('warning: %s with unexpected mercurial lib: %s\n'
1191 is typically the ``tests`` directory from Mercurial's source
1094 ' (expected %s)\n'
1192 repository.
1095 % (verb, actualhg, expecthg))
1193
1194 jobs specifies the number of jobs to run concurrently. Each test
1195 executes on its own thread. Tests actually spawn new processes, so
1196 state mutation should not be an issue.
1096
1197
1097 results = {'.':[], '!':[], '~': [], 's':[], 'i':[]}
1198 whitelist and blacklist denote tests that have been whitelisted and
1098 times = []
1199 blacklisted, respectively. These arguments don't belong in TestSuite.
1099 iolock = threading.Lock()
1200 Instead, whitelist and blacklist should be handled by the thing that
1100 abort = False
1201 populates the TestSuite with tests. They are present to preserve
1202 backwards compatible behavior which reports skipped tests as part
1203 of the results.
1101
1204
1102 def scheduletests(options, tests):
1205 retest denotes whether to retest failed tests. This arguably belongs
1103 jobs = options.jobs
1206 outside of TestSuite.
1104 done = queue.Queue()
1207
1105 running = 0
1208 keywords denotes key words that will be used to filter which tests
1106 count = 0
1209 to execute. This arguably belongs outside of TestSuite.
1107 global abort
1210
1211 loop denotes whether to loop over tests forever.
1212 """
1213 super(TestSuite, self).__init__(*args, **kwargs)
1108
1214
1109 def job(test, count):
1215 self._jobs = jobs
1110 try:
1216 self._whitelist = whitelist
1111 done.put(runone(options, test, count))
1217 self._blacklist = blacklist
1112 except KeyboardInterrupt:
1218 self._retest = retest
1113 pass
1219 self._keywords = keywords
1114 except: # re-raises
1220 self._loop = loop
1115 done.put(('!', test, 'run-test raised an error, see traceback'))
1116 raise
1117
1221
1118 try:
1222 def run(self, result):
1119 while tests or running:
1223 # We have a number of filters that need to be applied. We do this
1120 if not done.empty() or running == jobs or not tests:
1224 # here instead of inside Test because it makes the running logic for
1121 try:
1225 # Test simpler.
1122 code, test, msg = done.get(True, 1)
1226 tests = []
1123 results[code].append((test, msg))
1227 for test in self._tests:
1124 if options.first and code not in '.si':
1228 if not os.path.exists(test.path):
1125 break
1229 result.addSkip(test, "Doesn't exist")
1126 except queue.Empty:
1230 continue
1231
1232 if not (self._whitelist and test.name in self._whitelist):
1233 if self._blacklist and test.name in self._blacklist:
1234 result.addSkip(test, 'blacklisted')
1235 continue
1236
1237 if self._retest and not os.path.exists(test.errpath):
1238 result.addIgnore(test, 'not retesting')
1127 continue
1239 continue
1128 running -= 1
1240
1129 if tests and not running == jobs:
1241 if self._keywords:
1130 test = tests.pop(0)
1242 f = open(test.path, 'rb')
1131 if options.loop:
1243 t = f.read().lower() + test.name.lower()
1132 tests.append(test)
1244 f.close()
1133 t = threading.Thread(target=job, name=test, args=(test, count))
1245 ignored = False
1134 t.start()
1246 for k in self._keywords.lower().split():
1135 running += 1
1247 if k not in t:
1136 count += 1
1248 result.addIgnore(test, "doesn't match keyword")
1137 except KeyboardInterrupt:
1249 ignored = True
1138 abort = True
1250 break
1251
1252 if ignored:
1253 continue
1254
1255 tests.append(test)
1256
1257 runtests = list(tests)
1258 done = queue.Queue()
1259 running = 0
1260
1261 def job(test, result):
1262 try:
1263 test(result)
1264 done.put(None)
1265 except KeyboardInterrupt:
1266 pass
1267 except: # re-raises
1268 done.put(('!', test, 'run-test raised an error, see traceback'))
1269 raise
1270
1271 try:
1272 while tests or running:
1273 if not done.empty() or running == self._jobs or not tests:
1274 try:
1275 done.get(True, 1)
1276 if result and result.shouldStop:
1277 break
1278 except queue.Empty:
1279 continue
1280 running -= 1
1281 if tests and not running == self._jobs:
1282 test = tests.pop(0)
1283 if self._loop:
1284 tests.append(test)
1285 t = threading.Thread(target=job, name=test.name,
1286 args=(test, result))
1287 t.start()
1288 running += 1
1289 except KeyboardInterrupt:
1290 for test in runtests:
1291 test.abort()
1292
1293 return result
1294
1295 class TextTestRunner(unittest.TextTestRunner):
1296 """Custom unittest test runner that uses appropriate settings."""
1297
1298 def __init__(self, runner, *args, **kwargs):
1299 super(TextTestRunner, self).__init__(*args, **kwargs)
1139
1300
1140 def runtests(options, tests):
1301 self._runner = runner
1141 try:
1302
1142 if INST:
1303 def run(self, test):
1143 installhg(options)
1304 result = TestResult(self._runner.options, self.stream,
1144 _checkhglib("Testing")
1305 self.descriptions, self.verbosity)
1145 else:
1306
1146 usecorrectpython()
1307 test(result)
1308
1309 failed = len(result.failures)
1310 warned = len(result.warned)
1311 skipped = len(result.skipped)
1312 ignored = len(result.ignored)
1313
1314 self.stream.writeln('')
1315
1316 if not self._runner.options.noskips:
1317 for test, msg in result.skipped:
1318 self.stream.writeln('Skipped %s: %s' % (test.name, msg))
1319 for test, msg in result.warned:
1320 self.stream.writeln('Warned %s: %s' % (test.name, msg))
1321 for test, msg in result.failures:
1322 self.stream.writeln('Failed %s: %s' % (test.name, msg))
1323 for test, msg in result.errors:
1324 self.stream.writeln('Errored %s: %s' % (test.name, msg))
1325
1326 self._runner._checkhglib('Tested')
1147
1327
1148 if options.restart:
1328 # When '--retest' is enabled, only failure tests run. At this point
1149 orig = list(tests)
1329 # "result.testsRun" holds the count of failure test that has run. But
1150 while tests:
1330 # as while printing output, we have subtracted the skipped and ignored
1151 if os.path.exists(tests[0] + ".err"):
1331 # count from "result.testsRun". Therefore, to make the count remain
1152 break
1332 # the same, we need to add skipped and ignored count in here.
1153 tests.pop(0)
1333 if self._runner.options.retest:
1154 if not tests:
1334 result.testsRun = result.testsRun + skipped + ignored
1155 print "running all tests"
1156 tests = orig
1157
1335
1158 scheduletests(options, tests)
1336 # This differs from unittest's default output in that we don't count
1337 # skipped and ignored tests as part of the total test count.
1338 self.stream.writeln('# Ran %d tests, %d skipped, %d warned, %d failed.'
1339 % (result.testsRun - skipped - ignored,
1340 skipped + ignored, warned, failed))
1341 if failed:
1342 self.stream.writeln('python hash seed: %s' %
1343 os.environ['PYTHONHASHSEED'])
1344 if self._runner.options.time:
1345 self.printtimes(result.times)
1346
1347 return result
1348
1349 def printtimes(self, times):
1350 self.stream.writeln('# Producing time report')
1351 times.sort(key=lambda t: (t[1], t[0]), reverse=True)
1352 cols = '%7.3f %s'
1353 self.stream.writeln('%-7s %s' % ('Time', 'Test'))
1354 for test, timetaken in times:
1355 self.stream.writeln(cols % (timetaken, test))
1356
1357 class TestRunner(object):
1358 """Holds context for executing tests.
1359
1360 Tests rely on a lot of state. This object holds it for them.
1361 """
1159
1362
1160 failed = len(results['!'])
1363 # Programs required to run tests.
1161 warned = len(results['~'])
1364 REQUIREDTOOLS = [
1162 tested = len(results['.']) + failed + warned
1365 os.path.basename(sys.executable),
1163 skipped = len(results['s'])
1366 'diff',
1164 ignored = len(results['i'])
1367 'grep',
1368 'unzip',
1369 'gunzip',
1370 'bunzip2',
1371 'sed',
1372 ]
1373
1374 # Maps file extensions to test class.
1375 TESTTYPES = [
1376 ('.py', PythonTest),
1377 ('.t', TTest),
1378 ]
1379
1380 def __init__(self):
1381 self.options = None
1382 self._testdir = None
1383 self._hgtmp = None
1384 self._installdir = None
1385 self._bindir = None
1386 self._tmpbinddir = None
1387 self._pythondir = None
1388 self._coveragefile = None
1389 self._createdfiles = []
1390 self._hgpath = None
1391
1392 def run(self, args, parser=None):
1393 """Run the test suite."""
1394 oldmask = os.umask(022)
1395 try:
1396 parser = parser or getparser()
1397 options, args = parseargs(args, parser)
1398 self.options = options
1399
1400 self._checktools()
1401 tests = self.findtests(args)
1402 return self._run(tests)
1403 finally:
1404 os.umask(oldmask)
1165
1405
1166 print
1406 def _run(self, tests):
1167 if not options.noskips:
1407 if self.options.random:
1168 for s in results['s']:
1408 random.shuffle(tests)
1169 print "Skipped %s: %s" % s
1409 else:
1170 for s in results['~']:
1410 # keywords for slow tests
1171 print "Warned %s: %s" % s
1411 slow = 'svn gendoc check-code-hg'.split()
1172 for s in results['!']:
1412 def sortkey(f):
1173 print "Failed %s: %s" % s
1413 # run largest tests first, as they tend to take the longest
1174 _checkhglib("Tested")
1414 try:
1175 print "# Ran %d tests, %d skipped, %d warned, %d failed." % (
1415 val = -os.stat(f).st_size
1176 tested, skipped + ignored, warned, failed)
1416 except OSError, e:
1177 if results['!']:
1417 if e.errno != errno.ENOENT:
1178 print 'python hash seed:', os.environ['PYTHONHASHSEED']
1418 raise
1179 if options.time:
1419 return -1e9 # file does not exist, tell early
1180 outputtimes(options)
1420 for kw in slow:
1421 if kw in f:
1422 val *= 10
1423 return val
1424 tests.sort(key=sortkey)
1425
1426 self._testdir = os.environ['TESTDIR'] = os.getcwd()
1427
1428 if 'PYTHONHASHSEED' not in os.environ:
1429 # use a random python hash seed all the time
1430 # we do the randomness ourself to know what seed is used
1431 os.environ['PYTHONHASHSEED'] = str(random.getrandbits(32))
1432
1433 if self.options.tmpdir:
1434 self.options.keep_tmpdir = True
1435 tmpdir = self.options.tmpdir
1436 if os.path.exists(tmpdir):
1437 # Meaning of tmpdir has changed since 1.3: we used to create
1438 # HGTMP inside tmpdir; now HGTMP is tmpdir. So fail if
1439 # tmpdir already exists.
1440 print "error: temp dir %r already exists" % tmpdir
1441 return 1
1181
1442
1182 if options.anycoverage:
1443 # Automatically removing tmpdir sounds convenient, but could
1183 outputcoverage(options)
1444 # really annoy anyone in the habit of using "--tmpdir=/tmp"
1184 except KeyboardInterrupt:
1445 # or "--tmpdir=$HOME".
1185 failed = True
1446 #vlog("# Removing temp dir", tmpdir)
1186 print "\ninterrupted!"
1447 #shutil.rmtree(tmpdir)
1448 os.makedirs(tmpdir)
1449 else:
1450 d = None
1451 if os.name == 'nt':
1452 # without this, we get the default temp dir location, but
1453 # in all lowercase, which causes troubles with paths (issue3490)
1454 d = os.getenv('TMP')
1455 tmpdir = tempfile.mkdtemp('', 'hgtests.', d)
1456 self._hgtmp = os.environ['HGTMP'] = os.path.realpath(tmpdir)
1187
1457
1188 if failed:
1458 if self.options.with_hg:
1189 return 1
1459 self._installdir = None
1190 if warned:
1460 self._bindir = os.path.dirname(os.path.realpath(
1191 return 80
1461 self.options.with_hg))
1462 self._tmpbindir = os.path.join(self._hgtmp, 'install', 'bin')
1463 os.makedirs(self._tmpbindir)
1464
1465 # This looks redundant with how Python initializes sys.path from
1466 # the location of the script being executed. Needed because the
1467 # "hg" specified by --with-hg is not the only Python script
1468 # executed in the test suite that needs to import 'mercurial'
1469 # ... which means it's not really redundant at all.
1470 self._pythondir = self._bindir
1471 else:
1472 self._installdir = os.path.join(self._hgtmp, "install")
1473 self._bindir = os.environ["BINDIR"] = \
1474 os.path.join(self._installdir, "bin")
1475 self._tmpbindir = self._bindir
1476 self._pythondir = os.path.join(self._installdir, "lib", "python")
1477
1478 os.environ["BINDIR"] = self._bindir
1479 os.environ["PYTHON"] = PYTHON
1480
1481 path = [self._bindir] + os.environ["PATH"].split(os.pathsep)
1482 if self._tmpbindir != self._bindir:
1483 path = [self._tmpbindir] + path
1484 os.environ["PATH"] = os.pathsep.join(path)
1192
1485
1193 testtypes = [('.py', pytest, '.out'),
1486 # Include TESTDIR in PYTHONPATH so that out-of-tree extensions
1194 ('.t', tsttest, '')]
1487 # can run .../tests/run-tests.py test-foo where test-foo
1488 # adds an extension to HGRC. Also include run-test.py directory to
1489 # import modules like heredoctest.
1490 pypath = [self._pythondir, self._testdir,
1491 os.path.abspath(os.path.dirname(__file__))]
1492 # We have to augment PYTHONPATH, rather than simply replacing
1493 # it, in case external libraries are only available via current
1494 # PYTHONPATH. (In particular, the Subversion bindings on OS X
1495 # are in /opt/subversion.)
1496 oldpypath = os.environ.get(IMPL_PATH)
1497 if oldpypath:
1498 pypath.append(oldpypath)
1499 os.environ[IMPL_PATH] = os.pathsep.join(pypath)
1195
1500
1196 def main(args, parser=None):
1501 self._coveragefile = os.path.join(self._testdir, '.coverage')
1197 parser = parser or getparser()
1502
1198 (options, args) = parseargs(args, parser)
1503 vlog("# Using TESTDIR", self._testdir)
1199 os.umask(022)
1504 vlog("# Using HGTMP", self._hgtmp)
1505 vlog("# Using PATH", os.environ["PATH"])
1506 vlog("# Using", IMPL_PATH, os.environ[IMPL_PATH])
1200
1507
1201 checktools()
1508 try:
1509 return self._runtests(tests) or 0
1510 finally:
1511 time.sleep(.1)
1512 self._cleanup()
1513
1514 def findtests(self, args):
1515 """Finds possible test files from arguments.
1202
1516
1203 if not args:
1517 If you wish to inject custom tests into the test harness, this would
1204 if options.changed:
1518 be a good function to monkeypatch or override in a derived class.
1205 proc = Popen4('hg st --rev "%s" -man0 .' % options.changed,
1519 """
1206 None, 0)
1520 if not args:
1207 stdout, stderr = proc.communicate()
1521 if self.options.changed:
1208 args = stdout.strip('\0').split('\0')
1522 proc = Popen4('hg st --rev "%s" -man0 .' %
1209 else:
1523 self.options.changed, None, 0)
1210 args = os.listdir(".")
1524 stdout, stderr = proc.communicate()
1525 args = stdout.strip('\0').split('\0')
1526 else:
1527 args = os.listdir('.')
1528
1529 return [t for t in args
1530 if os.path.basename(t).startswith('test-')
1531 and (t.endswith('.py') or t.endswith('.t'))]
1211
1532
1212 tests = [t for t in args
1533 def _runtests(self, tests):
1213 if os.path.basename(t).startswith("test-")
1534 try:
1214 and (t.endswith(".py") or t.endswith(".t"))]
1535 if self._installdir:
1536 self._installhg()
1537 self._checkhglib("Testing")
1538 else:
1539 self._usecorrectpython()
1540
1541 if self.options.restart:
1542 orig = list(tests)
1543 while tests:
1544 if os.path.exists(tests[0] + ".err"):
1545 break
1546 tests.pop(0)
1547 if not tests:
1548 print "running all tests"
1549 tests = orig
1550
1551 tests = [self._gettest(t, i) for i, t in enumerate(tests)]
1552
1553 failed = False
1554 warned = False
1215
1555
1216 if options.random:
1556 suite = TestSuite(self._testdir,
1217 random.shuffle(tests)
1557 jobs=self.options.jobs,
1218 else:
1558 whitelist=self.options.whitelisted,
1219 # keywords for slow tests
1559 blacklist=self.options.blacklist,
1220 slow = 'svn gendoc check-code-hg'.split()
1560 retest=self.options.retest,
1221 def sortkey(f):
1561 keywords=self.options.keywords,
1222 # run largest tests first, as they tend to take the longest
1562 loop=self.options.loop,
1563 tests=tests)
1564 verbosity = 1
1565 if self.options.verbose:
1566 verbosity = 2
1567 runner = TextTestRunner(self, verbosity=verbosity)
1568 result = runner.run(suite)
1569
1570 if result.failures:
1571 failed = True
1572 if result.warned:
1573 warned = True
1574
1575 if self.options.anycoverage:
1576 self._outputcoverage()
1577 except KeyboardInterrupt:
1578 failed = True
1579 print "\ninterrupted!"
1580
1581 if failed:
1582 return 1
1583 if warned:
1584 return 80
1585
1586 def _gettest(self, test, count):
1587 """Obtain a Test by looking at its filename.
1588
1589 Returns a Test instance. The Test may not be runnable if it doesn't
1590 map to a known type.
1591 """
1592 lctest = test.lower()
1593 testcls = Test
1594
1595 for ext, cls in self.TESTTYPES:
1596 if lctest.endswith(ext):
1597 testcls = cls
1598 break
1599
1600 refpath = os.path.join(self._testdir, test)
1601 tmpdir = os.path.join(self._hgtmp, 'child%d' % count)
1602
1603 return testcls(refpath, tmpdir,
1604 keeptmpdir=self.options.keep_tmpdir,
1605 debug=self.options.debug,
1606 timeout=self.options.timeout,
1607 startport=self.options.port + count * 3,
1608 extraconfigopts=self.options.extra_config_opt,
1609 py3kwarnings=self.options.py3k_warnings,
1610 shell=self.options.shell)
1611
1612 def _cleanup(self):
1613 """Clean up state from this test invocation."""
1614
1615 if self.options.keep_tmpdir:
1616 return
1617
1618 vlog("# Cleaning up HGTMP", self._hgtmp)
1619 shutil.rmtree(self._hgtmp, True)
1620 for f in self._createdfiles:
1223 try:
1621 try:
1224 val = -os.stat(f).st_size
1622 os.remove(f)
1225 except OSError, e:
1623 except OSError:
1226 if e.errno != errno.ENOENT:
1624 pass
1227 raise
1228 return -1e9 # file does not exist, tell early
1229 for kw in slow:
1230 if kw in f:
1231 val *= 10
1232 return val
1233 tests.sort(key=sortkey)
1234
1235 if 'PYTHONHASHSEED' not in os.environ:
1236 # use a random python hash seed all the time
1237 # we do the randomness ourself to know what seed is used
1238 os.environ['PYTHONHASHSEED'] = str(random.getrandbits(32))
1239
1240 global TESTDIR, HGTMP, INST, BINDIR, TMPBINDIR, PYTHONDIR, COVERAGE_FILE
1241 TESTDIR = os.environ["TESTDIR"] = os.getcwd()
1242 if options.tmpdir:
1243 options.keep_tmpdir = True
1244 tmpdir = options.tmpdir
1245 if os.path.exists(tmpdir):
1246 # Meaning of tmpdir has changed since 1.3: we used to create
1247 # HGTMP inside tmpdir; now HGTMP is tmpdir. So fail if
1248 # tmpdir already exists.
1249 print "error: temp dir %r already exists" % tmpdir
1250 return 1
1251
1625
1252 # Automatically removing tmpdir sounds convenient, but could
1626 def _usecorrectpython(self):
1253 # really annoy anyone in the habit of using "--tmpdir=/tmp"
1627 """Configure the environment to use the appropriate Python in tests."""
1254 # or "--tmpdir=$HOME".
1628 # Tests must use the same interpreter as us or bad things will happen.
1255 #vlog("# Removing temp dir", tmpdir)
1629 pyexename = sys.platform == 'win32' and 'python.exe' or 'python'
1256 #shutil.rmtree(tmpdir)
1630 if getattr(os, 'symlink', None):
1257 os.makedirs(tmpdir)
1631 vlog("# Making python executable in test path a symlink to '%s'" %
1258 else:
1632 sys.executable)
1259 d = None
1633 mypython = os.path.join(self._tmpbindir, pyexename)
1260 if os.name == 'nt':
1634 try:
1261 # without this, we get the default temp dir location, but
1635 if os.readlink(mypython) == sys.executable:
1262 # in all lowercase, which causes troubles with paths (issue3490)
1636 return
1263 d = os.getenv('TMP')
1637 os.unlink(mypython)
1264 tmpdir = tempfile.mkdtemp('', 'hgtests.', d)
1638 except OSError, err:
1265 HGTMP = os.environ['HGTMP'] = os.path.realpath(tmpdir)
1639 if err.errno != errno.ENOENT:
1640 raise
1641 if self._findprogram(pyexename) != sys.executable:
1642 try:
1643 os.symlink(sys.executable, mypython)
1644 self._createdfiles.append(mypython)
1645 except OSError, err:
1646 # child processes may race, which is harmless
1647 if err.errno != errno.EEXIST:
1648 raise
1649 else:
1650 exedir, exename = os.path.split(sys.executable)
1651 vlog("# Modifying search path to find %s as %s in '%s'" %
1652 (exename, pyexename, exedir))
1653 path = os.environ['PATH'].split(os.pathsep)
1654 while exedir in path:
1655 path.remove(exedir)
1656 os.environ['PATH'] = os.pathsep.join([exedir] + path)
1657 if not self._findprogram(pyexename):
1658 print "WARNING: Cannot find %s in search path" % pyexename
1659
1660 def _installhg(self):
1661 """Install hg into the test environment.
1266
1662
1267 if options.with_hg:
1663 This will also configure hg with the appropriate testing settings.
1268 INST = None
1664 """
1269 BINDIR = os.path.dirname(os.path.realpath(options.with_hg))
1665 vlog("# Performing temporary installation of HG")
1270 TMPBINDIR = os.path.join(HGTMP, 'install', 'bin')
1666 installerrs = os.path.join("tests", "install.err")
1271 os.makedirs(TMPBINDIR)
1667 compiler = ''
1668 if self.options.compiler:
1669 compiler = '--compiler ' + self.options.compiler
1670 pure = self.options.pure and "--pure" or ""
1671 py3 = ''
1672 if sys.version_info[0] == 3:
1673 py3 = '--c2to3'
1272
1674
1273 # This looks redundant with how Python initializes sys.path from
1675 # Run installer in hg root
1274 # the location of the script being executed. Needed because the
1676 script = os.path.realpath(sys.argv[0])
1275 # "hg" specified by --with-hg is not the only Python script
1677 hgroot = os.path.dirname(os.path.dirname(script))
1276 # executed in the test suite that needs to import 'mercurial'
1678 os.chdir(hgroot)
1277 # ... which means it's not really redundant at all.
1679 nohome = '--home=""'
1278 PYTHONDIR = BINDIR
1680 if os.name == 'nt':
1279 else:
1681 # The --home="" trick works only on OS where os.sep == '/'
1280 INST = os.path.join(HGTMP, "install")
1682 # because of a distutils convert_path() fast-path. Avoid it at
1281 BINDIR = os.environ["BINDIR"] = os.path.join(INST, "bin")
1683 # least on Windows for now, deal with .pydistutils.cfg bugs
1282 TMPBINDIR = BINDIR
1684 # when they happen.
1283 PYTHONDIR = os.path.join(INST, "lib", "python")
1685 nohome = ''
1686 cmd = ('%(exe)s setup.py %(py3)s %(pure)s clean --all'
1687 ' build %(compiler)s --build-base="%(base)s"'
1688 ' install --force --prefix="%(prefix)s"'
1689 ' --install-lib="%(libdir)s"'
1690 ' --install-scripts="%(bindir)s" %(nohome)s >%(logfile)s 2>&1'
1691 % {'exe': sys.executable, 'py3': py3, 'pure': pure,
1692 'compiler': compiler,
1693 'base': os.path.join(self._hgtmp, "build"),
1694 'prefix': self._installdir, 'libdir': self._pythondir,
1695 'bindir': self._bindir,
1696 'nohome': nohome, 'logfile': installerrs})
1697 vlog("# Running", cmd)
1698 if os.system(cmd) == 0:
1699 if not self.options.verbose:
1700 os.remove(installerrs)
1701 else:
1702 f = open(installerrs, 'rb')
1703 for line in f:
1704 print line,
1705 f.close()
1706 sys.exit(1)
1707 os.chdir(self._testdir)
1708
1709 self._usecorrectpython()
1710
1711 if self.options.py3k_warnings and not self.options.anycoverage:
1712 vlog("# Updating hg command to enable Py3k Warnings switch")
1713 f = open(os.path.join(self._bindir, 'hg'), 'rb')
1714 lines = [line.rstrip() for line in f]
1715 lines[0] += ' -3'
1716 f.close()
1717 f = open(os.path.join(self._bindir, 'hg'), 'wb')
1718 for line in lines:
1719 f.write(line + '\n')
1720 f.close()
1284
1721
1285 os.environ["BINDIR"] = BINDIR
1722 hgbat = os.path.join(self._bindir, 'hg.bat')
1286 os.environ["PYTHON"] = PYTHON
1723 if os.path.isfile(hgbat):
1724 # hg.bat expects to be put in bin/scripts while run-tests.py
1725 # installation layout put it in bin/ directly. Fix it
1726 f = open(hgbat, 'rb')
1727 data = f.read()
1728 f.close()
1729 if '"%~dp0..\python" "%~dp0hg" %*' in data:
1730 data = data.replace('"%~dp0..\python" "%~dp0hg" %*',
1731 '"%~dp0python" "%~dp0hg" %*')
1732 f = open(hgbat, 'wb')
1733 f.write(data)
1734 f.close()
1735 else:
1736 print 'WARNING: cannot fix hg.bat reference to python.exe'
1287
1737
1288 path = [BINDIR] + os.environ["PATH"].split(os.pathsep)
1738 if self.options.anycoverage:
1289 if TMPBINDIR != BINDIR:
1739 custom = os.path.join(self._testdir, 'sitecustomize.py')
1290 path = [TMPBINDIR] + path
1740 target = os.path.join(self._pythondir, 'sitecustomize.py')
1291 os.environ["PATH"] = os.pathsep.join(path)
1741 vlog('# Installing coverage trigger to %s' % target)
1742 shutil.copyfile(custom, target)
1743 rc = os.path.join(self._testdir, '.coveragerc')
1744 vlog('# Installing coverage rc to %s' % rc)
1745 os.environ['COVERAGE_PROCESS_START'] = rc
1746 fn = os.path.join(self._installdir, '..', '.coverage')
1747 os.environ['COVERAGE_FILE'] = fn
1748
1749 def _checkhglib(self, verb):
1750 """Ensure that the 'mercurial' package imported by python is
1751 the one we expect it to be. If not, print a warning to stderr."""
1752 if ((self._bindir == self._pythondir) and
1753 (self._bindir != self._tmpbindir)):
1754 # The pythondir has been infered from --with-hg flag.
1755 # We cannot expect anything sensible here
1756 return
1757 expecthg = os.path.join(self._pythondir, 'mercurial')
1758 actualhg = self._gethgpath()
1759 if os.path.abspath(actualhg) != os.path.abspath(expecthg):
1760 sys.stderr.write('warning: %s with unexpected mercurial lib: %s\n'
1761 ' (expected %s)\n'
1762 % (verb, actualhg, expecthg))
1763 def _gethgpath(self):
1764 """Return the path to the mercurial package that is actually found by
1765 the current Python interpreter."""
1766 if self._hgpath is not None:
1767 return self._hgpath
1292
1768
1293 # Include TESTDIR in PYTHONPATH so that out-of-tree extensions
1769 cmd = '%s -c "import mercurial; print (mercurial.__path__[0])"'
1294 # can run .../tests/run-tests.py test-foo where test-foo
1770 pipe = os.popen(cmd % PYTHON)
1295 # adds an extension to HGRC. Also include run-test.py directory to import
1771 try:
1296 # modules like heredoctest.
1772 self._hgpath = pipe.read().strip()
1297 pypath = [PYTHONDIR, TESTDIR, os.path.abspath(os.path.dirname(__file__))]
1773 finally:
1298 # We have to augment PYTHONPATH, rather than simply replacing
1774 pipe.close()
1299 # it, in case external libraries are only available via current
1775
1300 # PYTHONPATH. (In particular, the Subversion bindings on OS X
1776 return self._hgpath
1301 # are in /opt/subversion.)
1777
1302 oldpypath = os.environ.get(IMPL_PATH)
1778 def _outputcoverage(self):
1303 if oldpypath:
1779 """Produce code coverage output."""
1304 pypath.append(oldpypath)
1780 vlog('# Producing coverage report')
1305 os.environ[IMPL_PATH] = os.pathsep.join(pypath)
1781 os.chdir(self._pythondir)
1782
1783 def covrun(*args):
1784 cmd = 'coverage %s' % ' '.join(args)
1785 vlog('# Running: %s' % cmd)
1786 os.system(cmd)
1306
1787
1307 COVERAGE_FILE = os.path.join(TESTDIR, ".coverage")
1788 covrun('-c')
1789 omit = ','.join(os.path.join(x, '*') for x in
1790 [self._bindir, self._testdir])
1791 covrun('-i', '-r', '"--omit=%s"' % omit) # report
1792 if self.options.htmlcov:
1793 htmldir = os.path.join(self._testdir, 'htmlcov')
1794 covrun('-i', '-b', '"--directory=%s"' % htmldir,
1795 '"--omit=%s"' % omit)
1796 if self.options.annotate:
1797 adir = os.path.join(self._testdir, 'annotated')
1798 if not os.path.isdir(adir):
1799 os.mkdir(adir)
1800 covrun('-i', '-a', '"--directory=%s"' % adir, '"--omit=%s"' % omit)
1308
1801
1309 vlog("# Using TESTDIR", TESTDIR)
1802 def _findprogram(self, program):
1310 vlog("# Using HGTMP", HGTMP)
1803 """Search PATH for a executable program"""
1311 vlog("# Using PATH", os.environ["PATH"])
1804 for p in os.environ.get('PATH', os.defpath).split(os.pathsep):
1312 vlog("# Using", IMPL_PATH, os.environ[IMPL_PATH])
1805 name = os.path.join(p, program)
1806 if os.name == 'nt' or os.access(name, os.X_OK):
1807 return name
1808 return None
1313
1809
1314 try:
1810 def _checktools(self):
1315 return runtests(options, tests) or 0
1811 """Ensure tools required to run tests are present."""
1316 finally:
1812 for p in self.REQUIREDTOOLS:
1317 time.sleep(.1)
1813 if os.name == 'nt' and not p.endswith('.exe'):
1318 cleanup(options)
1814 p += '.exe'
1815 found = self._findprogram(p)
1816 if found:
1817 vlog("# Found prerequisite", p, "at", found)
1818 else:
1819 print "WARNING: Did not find prerequisite tool: %s " % p
1319
1820
1320 if __name__ == '__main__':
1821 if __name__ == '__main__':
1321 sys.exit(main(sys.argv[1:]))
1822 runner = TestRunner()
1823 sys.exit(runner.run(sys.argv[1:]))
@@ -107,6 +107,7 b' should fail'
107 M a
107 M a
108 ? a.orig
108 ? a.orig
109 $ hg resolve -m a
109 $ hg resolve -m a
110 no more unresolved files
110 $ hg ci -m merge
111 $ hg ci -m merge
111
112
112 Issue683: peculiarity with hg revert of an removed then added file
113 Issue683: peculiarity with hg revert of an removed then added file
@@ -11,6 +11,8 b' should complain'
11 [255]
11 [255]
12
12
13 basic operation
13 basic operation
14 (this also tests that editor is invoked if the commit message is not
15 specified explicitly)
14
16
15 $ echo a > a
17 $ echo a > a
16 $ hg commit -d '0 0' -A -m a
18 $ hg commit -d '0 0' -A -m a
@@ -18,8 +20,19 b' basic operation'
18 $ echo b >> a
20 $ echo b >> a
19 $ hg commit -d '1 0' -m b
21 $ hg commit -d '1 0' -m b
20
22
21 $ hg backout -d '2 0' tip --tool=true
23 $ hg status --rev tip --rev "tip^1"
24 M a
25 $ HGEDITOR=cat hg backout -d '2 0' tip --tool=true
22 reverting a
26 reverting a
27 Backed out changeset a820f4f40a57
28
29
30 HG: Enter commit message. Lines beginning with 'HG:' are removed.
31 HG: Leave message empty to abort commit.
32 HG: --
33 HG: user: test
34 HG: branch 'default'
35 HG: changed a
23 changeset 2:2929462c3dff backs out changeset 1:a820f4f40a57
36 changeset 2:2929462c3dff backs out changeset 1:a820f4f40a57
24 $ cat a
37 $ cat a
25 a
38 a
@@ -31,6 +44,8 b' basic operation'
31 update: (current)
44 update: (current)
32
45
33 file that was removed is recreated
46 file that was removed is recreated
47 (this also tests that editor is not invoked if the commit message is
48 specified explicitly)
34
49
35 $ cd ..
50 $ cd ..
36 $ hg init remove
51 $ hg init remove
@@ -43,7 +58,7 b' file that was removed is recreated'
43 $ hg rm a
58 $ hg rm a
44 $ hg commit -d '1 0' -m b
59 $ hg commit -d '1 0' -m b
45
60
46 $ hg backout -d '2 0' tip --tool=true
61 $ HGEDITOR=cat hg backout -d '2 0' tip --tool=true -m "Backed out changeset 76862dcce372"
47 adding a
62 adding a
48 changeset 2:de31bdc76c0d backs out changeset 1:76862dcce372
63 changeset 2:de31bdc76c0d backs out changeset 1:76862dcce372
49 $ cat a
64 $ cat a
@@ -340,9 +355,21 b' without --merge'
340 update: (current)
355 update: (current)
341
356
342 with --merge
357 with --merge
358 (this also tests that editor is invoked if '--edit' is specified
359 explicitly regardless of '--message')
360
343 $ hg update -qC
361 $ hg update -qC
344 $ hg backout --merge -d '3 0' -r 1 -m 'backout on branch1' --tool=true
362 $ HGEDITOR=cat hg backout --merge -d '3 0' -r 1 -m 'backout on branch1' --tool=true --edit
345 removing file1
363 removing file1
364 backout on branch1
365
366
367 HG: Enter commit message. Lines beginning with 'HG:' are removed.
368 HG: Leave message empty to abort commit.
369 HG: --
370 HG: user: test
371 HG: branch 'branch2'
372 HG: removed file1
346 created new head
373 created new head
347 changeset 3:d4e8f6db59fb backs out changeset 1:bf1602f437f3
374 changeset 3:d4e8f6db59fb backs out changeset 1:bf1602f437f3
348 merging with changeset 3:d4e8f6db59fb
375 merging with changeset 3:d4e8f6db59fb
@@ -490,6 +517,7 b' Test usage of `hg resolve` in case of co'
490 merging foo
517 merging foo
491 my foo@b71750c4b0fd+ other foo@a30dd8addae3 ancestor foo@913609522437
518 my foo@b71750c4b0fd+ other foo@a30dd8addae3 ancestor foo@913609522437
492 premerge successful
519 premerge successful
520 no more unresolved files
493 $ hg status
521 $ hg status
494 M foo
522 M foo
495 ? foo.orig
523 ? foo.orig
@@ -7,6 +7,7 b' Create a repository:'
7 defaults.tag=-d "0 0"
7 defaults.tag=-d "0 0"
8 ui.slash=True
8 ui.slash=True
9 ui.interactive=False
9 ui.interactive=False
10 ui.mergemarkers=detailed
10 $ hg init t
11 $ hg init t
11 $ cd t
12 $ cd t
12
13
@@ -24,6 +24,7 b' update to bookmark X'
24
24
25 $ hg update X
25 $ hg update X
26 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
26 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
27 (activating bookmark X)
27
28
28 list bookmarks
29 list bookmarks
29
30
@@ -71,6 +72,7 b' list bookmarks'
71 Verify that switching to Z updates the current bookmark:
72 Verify that switching to Z updates the current bookmark:
72 $ hg update Z
73 $ hg update Z
73 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
74 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
75 (activating bookmark Z)
74 $ hg bookmark
76 $ hg bookmark
75 Y 0:719295282060
77 Y 0:719295282060
76 * Z -1:000000000000
78 * Z -1:000000000000
@@ -78,6 +80,7 b' Verify that switching to Z updates the c'
78 Switch back to Y for the remaining tests in this file:
80 Switch back to Y for the remaining tests in this file:
79 $ hg update Y
81 $ hg update Y
80 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
82 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
83 (activating bookmark Y)
81
84
82 delete bookmarks
85 delete bookmarks
83
86
@@ -152,6 +155,7 b' bare update moves the active bookmark fo'
152 $ hg bookmark X@2 -r 2
155 $ hg bookmark X@2 -r 2
153 $ hg update X
156 $ hg update X
154 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
157 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
158 (activating bookmark X)
155 $ hg bookmarks
159 $ hg bookmarks
156 * X 0:719295282060
160 * X 0:719295282060
157 X@1 1:cc586d725fbe
161 X@1 1:cc586d725fbe
@@ -32,6 +32,7 b''
32
32
33 $ hg up -C 3
33 $ hg up -C 3
34 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
34 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
35 (leaving bookmark c)
35 $ echo d > d
36 $ echo d > d
36 $ hg add d
37 $ hg add d
37 $ hg commit -m'd'
38 $ hg commit -m'd'
@@ -54,6 +55,7 b''
54
55
55 $ hg up -C 4
56 $ hg up -C 4
56 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
57 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
58 (leaving bookmark e)
57 $ hg merge
59 $ hg merge
58 abort: heads are bookmarked - please merge with an explicit rev
60 abort: heads are bookmarked - please merge with an explicit rev
59 (run 'hg heads' to see all heads)
61 (run 'hg heads' to see all heads)
@@ -63,6 +65,7 b''
63
65
64 $ hg up -C e
66 $ hg up -C e
65 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
67 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
68 (activating bookmark e)
66 $ hg merge
69 $ hg merge
67 abort: no matching bookmark to merge - please merge with an explicit rev or bookmark
70 abort: no matching bookmark to merge - please merge with an explicit rev or bookmark
68 (run 'hg heads' to see all heads)
71 (run 'hg heads' to see all heads)
@@ -72,6 +75,7 b''
72
75
73 $ hg up -C 4
76 $ hg up -C 4
74 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
77 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
78 (leaving bookmark e)
75 $ echo f > f
79 $ echo f > f
76 $ hg commit -Am "f"
80 $ hg commit -Am "f"
77 adding f
81 adding f
@@ -96,6 +100,7 b''
96
100
97 $ hg up -C e
101 $ hg up -C e
98 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
102 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
103 (activating bookmark e)
99 $ hg bookmarks
104 $ hg bookmarks
100 b 1:d2ae7f538514
105 b 1:d2ae7f538514
101 c 3:b8f96cf4688b
106 c 3:b8f96cf4688b
@@ -114,6 +119,7 b''
114
119
115 $ hg up -C 6
120 $ hg up -C 6
116 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
121 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
122 (leaving bookmark e)
117 $ echo g > g
123 $ echo g > g
118 $ hg commit -Am 'g'
124 $ hg commit -Am 'g'
119 adding g
125 adding g
@@ -274,7 +274,7 b' diverging a remote bookmark fails'
274 $ hg push http://localhost:$HGPORT2/
274 $ hg push http://localhost:$HGPORT2/
275 pushing to http://localhost:$HGPORT2/
275 pushing to http://localhost:$HGPORT2/
276 searching for changes
276 searching for changes
277 abort: push creates new remote head c922c0139ca0!
277 abort: push creates new remote head c922c0139ca0 with bookmark 'Y'!
278 (merge or see "hg help push" for details about pushing new heads)
278 (merge or see "hg help push" for details about pushing new heads)
279 [255]
279 [255]
280 $ hg -R ../a book
280 $ hg -R ../a book
@@ -290,7 +290,7 b' Unrelated marker does not alter the deci'
290 $ hg push http://localhost:$HGPORT2/
290 $ hg push http://localhost:$HGPORT2/
291 pushing to http://localhost:$HGPORT2/
291 pushing to http://localhost:$HGPORT2/
292 searching for changes
292 searching for changes
293 abort: push creates new remote head c922c0139ca0!
293 abort: push creates new remote head c922c0139ca0 with bookmark 'Y'!
294 (merge or see "hg help push" for details about pushing new heads)
294 (merge or see "hg help push" for details about pushing new heads)
295 [255]
295 [255]
296 $ hg -R ../a book
296 $ hg -R ../a book
@@ -411,6 +411,7 b' bookmark, not all outgoing changes:'
411 $ hg commit -m 'add bar'
411 $ hg commit -m 'add bar'
412 $ hg co "tip^"
412 $ hg co "tip^"
413 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
413 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
414 (leaving bookmark @)
414 $ hg book add-foo
415 $ hg book add-foo
415 $ hg book -r tip add-bar
416 $ hg book -r tip add-bar
416 Note: this push *must* push only a single changeset, as that's the point
417 Note: this push *must* push only a single changeset, as that's the point
@@ -38,6 +38,7 b' update to -2 (deactivates the active boo'
38
38
39 $ hg update -r -2
39 $ hg update -r -2
40 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
40 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
41 (leaving bookmark test2)
41
42
42 $ echo eee>>qqq.txt
43 $ echo eee>>qqq.txt
43
44
@@ -118,6 +118,7 b' bookmark rev 0 again'
118
118
119 $ hg update X
119 $ hg update X
120 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
120 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
121 (activating bookmark X)
121 $ echo c > c
122 $ echo c > c
122 $ hg add c
123 $ hg add c
123 $ hg commit -m 2
124 $ hg commit -m 2
@@ -501,6 +502,7 b" update to current bookmark if it's not t"
501 $ hg update
502 $ hg update
502 updating to active bookmark Z
503 updating to active bookmark Z
503 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
504 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
505 (activating bookmark Z)
504 $ hg bookmarks
506 $ hg bookmarks
505 X2 1:925d80f479bb
507 X2 1:925d80f479bb
506 Y 2:db815d6d32e6
508 Y 2:db815d6d32e6
@@ -513,6 +515,7 b' pull --update works the same as pull && '
513 moving bookmark 'Y' forward from db815d6d32e6
515 moving bookmark 'Y' forward from db815d6d32e6
514 $ hg -R cloned-bookmarks-update update Y
516 $ hg -R cloned-bookmarks-update update Y
515 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
517 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
518 (activating bookmark Y)
516 $ hg -R cloned-bookmarks-update pull --update .
519 $ hg -R cloned-bookmarks-update pull --update .
517 pulling from .
520 pulling from .
518 searching for changes
521 searching for changes
@@ -582,6 +585,7 b' test stripping a non-checked-out but boo'
582 $ hg book should-end-on-two
585 $ hg book should-end-on-two
583 $ hg co --clean 4
586 $ hg co --clean 4
584 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
587 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
588 (leaving bookmark should-end-on-two)
585 $ hg book four
589 $ hg book four
586 $ hg --config extensions.mq= strip 3
590 $ hg --config extensions.mq= strip 3
587 saved backup bundle to * (glob)
591 saved backup bundle to * (glob)
@@ -47,9 +47,7 b' Create an extension to test bundle2 API'
47 > op.ui.write('received ping request (id %i)\n' % part.id)
47 > op.ui.write('received ping request (id %i)\n' % part.id)
48 > if op.reply is not None and 'ping-pong' in op.reply.capabilities:
48 > if op.reply is not None and 'ping-pong' in op.reply.capabilities:
49 > op.ui.write_err('replying to ping request (id %i)\n' % part.id)
49 > op.ui.write_err('replying to ping request (id %i)\n' % part.id)
50 > rpart = bundle2.bundlepart('test:pong',
50 > op.reply.newpart('test:pong', [('in-reply-to', str(part.id))])
51 > [('in-reply-to', str(part.id))])
52 > op.reply.addpart(rpart)
53 >
51 >
54 > @bundle2.parthandler('test:debugreply')
52 > @bundle2.parthandler('test:debugreply')
55 > def debugreply(op, part):
53 > def debugreply(op, part):
@@ -66,6 +64,7 b' Create an extension to test bundle2 API'
66 > @command('bundle2',
64 > @command('bundle2',
67 > [('', 'param', [], 'stream level parameter'),
65 > [('', 'param', [], 'stream level parameter'),
68 > ('', 'unknown', False, 'include an unknown mandatory part in the bundle'),
66 > ('', 'unknown', False, 'include an unknown mandatory part in the bundle'),
67 > ('', 'unknownparams', False, 'include an unknown part parameters in the bundle'),
69 > ('', 'parts', False, 'include some arbitrary parts to the bundle'),
68 > ('', 'parts', False, 'include some arbitrary parts to the bundle'),
70 > ('', 'reply', False, 'produce a reply bundle'),
69 > ('', 'reply', False, 'produce a reply bundle'),
71 > ('', 'pushrace', False, 'includes a check:head part with unknown nodes'),
70 > ('', 'pushrace', False, 'includes a check:head part with unknown nodes'),
@@ -83,11 +82,12 b' Create an extension to test bundle2 API'
83 >
82 >
84 > if opts['reply']:
83 > if opts['reply']:
85 > capsstring = 'ping-pong\nelephants=babar,celeste\ncity%3D%21=celeste%2Cville'
84 > capsstring = 'ping-pong\nelephants=babar,celeste\ncity%3D%21=celeste%2Cville'
86 > bundler.addpart(bundle2.bundlepart('b2x:replycaps', data=capsstring))
85 > bundler.newpart('b2x:replycaps', data=capsstring)
87 >
86 >
88 > if opts['pushrace']:
87 > if opts['pushrace']:
89 > dummynode = '01234567890123456789'
88 > # also serve to test the assignement of data outside of init
90 > bundler.addpart(bundle2.bundlepart('b2x:check:heads', data=dummynode))
89 > part = bundler.newpart('b2x:check:heads')
90 > part.data = '01234567890123456789'
91 >
91 >
92 > revs = opts['rev']
92 > revs = opts['rev']
93 > if 'rev' in opts:
93 > if 'rev' in opts:
@@ -99,31 +99,27 b' Create an extension to test bundle2 API'
99 > headcommon = [c.node() for c in repo.set('parents(%ld) - %ld', revs, revs)]
99 > headcommon = [c.node() for c in repo.set('parents(%ld) - %ld', revs, revs)]
100 > outgoing = discovery.outgoing(repo.changelog, headcommon, headmissing)
100 > outgoing = discovery.outgoing(repo.changelog, headcommon, headmissing)
101 > cg = changegroup.getlocalbundle(repo, 'test:bundle2', outgoing, None)
101 > cg = changegroup.getlocalbundle(repo, 'test:bundle2', outgoing, None)
102 > part = bundle2.bundlepart('b2x:changegroup', data=cg.getchunks())
102 > bundler.newpart('b2x:changegroup', data=cg.getchunks())
103 > bundler.addpart(part)
104 >
103 >
105 > if opts['parts']:
104 > if opts['parts']:
106 > part = bundle2.bundlepart('test:empty')
105 > bundler.newpart('test:empty')
107 > bundler.addpart(part)
108 > # add a second one to make sure we handle multiple parts
106 > # add a second one to make sure we handle multiple parts
109 > part = bundle2.bundlepart('test:empty')
107 > bundler.newpart('test:empty')
110 > bundler.addpart(part)
108 > bundler.newpart('test:song', data=ELEPHANTSSONG)
111 > part = bundle2.bundlepart('test:song', data=ELEPHANTSSONG)
109 > bundler.newpart('test:debugreply')
112 > bundler.addpart(part)
110 > mathpart = bundler.newpart('test:math')
113 > part = bundle2.bundlepart('test:debugreply')
111 > mathpart.addparam('pi', '3.14')
114 > bundler.addpart(part)
112 > mathpart.addparam('e', '2.72')
115 > part = bundle2.bundlepart('test:math',
113 > mathpart.addparam('cooking', 'raw', mandatory=False)
116 > [('pi', '3.14'), ('e', '2.72')],
114 > mathpart.data = '42'
117 > [('cooking', 'raw')],
115 > # advisory known part with unknown mandatory param
118 > '42')
116 > bundler.newpart('test:song', [('randomparam','')])
119 > bundler.addpart(part)
120 > if opts['unknown']:
117 > if opts['unknown']:
121 > part = bundle2.bundlepart('test:UNKNOWN',
118 > bundler.newpart('test:UNKNOWN', data='some random content')
122 > data='some random content')
119 > if opts['unknownparams']:
123 > bundler.addpart(part)
120 > bundler.newpart('test:SONG', [('randomparams', '')])
124 > if opts['parts']:
121 > if opts['parts']:
125 > part = bundle2.bundlepart('test:ping')
122 > bundler.newpart('test:ping')
126 > bundler.addpart(part)
127 >
123 >
128 > if path is None:
124 > if path is None:
129 > file = sys.stdout
125 > file = sys.stdout
@@ -144,7 +140,7 b' Create an extension to test bundle2 API'
144 > unbundler = bundle2.unbundle20(ui, sys.stdin)
140 > unbundler = bundle2.unbundle20(ui, sys.stdin)
145 > op = bundle2.processbundle(repo, unbundler, lambda: tr)
141 > op = bundle2.processbundle(repo, unbundler, lambda: tr)
146 > tr.close()
142 > tr.close()
147 > except KeyError, exc:
143 > except error.BundleValueError, exc:
148 > raise util.Abort('missing support for %s' % exc)
144 > raise util.Abort('missing support for %s' % exc)
149 > except error.PushRaced, exc:
145 > except error.PushRaced, exc:
150 > raise util.Abort('push race: %s' % exc)
146 > raise util.Abort('push race: %s' % exc)
@@ -170,7 +166,7 b' Create an extension to test bundle2 API'
170 > unbundler = bundle2.unbundle20(ui, sys.stdin)
166 > unbundler = bundle2.unbundle20(ui, sys.stdin)
171 > try:
167 > try:
172 > params = unbundler.params
168 > params = unbundler.params
173 > except KeyError, exc:
169 > except error.BundleValueError, exc:
174 > raise util.Abort('unknown parameters: %s' % exc)
170 > raise util.Abort('unknown parameters: %s' % exc)
175 > ui.write('options count: %i\n' % len(params))
171 > ui.write('options count: %i\n' % len(params))
176 > for key in sorted(params):
172 > for key in sorted(params):
@@ -194,9 +190,12 b' Create an extension to test bundle2 API'
194 > bundle2-exp=True
190 > bundle2-exp=True
195 > [ui]
191 > [ui]
196 > ssh=python "$TESTDIR/dummyssh"
192 > ssh=python "$TESTDIR/dummyssh"
193 > logtemplate={rev}:{node|short} {phase} {author} {desc|firstline}
197 > [web]
194 > [web]
198 > push_ssl = false
195 > push_ssl = false
199 > allow_push = *
196 > allow_push = *
197 > [phases]
198 > publish=False
200 > EOF
199 > EOF
201
200
202 The extension requires a repo (currently unused)
201 The extension requires a repo (currently unused)
@@ -308,7 +307,7 b' Test unknown mandatory option'
308 ---------------------------------------------------
307 ---------------------------------------------------
309
308
310 $ hg bundle2 --param 'Gravity' | hg statbundle2
309 $ hg bundle2 --param 'Gravity' | hg statbundle2
311 abort: unknown parameters: 'Gravity'
310 abort: unknown parameters: Stream Parameter - Gravity
312 [255]
311 [255]
313
312
314 Test debug output
313 Test debug output
@@ -372,6 +371,7 b' Test part'
372 bundle part: "test:song"
371 bundle part: "test:song"
373 bundle part: "test:debugreply"
372 bundle part: "test:debugreply"
374 bundle part: "test:math"
373 bundle part: "test:math"
374 bundle part: "test:song"
375 bundle part: "test:ping"
375 bundle part: "test:ping"
376 end of bundle
376 end of bundle
377
377
@@ -380,7 +380,7 b' Test part'
380 test:empty\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x11 (esc)
380 test:empty\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x11 (esc)
381 test:empty\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x10 test:song\x00\x00\x00\x02\x00\x00\x00\x00\x00\xb2Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko (esc)
381 test:empty\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x10 test:song\x00\x00\x00\x02\x00\x00\x00\x00\x00\xb2Patali Dirapata, Cromda Cromda Ripalo, Pata Pata, Ko Ko Ko (esc)
382 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
382 Bokoro Dipoulito, Rondi Rondi Pepino, Pata Pata, Ko Ko Ko
383 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.\x00\x00\x00\x00\x00\x16\x0ftest:debugreply\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00+ test:math\x00\x00\x00\x04\x02\x01\x02\x04\x01\x04\x07\x03pi3.14e2.72cookingraw\x00\x00\x00\x0242\x00\x00\x00\x00\x00\x10 test:ping\x00\x00\x00\x05\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
383 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.\x00\x00\x00\x00\x00\x16\x0ftest:debugreply\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00+ test:math\x00\x00\x00\x04\x02\x01\x02\x04\x01\x04\x07\x03pi3.14e2.72cookingraw\x00\x00\x00\x0242\x00\x00\x00\x00\x00\x1d test:song\x00\x00\x00\x05\x01\x00\x0b\x00randomparam\x00\x00\x00\x00\x00\x10 test:ping\x00\x00\x00\x06\x00\x00\x00\x00\x00\x00\x00\x00 (no-eol) (esc)
384
384
385
385
386 $ hg statbundle2 < ../parts.hg2
386 $ hg statbundle2 < ../parts.hg2
@@ -405,11 +405,15 b' Test part'
405 mandatory: 2
405 mandatory: 2
406 advisory: 1
406 advisory: 1
407 payload: 2 bytes
407 payload: 2 bytes
408 :test:song:
409 mandatory: 1
410 advisory: 0
411 payload: 0 bytes
408 :test:ping:
412 :test:ping:
409 mandatory: 0
413 mandatory: 0
410 advisory: 0
414 advisory: 0
411 payload: 0 bytes
415 payload: 0 bytes
412 parts count: 6
416 parts count: 7
413
417
414 $ hg statbundle2 --debug < ../parts.hg2
418 $ hg statbundle2 --debug < ../parts.hg2
415 start processing of HG2X stream
419 start processing of HG2X stream
@@ -463,9 +467,18 b' Test part'
463 payload chunk size: 2
467 payload chunk size: 2
464 payload chunk size: 0
468 payload chunk size: 0
465 payload: 2 bytes
469 payload: 2 bytes
470 part header size: 29
471 part type: "test:song"
472 part id: "5"
473 part parameters: 1
474 :test:song:
475 mandatory: 1
476 advisory: 0
477 payload chunk size: 0
478 payload: 0 bytes
466 part header size: 16
479 part header size: 16
467 part type: "test:ping"
480 part type: "test:ping"
468 part id: "5"
481 part id: "6"
469 part parameters: 0
482 part parameters: 0
470 :test:ping:
483 :test:ping:
471 mandatory: 0
484 mandatory: 0
@@ -474,7 +487,7 b' Test part'
474 payload: 0 bytes
487 payload: 0 bytes
475 part header size: 0
488 part header size: 0
476 end of bundle2 stream
489 end of bundle2 stream
477 parts count: 6
490 parts count: 7
478
491
479 Test actual unbundling of test part
492 Test actual unbundling of test part
480 =======================================
493 =======================================
@@ -489,13 +502,13 b' Process the bundle'
489 part type: "test:empty"
502 part type: "test:empty"
490 part id: "0"
503 part id: "0"
491 part parameters: 0
504 part parameters: 0
492 ignoring unknown advisory part 'test:empty'
505 ignoring unsupported advisory part test:empty
493 payload chunk size: 0
506 payload chunk size: 0
494 part header size: 17
507 part header size: 17
495 part type: "test:empty"
508 part type: "test:empty"
496 part id: "1"
509 part id: "1"
497 part parameters: 0
510 part parameters: 0
498 ignoring unknown advisory part 'test:empty'
511 ignoring unsupported advisory part test:empty
499 payload chunk size: 0
512 payload chunk size: 0
500 part header size: 16
513 part header size: 16
501 part type: "test:song"
514 part type: "test:song"
@@ -519,15 +532,22 b' Process the bundle'
519 part type: "test:math"
532 part type: "test:math"
520 part id: "4"
533 part id: "4"
521 part parameters: 3
534 part parameters: 3
522 ignoring unknown advisory part 'test:math'
535 ignoring unsupported advisory part test:math
523 payload chunk size: 2
536 payload chunk size: 2
524 payload chunk size: 0
537 payload chunk size: 0
538 part header size: 29
539 part type: "test:song"
540 part id: "5"
541 part parameters: 1
542 found a handler for part 'test:song'
543 ignoring unsupported advisory part test:song - randomparam
544 payload chunk size: 0
525 part header size: 16
545 part header size: 16
526 part type: "test:ping"
546 part type: "test:ping"
527 part id: "5"
547 part id: "6"
528 part parameters: 0
548 part parameters: 0
529 found a handler for part 'test:ping'
549 found a handler for part 'test:ping'
530 received ping request (id 5)
550 received ping request (id 6)
531 payload chunk size: 0
551 payload chunk size: 0
532 part header size: 0
552 part header size: 0
533 end of bundle2 stream
553 end of bundle2 stream
@@ -546,7 +566,17 b' Unbundle with an unknown mandatory part'
546 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
566 Emana Karassoli, Loucra Loucra Ponponto, Pata Pata, Ko Ko Ko.
547 debugreply: no reply
567 debugreply: no reply
548 0 unread bytes
568 0 unread bytes
549 abort: missing support for 'test:unknown'
569 abort: missing support for test:unknown
570 [255]
571
572 Unbundle with an unknown mandatory part parameters
573 (should abort)
574
575 $ hg bundle2 --unknownparams ../unknown.hg2
576
577 $ hg unbundle2 < ../unknown.hg2
578 0 unread bytes
579 abort: missing support for test:song - randomparams
550 [255]
580 [255]
551
581
552 unbundle with a reply
582 unbundle with a reply
@@ -572,9 +602,9 b' The reply is a bundle'
572 debugreply: 'babar'
602 debugreply: 'babar'
573 debugreply: 'celeste'
603 debugreply: 'celeste'
574 debugreply: 'ping-pong'
604 debugreply: 'ping-pong'
575 \x00\x00\x00\x00\x00\x1e test:pong\x00\x00\x00\x02\x01\x00\x0b\x01in-reply-to6\x00\x00\x00\x00\x00\x1f (esc)
605 \x00\x00\x00\x00\x00\x1e test:pong\x00\x00\x00\x02\x01\x00\x0b\x01in-reply-to7\x00\x00\x00\x00\x00\x1f (esc)
576 b2x:output\x00\x00\x00\x03\x00\x01\x0b\x01in-reply-to6\x00\x00\x00=received ping request (id 6) (esc)
606 b2x:output\x00\x00\x00\x03\x00\x01\x0b\x01in-reply-to7\x00\x00\x00=received ping request (id 7) (esc)
577 replying to ping request (id 6)
607 replying to ping request (id 7)
578 \x00\x00\x00\x00\x00\x00 (no-eol) (esc)
608 \x00\x00\x00\x00\x00\x00 (no-eol) (esc)
579
609
580 The reply is valid
610 The reply is valid
@@ -613,8 +643,8 b' Unbundle the reply to get the output:'
613 remote: debugreply: 'babar'
643 remote: debugreply: 'babar'
614 remote: debugreply: 'celeste'
644 remote: debugreply: 'celeste'
615 remote: debugreply: 'ping-pong'
645 remote: debugreply: 'ping-pong'
616 remote: received ping request (id 6)
646 remote: received ping request (id 7)
617 remote: replying to ping request (id 6)
647 remote: replying to ping request (id 7)
618 0 unread bytes
648 0 unread bytes
619
649
620 Test push race detection
650 Test push race detection
@@ -637,57 +667,23 b' Support for changegroup'
637 (run 'hg heads' to see heads, 'hg merge' to merge)
667 (run 'hg heads' to see heads, 'hg merge' to merge)
638
668
639 $ hg log -G
669 $ hg log -G
640 o changeset: 8:02de42196ebe
670 o 8:02de42196ebe draft Nicolas Dumazet <nicdumz.commits@gmail.com> H
641 | tag: tip
642 | parent: 6:24b6387c8c8c
643 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
644 | date: Sat Apr 30 15:24:48 2011 +0200
645 | summary: H
646 |
647 | o changeset: 7:eea13746799a
648 |/| parent: 6:24b6387c8c8c
649 | | parent: 5:9520eea781bc
650 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
651 | | date: Sat Apr 30 15:24:48 2011 +0200
652 | | summary: G
653 | |
654 o | changeset: 6:24b6387c8c8c
655 | | parent: 1:cd010b8cd998
656 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
657 | | date: Sat Apr 30 15:24:48 2011 +0200
658 | | summary: F
659 | |
660 | o changeset: 5:9520eea781bc
661 |/ parent: 1:cd010b8cd998
662 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
663 | date: Sat Apr 30 15:24:48 2011 +0200
664 | summary: E
665 |
671 |
666 | o changeset: 4:32af7686d403
672 | o 7:eea13746799a draft Nicolas Dumazet <nicdumz.commits@gmail.com> G
667 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
673 |/|
668 | | date: Sat Apr 30 15:24:48 2011 +0200
674 o | 6:24b6387c8c8c draft Nicolas Dumazet <nicdumz.commits@gmail.com> F
669 | | summary: D
670 | |
675 | |
671 | o changeset: 3:5fddd98957c8
676 | o 5:9520eea781bc draft Nicolas Dumazet <nicdumz.commits@gmail.com> E
672 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
677 |/
673 | | date: Sat Apr 30 15:24:48 2011 +0200
678 | o 4:32af7686d403 draft Nicolas Dumazet <nicdumz.commits@gmail.com> D
674 | | summary: C
675 | |
679 | |
676 | o changeset: 2:42ccdea3bb16
680 | o 3:5fddd98957c8 draft Nicolas Dumazet <nicdumz.commits@gmail.com> C
677 |/ user: Nicolas Dumazet <nicdumz.commits@gmail.com>
681 | |
678 | date: Sat Apr 30 15:24:48 2011 +0200
682 | o 2:42ccdea3bb16 draft Nicolas Dumazet <nicdumz.commits@gmail.com> B
679 | summary: B
683 |/
680 |
684 o 1:cd010b8cd998 draft Nicolas Dumazet <nicdumz.commits@gmail.com> A
681 o changeset: 1:cd010b8cd998
682 parent: -1:000000000000
683 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
684 date: Sat Apr 30 15:24:48 2011 +0200
685 summary: A
686
685
687 @ changeset: 0:3903775176ed
686 @ 0:3903775176ed draft test a
688 user: test
689 date: Thu Jan 01 00:00:00 1970 +0000
690 summary: a
691
687
692
688
693 $ hg bundle2 --debug --rev '8+7+5+4' ../rev.hg2
689 $ hg bundle2 --debug --rev '8+7+5+4' ../rev.hg2
@@ -768,6 +764,7 b' Real world exchange'
768 clone --pull
764 clone --pull
769
765
770 $ cd ..
766 $ cd ..
767 $ hg -R main phase --public cd010b8cd998
771 $ hg clone main other --pull --rev 9520eea781bc
768 $ hg clone main other --pull --rev 9520eea781bc
772 adding changesets
769 adding changesets
773 adding manifests
770 adding manifests
@@ -776,20 +773,14 b' clone --pull'
776 updating to branch default
773 updating to branch default
777 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
774 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
778 $ hg -R other log -G
775 $ hg -R other log -G
779 @ changeset: 1:9520eea781bc
776 @ 1:9520eea781bc draft Nicolas Dumazet <nicdumz.commits@gmail.com> E
780 | tag: tip
781 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
782 | date: Sat Apr 30 15:24:48 2011 +0200
783 | summary: E
784 |
777 |
785 o changeset: 0:cd010b8cd998
778 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
786 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
787 date: Sat Apr 30 15:24:48 2011 +0200
788 summary: A
789
779
790
780
791 pull
781 pull
792
782
783 $ hg -R main phase --public 9520eea781bc
793 $ hg -R other pull -r 24b6387c8c8c
784 $ hg -R other pull -r 24b6387c8c8c
794 pulling from $TESTTMP/main (glob)
785 pulling from $TESTTMP/main (glob)
795 searching for changes
786 searching for changes
@@ -798,15 +789,43 b' pull'
798 adding file changes
789 adding file changes
799 added 1 changesets with 1 changes to 1 files (+1 heads)
790 added 1 changesets with 1 changes to 1 files (+1 heads)
800 (run 'hg heads' to see heads, 'hg merge' to merge)
791 (run 'hg heads' to see heads, 'hg merge' to merge)
792 $ hg -R other log -G
793 o 2:24b6387c8c8c draft Nicolas Dumazet <nicdumz.commits@gmail.com> F
794 |
795 | @ 1:9520eea781bc draft Nicolas Dumazet <nicdumz.commits@gmail.com> E
796 |/
797 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
798
801
799
800 pull empty (with phase movement)
801
802 $ hg -R main phase --public 24b6387c8c8c
803 $ hg -R other pull -r 24b6387c8c8c
804 pulling from $TESTTMP/main (glob)
805 no changes found
806 $ hg -R other log -G
807 o 2:24b6387c8c8c public Nicolas Dumazet <nicdumz.commits@gmail.com> F
808 |
809 | @ 1:9520eea781bc draft Nicolas Dumazet <nicdumz.commits@gmail.com> E
810 |/
811 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
812
802 pull empty
813 pull empty
803
814
804 $ hg -R other pull -r 24b6387c8c8c
815 $ hg -R other pull -r 24b6387c8c8c
805 pulling from $TESTTMP/main (glob)
816 pulling from $TESTTMP/main (glob)
806 no changes found
817 no changes found
818 $ hg -R other log -G
819 o 2:24b6387c8c8c public Nicolas Dumazet <nicdumz.commits@gmail.com> F
820 |
821 | @ 1:9520eea781bc draft Nicolas Dumazet <nicdumz.commits@gmail.com> E
822 |/
823 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
824
807
825
808 push
826 push
809
827
828 $ hg -R main phase --public eea13746799a
810 $ hg -R main push other --rev eea13746799a
829 $ hg -R main push other --rev eea13746799a
811 pushing to other
830 pushing to other
812 searching for changes
831 searching for changes
@@ -814,6 +833,15 b' push'
814 remote: adding manifests
833 remote: adding manifests
815 remote: adding file changes
834 remote: adding file changes
816 remote: added 1 changesets with 0 changes to 0 files (-1 heads)
835 remote: added 1 changesets with 0 changes to 0 files (-1 heads)
836 $ hg -R other log -G
837 o 3:eea13746799a public Nicolas Dumazet <nicdumz.commits@gmail.com> G
838 |\
839 | o 2:24b6387c8c8c public Nicolas Dumazet <nicdumz.commits@gmail.com> F
840 | |
841 @ | 1:9520eea781bc public Nicolas Dumazet <nicdumz.commits@gmail.com> E
842 |/
843 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
844
817
845
818 pull over ssh
846 pull over ssh
819
847
@@ -850,12 +878,28 b' push over ssh'
850 remote: adding manifests
878 remote: adding manifests
851 remote: adding file changes
879 remote: adding file changes
852 remote: added 1 changesets with 1 changes to 1 files
880 remote: added 1 changesets with 1 changes to 1 files
881 $ hg -R other log -G
882 o 6:5fddd98957c8 draft Nicolas Dumazet <nicdumz.commits@gmail.com> C
883 |
884 o 5:42ccdea3bb16 draft Nicolas Dumazet <nicdumz.commits@gmail.com> B
885 |
886 | o 4:02de42196ebe draft Nicolas Dumazet <nicdumz.commits@gmail.com> H
887 | |
888 | | o 3:eea13746799a public Nicolas Dumazet <nicdumz.commits@gmail.com> G
889 | |/|
890 | o | 2:24b6387c8c8c public Nicolas Dumazet <nicdumz.commits@gmail.com> F
891 |/ /
892 | @ 1:9520eea781bc public Nicolas Dumazet <nicdumz.commits@gmail.com> E
893 |/
894 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
895
853
896
854 push over http
897 push over http
855
898
856 $ hg -R other serve -p $HGPORT2 -d --pid-file=other.pid -E other-error.log
899 $ hg -R other serve -p $HGPORT2 -d --pid-file=other.pid -E other-error.log
857 $ cat other.pid >> $DAEMON_PIDS
900 $ cat other.pid >> $DAEMON_PIDS
858
901
902 $ hg -R main phase --public 32af7686d403
859 $ hg -R main push http://localhost:$HGPORT2/ -r 32af7686d403
903 $ hg -R main push http://localhost:$HGPORT2/ -r 32af7686d403
860 pushing to http://localhost:$HGPORT2/
904 pushing to http://localhost:$HGPORT2/
861 searching for changes
905 searching for changes
@@ -868,51 +912,21 b' push over http'
868 Check final content.
912 Check final content.
869
913
870 $ hg -R other log -G
914 $ hg -R other log -G
871 o changeset: 7:32af7686d403
915 o 7:32af7686d403 public Nicolas Dumazet <nicdumz.commits@gmail.com> D
872 | tag: tip
873 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
874 | date: Sat Apr 30 15:24:48 2011 +0200
875 | summary: D
876 |
916 |
877 o changeset: 6:5fddd98957c8
917 o 6:5fddd98957c8 public Nicolas Dumazet <nicdumz.commits@gmail.com> C
878 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
879 | date: Sat Apr 30 15:24:48 2011 +0200
880 | summary: C
881 |
918 |
882 o changeset: 5:42ccdea3bb16
919 o 5:42ccdea3bb16 public Nicolas Dumazet <nicdumz.commits@gmail.com> B
883 | parent: 0:cd010b8cd998
884 | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
885 | date: Sat Apr 30 15:24:48 2011 +0200
886 | summary: B
887 |
920 |
888 | o changeset: 4:02de42196ebe
921 | o 4:02de42196ebe draft Nicolas Dumazet <nicdumz.commits@gmail.com> H
889 | | parent: 2:24b6387c8c8c
890 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
891 | | date: Sat Apr 30 15:24:48 2011 +0200
892 | | summary: H
893 | |
922 | |
894 | | o changeset: 3:eea13746799a
923 | | o 3:eea13746799a public Nicolas Dumazet <nicdumz.commits@gmail.com> G
895 | |/| parent: 2:24b6387c8c8c
924 | |/|
896 | | | parent: 1:9520eea781bc
925 | o | 2:24b6387c8c8c public Nicolas Dumazet <nicdumz.commits@gmail.com> F
897 | | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
926 |/ /
898 | | | date: Sat Apr 30 15:24:48 2011 +0200
927 | @ 1:9520eea781bc public Nicolas Dumazet <nicdumz.commits@gmail.com> E
899 | | | summary: G
928 |/
900 | | |
929 o 0:cd010b8cd998 public Nicolas Dumazet <nicdumz.commits@gmail.com> A
901 | o | changeset: 2:24b6387c8c8c
902 |/ / parent: 0:cd010b8cd998
903 | | user: Nicolas Dumazet <nicdumz.commits@gmail.com>
904 | | date: Sat Apr 30 15:24:48 2011 +0200
905 | | summary: F
906 | |
907 | @ changeset: 1:9520eea781bc
908 |/ user: Nicolas Dumazet <nicdumz.commits@gmail.com>
909 | date: Sat Apr 30 15:24:48 2011 +0200
910 | summary: E
911 |
912 o changeset: 0:cd010b8cd998
913 user: Nicolas Dumazet <nicdumz.commits@gmail.com>
914 date: Sat Apr 30 15:24:48 2011 +0200
915 summary: A
916
930
917
931
918 Error Handling
932 Error Handling
@@ -933,27 +947,24 b' Setting up'
933 > from mercurial import exchange
947 > from mercurial import exchange
934 > from mercurial import extensions
948 > from mercurial import extensions
935 >
949 >
936 > def _pushbundle2failpart(orig, pushop, bundler):
950 > def _pushbundle2failpart(pushop, bundler):
937 > extradata = orig(pushop, bundler)
938 > reason = pushop.ui.config('failpush', 'reason', None)
951 > reason = pushop.ui.config('failpush', 'reason', None)
939 > part = None
952 > part = None
940 > if reason == 'abort':
953 > if reason == 'abort':
941 > part = bundle2.bundlepart('test:abort')
954 > bundler.newpart('test:abort')
942 > if reason == 'unknown':
955 > if reason == 'unknown':
943 > part = bundle2.bundlepart('TEST:UNKNOWN')
956 > bundler.newpart('TEST:UNKNOWN')
944 > if reason == 'race':
957 > if reason == 'race':
945 > # 20 Bytes of crap
958 > # 20 Bytes of crap
946 > part = bundle2.bundlepart('b2x:check:heads', data='01234567890123456789')
959 > bundler.newpart('b2x:check:heads', data='01234567890123456789')
947 > if part is not None:
960 > return lambda op: None
948 > bundler.addpart(part)
949 > return extradata
950 >
961 >
951 > @bundle2.parthandler("test:abort")
962 > @bundle2.parthandler("test:abort")
952 > def handleabort(op, part):
963 > def handleabort(op, part):
953 > raise util.Abort('Abandon ship!', hint="don't panic")
964 > raise util.Abort('Abandon ship!', hint="don't panic")
954 >
965 >
955 > def uisetup(ui):
966 > def uisetup(ui):
956 > extensions.wrapfunction(exchange, '_pushbundle2extraparts', _pushbundle2failpart)
967 > exchange.bundle2partsgenerators.insert(0, _pushbundle2failpart)
957 >
968 >
958 > EOF
969 > EOF
959
970
@@ -1015,19 +1026,19 b' Doing the actual push: unknown mandatory'
1015 $ hg -R main push other -r e7ec4e813ba6
1026 $ hg -R main push other -r e7ec4e813ba6
1016 pushing to other
1027 pushing to other
1017 searching for changes
1028 searching for changes
1018 abort: missing support for 'test:unknown'
1029 abort: missing support for test:unknown
1019 [255]
1030 [255]
1020
1031
1021 $ hg -R main push ssh://user@dummy/other -r e7ec4e813ba6
1032 $ hg -R main push ssh://user@dummy/other -r e7ec4e813ba6
1022 pushing to ssh://user@dummy/other
1033 pushing to ssh://user@dummy/other
1023 searching for changes
1034 searching for changes
1024 abort: missing support for "'test:unknown'"
1035 abort: missing support for test:unknown
1025 [255]
1036 [255]
1026
1037
1027 $ hg -R main push http://localhost:$HGPORT2/ -r e7ec4e813ba6
1038 $ hg -R main push http://localhost:$HGPORT2/ -r e7ec4e813ba6
1028 pushing to http://localhost:$HGPORT2/
1039 pushing to http://localhost:$HGPORT2/
1029 searching for changes
1040 searching for changes
1030 abort: missing support for "'test:unknown'"
1041 abort: missing support for test:unknown
1031 [255]
1042 [255]
1032
1043
1033 Doing the actual push: race
1044 Doing the actual push: race
@@ -1,36 +1,17 b''
1 #if test-repo
2
1 $ check_code="$TESTDIR"/../contrib/check-code.py
3 $ check_code="$TESTDIR"/../contrib/check-code.py
2 $ cd "$TESTDIR"/..
4 $ cd "$TESTDIR"/..
3 $ if hg identify -q > /dev/null 2>&1; then :
4 > else
5 > echo "skipped: not a Mercurial working dir" >&2
6 > exit 80
7 > fi
8
9 Prepare check for Python files without py extension
10
11 $ cp \
12 > hg \
13 > hgweb.cgi \
14 > contrib/convert-repo \
15 > contrib/dumprevlog \
16 > contrib/hgweb.fcgi \
17 > contrib/hgweb.wsgi \
18 > contrib/simplemerge \
19 > contrib/undumprevlog \
20 > i18n/hggettext \
21 > i18n/posplit \
22 > tests/hghave \
23 > tests/dummyssh \
24 > "$TESTTMP"/
25 $ for f in "$TESTTMP"/*; do mv "$f" "$f.py"; done
26
5
27 New errors are not allowed. Warnings are strongly discouraged.
6 New errors are not allowed. Warnings are strongly discouraged.
28 (The writing "no-che?k-code" is for not skipping this file when checking.)
7 (The writing "no-che?k-code" is for not skipping this file when checking.)
29
8
30 $ { hg manifest 2>/dev/null; ls "$TESTTMP"/*.py | sed 's-\\-/-g'; } |
9 $ hg locate | sed 's-\\-/-g' |
31 > xargs "$check_code" --warnings --per-file=0 || false
10 > xargs "$check_code" --warnings --per-file=0 || false
32 Skipping hgext/zeroconf/Zeroconf.py it has no-che?k-code (glob)
11 Skipping hgext/zeroconf/Zeroconf.py it has no-che?k-code (glob)
33 Skipping i18n/polib.py it has no-che?k-code (glob)
12 Skipping i18n/polib.py it has no-che?k-code (glob)
34 Skipping mercurial/httpclient/__init__.py it has no-che?k-code (glob)
13 Skipping mercurial/httpclient/__init__.py it has no-che?k-code (glob)
35 Skipping mercurial/httpclient/_readers.py it has no-che?k-code (glob)
14 Skipping mercurial/httpclient/_readers.py it has no-che?k-code (glob)
36 Skipping mercurial/httpclient/socketutil.py it has no-che?k-code (glob)
15 Skipping mercurial/httpclient/socketutil.py it has no-che?k-code (glob)
16
17 #endif
@@ -284,3 +284,19 b''
284 > print _(
284 > print _(
285 don't use % inside _()
285 don't use % inside _()
286 [1]
286 [1]
287
288 web templates
289
290 $ mkdir -p mercurial/templates
291 $ cat > mercurial/templates/example.tmpl <<EOF
292 > {desc}
293 > {desc|escape}
294 > {desc|firstline}
295 > {desc|websub}
296 > EOF
297
298 $ "$check_code" --warnings mercurial/templates/example.tmpl
299 mercurial/templates/example.tmpl:2:
300 > {desc|escape}
301 warning: follow desc keyword with either firstline or websub
302 [1]
@@ -5,7 +5,7 b''
5 run pyflakes on all tracked files ending in .py or without a file ending
5 run pyflakes on all tracked files ending in .py or without a file ending
6 (skipping binary file random-seed)
6 (skipping binary file random-seed)
7
7
8 $ hg manifest 2>/dev/null | egrep "\.py$|^[^.]*$" | grep -v /random_seed$ \
8 $ hg locate 'set:**.py or grep("^!#.*python")' 2>/dev/null \
9 > | xargs pyflakes 2>/dev/null | "$TESTDIR/filterpyflakes.py"
9 > | xargs pyflakes 2>/dev/null | "$TESTDIR/filterpyflakes.py"
10 contrib/win32/hgwebdir_wsgi.py:*: 'win32traceutil' imported but unused (glob)
10 contrib/win32/hgwebdir_wsgi.py:*: 'win32traceutil' imported but unused (glob)
11 setup.py:*: 'sha' imported but unused (glob)
11 setup.py:*: 'sha' imported but unused (glob)
@@ -17,5 +17,6 b' run pyflakes on all tracked files ending'
17 tests/hghave.py:*: 'pygments' imported but unused (glob)
17 tests/hghave.py:*: 'pygments' imported but unused (glob)
18 tests/hghave.py:*: 'ssl' imported but unused (glob)
18 tests/hghave.py:*: 'ssl' imported but unused (glob)
19 contrib/win32/hgwebdir_wsgi.py:93: 'from isapi.install import *' used; unable to detect undefined names (glob)
19 contrib/win32/hgwebdir_wsgi.py:93: 'from isapi.install import *' used; unable to detect undefined names (glob)
20 tests/filterpyflakes.py:58: undefined name 'undefinedname'
20
21
21 #endif
22 #endif
@@ -1850,6 +1850,15 b' Test current bookmark templating'
1850 2 bar* foo
1850 2 bar* foo
1851 1
1851 1
1852 0
1852 0
1853 $ hg log --template "{rev} {currentbookmark}\n"
1854 2 bar
1855 1
1856 0
1857 $ hg bookmarks --inactive bar
1858 $ hg log --template "{rev} {currentbookmark}\n"
1859 2
1860 1
1861 0
1853
1862
1854 Test stringify on sub expressions
1863 Test stringify on sub expressions
1855
1864
@@ -1859,3 +1868,119 b' Test stringify on sub expressions'
1859 $ hg log -R a -r 8 --template '{strip(if("1", if("1", "-abc-")), if("1", if("1", "-")))}\n'
1868 $ hg log -R a -r 8 --template '{strip(if("1", if("1", "-abc-")), if("1", if("1", "-")))}\n'
1860 abc
1869 abc
1861
1870
1871 Test splitlines
1872
1873 $ hg log -Gv -R a --template "{splitlines(desc) % 'foo {line}\n'}"
1874 @ foo future
1875 |
1876 o foo third
1877 |
1878 o foo second
1879
1880 o foo merge
1881 |\
1882 | o foo new head
1883 | |
1884 o | foo new branch
1885 |/
1886 o foo no user, no domain
1887 |
1888 o foo no person
1889 |
1890 o foo other 1
1891 | foo other 2
1892 | foo
1893 | foo other 3
1894 o foo line 1
1895 foo line 2
1896
1897 Test startswith
1898 $ hg log -Gv -R a --template "{startswith(desc)}"
1899 hg: parse error: startswith expects two arguments
1900 [255]
1901
1902 $ hg log -Gv -R a --template "{startswith('line', desc)}"
1903 @
1904 |
1905 o
1906 |
1907 o
1908
1909 o
1910 |\
1911 | o
1912 | |
1913 o |
1914 |/
1915 o
1916 |
1917 o
1918 |
1919 o
1920 |
1921 o line 1
1922 line 2
1923
1924 Test bad template with better error message
1925
1926 $ hg log -Gv -R a --template '{desc|user()}'
1927 hg: parse error: expected a symbol, got 'func'
1928 [255]
1929
1930 Test word function (including index out of bounds graceful failure)
1931
1932 $ hg log -Gv -R a --template "{word('1', desc)}"
1933 @
1934 |
1935 o
1936 |
1937 o
1938
1939 o
1940 |\
1941 | o head
1942 | |
1943 o | branch
1944 |/
1945 o user,
1946 |
1947 o person
1948 |
1949 o 1
1950 |
1951 o 1
1952
1953
1954 Test word third parameter used as splitter
1955
1956 $ hg log -Gv -R a --template "{word('0', desc, 'o')}"
1957 @ future
1958 |
1959 o third
1960 |
1961 o sec
1962
1963 o merge
1964 |\
1965 | o new head
1966 | |
1967 o | new branch
1968 |/
1969 o n
1970 |
1971 o n
1972 |
1973 o
1974 |
1975 o line 1
1976 line 2
1977
1978 Test word error messages for not enough and too many arguments
1979
1980 $ hg log -Gv -R a --template "{word('0')}"
1981 hg: parse error: word expects two or three arguments, got 1
1982 [255]
1983
1984 $ hg log -Gv -R a --template "{word('0', desc, 'o', 'h', 'b', 'o', 'y')}"
1985 hg: parse error: word expects two or three arguments, got 7
1986 [255]
@@ -341,10 +341,9 b" if __name__ == '__main__':"
341 check(mqoutsidechanges)
341 check(mqoutsidechanges)
342 dbg = open('dbgui.py', 'w')
342 dbg = open('dbgui.py', 'w')
343 dbg.write('from mercurial import cmdutil, commands\n'
343 dbg.write('from mercurial import cmdutil, commands\n'
344 'commands.norepo += " debuggetpass"\n'
345 'cmdtable = {}\n'
344 'cmdtable = {}\n'
346 'command = cmdutil.command(cmdtable)\n'
345 'command = cmdutil.command(cmdtable)\n'
347 '@command("debuggetpass")\n'
346 '@command("debuggetpass", norepo=True)\n'
348 'def debuggetpass(ui):\n'
347 'def debuggetpass(ui):\n'
349 ' ui.write("%s\\n" % ui.getpass())\n')
348 ' ui.write("%s\\n" % ui.getpass())\n')
350 dbg.close()
349 dbg.close()
@@ -80,6 +80,7 b' defaults.shelve=--date "0 0"'
80 defaults.tag=-d "0 0"
80 defaults.tag=-d "0 0"
81 ui.slash=True
81 ui.slash=True
82 ui.interactive=False
82 ui.interactive=False
83 ui.mergemarkers=detailed
83 ui.foo=bar
84 ui.foo=bar
84 ui.nontty=true
85 ui.nontty=true
85 runcommand init foo
86 runcommand init foo
@@ -90,6 +91,7 b' defaults.shelve=--date "0 0"'
90 defaults.tag=-d "0 0"
91 defaults.tag=-d "0 0"
91 ui.slash=True
92 ui.slash=True
92 ui.interactive=False
93 ui.interactive=False
94 ui.mergemarkers=detailed
93 ui.nontty=true
95 ui.nontty=true
94
96
95 testing hookoutput:
97 testing hookoutput:
@@ -177,6 +179,7 b' testing phasecacheafterstrip:'
177
179
178 runcommand update -C 0
180 runcommand update -C 0
179 1 files updated, 0 files merged, 2 files removed, 0 files unresolved
181 1 files updated, 0 files merged, 2 files removed, 0 files unresolved
182 (leaving bookmark bm3)
180 runcommand commit -Am. a
183 runcommand commit -Am. a
181 created new head
184 created new head
182 runcommand log -Gq
185 runcommand log -Gq
@@ -586,9 +586,10 b' Amend a merge changeset (with renames an'
586 merging cc incomplete! (edit conflicts, then use 'hg resolve --mark')
586 merging cc incomplete! (edit conflicts, then use 'hg resolve --mark')
587 [1]
587 [1]
588 $ hg resolve -m cc
588 $ hg resolve -m cc
589 no more unresolved files
589 $ hg ci -m 'merge bar'
590 $ hg ci -m 'merge bar'
590 $ hg log --config diff.git=1 -pr .
591 $ hg log --config diff.git=1 -pr .
591 changeset: 23:d51446492733
592 changeset: 23:93cd4445f720
592 tag: tip
593 tag: tip
593 parent: 22:30d96aeaf27b
594 parent: 22:30d96aeaf27b
594 parent: 21:1aa437659d19
595 parent: 21:1aa437659d19
@@ -603,11 +604,11 b' Amend a merge changeset (with renames an'
603 --- a/cc
604 --- a/cc
604 +++ b/cc
605 +++ b/cc
605 @@ -1,1 +1,5 @@
606 @@ -1,1 +1,5 @@
606 +<<<<<<< local
607 +<<<<<<< local: 30d96aeaf27b - test: aa
607 dd
608 dd
608 +=======
609 +=======
609 +cc
610 +cc
610 +>>>>>>> other
611 +>>>>>>> other: 1aa437659d19 bar - test: aazzcc
611 diff --git a/z b/zz
612 diff --git a/z b/zz
612 rename from z
613 rename from z
613 rename to zz
614 rename to zz
@@ -620,7 +621,7 b' Amend a merge changeset (with renames an'
620 cc not renamed
621 cc not renamed
621 $ hg ci --amend -m 'merge bar (amend message)'
622 $ hg ci --amend -m 'merge bar (amend message)'
622 $ hg log --config diff.git=1 -pr .
623 $ hg log --config diff.git=1 -pr .
623 changeset: 24:59de3dce7a79
624 changeset: 24:832b50f2c271
624 tag: tip
625 tag: tip
625 parent: 22:30d96aeaf27b
626 parent: 22:30d96aeaf27b
626 parent: 21:1aa437659d19
627 parent: 21:1aa437659d19
@@ -635,11 +636,11 b' Amend a merge changeset (with renames an'
635 --- a/cc
636 --- a/cc
636 +++ b/cc
637 +++ b/cc
637 @@ -1,1 +1,5 @@
638 @@ -1,1 +1,5 @@
638 +<<<<<<< local
639 +<<<<<<< local: 30d96aeaf27b - test: aa
639 dd
640 dd
640 +=======
641 +=======
641 +cc
642 +cc
642 +>>>>>>> other
643 +>>>>>>> other: 1aa437659d19 bar - test: aazzcc
643 diff --git a/z b/zz
644 diff --git a/z b/zz
644 rename from z
645 rename from z
645 rename to zz
646 rename to zz
@@ -653,7 +654,7 b' Amend a merge changeset (with renames an'
653 $ hg mv zz z
654 $ hg mv zz z
654 $ hg ci --amend -m 'merge bar (undo rename)'
655 $ hg ci --amend -m 'merge bar (undo rename)'
655 $ hg log --config diff.git=1 -pr .
656 $ hg log --config diff.git=1 -pr .
656 changeset: 26:7fb89c461f81
657 changeset: 26:bdafc5c72f74
657 tag: tip
658 tag: tip
658 parent: 22:30d96aeaf27b
659 parent: 22:30d96aeaf27b
659 parent: 21:1aa437659d19
660 parent: 21:1aa437659d19
@@ -668,11 +669,11 b' Amend a merge changeset (with renames an'
668 --- a/cc
669 --- a/cc
669 +++ b/cc
670 +++ b/cc
670 @@ -1,1 +1,5 @@
671 @@ -1,1 +1,5 @@
671 +<<<<<<< local
672 +<<<<<<< local: 30d96aeaf27b - test: aa
672 dd
673 dd
673 +=======
674 +=======
674 +cc
675 +cc
675 +>>>>>>> other
676 +>>>>>>> other: 1aa437659d19 bar - test: aazzcc
676
677
677 $ hg debugrename z
678 $ hg debugrename z
678 z not renamed
679 z not renamed
@@ -689,9 +690,9 b' Amend a merge changeset (with renames du'
689 $ echo aa >> aaa
690 $ echo aa >> aaa
690 $ hg ci -m 'merge bar again'
691 $ hg ci -m 'merge bar again'
691 $ hg log --config diff.git=1 -pr .
692 $ hg log --config diff.git=1 -pr .
692 changeset: 28:982d7a34ffee
693 changeset: 28:32f19415b634
693 tag: tip
694 tag: tip
694 parent: 26:7fb89c461f81
695 parent: 26:bdafc5c72f74
695 parent: 27:4c94d5bc65f5
696 parent: 27:4c94d5bc65f5
696 user: test
697 user: test
697 date: Thu Jan 01 00:00:00 1970 +0000
698 date: Thu Jan 01 00:00:00 1970 +0000
@@ -724,9 +725,9 b' Amend a merge changeset (with renames du'
724 $ hg mv aaa aa
725 $ hg mv aaa aa
725 $ hg ci --amend -m 'merge bar again (undo rename)'
726 $ hg ci --amend -m 'merge bar again (undo rename)'
726 $ hg log --config diff.git=1 -pr .
727 $ hg log --config diff.git=1 -pr .
727 changeset: 30:522688c0e71b
728 changeset: 30:1e2a06b3d312
728 tag: tip
729 tag: tip
729 parent: 26:7fb89c461f81
730 parent: 26:bdafc5c72f74
730 parent: 27:4c94d5bc65f5
731 parent: 27:4c94d5bc65f5
731 user: test
732 user: test
732 date: Thu Jan 01 00:00:00 1970 +0000
733 date: Thu Jan 01 00:00:00 1970 +0000
@@ -764,9 +765,9 b' Amend a merge changeset (with manifest-l'
764 use (c)hanged version or (d)elete? c
765 use (c)hanged version or (d)elete? c
765 $ hg ci -m 'merge bar (with conflicts)'
766 $ hg ci -m 'merge bar (with conflicts)'
766 $ hg log --config diff.git=1 -pr .
767 $ hg log --config diff.git=1 -pr .
767 changeset: 33:5f9904c491b8
768 changeset: 33:97a298b0c59f
768 tag: tip
769 tag: tip
769 parent: 32:01780b896f58
770 parent: 32:3d78ce4226b8
770 parent: 31:67db8847a540
771 parent: 31:67db8847a540
771 user: test
772 user: test
772 date: Thu Jan 01 00:00:00 1970 +0000
773 date: Thu Jan 01 00:00:00 1970 +0000
@@ -776,9 +777,9 b' Amend a merge changeset (with manifest-l'
776 $ hg rm aa
777 $ hg rm aa
777 $ hg ci --amend -m 'merge bar (with conflicts, amended)'
778 $ hg ci --amend -m 'merge bar (with conflicts, amended)'
778 $ hg log --config diff.git=1 -pr .
779 $ hg log --config diff.git=1 -pr .
779 changeset: 35:6ce0c89781a3
780 changeset: 35:6de0c1bde1c8
780 tag: tip
781 tag: tip
781 parent: 32:01780b896f58
782 parent: 32:3d78ce4226b8
782 parent: 31:67db8847a540
783 parent: 31:67db8847a540
783 user: test
784 user: test
784 date: Thu Jan 01 00:00:00 1970 +0000
785 date: Thu Jan 01 00:00:00 1970 +0000
@@ -41,6 +41,7 b' Correct the conflict without marking the'
41 Mark the conflict as resolved and commit
41 Mark the conflict as resolved and commit
42
42
43 $ hg resolve -m A
43 $ hg resolve -m A
44 no more unresolved files
44 $ hg commit -m "Merged"
45 $ hg commit -m "Merged"
45
46
46 $ cd ..
47 $ cd ..
@@ -354,6 +354,48 b' test saving last-message.txt'
354
354
355 test saving last-message.txt
355 test saving last-message.txt
356
356
357 test that '[committemplate] changeset' definition and commit log
358 specific template keywords work well
359
360 $ cat >> .hg/hgrc <<EOF
361 > [committemplate]
362 > changeset = HG: this is customized commit template
363 > HG: {extramsg}
364 > {if(currentbookmark,
365 > "HG: bookmark '{currentbookmark}' is activated\n",
366 > "HG: no bookmark is activated\n")}{subrepos %
367 > "HG: subrepo '{subrepo}' is changed\n"}
368 > EOF
369
370 $ hg init sub2
371 $ echo a > sub2/a
372 $ hg -R sub2 add sub2/a
373 $ echo 'sub2 = sub2' >> .hgsub
374
375 $ HGEDITOR=cat hg commit -S -q
376 HG: this is customized commit template
377 HG: Leave message empty to abort commit.
378 HG: bookmark 'currentbookmark' is activated
379 HG: subrepo 'sub' is changed
380 HG: subrepo 'sub2' is changed
381 abort: empty commit message
382 [255]
383
384 $ hg bookmark --inactive currentbookmark
385 $ hg forget .hgsub
386 $ HGEDITOR=cat hg commit -q
387 HG: this is customized commit template
388 HG: Leave message empty to abort commit.
389 HG: no bookmark is activated
390 abort: empty commit message
391 [255]
392
393 $ cat >> .hg/hgrc <<EOF
394 > # disable customizing for subsequent tests
395 > [committemplate]
396 > changeset =
397 > EOF
398
357 $ cd ..
399 $ cd ..
358
400
359
401
@@ -212,10 +212,10 b' Show all commands + options'
212 serve: accesslog, daemon, daemon-pipefds, errorlog, port, address, prefix, name, web-conf, webdir-conf, pid-file, stdio, cmdserver, templates, style, ipv6, certificate
212 serve: accesslog, daemon, daemon-pipefds, errorlog, port, address, prefix, name, web-conf, webdir-conf, pid-file, stdio, cmdserver, templates, style, ipv6, certificate
213 status: all, modified, added, removed, deleted, clean, unknown, ignored, no-status, copies, print0, rev, change, include, exclude, subrepos
213 status: all, modified, added, removed, deleted, clean, unknown, ignored, no-status, copies, print0, rev, change, include, exclude, subrepos
214 summary: remote
214 summary: remote
215 update: clean, check, date, rev
215 update: clean, check, date, rev, tool
216 addremove: similarity, include, exclude, dry-run
216 addremove: similarity, include, exclude, dry-run
217 archive: no-decode, prefix, rev, type, subrepos, include, exclude
217 archive: no-decode, prefix, rev, type, subrepos, include, exclude
218 backout: merge, parent, rev, tool, include, exclude, message, logfile, date, user
218 backout: merge, parent, rev, edit, tool, include, exclude, message, logfile, date, user
219 bisect: reset, good, bad, skip, extend, command, noupdate
219 bisect: reset, good, bad, skip, extend, command, noupdate
220 bookmarks: force, rev, delete, rename, inactive
220 bookmarks: force, rev, delete, rename, inactive
221 branch: force, clean
221 branch: force, clean
@@ -262,7 +262,7 b' Show all commands + options'
262 heads: rev, topo, active, closed, style, template
262 heads: rev, topo, active, closed, style, template
263 help: extension, command, keyword
263 help: extension, command, keyword
264 identify: rev, num, id, branch, tags, bookmarks, ssh, remotecmd, insecure
264 identify: rev, num, id, branch, tags, bookmarks, ssh, remotecmd, insecure
265 import: strip, base, edit, force, no-commit, bypass, exact, import-branch, message, logfile, date, user, similarity
265 import: strip, base, edit, force, no-commit, bypass, partial, exact, import-branch, message, logfile, date, user, similarity
266 incoming: force, newest-first, bundle, rev, bookmarks, branch, patch, git, limit, no-merges, stat, graph, style, template, ssh, remotecmd, insecure, subrepos
266 incoming: force, newest-first, bundle, rev, bookmarks, branch, patch, git, limit, no-merges, stat, graph, style, template, ssh, remotecmd, insecure, subrepos
267 locate: rev, print0, fullpath, include, exclude
267 locate: rev, print0, fullpath, include, exclude
268 manifest: rev, all
268 manifest: rev, all
@@ -1,12 +1,36 b''
1 $ hg init
1 $ hg init
2 $ echo "nothing" > a
2 $ cat << EOF > a
3 > Small Mathematical Series.
4 > One
5 > Two
6 > Three
7 > Four
8 > Five
9 > Hop we are done.
10 > EOF
3 $ hg add a
11 $ hg add a
4 $ hg commit -m ancestor
12 $ hg commit -m ancestor
5 $ echo "something" > a
13 $ cat << EOF > a
14 > Small Mathematical Series.
15 > 1
16 > 2
17 > 3
18 > 4
19 > 5
20 > Hop we are done.
21 > EOF
6 $ hg commit -m branch1
22 $ hg commit -m branch1
7 $ hg co 0
23 $ hg co 0
8 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
24 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
9 $ echo "something else" > a
25 $ cat << EOF > a
26 > Small Mathematical Series.
27 > 1
28 > 2
29 > 3
30 > 6
31 > 8
32 > Hop we are done.
33 > EOF
10 $ hg commit -m branch2
34 $ hg commit -m branch2
11 created new head
35 created new head
12
36
@@ -19,15 +43,158 b''
19 [1]
43 [1]
20
44
21 $ hg id
45 $ hg id
22 32e80765d7fe+75234512624c+ tip
46 618808747361+c0c68e4fe667+ tip
23
47
24 $ cat a
48 $ cat a
25 <<<<<<< local
49 Small Mathematical Series.
26 something else
50 <<<<<<< local: 618808747361 - test: branch2
51 1
52 2
53 3
54 6
55 8
27 =======
56 =======
28 something
57 1
29 >>>>>>> other
58 2
59 3
60 4
61 5
62 >>>>>>> other: c0c68e4fe667 - test: branch1
63 Hop we are done.
30
64
31 $ hg status
65 $ hg status
32 M a
66 M a
33 ? a.orig
67 ? a.orig
68
69 Verify custom conflict markers
70
71 $ hg up -q --clean .
72 $ printf "\n[ui]\nmergemarkertemplate={author} {rev}\n" >> .hg/hgrc
73
74 $ hg merge 1
75 merging a
76 warning: conflicts during merge.
77 merging a incomplete! (edit conflicts, then use 'hg resolve --mark')
78 0 files updated, 0 files merged, 0 files removed, 1 files unresolved
79 use 'hg resolve' to retry unresolved file merges or 'hg update -C .' to abandon
80 [1]
81
82 $ cat a
83 Small Mathematical Series.
84 <<<<<<< local: test 2
85 1
86 2
87 3
88 6
89 8
90 =======
91 1
92 2
93 3
94 4
95 5
96 >>>>>>> other: test 1
97 Hop we are done.
98
99 Verify line splitting of custom conflict marker which causes multiple lines
100
101 $ hg up -q --clean .
102 $ cat >> .hg/hgrc <<EOF
103 > [ui]
104 > mergemarkertemplate={author} {rev}\nfoo\nbar\nbaz
105 > EOF
106
107 $ hg -q merge 1
108 warning: conflicts during merge.
109 merging a incomplete! (edit conflicts, then use 'hg resolve --mark')
110 [1]
111
112 $ cat a
113 Small Mathematical Series.
114 <<<<<<< local: test 2
115 1
116 2
117 3
118 6
119 8
120 =======
121 1
122 2
123 3
124 4
125 5
126 >>>>>>> other: test 1
127 Hop we are done.
128
129 Verify line trimming of custom conflict marker using multi-byte characters
130
131 $ hg up -q --clean .
132 $ python <<EOF
133 > fp = open('logfile', 'w')
134 > fp.write('12345678901234567890123456789012345678901234567890' +
135 > '1234567890') # there are 5 more columns for 80 columns
136 >
137 > # 2 x 4 = 8 columns, but 3 x 4 = 12 bytes
138 > fp.write(u'\u3042\u3044\u3046\u3048'.encode('utf-8'))
139 >
140 > fp.close()
141 > EOF
142 $ hg add logfile
143 $ hg --encoding utf-8 commit --logfile logfile
144
145 $ cat >> .hg/hgrc <<EOF
146 > [ui]
147 > mergemarkertemplate={desc|firstline}
148 > EOF
149
150 $ hg -q --encoding utf-8 merge 1
151 warning: conflicts during merge.
152 merging a incomplete! (edit conflicts, then use 'hg resolve --mark')
153 [1]
154
155 $ cat a
156 Small Mathematical Series.
157 <<<<<<< local: 123456789012345678901234567890123456789012345678901234567890\xe3\x81\x82... (esc)
158 1
159 2
160 3
161 6
162 8
163 =======
164 1
165 2
166 3
167 4
168 5
169 >>>>>>> other: branch1
170 Hop we are done.
171
172 Verify basic conflict markers
173
174 $ hg up -q --clean 2
175 $ printf "\n[ui]\nmergemarkers=basic\n" >> .hg/hgrc
176
177 $ hg merge 1
178 merging a
179 warning: conflicts during merge.
180 merging a incomplete! (edit conflicts, then use 'hg resolve --mark')
181 0 files updated, 0 files merged, 0 files removed, 1 files unresolved
182 use 'hg resolve' to retry unresolved file merges or 'hg update -C .' to abandon
183 [1]
184
185 $ cat a
186 Small Mathematical Series.
187 <<<<<<< local
188 1
189 2
190 3
191 6
192 8
193 =======
194 1
195 2
196 3
197 4
198 5
199 >>>>>>> other
200 Hop we are done.
@@ -21,7 +21,7 b' print "workingfilectx.date =", repo[None'
21 # test memctx with non-ASCII commit message
21 # test memctx with non-ASCII commit message
22
22
23 def filectxfn(repo, memctx, path):
23 def filectxfn(repo, memctx, path):
24 return context.memfilectx("foo", "")
24 return context.memfilectx(repo, "foo", "")
25
25
26 ctx = context.memctx(repo, ['tip', None],
26 ctx = context.memctx(repo, ['tip', None],
27 encoding.tolocal("Gr\xc3\xbcezi!"),
27 encoding.tolocal("Gr\xc3\xbcezi!"),
@@ -30,3 +30,23 b' ctx.commit()'
30 for enc in "ASCII", "Latin-1", "UTF-8":
30 for enc in "ASCII", "Latin-1", "UTF-8":
31 encoding.encoding = enc
31 encoding.encoding = enc
32 print "%-8s: %s" % (enc, repo["tip"].description())
32 print "%-8s: %s" % (enc, repo["tip"].description())
33
34 # test performing a status
35
36 def getfilectx(repo, memctx, f):
37 fctx = memctx.parents()[0][f]
38 data, flags = fctx.data(), fctx.flags()
39 if f == 'foo':
40 data += 'bar\n'
41 return context.memfilectx(repo, f, data, 'l' in flags, 'x' in flags)
42
43 ctxa = repo.changectx(0)
44 ctxb = context.memctx(repo, [ctxa.node(), None], "test diff", ["foo"],
45 getfilectx, ctxa.user(), ctxa.date())
46
47 print ctxb.status(ctxa)
48
49 # test performing a diff on a memctx
50
51 for d in ctxb.diff(ctxa, git=True):
52 print d
@@ -2,3 +2,12 b' workingfilectx.date = (1000, 0)'
2 ASCII : Gr?ezi!
2 ASCII : Gr?ezi!
3 Latin-1 : Gr�ezi!
3 Latin-1 : Gr�ezi!
4 UTF-8 : Grüezi!
4 UTF-8 : Grüezi!
5 (['foo'], [], [], [], [], [], [])
6 diff --git a/foo b/foo
7
8 --- a/foo
9 +++ b/foo
10 @@ -1,1 +1,2 @@
11 foo
12 +bar
13
@@ -16,8 +16,10 b''
16 $ echo file > foo/file
16 $ echo file > foo/file
17 $ hg ci -qAm 'add foo/file'
17 $ hg ci -qAm 'add foo/file'
18 $ hg tag some-tag
18 $ hg tag some-tag
19 $ hg tag -l local-tag
19 $ hg log
20 $ hg log
20 changeset: 3:593cbf6fb2b4
21 changeset: 3:593cbf6fb2b4
22 tag: local-tag
21 tag: tip
23 tag: tip
22 user: test
24 user: test
23 date: Thu Jan 01 00:00:00 1970 +0000
25 date: Thu Jan 01 00:00:00 1970 +0000
@@ -390,3 +392,148 b' More source changes'
390 o 0 a4a1dae0fe35 "1: add a and dir/b" files: 0 a
392 o 0 a4a1dae0fe35 "1: add a and dir/b" files: 0 a
391
393
392 $ cd ..
394 $ cd ..
395
396 Two way tests
397
398 $ hg init 0
399 $ echo f > 0/f
400 $ echo a > 0/a-only
401 $ echo b > 0/b-only
402 $ hg -R 0 ci -Aqm0
403
404 $ cat << EOF > filemap-a
405 > exclude b-only
406 > EOF
407 $ cat << EOF > filemap-b
408 > exclude a-only
409 > EOF
410 $ hg convert --filemap filemap-a 0 a
411 initializing destination a repository
412 scanning source...
413 sorting...
414 converting...
415 0 0
416 $ hg -R a up -q
417 $ echo a > a/f
418 $ hg -R a ci -ma
419
420 $ hg convert --filemap filemap-b 0 b
421 initializing destination b repository
422 scanning source...
423 sorting...
424 converting...
425 0 0
426 $ hg -R b up -q
427 $ echo b > b/f
428 $ hg -R b ci -mb
429
430 $ tail */.hg/shamap
431 ==> 0/.hg/shamap <==
432 86f3f774ffb682bffb5dc3c1d3b3da637cb9a0d6 8a028c7c77f6c7bd6d63bc3f02ca9f779eabf16a
433 dd9f218eb91fb857f2a62fe023e1d64a4e7812fe 8a028c7c77f6c7bd6d63bc3f02ca9f779eabf16a
434
435 ==> a/.hg/shamap <==
436 8a028c7c77f6c7bd6d63bc3f02ca9f779eabf16a 86f3f774ffb682bffb5dc3c1d3b3da637cb9a0d6
437
438 ==> b/.hg/shamap <==
439 8a028c7c77f6c7bd6d63bc3f02ca9f779eabf16a dd9f218eb91fb857f2a62fe023e1d64a4e7812fe
440
441 $ hg convert a 0
442 scanning source...
443 sorting...
444 converting...
445 0 a
446
447 $ hg convert b 0
448 scanning source...
449 sorting...
450 converting...
451 0 b
452
453 $ hg -R 0 log -G
454 o changeset: 2:637fbbbe96b6
455 | tag: tip
456 | parent: 0:8a028c7c77f6
457 | user: test
458 | date: Thu Jan 01 00:00:00 1970 +0000
459 | summary: b
460 |
461 | o changeset: 1:ec7b9c96e692
462 |/ user: test
463 | date: Thu Jan 01 00:00:00 1970 +0000
464 | summary: a
465 |
466 @ changeset: 0:8a028c7c77f6
467 user: test
468 date: Thu Jan 01 00:00:00 1970 +0000
469 summary: 0
470
471 $ hg convert --filemap filemap-b 0 a --config convert.hg.revs=1::
472 scanning source...
473 sorting...
474 converting...
475
476 $ hg -R 0 up -r1
477 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
478 $ echo f >> 0/f
479 $ hg -R 0 ci -mx
480
481 $ hg convert --filemap filemap-b 0 a --config convert.hg.revs=1::
482 scanning source...
483 sorting...
484 converting...
485 0 x
486
487 $ hg -R a log -G -T '{rev} {desc|firstline} ({files})\n'
488 o 2 x (f)
489 |
490 @ 1 a (f)
491 |
492 o 0 0 (a-only f)
493
494 $ hg -R a mani -r tip
495 a-only
496 f
497
498 An additional round, demonstrating that unchanged files don't get converted
499
500 $ echo f >> 0/f
501 $ echo f >> 0/a-only
502 $ hg -R 0 ci -m "extra f+a-only change"
503
504 $ hg convert --filemap filemap-b 0 a --config convert.hg.revs=1::
505 scanning source...
506 sorting...
507 converting...
508 0 extra f+a-only change
509
510 $ hg -R a log -G -T '{rev} {desc|firstline} ({files})\n'
511 o 3 extra f+a-only change (f)
512 |
513 o 2 x (f)
514 |
515 @ 1 a (f)
516 |
517 o 0 0 (a-only f)
518
519
520 Conversion after rollback
521
522 $ hg -R a rollback -f
523 repository tip rolled back to revision 2 (undo commit)
524
525 $ hg convert --filemap filemap-b 0 a --config convert.hg.revs=1::
526 scanning source...
527 sorting...
528 converting...
529 0 extra f+a-only change
530
531 $ hg -R a log -G -T '{rev} {desc|firstline} ({files})\n'
532 o 3 extra f+a-only change (f)
533 |
534 o 2 x (f)
535 |
536 @ 1 a (f)
537 |
538 o 0 0 (a-only f)
539
@@ -24,6 +24,7 b''
24 $ hg ci -m 'merge local copy' -d '3 0'
24 $ hg ci -m 'merge local copy' -d '3 0'
25 $ hg up -C 1
25 $ hg up -C 1
26 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
26 1 files updated, 0 files merged, 1 files removed, 0 files unresolved
27 (leaving bookmark premerge1)
27 $ hg bookmark premerge2
28 $ hg bookmark premerge2
28 $ hg merge 2
29 $ hg merge 2
29 merging foo and baz to baz
30 merging foo and baz to baz
@@ -352,6 +352,7 b' Branchy history'
352 [1]
352 [1]
353 $ hg --cwd b revert -r 2 b
353 $ hg --cwd b revert -r 2 b
354 $ hg --cwd b resolve -m b
354 $ hg --cwd b resolve -m b
355 no more unresolved files
355 $ hg --cwd b ci -d '5 0' -m 'merge'
356 $ hg --cwd b ci -d '5 0' -m 'merge'
356
357
357 Expect 4 changes
358 Expect 4 changes
@@ -31,16 +31,16 b''
31 resolving manifests
31 resolving manifests
32 branchmerge: True, force: False, partial: False
32 branchmerge: True, force: False, partial: False
33 ancestor: b8bf91eeebbc, local: add3f11052fa+, remote: 17c05bb7fcb6
33 ancestor: b8bf91eeebbc, local: add3f11052fa+, remote: 17c05bb7fcb6
34 preserving a for resolve of b
35 preserving a for resolve of c
36 removing a
34 b: remote moved from a -> m
37 b: remote moved from a -> m
35 preserving a for resolve of b
36 c: remote moved from a -> m
37 preserving a for resolve of c
38 removing a
39 updating: b 1/2 files (50.00%)
38 updating: b 1/2 files (50.00%)
40 picked tool 'internal:merge' for b (binary False symlink False)
39 picked tool 'internal:merge' for b (binary False symlink False)
41 merging a and b to b
40 merging a and b to b
42 my b@add3f11052fa+ other b@17c05bb7fcb6 ancestor a@b8bf91eeebbc
41 my b@add3f11052fa+ other b@17c05bb7fcb6 ancestor a@b8bf91eeebbc
43 premerge successful
42 premerge successful
43 c: remote moved from a -> m
44 updating: c 2/2 files (100.00%)
44 updating: c 2/2 files (100.00%)
45 picked tool 'internal:merge' for c (binary False symlink False)
45 picked tool 'internal:merge' for c (binary False symlink False)
46 merging a and c to c
46 merging a and c to c
@@ -37,4 +37,8 b''
37
37
38 $ hg diff --git -r 0 -r 2
38 $ hg diff --git -r 0 -r 2
39
39
40 $ hg diff --config diff.nobinary=True --git -r 0 -r 1
41 diff --git a/binfile.bin b/binfile.bin
42 Binary file binfile.bin has changed
43
40 $ cd ..
44 $ cd ..
@@ -35,15 +35,15 b" we get conflicts that shouldn't be there"
35 resolving manifests
35 resolving manifests
36 branchmerge: True, force: False, partial: False
36 branchmerge: True, force: False, partial: False
37 ancestor: e6dc8efe11cc, local: 6a0df1dad128+, remote: 484bf6903104
37 ancestor: e6dc8efe11cc, local: 6a0df1dad128+, remote: 484bf6903104
38 preserving foo for resolve of bar
39 preserving foo for resolve of foo
38 bar: remote copied from foo -> m
40 bar: remote copied from foo -> m
39 preserving foo for resolve of bar
40 foo: versions differ -> m
41 preserving foo for resolve of foo
42 updating: bar 1/2 files (50.00%)
41 updating: bar 1/2 files (50.00%)
43 picked tool 'internal:merge' for bar (binary False symlink False)
42 picked tool 'internal:merge' for bar (binary False symlink False)
44 merging foo and bar to bar
43 merging foo and bar to bar
45 my bar@6a0df1dad128+ other bar@484bf6903104 ancestor foo@e6dc8efe11cc
44 my bar@6a0df1dad128+ other bar@484bf6903104 ancestor foo@e6dc8efe11cc
46 premerge successful
45 premerge successful
46 foo: versions differ -> m
47 updating: foo 2/2 files (100.00%)
47 updating: foo 2/2 files (100.00%)
48 picked tool 'internal:merge' for foo (binary False symlink False)
48 picked tool 'internal:merge' for foo (binary False symlink False)
49 merging foo
49 merging foo
@@ -16,19 +16,18 b' Test alignment of multibyte characters'
16 > f = file('l', 'w'); f.write(l); f.close()
16 > f = file('l', 'w'); f.write(l); f.close()
17 > # instant extension to show list of options
17 > # instant extension to show list of options
18 > f = file('showoptlist.py', 'w'); f.write("""# encoding: utf-8
18 > f = file('showoptlist.py', 'w'); f.write("""# encoding: utf-8
19 > from mercurial import cmdutil
20 > cmdtable = {}
21 > command = cmdutil.command(cmdtable)
22 >
23 > @command('showoptlist',
24 > [('s', 'opt1', '', 'short width' + ' %(s)s' * 8, '%(s)s'),
25 > ('m', 'opt2', '', 'middle width' + ' %(m)s' * 8, '%(m)s'),
26 > ('l', 'opt3', '', 'long width' + ' %(l)s' * 8, '%(l)s')],
27 > '')
19 > def showoptlist(ui, repo, *pats, **opts):
28 > def showoptlist(ui, repo, *pats, **opts):
20 > '''dummy command to show option descriptions'''
29 > '''dummy command to show option descriptions'''
21 > return 0
30 > return 0
22 > cmdtable = {
23 > 'showoptlist':
24 > (showoptlist,
25 > [('s', 'opt1', '', 'short width' + ' %(s)s' * 8, '%(s)s'),
26 > ('m', 'opt2', '', 'middle width' + ' %(m)s' * 8, '%(m)s'),
27 > ('l', 'opt3', '', 'long width' + ' %(l)s' * 8, '%(l)s')
28 > ],
29 > ""
30 > )
31 > }
32 > """ % globals())
31 > """ % globals())
33 > f.close()
32 > f.close()
34 > EOF
33 > EOF
@@ -6,7 +6,13 b' Test text wrapping for multibyte charact'
6 define commands to display help text
6 define commands to display help text
7
7
8 $ cat << EOF > show.py
8 $ cat << EOF > show.py
9 > from mercurial import cmdutil
10 >
11 > cmdtable = {}
12 > command = cmdutil.command(cmdtable)
13 >
9 > # Japanese full-width characters:
14 > # Japanese full-width characters:
15 > @command('show_full_ja', [], '')
10 > def show_full_ja(ui, **opts):
16 > def show_full_ja(ui, **opts):
11 > u'''\u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051 \u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051 \u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051
17 > u'''\u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051 \u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051 \u3042\u3044\u3046\u3048\u304a\u304b\u304d\u304f\u3051
12 >
18 >
@@ -16,6 +22,7 b' define commands to display help text'
16 > '''
22 > '''
17 >
23 >
18 > # Japanese half-width characters:
24 > # Japanese half-width characters:
25 > @command('show_half_ja', [], '')
19 > def show_half_ja(ui, *opts):
26 > def show_half_ja(ui, *opts):
20 > u'''\uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79
27 > u'''\uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79 \uff71\uff72\uff73\uff74\uff75\uff76\uff77\uff78\uff79
21 >
28 >
@@ -25,6 +32,7 b' define commands to display help text'
25 > '''
32 > '''
26 >
33 >
27 > # Japanese ambiguous-width characters:
34 > # Japanese ambiguous-width characters:
35 > @command('show_ambig_ja', [], '')
28 > def show_ambig_ja(ui, **opts):
36 > def show_ambig_ja(ui, **opts):
29 > u'''\u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb \u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb \u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb
37 > u'''\u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb \u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb \u03b1\u03b2\u03b3\u03b4\u03c5\u03b6\u03b7\u03b8\u25cb
30 >
38 >
@@ -34,6 +42,7 b' define commands to display help text'
34 > '''
42 > '''
35 >
43 >
36 > # Russian ambiguous-width characters:
44 > # Russian ambiguous-width characters:
45 > @command('show_ambig_ru', [], '')
37 > def show_ambig_ru(ui, **opts):
46 > def show_ambig_ru(ui, **opts):
38 > u'''\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438
47 > u'''\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438 \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438
39 >
48 >
@@ -41,13 +50,6 b' define commands to display help text'
41 >
50 >
42 > \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438
51 > \u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438
43 > '''
52 > '''
44 >
45 > cmdtable = {
46 > 'show_full_ja': (show_full_ja, [], ""),
47 > 'show_half_ja': (show_half_ja, [], ""),
48 > 'show_ambig_ja': (show_ambig_ja, [], ""),
49 > 'show_ambig_ru': (show_ambig_ru, [], ""),
50 > }
51 > EOF
53 > EOF
52
54
53 "COLUMNS=60" means that there is no lines which has grater than 58 width
55 "COLUMNS=60" means that there is no lines which has grater than 58 width
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file copied from tests/test-run-tests.t to tests/test-unified-test.t
NO CONTENT: file copied from tests/test-run-tests.t to tests/test-unified-test.t
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now