##// END OF EJS Templates
setdiscovery: always add exponential sample to the heads...
Pierre-Yves David -
r23813:932f814b default
parent child Browse files
Show More
@@ -1,258 +1,255 b''
1 # setdiscovery.py - improved discovery of common nodeset for mercurial
1 # setdiscovery.py - improved discovery of common nodeset for mercurial
2 #
2 #
3 # Copyright 2010 Benoit Boissinot <bboissin@gmail.com>
3 # Copyright 2010 Benoit Boissinot <bboissin@gmail.com>
4 # and Peter Arrenbrecht <peter@arrenbrecht.ch>
4 # and Peter Arrenbrecht <peter@arrenbrecht.ch>
5 #
5 #
6 # This software may be used and distributed according to the terms of the
6 # This software may be used and distributed according to the terms of the
7 # GNU General Public License version 2 or any later version.
7 # GNU General Public License version 2 or any later version.
8 """
8 """
9 Algorithm works in the following way. You have two repository: local and
9 Algorithm works in the following way. You have two repository: local and
10 remote. They both contains a DAG of changelists.
10 remote. They both contains a DAG of changelists.
11
11
12 The goal of the discovery protocol is to find one set of node *common*,
12 The goal of the discovery protocol is to find one set of node *common*,
13 the set of nodes shared by local and remote.
13 the set of nodes shared by local and remote.
14
14
15 One of the issue with the original protocol was latency, it could
15 One of the issue with the original protocol was latency, it could
16 potentially require lots of roundtrips to discover that the local repo was a
16 potentially require lots of roundtrips to discover that the local repo was a
17 subset of remote (which is a very common case, you usually have few changes
17 subset of remote (which is a very common case, you usually have few changes
18 compared to upstream, while upstream probably had lots of development).
18 compared to upstream, while upstream probably had lots of development).
19
19
20 The new protocol only requires one interface for the remote repo: `known()`,
20 The new protocol only requires one interface for the remote repo: `known()`,
21 which given a set of changelists tells you if they are present in the DAG.
21 which given a set of changelists tells you if they are present in the DAG.
22
22
23 The algorithm then works as follow:
23 The algorithm then works as follow:
24
24
25 - We will be using three sets, `common`, `missing`, `unknown`. Originally
25 - We will be using three sets, `common`, `missing`, `unknown`. Originally
26 all nodes are in `unknown`.
26 all nodes are in `unknown`.
27 - Take a sample from `unknown`, call `remote.known(sample)`
27 - Take a sample from `unknown`, call `remote.known(sample)`
28 - For each node that remote knows, move it and all its ancestors to `common`
28 - For each node that remote knows, move it and all its ancestors to `common`
29 - For each node that remote doesn't know, move it and all its descendants
29 - For each node that remote doesn't know, move it and all its descendants
30 to `missing`
30 to `missing`
31 - Iterate until `unknown` is empty
31 - Iterate until `unknown` is empty
32
32
33 There are a couple optimizations, first is instead of starting with a random
33 There are a couple optimizations, first is instead of starting with a random
34 sample of missing, start by sending all heads, in the case where the local
34 sample of missing, start by sending all heads, in the case where the local
35 repo is a subset, you computed the answer in one round trip.
35 repo is a subset, you computed the answer in one round trip.
36
36
37 Then you can do something similar to the bisecting strategy used when
37 Then you can do something similar to the bisecting strategy used when
38 finding faulty changesets. Instead of random samples, you can try picking
38 finding faulty changesets. Instead of random samples, you can try picking
39 nodes that will maximize the number of nodes that will be
39 nodes that will maximize the number of nodes that will be
40 classified with it (since all ancestors or descendants will be marked as well).
40 classified with it (since all ancestors or descendants will be marked as well).
41 """
41 """
42
42
43 from node import nullid, nullrev
43 from node import nullid, nullrev
44 from i18n import _
44 from i18n import _
45 import random
45 import random
46 import util, dagutil
46 import util, dagutil
47
47
48 def _updatesample(dag, nodes, sample, always, quicksamplesize=0):
48 def _updatesample(dag, nodes, sample, always, quicksamplesize=0):
49 """update an existing sample to match the expected size
49 """update an existing sample to match the expected size
50
50
51 The sample is updated with nodes exponentially distant from each head of the
51 The sample is updated with nodes exponentially distant from each head of the
52 <nodes> set. (H~1, H~2, H~4, H~8, etc).
52 <nodes> set. (H~1, H~2, H~4, H~8, etc).
53
53
54 If a target size is specified, the sampling will stop once this size is
54 If a target size is specified, the sampling will stop once this size is
55 reached. Otherwise sampling will happen until roots of the <nodes> set are
55 reached. Otherwise sampling will happen until roots of the <nodes> set are
56 reached.
56 reached.
57
57
58 :dag: a dag object from dagutil
58 :dag: a dag object from dagutil
59 :nodes: set of nodes we want to discover (if None, assume the whole dag)
59 :nodes: set of nodes we want to discover (if None, assume the whole dag)
60 :sample: a sample to update
60 :sample: a sample to update
61 :always: set of notable nodes that will be part of the sample anyway
61 :always: set of notable nodes that will be part of the sample anyway
62 :quicksamplesize: optional target size of the sample"""
62 :quicksamplesize: optional target size of the sample"""
63 # if nodes is empty we scan the entire graph
63 # if nodes is empty we scan the entire graph
64 if nodes:
64 if nodes:
65 heads = dag.headsetofconnecteds(nodes)
65 heads = dag.headsetofconnecteds(nodes)
66 else:
66 else:
67 heads = dag.heads()
67 heads = dag.heads()
68 dist = {}
68 dist = {}
69 visit = util.deque(heads)
69 visit = util.deque(heads)
70 seen = set()
70 seen = set()
71 factor = 1
71 factor = 1
72 while visit:
72 while visit:
73 curr = visit.popleft()
73 curr = visit.popleft()
74 if curr in seen:
74 if curr in seen:
75 continue
75 continue
76 d = dist.setdefault(curr, 1)
76 d = dist.setdefault(curr, 1)
77 if d > factor:
77 if d > factor:
78 factor *= 2
78 factor *= 2
79 if d == factor:
79 if d == factor:
80 if curr not in always: # need this check for the early exit below
80 if curr not in always: # need this check for the early exit below
81 sample.add(curr)
81 sample.add(curr)
82 if quicksamplesize and (len(sample) >= quicksamplesize):
82 if quicksamplesize and (len(sample) >= quicksamplesize):
83 return
83 return
84 seen.add(curr)
84 seen.add(curr)
85 for p in dag.parents(curr):
85 for p in dag.parents(curr):
86 if not nodes or p in nodes:
86 if not nodes or p in nodes:
87 dist.setdefault(p, d + 1)
87 dist.setdefault(p, d + 1)
88 visit.append(p)
88 visit.append(p)
89
89
90 def _setupsample(dag, nodes, size):
90 def _setupsample(dag, nodes, size):
91 always = dag.headsetofconnecteds(nodes)
91 always = dag.headsetofconnecteds(nodes)
92 desiredlen = size - len(always)
92 desiredlen = size - len(always)
93 if desiredlen <= 0:
93 if desiredlen <= 0:
94 # This could be bad if there are very many heads, all unknown to the
94 # This could be bad if there are very many heads, all unknown to the
95 # server. We're counting on long request support here.
95 # server. We're counting on long request support here.
96 return always, None, desiredlen
96 return always, None, desiredlen
97 return always, set(), desiredlen
97 return always, set(), desiredlen
98
98
99 def _takequicksample(dag, nodes, size):
99 def _takequicksample(dag, nodes, size):
100 always, sample, desiredlen = _setupsample(dag, nodes, size)
100 always, sample, desiredlen = _setupsample(dag, nodes, size)
101 if sample is None:
101 if sample is None:
102 return always
102 return always
103 _updatesample(dag, None, sample, always, quicksamplesize=desiredlen)
103 _updatesample(dag, None, sample, always, quicksamplesize=desiredlen)
104 sample.update(always)
104 sample.update(always)
105 return sample
105 return sample
106
106
107 def _takefullsample(dag, nodes, size):
107 def _takefullsample(dag, nodes, size):
108 always = dag.headsetofconnecteds(nodes)
108 sample = always = dag.headsetofconnecteds(nodes)
109 if size <= len(always):
110 return always
111 sample = always
112 # update from heads
109 # update from heads
113 _updatesample(dag, nodes, sample, always)
110 _updatesample(dag, nodes, sample, always)
114 # update from roots
111 # update from roots
115 _updatesample(dag.inverse(), nodes, sample, always)
112 _updatesample(dag.inverse(), nodes, sample, always)
116 assert sample
113 assert sample
117 sample.update(always)
114 sample.update(always)
118 sample = _limitsample(sample, size)
115 sample = _limitsample(sample, size)
119 if len(sample) < size:
116 if len(sample) < size:
120 more = size - len(sample)
117 more = size - len(sample)
121 sample.update(random.sample(list(nodes - sample), more))
118 sample.update(random.sample(list(nodes - sample), more))
122 return sample
119 return sample
123
120
124 def _limitsample(sample, desiredlen):
121 def _limitsample(sample, desiredlen):
125 """return a random subset of sample of at most desiredlen item"""
122 """return a random subset of sample of at most desiredlen item"""
126 if len(sample) > desiredlen:
123 if len(sample) > desiredlen:
127 sample = set(random.sample(sample, desiredlen))
124 sample = set(random.sample(sample, desiredlen))
128 return sample
125 return sample
129
126
130 def findcommonheads(ui, local, remote,
127 def findcommonheads(ui, local, remote,
131 initialsamplesize=100,
128 initialsamplesize=100,
132 fullsamplesize=200,
129 fullsamplesize=200,
133 abortwhenunrelated=True):
130 abortwhenunrelated=True):
134 '''Return a tuple (common, anyincoming, remoteheads) used to identify
131 '''Return a tuple (common, anyincoming, remoteheads) used to identify
135 missing nodes from or in remote.
132 missing nodes from or in remote.
136 '''
133 '''
137 roundtrips = 0
134 roundtrips = 0
138 cl = local.changelog
135 cl = local.changelog
139 dag = dagutil.revlogdag(cl)
136 dag = dagutil.revlogdag(cl)
140
137
141 # early exit if we know all the specified remote heads already
138 # early exit if we know all the specified remote heads already
142 ui.debug("query 1; heads\n")
139 ui.debug("query 1; heads\n")
143 roundtrips += 1
140 roundtrips += 1
144 ownheads = dag.heads()
141 ownheads = dag.heads()
145 sample = _limitsample(ownheads, initialsamplesize)
142 sample = _limitsample(ownheads, initialsamplesize)
146 # indices between sample and externalized version must match
143 # indices between sample and externalized version must match
147 sample = list(sample)
144 sample = list(sample)
148 if remote.local():
145 if remote.local():
149 # stopgap until we have a proper localpeer that supports batch()
146 # stopgap until we have a proper localpeer that supports batch()
150 srvheadhashes = remote.heads()
147 srvheadhashes = remote.heads()
151 yesno = remote.known(dag.externalizeall(sample))
148 yesno = remote.known(dag.externalizeall(sample))
152 elif remote.capable('batch'):
149 elif remote.capable('batch'):
153 batch = remote.batch()
150 batch = remote.batch()
154 srvheadhashesref = batch.heads()
151 srvheadhashesref = batch.heads()
155 yesnoref = batch.known(dag.externalizeall(sample))
152 yesnoref = batch.known(dag.externalizeall(sample))
156 batch.submit()
153 batch.submit()
157 srvheadhashes = srvheadhashesref.value
154 srvheadhashes = srvheadhashesref.value
158 yesno = yesnoref.value
155 yesno = yesnoref.value
159 else:
156 else:
160 # compatibility with pre-batch, but post-known remotes during 1.9
157 # compatibility with pre-batch, but post-known remotes during 1.9
161 # development
158 # development
162 srvheadhashes = remote.heads()
159 srvheadhashes = remote.heads()
163 sample = []
160 sample = []
164
161
165 if cl.tip() == nullid:
162 if cl.tip() == nullid:
166 if srvheadhashes != [nullid]:
163 if srvheadhashes != [nullid]:
167 return [nullid], True, srvheadhashes
164 return [nullid], True, srvheadhashes
168 return [nullid], False, []
165 return [nullid], False, []
169
166
170 # start actual discovery (we note this before the next "if" for
167 # start actual discovery (we note this before the next "if" for
171 # compatibility reasons)
168 # compatibility reasons)
172 ui.status(_("searching for changes\n"))
169 ui.status(_("searching for changes\n"))
173
170
174 srvheads = dag.internalizeall(srvheadhashes, filterunknown=True)
171 srvheads = dag.internalizeall(srvheadhashes, filterunknown=True)
175 if len(srvheads) == len(srvheadhashes):
172 if len(srvheads) == len(srvheadhashes):
176 ui.debug("all remote heads known locally\n")
173 ui.debug("all remote heads known locally\n")
177 return (srvheadhashes, False, srvheadhashes,)
174 return (srvheadhashes, False, srvheadhashes,)
178
175
179 if sample and len(ownheads) <= initialsamplesize and util.all(yesno):
176 if sample and len(ownheads) <= initialsamplesize and util.all(yesno):
180 ui.note(_("all local heads known remotely\n"))
177 ui.note(_("all local heads known remotely\n"))
181 ownheadhashes = dag.externalizeall(ownheads)
178 ownheadhashes = dag.externalizeall(ownheads)
182 return (ownheadhashes, True, srvheadhashes,)
179 return (ownheadhashes, True, srvheadhashes,)
183
180
184 # full blown discovery
181 # full blown discovery
185
182
186 # own nodes I know we both know
183 # own nodes I know we both know
187 # treat remote heads (and maybe own heads) as a first implicit sample
184 # treat remote heads (and maybe own heads) as a first implicit sample
188 # response
185 # response
189 common = cl.incrementalmissingrevs(srvheads)
186 common = cl.incrementalmissingrevs(srvheads)
190 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
187 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
191 common.addbases(commoninsample)
188 common.addbases(commoninsample)
192 # own nodes where I don't know if remote knows them
189 # own nodes where I don't know if remote knows them
193 undecided = set(common.missingancestors(ownheads))
190 undecided = set(common.missingancestors(ownheads))
194 # own nodes I know remote lacks
191 # own nodes I know remote lacks
195 missing = set()
192 missing = set()
196
193
197 full = False
194 full = False
198 while undecided:
195 while undecided:
199
196
200 if sample:
197 if sample:
201 missinginsample = [n for i, n in enumerate(sample) if not yesno[i]]
198 missinginsample = [n for i, n in enumerate(sample) if not yesno[i]]
202 missing.update(dag.descendantset(missinginsample, missing))
199 missing.update(dag.descendantset(missinginsample, missing))
203
200
204 undecided.difference_update(missing)
201 undecided.difference_update(missing)
205
202
206 if not undecided:
203 if not undecided:
207 break
204 break
208
205
209 if full or common.hasbases():
206 if full or common.hasbases():
210 if full:
207 if full:
211 ui.note(_("sampling from both directions\n"))
208 ui.note(_("sampling from both directions\n"))
212 else:
209 else:
213 ui.debug("taking initial sample\n")
210 ui.debug("taking initial sample\n")
214 samplefunc = _takefullsample
211 samplefunc = _takefullsample
215 targetsize = fullsamplesize
212 targetsize = fullsamplesize
216 else:
213 else:
217 # use even cheaper initial sample
214 # use even cheaper initial sample
218 ui.debug("taking quick initial sample\n")
215 ui.debug("taking quick initial sample\n")
219 samplefunc = _takequicksample
216 samplefunc = _takequicksample
220 targetsize = initialsamplesize
217 targetsize = initialsamplesize
221 if len(undecided) < targetsize:
218 if len(undecided) < targetsize:
222 sample = list(undecided)
219 sample = list(undecided)
223 else:
220 else:
224 sample = samplefunc(dag, undecided, targetsize)
221 sample = samplefunc(dag, undecided, targetsize)
225 sample = _limitsample(sample, targetsize)
222 sample = _limitsample(sample, targetsize)
226
223
227 roundtrips += 1
224 roundtrips += 1
228 ui.progress(_('searching'), roundtrips, unit=_('queries'))
225 ui.progress(_('searching'), roundtrips, unit=_('queries'))
229 ui.debug("query %i; still undecided: %i, sample size is: %i\n"
226 ui.debug("query %i; still undecided: %i, sample size is: %i\n"
230 % (roundtrips, len(undecided), len(sample)))
227 % (roundtrips, len(undecided), len(sample)))
231 # indices between sample and externalized version must match
228 # indices between sample and externalized version must match
232 sample = list(sample)
229 sample = list(sample)
233 yesno = remote.known(dag.externalizeall(sample))
230 yesno = remote.known(dag.externalizeall(sample))
234 full = True
231 full = True
235
232
236 if sample:
233 if sample:
237 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
234 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
238 common.addbases(commoninsample)
235 common.addbases(commoninsample)
239 common.removeancestorsfrom(undecided)
236 common.removeancestorsfrom(undecided)
240
237
241 # heads(common) == heads(common.bases) since common represents common.bases
238 # heads(common) == heads(common.bases) since common represents common.bases
242 # and all its ancestors
239 # and all its ancestors
243 result = dag.headsetofconnecteds(common.bases)
240 result = dag.headsetofconnecteds(common.bases)
244 # common.bases can include nullrev, but our contract requires us to not
241 # common.bases can include nullrev, but our contract requires us to not
245 # return any heads in that case, so discard that
242 # return any heads in that case, so discard that
246 result.discard(nullrev)
243 result.discard(nullrev)
247 ui.progress(_('searching'), None)
244 ui.progress(_('searching'), None)
248 ui.debug("%d total queries\n" % roundtrips)
245 ui.debug("%d total queries\n" % roundtrips)
249
246
250 if not result and srvheadhashes != [nullid]:
247 if not result and srvheadhashes != [nullid]:
251 if abortwhenunrelated:
248 if abortwhenunrelated:
252 raise util.Abort(_("repository is unrelated"))
249 raise util.Abort(_("repository is unrelated"))
253 else:
250 else:
254 ui.warn(_("warning: repository is unrelated\n"))
251 ui.warn(_("warning: repository is unrelated\n"))
255 return (set([nullid]), True, srvheadhashes,)
252 return (set([nullid]), True, srvheadhashes,)
256
253
257 anyincoming = (srvheadhashes != [nullid])
254 anyincoming = (srvheadhashes != [nullid])
258 return dag.externalizeall(result), anyincoming, srvheadhashes
255 return dag.externalizeall(result), anyincoming, srvheadhashes
@@ -1,409 +1,406 b''
1
1
2 Function to test discovery between two repos in both directions, using both the local shortcut
2 Function to test discovery between two repos in both directions, using both the local shortcut
3 (which is currently not activated by default) and the full remotable protocol:
3 (which is currently not activated by default) and the full remotable protocol:
4
4
5 $ testdesc() { # revs_a, revs_b, dagdesc
5 $ testdesc() { # revs_a, revs_b, dagdesc
6 > if [ -d foo ]; then rm -rf foo; fi
6 > if [ -d foo ]; then rm -rf foo; fi
7 > hg init foo
7 > hg init foo
8 > cd foo
8 > cd foo
9 > hg debugbuilddag "$3"
9 > hg debugbuilddag "$3"
10 > hg clone . a $1 --quiet
10 > hg clone . a $1 --quiet
11 > hg clone . b $2 --quiet
11 > hg clone . b $2 --quiet
12 > echo
12 > echo
13 > echo "% -- a -> b tree"
13 > echo "% -- a -> b tree"
14 > hg -R a debugdiscovery b --verbose --old
14 > hg -R a debugdiscovery b --verbose --old
15 > echo
15 > echo
16 > echo "% -- a -> b set"
16 > echo "% -- a -> b set"
17 > hg -R a debugdiscovery b --verbose --debug
17 > hg -R a debugdiscovery b --verbose --debug
18 > echo
18 > echo
19 > echo "% -- b -> a tree"
19 > echo "% -- b -> a tree"
20 > hg -R b debugdiscovery a --verbose --old
20 > hg -R b debugdiscovery a --verbose --old
21 > echo
21 > echo
22 > echo "% -- b -> a set"
22 > echo "% -- b -> a set"
23 > hg -R b debugdiscovery a --verbose --debug
23 > hg -R b debugdiscovery a --verbose --debug
24 > cd ..
24 > cd ..
25 > }
25 > }
26
26
27
27
28 Small superset:
28 Small superset:
29
29
30 $ testdesc '-ra1 -ra2' '-rb1 -rb2 -rb3' '
30 $ testdesc '-ra1 -ra2' '-rb1 -rb2 -rb3' '
31 > +2:f +1:a1:b1
31 > +2:f +1:a1:b1
32 > <f +4 :a2
32 > <f +4 :a2
33 > +5 :b2
33 > +5 :b2
34 > <f +3 :b3'
34 > <f +3 :b3'
35
35
36 % -- a -> b tree
36 % -- a -> b tree
37 comparing with b
37 comparing with b
38 searching for changes
38 searching for changes
39 unpruned common: 01241442b3c2 66f7d451a68b b5714e113bc0
39 unpruned common: 01241442b3c2 66f7d451a68b b5714e113bc0
40 common heads: 01241442b3c2 b5714e113bc0
40 common heads: 01241442b3c2 b5714e113bc0
41 local is subset
41 local is subset
42
42
43 % -- a -> b set
43 % -- a -> b set
44 comparing with b
44 comparing with b
45 query 1; heads
45 query 1; heads
46 searching for changes
46 searching for changes
47 all local heads known remotely
47 all local heads known remotely
48 common heads: 01241442b3c2 b5714e113bc0
48 common heads: 01241442b3c2 b5714e113bc0
49 local is subset
49 local is subset
50
50
51 % -- b -> a tree
51 % -- b -> a tree
52 comparing with a
52 comparing with a
53 searching for changes
53 searching for changes
54 unpruned common: 01241442b3c2 b5714e113bc0
54 unpruned common: 01241442b3c2 b5714e113bc0
55 common heads: 01241442b3c2 b5714e113bc0
55 common heads: 01241442b3c2 b5714e113bc0
56 remote is subset
56 remote is subset
57
57
58 % -- b -> a set
58 % -- b -> a set
59 comparing with a
59 comparing with a
60 query 1; heads
60 query 1; heads
61 searching for changes
61 searching for changes
62 all remote heads known locally
62 all remote heads known locally
63 common heads: 01241442b3c2 b5714e113bc0
63 common heads: 01241442b3c2 b5714e113bc0
64 remote is subset
64 remote is subset
65
65
66
66
67 Many new:
67 Many new:
68
68
69 $ testdesc '-ra1 -ra2' '-rb' '
69 $ testdesc '-ra1 -ra2' '-rb' '
70 > +2:f +3:a1 +3:b
70 > +2:f +3:a1 +3:b
71 > <f +30 :a2'
71 > <f +30 :a2'
72
72
73 % -- a -> b tree
73 % -- a -> b tree
74 comparing with b
74 comparing with b
75 searching for changes
75 searching for changes
76 unpruned common: bebd167eb94d
76 unpruned common: bebd167eb94d
77 common heads: bebd167eb94d
77 common heads: bebd167eb94d
78
78
79 % -- a -> b set
79 % -- a -> b set
80 comparing with b
80 comparing with b
81 query 1; heads
81 query 1; heads
82 searching for changes
82 searching for changes
83 taking initial sample
83 taking initial sample
84 searching: 2 queries
84 searching: 2 queries
85 query 2; still undecided: 29, sample size is: 29
85 query 2; still undecided: 29, sample size is: 29
86 2 total queries
86 2 total queries
87 common heads: bebd167eb94d
87 common heads: bebd167eb94d
88
88
89 % -- b -> a tree
89 % -- b -> a tree
90 comparing with a
90 comparing with a
91 searching for changes
91 searching for changes
92 unpruned common: 66f7d451a68b bebd167eb94d
92 unpruned common: 66f7d451a68b bebd167eb94d
93 common heads: bebd167eb94d
93 common heads: bebd167eb94d
94
94
95 % -- b -> a set
95 % -- b -> a set
96 comparing with a
96 comparing with a
97 query 1; heads
97 query 1; heads
98 searching for changes
98 searching for changes
99 taking initial sample
99 taking initial sample
100 searching: 2 queries
100 searching: 2 queries
101 query 2; still undecided: 2, sample size is: 2
101 query 2; still undecided: 2, sample size is: 2
102 2 total queries
102 2 total queries
103 common heads: bebd167eb94d
103 common heads: bebd167eb94d
104
104
105
105
106 Both sides many new with stub:
106 Both sides many new with stub:
107
107
108 $ testdesc '-ra1 -ra2' '-rb' '
108 $ testdesc '-ra1 -ra2' '-rb' '
109 > +2:f +2:a1 +30 :b
109 > +2:f +2:a1 +30 :b
110 > <f +30 :a2'
110 > <f +30 :a2'
111
111
112 % -- a -> b tree
112 % -- a -> b tree
113 comparing with b
113 comparing with b
114 searching for changes
114 searching for changes
115 unpruned common: 2dc09a01254d
115 unpruned common: 2dc09a01254d
116 common heads: 2dc09a01254d
116 common heads: 2dc09a01254d
117
117
118 % -- a -> b set
118 % -- a -> b set
119 comparing with b
119 comparing with b
120 query 1; heads
120 query 1; heads
121 searching for changes
121 searching for changes
122 taking initial sample
122 taking initial sample
123 searching: 2 queries
123 searching: 2 queries
124 query 2; still undecided: 29, sample size is: 29
124 query 2; still undecided: 29, sample size is: 29
125 2 total queries
125 2 total queries
126 common heads: 2dc09a01254d
126 common heads: 2dc09a01254d
127
127
128 % -- b -> a tree
128 % -- b -> a tree
129 comparing with a
129 comparing with a
130 searching for changes
130 searching for changes
131 unpruned common: 2dc09a01254d 66f7d451a68b
131 unpruned common: 2dc09a01254d 66f7d451a68b
132 common heads: 2dc09a01254d
132 common heads: 2dc09a01254d
133
133
134 % -- b -> a set
134 % -- b -> a set
135 comparing with a
135 comparing with a
136 query 1; heads
136 query 1; heads
137 searching for changes
137 searching for changes
138 taking initial sample
138 taking initial sample
139 searching: 2 queries
139 searching: 2 queries
140 query 2; still undecided: 29, sample size is: 29
140 query 2; still undecided: 29, sample size is: 29
141 2 total queries
141 2 total queries
142 common heads: 2dc09a01254d
142 common heads: 2dc09a01254d
143
143
144
144
145 Both many new:
145 Both many new:
146
146
147 $ testdesc '-ra' '-rb' '
147 $ testdesc '-ra' '-rb' '
148 > +2:f +30 :b
148 > +2:f +30 :b
149 > <f +30 :a'
149 > <f +30 :a'
150
150
151 % -- a -> b tree
151 % -- a -> b tree
152 comparing with b
152 comparing with b
153 searching for changes
153 searching for changes
154 unpruned common: 66f7d451a68b
154 unpruned common: 66f7d451a68b
155 common heads: 66f7d451a68b
155 common heads: 66f7d451a68b
156
156
157 % -- a -> b set
157 % -- a -> b set
158 comparing with b
158 comparing with b
159 query 1; heads
159 query 1; heads
160 searching for changes
160 searching for changes
161 taking quick initial sample
161 taking quick initial sample
162 searching: 2 queries
162 searching: 2 queries
163 query 2; still undecided: 31, sample size is: 31
163 query 2; still undecided: 31, sample size is: 31
164 2 total queries
164 2 total queries
165 common heads: 66f7d451a68b
165 common heads: 66f7d451a68b
166
166
167 % -- b -> a tree
167 % -- b -> a tree
168 comparing with a
168 comparing with a
169 searching for changes
169 searching for changes
170 unpruned common: 66f7d451a68b
170 unpruned common: 66f7d451a68b
171 common heads: 66f7d451a68b
171 common heads: 66f7d451a68b
172
172
173 % -- b -> a set
173 % -- b -> a set
174 comparing with a
174 comparing with a
175 query 1; heads
175 query 1; heads
176 searching for changes
176 searching for changes
177 taking quick initial sample
177 taking quick initial sample
178 searching: 2 queries
178 searching: 2 queries
179 query 2; still undecided: 31, sample size is: 31
179 query 2; still undecided: 31, sample size is: 31
180 2 total queries
180 2 total queries
181 common heads: 66f7d451a68b
181 common heads: 66f7d451a68b
182
182
183
183
184 Both many new skewed:
184 Both many new skewed:
185
185
186 $ testdesc '-ra' '-rb' '
186 $ testdesc '-ra' '-rb' '
187 > +2:f +30 :b
187 > +2:f +30 :b
188 > <f +50 :a'
188 > <f +50 :a'
189
189
190 % -- a -> b tree
190 % -- a -> b tree
191 comparing with b
191 comparing with b
192 searching for changes
192 searching for changes
193 unpruned common: 66f7d451a68b
193 unpruned common: 66f7d451a68b
194 common heads: 66f7d451a68b
194 common heads: 66f7d451a68b
195
195
196 % -- a -> b set
196 % -- a -> b set
197 comparing with b
197 comparing with b
198 query 1; heads
198 query 1; heads
199 searching for changes
199 searching for changes
200 taking quick initial sample
200 taking quick initial sample
201 searching: 2 queries
201 searching: 2 queries
202 query 2; still undecided: 51, sample size is: 51
202 query 2; still undecided: 51, sample size is: 51
203 2 total queries
203 2 total queries
204 common heads: 66f7d451a68b
204 common heads: 66f7d451a68b
205
205
206 % -- b -> a tree
206 % -- b -> a tree
207 comparing with a
207 comparing with a
208 searching for changes
208 searching for changes
209 unpruned common: 66f7d451a68b
209 unpruned common: 66f7d451a68b
210 common heads: 66f7d451a68b
210 common heads: 66f7d451a68b
211
211
212 % -- b -> a set
212 % -- b -> a set
213 comparing with a
213 comparing with a
214 query 1; heads
214 query 1; heads
215 searching for changes
215 searching for changes
216 taking quick initial sample
216 taking quick initial sample
217 searching: 2 queries
217 searching: 2 queries
218 query 2; still undecided: 31, sample size is: 31
218 query 2; still undecided: 31, sample size is: 31
219 2 total queries
219 2 total queries
220 common heads: 66f7d451a68b
220 common heads: 66f7d451a68b
221
221
222
222
223 Both many new on top of long history:
223 Both many new on top of long history:
224
224
225 $ testdesc '-ra' '-rb' '
225 $ testdesc '-ra' '-rb' '
226 > +1000:f +30 :b
226 > +1000:f +30 :b
227 > <f +50 :a'
227 > <f +50 :a'
228
228
229 % -- a -> b tree
229 % -- a -> b tree
230 comparing with b
230 comparing with b
231 searching for changes
231 searching for changes
232 unpruned common: 7ead0cba2838
232 unpruned common: 7ead0cba2838
233 common heads: 7ead0cba2838
233 common heads: 7ead0cba2838
234
234
235 % -- a -> b set
235 % -- a -> b set
236 comparing with b
236 comparing with b
237 query 1; heads
237 query 1; heads
238 searching for changes
238 searching for changes
239 taking quick initial sample
239 taking quick initial sample
240 searching: 2 queries
240 searching: 2 queries
241 query 2; still undecided: 1049, sample size is: 11
241 query 2; still undecided: 1049, sample size is: 11
242 sampling from both directions
242 sampling from both directions
243 searching: 3 queries
243 searching: 3 queries
244 query 3; still undecided: 31, sample size is: 31
244 query 3; still undecided: 31, sample size is: 31
245 3 total queries
245 3 total queries
246 common heads: 7ead0cba2838
246 common heads: 7ead0cba2838
247
247
248 % -- b -> a tree
248 % -- b -> a tree
249 comparing with a
249 comparing with a
250 searching for changes
250 searching for changes
251 unpruned common: 7ead0cba2838
251 unpruned common: 7ead0cba2838
252 common heads: 7ead0cba2838
252 common heads: 7ead0cba2838
253
253
254 % -- b -> a set
254 % -- b -> a set
255 comparing with a
255 comparing with a
256 query 1; heads
256 query 1; heads
257 searching for changes
257 searching for changes
258 taking quick initial sample
258 taking quick initial sample
259 searching: 2 queries
259 searching: 2 queries
260 query 2; still undecided: 1029, sample size is: 11
260 query 2; still undecided: 1029, sample size is: 11
261 sampling from both directions
261 sampling from both directions
262 searching: 3 queries
262 searching: 3 queries
263 query 3; still undecided: 15, sample size is: 15
263 query 3; still undecided: 15, sample size is: 15
264 3 total queries
264 3 total queries
265 common heads: 7ead0cba2838
265 common heads: 7ead0cba2838
266
266
267
267
268 One with >200 heads, which used to use up all of the sample:
268 One with >200 heads, which used to use up all of the sample:
269
269
270 $ hg init manyheads
270 $ hg init manyheads
271 $ cd manyheads
271 $ cd manyheads
272 $ echo "+300:r @a" >dagdesc
272 $ echo "+300:r @a" >dagdesc
273 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
273 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
274 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
274 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
275 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
275 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
276 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
276 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
277 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
277 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
278 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
278 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
279 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
279 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
280 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
280 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
281 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
281 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
282 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
282 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
283 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
283 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
284 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
284 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
285 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
285 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
286 $ echo "@b *r+3" >>dagdesc # one more head
286 $ echo "@b *r+3" >>dagdesc # one more head
287 $ hg debugbuilddag <dagdesc
287 $ hg debugbuilddag <dagdesc
288 reading DAG from stdin
288 reading DAG from stdin
289
289
290 $ hg heads -t --template . | wc -c
290 $ hg heads -t --template . | wc -c
291 \s*261 (re)
291 \s*261 (re)
292
292
293 $ hg clone -b a . a
293 $ hg clone -b a . a
294 adding changesets
294 adding changesets
295 adding manifests
295 adding manifests
296 adding file changes
296 adding file changes
297 added 1340 changesets with 0 changes to 0 files (+259 heads)
297 added 1340 changesets with 0 changes to 0 files (+259 heads)
298 updating to branch a
298 updating to branch a
299 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
299 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
300 $ hg clone -b b . b
300 $ hg clone -b b . b
301 adding changesets
301 adding changesets
302 adding manifests
302 adding manifests
303 adding file changes
303 adding file changes
304 added 304 changesets with 0 changes to 0 files
304 added 304 changesets with 0 changes to 0 files
305 updating to branch b
305 updating to branch b
306 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
306 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
307
307
308 $ hg -R a debugdiscovery b --debug --verbose
308 $ hg -R a debugdiscovery b --debug --verbose
309 comparing with b
309 comparing with b
310 query 1; heads
310 query 1; heads
311 searching for changes
311 searching for changes
312 taking quick initial sample
312 taking quick initial sample
313 searching: 2 queries
313 searching: 2 queries
314 query 2; still undecided: 1240, sample size is: 100
314 query 2; still undecided: 1240, sample size is: 100
315 sampling from both directions
315 sampling from both directions
316 searching: 3 queries
316 searching: 3 queries
317 query 3; still undecided: 1140, sample size is: 200
317 query 3; still undecided: 1140, sample size is: 200
318 sampling from both directions
318 sampling from both directions
319 searching: 4 queries
319 searching: 4 queries
320 query 4; still undecided: 940, sample size is: 200
320 query 4; still undecided: 592, sample size is: 200
321 sampling from both directions
321 sampling from both directions
322 searching: 5 queries
322 searching: 5 queries
323 query 5; still undecided: 740, sample size is: 200
323 query 5; still undecided: 292, sample size is: 200
324 sampling from both directions
324 sampling from both directions
325 searching: 6 queries
325 searching: 6 queries
326 query 6; still undecided: 540, sample size is: 200
326 query 6; still undecided: 51, sample size is: 51
327 sampling from both directions
327 6 total queries
328 searching: 7 queries
329 query 7; still undecided: 46, sample size is: 46
330 7 total queries
331 common heads: 3ee37d65064a
328 common heads: 3ee37d65064a
332
329
333 Test actual protocol when pulling one new head in addition to common heads
330 Test actual protocol when pulling one new head in addition to common heads
334
331
335 $ hg clone -U b c
332 $ hg clone -U b c
336 $ hg -R c id -ir tip
333 $ hg -R c id -ir tip
337 513314ca8b3a
334 513314ca8b3a
338 $ hg -R c up -qr default
335 $ hg -R c up -qr default
339 $ touch c/f
336 $ touch c/f
340 $ hg -R c ci -Aqm "extra head"
337 $ hg -R c ci -Aqm "extra head"
341 $ hg -R c id -i
338 $ hg -R c id -i
342 e64a39e7da8b
339 e64a39e7da8b
343
340
344 $ hg serve -R c -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
341 $ hg serve -R c -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
345 $ cat hg.pid >> $DAEMON_PIDS
342 $ cat hg.pid >> $DAEMON_PIDS
346
343
347 $ hg -R b incoming http://localhost:$HGPORT/ -T '{node|short}\n'
344 $ hg -R b incoming http://localhost:$HGPORT/ -T '{node|short}\n'
348 comparing with http://localhost:$HGPORT/
345 comparing with http://localhost:$HGPORT/
349 searching for changes
346 searching for changes
350 e64a39e7da8b
347 e64a39e7da8b
351
348
352 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
349 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
353 $ cut -d' ' -f6- access.log | grep -v cmd=known # cmd=known uses random sampling
350 $ cut -d' ' -f6- access.log | grep -v cmd=known # cmd=known uses random sampling
354 "GET /?cmd=capabilities HTTP/1.1" 200 -
351 "GET /?cmd=capabilities HTTP/1.1" 200 -
355 "GET /?cmd=batch HTTP/1.1" 200 - x-hgarg-1:cmds=heads+%3Bknown+nodes%3D513314ca8b3ae4dac8eec56966265b00fcf866db
352 "GET /?cmd=batch HTTP/1.1" 200 - x-hgarg-1:cmds=heads+%3Bknown+nodes%3D513314ca8b3ae4dac8eec56966265b00fcf866db
356 "GET /?cmd=getbundle HTTP/1.1" 200 - x-hgarg-1:common=513314ca8b3ae4dac8eec56966265b00fcf866db&heads=e64a39e7da8b0d54bc63e81169aff001c13b3477
353 "GET /?cmd=getbundle HTTP/1.1" 200 - x-hgarg-1:common=513314ca8b3ae4dac8eec56966265b00fcf866db&heads=e64a39e7da8b0d54bc63e81169aff001c13b3477
357 "GET /?cmd=listkeys HTTP/1.1" 200 - x-hgarg-1:namespace=phases
354 "GET /?cmd=listkeys HTTP/1.1" 200 - x-hgarg-1:namespace=phases
358 $ cat errors.log
355 $ cat errors.log
359
356
360 $ cd ..
357 $ cd ..
361
358
362
359
363 Issue 4438 - test coverage for 3ef893520a85 issues.
360 Issue 4438 - test coverage for 3ef893520a85 issues.
364
361
365 $ mkdir issue4438
362 $ mkdir issue4438
366 $ cd issue4438
363 $ cd issue4438
367 #if false
364 #if false
368 generate new bundles:
365 generate new bundles:
369 $ hg init r1
366 $ hg init r1
370 $ for i in `seq 101`; do hg -R r1 up -qr null && hg -R r1 branch -q b$i && hg -R r1 ci -qmb$i; done
367 $ for i in `seq 101`; do hg -R r1 up -qr null && hg -R r1 branch -q b$i && hg -R r1 ci -qmb$i; done
371 $ hg clone -q r1 r2
368 $ hg clone -q r1 r2
372 $ for i in `seq 10`; do hg -R r1 up -qr null && hg -R r1 branch -q c$i && hg -R r1 ci -qmc$i; done
369 $ for i in `seq 10`; do hg -R r1 up -qr null && hg -R r1 branch -q c$i && hg -R r1 ci -qmc$i; done
373 $ hg -R r2 branch -q r2change && hg -R r2 ci -qmr2change
370 $ hg -R r2 branch -q r2change && hg -R r2 ci -qmr2change
374 $ hg -R r1 bundle -qa $TESTDIR/bundles/issue4438-r1.hg
371 $ hg -R r1 bundle -qa $TESTDIR/bundles/issue4438-r1.hg
375 $ hg -R r2 bundle -qa $TESTDIR/bundles/issue4438-r2.hg
372 $ hg -R r2 bundle -qa $TESTDIR/bundles/issue4438-r2.hg
376 #else
373 #else
377 use existing bundles:
374 use existing bundles:
378 $ hg clone -q $TESTDIR/bundles/issue4438-r1.hg r1
375 $ hg clone -q $TESTDIR/bundles/issue4438-r1.hg r1
379 $ hg clone -q $TESTDIR/bundles/issue4438-r2.hg r2
376 $ hg clone -q $TESTDIR/bundles/issue4438-r2.hg r2
380 #endif
377 #endif
381
378
382 Set iteration order could cause wrong and unstable results - fixed in 73cfaa348650:
379 Set iteration order could cause wrong and unstable results - fixed in 73cfaa348650:
383
380
384 $ hg -R r1 outgoing r2 -T'{rev} '
381 $ hg -R r1 outgoing r2 -T'{rev} '
385 comparing with r2
382 comparing with r2
386 searching for changes
383 searching for changes
387 101 102 103 104 105 106 107 108 109 110 (no-eol)
384 101 102 103 104 105 106 107 108 109 110 (no-eol)
388
385
389 The case where all the 'initialsamplesize' samples already were common would
386 The case where all the 'initialsamplesize' samples already were common would
390 give 'all remote heads known locally' without checking the remaining heads -
387 give 'all remote heads known locally' without checking the remaining heads -
391 fixed in 86c35b7ae300:
388 fixed in 86c35b7ae300:
392
389
393 $ cat >> $TESTTMP/unrandomsample.py << EOF
390 $ cat >> $TESTTMP/unrandomsample.py << EOF
394 > import random
391 > import random
395 > def sample(population, k):
392 > def sample(population, k):
396 > return sorted(population)[:k]
393 > return sorted(population)[:k]
397 > random.sample = sample
394 > random.sample = sample
398 > EOF
395 > EOF
399
396
400 $ cat >> r1/.hg/hgrc << EOF
397 $ cat >> r1/.hg/hgrc << EOF
401 > [extensions]
398 > [extensions]
402 > unrandomsample = $TESTTMP/unrandomsample.py
399 > unrandomsample = $TESTTMP/unrandomsample.py
403 > EOF
400 > EOF
404
401
405 $ hg -R r1 outgoing r2 -T'{rev} '
402 $ hg -R r1 outgoing r2 -T'{rev} '
406 comparing with r2
403 comparing with r2
407 searching for changes
404 searching for changes
408 101 102 103 104 105 106 107 108 109 110 (no-eol)
405 101 102 103 104 105 106 107 108 109 110 (no-eol)
409 $ cd ..
406 $ cd ..
General Comments 0
You need to be logged in to leave comments. Login now