##// END OF EJS Templates
setdiscovery: limit the size of the initial sample (issue4411)...
Pierre-Yves David -
r23084:3ef89352 stable
parent child Browse files
Show More
@@ -1,237 +1,237 b''
1 # setdiscovery.py - improved discovery of common nodeset for mercurial
1 # setdiscovery.py - improved discovery of common nodeset for mercurial
2 #
2 #
3 # Copyright 2010 Benoit Boissinot <bboissin@gmail.com>
3 # Copyright 2010 Benoit Boissinot <bboissin@gmail.com>
4 # and Peter Arrenbrecht <peter@arrenbrecht.ch>
4 # and Peter Arrenbrecht <peter@arrenbrecht.ch>
5 #
5 #
6 # This software may be used and distributed according to the terms of the
6 # This software may be used and distributed according to the terms of the
7 # GNU General Public License version 2 or any later version.
7 # GNU General Public License version 2 or any later version.
8 """
8 """
9 Algorithm works in the following way. You have two repository: local and
9 Algorithm works in the following way. You have two repository: local and
10 remote. They both contains a DAG of changelists.
10 remote. They both contains a DAG of changelists.
11
11
12 The goal of the discovery protocol is to find one set of node *common*,
12 The goal of the discovery protocol is to find one set of node *common*,
13 the set of nodes shared by local and remote.
13 the set of nodes shared by local and remote.
14
14
15 One of the issue with the original protocol was latency, it could
15 One of the issue with the original protocol was latency, it could
16 potentially require lots of roundtrips to discover that the local repo was a
16 potentially require lots of roundtrips to discover that the local repo was a
17 subset of remote (which is a very common case, you usually have few changes
17 subset of remote (which is a very common case, you usually have few changes
18 compared to upstream, while upstream probably had lots of development).
18 compared to upstream, while upstream probably had lots of development).
19
19
20 The new protocol only requires one interface for the remote repo: `known()`,
20 The new protocol only requires one interface for the remote repo: `known()`,
21 which given a set of changelists tells you if they are present in the DAG.
21 which given a set of changelists tells you if they are present in the DAG.
22
22
23 The algorithm then works as follow:
23 The algorithm then works as follow:
24
24
25 - We will be using three sets, `common`, `missing`, `unknown`. Originally
25 - We will be using three sets, `common`, `missing`, `unknown`. Originally
26 all nodes are in `unknown`.
26 all nodes are in `unknown`.
27 - Take a sample from `unknown`, call `remote.known(sample)`
27 - Take a sample from `unknown`, call `remote.known(sample)`
28 - For each node that remote knows, move it and all its ancestors to `common`
28 - For each node that remote knows, move it and all its ancestors to `common`
29 - For each node that remote doesn't know, move it and all its descendants
29 - For each node that remote doesn't know, move it and all its descendants
30 to `missing`
30 to `missing`
31 - Iterate until `unknown` is empty
31 - Iterate until `unknown` is empty
32
32
33 There are a couple optimizations, first is instead of starting with a random
33 There are a couple optimizations, first is instead of starting with a random
34 sample of missing, start by sending all heads, in the case where the local
34 sample of missing, start by sending all heads, in the case where the local
35 repo is a subset, you computed the answer in one round trip.
35 repo is a subset, you computed the answer in one round trip.
36
36
37 Then you can do something similar to the bisecting strategy used when
37 Then you can do something similar to the bisecting strategy used when
38 finding faulty changesets. Instead of random samples, you can try picking
38 finding faulty changesets. Instead of random samples, you can try picking
39 nodes that will maximize the number of nodes that will be
39 nodes that will maximize the number of nodes that will be
40 classified with it (since all ancestors or descendants will be marked as well).
40 classified with it (since all ancestors or descendants will be marked as well).
41 """
41 """
42
42
43 from node import nullid
43 from node import nullid
44 from i18n import _
44 from i18n import _
45 import random
45 import random
46 import util, dagutil
46 import util, dagutil
47
47
48 def _updatesample(dag, nodes, sample, always, quicksamplesize=0):
48 def _updatesample(dag, nodes, sample, always, quicksamplesize=0):
49 # if nodes is empty we scan the entire graph
49 # if nodes is empty we scan the entire graph
50 if nodes:
50 if nodes:
51 heads = dag.headsetofconnecteds(nodes)
51 heads = dag.headsetofconnecteds(nodes)
52 else:
52 else:
53 heads = dag.heads()
53 heads = dag.heads()
54 dist = {}
54 dist = {}
55 visit = util.deque(heads)
55 visit = util.deque(heads)
56 seen = set()
56 seen = set()
57 factor = 1
57 factor = 1
58 while visit:
58 while visit:
59 curr = visit.popleft()
59 curr = visit.popleft()
60 if curr in seen:
60 if curr in seen:
61 continue
61 continue
62 d = dist.setdefault(curr, 1)
62 d = dist.setdefault(curr, 1)
63 if d > factor:
63 if d > factor:
64 factor *= 2
64 factor *= 2
65 if d == factor:
65 if d == factor:
66 if curr not in always: # need this check for the early exit below
66 if curr not in always: # need this check for the early exit below
67 sample.add(curr)
67 sample.add(curr)
68 if quicksamplesize and (len(sample) >= quicksamplesize):
68 if quicksamplesize and (len(sample) >= quicksamplesize):
69 return
69 return
70 seen.add(curr)
70 seen.add(curr)
71 for p in dag.parents(curr):
71 for p in dag.parents(curr):
72 if not nodes or p in nodes:
72 if not nodes or p in nodes:
73 dist.setdefault(p, d + 1)
73 dist.setdefault(p, d + 1)
74 visit.append(p)
74 visit.append(p)
75
75
76 def _setupsample(dag, nodes, size):
76 def _setupsample(dag, nodes, size):
77 if len(nodes) <= size:
77 if len(nodes) <= size:
78 return set(nodes), None, 0
78 return set(nodes), None, 0
79 always = dag.headsetofconnecteds(nodes)
79 always = dag.headsetofconnecteds(nodes)
80 desiredlen = size - len(always)
80 desiredlen = size - len(always)
81 if desiredlen <= 0:
81 if desiredlen <= 0:
82 # This could be bad if there are very many heads, all unknown to the
82 # This could be bad if there are very many heads, all unknown to the
83 # server. We're counting on long request support here.
83 # server. We're counting on long request support here.
84 return always, None, desiredlen
84 return always, None, desiredlen
85 return always, set(), desiredlen
85 return always, set(), desiredlen
86
86
87 def _takequicksample(dag, nodes, size, initial):
87 def _takequicksample(dag, nodes, size, initial):
88 always, sample, desiredlen = _setupsample(dag, nodes, size)
88 always, sample, desiredlen = _setupsample(dag, nodes, size)
89 if sample is None:
89 if sample is None:
90 return always
90 return always
91 if initial:
91 if initial:
92 fromset = None
92 fromset = None
93 else:
93 else:
94 fromset = nodes
94 fromset = nodes
95 _updatesample(dag, fromset, sample, always, quicksamplesize=desiredlen)
95 _updatesample(dag, fromset, sample, always, quicksamplesize=desiredlen)
96 sample.update(always)
96 sample.update(always)
97 return sample
97 return sample
98
98
99 def _takefullsample(dag, nodes, size):
99 def _takefullsample(dag, nodes, size):
100 always, sample, desiredlen = _setupsample(dag, nodes, size)
100 always, sample, desiredlen = _setupsample(dag, nodes, size)
101 if sample is None:
101 if sample is None:
102 return always
102 return always
103 # update from heads
103 # update from heads
104 _updatesample(dag, nodes, sample, always)
104 _updatesample(dag, nodes, sample, always)
105 # update from roots
105 # update from roots
106 _updatesample(dag.inverse(), nodes, sample, always)
106 _updatesample(dag.inverse(), nodes, sample, always)
107 assert sample
107 assert sample
108 sample = _limitsample(sample, desiredlen)
108 sample = _limitsample(sample, desiredlen)
109 if len(sample) < desiredlen:
109 if len(sample) < desiredlen:
110 more = desiredlen - len(sample)
110 more = desiredlen - len(sample)
111 sample.update(random.sample(list(nodes - sample - always), more))
111 sample.update(random.sample(list(nodes - sample - always), more))
112 sample.update(always)
112 sample.update(always)
113 return sample
113 return sample
114
114
115 def _limitsample(sample, desiredlen):
115 def _limitsample(sample, desiredlen):
116 """return a random subset of sample of at most desiredlen item"""
116 """return a random subset of sample of at most desiredlen item"""
117 if len(sample) > desiredlen:
117 if len(sample) > desiredlen:
118 sample = set(random.sample(sample, desiredlen))
118 sample = set(random.sample(sample, desiredlen))
119 return sample
119 return sample
120
120
121 def findcommonheads(ui, local, remote,
121 def findcommonheads(ui, local, remote,
122 initialsamplesize=100,
122 initialsamplesize=100,
123 fullsamplesize=200,
123 fullsamplesize=200,
124 abortwhenunrelated=True):
124 abortwhenunrelated=True):
125 '''Return a tuple (common, anyincoming, remoteheads) used to identify
125 '''Return a tuple (common, anyincoming, remoteheads) used to identify
126 missing nodes from or in remote.
126 missing nodes from or in remote.
127 '''
127 '''
128 roundtrips = 0
128 roundtrips = 0
129 cl = local.changelog
129 cl = local.changelog
130 dag = dagutil.revlogdag(cl)
130 dag = dagutil.revlogdag(cl)
131
131
132 # early exit if we know all the specified remote heads already
132 # early exit if we know all the specified remote heads already
133 ui.debug("query 1; heads\n")
133 ui.debug("query 1; heads\n")
134 roundtrips += 1
134 roundtrips += 1
135 ownheads = dag.heads()
135 ownheads = dag.heads()
136 sample = ownheads
136 sample = _limitsample(ownheads, initialsamplesize)
137 if remote.local():
137 if remote.local():
138 # stopgap until we have a proper localpeer that supports batch()
138 # stopgap until we have a proper localpeer that supports batch()
139 srvheadhashes = remote.heads()
139 srvheadhashes = remote.heads()
140 yesno = remote.known(dag.externalizeall(sample))
140 yesno = remote.known(dag.externalizeall(sample))
141 elif remote.capable('batch'):
141 elif remote.capable('batch'):
142 batch = remote.batch()
142 batch = remote.batch()
143 srvheadhashesref = batch.heads()
143 srvheadhashesref = batch.heads()
144 yesnoref = batch.known(dag.externalizeall(sample))
144 yesnoref = batch.known(dag.externalizeall(sample))
145 batch.submit()
145 batch.submit()
146 srvheadhashes = srvheadhashesref.value
146 srvheadhashes = srvheadhashesref.value
147 yesno = yesnoref.value
147 yesno = yesnoref.value
148 else:
148 else:
149 # compatibility with pre-batch, but post-known remotes during 1.9
149 # compatibility with pre-batch, but post-known remotes during 1.9
150 # development
150 # development
151 srvheadhashes = remote.heads()
151 srvheadhashes = remote.heads()
152 sample = []
152 sample = []
153
153
154 if cl.tip() == nullid:
154 if cl.tip() == nullid:
155 if srvheadhashes != [nullid]:
155 if srvheadhashes != [nullid]:
156 return [nullid], True, srvheadhashes
156 return [nullid], True, srvheadhashes
157 return [nullid], False, []
157 return [nullid], False, []
158
158
159 # start actual discovery (we note this before the next "if" for
159 # start actual discovery (we note this before the next "if" for
160 # compatibility reasons)
160 # compatibility reasons)
161 ui.status(_("searching for changes\n"))
161 ui.status(_("searching for changes\n"))
162
162
163 srvheads = dag.internalizeall(srvheadhashes, filterunknown=True)
163 srvheads = dag.internalizeall(srvheadhashes, filterunknown=True)
164 if len(srvheads) == len(srvheadhashes):
164 if len(srvheads) == len(srvheadhashes):
165 ui.debug("all remote heads known locally\n")
165 ui.debug("all remote heads known locally\n")
166 return (srvheadhashes, False, srvheadhashes,)
166 return (srvheadhashes, False, srvheadhashes,)
167
167
168 if sample and util.all(yesno):
168 if sample and util.all(yesno):
169 ui.note(_("all local heads known remotely\n"))
169 ui.note(_("all local heads known remotely\n"))
170 ownheadhashes = dag.externalizeall(ownheads)
170 ownheadhashes = dag.externalizeall(ownheads)
171 return (ownheadhashes, True, srvheadhashes,)
171 return (ownheadhashes, True, srvheadhashes,)
172
172
173 # full blown discovery
173 # full blown discovery
174
174
175 # own nodes where I don't know if remote knows them
175 # own nodes where I don't know if remote knows them
176 undecided = dag.nodeset()
176 undecided = dag.nodeset()
177 # own nodes I know we both know
177 # own nodes I know we both know
178 common = set()
178 common = set()
179 # own nodes I know remote lacks
179 # own nodes I know remote lacks
180 missing = set()
180 missing = set()
181
181
182 # treat remote heads (and maybe own heads) as a first implicit sample
182 # treat remote heads (and maybe own heads) as a first implicit sample
183 # response
183 # response
184 common.update(dag.ancestorset(srvheads))
184 common.update(dag.ancestorset(srvheads))
185 undecided.difference_update(common)
185 undecided.difference_update(common)
186
186
187 full = False
187 full = False
188 while undecided:
188 while undecided:
189
189
190 if sample:
190 if sample:
191 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
191 commoninsample = set(n for i, n in enumerate(sample) if yesno[i])
192 common.update(dag.ancestorset(commoninsample, common))
192 common.update(dag.ancestorset(commoninsample, common))
193
193
194 missinginsample = [n for i, n in enumerate(sample) if not yesno[i]]
194 missinginsample = [n for i, n in enumerate(sample) if not yesno[i]]
195 missing.update(dag.descendantset(missinginsample, missing))
195 missing.update(dag.descendantset(missinginsample, missing))
196
196
197 undecided.difference_update(missing)
197 undecided.difference_update(missing)
198 undecided.difference_update(common)
198 undecided.difference_update(common)
199
199
200 if not undecided:
200 if not undecided:
201 break
201 break
202
202
203 if full:
203 if full:
204 ui.note(_("sampling from both directions\n"))
204 ui.note(_("sampling from both directions\n"))
205 sample = _takefullsample(dag, undecided, size=fullsamplesize)
205 sample = _takefullsample(dag, undecided, size=fullsamplesize)
206 elif common:
206 elif common:
207 # use cheapish initial sample
207 # use cheapish initial sample
208 ui.debug("taking initial sample\n")
208 ui.debug("taking initial sample\n")
209 sample = _takefullsample(dag, undecided, size=fullsamplesize)
209 sample = _takefullsample(dag, undecided, size=fullsamplesize)
210 else:
210 else:
211 # use even cheaper initial sample
211 # use even cheaper initial sample
212 ui.debug("taking quick initial sample\n")
212 ui.debug("taking quick initial sample\n")
213 sample = _takequicksample(dag, undecided, size=initialsamplesize,
213 sample = _takequicksample(dag, undecided, size=initialsamplesize,
214 initial=True)
214 initial=True)
215
215
216 roundtrips += 1
216 roundtrips += 1
217 ui.progress(_('searching'), roundtrips, unit=_('queries'))
217 ui.progress(_('searching'), roundtrips, unit=_('queries'))
218 ui.debug("query %i; still undecided: %i, sample size is: %i\n"
218 ui.debug("query %i; still undecided: %i, sample size is: %i\n"
219 % (roundtrips, len(undecided), len(sample)))
219 % (roundtrips, len(undecided), len(sample)))
220 # indices between sample and externalized version must match
220 # indices between sample and externalized version must match
221 sample = list(sample)
221 sample = list(sample)
222 yesno = remote.known(dag.externalizeall(sample))
222 yesno = remote.known(dag.externalizeall(sample))
223 full = True
223 full = True
224
224
225 result = dag.headsetofconnecteds(common)
225 result = dag.headsetofconnecteds(common)
226 ui.progress(_('searching'), None)
226 ui.progress(_('searching'), None)
227 ui.debug("%d total queries\n" % roundtrips)
227 ui.debug("%d total queries\n" % roundtrips)
228
228
229 if not result and srvheadhashes != [nullid]:
229 if not result and srvheadhashes != [nullid]:
230 if abortwhenunrelated:
230 if abortwhenunrelated:
231 raise util.Abort(_("repository is unrelated"))
231 raise util.Abort(_("repository is unrelated"))
232 else:
232 else:
233 ui.warn(_("warning: repository is unrelated\n"))
233 ui.warn(_("warning: repository is unrelated\n"))
234 return (set([nullid]), True, srvheadhashes,)
234 return (set([nullid]), True, srvheadhashes,)
235
235
236 anyincoming = (srvheadhashes != [nullid])
236 anyincoming = (srvheadhashes != [nullid])
237 return dag.externalizeall(result), anyincoming, srvheadhashes
237 return dag.externalizeall(result), anyincoming, srvheadhashes
@@ -1,353 +1,353 b''
1
1
2 Function to test discovery between two repos in both directions, using both the local shortcut
2 Function to test discovery between two repos in both directions, using both the local shortcut
3 (which is currently not activated by default) and the full remotable protocol:
3 (which is currently not activated by default) and the full remotable protocol:
4
4
5 $ testdesc() { # revs_a, revs_b, dagdesc
5 $ testdesc() { # revs_a, revs_b, dagdesc
6 > if [ -d foo ]; then rm -rf foo; fi
6 > if [ -d foo ]; then rm -rf foo; fi
7 > hg init foo
7 > hg init foo
8 > cd foo
8 > cd foo
9 > hg debugbuilddag "$3"
9 > hg debugbuilddag "$3"
10 > hg clone . a $1 --quiet
10 > hg clone . a $1 --quiet
11 > hg clone . b $2 --quiet
11 > hg clone . b $2 --quiet
12 > echo
12 > echo
13 > echo "% -- a -> b tree"
13 > echo "% -- a -> b tree"
14 > hg -R a debugdiscovery b --verbose --old
14 > hg -R a debugdiscovery b --verbose --old
15 > echo
15 > echo
16 > echo "% -- a -> b set"
16 > echo "% -- a -> b set"
17 > hg -R a debugdiscovery b --verbose --debug
17 > hg -R a debugdiscovery b --verbose --debug
18 > echo
18 > echo
19 > echo "% -- b -> a tree"
19 > echo "% -- b -> a tree"
20 > hg -R b debugdiscovery a --verbose --old
20 > hg -R b debugdiscovery a --verbose --old
21 > echo
21 > echo
22 > echo "% -- b -> a set"
22 > echo "% -- b -> a set"
23 > hg -R b debugdiscovery a --verbose --debug
23 > hg -R b debugdiscovery a --verbose --debug
24 > cd ..
24 > cd ..
25 > }
25 > }
26
26
27
27
28 Small superset:
28 Small superset:
29
29
30 $ testdesc '-ra1 -ra2' '-rb1 -rb2 -rb3' '
30 $ testdesc '-ra1 -ra2' '-rb1 -rb2 -rb3' '
31 > +2:f +1:a1:b1
31 > +2:f +1:a1:b1
32 > <f +4 :a2
32 > <f +4 :a2
33 > +5 :b2
33 > +5 :b2
34 > <f +3 :b3'
34 > <f +3 :b3'
35
35
36 % -- a -> b tree
36 % -- a -> b tree
37 comparing with b
37 comparing with b
38 searching for changes
38 searching for changes
39 unpruned common: 01241442b3c2 66f7d451a68b b5714e113bc0
39 unpruned common: 01241442b3c2 66f7d451a68b b5714e113bc0
40 common heads: 01241442b3c2 b5714e113bc0
40 common heads: 01241442b3c2 b5714e113bc0
41 local is subset
41 local is subset
42
42
43 % -- a -> b set
43 % -- a -> b set
44 comparing with b
44 comparing with b
45 query 1; heads
45 query 1; heads
46 searching for changes
46 searching for changes
47 all local heads known remotely
47 all local heads known remotely
48 common heads: 01241442b3c2 b5714e113bc0
48 common heads: 01241442b3c2 b5714e113bc0
49 local is subset
49 local is subset
50
50
51 % -- b -> a tree
51 % -- b -> a tree
52 comparing with a
52 comparing with a
53 searching for changes
53 searching for changes
54 unpruned common: 01241442b3c2 b5714e113bc0
54 unpruned common: 01241442b3c2 b5714e113bc0
55 common heads: 01241442b3c2 b5714e113bc0
55 common heads: 01241442b3c2 b5714e113bc0
56 remote is subset
56 remote is subset
57
57
58 % -- b -> a set
58 % -- b -> a set
59 comparing with a
59 comparing with a
60 query 1; heads
60 query 1; heads
61 searching for changes
61 searching for changes
62 all remote heads known locally
62 all remote heads known locally
63 common heads: 01241442b3c2 b5714e113bc0
63 common heads: 01241442b3c2 b5714e113bc0
64 remote is subset
64 remote is subset
65
65
66
66
67 Many new:
67 Many new:
68
68
69 $ testdesc '-ra1 -ra2' '-rb' '
69 $ testdesc '-ra1 -ra2' '-rb' '
70 > +2:f +3:a1 +3:b
70 > +2:f +3:a1 +3:b
71 > <f +30 :a2'
71 > <f +30 :a2'
72
72
73 % -- a -> b tree
73 % -- a -> b tree
74 comparing with b
74 comparing with b
75 searching for changes
75 searching for changes
76 unpruned common: bebd167eb94d
76 unpruned common: bebd167eb94d
77 common heads: bebd167eb94d
77 common heads: bebd167eb94d
78
78
79 % -- a -> b set
79 % -- a -> b set
80 comparing with b
80 comparing with b
81 query 1; heads
81 query 1; heads
82 searching for changes
82 searching for changes
83 taking initial sample
83 taking initial sample
84 searching: 2 queries
84 searching: 2 queries
85 query 2; still undecided: 29, sample size is: 29
85 query 2; still undecided: 29, sample size is: 29
86 2 total queries
86 2 total queries
87 common heads: bebd167eb94d
87 common heads: bebd167eb94d
88
88
89 % -- b -> a tree
89 % -- b -> a tree
90 comparing with a
90 comparing with a
91 searching for changes
91 searching for changes
92 unpruned common: 66f7d451a68b bebd167eb94d
92 unpruned common: 66f7d451a68b bebd167eb94d
93 common heads: bebd167eb94d
93 common heads: bebd167eb94d
94
94
95 % -- b -> a set
95 % -- b -> a set
96 comparing with a
96 comparing with a
97 query 1; heads
97 query 1; heads
98 searching for changes
98 searching for changes
99 taking initial sample
99 taking initial sample
100 searching: 2 queries
100 searching: 2 queries
101 query 2; still undecided: 2, sample size is: 2
101 query 2; still undecided: 2, sample size is: 2
102 2 total queries
102 2 total queries
103 common heads: bebd167eb94d
103 common heads: bebd167eb94d
104
104
105
105
106 Both sides many new with stub:
106 Both sides many new with stub:
107
107
108 $ testdesc '-ra1 -ra2' '-rb' '
108 $ testdesc '-ra1 -ra2' '-rb' '
109 > +2:f +2:a1 +30 :b
109 > +2:f +2:a1 +30 :b
110 > <f +30 :a2'
110 > <f +30 :a2'
111
111
112 % -- a -> b tree
112 % -- a -> b tree
113 comparing with b
113 comparing with b
114 searching for changes
114 searching for changes
115 unpruned common: 2dc09a01254d
115 unpruned common: 2dc09a01254d
116 common heads: 2dc09a01254d
116 common heads: 2dc09a01254d
117
117
118 % -- a -> b set
118 % -- a -> b set
119 comparing with b
119 comparing with b
120 query 1; heads
120 query 1; heads
121 searching for changes
121 searching for changes
122 taking initial sample
122 taking initial sample
123 searching: 2 queries
123 searching: 2 queries
124 query 2; still undecided: 29, sample size is: 29
124 query 2; still undecided: 29, sample size is: 29
125 2 total queries
125 2 total queries
126 common heads: 2dc09a01254d
126 common heads: 2dc09a01254d
127
127
128 % -- b -> a tree
128 % -- b -> a tree
129 comparing with a
129 comparing with a
130 searching for changes
130 searching for changes
131 unpruned common: 2dc09a01254d 66f7d451a68b
131 unpruned common: 2dc09a01254d 66f7d451a68b
132 common heads: 2dc09a01254d
132 common heads: 2dc09a01254d
133
133
134 % -- b -> a set
134 % -- b -> a set
135 comparing with a
135 comparing with a
136 query 1; heads
136 query 1; heads
137 searching for changes
137 searching for changes
138 taking initial sample
138 taking initial sample
139 searching: 2 queries
139 searching: 2 queries
140 query 2; still undecided: 29, sample size is: 29
140 query 2; still undecided: 29, sample size is: 29
141 2 total queries
141 2 total queries
142 common heads: 2dc09a01254d
142 common heads: 2dc09a01254d
143
143
144
144
145 Both many new:
145 Both many new:
146
146
147 $ testdesc '-ra' '-rb' '
147 $ testdesc '-ra' '-rb' '
148 > +2:f +30 :b
148 > +2:f +30 :b
149 > <f +30 :a'
149 > <f +30 :a'
150
150
151 % -- a -> b tree
151 % -- a -> b tree
152 comparing with b
152 comparing with b
153 searching for changes
153 searching for changes
154 unpruned common: 66f7d451a68b
154 unpruned common: 66f7d451a68b
155 common heads: 66f7d451a68b
155 common heads: 66f7d451a68b
156
156
157 % -- a -> b set
157 % -- a -> b set
158 comparing with b
158 comparing with b
159 query 1; heads
159 query 1; heads
160 searching for changes
160 searching for changes
161 taking quick initial sample
161 taking quick initial sample
162 searching: 2 queries
162 searching: 2 queries
163 query 2; still undecided: 31, sample size is: 31
163 query 2; still undecided: 31, sample size is: 31
164 2 total queries
164 2 total queries
165 common heads: 66f7d451a68b
165 common heads: 66f7d451a68b
166
166
167 % -- b -> a tree
167 % -- b -> a tree
168 comparing with a
168 comparing with a
169 searching for changes
169 searching for changes
170 unpruned common: 66f7d451a68b
170 unpruned common: 66f7d451a68b
171 common heads: 66f7d451a68b
171 common heads: 66f7d451a68b
172
172
173 % -- b -> a set
173 % -- b -> a set
174 comparing with a
174 comparing with a
175 query 1; heads
175 query 1; heads
176 searching for changes
176 searching for changes
177 taking quick initial sample
177 taking quick initial sample
178 searching: 2 queries
178 searching: 2 queries
179 query 2; still undecided: 31, sample size is: 31
179 query 2; still undecided: 31, sample size is: 31
180 2 total queries
180 2 total queries
181 common heads: 66f7d451a68b
181 common heads: 66f7d451a68b
182
182
183
183
184 Both many new skewed:
184 Both many new skewed:
185
185
186 $ testdesc '-ra' '-rb' '
186 $ testdesc '-ra' '-rb' '
187 > +2:f +30 :b
187 > +2:f +30 :b
188 > <f +50 :a'
188 > <f +50 :a'
189
189
190 % -- a -> b tree
190 % -- a -> b tree
191 comparing with b
191 comparing with b
192 searching for changes
192 searching for changes
193 unpruned common: 66f7d451a68b
193 unpruned common: 66f7d451a68b
194 common heads: 66f7d451a68b
194 common heads: 66f7d451a68b
195
195
196 % -- a -> b set
196 % -- a -> b set
197 comparing with b
197 comparing with b
198 query 1; heads
198 query 1; heads
199 searching for changes
199 searching for changes
200 taking quick initial sample
200 taking quick initial sample
201 searching: 2 queries
201 searching: 2 queries
202 query 2; still undecided: 51, sample size is: 51
202 query 2; still undecided: 51, sample size is: 51
203 2 total queries
203 2 total queries
204 common heads: 66f7d451a68b
204 common heads: 66f7d451a68b
205
205
206 % -- b -> a tree
206 % -- b -> a tree
207 comparing with a
207 comparing with a
208 searching for changes
208 searching for changes
209 unpruned common: 66f7d451a68b
209 unpruned common: 66f7d451a68b
210 common heads: 66f7d451a68b
210 common heads: 66f7d451a68b
211
211
212 % -- b -> a set
212 % -- b -> a set
213 comparing with a
213 comparing with a
214 query 1; heads
214 query 1; heads
215 searching for changes
215 searching for changes
216 taking quick initial sample
216 taking quick initial sample
217 searching: 2 queries
217 searching: 2 queries
218 query 2; still undecided: 31, sample size is: 31
218 query 2; still undecided: 31, sample size is: 31
219 2 total queries
219 2 total queries
220 common heads: 66f7d451a68b
220 common heads: 66f7d451a68b
221
221
222
222
223 Both many new on top of long history:
223 Both many new on top of long history:
224
224
225 $ testdesc '-ra' '-rb' '
225 $ testdesc '-ra' '-rb' '
226 > +1000:f +30 :b
226 > +1000:f +30 :b
227 > <f +50 :a'
227 > <f +50 :a'
228
228
229 % -- a -> b tree
229 % -- a -> b tree
230 comparing with b
230 comparing with b
231 searching for changes
231 searching for changes
232 unpruned common: 7ead0cba2838
232 unpruned common: 7ead0cba2838
233 common heads: 7ead0cba2838
233 common heads: 7ead0cba2838
234
234
235 % -- a -> b set
235 % -- a -> b set
236 comparing with b
236 comparing with b
237 query 1; heads
237 query 1; heads
238 searching for changes
238 searching for changes
239 taking quick initial sample
239 taking quick initial sample
240 searching: 2 queries
240 searching: 2 queries
241 query 2; still undecided: 1049, sample size is: 11
241 query 2; still undecided: 1049, sample size is: 11
242 sampling from both directions
242 sampling from both directions
243 searching: 3 queries
243 searching: 3 queries
244 query 3; still undecided: 31, sample size is: 31
244 query 3; still undecided: 31, sample size is: 31
245 3 total queries
245 3 total queries
246 common heads: 7ead0cba2838
246 common heads: 7ead0cba2838
247
247
248 % -- b -> a tree
248 % -- b -> a tree
249 comparing with a
249 comparing with a
250 searching for changes
250 searching for changes
251 unpruned common: 7ead0cba2838
251 unpruned common: 7ead0cba2838
252 common heads: 7ead0cba2838
252 common heads: 7ead0cba2838
253
253
254 % -- b -> a set
254 % -- b -> a set
255 comparing with a
255 comparing with a
256 query 1; heads
256 query 1; heads
257 searching for changes
257 searching for changes
258 taking quick initial sample
258 taking quick initial sample
259 searching: 2 queries
259 searching: 2 queries
260 query 2; still undecided: 1029, sample size is: 11
260 query 2; still undecided: 1029, sample size is: 11
261 sampling from both directions
261 sampling from both directions
262 searching: 3 queries
262 searching: 3 queries
263 query 3; still undecided: 15, sample size is: 15
263 query 3; still undecided: 15, sample size is: 15
264 3 total queries
264 3 total queries
265 common heads: 7ead0cba2838
265 common heads: 7ead0cba2838
266
266
267
267
268 One with >200 heads, which used to use up all of the sample:
268 One with >200 heads, which used to use up all of the sample:
269
269
270 $ hg init manyheads
270 $ hg init manyheads
271 $ cd manyheads
271 $ cd manyheads
272 $ echo "+300:r @a" >dagdesc
272 $ echo "+300:r @a" >dagdesc
273 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
273 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
274 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
274 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
275 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
275 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
276 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
276 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
277 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
277 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
278 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
278 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
279 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
279 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
280 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
280 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
281 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
281 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
282 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
282 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
283 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
283 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
284 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
284 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
285 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
285 $ echo "*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3 *r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3*r+3" >>dagdesc # 20 heads
286 $ echo "@b *r+3" >>dagdesc # one more head
286 $ echo "@b *r+3" >>dagdesc # one more head
287 $ hg debugbuilddag <dagdesc
287 $ hg debugbuilddag <dagdesc
288 reading DAG from stdin
288 reading DAG from stdin
289
289
290 $ hg heads -t --template . | wc -c
290 $ hg heads -t --template . | wc -c
291 \s*261 (re)
291 \s*261 (re)
292
292
293 $ hg clone -b a . a
293 $ hg clone -b a . a
294 adding changesets
294 adding changesets
295 adding manifests
295 adding manifests
296 adding file changes
296 adding file changes
297 added 1340 changesets with 0 changes to 0 files (+259 heads)
297 added 1340 changesets with 0 changes to 0 files (+259 heads)
298 updating to branch a
298 updating to branch a
299 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
299 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
300 $ hg clone -b b . b
300 $ hg clone -b b . b
301 adding changesets
301 adding changesets
302 adding manifests
302 adding manifests
303 adding file changes
303 adding file changes
304 added 304 changesets with 0 changes to 0 files
304 added 304 changesets with 0 changes to 0 files
305 updating to branch b
305 updating to branch b
306 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
306 0 files updated, 0 files merged, 0 files removed, 0 files unresolved
307
307
308 $ hg -R a debugdiscovery b --debug --verbose
308 $ hg -R a debugdiscovery b --debug --verbose
309 comparing with b
309 comparing with b
310 query 1; heads
310 query 1; heads
311 searching for changes
311 searching for changes
312 taking quick initial sample
312 taking quick initial sample
313 searching: 2 queries
313 searching: 2 queries
314 query 2; still undecided: 1080, sample size is: 260
314 query 2; still undecided: 1240, sample size is: 260
315 sampling from both directions
315 sampling from both directions
316 searching: 3 queries
316 searching: 3 queries
317 query 3; still undecided: 820, sample size is: 260
317 query 3; still undecided: 980, sample size is: 260
318 sampling from both directions
318 sampling from both directions
319 searching: 4 queries
319 searching: 4 queries
320 query 4; still undecided: 560, sample size is: 260
320 query 4; still undecided: 720, sample size is: 260
321 sampling from both directions
321 sampling from both directions
322 searching: 5 queries
322 searching: 5 queries
323 query 5; still undecided: 300, sample size is: 200
323 query 5; still undecided: 460, sample size is: 200
324 5 total queries
324 5 total queries
325 common heads: 3ee37d65064a
325 common heads: 3ee37d65064a
326
326
327 Test actual protocol when pulling one new head in addition to common heads
327 Test actual protocol when pulling one new head in addition to common heads
328
328
329 $ hg clone -U b c
329 $ hg clone -U b c
330 $ hg -R c id -ir tip
330 $ hg -R c id -ir tip
331 513314ca8b3a
331 513314ca8b3a
332 $ hg -R c up -qr default
332 $ hg -R c up -qr default
333 $ touch c/f
333 $ touch c/f
334 $ hg -R c ci -Aqm "extra head"
334 $ hg -R c ci -Aqm "extra head"
335 $ hg -R c id -i
335 $ hg -R c id -i
336 e64a39e7da8b
336 e64a39e7da8b
337
337
338 $ hg serve -R c -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
338 $ hg serve -R c -p $HGPORT -d --pid-file=hg.pid -A access.log -E errors.log
339 $ cat hg.pid >> $DAEMON_PIDS
339 $ cat hg.pid >> $DAEMON_PIDS
340
340
341 $ hg -R b incoming http://localhost:$HGPORT/ -T '{node|short}\n'
341 $ hg -R b incoming http://localhost:$HGPORT/ -T '{node|short}\n'
342 comparing with http://localhost:$HGPORT/
342 comparing with http://localhost:$HGPORT/
343 searching for changes
343 searching for changes
344 e64a39e7da8b
344 e64a39e7da8b
345
345
346 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
346 $ "$TESTDIR/killdaemons.py" $DAEMON_PIDS
347 $ cut -d' ' -f6- access.log | grep -v cmd=known # cmd=known uses random sampling
347 $ cut -d' ' -f6- access.log | grep -v cmd=known # cmd=known uses random sampling
348 "GET /?cmd=capabilities HTTP/1.1" 200 -
348 "GET /?cmd=capabilities HTTP/1.1" 200 -
349 "GET /?cmd=batch HTTP/1.1" 200 - x-hgarg-1:cmds=heads+%3Bknown+nodes%3D513314ca8b3ae4dac8eec56966265b00fcf866db
349 "GET /?cmd=batch HTTP/1.1" 200 - x-hgarg-1:cmds=heads+%3Bknown+nodes%3D513314ca8b3ae4dac8eec56966265b00fcf866db
350 "GET /?cmd=getbundle HTTP/1.1" 200 - x-hgarg-1:common=513314ca8b3ae4dac8eec56966265b00fcf866db&heads=e64a39e7da8b0d54bc63e81169aff001c13b3477
350 "GET /?cmd=getbundle HTTP/1.1" 200 - x-hgarg-1:common=513314ca8b3ae4dac8eec56966265b00fcf866db&heads=e64a39e7da8b0d54bc63e81169aff001c13b3477
351 $ cat errors.log
351 $ cat errors.log
352
352
353 $ cd ..
353 $ cd ..
General Comments 0
You need to be logged in to leave comments. Login now