##// END OF EJS Templates
fix: update commit hash references in the new commits...
Matt Harbison -
r46305:04de8a1e default
parent child Browse files
Show More
@@ -1,931 +1,937
1 1 # fix - rewrite file content in changesets and working copy
2 2 #
3 3 # Copyright 2018 Google LLC.
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8 8
9 9 Provides a command that runs configured tools on the contents of modified files,
10 10 writing back any fixes to the working copy or replacing changesets.
11 11
12 12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 13 formatting fixes to modified lines in C++ code::
14 14
15 15 [fix]
16 16 clang-format:command=clang-format --assume-filename={rootpath}
17 17 clang-format:linerange=--lines={first}:{last}
18 18 clang-format:pattern=set:**.cpp or **.hpp
19 19
20 20 The :command suboption forms the first part of the shell command that will be
21 21 used to fix a file. The content of the file is passed on standard input, and the
22 22 fixed file content is expected on standard output. Any output on standard error
23 23 will be displayed as a warning. If the exit status is not zero, the file will
24 24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 25 status but no standard error output. Some values may be substituted into the
26 26 command::
27 27
28 28 {rootpath} The path of the file being fixed, relative to the repo root
29 29 {basename} The name of the file being fixed, without the directory path
30 30
31 31 If the :linerange suboption is set, the tool will only be run if there are
32 32 changed lines in a file. The value of this suboption is appended to the shell
33 33 command once for every range of changed lines in the file. Some values may be
34 34 substituted into the command::
35 35
36 36 {first} The 1-based line number of the first line in the modified range
37 37 {last} The 1-based line number of the last line in the modified range
38 38
39 39 Deleted sections of a file will be ignored by :linerange, because there is no
40 40 corresponding line range in the version being fixed.
41 41
42 42 By default, tools that set :linerange will only be executed if there is at least
43 43 one changed line range. This is meant to prevent accidents like running a code
44 44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 46 to false.
47 47
48 48 The :pattern suboption determines which files will be passed through each
49 49 configured tool. See :hg:`help patterns` for possible values. However, all
50 50 patterns are relative to the repo root, even if that text says they are relative
51 51 to the current working directory. If there are file arguments to :hg:`fix`, the
52 52 intersection of these patterns is used.
53 53
54 54 There is also a configurable limit for the maximum size of file that will be
55 55 processed by :hg:`fix`::
56 56
57 57 [fix]
58 58 maxfilesize = 2MB
59 59
60 60 Normally, execution of configured tools will continue after a failure (indicated
61 61 by a non-zero exit status). It can also be configured to abort after the first
62 62 such failure, so that no files will be affected if any tool fails. This abort
63 63 will also cause :hg:`fix` to exit with a non-zero status::
64 64
65 65 [fix]
66 66 failure = abort
67 67
68 68 When multiple tools are configured to affect a file, they execute in an order
69 69 defined by the :priority suboption. The priority suboption has a default value
70 70 of zero for each tool. Tools are executed in order of descending priority. The
71 71 execution order of tools with equal priority is unspecified. For example, you
72 72 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
73 73 in a text file by ensuring that 'sort' runs before 'head'::
74 74
75 75 [fix]
76 76 sort:command = sort -n
77 77 head:command = head -n 10
78 78 sort:pattern = numbers.txt
79 79 head:pattern = numbers.txt
80 80 sort:priority = 2
81 81 head:priority = 1
82 82
83 83 To account for changes made by each tool, the line numbers used for incremental
84 84 formatting are recomputed before executing the next tool. So, each tool may see
85 85 different values for the arguments added by the :linerange suboption.
86 86
87 87 Each fixer tool is allowed to return some metadata in addition to the fixed file
88 88 content. The metadata must be placed before the file content on stdout,
89 89 separated from the file content by a zero byte. The metadata is parsed as a JSON
90 90 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
91 91 is expected to produce this metadata encoding if and only if the :metadata
92 92 suboption is true::
93 93
94 94 [fix]
95 95 tool:command = tool --prepend-json-metadata
96 96 tool:metadata = true
97 97
98 98 The metadata values are passed to hooks, which can be used to print summaries or
99 99 perform other post-fixing work. The supported hooks are::
100 100
101 101 "postfixfile"
102 102 Run once for each file in each revision where any fixer tools made changes
103 103 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
104 104 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
105 105 tools that affected the file. Fixer tools that didn't affect the file have a
106 106 value of None. Only fixer tools that executed are present in the metadata.
107 107
108 108 "postfix"
109 109 Run once after all files and revisions have been handled. Provides
110 110 "$HG_REPLACEMENTS" with information about what revisions were created and
111 111 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
112 112 files in the working copy were updated. Provides a list "$HG_METADATA"
113 113 mapping fixer tool names to lists of metadata values returned from
114 114 executions that modified a file. This aggregates the same metadata
115 115 previously passed to the "postfixfile" hook.
116 116
117 117 Fixer tools are run in the repository's root directory. This allows them to read
118 118 configuration files from the working copy, or even write to the working copy.
119 119 The working copy is not updated to match the revision being fixed. In fact,
120 120 several revisions may be fixed in parallel. Writes to the working copy are not
121 121 amended into the revision being fixed; fixer tools should always write fixed
122 122 file content back to stdout as documented above.
123 123 """
124 124
125 125 from __future__ import absolute_import
126 126
127 127 import collections
128 128 import itertools
129 129 import os
130 130 import re
131 131 import subprocess
132 132
133 133 from mercurial.i18n import _
134 134 from mercurial.node import nullrev
135 135 from mercurial.node import wdirrev
136 136
137 137 from mercurial.utils import procutil
138 138
139 139 from mercurial import (
140 140 cmdutil,
141 141 context,
142 142 copies,
143 143 error,
144 144 match as matchmod,
145 145 mdiff,
146 146 merge,
147 147 mergestate as mergestatemod,
148 148 pycompat,
149 149 registrar,
150 150 rewriteutil,
151 151 scmutil,
152 152 util,
153 153 worker,
154 154 )
155 155
156 156 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
157 157 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
158 158 # be specifying the version(s) of Mercurial they are tested with, or
159 159 # leave the attribute unspecified.
160 160 testedwith = b'ships-with-hg-core'
161 161
162 162 cmdtable = {}
163 163 command = registrar.command(cmdtable)
164 164
165 165 configtable = {}
166 166 configitem = registrar.configitem(configtable)
167 167
168 168 # Register the suboptions allowed for each configured fixer, and default values.
169 169 FIXER_ATTRS = {
170 170 b'command': None,
171 171 b'linerange': None,
172 172 b'pattern': None,
173 173 b'priority': 0,
174 174 b'metadata': False,
175 175 b'skipclean': True,
176 176 b'enabled': True,
177 177 }
178 178
179 179 for key, default in FIXER_ATTRS.items():
180 180 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
181 181
182 182 # A good default size allows most source code files to be fixed, but avoids
183 183 # letting fixer tools choke on huge inputs, which could be surprising to the
184 184 # user.
185 185 configitem(b'fix', b'maxfilesize', default=b'2MB')
186 186
187 187 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
188 188 # This helps users do shell scripts that stop when a fixer tool signals a
189 189 # problem.
190 190 configitem(b'fix', b'failure', default=b'continue')
191 191
192 192
193 193 def checktoolfailureaction(ui, message, hint=None):
194 194 """Abort with 'message' if fix.failure=abort"""
195 195 action = ui.config(b'fix', b'failure')
196 196 if action not in (b'continue', b'abort'):
197 197 raise error.Abort(
198 198 _(b'unknown fix.failure action: %s') % (action,),
199 199 hint=_(b'use "continue" or "abort"'),
200 200 )
201 201 if action == b'abort':
202 202 raise error.Abort(message, hint=hint)
203 203
204 204
205 205 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
206 206 baseopt = (
207 207 b'',
208 208 b'base',
209 209 [],
210 210 _(
211 211 b'revisions to diff against (overrides automatic '
212 212 b'selection, and applies to every revision being '
213 213 b'fixed)'
214 214 ),
215 215 _(b'REV'),
216 216 )
217 217 revopt = (b'r', b'rev', [], _(b'revisions to fix (ADVANCED)'), _(b'REV'))
218 218 sourceopt = (
219 219 b's',
220 220 b'source',
221 221 [],
222 222 _(b'fix the specified revisions and their descendants'),
223 223 _(b'REV'),
224 224 )
225 225 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
226 226 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
227 227 usage = _(b'[OPTION]... [FILE]...')
228 228
229 229
230 230 @command(
231 231 b'fix',
232 232 [allopt, baseopt, revopt, sourceopt, wdiropt, wholeopt],
233 233 usage,
234 234 helpcategory=command.CATEGORY_FILE_CONTENTS,
235 235 )
236 236 def fix(ui, repo, *pats, **opts):
237 237 """rewrite file content in changesets or working directory
238 238
239 239 Runs any configured tools to fix the content of files. Only affects files
240 240 with changes, unless file arguments are provided. Only affects changed lines
241 241 of files, unless the --whole flag is used. Some tools may always affect the
242 242 whole file regardless of --whole.
243 243
244 244 If --working-dir is used, files with uncommitted changes in the working copy
245 245 will be fixed. Note that no backup are made.
246 246
247 247 If revisions are specified with --source, those revisions and their
248 248 descendants will be checked, and they may be replaced with new revisions
249 249 that have fixed file content. By automatically including the descendants,
250 250 no merging, rebasing, or evolution will be required. If an ancestor of the
251 251 working copy is included, then the working copy itself will also be fixed,
252 252 and the working copy will be updated to the fixed parent.
253 253
254 254 When determining what lines of each file to fix at each revision, the whole
255 255 set of revisions being fixed is considered, so that fixes to earlier
256 256 revisions are not forgotten in later ones. The --base flag can be used to
257 257 override this default behavior, though it is not usually desirable to do so.
258 258 """
259 259 opts = pycompat.byteskwargs(opts)
260 260 cmdutil.check_at_most_one_arg(opts, b'all', b'source', b'rev')
261 261 cmdutil.check_incompatible_arguments(
262 262 opts, b'working_dir', [b'all', b'source']
263 263 )
264 264
265 265 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
266 266 revstofix = getrevstofix(ui, repo, opts)
267 267 basectxs = getbasectxs(repo, opts, revstofix)
268 268 workqueue, numitems = getworkqueue(
269 269 ui, repo, pats, opts, revstofix, basectxs
270 270 )
271 271 basepaths = getbasepaths(repo, opts, workqueue, basectxs)
272 272 fixers = getfixers(ui)
273 273
274 274 # Rather than letting each worker independently fetch the files
275 275 # (which also would add complications for shared/keepalive
276 276 # connections), prefetch them all first.
277 277 _prefetchfiles(repo, workqueue, basepaths)
278 278
279 279 # There are no data dependencies between the workers fixing each file
280 280 # revision, so we can use all available parallelism.
281 281 def getfixes(items):
282 282 for rev, path in items:
283 283 ctx = repo[rev]
284 284 olddata = ctx[path].data()
285 285 metadata, newdata = fixfile(
286 286 ui, repo, opts, fixers, ctx, path, basepaths, basectxs[rev]
287 287 )
288 288 # Don't waste memory/time passing unchanged content back, but
289 289 # produce one result per item either way.
290 290 yield (
291 291 rev,
292 292 path,
293 293 metadata,
294 294 newdata if newdata != olddata else None,
295 295 )
296 296
297 297 results = worker.worker(
298 298 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
299 299 )
300 300
301 301 # We have to hold on to the data for each successor revision in memory
302 302 # until all its parents are committed. We ensure this by committing and
303 303 # freeing memory for the revisions in some topological order. This
304 304 # leaves a little bit of memory efficiency on the table, but also makes
305 305 # the tests deterministic. It might also be considered a feature since
306 306 # it makes the results more easily reproducible.
307 307 filedata = collections.defaultdict(dict)
308 308 aggregatemetadata = collections.defaultdict(list)
309 309 replacements = {}
310 310 wdirwritten = False
311 311 commitorder = sorted(revstofix, reverse=True)
312 312 with ui.makeprogress(
313 313 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
314 314 ) as progress:
315 315 for rev, path, filerevmetadata, newdata in results:
316 316 progress.increment(item=path)
317 317 for fixername, fixermetadata in filerevmetadata.items():
318 318 aggregatemetadata[fixername].append(fixermetadata)
319 319 if newdata is not None:
320 320 filedata[rev][path] = newdata
321 321 hookargs = {
322 322 b'rev': rev,
323 323 b'path': path,
324 324 b'metadata': filerevmetadata,
325 325 }
326 326 repo.hook(
327 327 b'postfixfile',
328 328 throw=False,
329 329 **pycompat.strkwargs(hookargs)
330 330 )
331 331 numitems[rev] -= 1
332 332 # Apply the fixes for this and any other revisions that are
333 333 # ready and sitting at the front of the queue. Using a loop here
334 334 # prevents the queue from being blocked by the first revision to
335 335 # be ready out of order.
336 336 while commitorder and not numitems[commitorder[-1]]:
337 337 rev = commitorder.pop()
338 338 ctx = repo[rev]
339 339 if rev == wdirrev:
340 340 writeworkingdir(repo, ctx, filedata[rev], replacements)
341 341 wdirwritten = bool(filedata[rev])
342 342 else:
343 343 replacerev(ui, repo, ctx, filedata[rev], replacements)
344 344 del filedata[rev]
345 345
346 346 cleanup(repo, replacements, wdirwritten)
347 347 hookargs = {
348 348 b'replacements': replacements,
349 349 b'wdirwritten': wdirwritten,
350 350 b'metadata': aggregatemetadata,
351 351 }
352 352 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
353 353
354 354
355 355 def cleanup(repo, replacements, wdirwritten):
356 356 """Calls scmutil.cleanupnodes() with the given replacements.
357 357
358 358 "replacements" is a dict from nodeid to nodeid, with one key and one value
359 359 for every revision that was affected by fixing. This is slightly different
360 360 from cleanupnodes().
361 361
362 362 "wdirwritten" is a bool which tells whether the working copy was affected by
363 363 fixing, since it has no entry in "replacements".
364 364
365 365 Useful as a hook point for extending "hg fix" with output summarizing the
366 366 effects of the command, though we choose not to output anything here.
367 367 """
368 368 replacements = {
369 369 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
370 370 }
371 371 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
372 372
373 373
374 374 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
375 375 """"Constructs the list of files to be fixed at specific revisions
376 376
377 377 It is up to the caller how to consume the work items, and the only
378 378 dependence between them is that replacement revisions must be committed in
379 379 topological order. Each work item represents a file in the working copy or
380 380 in some revision that should be fixed and written back to the working copy
381 381 or into a replacement revision.
382 382
383 383 Work items for the same revision are grouped together, so that a worker
384 384 pool starting with the first N items in parallel is likely to finish the
385 385 first revision's work before other revisions. This can allow us to write
386 386 the result to disk and reduce memory footprint. At time of writing, the
387 387 partition strategy in worker.py seems favorable to this. We also sort the
388 388 items by ascending revision number to match the order in which we commit
389 389 the fixes later.
390 390 """
391 391 workqueue = []
392 392 numitems = collections.defaultdict(int)
393 393 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
394 394 for rev in sorted(revstofix):
395 395 fixctx = repo[rev]
396 396 match = scmutil.match(fixctx, pats, opts)
397 397 for path in sorted(
398 398 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
399 399 ):
400 400 fctx = fixctx[path]
401 401 if fctx.islink():
402 402 continue
403 403 if fctx.size() > maxfilesize:
404 404 ui.warn(
405 405 _(b'ignoring file larger than %s: %s\n')
406 406 % (util.bytecount(maxfilesize), path)
407 407 )
408 408 continue
409 409 workqueue.append((rev, path))
410 410 numitems[rev] += 1
411 411 return workqueue, numitems
412 412
413 413
414 414 def getrevstofix(ui, repo, opts):
415 415 """Returns the set of revision numbers that should be fixed"""
416 416 if opts[b'all']:
417 417 revs = repo.revs(b'(not public() and not obsolete()) or wdir()')
418 418 elif opts[b'source']:
419 419 source_revs = scmutil.revrange(repo, opts[b'source'])
420 420 revs = set(repo.revs(b'%ld::', source_revs))
421 421 if wdirrev in source_revs:
422 422 # `wdir()::` is currently empty, so manually add wdir
423 423 revs.add(wdirrev)
424 424 if repo[b'.'].rev() in revs:
425 425 revs.add(wdirrev)
426 426 else:
427 427 revs = set(scmutil.revrange(repo, opts[b'rev']))
428 428 if opts.get(b'working_dir'):
429 429 revs.add(wdirrev)
430 430 for rev in revs:
431 431 checkfixablectx(ui, repo, repo[rev])
432 432 # Allow fixing only wdir() even if there's an unfinished operation
433 433 if not (len(revs) == 1 and wdirrev in revs):
434 434 cmdutil.checkunfinished(repo)
435 435 rewriteutil.precheck(repo, revs, b'fix')
436 436 if wdirrev in revs and list(
437 437 mergestatemod.mergestate.read(repo).unresolved()
438 438 ):
439 439 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
440 440 if not revs:
441 441 raise error.Abort(
442 442 b'no changesets specified', hint=b'use --rev or --working-dir'
443 443 )
444 444 return revs
445 445
446 446
447 447 def checkfixablectx(ui, repo, ctx):
448 448 """Aborts if the revision shouldn't be replaced with a fixed one."""
449 449 if ctx.obsolete():
450 450 # It would be better to actually check if the revision has a successor.
451 451 allowdivergence = ui.configbool(
452 452 b'experimental', b'evolution.allowdivergence'
453 453 )
454 454 if not allowdivergence:
455 455 raise error.Abort(
456 456 b'fixing obsolete revision could cause divergence'
457 457 )
458 458
459 459
460 460 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
461 461 """Returns the set of files that should be fixed in a context
462 462
463 463 The result depends on the base contexts; we include any file that has
464 464 changed relative to any of the base contexts. Base contexts should be
465 465 ancestors of the context being fixed.
466 466 """
467 467 files = set()
468 468 for basectx in basectxs:
469 469 stat = basectx.status(
470 470 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
471 471 )
472 472 files.update(
473 473 set(
474 474 itertools.chain(
475 475 stat.added, stat.modified, stat.clean, stat.unknown
476 476 )
477 477 )
478 478 )
479 479 return files
480 480
481 481
482 482 def lineranges(opts, path, basepaths, basectxs, fixctx, content2):
483 483 """Returns the set of line ranges that should be fixed in a file
484 484
485 485 Of the form [(10, 20), (30, 40)].
486 486
487 487 This depends on the given base contexts; we must consider lines that have
488 488 changed versus any of the base contexts, and whether the file has been
489 489 renamed versus any of them.
490 490
491 491 Another way to understand this is that we exclude line ranges that are
492 492 common to the file in all base contexts.
493 493 """
494 494 if opts.get(b'whole'):
495 495 # Return a range containing all lines. Rely on the diff implementation's
496 496 # idea of how many lines are in the file, instead of reimplementing it.
497 497 return difflineranges(b'', content2)
498 498
499 499 rangeslist = []
500 500 for basectx in basectxs:
501 501 basepath = basepaths.get((basectx.rev(), fixctx.rev(), path), path)
502 502
503 503 if basepath in basectx:
504 504 content1 = basectx[basepath].data()
505 505 else:
506 506 content1 = b''
507 507 rangeslist.extend(difflineranges(content1, content2))
508 508 return unionranges(rangeslist)
509 509
510 510
511 511 def getbasepaths(repo, opts, workqueue, basectxs):
512 512 if opts.get(b'whole'):
513 513 # Base paths will never be fetched for line range determination.
514 514 return {}
515 515
516 516 basepaths = {}
517 517 for rev, path in workqueue:
518 518 fixctx = repo[rev]
519 519 for basectx in basectxs[rev]:
520 520 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
521 521 if basepath in basectx:
522 522 basepaths[(basectx.rev(), fixctx.rev(), path)] = basepath
523 523 return basepaths
524 524
525 525
526 526 def unionranges(rangeslist):
527 527 """Return the union of some closed intervals
528 528
529 529 >>> unionranges([])
530 530 []
531 531 >>> unionranges([(1, 100)])
532 532 [(1, 100)]
533 533 >>> unionranges([(1, 100), (1, 100)])
534 534 [(1, 100)]
535 535 >>> unionranges([(1, 100), (2, 100)])
536 536 [(1, 100)]
537 537 >>> unionranges([(1, 99), (1, 100)])
538 538 [(1, 100)]
539 539 >>> unionranges([(1, 100), (40, 60)])
540 540 [(1, 100)]
541 541 >>> unionranges([(1, 49), (50, 100)])
542 542 [(1, 100)]
543 543 >>> unionranges([(1, 48), (50, 100)])
544 544 [(1, 48), (50, 100)]
545 545 >>> unionranges([(1, 2), (3, 4), (5, 6)])
546 546 [(1, 6)]
547 547 """
548 548 rangeslist = sorted(set(rangeslist))
549 549 unioned = []
550 550 if rangeslist:
551 551 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
552 552 for a, b in rangeslist:
553 553 c, d = unioned[-1]
554 554 if a > d + 1:
555 555 unioned.append((a, b))
556 556 else:
557 557 unioned[-1] = (c, max(b, d))
558 558 return unioned
559 559
560 560
561 561 def difflineranges(content1, content2):
562 562 """Return list of line number ranges in content2 that differ from content1.
563 563
564 564 Line numbers are 1-based. The numbers are the first and last line contained
565 565 in the range. Single-line ranges have the same line number for the first and
566 566 last line. Excludes any empty ranges that result from lines that are only
567 567 present in content1. Relies on mdiff's idea of where the line endings are in
568 568 the string.
569 569
570 570 >>> from mercurial import pycompat
571 571 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
572 572 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
573 573 >>> difflineranges2(b'', b'')
574 574 []
575 575 >>> difflineranges2(b'a', b'')
576 576 []
577 577 >>> difflineranges2(b'', b'A')
578 578 [(1, 1)]
579 579 >>> difflineranges2(b'a', b'a')
580 580 []
581 581 >>> difflineranges2(b'a', b'A')
582 582 [(1, 1)]
583 583 >>> difflineranges2(b'ab', b'')
584 584 []
585 585 >>> difflineranges2(b'', b'AB')
586 586 [(1, 2)]
587 587 >>> difflineranges2(b'abc', b'ac')
588 588 []
589 589 >>> difflineranges2(b'ab', b'aCb')
590 590 [(2, 2)]
591 591 >>> difflineranges2(b'abc', b'aBc')
592 592 [(2, 2)]
593 593 >>> difflineranges2(b'ab', b'AB')
594 594 [(1, 2)]
595 595 >>> difflineranges2(b'abcde', b'aBcDe')
596 596 [(2, 2), (4, 4)]
597 597 >>> difflineranges2(b'abcde', b'aBCDe')
598 598 [(2, 4)]
599 599 """
600 600 ranges = []
601 601 for lines, kind in mdiff.allblocks(content1, content2):
602 602 firstline, lastline = lines[2:4]
603 603 if kind == b'!' and firstline != lastline:
604 604 ranges.append((firstline + 1, lastline))
605 605 return ranges
606 606
607 607
608 608 def getbasectxs(repo, opts, revstofix):
609 609 """Returns a map of the base contexts for each revision
610 610
611 611 The base contexts determine which lines are considered modified when we
612 612 attempt to fix just the modified lines in a file. It also determines which
613 613 files we attempt to fix, so it is important to compute this even when
614 614 --whole is used.
615 615 """
616 616 # The --base flag overrides the usual logic, and we give every revision
617 617 # exactly the set of baserevs that the user specified.
618 618 if opts.get(b'base'):
619 619 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
620 620 if not baserevs:
621 621 baserevs = {nullrev}
622 622 basectxs = {repo[rev] for rev in baserevs}
623 623 return {rev: basectxs for rev in revstofix}
624 624
625 625 # Proceed in topological order so that we can easily determine each
626 626 # revision's baserevs by looking at its parents and their baserevs.
627 627 basectxs = collections.defaultdict(set)
628 628 for rev in sorted(revstofix):
629 629 ctx = repo[rev]
630 630 for pctx in ctx.parents():
631 631 if pctx.rev() in basectxs:
632 632 basectxs[rev].update(basectxs[pctx.rev()])
633 633 else:
634 634 basectxs[rev].add(pctx)
635 635 return basectxs
636 636
637 637
638 638 def _prefetchfiles(repo, workqueue, basepaths):
639 639 toprefetch = set()
640 640
641 641 # Prefetch the files that will be fixed.
642 642 for rev, path in workqueue:
643 643 if rev == wdirrev:
644 644 continue
645 645 toprefetch.add((rev, path))
646 646
647 647 # Prefetch the base contents for lineranges().
648 648 for (baserev, fixrev, path), basepath in basepaths.items():
649 649 toprefetch.add((baserev, basepath))
650 650
651 651 if toprefetch:
652 652 scmutil.prefetchfiles(
653 653 repo,
654 654 [
655 655 (rev, scmutil.matchfiles(repo, [path]))
656 656 for rev, path in toprefetch
657 657 ],
658 658 )
659 659
660 660
661 661 def fixfile(ui, repo, opts, fixers, fixctx, path, basepaths, basectxs):
662 662 """Run any configured fixers that should affect the file in this context
663 663
664 664 Returns the file content that results from applying the fixers in some order
665 665 starting with the file's content in the fixctx. Fixers that support line
666 666 ranges will affect lines that have changed relative to any of the basectxs
667 667 (i.e. they will only avoid lines that are common to all basectxs).
668 668
669 669 A fixer tool's stdout will become the file's new content if and only if it
670 670 exits with code zero. The fixer tool's working directory is the repository's
671 671 root.
672 672 """
673 673 metadata = {}
674 674 newdata = fixctx[path].data()
675 675 for fixername, fixer in pycompat.iteritems(fixers):
676 676 if fixer.affects(opts, fixctx, path):
677 677 ranges = lineranges(
678 678 opts, path, basepaths, basectxs, fixctx, newdata
679 679 )
680 680 command = fixer.command(ui, path, ranges)
681 681 if command is None:
682 682 continue
683 683 ui.debug(b'subprocess: %s\n' % (command,))
684 684 proc = subprocess.Popen(
685 685 procutil.tonativestr(command),
686 686 shell=True,
687 687 cwd=procutil.tonativestr(repo.root),
688 688 stdin=subprocess.PIPE,
689 689 stdout=subprocess.PIPE,
690 690 stderr=subprocess.PIPE,
691 691 )
692 692 stdout, stderr = proc.communicate(newdata)
693 693 if stderr:
694 694 showstderr(ui, fixctx.rev(), fixername, stderr)
695 695 newerdata = stdout
696 696 if fixer.shouldoutputmetadata():
697 697 try:
698 698 metadatajson, newerdata = stdout.split(b'\0', 1)
699 699 metadata[fixername] = pycompat.json_loads(metadatajson)
700 700 except ValueError:
701 701 ui.warn(
702 702 _(b'ignored invalid output from fixer tool: %s\n')
703 703 % (fixername,)
704 704 )
705 705 continue
706 706 else:
707 707 metadata[fixername] = None
708 708 if proc.returncode == 0:
709 709 newdata = newerdata
710 710 else:
711 711 if not stderr:
712 712 message = _(b'exited with status %d\n') % (proc.returncode,)
713 713 showstderr(ui, fixctx.rev(), fixername, message)
714 714 checktoolfailureaction(
715 715 ui,
716 716 _(b'no fixes will be applied'),
717 717 hint=_(
718 718 b'use --config fix.failure=continue to apply any '
719 719 b'successful fixes anyway'
720 720 ),
721 721 )
722 722 return metadata, newdata
723 723
724 724
725 725 def showstderr(ui, rev, fixername, stderr):
726 726 """Writes the lines of the stderr string as warnings on the ui
727 727
728 728 Uses the revision number and fixername to give more context to each line of
729 729 the error message. Doesn't include file names, since those take up a lot of
730 730 space and would tend to be included in the error message if they were
731 731 relevant.
732 732 """
733 733 for line in re.split(b'[\r\n]+', stderr):
734 734 if line:
735 735 ui.warn(b'[')
736 736 if rev is None:
737 737 ui.warn(_(b'wdir'), label=b'evolve.rev')
738 738 else:
739 739 ui.warn(b'%d' % rev, label=b'evolve.rev')
740 740 ui.warn(b'] %s: %s\n' % (fixername, line))
741 741
742 742
743 743 def writeworkingdir(repo, ctx, filedata, replacements):
744 744 """Write new content to the working copy and check out the new p1 if any
745 745
746 746 We check out a new revision if and only if we fixed something in both the
747 747 working directory and its parent revision. This avoids the need for a full
748 748 update/merge, and means that the working directory simply isn't affected
749 749 unless the --working-dir flag is given.
750 750
751 751 Directly updates the dirstate for the affected files.
752 752 """
753 753 for path, data in pycompat.iteritems(filedata):
754 754 fctx = ctx[path]
755 755 fctx.write(data, fctx.flags())
756 756 if repo.dirstate[path] == b'n':
757 757 repo.dirstate.normallookup(path)
758 758
759 759 oldparentnodes = repo.dirstate.parents()
760 760 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
761 761 if newparentnodes != oldparentnodes:
762 762 repo.setparents(*newparentnodes)
763 763
764 764
765 765 def replacerev(ui, repo, ctx, filedata, replacements):
766 766 """Commit a new revision like the given one, but with file content changes
767 767
768 768 "ctx" is the original revision to be replaced by a modified one.
769 769
770 770 "filedata" is a dict that maps paths to their new file content. All other
771 771 paths will be recreated from the original revision without changes.
772 772 "filedata" may contain paths that didn't exist in the original revision;
773 773 they will be added.
774 774
775 775 "replacements" is a dict that maps a single node to a single node, and it is
776 776 updated to indicate the original revision is replaced by the newly created
777 777 one. No entry is added if the replacement's node already exists.
778 778
779 779 The new revision has the same parents as the old one, unless those parents
780 780 have already been replaced, in which case those replacements are the parents
781 781 of this new revision. Thus, if revisions are replaced in topological order,
782 782 there is no need to rebase them into the original topology later.
783 783 """
784 784
785 785 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
786 786 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
787 787 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
788 788 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
789 789
790 790 # We don't want to create a revision that has no changes from the original,
791 791 # but we should if the original revision's parent has been replaced.
792 792 # Otherwise, we would produce an orphan that needs no actual human
793 793 # intervention to evolve. We can't rely on commit() to avoid creating the
794 794 # un-needed revision because the extra field added below produces a new hash
795 795 # regardless of file content changes.
796 796 if (
797 797 not filedata
798 798 and p1ctx.node() not in replacements
799 799 and p2ctx.node() not in replacements
800 800 ):
801 801 return
802 802
803 803 extra = ctx.extra().copy()
804 804 extra[b'fix_source'] = ctx.hex()
805 805
806 806 wctx = context.overlayworkingctx(repo)
807 807 wctx.setbase(repo[newp1node])
808 808 merge.revert_to(ctx, wc=wctx)
809 809 copies.graftcopies(wctx, ctx, ctx.p1())
810 810
811 811 for path in filedata.keys():
812 812 fctx = ctx[path]
813 813 copysource = fctx.copysource()
814 814 wctx.write(path, filedata[path], flags=fctx.flags())
815 815 if copysource:
816 816 wctx.markcopied(path, copysource)
817 817
818 desc = rewriteutil.update_hash_refs(
819 repo,
820 ctx.description(),
821 {oldnode: [newnode] for oldnode, newnode in replacements.items()},
822 )
823
818 824 memctx = wctx.tomemctx(
819 text=ctx.description(),
825 text=desc,
820 826 branch=ctx.branch(),
821 827 extra=extra,
822 828 date=ctx.date(),
823 829 parents=(newp1node, newp2node),
824 830 user=ctx.user(),
825 831 )
826 832
827 833 sucnode = memctx.commit()
828 834 prenode = ctx.node()
829 835 if prenode == sucnode:
830 836 ui.debug(b'node %s already existed\n' % (ctx.hex()))
831 837 else:
832 838 replacements[ctx.node()] = sucnode
833 839
834 840
835 841 def getfixers(ui):
836 842 """Returns a map of configured fixer tools indexed by their names
837 843
838 844 Each value is a Fixer object with methods that implement the behavior of the
839 845 fixer's config suboptions. Does not validate the config values.
840 846 """
841 847 fixers = {}
842 848 for name in fixernames(ui):
843 849 enabled = ui.configbool(b'fix', name + b':enabled')
844 850 command = ui.config(b'fix', name + b':command')
845 851 pattern = ui.config(b'fix', name + b':pattern')
846 852 linerange = ui.config(b'fix', name + b':linerange')
847 853 priority = ui.configint(b'fix', name + b':priority')
848 854 metadata = ui.configbool(b'fix', name + b':metadata')
849 855 skipclean = ui.configbool(b'fix', name + b':skipclean')
850 856 # Don't use a fixer if it has no pattern configured. It would be
851 857 # dangerous to let it affect all files. It would be pointless to let it
852 858 # affect no files. There is no reasonable subset of files to use as the
853 859 # default.
854 860 if command is None:
855 861 ui.warn(
856 862 _(b'fixer tool has no command configuration: %s\n') % (name,)
857 863 )
858 864 elif pattern is None:
859 865 ui.warn(
860 866 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
861 867 )
862 868 elif not enabled:
863 869 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
864 870 else:
865 871 fixers[name] = Fixer(
866 872 command, pattern, linerange, priority, metadata, skipclean
867 873 )
868 874 return collections.OrderedDict(
869 875 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
870 876 )
871 877
872 878
873 879 def fixernames(ui):
874 880 """Returns the names of [fix] config options that have suboptions"""
875 881 names = set()
876 882 for k, v in ui.configitems(b'fix'):
877 883 if b':' in k:
878 884 names.add(k.split(b':', 1)[0])
879 885 return names
880 886
881 887
882 888 class Fixer(object):
883 889 """Wraps the raw config values for a fixer with methods"""
884 890
885 891 def __init__(
886 892 self, command, pattern, linerange, priority, metadata, skipclean
887 893 ):
888 894 self._command = command
889 895 self._pattern = pattern
890 896 self._linerange = linerange
891 897 self._priority = priority
892 898 self._metadata = metadata
893 899 self._skipclean = skipclean
894 900
895 901 def affects(self, opts, fixctx, path):
896 902 """Should this fixer run on the file at the given path and context?"""
897 903 repo = fixctx.repo()
898 904 matcher = matchmod.match(
899 905 repo.root, repo.root, [self._pattern], ctx=fixctx
900 906 )
901 907 return matcher(path)
902 908
903 909 def shouldoutputmetadata(self):
904 910 """Should the stdout of this fixer start with JSON and a null byte?"""
905 911 return self._metadata
906 912
907 913 def command(self, ui, path, ranges):
908 914 """A shell command to use to invoke this fixer on the given file/lines
909 915
910 916 May return None if there is no appropriate command to run for the given
911 917 parameters.
912 918 """
913 919 expand = cmdutil.rendercommandtemplate
914 920 parts = [
915 921 expand(
916 922 ui,
917 923 self._command,
918 924 {b'rootpath': path, b'basename': os.path.basename(path)},
919 925 )
920 926 ]
921 927 if self._linerange:
922 928 if self._skipclean and not ranges:
923 929 # No line ranges to fix, so don't run the fixer.
924 930 return None
925 931 for first, last in ranges:
926 932 parts.append(
927 933 expand(
928 934 ui, self._linerange, {b'first': first, b'last': last}
929 935 )
930 936 )
931 937 return b' '.join(parts)
@@ -1,521 +1,521
1 1 A script that implements uppercasing all letters in a file.
2 2
3 3 $ UPPERCASEPY="$TESTTMP/uppercase.py"
4 4 $ cat > $UPPERCASEPY <<EOF
5 5 > import sys
6 6 > from mercurial.utils.procutil import setbinary
7 7 > setbinary(sys.stdin)
8 8 > setbinary(sys.stdout)
9 9 > sys.stdout.write(sys.stdin.read().upper())
10 10 > EOF
11 11 $ TESTLINES="foo\nbar\nbaz\n"
12 12 $ printf $TESTLINES | "$PYTHON" $UPPERCASEPY
13 13 FOO
14 14 BAR
15 15 BAZ
16 16
17 17 Tests for the fix extension's behavior around non-trivial history topologies.
18 18 Looks for correct incremental fixing and reproduction of parent/child
19 19 relationships. We indicate fixed file content by uppercasing it.
20 20
21 21 $ cat >> $HGRCPATH <<EOF
22 22 > [extensions]
23 23 > fix =
24 24 > strip =
25 25 > [fix]
26 26 > uppercase-whole-file:command="$PYTHON" $UPPERCASEPY
27 27 > uppercase-whole-file:pattern=set:**
28 28 > EOF
29 29
30 30 This tests the only behavior that should really be affected by obsolescence, so
31 31 we'll test it with evolution off and on. This only changes the revision
32 32 numbers, if all is well.
33 33
34 34 #testcases obsstore-off obsstore-on
35 35 #if obsstore-on
36 36 $ cat >> $HGRCPATH <<EOF
37 37 > [experimental]
38 38 > evolution.createmarkers=True
39 39 > evolution.allowunstable=True
40 40 > EOF
41 41 #endif
42 42
43 43 Setting up the test topology. Scroll down to see the graph produced. We make it
44 44 clear which files were modified in each revision. It's enough to test at the
45 45 file granularity, because that demonstrates which baserevs were diffed against.
46 46 The computation of changed lines is orthogonal and tested separately.
47 47
48 48 $ hg init repo
49 49 $ cd repo
50 50
51 51 $ printf "aaaa\n" > a
52 52 $ hg commit -Am "change A"
53 53 adding a
54 54 $ printf "bbbb\n" > b
55 55 $ hg commit -Am "change B"
56 56 adding b
57 57 $ printf "cccc\n" > c
58 58 $ hg commit -Am "change C"
59 59 adding c
60 60 $ hg checkout 0
61 61 0 files updated, 0 files merged, 2 files removed, 0 files unresolved
62 62 $ printf "dddd\n" > d
63 63 $ hg commit -Am "change D"
64 64 adding d
65 65 created new head
66 66 $ hg merge -r 2
67 67 2 files updated, 0 files merged, 0 files removed, 0 files unresolved
68 68 (branch merge, don't forget to commit)
69 69 $ printf "eeee\n" > e
70 70 $ hg commit -Am "change E"
71 71 adding e
72 72 $ hg checkout 0
73 73 0 files updated, 0 files merged, 4 files removed, 0 files unresolved
74 74 $ printf "ffff\n" > f
75 75 $ hg commit -Am "change F"
76 76 adding f
77 77 created new head
78 78 $ hg checkout 0
79 79 0 files updated, 0 files merged, 1 files removed, 0 files unresolved
80 80 $ printf "gggg\n" > g
81 81 $ hg commit -Am "change G"
82 82 adding g
83 83 created new head
84 84 $ hg merge -r 5
85 85 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
86 86 (branch merge, don't forget to commit)
87 87 $ printf "hhhh\n" > h
88 $ hg commit -Am "change H"
88 $ hg commit -Am "change H (child of b53d63e816fb and 0e49f92ee6e9)"
89 89 adding h
90 90 $ hg merge -r 4
91 91 4 files updated, 0 files merged, 0 files removed, 0 files unresolved
92 92 (branch merge, don't forget to commit)
93 93 $ printf "iiii\n" > i
94 94 $ hg commit -Am "change I"
95 95 adding i
96 96 $ hg checkout 2
97 97 0 files updated, 0 files merged, 6 files removed, 0 files unresolved
98 98 $ printf "jjjj\n" > j
99 $ hg commit -Am "change J"
99 $ hg commit -Am "change J (child of 7f371349286e)"
100 100 adding j
101 101 created new head
102 102 $ hg checkout 7
103 103 3 files updated, 0 files merged, 3 files removed, 0 files unresolved
104 104 $ printf "kkkk\n" > k
105 105 $ hg add
106 106 adding k
107 107
108 $ hg log --graph --template '{rev} {desc}\n'
109 o 9 change J
108 $ hg log --graph --template '{rev}:{node|short} {desc}\n'
109 o 9:884041ccc490 change J (child of 7f371349286e)
110 110 |
111 | o 8 change I
111 | o 8:b7c772105fd2 change I
112 112 | |\
113 | | @ 7 change H
113 | | @ 7:4e7b9312dad2 change H (child of b53d63e816fb and 0e49f92ee6e9)
114 114 | | |\
115 | | | o 6 change G
115 | | | o 6:0e49f92ee6e9 change G
116 116 | | | |
117 | | o | 5 change F
117 | | o | 5:b53d63e816fb change F
118 118 | | |/
119 | o | 4 change E
119 | o | 4:ddad58af5e51 change E
120 120 |/| |
121 | o | 3 change D
121 | o | 3:c015ebfd2bfe change D
122 122 | |/
123 o | 2 change C
123 o | 2:7f371349286e change C
124 124 | |
125 o | 1 change B
125 o | 1:388fdd33fea0 change B
126 126 |/
127 o 0 change A
127 o 0:a55a84d97a24 change A
128 128
129 129
130 130 Fix all but the root revision and its four children.
131 131
132 132 $ hg fix -r '2|4|7|8|9' --working-dir
133 133 saved backup bundle to * (glob) (obsstore-off !)
134 134
135 135 The five revisions remain, but the other revisions were fixed and replaced. All
136 136 parent pointers have been accurately set to reproduce the previous topology
137 137 (though it is rendered in a slightly different order now).
138 138
139 139 #if obsstore-on
140 $ hg log --graph --template '{rev} {desc}\n'
141 o 14 change J
140 $ hg log --graph --template '{rev}:{node|short} {desc}\n'
141 o 14:d8d0e7974598 change J (child of 89de0da1d5da)
142 142 |
143 | o 13 change I
143 | o 13:4fc0b354461e change I
144 144 | |\
145 | | @ 12 change H
145 | | @ 12:1c45f3923443 change H (child of b53d63e816fb and 0e49f92ee6e9)
146 146 | | |\
147 | o | | 11 change E
147 | o | | 11:d75754455722 change E
148 148 |/| | |
149 o | | | 10 change C
149 o | | | 10:89de0da1d5da change C
150 150 | | | |
151 | | | o 6 change G
151 | | | o 6:0e49f92ee6e9 change G
152 152 | | | |
153 | | o | 5 change F
153 | | o | 5:b53d63e816fb change F
154 154 | | |/
155 | o / 3 change D
155 | o / 3:c015ebfd2bfe change D
156 156 | |/
157 o / 1 change B
157 o / 1:388fdd33fea0 change B
158 158 |/
159 o 0 change A
159 o 0:a55a84d97a24 change A
160 160
161 161 $ C=10
162 162 $ E=11
163 163 $ H=12
164 164 $ I=13
165 165 $ J=14
166 166 #else
167 $ hg log --graph --template '{rev} {desc}\n'
168 o 9 change J
167 $ hg log --graph --template '{rev}:{node|short} {desc}\n'
168 o 9:d8d0e7974598 change J (child of 89de0da1d5da)
169 169 |
170 | o 8 change I
170 | o 8:4fc0b354461e change I
171 171 | |\
172 | | @ 7 change H
172 | | @ 7:1c45f3923443 change H (child of b53d63e816fb and 0e49f92ee6e9)
173 173 | | |\
174 | o | | 6 change E
174 | o | | 6:d75754455722 change E
175 175 |/| | |
176 o | | | 5 change C
176 o | | | 5:89de0da1d5da change C
177 177 | | | |
178 | | | o 4 change G
178 | | | o 4:0e49f92ee6e9 change G
179 179 | | | |
180 | | o | 3 change F
180 | | o | 3:b53d63e816fb change F
181 181 | | |/
182 | o / 2 change D
182 | o / 2:c015ebfd2bfe change D
183 183 | |/
184 o / 1 change B
184 o / 1:388fdd33fea0 change B
185 185 |/
186 o 0 change A
186 o 0:a55a84d97a24 change A
187 187
188 188 $ C=5
189 189 $ E=6
190 190 $ H=7
191 191 $ I=8
192 192 $ J=9
193 193 #endif
194 194
195 195 Change C is a root of the set being fixed, so all we fix is what has changed
196 196 since its parent. That parent, change B, is its baserev.
197 197
198 198 $ hg cat -r $C 'set:**'
199 199 aaaa
200 200 bbbb
201 201 CCCC
202 202
203 203 Change E is a merge with only one parent being fixed. Its baserevs are the
204 204 unfixed parent plus the baserevs of the other parent. This evaluates to changes
205 205 B and D. We now have to decide what it means to incrementally fix a merge
206 206 commit. We choose to fix anything that has changed versus any baserev. Only the
207 207 undisturbed content of the common ancestor, change A, is unfixed.
208 208
209 209 $ hg cat -r $E 'set:**'
210 210 aaaa
211 211 BBBB
212 212 CCCC
213 213 DDDD
214 214 EEEE
215 215
216 216 Change H is a merge with neither parent being fixed. This is essentially
217 217 equivalent to the previous case because there is still only one baserev for
218 218 each parent of the merge.
219 219
220 220 $ hg cat -r $H 'set:**'
221 221 aaaa
222 222 FFFF
223 223 GGGG
224 224 HHHH
225 225
226 226 Change I is a merge that has four baserevs; two from each parent. We handle
227 227 multiple baserevs in the same way regardless of how many came from each parent.
228 228 So, fixing change H will fix any files that were not exactly the same in each
229 229 baserev.
230 230
231 231 $ hg cat -r $I 'set:**'
232 232 aaaa
233 233 BBBB
234 234 CCCC
235 235 DDDD
236 236 EEEE
237 237 FFFF
238 238 GGGG
239 239 HHHH
240 240 IIII
241 241
242 242 Change J is a simple case with one baserev, but its baserev is not its parent,
243 243 change C. Its baserev is its grandparent, change B.
244 244
245 245 $ hg cat -r $J 'set:**'
246 246 aaaa
247 247 bbbb
248 248 CCCC
249 249 JJJJ
250 250
251 251 The working copy was dirty, so it is treated much like a revision. The baserevs
252 252 for the working copy are inherited from its parent, change H, because it is
253 253 also being fixed.
254 254
255 255 $ cat *
256 256 aaaa
257 257 FFFF
258 258 GGGG
259 259 HHHH
260 260 KKKK
261 261
262 262 Change A was never a baserev because none of its children were to be fixed.
263 263
264 264 $ cd ..
265 265
266 266
267 267 Test the --source option. We only do this with obsstore on to avoid duplicating
268 268 test code. We rely on the other tests to prove that obsolescence is not an
269 269 important factor here.
270 270
271 271 #if obsstore-on
272 272 $ hg init source-arg
273 273 $ cd source-arg
274 274 $ printf "aaaa\n" > a
275 275 $ hg commit -Am "change A"
276 276 adding a
277 277 $ printf "bbbb\n" > b
278 278 $ hg commit -Am "change B"
279 279 adding b
280 280 $ printf "cccc\n" > c
281 281 $ hg commit -Am "change C"
282 282 adding c
283 283 $ hg checkout 0
284 284 0 files updated, 0 files merged, 2 files removed, 0 files unresolved
285 285 $ printf "dddd\n" > d
286 286 $ hg commit -Am "change D"
287 287 adding d
288 288 created new head
289 289 $ hg log --graph --template '{rev} {desc}\n'
290 290 @ 3 change D
291 291 |
292 292 | o 2 change C
293 293 | |
294 294 | o 1 change B
295 295 |/
296 296 o 0 change A
297 297
298 298
299 299 Test passing 'wdir()' to --source
300 300 $ printf "xxxx\n" > x
301 301 $ hg add x
302 302 $ hg fix -s 'wdir()'
303 303 $ cat *
304 304 aaaa
305 305 dddd
306 306 XXXX
307 307
308 308 Test passing '.' to --source
309 309 $ printf "xxxx\n" > x
310 310 $ hg fix -s .
311 311 $ hg log --graph --template '{rev} {desc}\n'
312 312 @ 4 change D
313 313 |
314 314 | o 2 change C
315 315 | |
316 316 | o 1 change B
317 317 |/
318 318 o 0 change A
319 319
320 320 $ cat *
321 321 aaaa
322 322 DDDD
323 323 XXXX
324 324 $ hg strip -qf 4
325 325 $ hg co -q 3
326 326
327 327 Test passing other branch to --source
328 328 $ printf "xxxx\n" > x
329 329 $ hg add x
330 330 $ hg fix -s 2
331 331 $ hg log --graph --template '{rev} {desc}\n'
332 332 o 4 change C
333 333 |
334 334 | @ 3 change D
335 335 | |
336 336 o | 1 change B
337 337 |/
338 338 o 0 change A
339 339
340 340 $ hg cat -r 4 b c
341 341 bbbb
342 342 CCCC
343 343 $ cat *
344 344 aaaa
345 345 dddd
346 346 xxxx
347 347 $ hg strip -qf 4
348 348
349 349 Test passing multiple revisions to --source
350 350 $ hg fix -s '2 + .'
351 351 $ hg log --graph --template '{rev} {desc}\n'
352 352 @ 5 change D
353 353 |
354 354 | o 4 change C
355 355 | |
356 356 | o 1 change B
357 357 |/
358 358 o 0 change A
359 359
360 360 $ hg cat -r 4 b c
361 361 bbbb
362 362 CCCC
363 363 $ cat *
364 364 aaaa
365 365 DDDD
366 366 XXXX
367 367
368 368 $ cd ..
369 369 #endif
370 370
371 371 The --all flag should fix anything that wouldn't cause a problem if you fixed
372 372 it, including the working copy. Obsolete revisions are not fixed because that
373 373 could cause divergence. Public revisions would cause an abort because they are
374 374 immutable. We can fix orphans because their successors are still just orphans
375 375 of the original obsolete parent. When obsolesence is off, we're just fixing and
376 376 replacing anything that isn't public.
377 377
378 378 $ hg init fixall
379 379 $ cd fixall
380 380 $ hg fix --all --working-dir
381 381 abort: cannot specify both --working-dir and --all
382 382 [255]
383 383
384 384 #if obsstore-on
385 385 $ printf "one\n" > foo.whole
386 386 $ hg commit -Aqm "first"
387 387 $ hg phase --public
388 388 $ hg tag --local root
389 389 $ printf "two\n" > foo.whole
390 390 $ hg commit -m "second"
391 391 $ printf "three\n" > foo.whole
392 392 $ hg commit -m "third" --secret
393 393 $ hg tag --local secret
394 394 $ hg checkout root
395 395 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
396 396 $ printf "four\n" > foo.whole
397 397 $ hg commit -m "fourth"
398 398 created new head
399 399 $ printf "five\n" > foo.whole
400 400 $ hg commit -m "fifth"
401 401 $ hg tag --local replaced
402 402 $ printf "six\n" > foo.whole
403 403 $ hg commit -m "sixth"
404 404 $ hg checkout replaced
405 405 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
406 406 $ printf "seven\n" > foo.whole
407 407 $ hg commit --amend
408 408 1 new orphan changesets
409 409 $ hg checkout secret
410 410 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
411 411 $ printf "uncommitted\n" > foo.whole
412 412
413 413 $ hg log --graph --template '{rev} {desc} {phase}\n'
414 414 o 6 fifth draft
415 415 |
416 416 | * 5 sixth draft
417 417 | |
418 418 | x 4 fifth draft
419 419 |/
420 420 o 3 fourth draft
421 421 |
422 422 | @ 2 third secret
423 423 | |
424 424 | o 1 second draft
425 425 |/
426 426 o 0 first public
427 427
428 428
429 429 $ hg fix --all
430 430
431 431 $ hg log --graph --template '{rev} {desc}\n' -r 'sort(all(), topo)' --hidden
432 432 o 11 fifth
433 433 |
434 434 o 9 fourth
435 435 |
436 436 | @ 8 third
437 437 | |
438 438 | o 7 second
439 439 |/
440 440 | * 10 sixth
441 441 | |
442 442 | | x 5 sixth
443 443 | |/
444 444 | x 4 fifth
445 445 | |
446 446 | | x 6 fifth
447 447 | |/
448 448 | x 3 fourth
449 449 |/
450 450 | x 2 third
451 451 | |
452 452 | x 1 second
453 453 |/
454 454 o 0 first
455 455
456 456
457 457 $ hg cat -r 7 foo.whole
458 458 TWO
459 459 $ hg cat -r 8 foo.whole
460 460 THREE
461 461 $ hg cat -r 9 foo.whole
462 462 FOUR
463 463 $ hg cat -r 10 foo.whole
464 464 SIX
465 465 $ hg cat -r 11 foo.whole
466 466 SEVEN
467 467 $ cat foo.whole
468 468 UNCOMMITTED
469 469 #else
470 470 $ printf "one\n" > foo.whole
471 471 $ hg commit -Aqm "first"
472 472 $ hg phase --public
473 473 $ hg tag --local root
474 474 $ printf "two\n" > foo.whole
475 475 $ hg commit -m "second"
476 476 $ printf "three\n" > foo.whole
477 477 $ hg commit -m "third" --secret
478 478 $ hg tag --local secret
479 479 $ hg checkout root
480 480 1 files updated, 0 files merged, 0 files removed, 0 files unresolved
481 481 $ printf "four\n" > foo.whole
482 482 $ hg commit -m "fourth"
483 483 created new head
484 484 $ printf "uncommitted\n" > foo.whole
485 485
486 486 $ hg log --graph --template '{rev} {desc} {phase}\n'
487 487 @ 3 fourth draft
488 488 |
489 489 | o 2 third secret
490 490 | |
491 491 | o 1 second draft
492 492 |/
493 493 o 0 first public
494 494
495 495
496 496 $ hg fix --all
497 497 saved backup bundle to * (glob)
498 498
499 499 $ hg log --graph --template '{rev} {desc} {phase}\n'
500 500 @ 3 fourth draft
501 501 |
502 502 | o 2 third secret
503 503 | |
504 504 | o 1 second draft
505 505 |/
506 506 o 0 first public
507 507
508 508 $ hg cat -r 0 foo.whole
509 509 one
510 510 $ hg cat -r 1 foo.whole
511 511 TWO
512 512 $ hg cat -r 2 foo.whole
513 513 THREE
514 514 $ hg cat -r 3 foo.whole
515 515 FOUR
516 516 $ cat foo.whole
517 517 UNCOMMITTED
518 518 #endif
519 519
520 520 $ cd ..
521 521
General Comments 0
You need to be logged in to leave comments. Login now