##// END OF EJS Templates
fix: use obsolete.isenabled() to check for experimental.allowdivergence...
av6 -
r48598:e69c82bf stable
parent child Browse files
Show More
@@ -1,942 +1,940 b''
1 1 # fix - rewrite file content in changesets and working copy
2 2 #
3 3 # Copyright 2018 Google LLC.
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8 8
9 9 Provides a command that runs configured tools on the contents of modified files,
10 10 writing back any fixes to the working copy or replacing changesets.
11 11
12 12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 13 formatting fixes to modified lines in C++ code::
14 14
15 15 [fix]
16 16 clang-format:command=clang-format --assume-filename={rootpath}
17 17 clang-format:linerange=--lines={first}:{last}
18 18 clang-format:pattern=set:**.cpp or **.hpp
19 19
20 20 The :command suboption forms the first part of the shell command that will be
21 21 used to fix a file. The content of the file is passed on standard input, and the
22 22 fixed file content is expected on standard output. Any output on standard error
23 23 will be displayed as a warning. If the exit status is not zero, the file will
24 24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 25 status but no standard error output. Some values may be substituted into the
26 26 command::
27 27
28 28 {rootpath} The path of the file being fixed, relative to the repo root
29 29 {basename} The name of the file being fixed, without the directory path
30 30
31 31 If the :linerange suboption is set, the tool will only be run if there are
32 32 changed lines in a file. The value of this suboption is appended to the shell
33 33 command once for every range of changed lines in the file. Some values may be
34 34 substituted into the command::
35 35
36 36 {first} The 1-based line number of the first line in the modified range
37 37 {last} The 1-based line number of the last line in the modified range
38 38
39 39 Deleted sections of a file will be ignored by :linerange, because there is no
40 40 corresponding line range in the version being fixed.
41 41
42 42 By default, tools that set :linerange will only be executed if there is at least
43 43 one changed line range. This is meant to prevent accidents like running a code
44 44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 46 to false.
47 47
48 48 The :pattern suboption determines which files will be passed through each
49 49 configured tool. See :hg:`help patterns` for possible values. However, all
50 50 patterns are relative to the repo root, even if that text says they are relative
51 51 to the current working directory. If there are file arguments to :hg:`fix`, the
52 52 intersection of these patterns is used.
53 53
54 54 There is also a configurable limit for the maximum size of file that will be
55 55 processed by :hg:`fix`::
56 56
57 57 [fix]
58 58 maxfilesize = 2MB
59 59
60 60 Normally, execution of configured tools will continue after a failure (indicated
61 61 by a non-zero exit status). It can also be configured to abort after the first
62 62 such failure, so that no files will be affected if any tool fails. This abort
63 63 will also cause :hg:`fix` to exit with a non-zero status::
64 64
65 65 [fix]
66 66 failure = abort
67 67
68 68 When multiple tools are configured to affect a file, they execute in an order
69 69 defined by the :priority suboption. The priority suboption has a default value
70 70 of zero for each tool. Tools are executed in order of descending priority. The
71 71 execution order of tools with equal priority is unspecified. For example, you
72 72 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
73 73 in a text file by ensuring that 'sort' runs before 'head'::
74 74
75 75 [fix]
76 76 sort:command = sort -n
77 77 head:command = head -n 10
78 78 sort:pattern = numbers.txt
79 79 head:pattern = numbers.txt
80 80 sort:priority = 2
81 81 head:priority = 1
82 82
83 83 To account for changes made by each tool, the line numbers used for incremental
84 84 formatting are recomputed before executing the next tool. So, each tool may see
85 85 different values for the arguments added by the :linerange suboption.
86 86
87 87 Each fixer tool is allowed to return some metadata in addition to the fixed file
88 88 content. The metadata must be placed before the file content on stdout,
89 89 separated from the file content by a zero byte. The metadata is parsed as a JSON
90 90 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
91 91 is expected to produce this metadata encoding if and only if the :metadata
92 92 suboption is true::
93 93
94 94 [fix]
95 95 tool:command = tool --prepend-json-metadata
96 96 tool:metadata = true
97 97
98 98 The metadata values are passed to hooks, which can be used to print summaries or
99 99 perform other post-fixing work. The supported hooks are::
100 100
101 101 "postfixfile"
102 102 Run once for each file in each revision where any fixer tools made changes
103 103 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
104 104 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
105 105 tools that affected the file. Fixer tools that didn't affect the file have a
106 106 value of None. Only fixer tools that executed are present in the metadata.
107 107
108 108 "postfix"
109 109 Run once after all files and revisions have been handled. Provides
110 110 "$HG_REPLACEMENTS" with information about what revisions were created and
111 111 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
112 112 files in the working copy were updated. Provides a list "$HG_METADATA"
113 113 mapping fixer tool names to lists of metadata values returned from
114 114 executions that modified a file. This aggregates the same metadata
115 115 previously passed to the "postfixfile" hook.
116 116
117 117 Fixer tools are run in the repository's root directory. This allows them to read
118 118 configuration files from the working copy, or even write to the working copy.
119 119 The working copy is not updated to match the revision being fixed. In fact,
120 120 several revisions may be fixed in parallel. Writes to the working copy are not
121 121 amended into the revision being fixed; fixer tools should always write fixed
122 122 file content back to stdout as documented above.
123 123 """
124 124
125 125 from __future__ import absolute_import
126 126
127 127 import collections
128 128 import itertools
129 129 import os
130 130 import re
131 131 import subprocess
132 132
133 133 from mercurial.i18n import _
134 134 from mercurial.node import (
135 135 nullid,
136 136 nullrev,
137 137 wdirrev,
138 138 )
139 139
140 140 from mercurial.utils import procutil
141 141
142 142 from mercurial import (
143 143 cmdutil,
144 144 context,
145 145 copies,
146 146 error,
147 147 match as matchmod,
148 148 mdiff,
149 149 merge,
150 150 mergestate as mergestatemod,
151 obsolete,
151 152 pycompat,
152 153 registrar,
153 154 rewriteutil,
154 155 scmutil,
155 156 util,
156 157 worker,
157 158 )
158 159
159 160 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
160 161 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
161 162 # be specifying the version(s) of Mercurial they are tested with, or
162 163 # leave the attribute unspecified.
163 164 testedwith = b'ships-with-hg-core'
164 165
165 166 cmdtable = {}
166 167 command = registrar.command(cmdtable)
167 168
168 169 configtable = {}
169 170 configitem = registrar.configitem(configtable)
170 171
171 172 # Register the suboptions allowed for each configured fixer, and default values.
172 173 FIXER_ATTRS = {
173 174 b'command': None,
174 175 b'linerange': None,
175 176 b'pattern': None,
176 177 b'priority': 0,
177 178 b'metadata': False,
178 179 b'skipclean': True,
179 180 b'enabled': True,
180 181 }
181 182
182 183 for key, default in FIXER_ATTRS.items():
183 184 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
184 185
185 186 # A good default size allows most source code files to be fixed, but avoids
186 187 # letting fixer tools choke on huge inputs, which could be surprising to the
187 188 # user.
188 189 configitem(b'fix', b'maxfilesize', default=b'2MB')
189 190
190 191 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
191 192 # This helps users do shell scripts that stop when a fixer tool signals a
192 193 # problem.
193 194 configitem(b'fix', b'failure', default=b'continue')
194 195
195 196
196 197 def checktoolfailureaction(ui, message, hint=None):
197 198 """Abort with 'message' if fix.failure=abort"""
198 199 action = ui.config(b'fix', b'failure')
199 200 if action not in (b'continue', b'abort'):
200 201 raise error.Abort(
201 202 _(b'unknown fix.failure action: %s') % (action,),
202 203 hint=_(b'use "continue" or "abort"'),
203 204 )
204 205 if action == b'abort':
205 206 raise error.Abort(message, hint=hint)
206 207
207 208
208 209 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
209 210 baseopt = (
210 211 b'',
211 212 b'base',
212 213 [],
213 214 _(
214 215 b'revisions to diff against (overrides automatic '
215 216 b'selection, and applies to every revision being '
216 217 b'fixed)'
217 218 ),
218 219 _(b'REV'),
219 220 )
220 221 revopt = (b'r', b'rev', [], _(b'revisions to fix (ADVANCED)'), _(b'REV'))
221 222 sourceopt = (
222 223 b's',
223 224 b'source',
224 225 [],
225 226 _(b'fix the specified revisions and their descendants'),
226 227 _(b'REV'),
227 228 )
228 229 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
229 230 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
230 231 usage = _(b'[OPTION]... [FILE]...')
231 232
232 233
233 234 @command(
234 235 b'fix',
235 236 [allopt, baseopt, revopt, sourceopt, wdiropt, wholeopt],
236 237 usage,
237 238 helpcategory=command.CATEGORY_FILE_CONTENTS,
238 239 )
239 240 def fix(ui, repo, *pats, **opts):
240 241 """rewrite file content in changesets or working directory
241 242
242 243 Runs any configured tools to fix the content of files. Only affects files
243 244 with changes, unless file arguments are provided. Only affects changed lines
244 245 of files, unless the --whole flag is used. Some tools may always affect the
245 246 whole file regardless of --whole.
246 247
247 248 If --working-dir is used, files with uncommitted changes in the working copy
248 249 will be fixed. Note that no backup are made.
249 250
250 251 If revisions are specified with --source, those revisions and their
251 252 descendants will be checked, and they may be replaced with new revisions
252 253 that have fixed file content. By automatically including the descendants,
253 254 no merging, rebasing, or evolution will be required. If an ancestor of the
254 255 working copy is included, then the working copy itself will also be fixed,
255 256 and the working copy will be updated to the fixed parent.
256 257
257 258 When determining what lines of each file to fix at each revision, the whole
258 259 set of revisions being fixed is considered, so that fixes to earlier
259 260 revisions are not forgotten in later ones. The --base flag can be used to
260 261 override this default behavior, though it is not usually desirable to do so.
261 262 """
262 263 opts = pycompat.byteskwargs(opts)
263 264 cmdutil.check_at_most_one_arg(opts, b'all', b'source', b'rev')
264 265 cmdutil.check_incompatible_arguments(
265 266 opts, b'working_dir', [b'all', b'source']
266 267 )
267 268
268 269 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
269 270 revstofix = getrevstofix(ui, repo, opts)
270 271 basectxs = getbasectxs(repo, opts, revstofix)
271 272 workqueue, numitems = getworkqueue(
272 273 ui, repo, pats, opts, revstofix, basectxs
273 274 )
274 275 basepaths = getbasepaths(repo, opts, workqueue, basectxs)
275 276 fixers = getfixers(ui)
276 277
277 278 # Rather than letting each worker independently fetch the files
278 279 # (which also would add complications for shared/keepalive
279 280 # connections), prefetch them all first.
280 281 _prefetchfiles(repo, workqueue, basepaths)
281 282
282 283 # There are no data dependencies between the workers fixing each file
283 284 # revision, so we can use all available parallelism.
284 285 def getfixes(items):
285 286 for rev, path in items:
286 287 ctx = repo[rev]
287 288 olddata = ctx[path].data()
288 289 metadata, newdata = fixfile(
289 290 ui, repo, opts, fixers, ctx, path, basepaths, basectxs[rev]
290 291 )
291 292 # Don't waste memory/time passing unchanged content back, but
292 293 # produce one result per item either way.
293 294 yield (
294 295 rev,
295 296 path,
296 297 metadata,
297 298 newdata if newdata != olddata else None,
298 299 )
299 300
300 301 results = worker.worker(
301 302 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
302 303 )
303 304
304 305 # We have to hold on to the data for each successor revision in memory
305 306 # until all its parents are committed. We ensure this by committing and
306 307 # freeing memory for the revisions in some topological order. This
307 308 # leaves a little bit of memory efficiency on the table, but also makes
308 309 # the tests deterministic. It might also be considered a feature since
309 310 # it makes the results more easily reproducible.
310 311 filedata = collections.defaultdict(dict)
311 312 aggregatemetadata = collections.defaultdict(list)
312 313 replacements = {}
313 314 wdirwritten = False
314 315 commitorder = sorted(revstofix, reverse=True)
315 316 with ui.makeprogress(
316 317 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
317 318 ) as progress:
318 319 for rev, path, filerevmetadata, newdata in results:
319 320 progress.increment(item=path)
320 321 for fixername, fixermetadata in filerevmetadata.items():
321 322 aggregatemetadata[fixername].append(fixermetadata)
322 323 if newdata is not None:
323 324 filedata[rev][path] = newdata
324 325 hookargs = {
325 326 b'rev': rev,
326 327 b'path': path,
327 328 b'metadata': filerevmetadata,
328 329 }
329 330 repo.hook(
330 331 b'postfixfile',
331 332 throw=False,
332 333 **pycompat.strkwargs(hookargs)
333 334 )
334 335 numitems[rev] -= 1
335 336 # Apply the fixes for this and any other revisions that are
336 337 # ready and sitting at the front of the queue. Using a loop here
337 338 # prevents the queue from being blocked by the first revision to
338 339 # be ready out of order.
339 340 while commitorder and not numitems[commitorder[-1]]:
340 341 rev = commitorder.pop()
341 342 ctx = repo[rev]
342 343 if rev == wdirrev:
343 344 writeworkingdir(repo, ctx, filedata[rev], replacements)
344 345 wdirwritten = bool(filedata[rev])
345 346 else:
346 347 replacerev(ui, repo, ctx, filedata[rev], replacements)
347 348 del filedata[rev]
348 349
349 350 cleanup(repo, replacements, wdirwritten)
350 351 hookargs = {
351 352 b'replacements': replacements,
352 353 b'wdirwritten': wdirwritten,
353 354 b'metadata': aggregatemetadata,
354 355 }
355 356 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
356 357
357 358
358 359 def cleanup(repo, replacements, wdirwritten):
359 360 """Calls scmutil.cleanupnodes() with the given replacements.
360 361
361 362 "replacements" is a dict from nodeid to nodeid, with one key and one value
362 363 for every revision that was affected by fixing. This is slightly different
363 364 from cleanupnodes().
364 365
365 366 "wdirwritten" is a bool which tells whether the working copy was affected by
366 367 fixing, since it has no entry in "replacements".
367 368
368 369 Useful as a hook point for extending "hg fix" with output summarizing the
369 370 effects of the command, though we choose not to output anything here.
370 371 """
371 372 replacements = {
372 373 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
373 374 }
374 375 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
375 376
376 377
377 378 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
378 379 """Constructs the list of files to be fixed at specific revisions
379 380
380 381 It is up to the caller how to consume the work items, and the only
381 382 dependence between them is that replacement revisions must be committed in
382 383 topological order. Each work item represents a file in the working copy or
383 384 in some revision that should be fixed and written back to the working copy
384 385 or into a replacement revision.
385 386
386 387 Work items for the same revision are grouped together, so that a worker
387 388 pool starting with the first N items in parallel is likely to finish the
388 389 first revision's work before other revisions. This can allow us to write
389 390 the result to disk and reduce memory footprint. At time of writing, the
390 391 partition strategy in worker.py seems favorable to this. We also sort the
391 392 items by ascending revision number to match the order in which we commit
392 393 the fixes later.
393 394 """
394 395 workqueue = []
395 396 numitems = collections.defaultdict(int)
396 397 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
397 398 for rev in sorted(revstofix):
398 399 fixctx = repo[rev]
399 400 match = scmutil.match(fixctx, pats, opts)
400 401 for path in sorted(
401 402 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
402 403 ):
403 404 fctx = fixctx[path]
404 405 if fctx.islink():
405 406 continue
406 407 if fctx.size() > maxfilesize:
407 408 ui.warn(
408 409 _(b'ignoring file larger than %s: %s\n')
409 410 % (util.bytecount(maxfilesize), path)
410 411 )
411 412 continue
412 413 workqueue.append((rev, path))
413 414 numitems[rev] += 1
414 415 return workqueue, numitems
415 416
416 417
417 418 def getrevstofix(ui, repo, opts):
418 419 """Returns the set of revision numbers that should be fixed"""
419 420 if opts[b'all']:
420 421 revs = repo.revs(b'(not public() and not obsolete()) or wdir()')
421 422 elif opts[b'source']:
422 423 source_revs = scmutil.revrange(repo, opts[b'source'])
423 424 revs = set(repo.revs(b'(%ld::) - obsolete()', source_revs))
424 425 if wdirrev in source_revs:
425 426 # `wdir()::` is currently empty, so manually add wdir
426 427 revs.add(wdirrev)
427 428 if repo[b'.'].rev() in revs:
428 429 revs.add(wdirrev)
429 430 else:
430 431 revs = set(scmutil.revrange(repo, opts[b'rev']))
431 432 if opts.get(b'working_dir'):
432 433 revs.add(wdirrev)
433 434 for rev in revs:
434 435 checkfixablectx(ui, repo, repo[rev])
435 436 # Allow fixing only wdir() even if there's an unfinished operation
436 437 if not (len(revs) == 1 and wdirrev in revs):
437 438 cmdutil.checkunfinished(repo)
438 439 rewriteutil.precheck(repo, revs, b'fix')
439 440 if (
440 441 wdirrev in revs
441 442 and mergestatemod.mergestate.read(repo).unresolvedcount()
442 443 ):
443 444 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
444 445 if not revs:
445 446 raise error.Abort(
446 447 b'no changesets specified', hint=b'use --source or --working-dir'
447 448 )
448 449 return revs
449 450
450 451
451 452 def checkfixablectx(ui, repo, ctx):
452 453 """Aborts if the revision shouldn't be replaced with a fixed one."""
453 454 if ctx.obsolete():
454 455 # It would be better to actually check if the revision has a successor.
455 allowdivergence = ui.configbool(
456 b'experimental', b'evolution.allowdivergence'
457 )
458 if not allowdivergence:
456 if not obsolete.isenabled(repo, obsolete.allowdivergenceopt):
459 457 raise error.Abort(
460 458 b'fixing obsolete revision could cause divergence'
461 459 )
462 460
463 461
464 462 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
465 463 """Returns the set of files that should be fixed in a context
466 464
467 465 The result depends on the base contexts; we include any file that has
468 466 changed relative to any of the base contexts. Base contexts should be
469 467 ancestors of the context being fixed.
470 468 """
471 469 files = set()
472 470 for basectx in basectxs:
473 471 stat = basectx.status(
474 472 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
475 473 )
476 474 files.update(
477 475 set(
478 476 itertools.chain(
479 477 stat.added, stat.modified, stat.clean, stat.unknown
480 478 )
481 479 )
482 480 )
483 481 return files
484 482
485 483
486 484 def lineranges(opts, path, basepaths, basectxs, fixctx, content2):
487 485 """Returns the set of line ranges that should be fixed in a file
488 486
489 487 Of the form [(10, 20), (30, 40)].
490 488
491 489 This depends on the given base contexts; we must consider lines that have
492 490 changed versus any of the base contexts, and whether the file has been
493 491 renamed versus any of them.
494 492
495 493 Another way to understand this is that we exclude line ranges that are
496 494 common to the file in all base contexts.
497 495 """
498 496 if opts.get(b'whole'):
499 497 # Return a range containing all lines. Rely on the diff implementation's
500 498 # idea of how many lines are in the file, instead of reimplementing it.
501 499 return difflineranges(b'', content2)
502 500
503 501 rangeslist = []
504 502 for basectx in basectxs:
505 503 basepath = basepaths.get((basectx.rev(), fixctx.rev(), path), path)
506 504
507 505 if basepath in basectx:
508 506 content1 = basectx[basepath].data()
509 507 else:
510 508 content1 = b''
511 509 rangeslist.extend(difflineranges(content1, content2))
512 510 return unionranges(rangeslist)
513 511
514 512
515 513 def getbasepaths(repo, opts, workqueue, basectxs):
516 514 if opts.get(b'whole'):
517 515 # Base paths will never be fetched for line range determination.
518 516 return {}
519 517
520 518 basepaths = {}
521 519 for rev, path in workqueue:
522 520 fixctx = repo[rev]
523 521 for basectx in basectxs[rev]:
524 522 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
525 523 if basepath in basectx:
526 524 basepaths[(basectx.rev(), fixctx.rev(), path)] = basepath
527 525 return basepaths
528 526
529 527
530 528 def unionranges(rangeslist):
531 529 """Return the union of some closed intervals
532 530
533 531 >>> unionranges([])
534 532 []
535 533 >>> unionranges([(1, 100)])
536 534 [(1, 100)]
537 535 >>> unionranges([(1, 100), (1, 100)])
538 536 [(1, 100)]
539 537 >>> unionranges([(1, 100), (2, 100)])
540 538 [(1, 100)]
541 539 >>> unionranges([(1, 99), (1, 100)])
542 540 [(1, 100)]
543 541 >>> unionranges([(1, 100), (40, 60)])
544 542 [(1, 100)]
545 543 >>> unionranges([(1, 49), (50, 100)])
546 544 [(1, 100)]
547 545 >>> unionranges([(1, 48), (50, 100)])
548 546 [(1, 48), (50, 100)]
549 547 >>> unionranges([(1, 2), (3, 4), (5, 6)])
550 548 [(1, 6)]
551 549 """
552 550 rangeslist = sorted(set(rangeslist))
553 551 unioned = []
554 552 if rangeslist:
555 553 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
556 554 for a, b in rangeslist:
557 555 c, d = unioned[-1]
558 556 if a > d + 1:
559 557 unioned.append((a, b))
560 558 else:
561 559 unioned[-1] = (c, max(b, d))
562 560 return unioned
563 561
564 562
565 563 def difflineranges(content1, content2):
566 564 """Return list of line number ranges in content2 that differ from content1.
567 565
568 566 Line numbers are 1-based. The numbers are the first and last line contained
569 567 in the range. Single-line ranges have the same line number for the first and
570 568 last line. Excludes any empty ranges that result from lines that are only
571 569 present in content1. Relies on mdiff's idea of where the line endings are in
572 570 the string.
573 571
574 572 >>> from mercurial import pycompat
575 573 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
576 574 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
577 575 >>> difflineranges2(b'', b'')
578 576 []
579 577 >>> difflineranges2(b'a', b'')
580 578 []
581 579 >>> difflineranges2(b'', b'A')
582 580 [(1, 1)]
583 581 >>> difflineranges2(b'a', b'a')
584 582 []
585 583 >>> difflineranges2(b'a', b'A')
586 584 [(1, 1)]
587 585 >>> difflineranges2(b'ab', b'')
588 586 []
589 587 >>> difflineranges2(b'', b'AB')
590 588 [(1, 2)]
591 589 >>> difflineranges2(b'abc', b'ac')
592 590 []
593 591 >>> difflineranges2(b'ab', b'aCb')
594 592 [(2, 2)]
595 593 >>> difflineranges2(b'abc', b'aBc')
596 594 [(2, 2)]
597 595 >>> difflineranges2(b'ab', b'AB')
598 596 [(1, 2)]
599 597 >>> difflineranges2(b'abcde', b'aBcDe')
600 598 [(2, 2), (4, 4)]
601 599 >>> difflineranges2(b'abcde', b'aBCDe')
602 600 [(2, 4)]
603 601 """
604 602 ranges = []
605 603 for lines, kind in mdiff.allblocks(content1, content2):
606 604 firstline, lastline = lines[2:4]
607 605 if kind == b'!' and firstline != lastline:
608 606 ranges.append((firstline + 1, lastline))
609 607 return ranges
610 608
611 609
612 610 def getbasectxs(repo, opts, revstofix):
613 611 """Returns a map of the base contexts for each revision
614 612
615 613 The base contexts determine which lines are considered modified when we
616 614 attempt to fix just the modified lines in a file. It also determines which
617 615 files we attempt to fix, so it is important to compute this even when
618 616 --whole is used.
619 617 """
620 618 # The --base flag overrides the usual logic, and we give every revision
621 619 # exactly the set of baserevs that the user specified.
622 620 if opts.get(b'base'):
623 621 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
624 622 if not baserevs:
625 623 baserevs = {nullrev}
626 624 basectxs = {repo[rev] for rev in baserevs}
627 625 return {rev: basectxs for rev in revstofix}
628 626
629 627 # Proceed in topological order so that we can easily determine each
630 628 # revision's baserevs by looking at its parents and their baserevs.
631 629 basectxs = collections.defaultdict(set)
632 630 for rev in sorted(revstofix):
633 631 ctx = repo[rev]
634 632 for pctx in ctx.parents():
635 633 if pctx.rev() in basectxs:
636 634 basectxs[rev].update(basectxs[pctx.rev()])
637 635 else:
638 636 basectxs[rev].add(pctx)
639 637 return basectxs
640 638
641 639
642 640 def _prefetchfiles(repo, workqueue, basepaths):
643 641 toprefetch = set()
644 642
645 643 # Prefetch the files that will be fixed.
646 644 for rev, path in workqueue:
647 645 if rev == wdirrev:
648 646 continue
649 647 toprefetch.add((rev, path))
650 648
651 649 # Prefetch the base contents for lineranges().
652 650 for (baserev, fixrev, path), basepath in basepaths.items():
653 651 toprefetch.add((baserev, basepath))
654 652
655 653 if toprefetch:
656 654 scmutil.prefetchfiles(
657 655 repo,
658 656 [
659 657 (rev, scmutil.matchfiles(repo, [path]))
660 658 for rev, path in toprefetch
661 659 ],
662 660 )
663 661
664 662
665 663 def fixfile(ui, repo, opts, fixers, fixctx, path, basepaths, basectxs):
666 664 """Run any configured fixers that should affect the file in this context
667 665
668 666 Returns the file content that results from applying the fixers in some order
669 667 starting with the file's content in the fixctx. Fixers that support line
670 668 ranges will affect lines that have changed relative to any of the basectxs
671 669 (i.e. they will only avoid lines that are common to all basectxs).
672 670
673 671 A fixer tool's stdout will become the file's new content if and only if it
674 672 exits with code zero. The fixer tool's working directory is the repository's
675 673 root.
676 674 """
677 675 metadata = {}
678 676 newdata = fixctx[path].data()
679 677 for fixername, fixer in pycompat.iteritems(fixers):
680 678 if fixer.affects(opts, fixctx, path):
681 679 ranges = lineranges(
682 680 opts, path, basepaths, basectxs, fixctx, newdata
683 681 )
684 682 command = fixer.command(ui, path, ranges)
685 683 if command is None:
686 684 continue
687 685 ui.debug(b'subprocess: %s\n' % (command,))
688 686 proc = subprocess.Popen(
689 687 procutil.tonativestr(command),
690 688 shell=True,
691 689 cwd=procutil.tonativestr(repo.root),
692 690 stdin=subprocess.PIPE,
693 691 stdout=subprocess.PIPE,
694 692 stderr=subprocess.PIPE,
695 693 )
696 694 stdout, stderr = proc.communicate(newdata)
697 695 if stderr:
698 696 showstderr(ui, fixctx.rev(), fixername, stderr)
699 697 newerdata = stdout
700 698 if fixer.shouldoutputmetadata():
701 699 try:
702 700 metadatajson, newerdata = stdout.split(b'\0', 1)
703 701 metadata[fixername] = pycompat.json_loads(metadatajson)
704 702 except ValueError:
705 703 ui.warn(
706 704 _(b'ignored invalid output from fixer tool: %s\n')
707 705 % (fixername,)
708 706 )
709 707 continue
710 708 else:
711 709 metadata[fixername] = None
712 710 if proc.returncode == 0:
713 711 newdata = newerdata
714 712 else:
715 713 if not stderr:
716 714 message = _(b'exited with status %d\n') % (proc.returncode,)
717 715 showstderr(ui, fixctx.rev(), fixername, message)
718 716 checktoolfailureaction(
719 717 ui,
720 718 _(b'no fixes will be applied'),
721 719 hint=_(
722 720 b'use --config fix.failure=continue to apply any '
723 721 b'successful fixes anyway'
724 722 ),
725 723 )
726 724 return metadata, newdata
727 725
728 726
729 727 def showstderr(ui, rev, fixername, stderr):
730 728 """Writes the lines of the stderr string as warnings on the ui
731 729
732 730 Uses the revision number and fixername to give more context to each line of
733 731 the error message. Doesn't include file names, since those take up a lot of
734 732 space and would tend to be included in the error message if they were
735 733 relevant.
736 734 """
737 735 for line in re.split(b'[\r\n]+', stderr):
738 736 if line:
739 737 ui.warn(b'[')
740 738 if rev is None:
741 739 ui.warn(_(b'wdir'), label=b'evolve.rev')
742 740 else:
743 741 ui.warn(b'%d' % rev, label=b'evolve.rev')
744 742 ui.warn(b'] %s: %s\n' % (fixername, line))
745 743
746 744
747 745 def writeworkingdir(repo, ctx, filedata, replacements):
748 746 """Write new content to the working copy and check out the new p1 if any
749 747
750 748 We check out a new revision if and only if we fixed something in both the
751 749 working directory and its parent revision. This avoids the need for a full
752 750 update/merge, and means that the working directory simply isn't affected
753 751 unless the --working-dir flag is given.
754 752
755 753 Directly updates the dirstate for the affected files.
756 754 """
757 755 assert repo.dirstate.p2() == nullid
758 756
759 757 for path, data in pycompat.iteritems(filedata):
760 758 fctx = ctx[path]
761 759 fctx.write(data, fctx.flags())
762 760
763 761 oldp1 = repo.dirstate.p1()
764 762 newp1 = replacements.get(oldp1, oldp1)
765 763 if newp1 != oldp1:
766 764 with repo.dirstate.parentchange():
767 765 scmutil.movedirstate(repo, repo[newp1])
768 766
769 767
770 768 def replacerev(ui, repo, ctx, filedata, replacements):
771 769 """Commit a new revision like the given one, but with file content changes
772 770
773 771 "ctx" is the original revision to be replaced by a modified one.
774 772
775 773 "filedata" is a dict that maps paths to their new file content. All other
776 774 paths will be recreated from the original revision without changes.
777 775 "filedata" may contain paths that didn't exist in the original revision;
778 776 they will be added.
779 777
780 778 "replacements" is a dict that maps a single node to a single node, and it is
781 779 updated to indicate the original revision is replaced by the newly created
782 780 one. No entry is added if the replacement's node already exists.
783 781
784 782 The new revision has the same parents as the old one, unless those parents
785 783 have already been replaced, in which case those replacements are the parents
786 784 of this new revision. Thus, if revisions are replaced in topological order,
787 785 there is no need to rebase them into the original topology later.
788 786 """
789 787
790 788 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
791 789 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
792 790 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
793 791 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
794 792
795 793 # We don't want to create a revision that has no changes from the original,
796 794 # but we should if the original revision's parent has been replaced.
797 795 # Otherwise, we would produce an orphan that needs no actual human
798 796 # intervention to evolve. We can't rely on commit() to avoid creating the
799 797 # un-needed revision because the extra field added below produces a new hash
800 798 # regardless of file content changes.
801 799 if (
802 800 not filedata
803 801 and p1ctx.node() not in replacements
804 802 and p2ctx.node() not in replacements
805 803 ):
806 804 return
807 805
808 806 extra = ctx.extra().copy()
809 807 extra[b'fix_source'] = ctx.hex()
810 808
811 809 wctx = context.overlayworkingctx(repo)
812 810 wctx.setbase(repo[newp1node])
813 811 merge.revert_to(ctx, wc=wctx)
814 812 copies.graftcopies(wctx, ctx, ctx.p1())
815 813
816 814 for path in filedata.keys():
817 815 fctx = ctx[path]
818 816 copysource = fctx.copysource()
819 817 wctx.write(path, filedata[path], flags=fctx.flags())
820 818 if copysource:
821 819 wctx.markcopied(path, copysource)
822 820
823 821 desc = rewriteutil.update_hash_refs(
824 822 repo,
825 823 ctx.description(),
826 824 {oldnode: [newnode] for oldnode, newnode in replacements.items()},
827 825 )
828 826
829 827 memctx = wctx.tomemctx(
830 828 text=desc,
831 829 branch=ctx.branch(),
832 830 extra=extra,
833 831 date=ctx.date(),
834 832 parents=(newp1node, newp2node),
835 833 user=ctx.user(),
836 834 )
837 835
838 836 sucnode = memctx.commit()
839 837 prenode = ctx.node()
840 838 if prenode == sucnode:
841 839 ui.debug(b'node %s already existed\n' % (ctx.hex()))
842 840 else:
843 841 replacements[ctx.node()] = sucnode
844 842
845 843
846 844 def getfixers(ui):
847 845 """Returns a map of configured fixer tools indexed by their names
848 846
849 847 Each value is a Fixer object with methods that implement the behavior of the
850 848 fixer's config suboptions. Does not validate the config values.
851 849 """
852 850 fixers = {}
853 851 for name in fixernames(ui):
854 852 enabled = ui.configbool(b'fix', name + b':enabled')
855 853 command = ui.config(b'fix', name + b':command')
856 854 pattern = ui.config(b'fix', name + b':pattern')
857 855 linerange = ui.config(b'fix', name + b':linerange')
858 856 priority = ui.configint(b'fix', name + b':priority')
859 857 metadata = ui.configbool(b'fix', name + b':metadata')
860 858 skipclean = ui.configbool(b'fix', name + b':skipclean')
861 859 # Don't use a fixer if it has no pattern configured. It would be
862 860 # dangerous to let it affect all files. It would be pointless to let it
863 861 # affect no files. There is no reasonable subset of files to use as the
864 862 # default.
865 863 if command is None:
866 864 ui.warn(
867 865 _(b'fixer tool has no command configuration: %s\n') % (name,)
868 866 )
869 867 elif pattern is None:
870 868 ui.warn(
871 869 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
872 870 )
873 871 elif not enabled:
874 872 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
875 873 else:
876 874 fixers[name] = Fixer(
877 875 command, pattern, linerange, priority, metadata, skipclean
878 876 )
879 877 return collections.OrderedDict(
880 878 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
881 879 )
882 880
883 881
884 882 def fixernames(ui):
885 883 """Returns the names of [fix] config options that have suboptions"""
886 884 names = set()
887 885 for k, v in ui.configitems(b'fix'):
888 886 if b':' in k:
889 887 names.add(k.split(b':', 1)[0])
890 888 return names
891 889
892 890
893 891 class Fixer(object):
894 892 """Wraps the raw config values for a fixer with methods"""
895 893
896 894 def __init__(
897 895 self, command, pattern, linerange, priority, metadata, skipclean
898 896 ):
899 897 self._command = command
900 898 self._pattern = pattern
901 899 self._linerange = linerange
902 900 self._priority = priority
903 901 self._metadata = metadata
904 902 self._skipclean = skipclean
905 903
906 904 def affects(self, opts, fixctx, path):
907 905 """Should this fixer run on the file at the given path and context?"""
908 906 repo = fixctx.repo()
909 907 matcher = matchmod.match(
910 908 repo.root, repo.root, [self._pattern], ctx=fixctx
911 909 )
912 910 return matcher(path)
913 911
914 912 def shouldoutputmetadata(self):
915 913 """Should the stdout of this fixer start with JSON and a null byte?"""
916 914 return self._metadata
917 915
918 916 def command(self, ui, path, ranges):
919 917 """A shell command to use to invoke this fixer on the given file/lines
920 918
921 919 May return None if there is no appropriate command to run for the given
922 920 parameters.
923 921 """
924 922 expand = cmdutil.rendercommandtemplate
925 923 parts = [
926 924 expand(
927 925 ui,
928 926 self._command,
929 927 {b'rootpath': path, b'basename': os.path.basename(path)},
930 928 )
931 929 ]
932 930 if self._linerange:
933 931 if self._skipclean and not ranges:
934 932 # No line ranges to fix, so don't run the fixer.
935 933 return None
936 934 for first, last in ranges:
937 935 parts.append(
938 936 expand(
939 937 ui, self._linerange, {b'first': first, b'last': last}
940 938 )
941 939 )
942 940 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now