##// END OF EJS Templates
fix: remove a never-true check for unset pattern in Fixer.affects()...
Martin von Zweigbergk -
r43495:0e2a2fab default
parent child Browse files
Show More
@@ -1,878 +1,876 b''
1 1 # fix - rewrite file content in changesets and working copy
2 2 #
3 3 # Copyright 2018 Google LLC.
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8 8
9 9 Provides a command that runs configured tools on the contents of modified files,
10 10 writing back any fixes to the working copy or replacing changesets.
11 11
12 12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 13 formatting fixes to modified lines in C++ code::
14 14
15 15 [fix]
16 16 clang-format:command=clang-format --assume-filename={rootpath}
17 17 clang-format:linerange=--lines={first}:{last}
18 18 clang-format:pattern=set:**.cpp or **.hpp
19 19
20 20 The :command suboption forms the first part of the shell command that will be
21 21 used to fix a file. The content of the file is passed on standard input, and the
22 22 fixed file content is expected on standard output. Any output on standard error
23 23 will be displayed as a warning. If the exit status is not zero, the file will
24 24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 25 status but no standard error output. Some values may be substituted into the
26 26 command::
27 27
28 28 {rootpath} The path of the file being fixed, relative to the repo root
29 29 {basename} The name of the file being fixed, without the directory path
30 30
31 31 If the :linerange suboption is set, the tool will only be run if there are
32 32 changed lines in a file. The value of this suboption is appended to the shell
33 33 command once for every range of changed lines in the file. Some values may be
34 34 substituted into the command::
35 35
36 36 {first} The 1-based line number of the first line in the modified range
37 37 {last} The 1-based line number of the last line in the modified range
38 38
39 39 Deleted sections of a file will be ignored by :linerange, because there is no
40 40 corresponding line range in the version being fixed.
41 41
42 42 By default, tools that set :linerange will only be executed if there is at least
43 43 one changed line range. This is meant to prevent accidents like running a code
44 44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 46 to false.
47 47
48 48 The :pattern suboption determines which files will be passed through each
49 49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 50 arguments to :hg:`fix`, the intersection of these patterns is used.
51 51
52 52 There is also a configurable limit for the maximum size of file that will be
53 53 processed by :hg:`fix`::
54 54
55 55 [fix]
56 56 maxfilesize = 2MB
57 57
58 58 Normally, execution of configured tools will continue after a failure (indicated
59 59 by a non-zero exit status). It can also be configured to abort after the first
60 60 such failure, so that no files will be affected if any tool fails. This abort
61 61 will also cause :hg:`fix` to exit with a non-zero status::
62 62
63 63 [fix]
64 64 failure = abort
65 65
66 66 When multiple tools are configured to affect a file, they execute in an order
67 67 defined by the :priority suboption. The priority suboption has a default value
68 68 of zero for each tool. Tools are executed in order of descending priority. The
69 69 execution order of tools with equal priority is unspecified. For example, you
70 70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 71 in a text file by ensuring that 'sort' runs before 'head'::
72 72
73 73 [fix]
74 74 sort:command = sort -n
75 75 head:command = head -n 10
76 76 sort:pattern = numbers.txt
77 77 head:pattern = numbers.txt
78 78 sort:priority = 2
79 79 head:priority = 1
80 80
81 81 To account for changes made by each tool, the line numbers used for incremental
82 82 formatting are recomputed before executing the next tool. So, each tool may see
83 83 different values for the arguments added by the :linerange suboption.
84 84
85 85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 86 content. The metadata must be placed before the file content on stdout,
87 87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 89 is expected to produce this metadata encoding if and only if the :metadata
90 90 suboption is true::
91 91
92 92 [fix]
93 93 tool:command = tool --prepend-json-metadata
94 94 tool:metadata = true
95 95
96 96 The metadata values are passed to hooks, which can be used to print summaries or
97 97 perform other post-fixing work. The supported hooks are::
98 98
99 99 "postfixfile"
100 100 Run once for each file in each revision where any fixer tools made changes
101 101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 103 tools that affected the file. Fixer tools that didn't affect the file have a
104 104 valueof None. Only fixer tools that executed are present in the metadata.
105 105
106 106 "postfix"
107 107 Run once after all files and revisions have been handled. Provides
108 108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 111 mapping fixer tool names to lists of metadata values returned from
112 112 executions that modified a file. This aggregates the same metadata
113 113 previously passed to the "postfixfile" hook.
114 114
115 115 Fixer tools are run the in repository's root directory. This allows them to read
116 116 configuration files from the working copy, or even write to the working copy.
117 117 The working copy is not updated to match the revision being fixed. In fact,
118 118 several revisions may be fixed in parallel. Writes to the working copy are not
119 119 amended into the revision being fixed; fixer tools should always write fixed
120 120 file content back to stdout as documented above.
121 121 """
122 122
123 123 from __future__ import absolute_import
124 124
125 125 import collections
126 126 import itertools
127 127 import json
128 128 import os
129 129 import re
130 130 import subprocess
131 131
132 132 from mercurial.i18n import _
133 133 from mercurial.node import nullrev
134 134 from mercurial.node import wdirrev
135 135
136 136 from mercurial.utils import procutil
137 137
138 138 from mercurial import (
139 139 cmdutil,
140 140 context,
141 141 copies,
142 142 error,
143 143 mdiff,
144 144 merge,
145 145 obsolete,
146 146 pycompat,
147 147 registrar,
148 148 scmutil,
149 149 util,
150 150 worker,
151 151 )
152 152
153 153 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
154 154 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
155 155 # be specifying the version(s) of Mercurial they are tested with, or
156 156 # leave the attribute unspecified.
157 157 testedwith = b'ships-with-hg-core'
158 158
159 159 cmdtable = {}
160 160 command = registrar.command(cmdtable)
161 161
162 162 configtable = {}
163 163 configitem = registrar.configitem(configtable)
164 164
165 165 # Register the suboptions allowed for each configured fixer, and default values.
166 166 FIXER_ATTRS = {
167 167 b'command': None,
168 168 b'linerange': None,
169 169 b'pattern': None,
170 170 b'priority': 0,
171 171 b'metadata': False,
172 172 b'skipclean': True,
173 173 b'enabled': True,
174 174 }
175 175
176 176 for key, default in FIXER_ATTRS.items():
177 177 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
178 178
179 179 # A good default size allows most source code files to be fixed, but avoids
180 180 # letting fixer tools choke on huge inputs, which could be surprising to the
181 181 # user.
182 182 configitem(b'fix', b'maxfilesize', default=b'2MB')
183 183
184 184 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
185 185 # This helps users do shell scripts that stop when a fixer tool signals a
186 186 # problem.
187 187 configitem(b'fix', b'failure', default=b'continue')
188 188
189 189
190 190 def checktoolfailureaction(ui, message, hint=None):
191 191 """Abort with 'message' if fix.failure=abort"""
192 192 action = ui.config(b'fix', b'failure')
193 193 if action not in (b'continue', b'abort'):
194 194 raise error.Abort(
195 195 _(b'unknown fix.failure action: %s') % (action,),
196 196 hint=_(b'use "continue" or "abort"'),
197 197 )
198 198 if action == b'abort':
199 199 raise error.Abort(message, hint=hint)
200 200
201 201
202 202 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
203 203 baseopt = (
204 204 b'',
205 205 b'base',
206 206 [],
207 207 _(
208 208 b'revisions to diff against (overrides automatic '
209 209 b'selection, and applies to every revision being '
210 210 b'fixed)'
211 211 ),
212 212 _(b'REV'),
213 213 )
214 214 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
215 215 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
216 216 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
217 217 usage = _(b'[OPTION]... [FILE]...')
218 218
219 219
220 220 @command(
221 221 b'fix',
222 222 [allopt, baseopt, revopt, wdiropt, wholeopt],
223 223 usage,
224 224 helpcategory=command.CATEGORY_FILE_CONTENTS,
225 225 )
226 226 def fix(ui, repo, *pats, **opts):
227 227 """rewrite file content in changesets or working directory
228 228
229 229 Runs any configured tools to fix the content of files. Only affects files
230 230 with changes, unless file arguments are provided. Only affects changed lines
231 231 of files, unless the --whole flag is used. Some tools may always affect the
232 232 whole file regardless of --whole.
233 233
234 234 If revisions are specified with --rev, those revisions will be checked, and
235 235 they may be replaced with new revisions that have fixed file content. It is
236 236 desirable to specify all descendants of each specified revision, so that the
237 237 fixes propagate to the descendants. If all descendants are fixed at the same
238 238 time, no merging, rebasing, or evolution will be required.
239 239
240 240 If --working-dir is used, files with uncommitted changes in the working copy
241 241 will be fixed. If the checked-out revision is also fixed, the working
242 242 directory will update to the replacement revision.
243 243
244 244 When determining what lines of each file to fix at each revision, the whole
245 245 set of revisions being fixed is considered, so that fixes to earlier
246 246 revisions are not forgotten in later ones. The --base flag can be used to
247 247 override this default behavior, though it is not usually desirable to do so.
248 248 """
249 249 opts = pycompat.byteskwargs(opts)
250 250 if opts[b'all']:
251 251 if opts[b'rev']:
252 252 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
253 253 opts[b'rev'] = [b'not public() and not obsolete()']
254 254 opts[b'working_dir'] = True
255 255 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
256 256 revstofix = getrevstofix(ui, repo, opts)
257 257 basectxs = getbasectxs(repo, opts, revstofix)
258 258 workqueue, numitems = getworkqueue(
259 259 ui, repo, pats, opts, revstofix, basectxs
260 260 )
261 261 fixers = getfixers(ui)
262 262
263 263 # There are no data dependencies between the workers fixing each file
264 264 # revision, so we can use all available parallelism.
265 265 def getfixes(items):
266 266 for rev, path in items:
267 267 ctx = repo[rev]
268 268 olddata = ctx[path].data()
269 269 metadata, newdata = fixfile(
270 270 ui, repo, opts, fixers, ctx, path, basectxs[rev]
271 271 )
272 272 # Don't waste memory/time passing unchanged content back, but
273 273 # produce one result per item either way.
274 274 yield (
275 275 rev,
276 276 path,
277 277 metadata,
278 278 newdata if newdata != olddata else None,
279 279 )
280 280
281 281 results = worker.worker(
282 282 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
283 283 )
284 284
285 285 # We have to hold on to the data for each successor revision in memory
286 286 # until all its parents are committed. We ensure this by committing and
287 287 # freeing memory for the revisions in some topological order. This
288 288 # leaves a little bit of memory efficiency on the table, but also makes
289 289 # the tests deterministic. It might also be considered a feature since
290 290 # it makes the results more easily reproducible.
291 291 filedata = collections.defaultdict(dict)
292 292 aggregatemetadata = collections.defaultdict(list)
293 293 replacements = {}
294 294 wdirwritten = False
295 295 commitorder = sorted(revstofix, reverse=True)
296 296 with ui.makeprogress(
297 297 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
298 298 ) as progress:
299 299 for rev, path, filerevmetadata, newdata in results:
300 300 progress.increment(item=path)
301 301 for fixername, fixermetadata in filerevmetadata.items():
302 302 aggregatemetadata[fixername].append(fixermetadata)
303 303 if newdata is not None:
304 304 filedata[rev][path] = newdata
305 305 hookargs = {
306 306 b'rev': rev,
307 307 b'path': path,
308 308 b'metadata': filerevmetadata,
309 309 }
310 310 repo.hook(
311 311 b'postfixfile',
312 312 throw=False,
313 313 **pycompat.strkwargs(hookargs)
314 314 )
315 315 numitems[rev] -= 1
316 316 # Apply the fixes for this and any other revisions that are
317 317 # ready and sitting at the front of the queue. Using a loop here
318 318 # prevents the queue from being blocked by the first revision to
319 319 # be ready out of order.
320 320 while commitorder and not numitems[commitorder[-1]]:
321 321 rev = commitorder.pop()
322 322 ctx = repo[rev]
323 323 if rev == wdirrev:
324 324 writeworkingdir(repo, ctx, filedata[rev], replacements)
325 325 wdirwritten = bool(filedata[rev])
326 326 else:
327 327 replacerev(ui, repo, ctx, filedata[rev], replacements)
328 328 del filedata[rev]
329 329
330 330 cleanup(repo, replacements, wdirwritten)
331 331 hookargs = {
332 332 b'replacements': replacements,
333 333 b'wdirwritten': wdirwritten,
334 334 b'metadata': aggregatemetadata,
335 335 }
336 336 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
337 337
338 338
339 339 def cleanup(repo, replacements, wdirwritten):
340 340 """Calls scmutil.cleanupnodes() with the given replacements.
341 341
342 342 "replacements" is a dict from nodeid to nodeid, with one key and one value
343 343 for every revision that was affected by fixing. This is slightly different
344 344 from cleanupnodes().
345 345
346 346 "wdirwritten" is a bool which tells whether the working copy was affected by
347 347 fixing, since it has no entry in "replacements".
348 348
349 349 Useful as a hook point for extending "hg fix" with output summarizing the
350 350 effects of the command, though we choose not to output anything here.
351 351 """
352 352 replacements = {
353 353 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
354 354 }
355 355 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
356 356
357 357
358 358 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
359 359 """"Constructs the list of files to be fixed at specific revisions
360 360
361 361 It is up to the caller how to consume the work items, and the only
362 362 dependence between them is that replacement revisions must be committed in
363 363 topological order. Each work item represents a file in the working copy or
364 364 in some revision that should be fixed and written back to the working copy
365 365 or into a replacement revision.
366 366
367 367 Work items for the same revision are grouped together, so that a worker
368 368 pool starting with the first N items in parallel is likely to finish the
369 369 first revision's work before other revisions. This can allow us to write
370 370 the result to disk and reduce memory footprint. At time of writing, the
371 371 partition strategy in worker.py seems favorable to this. We also sort the
372 372 items by ascending revision number to match the order in which we commit
373 373 the fixes later.
374 374 """
375 375 workqueue = []
376 376 numitems = collections.defaultdict(int)
377 377 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
378 378 for rev in sorted(revstofix):
379 379 fixctx = repo[rev]
380 380 match = scmutil.match(fixctx, pats, opts)
381 381 for path in sorted(
382 382 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
383 383 ):
384 384 fctx = fixctx[path]
385 385 if fctx.islink():
386 386 continue
387 387 if fctx.size() > maxfilesize:
388 388 ui.warn(
389 389 _(b'ignoring file larger than %s: %s\n')
390 390 % (util.bytecount(maxfilesize), path)
391 391 )
392 392 continue
393 393 workqueue.append((rev, path))
394 394 numitems[rev] += 1
395 395 return workqueue, numitems
396 396
397 397
398 398 def getrevstofix(ui, repo, opts):
399 399 """Returns the set of revision numbers that should be fixed"""
400 400 revs = set(scmutil.revrange(repo, opts[b'rev']))
401 401 for rev in revs:
402 402 checkfixablectx(ui, repo, repo[rev])
403 403 if revs:
404 404 cmdutil.checkunfinished(repo)
405 405 checknodescendants(repo, revs)
406 406 if opts.get(b'working_dir'):
407 407 revs.add(wdirrev)
408 408 if list(merge.mergestate.read(repo).unresolved()):
409 409 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
410 410 if not revs:
411 411 raise error.Abort(
412 412 b'no changesets specified', hint=b'use --rev or --working-dir'
413 413 )
414 414 return revs
415 415
416 416
417 417 def checknodescendants(repo, revs):
418 418 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
419 419 b'(%ld::) - (%ld)', revs, revs
420 420 ):
421 421 raise error.Abort(
422 422 _(b'can only fix a changeset together with all its descendants')
423 423 )
424 424
425 425
426 426 def checkfixablectx(ui, repo, ctx):
427 427 """Aborts if the revision shouldn't be replaced with a fixed one."""
428 428 if not ctx.mutable():
429 429 raise error.Abort(
430 430 b'can\'t fix immutable changeset %s'
431 431 % (scmutil.formatchangeid(ctx),)
432 432 )
433 433 if ctx.obsolete():
434 434 # It would be better to actually check if the revision has a successor.
435 435 allowdivergence = ui.configbool(
436 436 b'experimental', b'evolution.allowdivergence'
437 437 )
438 438 if not allowdivergence:
439 439 raise error.Abort(
440 440 b'fixing obsolete revision could cause divergence'
441 441 )
442 442
443 443
444 444 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
445 445 """Returns the set of files that should be fixed in a context
446 446
447 447 The result depends on the base contexts; we include any file that has
448 448 changed relative to any of the base contexts. Base contexts should be
449 449 ancestors of the context being fixed.
450 450 """
451 451 files = set()
452 452 for basectx in basectxs:
453 453 stat = basectx.status(
454 454 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
455 455 )
456 456 files.update(
457 457 set(
458 458 itertools.chain(
459 459 stat.added, stat.modified, stat.clean, stat.unknown
460 460 )
461 461 )
462 462 )
463 463 return files
464 464
465 465
466 466 def lineranges(opts, path, basectxs, fixctx, content2):
467 467 """Returns the set of line ranges that should be fixed in a file
468 468
469 469 Of the form [(10, 20), (30, 40)].
470 470
471 471 This depends on the given base contexts; we must consider lines that have
472 472 changed versus any of the base contexts, and whether the file has been
473 473 renamed versus any of them.
474 474
475 475 Another way to understand this is that we exclude line ranges that are
476 476 common to the file in all base contexts.
477 477 """
478 478 if opts.get(b'whole'):
479 479 # Return a range containing all lines. Rely on the diff implementation's
480 480 # idea of how many lines are in the file, instead of reimplementing it.
481 481 return difflineranges(b'', content2)
482 482
483 483 rangeslist = []
484 484 for basectx in basectxs:
485 485 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
486 486 if basepath in basectx:
487 487 content1 = basectx[basepath].data()
488 488 else:
489 489 content1 = b''
490 490 rangeslist.extend(difflineranges(content1, content2))
491 491 return unionranges(rangeslist)
492 492
493 493
494 494 def unionranges(rangeslist):
495 495 """Return the union of some closed intervals
496 496
497 497 >>> unionranges([])
498 498 []
499 499 >>> unionranges([(1, 100)])
500 500 [(1, 100)]
501 501 >>> unionranges([(1, 100), (1, 100)])
502 502 [(1, 100)]
503 503 >>> unionranges([(1, 100), (2, 100)])
504 504 [(1, 100)]
505 505 >>> unionranges([(1, 99), (1, 100)])
506 506 [(1, 100)]
507 507 >>> unionranges([(1, 100), (40, 60)])
508 508 [(1, 100)]
509 509 >>> unionranges([(1, 49), (50, 100)])
510 510 [(1, 100)]
511 511 >>> unionranges([(1, 48), (50, 100)])
512 512 [(1, 48), (50, 100)]
513 513 >>> unionranges([(1, 2), (3, 4), (5, 6)])
514 514 [(1, 6)]
515 515 """
516 516 rangeslist = sorted(set(rangeslist))
517 517 unioned = []
518 518 if rangeslist:
519 519 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
520 520 for a, b in rangeslist:
521 521 c, d = unioned[-1]
522 522 if a > d + 1:
523 523 unioned.append((a, b))
524 524 else:
525 525 unioned[-1] = (c, max(b, d))
526 526 return unioned
527 527
528 528
529 529 def difflineranges(content1, content2):
530 530 """Return list of line number ranges in content2 that differ from content1.
531 531
532 532 Line numbers are 1-based. The numbers are the first and last line contained
533 533 in the range. Single-line ranges have the same line number for the first and
534 534 last line. Excludes any empty ranges that result from lines that are only
535 535 present in content1. Relies on mdiff's idea of where the line endings are in
536 536 the string.
537 537
538 538 >>> from mercurial import pycompat
539 539 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
540 540 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
541 541 >>> difflineranges2(b'', b'')
542 542 []
543 543 >>> difflineranges2(b'a', b'')
544 544 []
545 545 >>> difflineranges2(b'', b'A')
546 546 [(1, 1)]
547 547 >>> difflineranges2(b'a', b'a')
548 548 []
549 549 >>> difflineranges2(b'a', b'A')
550 550 [(1, 1)]
551 551 >>> difflineranges2(b'ab', b'')
552 552 []
553 553 >>> difflineranges2(b'', b'AB')
554 554 [(1, 2)]
555 555 >>> difflineranges2(b'abc', b'ac')
556 556 []
557 557 >>> difflineranges2(b'ab', b'aCb')
558 558 [(2, 2)]
559 559 >>> difflineranges2(b'abc', b'aBc')
560 560 [(2, 2)]
561 561 >>> difflineranges2(b'ab', b'AB')
562 562 [(1, 2)]
563 563 >>> difflineranges2(b'abcde', b'aBcDe')
564 564 [(2, 2), (4, 4)]
565 565 >>> difflineranges2(b'abcde', b'aBCDe')
566 566 [(2, 4)]
567 567 """
568 568 ranges = []
569 569 for lines, kind in mdiff.allblocks(content1, content2):
570 570 firstline, lastline = lines[2:4]
571 571 if kind == b'!' and firstline != lastline:
572 572 ranges.append((firstline + 1, lastline))
573 573 return ranges
574 574
575 575
576 576 def getbasectxs(repo, opts, revstofix):
577 577 """Returns a map of the base contexts for each revision
578 578
579 579 The base contexts determine which lines are considered modified when we
580 580 attempt to fix just the modified lines in a file. It also determines which
581 581 files we attempt to fix, so it is important to compute this even when
582 582 --whole is used.
583 583 """
584 584 # The --base flag overrides the usual logic, and we give every revision
585 585 # exactly the set of baserevs that the user specified.
586 586 if opts.get(b'base'):
587 587 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
588 588 if not baserevs:
589 589 baserevs = {nullrev}
590 590 basectxs = {repo[rev] for rev in baserevs}
591 591 return {rev: basectxs for rev in revstofix}
592 592
593 593 # Proceed in topological order so that we can easily determine each
594 594 # revision's baserevs by looking at its parents and their baserevs.
595 595 basectxs = collections.defaultdict(set)
596 596 for rev in sorted(revstofix):
597 597 ctx = repo[rev]
598 598 for pctx in ctx.parents():
599 599 if pctx.rev() in basectxs:
600 600 basectxs[rev].update(basectxs[pctx.rev()])
601 601 else:
602 602 basectxs[rev].add(pctx)
603 603 return basectxs
604 604
605 605
606 606 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
607 607 """Run any configured fixers that should affect the file in this context
608 608
609 609 Returns the file content that results from applying the fixers in some order
610 610 starting with the file's content in the fixctx. Fixers that support line
611 611 ranges will affect lines that have changed relative to any of the basectxs
612 612 (i.e. they will only avoid lines that are common to all basectxs).
613 613
614 614 A fixer tool's stdout will become the file's new content if and only if it
615 615 exits with code zero. The fixer tool's working directory is the repository's
616 616 root.
617 617 """
618 618 metadata = {}
619 619 newdata = fixctx[path].data()
620 620 for fixername, fixer in pycompat.iteritems(fixers):
621 621 if fixer.affects(opts, fixctx, path):
622 622 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
623 623 command = fixer.command(ui, path, ranges)
624 624 if command is None:
625 625 continue
626 626 ui.debug(b'subprocess: %s\n' % (command,))
627 627 proc = subprocess.Popen(
628 628 procutil.tonativestr(command),
629 629 shell=True,
630 630 cwd=procutil.tonativestr(repo.root),
631 631 stdin=subprocess.PIPE,
632 632 stdout=subprocess.PIPE,
633 633 stderr=subprocess.PIPE,
634 634 )
635 635 stdout, stderr = proc.communicate(newdata)
636 636 if stderr:
637 637 showstderr(ui, fixctx.rev(), fixername, stderr)
638 638 newerdata = stdout
639 639 if fixer.shouldoutputmetadata():
640 640 try:
641 641 metadatajson, newerdata = stdout.split(b'\0', 1)
642 642 metadata[fixername] = json.loads(metadatajson)
643 643 except ValueError:
644 644 ui.warn(
645 645 _(b'ignored invalid output from fixer tool: %s\n')
646 646 % (fixername,)
647 647 )
648 648 continue
649 649 else:
650 650 metadata[fixername] = None
651 651 if proc.returncode == 0:
652 652 newdata = newerdata
653 653 else:
654 654 if not stderr:
655 655 message = _(b'exited with status %d\n') % (proc.returncode,)
656 656 showstderr(ui, fixctx.rev(), fixername, message)
657 657 checktoolfailureaction(
658 658 ui,
659 659 _(b'no fixes will be applied'),
660 660 hint=_(
661 661 b'use --config fix.failure=continue to apply any '
662 662 b'successful fixes anyway'
663 663 ),
664 664 )
665 665 return metadata, newdata
666 666
667 667
668 668 def showstderr(ui, rev, fixername, stderr):
669 669 """Writes the lines of the stderr string as warnings on the ui
670 670
671 671 Uses the revision number and fixername to give more context to each line of
672 672 the error message. Doesn't include file names, since those take up a lot of
673 673 space and would tend to be included in the error message if they were
674 674 relevant.
675 675 """
676 676 for line in re.split(b'[\r\n]+', stderr):
677 677 if line:
678 678 ui.warn(b'[')
679 679 if rev is None:
680 680 ui.warn(_(b'wdir'), label=b'evolve.rev')
681 681 else:
682 682 ui.warn((str(rev)), label=b'evolve.rev')
683 683 ui.warn(b'] %s: %s\n' % (fixername, line))
684 684
685 685
686 686 def writeworkingdir(repo, ctx, filedata, replacements):
687 687 """Write new content to the working copy and check out the new p1 if any
688 688
689 689 We check out a new revision if and only if we fixed something in both the
690 690 working directory and its parent revision. This avoids the need for a full
691 691 update/merge, and means that the working directory simply isn't affected
692 692 unless the --working-dir flag is given.
693 693
694 694 Directly updates the dirstate for the affected files.
695 695 """
696 696 for path, data in pycompat.iteritems(filedata):
697 697 fctx = ctx[path]
698 698 fctx.write(data, fctx.flags())
699 699 if repo.dirstate[path] == b'n':
700 700 repo.dirstate.normallookup(path)
701 701
702 702 oldparentnodes = repo.dirstate.parents()
703 703 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
704 704 if newparentnodes != oldparentnodes:
705 705 repo.setparents(*newparentnodes)
706 706
707 707
708 708 def replacerev(ui, repo, ctx, filedata, replacements):
709 709 """Commit a new revision like the given one, but with file content changes
710 710
711 711 "ctx" is the original revision to be replaced by a modified one.
712 712
713 713 "filedata" is a dict that maps paths to their new file content. All other
714 714 paths will be recreated from the original revision without changes.
715 715 "filedata" may contain paths that didn't exist in the original revision;
716 716 they will be added.
717 717
718 718 "replacements" is a dict that maps a single node to a single node, and it is
719 719 updated to indicate the original revision is replaced by the newly created
720 720 one. No entry is added if the replacement's node already exists.
721 721
722 722 The new revision has the same parents as the old one, unless those parents
723 723 have already been replaced, in which case those replacements are the parents
724 724 of this new revision. Thus, if revisions are replaced in topological order,
725 725 there is no need to rebase them into the original topology later.
726 726 """
727 727
728 728 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
729 729 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
730 730 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
731 731 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
732 732
733 733 # We don't want to create a revision that has no changes from the original,
734 734 # but we should if the original revision's parent has been replaced.
735 735 # Otherwise, we would produce an orphan that needs no actual human
736 736 # intervention to evolve. We can't rely on commit() to avoid creating the
737 737 # un-needed revision because the extra field added below produces a new hash
738 738 # regardless of file content changes.
739 739 if (
740 740 not filedata
741 741 and p1ctx.node() not in replacements
742 742 and p2ctx.node() not in replacements
743 743 ):
744 744 return
745 745
746 746 def filectxfn(repo, memctx, path):
747 747 if path not in ctx:
748 748 return None
749 749 fctx = ctx[path]
750 750 copysource = fctx.copysource()
751 751 return context.memfilectx(
752 752 repo,
753 753 memctx,
754 754 path=fctx.path(),
755 755 data=filedata.get(path, fctx.data()),
756 756 islink=fctx.islink(),
757 757 isexec=fctx.isexec(),
758 758 copysource=copysource,
759 759 )
760 760
761 761 extra = ctx.extra().copy()
762 762 extra[b'fix_source'] = ctx.hex()
763 763
764 764 memctx = context.memctx(
765 765 repo,
766 766 parents=(newp1node, newp2node),
767 767 text=ctx.description(),
768 768 files=set(ctx.files()) | set(filedata.keys()),
769 769 filectxfn=filectxfn,
770 770 user=ctx.user(),
771 771 date=ctx.date(),
772 772 extra=extra,
773 773 branch=ctx.branch(),
774 774 editor=None,
775 775 )
776 776 sucnode = memctx.commit()
777 777 prenode = ctx.node()
778 778 if prenode == sucnode:
779 779 ui.debug(b'node %s already existed\n' % (ctx.hex()))
780 780 else:
781 781 replacements[ctx.node()] = sucnode
782 782
783 783
784 784 def getfixers(ui):
785 785 """Returns a map of configured fixer tools indexed by their names
786 786
787 787 Each value is a Fixer object with methods that implement the behavior of the
788 788 fixer's config suboptions. Does not validate the config values.
789 789 """
790 790 fixers = {}
791 791 for name in fixernames(ui):
792 792 enabled = ui.configbool(b'fix', name + b':enabled')
793 793 command = ui.config(b'fix', name + b':command')
794 794 pattern = ui.config(b'fix', name + b':pattern')
795 795 linerange = ui.config(b'fix', name + b':linerange')
796 796 priority = ui.configint(b'fix', name + b':priority')
797 797 metadata = ui.configbool(b'fix', name + b':metadata')
798 798 skipclean = ui.configbool(b'fix', name + b':skipclean')
799 799 # Don't use a fixer if it has no pattern configured. It would be
800 800 # dangerous to let it affect all files. It would be pointless to let it
801 801 # affect no files. There is no reasonable subset of files to use as the
802 802 # default.
803 803 if command is None:
804 804 ui.warn(
805 805 _(b'fixer tool has no command configuration: %s\n') % (name,)
806 806 )
807 807 elif pattern is None:
808 808 ui.warn(
809 809 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
810 810 )
811 811 elif not enabled:
812 812 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
813 813 else:
814 814 fixers[name] = Fixer(
815 815 command, pattern, linerange, priority, metadata, skipclean
816 816 )
817 817 return collections.OrderedDict(
818 818 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
819 819 )
820 820
821 821
822 822 def fixernames(ui):
823 823 """Returns the names of [fix] config options that have suboptions"""
824 824 names = set()
825 825 for k, v in ui.configitems(b'fix'):
826 826 if b':' in k:
827 827 names.add(k.split(b':', 1)[0])
828 828 return names
829 829
830 830
831 831 class Fixer(object):
832 832 """Wraps the raw config values for a fixer with methods"""
833 833
834 834 def __init__(
835 835 self, command, pattern, linerange, priority, metadata, skipclean
836 836 ):
837 837 self._command = command
838 838 self._pattern = pattern
839 839 self._linerange = linerange
840 840 self._priority = priority
841 841 self._metadata = metadata
842 842 self._skipclean = skipclean
843 843
844 844 def affects(self, opts, fixctx, path):
845 845 """Should this fixer run on the file at the given path and context?"""
846 return self._pattern is not None and scmutil.match(
847 fixctx, [self._pattern], opts
848 )(path)
846 return scmutil.match(fixctx, [self._pattern], opts)(path)
849 847
850 848 def shouldoutputmetadata(self):
851 849 """Should the stdout of this fixer start with JSON and a null byte?"""
852 850 return self._metadata
853 851
854 852 def command(self, ui, path, ranges):
855 853 """A shell command to use to invoke this fixer on the given file/lines
856 854
857 855 May return None if there is no appropriate command to run for the given
858 856 parameters.
859 857 """
860 858 expand = cmdutil.rendercommandtemplate
861 859 parts = [
862 860 expand(
863 861 ui,
864 862 self._command,
865 863 {b'rootpath': path, b'basename': os.path.basename(path)},
866 864 )
867 865 ]
868 866 if self._linerange:
869 867 if self._skipclean and not ranges:
870 868 # No line ranges to fix, so don't run the fixer.
871 869 return None
872 870 for first, last in ranges:
873 871 parts.append(
874 872 expand(
875 873 ui, self._linerange, {b'first': first, b'last': last}
876 874 )
877 875 )
878 876 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now