##// END OF EJS Templates
fix: fix registration of config item defaults...
Martin von Zweigbergk -
r43488:5cb3e6f4 default
parent child Browse files
Show More
@@ -1,871 +1,870 b''
1 1 # fix - rewrite file content in changesets and working copy
2 2 #
3 3 # Copyright 2018 Google LLC.
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8 8
9 9 Provides a command that runs configured tools on the contents of modified files,
10 10 writing back any fixes to the working copy or replacing changesets.
11 11
12 12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 13 formatting fixes to modified lines in C++ code::
14 14
15 15 [fix]
16 16 clang-format:command=clang-format --assume-filename={rootpath}
17 17 clang-format:linerange=--lines={first}:{last}
18 18 clang-format:pattern=set:**.cpp or **.hpp
19 19
20 20 The :command suboption forms the first part of the shell command that will be
21 21 used to fix a file. The content of the file is passed on standard input, and the
22 22 fixed file content is expected on standard output. Any output on standard error
23 23 will be displayed as a warning. If the exit status is not zero, the file will
24 24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 25 status but no standard error output. Some values may be substituted into the
26 26 command::
27 27
28 28 {rootpath} The path of the file being fixed, relative to the repo root
29 29 {basename} The name of the file being fixed, without the directory path
30 30
31 31 If the :linerange suboption is set, the tool will only be run if there are
32 32 changed lines in a file. The value of this suboption is appended to the shell
33 33 command once for every range of changed lines in the file. Some values may be
34 34 substituted into the command::
35 35
36 36 {first} The 1-based line number of the first line in the modified range
37 37 {last} The 1-based line number of the last line in the modified range
38 38
39 39 Deleted sections of a file will be ignored by :linerange, because there is no
40 40 corresponding line range in the version being fixed.
41 41
42 42 By default, tools that set :linerange will only be executed if there is at least
43 43 one changed line range. This is meant to prevent accidents like running a code
44 44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 46 to false.
47 47
48 48 The :pattern suboption determines which files will be passed through each
49 49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 50 arguments to :hg:`fix`, the intersection of these patterns is used.
51 51
52 52 There is also a configurable limit for the maximum size of file that will be
53 53 processed by :hg:`fix`::
54 54
55 55 [fix]
56 56 maxfilesize = 2MB
57 57
58 58 Normally, execution of configured tools will continue after a failure (indicated
59 59 by a non-zero exit status). It can also be configured to abort after the first
60 60 such failure, so that no files will be affected if any tool fails. This abort
61 61 will also cause :hg:`fix` to exit with a non-zero status::
62 62
63 63 [fix]
64 64 failure = abort
65 65
66 66 When multiple tools are configured to affect a file, they execute in an order
67 67 defined by the :priority suboption. The priority suboption has a default value
68 68 of zero for each tool. Tools are executed in order of descending priority. The
69 69 execution order of tools with equal priority is unspecified. For example, you
70 70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 71 in a text file by ensuring that 'sort' runs before 'head'::
72 72
73 73 [fix]
74 74 sort:command = sort -n
75 75 head:command = head -n 10
76 76 sort:pattern = numbers.txt
77 77 head:pattern = numbers.txt
78 78 sort:priority = 2
79 79 head:priority = 1
80 80
81 81 To account for changes made by each tool, the line numbers used for incremental
82 82 formatting are recomputed before executing the next tool. So, each tool may see
83 83 different values for the arguments added by the :linerange suboption.
84 84
85 85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 86 content. The metadata must be placed before the file content on stdout,
87 87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 89 is expected to produce this metadata encoding if and only if the :metadata
90 90 suboption is true::
91 91
92 92 [fix]
93 93 tool:command = tool --prepend-json-metadata
94 94 tool:metadata = true
95 95
96 96 The metadata values are passed to hooks, which can be used to print summaries or
97 97 perform other post-fixing work. The supported hooks are::
98 98
99 99 "postfixfile"
100 100 Run once for each file in each revision where any fixer tools made changes
101 101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 103 tools that affected the file. Fixer tools that didn't affect the file have a
104 104 valueof None. Only fixer tools that executed are present in the metadata.
105 105
106 106 "postfix"
107 107 Run once after all files and revisions have been handled. Provides
108 108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 111 mapping fixer tool names to lists of metadata values returned from
112 112 executions that modified a file. This aggregates the same metadata
113 113 previously passed to the "postfixfile" hook.
114 114
115 115 Fixer tools are run the in repository's root directory. This allows them to read
116 116 configuration files from the working copy, or even write to the working copy.
117 117 The working copy is not updated to match the revision being fixed. In fact,
118 118 several revisions may be fixed in parallel. Writes to the working copy are not
119 119 amended into the revision being fixed; fixer tools should always write fixed
120 120 file content back to stdout as documented above.
121 121 """
122 122
123 123 from __future__ import absolute_import
124 124
125 125 import collections
126 126 import itertools
127 127 import json
128 128 import os
129 129 import re
130 130 import subprocess
131 131
132 132 from mercurial.i18n import _
133 133 from mercurial.node import nullrev
134 134 from mercurial.node import wdirrev
135 135 from mercurial.pycompat import setattr
136 136
137 137 from mercurial.utils import (
138 138 procutil,
139 139 stringutil,
140 140 )
141 141
142 142 from mercurial import (
143 143 cmdutil,
144 144 context,
145 145 copies,
146 146 error,
147 147 mdiff,
148 148 merge,
149 149 obsolete,
150 150 pycompat,
151 151 registrar,
152 152 scmutil,
153 153 util,
154 154 worker,
155 155 )
156 156
157 157 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
158 158 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
159 159 # be specifying the version(s) of Mercurial they are tested with, or
160 160 # leave the attribute unspecified.
161 161 testedwith = b'ships-with-hg-core'
162 162
163 163 cmdtable = {}
164 164 command = registrar.command(cmdtable)
165 165
166 166 configtable = {}
167 167 configitem = registrar.configitem(configtable)
168 168
169 169 # Register the suboptions allowed for each configured fixer, and default values.
170 170 FIXER_ATTRS = {
171 171 b'command': None,
172 172 b'linerange': None,
173 173 b'pattern': None,
174 174 b'priority': 0,
175 175 b'metadata': b'false',
176 176 b'skipclean': b'true',
177 177 b'enabled': b'true',
178 178 }
179 179
180 180 for key, default in FIXER_ATTRS.items():
181 configitem(b'fix', b'.*(:%s)?' % key, default=default, generic=True)
181 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
182 182
183 183 # A good default size allows most source code files to be fixed, but avoids
184 184 # letting fixer tools choke on huge inputs, which could be surprising to the
185 185 # user.
186 186 configitem(b'fix', b'maxfilesize', default=b'2MB')
187 187
188 188 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
189 189 # This helps users do shell scripts that stop when a fixer tool signals a
190 190 # problem.
191 191 configitem(b'fix', b'failure', default=b'continue')
192 192
193 193
194 194 def checktoolfailureaction(ui, message, hint=None):
195 195 """Abort with 'message' if fix.failure=abort"""
196 196 action = ui.config(b'fix', b'failure')
197 197 if action not in (b'continue', b'abort'):
198 198 raise error.Abort(
199 199 _(b'unknown fix.failure action: %s') % (action,),
200 200 hint=_(b'use "continue" or "abort"'),
201 201 )
202 202 if action == b'abort':
203 203 raise error.Abort(message, hint=hint)
204 204
205 205
206 206 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
207 207 baseopt = (
208 208 b'',
209 209 b'base',
210 210 [],
211 211 _(
212 212 b'revisions to diff against (overrides automatic '
213 213 b'selection, and applies to every revision being '
214 214 b'fixed)'
215 215 ),
216 216 _(b'REV'),
217 217 )
218 218 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
219 219 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
220 220 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
221 221 usage = _(b'[OPTION]... [FILE]...')
222 222
223 223
224 224 @command(
225 225 b'fix',
226 226 [allopt, baseopt, revopt, wdiropt, wholeopt],
227 227 usage,
228 228 helpcategory=command.CATEGORY_FILE_CONTENTS,
229 229 )
230 230 def fix(ui, repo, *pats, **opts):
231 231 """rewrite file content in changesets or working directory
232 232
233 233 Runs any configured tools to fix the content of files. Only affects files
234 234 with changes, unless file arguments are provided. Only affects changed lines
235 235 of files, unless the --whole flag is used. Some tools may always affect the
236 236 whole file regardless of --whole.
237 237
238 238 If revisions are specified with --rev, those revisions will be checked, and
239 239 they may be replaced with new revisions that have fixed file content. It is
240 240 desirable to specify all descendants of each specified revision, so that the
241 241 fixes propagate to the descendants. If all descendants are fixed at the same
242 242 time, no merging, rebasing, or evolution will be required.
243 243
244 244 If --working-dir is used, files with uncommitted changes in the working copy
245 245 will be fixed. If the checked-out revision is also fixed, the working
246 246 directory will update to the replacement revision.
247 247
248 248 When determining what lines of each file to fix at each revision, the whole
249 249 set of revisions being fixed is considered, so that fixes to earlier
250 250 revisions are not forgotten in later ones. The --base flag can be used to
251 251 override this default behavior, though it is not usually desirable to do so.
252 252 """
253 253 opts = pycompat.byteskwargs(opts)
254 254 if opts[b'all']:
255 255 if opts[b'rev']:
256 256 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
257 257 opts[b'rev'] = [b'not public() and not obsolete()']
258 258 opts[b'working_dir'] = True
259 259 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
260 260 revstofix = getrevstofix(ui, repo, opts)
261 261 basectxs = getbasectxs(repo, opts, revstofix)
262 262 workqueue, numitems = getworkqueue(
263 263 ui, repo, pats, opts, revstofix, basectxs
264 264 )
265 265 fixers = getfixers(ui)
266 266
267 267 # There are no data dependencies between the workers fixing each file
268 268 # revision, so we can use all available parallelism.
269 269 def getfixes(items):
270 270 for rev, path in items:
271 271 ctx = repo[rev]
272 272 olddata = ctx[path].data()
273 273 metadata, newdata = fixfile(
274 274 ui, repo, opts, fixers, ctx, path, basectxs[rev]
275 275 )
276 276 # Don't waste memory/time passing unchanged content back, but
277 277 # produce one result per item either way.
278 278 yield (
279 279 rev,
280 280 path,
281 281 metadata,
282 282 newdata if newdata != olddata else None,
283 283 )
284 284
285 285 results = worker.worker(
286 286 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
287 287 )
288 288
289 289 # We have to hold on to the data for each successor revision in memory
290 290 # until all its parents are committed. We ensure this by committing and
291 291 # freeing memory for the revisions in some topological order. This
292 292 # leaves a little bit of memory efficiency on the table, but also makes
293 293 # the tests deterministic. It might also be considered a feature since
294 294 # it makes the results more easily reproducible.
295 295 filedata = collections.defaultdict(dict)
296 296 aggregatemetadata = collections.defaultdict(list)
297 297 replacements = {}
298 298 wdirwritten = False
299 299 commitorder = sorted(revstofix, reverse=True)
300 300 with ui.makeprogress(
301 301 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
302 302 ) as progress:
303 303 for rev, path, filerevmetadata, newdata in results:
304 304 progress.increment(item=path)
305 305 for fixername, fixermetadata in filerevmetadata.items():
306 306 aggregatemetadata[fixername].append(fixermetadata)
307 307 if newdata is not None:
308 308 filedata[rev][path] = newdata
309 309 hookargs = {
310 310 b'rev': rev,
311 311 b'path': path,
312 312 b'metadata': filerevmetadata,
313 313 }
314 314 repo.hook(
315 315 b'postfixfile',
316 316 throw=False,
317 317 **pycompat.strkwargs(hookargs)
318 318 )
319 319 numitems[rev] -= 1
320 320 # Apply the fixes for this and any other revisions that are
321 321 # ready and sitting at the front of the queue. Using a loop here
322 322 # prevents the queue from being blocked by the first revision to
323 323 # be ready out of order.
324 324 while commitorder and not numitems[commitorder[-1]]:
325 325 rev = commitorder.pop()
326 326 ctx = repo[rev]
327 327 if rev == wdirrev:
328 328 writeworkingdir(repo, ctx, filedata[rev], replacements)
329 329 wdirwritten = bool(filedata[rev])
330 330 else:
331 331 replacerev(ui, repo, ctx, filedata[rev], replacements)
332 332 del filedata[rev]
333 333
334 334 cleanup(repo, replacements, wdirwritten)
335 335 hookargs = {
336 336 b'replacements': replacements,
337 337 b'wdirwritten': wdirwritten,
338 338 b'metadata': aggregatemetadata,
339 339 }
340 340 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
341 341
342 342
343 343 def cleanup(repo, replacements, wdirwritten):
344 344 """Calls scmutil.cleanupnodes() with the given replacements.
345 345
346 346 "replacements" is a dict from nodeid to nodeid, with one key and one value
347 347 for every revision that was affected by fixing. This is slightly different
348 348 from cleanupnodes().
349 349
350 350 "wdirwritten" is a bool which tells whether the working copy was affected by
351 351 fixing, since it has no entry in "replacements".
352 352
353 353 Useful as a hook point for extending "hg fix" with output summarizing the
354 354 effects of the command, though we choose not to output anything here.
355 355 """
356 356 replacements = {
357 357 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
358 358 }
359 359 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
360 360
361 361
362 362 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
363 363 """"Constructs the list of files to be fixed at specific revisions
364 364
365 365 It is up to the caller how to consume the work items, and the only
366 366 dependence between them is that replacement revisions must be committed in
367 367 topological order. Each work item represents a file in the working copy or
368 368 in some revision that should be fixed and written back to the working copy
369 369 or into a replacement revision.
370 370
371 371 Work items for the same revision are grouped together, so that a worker
372 372 pool starting with the first N items in parallel is likely to finish the
373 373 first revision's work before other revisions. This can allow us to write
374 374 the result to disk and reduce memory footprint. At time of writing, the
375 375 partition strategy in worker.py seems favorable to this. We also sort the
376 376 items by ascending revision number to match the order in which we commit
377 377 the fixes later.
378 378 """
379 379 workqueue = []
380 380 numitems = collections.defaultdict(int)
381 381 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
382 382 for rev in sorted(revstofix):
383 383 fixctx = repo[rev]
384 384 match = scmutil.match(fixctx, pats, opts)
385 385 for path in sorted(
386 386 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
387 387 ):
388 388 fctx = fixctx[path]
389 389 if fctx.islink():
390 390 continue
391 391 if fctx.size() > maxfilesize:
392 392 ui.warn(
393 393 _(b'ignoring file larger than %s: %s\n')
394 394 % (util.bytecount(maxfilesize), path)
395 395 )
396 396 continue
397 397 workqueue.append((rev, path))
398 398 numitems[rev] += 1
399 399 return workqueue, numitems
400 400
401 401
402 402 def getrevstofix(ui, repo, opts):
403 403 """Returns the set of revision numbers that should be fixed"""
404 404 revs = set(scmutil.revrange(repo, opts[b'rev']))
405 405 for rev in revs:
406 406 checkfixablectx(ui, repo, repo[rev])
407 407 if revs:
408 408 cmdutil.checkunfinished(repo)
409 409 checknodescendants(repo, revs)
410 410 if opts.get(b'working_dir'):
411 411 revs.add(wdirrev)
412 412 if list(merge.mergestate.read(repo).unresolved()):
413 413 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
414 414 if not revs:
415 415 raise error.Abort(
416 416 b'no changesets specified', hint=b'use --rev or --working-dir'
417 417 )
418 418 return revs
419 419
420 420
421 421 def checknodescendants(repo, revs):
422 422 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
423 423 b'(%ld::) - (%ld)', revs, revs
424 424 ):
425 425 raise error.Abort(
426 426 _(b'can only fix a changeset together with all its descendants')
427 427 )
428 428
429 429
430 430 def checkfixablectx(ui, repo, ctx):
431 431 """Aborts if the revision shouldn't be replaced with a fixed one."""
432 432 if not ctx.mutable():
433 433 raise error.Abort(
434 434 b'can\'t fix immutable changeset %s'
435 435 % (scmutil.formatchangeid(ctx),)
436 436 )
437 437 if ctx.obsolete():
438 438 # It would be better to actually check if the revision has a successor.
439 439 allowdivergence = ui.configbool(
440 440 b'experimental', b'evolution.allowdivergence'
441 441 )
442 442 if not allowdivergence:
443 443 raise error.Abort(
444 444 b'fixing obsolete revision could cause divergence'
445 445 )
446 446
447 447
448 448 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
449 449 """Returns the set of files that should be fixed in a context
450 450
451 451 The result depends on the base contexts; we include any file that has
452 452 changed relative to any of the base contexts. Base contexts should be
453 453 ancestors of the context being fixed.
454 454 """
455 455 files = set()
456 456 for basectx in basectxs:
457 457 stat = basectx.status(
458 458 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
459 459 )
460 460 files.update(
461 461 set(
462 462 itertools.chain(
463 463 stat.added, stat.modified, stat.clean, stat.unknown
464 464 )
465 465 )
466 466 )
467 467 return files
468 468
469 469
470 470 def lineranges(opts, path, basectxs, fixctx, content2):
471 471 """Returns the set of line ranges that should be fixed in a file
472 472
473 473 Of the form [(10, 20), (30, 40)].
474 474
475 475 This depends on the given base contexts; we must consider lines that have
476 476 changed versus any of the base contexts, and whether the file has been
477 477 renamed versus any of them.
478 478
479 479 Another way to understand this is that we exclude line ranges that are
480 480 common to the file in all base contexts.
481 481 """
482 482 if opts.get(b'whole'):
483 483 # Return a range containing all lines. Rely on the diff implementation's
484 484 # idea of how many lines are in the file, instead of reimplementing it.
485 485 return difflineranges(b'', content2)
486 486
487 487 rangeslist = []
488 488 for basectx in basectxs:
489 489 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
490 490 if basepath in basectx:
491 491 content1 = basectx[basepath].data()
492 492 else:
493 493 content1 = b''
494 494 rangeslist.extend(difflineranges(content1, content2))
495 495 return unionranges(rangeslist)
496 496
497 497
498 498 def unionranges(rangeslist):
499 499 """Return the union of some closed intervals
500 500
501 501 >>> unionranges([])
502 502 []
503 503 >>> unionranges([(1, 100)])
504 504 [(1, 100)]
505 505 >>> unionranges([(1, 100), (1, 100)])
506 506 [(1, 100)]
507 507 >>> unionranges([(1, 100), (2, 100)])
508 508 [(1, 100)]
509 509 >>> unionranges([(1, 99), (1, 100)])
510 510 [(1, 100)]
511 511 >>> unionranges([(1, 100), (40, 60)])
512 512 [(1, 100)]
513 513 >>> unionranges([(1, 49), (50, 100)])
514 514 [(1, 100)]
515 515 >>> unionranges([(1, 48), (50, 100)])
516 516 [(1, 48), (50, 100)]
517 517 >>> unionranges([(1, 2), (3, 4), (5, 6)])
518 518 [(1, 6)]
519 519 """
520 520 rangeslist = sorted(set(rangeslist))
521 521 unioned = []
522 522 if rangeslist:
523 523 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
524 524 for a, b in rangeslist:
525 525 c, d = unioned[-1]
526 526 if a > d + 1:
527 527 unioned.append((a, b))
528 528 else:
529 529 unioned[-1] = (c, max(b, d))
530 530 return unioned
531 531
532 532
533 533 def difflineranges(content1, content2):
534 534 """Return list of line number ranges in content2 that differ from content1.
535 535
536 536 Line numbers are 1-based. The numbers are the first and last line contained
537 537 in the range. Single-line ranges have the same line number for the first and
538 538 last line. Excludes any empty ranges that result from lines that are only
539 539 present in content1. Relies on mdiff's idea of where the line endings are in
540 540 the string.
541 541
542 542 >>> from mercurial import pycompat
543 543 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
544 544 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
545 545 >>> difflineranges2(b'', b'')
546 546 []
547 547 >>> difflineranges2(b'a', b'')
548 548 []
549 549 >>> difflineranges2(b'', b'A')
550 550 [(1, 1)]
551 551 >>> difflineranges2(b'a', b'a')
552 552 []
553 553 >>> difflineranges2(b'a', b'A')
554 554 [(1, 1)]
555 555 >>> difflineranges2(b'ab', b'')
556 556 []
557 557 >>> difflineranges2(b'', b'AB')
558 558 [(1, 2)]
559 559 >>> difflineranges2(b'abc', b'ac')
560 560 []
561 561 >>> difflineranges2(b'ab', b'aCb')
562 562 [(2, 2)]
563 563 >>> difflineranges2(b'abc', b'aBc')
564 564 [(2, 2)]
565 565 >>> difflineranges2(b'ab', b'AB')
566 566 [(1, 2)]
567 567 >>> difflineranges2(b'abcde', b'aBcDe')
568 568 [(2, 2), (4, 4)]
569 569 >>> difflineranges2(b'abcde', b'aBCDe')
570 570 [(2, 4)]
571 571 """
572 572 ranges = []
573 573 for lines, kind in mdiff.allblocks(content1, content2):
574 574 firstline, lastline = lines[2:4]
575 575 if kind == b'!' and firstline != lastline:
576 576 ranges.append((firstline + 1, lastline))
577 577 return ranges
578 578
579 579
580 580 def getbasectxs(repo, opts, revstofix):
581 581 """Returns a map of the base contexts for each revision
582 582
583 583 The base contexts determine which lines are considered modified when we
584 584 attempt to fix just the modified lines in a file. It also determines which
585 585 files we attempt to fix, so it is important to compute this even when
586 586 --whole is used.
587 587 """
588 588 # The --base flag overrides the usual logic, and we give every revision
589 589 # exactly the set of baserevs that the user specified.
590 590 if opts.get(b'base'):
591 591 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
592 592 if not baserevs:
593 593 baserevs = {nullrev}
594 594 basectxs = {repo[rev] for rev in baserevs}
595 595 return {rev: basectxs for rev in revstofix}
596 596
597 597 # Proceed in topological order so that we can easily determine each
598 598 # revision's baserevs by looking at its parents and their baserevs.
599 599 basectxs = collections.defaultdict(set)
600 600 for rev in sorted(revstofix):
601 601 ctx = repo[rev]
602 602 for pctx in ctx.parents():
603 603 if pctx.rev() in basectxs:
604 604 basectxs[rev].update(basectxs[pctx.rev()])
605 605 else:
606 606 basectxs[rev].add(pctx)
607 607 return basectxs
608 608
609 609
610 610 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
611 611 """Run any configured fixers that should affect the file in this context
612 612
613 613 Returns the file content that results from applying the fixers in some order
614 614 starting with the file's content in the fixctx. Fixers that support line
615 615 ranges will affect lines that have changed relative to any of the basectxs
616 616 (i.e. they will only avoid lines that are common to all basectxs).
617 617
618 618 A fixer tool's stdout will become the file's new content if and only if it
619 619 exits with code zero. The fixer tool's working directory is the repository's
620 620 root.
621 621 """
622 622 metadata = {}
623 623 newdata = fixctx[path].data()
624 624 for fixername, fixer in pycompat.iteritems(fixers):
625 625 if fixer.affects(opts, fixctx, path):
626 626 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
627 627 command = fixer.command(ui, path, ranges)
628 628 if command is None:
629 629 continue
630 630 ui.debug(b'subprocess: %s\n' % (command,))
631 631 proc = subprocess.Popen(
632 632 procutil.tonativestr(command),
633 633 shell=True,
634 634 cwd=procutil.tonativestr(repo.root),
635 635 stdin=subprocess.PIPE,
636 636 stdout=subprocess.PIPE,
637 637 stderr=subprocess.PIPE,
638 638 )
639 639 stdout, stderr = proc.communicate(newdata)
640 640 if stderr:
641 641 showstderr(ui, fixctx.rev(), fixername, stderr)
642 642 newerdata = stdout
643 643 if fixer.shouldoutputmetadata():
644 644 try:
645 645 metadatajson, newerdata = stdout.split(b'\0', 1)
646 646 metadata[fixername] = json.loads(metadatajson)
647 647 except ValueError:
648 648 ui.warn(
649 649 _(b'ignored invalid output from fixer tool: %s\n')
650 650 % (fixername,)
651 651 )
652 652 continue
653 653 else:
654 654 metadata[fixername] = None
655 655 if proc.returncode == 0:
656 656 newdata = newerdata
657 657 else:
658 658 if not stderr:
659 659 message = _(b'exited with status %d\n') % (proc.returncode,)
660 660 showstderr(ui, fixctx.rev(), fixername, message)
661 661 checktoolfailureaction(
662 662 ui,
663 663 _(b'no fixes will be applied'),
664 664 hint=_(
665 665 b'use --config fix.failure=continue to apply any '
666 666 b'successful fixes anyway'
667 667 ),
668 668 )
669 669 return metadata, newdata
670 670
671 671
672 672 def showstderr(ui, rev, fixername, stderr):
673 673 """Writes the lines of the stderr string as warnings on the ui
674 674
675 675 Uses the revision number and fixername to give more context to each line of
676 676 the error message. Doesn't include file names, since those take up a lot of
677 677 space and would tend to be included in the error message if they were
678 678 relevant.
679 679 """
680 680 for line in re.split(b'[\r\n]+', stderr):
681 681 if line:
682 682 ui.warn(b'[')
683 683 if rev is None:
684 684 ui.warn(_(b'wdir'), label=b'evolve.rev')
685 685 else:
686 686 ui.warn((str(rev)), label=b'evolve.rev')
687 687 ui.warn(b'] %s: %s\n' % (fixername, line))
688 688
689 689
690 690 def writeworkingdir(repo, ctx, filedata, replacements):
691 691 """Write new content to the working copy and check out the new p1 if any
692 692
693 693 We check out a new revision if and only if we fixed something in both the
694 694 working directory and its parent revision. This avoids the need for a full
695 695 update/merge, and means that the working directory simply isn't affected
696 696 unless the --working-dir flag is given.
697 697
698 698 Directly updates the dirstate for the affected files.
699 699 """
700 700 for path, data in pycompat.iteritems(filedata):
701 701 fctx = ctx[path]
702 702 fctx.write(data, fctx.flags())
703 703 if repo.dirstate[path] == b'n':
704 704 repo.dirstate.normallookup(path)
705 705
706 706 oldparentnodes = repo.dirstate.parents()
707 707 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
708 708 if newparentnodes != oldparentnodes:
709 709 repo.setparents(*newparentnodes)
710 710
711 711
712 712 def replacerev(ui, repo, ctx, filedata, replacements):
713 713 """Commit a new revision like the given one, but with file content changes
714 714
715 715 "ctx" is the original revision to be replaced by a modified one.
716 716
717 717 "filedata" is a dict that maps paths to their new file content. All other
718 718 paths will be recreated from the original revision without changes.
719 719 "filedata" may contain paths that didn't exist in the original revision;
720 720 they will be added.
721 721
722 722 "replacements" is a dict that maps a single node to a single node, and it is
723 723 updated to indicate the original revision is replaced by the newly created
724 724 one. No entry is added if the replacement's node already exists.
725 725
726 726 The new revision has the same parents as the old one, unless those parents
727 727 have already been replaced, in which case those replacements are the parents
728 728 of this new revision. Thus, if revisions are replaced in topological order,
729 729 there is no need to rebase them into the original topology later.
730 730 """
731 731
732 732 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
733 733 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
734 734 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
735 735 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
736 736
737 737 # We don't want to create a revision that has no changes from the original,
738 738 # but we should if the original revision's parent has been replaced.
739 739 # Otherwise, we would produce an orphan that needs no actual human
740 740 # intervention to evolve. We can't rely on commit() to avoid creating the
741 741 # un-needed revision because the extra field added below produces a new hash
742 742 # regardless of file content changes.
743 743 if (
744 744 not filedata
745 745 and p1ctx.node() not in replacements
746 746 and p2ctx.node() not in replacements
747 747 ):
748 748 return
749 749
750 750 def filectxfn(repo, memctx, path):
751 751 if path not in ctx:
752 752 return None
753 753 fctx = ctx[path]
754 754 copysource = fctx.copysource()
755 755 return context.memfilectx(
756 756 repo,
757 757 memctx,
758 758 path=fctx.path(),
759 759 data=filedata.get(path, fctx.data()),
760 760 islink=fctx.islink(),
761 761 isexec=fctx.isexec(),
762 762 copysource=copysource,
763 763 )
764 764
765 765 extra = ctx.extra().copy()
766 766 extra[b'fix_source'] = ctx.hex()
767 767
768 768 memctx = context.memctx(
769 769 repo,
770 770 parents=(newp1node, newp2node),
771 771 text=ctx.description(),
772 772 files=set(ctx.files()) | set(filedata.keys()),
773 773 filectxfn=filectxfn,
774 774 user=ctx.user(),
775 775 date=ctx.date(),
776 776 extra=extra,
777 777 branch=ctx.branch(),
778 778 editor=None,
779 779 )
780 780 sucnode = memctx.commit()
781 781 prenode = ctx.node()
782 782 if prenode == sucnode:
783 783 ui.debug(b'node %s already existed\n' % (ctx.hex()))
784 784 else:
785 785 replacements[ctx.node()] = sucnode
786 786
787 787
788 788 def getfixers(ui):
789 789 """Returns a map of configured fixer tools indexed by their names
790 790
791 791 Each value is a Fixer object with methods that implement the behavior of the
792 792 fixer's config suboptions. Does not validate the config values.
793 793 """
794 794 fixers = {}
795 795 for name in fixernames(ui):
796 796 fixers[name] = Fixer()
797 attrs = ui.configsuboptions(b'fix', name)[1]
798 797 for key, default in FIXER_ATTRS.items():
799 798 setattr(
800 799 fixers[name],
801 800 pycompat.sysstr(b'_' + key),
802 attrs.get(key, default),
801 ui.config(b'fix', name + b':' + key, default),
803 802 )
804 803 fixers[name]._priority = int(fixers[name]._priority)
805 804 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
806 805 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
807 806 fixers[name]._enabled = stringutil.parsebool(fixers[name]._enabled)
808 807 # Don't use a fixer if it has no pattern configured. It would be
809 808 # dangerous to let it affect all files. It would be pointless to let it
810 809 # affect no files. There is no reasonable subset of files to use as the
811 810 # default.
812 811 if fixers[name]._pattern is None:
813 812 ui.warn(
814 813 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
815 814 )
816 815 del fixers[name]
817 816 elif not fixers[name]._enabled:
818 817 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
819 818 del fixers[name]
820 819 return collections.OrderedDict(
821 820 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
822 821 )
823 822
824 823
825 824 def fixernames(ui):
826 825 """Returns the names of [fix] config options that have suboptions"""
827 826 names = set()
828 827 for k, v in ui.configitems(b'fix'):
829 828 if b':' in k:
830 829 names.add(k.split(b':', 1)[0])
831 830 return names
832 831
833 832
834 833 class Fixer(object):
835 834 """Wraps the raw config values for a fixer with methods"""
836 835
837 836 def affects(self, opts, fixctx, path):
838 837 """Should this fixer run on the file at the given path and context?"""
839 838 return self._pattern is not None and scmutil.match(
840 839 fixctx, [self._pattern], opts
841 840 )(path)
842 841
843 842 def shouldoutputmetadata(self):
844 843 """Should the stdout of this fixer start with JSON and a null byte?"""
845 844 return self._metadata
846 845
847 846 def command(self, ui, path, ranges):
848 847 """A shell command to use to invoke this fixer on the given file/lines
849 848
850 849 May return None if there is no appropriate command to run for the given
851 850 parameters.
852 851 """
853 852 expand = cmdutil.rendercommandtemplate
854 853 parts = [
855 854 expand(
856 855 ui,
857 856 self._command,
858 857 {b'rootpath': path, b'basename': os.path.basename(path)},
859 858 )
860 859 ]
861 860 if self._linerange:
862 861 if self._skipclean and not ranges:
863 862 # No line ranges to fix, so don't run the fixer.
864 863 return None
865 864 for first, last in ranges:
866 865 parts.append(
867 866 expand(
868 867 ui, self._linerange, {b'first': first, b'last': last}
869 868 )
870 869 )
871 870 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now