##// END OF EJS Templates
fix: pass line ranges as value instead of callback...
Danny Hooper -
r43003:e9f50307 default
parent child Browse files
Show More
@@ -1,778 +1,777 b''
1 1 # fix - rewrite file content in changesets and working copy
2 2 #
3 3 # Copyright 2018 Google LLC.
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8 8
9 9 Provides a command that runs configured tools on the contents of modified files,
10 10 writing back any fixes to the working copy or replacing changesets.
11 11
12 12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 13 formatting fixes to modified lines in C++ code::
14 14
15 15 [fix]
16 16 clang-format:command=clang-format --assume-filename={rootpath}
17 17 clang-format:linerange=--lines={first}:{last}
18 18 clang-format:pattern=set:**.cpp or **.hpp
19 19
20 20 The :command suboption forms the first part of the shell command that will be
21 21 used to fix a file. The content of the file is passed on standard input, and the
22 22 fixed file content is expected on standard output. Any output on standard error
23 23 will be displayed as a warning. If the exit status is not zero, the file will
24 24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 25 status but no standard error output. Some values may be substituted into the
26 26 command::
27 27
28 28 {rootpath} The path of the file being fixed, relative to the repo root
29 29 {basename} The name of the file being fixed, without the directory path
30 30
31 31 If the :linerange suboption is set, the tool will only be run if there are
32 32 changed lines in a file. The value of this suboption is appended to the shell
33 33 command once for every range of changed lines in the file. Some values may be
34 34 substituted into the command::
35 35
36 36 {first} The 1-based line number of the first line in the modified range
37 37 {last} The 1-based line number of the last line in the modified range
38 38
39 39 Deleted sections of a file will be ignored by :linerange, because there is no
40 40 corresponding line range in the version being fixed.
41 41
42 42 By default, tools that set :linerange will only be executed if there is at least
43 43 one changed line range. This is meant to prevent accidents like running a code
44 44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 46 to false.
47 47
48 48 The :pattern suboption determines which files will be passed through each
49 49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 50 arguments to :hg:`fix`, the intersection of these patterns is used.
51 51
52 52 There is also a configurable limit for the maximum size of file that will be
53 53 processed by :hg:`fix`::
54 54
55 55 [fix]
56 56 maxfilesize = 2MB
57 57
58 58 Normally, execution of configured tools will continue after a failure (indicated
59 59 by a non-zero exit status). It can also be configured to abort after the first
60 60 such failure, so that no files will be affected if any tool fails. This abort
61 61 will also cause :hg:`fix` to exit with a non-zero status::
62 62
63 63 [fix]
64 64 failure = abort
65 65
66 66 When multiple tools are configured to affect a file, they execute in an order
67 67 defined by the :priority suboption. The priority suboption has a default value
68 68 of zero for each tool. Tools are executed in order of descending priority. The
69 69 execution order of tools with equal priority is unspecified. For example, you
70 70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 71 in a text file by ensuring that 'sort' runs before 'head'::
72 72
73 73 [fix]
74 74 sort:command = sort -n
75 75 head:command = head -n 10
76 76 sort:pattern = numbers.txt
77 77 head:pattern = numbers.txt
78 78 sort:priority = 2
79 79 head:priority = 1
80 80
81 81 To account for changes made by each tool, the line numbers used for incremental
82 82 formatting are recomputed before executing the next tool. So, each tool may see
83 83 different values for the arguments added by the :linerange suboption.
84 84
85 85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 86 content. The metadata must be placed before the file content on stdout,
87 87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 89 is expected to produce this metadata encoding if and only if the :metadata
90 90 suboption is true::
91 91
92 92 [fix]
93 93 tool:command = tool --prepend-json-metadata
94 94 tool:metadata = true
95 95
96 96 The metadata values are passed to hooks, which can be used to print summaries or
97 97 perform other post-fixing work. The supported hooks are::
98 98
99 99 "postfixfile"
100 100 Run once for each file in each revision where any fixer tools made changes
101 101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 103 tools that affected the file. Fixer tools that didn't affect the file have a
104 104 valueof None. Only fixer tools that executed are present in the metadata.
105 105
106 106 "postfix"
107 107 Run once after all files and revisions have been handled. Provides
108 108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 111 mapping fixer tool names to lists of metadata values returned from
112 112 executions that modified a file. This aggregates the same metadata
113 113 previously passed to the "postfixfile" hook.
114 114
115 115 Fixer tools are run the in repository's root directory. This allows them to read
116 116 configuration files from the working copy, or even write to the working copy.
117 117 The working copy is not updated to match the revision being fixed. In fact,
118 118 several revisions may be fixed in parallel. Writes to the working copy are not
119 119 amended into the revision being fixed; fixer tools should always write fixed
120 120 file content back to stdout as documented above.
121 121 """
122 122
123 123 from __future__ import absolute_import
124 124
125 125 import collections
126 126 import itertools
127 127 import json
128 128 import os
129 129 import re
130 130 import subprocess
131 131
132 132 from mercurial.i18n import _
133 133 from mercurial.node import nullrev
134 134 from mercurial.node import wdirrev
135 135
136 136 from mercurial.utils import (
137 137 procutil,
138 138 stringutil,
139 139 )
140 140
141 141 from mercurial import (
142 142 cmdutil,
143 143 context,
144 144 copies,
145 145 error,
146 146 mdiff,
147 147 merge,
148 148 obsolete,
149 149 pycompat,
150 150 registrar,
151 151 scmutil,
152 152 util,
153 153 worker,
154 154 )
155 155
156 156 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
157 157 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
158 158 # be specifying the version(s) of Mercurial they are tested with, or
159 159 # leave the attribute unspecified.
160 160 testedwith = 'ships-with-hg-core'
161 161
162 162 cmdtable = {}
163 163 command = registrar.command(cmdtable)
164 164
165 165 configtable = {}
166 166 configitem = registrar.configitem(configtable)
167 167
168 168 # Register the suboptions allowed for each configured fixer, and default values.
169 169 FIXER_ATTRS = {
170 170 'command': None,
171 171 'linerange': None,
172 172 'pattern': None,
173 173 'priority': 0,
174 174 'metadata': 'false',
175 175 'skipclean': 'true',
176 176 }
177 177
178 178 for key, default in FIXER_ATTRS.items():
179 179 configitem('fix', '.*(:%s)?' % key, default=default, generic=True)
180 180
181 181 # A good default size allows most source code files to be fixed, but avoids
182 182 # letting fixer tools choke on huge inputs, which could be surprising to the
183 183 # user.
184 184 configitem('fix', 'maxfilesize', default='2MB')
185 185
186 186 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
187 187 # This helps users do shell scripts that stop when a fixer tool signals a
188 188 # problem.
189 189 configitem('fix', 'failure', default='continue')
190 190
191 191 def checktoolfailureaction(ui, message, hint=None):
192 192 """Abort with 'message' if fix.failure=abort"""
193 193 action = ui.config('fix', 'failure')
194 194 if action not in ('continue', 'abort'):
195 195 raise error.Abort(_('unknown fix.failure action: %s') % (action,),
196 196 hint=_('use "continue" or "abort"'))
197 197 if action == 'abort':
198 198 raise error.Abort(message, hint=hint)
199 199
200 200 allopt = ('', 'all', False, _('fix all non-public non-obsolete revisions'))
201 201 baseopt = ('', 'base', [], _('revisions to diff against (overrides automatic '
202 202 'selection, and applies to every revision being '
203 203 'fixed)'), _('REV'))
204 204 revopt = ('r', 'rev', [], _('revisions to fix'), _('REV'))
205 205 wdiropt = ('w', 'working-dir', False, _('fix the working directory'))
206 206 wholeopt = ('', 'whole', False, _('always fix every line of a file'))
207 207 usage = _('[OPTION]... [FILE]...')
208 208
209 209 @command('fix', [allopt, baseopt, revopt, wdiropt, wholeopt], usage,
210 210 helpcategory=command.CATEGORY_FILE_CONTENTS)
211 211 def fix(ui, repo, *pats, **opts):
212 212 """rewrite file content in changesets or working directory
213 213
214 214 Runs any configured tools to fix the content of files. Only affects files
215 215 with changes, unless file arguments are provided. Only affects changed lines
216 216 of files, unless the --whole flag is used. Some tools may always affect the
217 217 whole file regardless of --whole.
218 218
219 219 If revisions are specified with --rev, those revisions will be checked, and
220 220 they may be replaced with new revisions that have fixed file content. It is
221 221 desirable to specify all descendants of each specified revision, so that the
222 222 fixes propagate to the descendants. If all descendants are fixed at the same
223 223 time, no merging, rebasing, or evolution will be required.
224 224
225 225 If --working-dir is used, files with uncommitted changes in the working copy
226 226 will be fixed. If the checked-out revision is also fixed, the working
227 227 directory will update to the replacement revision.
228 228
229 229 When determining what lines of each file to fix at each revision, the whole
230 230 set of revisions being fixed is considered, so that fixes to earlier
231 231 revisions are not forgotten in later ones. The --base flag can be used to
232 232 override this default behavior, though it is not usually desirable to do so.
233 233 """
234 234 opts = pycompat.byteskwargs(opts)
235 235 if opts['all']:
236 236 if opts['rev']:
237 237 raise error.Abort(_('cannot specify both "--rev" and "--all"'))
238 238 opts['rev'] = ['not public() and not obsolete()']
239 239 opts['working_dir'] = True
240 240 with repo.wlock(), repo.lock(), repo.transaction('fix'):
241 241 revstofix = getrevstofix(ui, repo, opts)
242 242 basectxs = getbasectxs(repo, opts, revstofix)
243 243 workqueue, numitems = getworkqueue(ui, repo, pats, opts, revstofix,
244 244 basectxs)
245 245 fixers = getfixers(ui)
246 246
247 247 # There are no data dependencies between the workers fixing each file
248 248 # revision, so we can use all available parallelism.
249 249 def getfixes(items):
250 250 for rev, path in items:
251 251 ctx = repo[rev]
252 252 olddata = ctx[path].data()
253 253 metadata, newdata = fixfile(ui, repo, opts, fixers, ctx, path,
254 254 basectxs[rev])
255 255 # Don't waste memory/time passing unchanged content back, but
256 256 # produce one result per item either way.
257 257 yield (rev, path, metadata,
258 258 newdata if newdata != olddata else None)
259 259 results = worker.worker(ui, 1.0, getfixes, tuple(), workqueue,
260 260 threadsafe=False)
261 261
262 262 # We have to hold on to the data for each successor revision in memory
263 263 # until all its parents are committed. We ensure this by committing and
264 264 # freeing memory for the revisions in some topological order. This
265 265 # leaves a little bit of memory efficiency on the table, but also makes
266 266 # the tests deterministic. It might also be considered a feature since
267 267 # it makes the results more easily reproducible.
268 268 filedata = collections.defaultdict(dict)
269 269 aggregatemetadata = collections.defaultdict(list)
270 270 replacements = {}
271 271 wdirwritten = False
272 272 commitorder = sorted(revstofix, reverse=True)
273 273 with ui.makeprogress(topic=_('fixing'), unit=_('files'),
274 274 total=sum(numitems.values())) as progress:
275 275 for rev, path, filerevmetadata, newdata in results:
276 276 progress.increment(item=path)
277 277 for fixername, fixermetadata in filerevmetadata.items():
278 278 aggregatemetadata[fixername].append(fixermetadata)
279 279 if newdata is not None:
280 280 filedata[rev][path] = newdata
281 281 hookargs = {
282 282 'rev': rev,
283 283 'path': path,
284 284 'metadata': filerevmetadata,
285 285 }
286 286 repo.hook('postfixfile', throw=False,
287 287 **pycompat.strkwargs(hookargs))
288 288 numitems[rev] -= 1
289 289 # Apply the fixes for this and any other revisions that are
290 290 # ready and sitting at the front of the queue. Using a loop here
291 291 # prevents the queue from being blocked by the first revision to
292 292 # be ready out of order.
293 293 while commitorder and not numitems[commitorder[-1]]:
294 294 rev = commitorder.pop()
295 295 ctx = repo[rev]
296 296 if rev == wdirrev:
297 297 writeworkingdir(repo, ctx, filedata[rev], replacements)
298 298 wdirwritten = bool(filedata[rev])
299 299 else:
300 300 replacerev(ui, repo, ctx, filedata[rev], replacements)
301 301 del filedata[rev]
302 302
303 303 cleanup(repo, replacements, wdirwritten)
304 304 hookargs = {
305 305 'replacements': replacements,
306 306 'wdirwritten': wdirwritten,
307 307 'metadata': aggregatemetadata,
308 308 }
309 309 repo.hook('postfix', throw=True, **pycompat.strkwargs(hookargs))
310 310
311 311 def cleanup(repo, replacements, wdirwritten):
312 312 """Calls scmutil.cleanupnodes() with the given replacements.
313 313
314 314 "replacements" is a dict from nodeid to nodeid, with one key and one value
315 315 for every revision that was affected by fixing. This is slightly different
316 316 from cleanupnodes().
317 317
318 318 "wdirwritten" is a bool which tells whether the working copy was affected by
319 319 fixing, since it has no entry in "replacements".
320 320
321 321 Useful as a hook point for extending "hg fix" with output summarizing the
322 322 effects of the command, though we choose not to output anything here.
323 323 """
324 324 replacements = {prec: [succ] for prec, succ in replacements.iteritems()}
325 325 scmutil.cleanupnodes(repo, replacements, 'fix', fixphase=True)
326 326
327 327 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
328 328 """"Constructs the list of files to be fixed at specific revisions
329 329
330 330 It is up to the caller how to consume the work items, and the only
331 331 dependence between them is that replacement revisions must be committed in
332 332 topological order. Each work item represents a file in the working copy or
333 333 in some revision that should be fixed and written back to the working copy
334 334 or into a replacement revision.
335 335
336 336 Work items for the same revision are grouped together, so that a worker
337 337 pool starting with the first N items in parallel is likely to finish the
338 338 first revision's work before other revisions. This can allow us to write
339 339 the result to disk and reduce memory footprint. At time of writing, the
340 340 partition strategy in worker.py seems favorable to this. We also sort the
341 341 items by ascending revision number to match the order in which we commit
342 342 the fixes later.
343 343 """
344 344 workqueue = []
345 345 numitems = collections.defaultdict(int)
346 346 maxfilesize = ui.configbytes('fix', 'maxfilesize')
347 347 for rev in sorted(revstofix):
348 348 fixctx = repo[rev]
349 349 match = scmutil.match(fixctx, pats, opts)
350 350 for path in sorted(pathstofix(
351 351 ui, repo, pats, opts, match, basectxs[rev], fixctx)):
352 352 fctx = fixctx[path]
353 353 if fctx.islink():
354 354 continue
355 355 if fctx.size() > maxfilesize:
356 356 ui.warn(_('ignoring file larger than %s: %s\n') %
357 357 (util.bytecount(maxfilesize), path))
358 358 continue
359 359 workqueue.append((rev, path))
360 360 numitems[rev] += 1
361 361 return workqueue, numitems
362 362
363 363 def getrevstofix(ui, repo, opts):
364 364 """Returns the set of revision numbers that should be fixed"""
365 365 revs = set(scmutil.revrange(repo, opts['rev']))
366 366 for rev in revs:
367 367 checkfixablectx(ui, repo, repo[rev])
368 368 if revs:
369 369 cmdutil.checkunfinished(repo)
370 370 checknodescendants(repo, revs)
371 371 if opts.get('working_dir'):
372 372 revs.add(wdirrev)
373 373 if list(merge.mergestate.read(repo).unresolved()):
374 374 raise error.Abort('unresolved conflicts', hint="use 'hg resolve'")
375 375 if not revs:
376 376 raise error.Abort(
377 377 'no changesets specified', hint='use --rev or --working-dir')
378 378 return revs
379 379
380 380 def checknodescendants(repo, revs):
381 381 if (not obsolete.isenabled(repo, obsolete.allowunstableopt) and
382 382 repo.revs('(%ld::) - (%ld)', revs, revs)):
383 383 raise error.Abort(_('can only fix a changeset together '
384 384 'with all its descendants'))
385 385
386 386 def checkfixablectx(ui, repo, ctx):
387 387 """Aborts if the revision shouldn't be replaced with a fixed one."""
388 388 if not ctx.mutable():
389 389 raise error.Abort('can\'t fix immutable changeset %s' %
390 390 (scmutil.formatchangeid(ctx),))
391 391 if ctx.obsolete():
392 392 # It would be better to actually check if the revision has a successor.
393 393 allowdivergence = ui.configbool('experimental',
394 394 'evolution.allowdivergence')
395 395 if not allowdivergence:
396 396 raise error.Abort('fixing obsolete revision could cause divergence')
397 397
398 398 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
399 399 """Returns the set of files that should be fixed in a context
400 400
401 401 The result depends on the base contexts; we include any file that has
402 402 changed relative to any of the base contexts. Base contexts should be
403 403 ancestors of the context being fixed.
404 404 """
405 405 files = set()
406 406 for basectx in basectxs:
407 407 stat = basectx.status(fixctx, match=match, listclean=bool(pats),
408 408 listunknown=bool(pats))
409 409 files.update(
410 410 set(itertools.chain(stat.added, stat.modified, stat.clean,
411 411 stat.unknown)))
412 412 return files
413 413
414 414 def lineranges(opts, path, basectxs, fixctx, content2):
415 415 """Returns the set of line ranges that should be fixed in a file
416 416
417 417 Of the form [(10, 20), (30, 40)].
418 418
419 419 This depends on the given base contexts; we must consider lines that have
420 420 changed versus any of the base contexts, and whether the file has been
421 421 renamed versus any of them.
422 422
423 423 Another way to understand this is that we exclude line ranges that are
424 424 common to the file in all base contexts.
425 425 """
426 426 if opts.get('whole'):
427 427 # Return a range containing all lines. Rely on the diff implementation's
428 428 # idea of how many lines are in the file, instead of reimplementing it.
429 429 return difflineranges('', content2)
430 430
431 431 rangeslist = []
432 432 for basectx in basectxs:
433 433 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
434 434 if basepath in basectx:
435 435 content1 = basectx[basepath].data()
436 436 else:
437 437 content1 = ''
438 438 rangeslist.extend(difflineranges(content1, content2))
439 439 return unionranges(rangeslist)
440 440
441 441 def unionranges(rangeslist):
442 442 """Return the union of some closed intervals
443 443
444 444 >>> unionranges([])
445 445 []
446 446 >>> unionranges([(1, 100)])
447 447 [(1, 100)]
448 448 >>> unionranges([(1, 100), (1, 100)])
449 449 [(1, 100)]
450 450 >>> unionranges([(1, 100), (2, 100)])
451 451 [(1, 100)]
452 452 >>> unionranges([(1, 99), (1, 100)])
453 453 [(1, 100)]
454 454 >>> unionranges([(1, 100), (40, 60)])
455 455 [(1, 100)]
456 456 >>> unionranges([(1, 49), (50, 100)])
457 457 [(1, 100)]
458 458 >>> unionranges([(1, 48), (50, 100)])
459 459 [(1, 48), (50, 100)]
460 460 >>> unionranges([(1, 2), (3, 4), (5, 6)])
461 461 [(1, 6)]
462 462 """
463 463 rangeslist = sorted(set(rangeslist))
464 464 unioned = []
465 465 if rangeslist:
466 466 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
467 467 for a, b in rangeslist:
468 468 c, d = unioned[-1]
469 469 if a > d + 1:
470 470 unioned.append((a, b))
471 471 else:
472 472 unioned[-1] = (c, max(b, d))
473 473 return unioned
474 474
475 475 def difflineranges(content1, content2):
476 476 """Return list of line number ranges in content2 that differ from content1.
477 477
478 478 Line numbers are 1-based. The numbers are the first and last line contained
479 479 in the range. Single-line ranges have the same line number for the first and
480 480 last line. Excludes any empty ranges that result from lines that are only
481 481 present in content1. Relies on mdiff's idea of where the line endings are in
482 482 the string.
483 483
484 484 >>> from mercurial import pycompat
485 485 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
486 486 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
487 487 >>> difflineranges2(b'', b'')
488 488 []
489 489 >>> difflineranges2(b'a', b'')
490 490 []
491 491 >>> difflineranges2(b'', b'A')
492 492 [(1, 1)]
493 493 >>> difflineranges2(b'a', b'a')
494 494 []
495 495 >>> difflineranges2(b'a', b'A')
496 496 [(1, 1)]
497 497 >>> difflineranges2(b'ab', b'')
498 498 []
499 499 >>> difflineranges2(b'', b'AB')
500 500 [(1, 2)]
501 501 >>> difflineranges2(b'abc', b'ac')
502 502 []
503 503 >>> difflineranges2(b'ab', b'aCb')
504 504 [(2, 2)]
505 505 >>> difflineranges2(b'abc', b'aBc')
506 506 [(2, 2)]
507 507 >>> difflineranges2(b'ab', b'AB')
508 508 [(1, 2)]
509 509 >>> difflineranges2(b'abcde', b'aBcDe')
510 510 [(2, 2), (4, 4)]
511 511 >>> difflineranges2(b'abcde', b'aBCDe')
512 512 [(2, 4)]
513 513 """
514 514 ranges = []
515 515 for lines, kind in mdiff.allblocks(content1, content2):
516 516 firstline, lastline = lines[2:4]
517 517 if kind == '!' and firstline != lastline:
518 518 ranges.append((firstline + 1, lastline))
519 519 return ranges
520 520
521 521 def getbasectxs(repo, opts, revstofix):
522 522 """Returns a map of the base contexts for each revision
523 523
524 524 The base contexts determine which lines are considered modified when we
525 525 attempt to fix just the modified lines in a file. It also determines which
526 526 files we attempt to fix, so it is important to compute this even when
527 527 --whole is used.
528 528 """
529 529 # The --base flag overrides the usual logic, and we give every revision
530 530 # exactly the set of baserevs that the user specified.
531 531 if opts.get('base'):
532 532 baserevs = set(scmutil.revrange(repo, opts.get('base')))
533 533 if not baserevs:
534 534 baserevs = {nullrev}
535 535 basectxs = {repo[rev] for rev in baserevs}
536 536 return {rev: basectxs for rev in revstofix}
537 537
538 538 # Proceed in topological order so that we can easily determine each
539 539 # revision's baserevs by looking at its parents and their baserevs.
540 540 basectxs = collections.defaultdict(set)
541 541 for rev in sorted(revstofix):
542 542 ctx = repo[rev]
543 543 for pctx in ctx.parents():
544 544 if pctx.rev() in basectxs:
545 545 basectxs[rev].update(basectxs[pctx.rev()])
546 546 else:
547 547 basectxs[rev].add(pctx)
548 548 return basectxs
549 549
550 550 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
551 551 """Run any configured fixers that should affect the file in this context
552 552
553 553 Returns the file content that results from applying the fixers in some order
554 554 starting with the file's content in the fixctx. Fixers that support line
555 555 ranges will affect lines that have changed relative to any of the basectxs
556 556 (i.e. they will only avoid lines that are common to all basectxs).
557 557
558 558 A fixer tool's stdout will become the file's new content if and only if it
559 559 exits with code zero. The fixer tool's working directory is the repository's
560 560 root.
561 561 """
562 562 metadata = {}
563 563 newdata = fixctx[path].data()
564 564 for fixername, fixer in fixers.iteritems():
565 565 if fixer.affects(opts, fixctx, path):
566 rangesfn = lambda: lineranges(opts, path, basectxs, fixctx, newdata)
567 command = fixer.command(ui, path, rangesfn)
566 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
567 command = fixer.command(ui, path, ranges)
568 568 if command is None:
569 569 continue
570 570 ui.debug('subprocess: %s\n' % (command,))
571 571 proc = subprocess.Popen(
572 572 procutil.tonativestr(command),
573 573 shell=True,
574 574 cwd=repo.root,
575 575 stdin=subprocess.PIPE,
576 576 stdout=subprocess.PIPE,
577 577 stderr=subprocess.PIPE)
578 578 stdout, stderr = proc.communicate(newdata)
579 579 if stderr:
580 580 showstderr(ui, fixctx.rev(), fixername, stderr)
581 581 newerdata = stdout
582 582 if fixer.shouldoutputmetadata():
583 583 try:
584 584 metadatajson, newerdata = stdout.split('\0', 1)
585 585 metadata[fixername] = json.loads(metadatajson)
586 586 except ValueError:
587 587 ui.warn(_('ignored invalid output from fixer tool: %s\n') %
588 588 (fixername,))
589 589 continue
590 590 else:
591 591 metadata[fixername] = None
592 592 if proc.returncode == 0:
593 593 newdata = newerdata
594 594 else:
595 595 if not stderr:
596 596 message = _('exited with status %d\n') % (proc.returncode,)
597 597 showstderr(ui, fixctx.rev(), fixername, message)
598 598 checktoolfailureaction(
599 599 ui, _('no fixes will be applied'),
600 600 hint=_('use --config fix.failure=continue to apply any '
601 601 'successful fixes anyway'))
602 602 return metadata, newdata
603 603
604 604 def showstderr(ui, rev, fixername, stderr):
605 605 """Writes the lines of the stderr string as warnings on the ui
606 606
607 607 Uses the revision number and fixername to give more context to each line of
608 608 the error message. Doesn't include file names, since those take up a lot of
609 609 space and would tend to be included in the error message if they were
610 610 relevant.
611 611 """
612 612 for line in re.split('[\r\n]+', stderr):
613 613 if line:
614 614 ui.warn(('['))
615 615 if rev is None:
616 616 ui.warn(_('wdir'), label='evolve.rev')
617 617 else:
618 618 ui.warn((str(rev)), label='evolve.rev')
619 619 ui.warn(('] %s: %s\n') % (fixername, line))
620 620
621 621 def writeworkingdir(repo, ctx, filedata, replacements):
622 622 """Write new content to the working copy and check out the new p1 if any
623 623
624 624 We check out a new revision if and only if we fixed something in both the
625 625 working directory and its parent revision. This avoids the need for a full
626 626 update/merge, and means that the working directory simply isn't affected
627 627 unless the --working-dir flag is given.
628 628
629 629 Directly updates the dirstate for the affected files.
630 630 """
631 631 for path, data in filedata.iteritems():
632 632 fctx = ctx[path]
633 633 fctx.write(data, fctx.flags())
634 634 if repo.dirstate[path] == 'n':
635 635 repo.dirstate.normallookup(path)
636 636
637 637 oldparentnodes = repo.dirstate.parents()
638 638 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
639 639 if newparentnodes != oldparentnodes:
640 640 repo.setparents(*newparentnodes)
641 641
642 642 def replacerev(ui, repo, ctx, filedata, replacements):
643 643 """Commit a new revision like the given one, but with file content changes
644 644
645 645 "ctx" is the original revision to be replaced by a modified one.
646 646
647 647 "filedata" is a dict that maps paths to their new file content. All other
648 648 paths will be recreated from the original revision without changes.
649 649 "filedata" may contain paths that didn't exist in the original revision;
650 650 they will be added.
651 651
652 652 "replacements" is a dict that maps a single node to a single node, and it is
653 653 updated to indicate the original revision is replaced by the newly created
654 654 one. No entry is added if the replacement's node already exists.
655 655
656 656 The new revision has the same parents as the old one, unless those parents
657 657 have already been replaced, in which case those replacements are the parents
658 658 of this new revision. Thus, if revisions are replaced in topological order,
659 659 there is no need to rebase them into the original topology later.
660 660 """
661 661
662 662 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
663 663 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
664 664 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
665 665 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
666 666
667 667 # We don't want to create a revision that has no changes from the original,
668 668 # but we should if the original revision's parent has been replaced.
669 669 # Otherwise, we would produce an orphan that needs no actual human
670 670 # intervention to evolve. We can't rely on commit() to avoid creating the
671 671 # un-needed revision because the extra field added below produces a new hash
672 672 # regardless of file content changes.
673 673 if (not filedata and
674 674 p1ctx.node() not in replacements and
675 675 p2ctx.node() not in replacements):
676 676 return
677 677
678 678 def filectxfn(repo, memctx, path):
679 679 if path not in ctx:
680 680 return None
681 681 fctx = ctx[path]
682 682 copysource = fctx.copysource()
683 683 return context.memfilectx(
684 684 repo,
685 685 memctx,
686 686 path=fctx.path(),
687 687 data=filedata.get(path, fctx.data()),
688 688 islink=fctx.islink(),
689 689 isexec=fctx.isexec(),
690 690 copysource=copysource)
691 691
692 692 extra = ctx.extra().copy()
693 693 extra['fix_source'] = ctx.hex()
694 694
695 695 memctx = context.memctx(
696 696 repo,
697 697 parents=(newp1node, newp2node),
698 698 text=ctx.description(),
699 699 files=set(ctx.files()) | set(filedata.keys()),
700 700 filectxfn=filectxfn,
701 701 user=ctx.user(),
702 702 date=ctx.date(),
703 703 extra=extra,
704 704 branch=ctx.branch(),
705 705 editor=None)
706 706 sucnode = memctx.commit()
707 707 prenode = ctx.node()
708 708 if prenode == sucnode:
709 709 ui.debug('node %s already existed\n' % (ctx.hex()))
710 710 else:
711 711 replacements[ctx.node()] = sucnode
712 712
713 713 def getfixers(ui):
714 714 """Returns a map of configured fixer tools indexed by their names
715 715
716 716 Each value is a Fixer object with methods that implement the behavior of the
717 717 fixer's config suboptions. Does not validate the config values.
718 718 """
719 719 fixers = {}
720 720 for name in fixernames(ui):
721 721 fixers[name] = Fixer()
722 722 attrs = ui.configsuboptions('fix', name)[1]
723 723 for key, default in FIXER_ATTRS.items():
724 724 setattr(fixers[name], pycompat.sysstr('_' + key),
725 725 attrs.get(key, default))
726 726 fixers[name]._priority = int(fixers[name]._priority)
727 727 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
728 728 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
729 729 # Don't use a fixer if it has no pattern configured. It would be
730 730 # dangerous to let it affect all files. It would be pointless to let it
731 731 # affect no files. There is no reasonable subset of files to use as the
732 732 # default.
733 733 if fixers[name]._pattern is None:
734 734 ui.warn(
735 735 _('fixer tool has no pattern configuration: %s\n') % (name,))
736 736 del fixers[name]
737 737 return collections.OrderedDict(
738 738 sorted(fixers.items(), key=lambda item: item[1]._priority,
739 739 reverse=True))
740 740
741 741 def fixernames(ui):
742 742 """Returns the names of [fix] config options that have suboptions"""
743 743 names = set()
744 744 for k, v in ui.configitems('fix'):
745 745 if ':' in k:
746 746 names.add(k.split(':', 1)[0])
747 747 return names
748 748
749 749 class Fixer(object):
750 750 """Wraps the raw config values for a fixer with methods"""
751 751
752 752 def affects(self, opts, fixctx, path):
753 753 """Should this fixer run on the file at the given path and context?"""
754 754 return (self._pattern is not None and
755 755 scmutil.match(fixctx, [self._pattern], opts)(path))
756 756
757 757 def shouldoutputmetadata(self):
758 758 """Should the stdout of this fixer start with JSON and a null byte?"""
759 759 return self._metadata
760 760
761 def command(self, ui, path, rangesfn):
761 def command(self, ui, path, ranges):
762 762 """A shell command to use to invoke this fixer on the given file/lines
763 763
764 764 May return None if there is no appropriate command to run for the given
765 765 parameters.
766 766 """
767 767 expand = cmdutil.rendercommandtemplate
768 768 parts = [expand(ui, self._command,
769 769 {'rootpath': path, 'basename': os.path.basename(path)})]
770 770 if self._linerange:
771 ranges = rangesfn()
772 771 if self._skipclean and not ranges:
773 772 # No line ranges to fix, so don't run the fixer.
774 773 return None
775 774 for first, last in ranges:
776 775 parts.append(expand(ui, self._linerange,
777 776 {'first': first, 'last': last}))
778 777 return ' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now