##// END OF EJS Templates
match: delete unused argument "listsubrepos" from _buildmatch()...
match: delete unused argument "listsubrepos" from _buildmatch() Seems to have been unused since 9f9ffe5f687c (match: compose 'set:' pattern as matcher, 2018-06-10). Differential Revision: https://phab.mercurial-scm.org/D5924

File last commit:

r41162:d8f5c615 default
r41818:a1326852 default
Show More
fix.py
691 lines | 27.4 KiB | text/x-python | PythonLexer
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 # fix - rewrite file content in changesets and working copy
#
# Copyright 2018 Google LLC.
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
"""rewrite file content in changesets or working copy (EXPERIMENTAL)
Provides a command that runs configured tools on the contents of modified files,
writing back any fixes to the working copy or replacing changesets.
Here is an example configuration that causes :hg:`fix` to apply automatic
formatting fixes to modified lines in C++ code::
[fix]
clang-format:command=clang-format --assume-filename={rootpath}
clang-format:linerange=--lines={first}:{last}
Danny Hooper
fix: rename :fileset subconfig to :pattern...
r40569 clang-format:pattern=set:**.cpp or **.hpp
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
The :command suboption forms the first part of the shell command that will be
used to fix a file. The content of the file is passed on standard input, and the
Danny Hooper
fix: add a config to abort when a fixer tool fails...
r40568 fixed file content is expected on standard output. Any output on standard error
will be displayed as a warning. If the exit status is not zero, the file will
not be affected. A placeholder warning is displayed if there is a non-zero exit
status but no standard error output. Some values may be substituted into the
command::
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
{rootpath} The path of the file being fixed, relative to the repo root
{basename} The name of the file being fixed, without the directory path
If the :linerange suboption is set, the tool will only be run if there are
changed lines in a file. The value of this suboption is appended to the shell
command once for every range of changed lines in the file. Some values may be
substituted into the command::
{first} The 1-based line number of the first line in the modified range
{last} The 1-based line number of the last line in the modified range
Danny Hooper
fix: rename :fileset subconfig to :pattern...
r40569 The :pattern suboption determines which files will be passed through each
configured tool. See :hg:`help patterns` for possible values. If there are file
arguments to :hg:`fix`, the intersection of these patterns is used.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
There is also a configurable limit for the maximum size of file that will be
processed by :hg:`fix`::
[fix]
Danny Hooper
fix: add a config to abort when a fixer tool fails...
r40568 maxfilesize = 2MB
Normally, execution of configured tools will continue after a failure (indicated
by a non-zero exit status). It can also be configured to abort after the first
such failure, so that no files will be affected if any tool fails. This abort
will also cause :hg:`fix` to exit with a non-zero status::
[fix]
failure = abort
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 When multiple tools are configured to affect a file, they execute in an order
defined by the :priority suboption. The priority suboption has a default value
of zero for each tool. Tools are executed in order of descending priority. The
execution order of tools with equal priority is unspecified. For example, you
could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
in a text file by ensuring that 'sort' runs before 'head'::
[fix]
Danny Hooper
tests: use more portable flags in test-fix.t...
r41162 sort:command = sort -n
head:command = head -n 10
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 sort:pattern = numbers.txt
head:pattern = numbers.txt
sort:priority = 2
head:priority = 1
To account for changes made by each tool, the line numbers used for incremental
formatting are recomputed before executing the next tool. So, each tool may see
different values for the arguments added by the :linerange suboption.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 """
from __future__ import absolute_import
import collections
import itertools
import os
import re
import subprocess
from mercurial.i18n import _
from mercurial.node import nullrev
from mercurial.node import wdirrev
Matt Harbison
py3: convert arguments, cwd and env to native strings when spawning subprocess...
r39851 from mercurial.utils import (
procutil,
)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 from mercurial import (
cmdutil,
context,
copies,
error,
mdiff,
merge,
obsolete,
Augie Fackler
fix: port most of the way to python 3...
r37636 pycompat,
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 registrar,
scmutil,
util,
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554 worker,
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 )
# Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
# extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
# be specifying the version(s) of Mercurial they are tested with, or
# leave the attribute unspecified.
testedwith = 'ships-with-hg-core'
cmdtable = {}
command = registrar.command(cmdtable)
configtable = {}
configitem = registrar.configitem(configtable)
# Register the suboptions allowed for each configured fixer.
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 FIXER_ATTRS = {
'command': None,
'linerange': None,
'fileset': None,
'pattern': None,
'priority': 0,
}
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 for key, default in FIXER_ATTRS.items():
configitem('fix', '.*(:%s)?' % key, default=default, generic=True)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
# A good default size allows most source code files to be fixed, but avoids
# letting fixer tools choke on huge inputs, which could be surprising to the
# user.
configitem('fix', 'maxfilesize', default='2MB')
Danny Hooper
fix: add a config to abort when a fixer tool fails...
r40568 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
# This helps users do shell scripts that stop when a fixer tool signals a
# problem.
configitem('fix', 'failure', default='continue')
def checktoolfailureaction(ui, message, hint=None):
"""Abort with 'message' if fix.failure=abort"""
action = ui.config('fix', 'failure')
if action not in ('continue', 'abort'):
raise error.Abort(_('unknown fix.failure action: %s') % (action,),
hint=_('use "continue" or "abort"'))
if action == 'abort':
raise error.Abort(message, hint=hint)
Danny Hooper
fix: pull out flag definitions to make them re-usable from extensions...
r38984 allopt = ('', 'all', False, _('fix all non-public non-obsolete revisions'))
baseopt = ('', 'base', [], _('revisions to diff against (overrides automatic '
'selection, and applies to every revision being '
'fixed)'), _('REV'))
revopt = ('r', 'rev', [], _('revisions to fix'), _('REV'))
wdiropt = ('w', 'working-dir', False, _('fix the working directory'))
wholeopt = ('', 'whole', False, _('always fix every line of a file'))
usage = _('[OPTION]... [FILE]...')
rdamazio@google.com
help: assigning categories to existing commands...
r40329 @command('fix', [allopt, baseopt, revopt, wdiropt, wholeopt], usage,
helpcategory=command.CATEGORY_FILE_CONTENTS)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 def fix(ui, repo, *pats, **opts):
"""rewrite file content in changesets or working directory
Runs any configured tools to fix the content of files. Only affects files
with changes, unless file arguments are provided. Only affects changed lines
of files, unless the --whole flag is used. Some tools may always affect the
whole file regardless of --whole.
If revisions are specified with --rev, those revisions will be checked, and
they may be replaced with new revisions that have fixed file content. It is
desirable to specify all descendants of each specified revision, so that the
fixes propagate to the descendants. If all descendants are fixed at the same
time, no merging, rebasing, or evolution will be required.
If --working-dir is used, files with uncommitted changes in the working copy
will be fixed. If the checked-out revision is also fixed, the working
directory will update to the replacement revision.
When determining what lines of each file to fix at each revision, the whole
set of revisions being fixed is considered, so that fixes to earlier
revisions are not forgotten in later ones. The --base flag can be used to
override this default behavior, though it is not usually desirable to do so.
"""
Augie Fackler
fix: port most of the way to python 3...
r37636 opts = pycompat.byteskwargs(opts)
Danny Hooper
fix: add --all flag to fix non-public non-obsolete revisions...
r37613 if opts['all']:
if opts['rev']:
raise error.Abort(_('cannot specify both "--rev" and "--all"'))
opts['rev'] = ['not public() and not obsolete()']
opts['working_dir'] = True
Martin von Zweigbergk
fix: include cleanupnodes() in transaction...
r38439 with repo.wlock(), repo.lock(), repo.transaction('fix'):
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 revstofix = getrevstofix(ui, repo, opts)
basectxs = getbasectxs(repo, opts, revstofix)
workqueue, numitems = getworkqueue(ui, repo, pats, opts, revstofix,
basectxs)
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554 fixers = getfixers(ui)
# There are no data dependencies between the workers fixing each file
# revision, so we can use all available parallelism.
def getfixes(items):
for rev, path in items:
ctx = repo[rev]
olddata = ctx[path].data()
newdata = fixfile(ui, opts, fixers, ctx, path, basectxs[rev])
# Don't waste memory/time passing unchanged content back, but
# produce one result per item either way.
yield (rev, path, newdata if newdata != olddata else None)
Yuya Nishihara
fix: disable use of thread-based worker...
r40482 results = worker.worker(ui, 1.0, getfixes, tuple(), workqueue,
threadsafe=False)
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554
# We have to hold on to the data for each successor revision in memory
# until all its parents are committed. We ensure this by committing and
# freeing memory for the revisions in some topological order. This
# leaves a little bit of memory efficiency on the table, but also makes
# the tests deterministic. It might also be considered a feature since
# it makes the results more easily reproducible.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 filedata = collections.defaultdict(dict)
replacements = {}
Danny Hooper
fix: correctly set wdirwritten given that the dict item is deleted...
r38985 wdirwritten = False
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554 commitorder = sorted(revstofix, reverse=True)
Danny Hooper
fix: add progress bar for number of file revisions processed...
r38555 with ui.makeprogress(topic=_('fixing'), unit=_('files'),
total=sum(numitems.values())) as progress:
for rev, path, newdata in results:
progress.increment(item=path)
if newdata is not None:
filedata[rev][path] = newdata
numitems[rev] -= 1
# Apply the fixes for this and any other revisions that are
# ready and sitting at the front of the queue. Using a loop here
# prevents the queue from being blocked by the first revision to
# be ready out of order.
while commitorder and not numitems[commitorder[-1]]:
rev = commitorder.pop()
ctx = repo[rev]
if rev == wdirrev:
writeworkingdir(repo, ctx, filedata[rev], replacements)
Danny Hooper
fix: correctly set wdirwritten given that the dict item is deleted...
r38985 wdirwritten = bool(filedata[rev])
Danny Hooper
fix: add progress bar for number of file revisions processed...
r38555 else:
replacerev(ui, repo, ctx, filedata[rev], replacements)
del filedata[rev]
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
Danny Hooper
fix: correctly set wdirwritten given that the dict item is deleted...
r38985 cleanup(repo, replacements, wdirwritten)
Danny Hooper
fix: add a monkey-patchable point after all new revisions have been committed...
r38847
def cleanup(repo, replacements, wdirwritten):
"""Calls scmutil.cleanupnodes() with the given replacements.
"replacements" is a dict from nodeid to nodeid, with one key and one value
for every revision that was affected by fixing. This is slightly different
from cleanupnodes().
"wdirwritten" is a bool which tells whether the working copy was affected by
fixing, since it has no entry in "replacements".
Useful as a hook point for extending "hg fix" with output summarizing the
effects of the command, though we choose not to output anything here.
"""
replacements = {prec: [succ] for prec, succ in replacements.iteritems()}
scmutil.cleanupnodes(repo, replacements, 'fix', fixphase=True)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
""""Constructs the list of files to be fixed at specific revisions
It is up to the caller how to consume the work items, and the only
dependence between them is that replacement revisions must be committed in
topological order. Each work item represents a file in the working copy or
in some revision that should be fixed and written back to the working copy
or into a replacement revision.
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554
Work items for the same revision are grouped together, so that a worker
pool starting with the first N items in parallel is likely to finish the
first revision's work before other revisions. This can allow us to write
the result to disk and reduce memory footprint. At time of writing, the
partition strategy in worker.py seems favorable to this. We also sort the
items by ascending revision number to match the order in which we commit
the fixes later.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 """
workqueue = []
numitems = collections.defaultdict(int)
maxfilesize = ui.configbytes('fix', 'maxfilesize')
Danny Hooper
fix: use a worker pool to parallelize running tools...
r38554 for rev in sorted(revstofix):
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 fixctx = repo[rev]
match = scmutil.match(fixctx, pats, opts)
for path in pathstofix(ui, repo, pats, opts, match, basectxs[rev],
fixctx):
if path not in fixctx:
continue
fctx = fixctx[path]
if fctx.islink():
continue
if fctx.size() > maxfilesize:
ui.warn(_('ignoring file larger than %s: %s\n') %
(util.bytecount(maxfilesize), path))
continue
workqueue.append((rev, path))
numitems[rev] += 1
return workqueue, numitems
def getrevstofix(ui, repo, opts):
"""Returns the set of revision numbers that should be fixed"""
revs = set(scmutil.revrange(repo, opts['rev']))
for rev in revs:
checkfixablectx(ui, repo, repo[rev])
if revs:
cmdutil.checkunfinished(repo)
checknodescendants(repo, revs)
if opts.get('working_dir'):
revs.add(wdirrev)
if list(merge.mergestate.read(repo).unresolved()):
raise error.Abort('unresolved conflicts', hint="use 'hg resolve'")
if not revs:
raise error.Abort(
'no changesets specified', hint='use --rev or --working-dir')
return revs
def checknodescendants(repo, revs):
if (not obsolete.isenabled(repo, obsolete.allowunstableopt) and
repo.revs('(%ld::) - (%ld)', revs, revs)):
raise error.Abort(_('can only fix a changeset together '
'with all its descendants'))
def checkfixablectx(ui, repo, ctx):
"""Aborts if the revision shouldn't be replaced with a fixed one."""
if not ctx.mutable():
raise error.Abort('can\'t fix immutable changeset %s' %
(scmutil.formatchangeid(ctx),))
if ctx.obsolete():
# It would be better to actually check if the revision has a successor.
allowdivergence = ui.configbool('experimental',
'evolution.allowdivergence')
if not allowdivergence:
raise error.Abort('fixing obsolete revision could cause divergence')
def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
"""Returns the set of files that should be fixed in a context
The result depends on the base contexts; we include any file that has
changed relative to any of the base contexts. Base contexts should be
ancestors of the context being fixed.
"""
files = set()
for basectx in basectxs:
Martin von Zweigbergk
fix: use ctx1.status(ctx2) instead of repo.status(ctx1, ctx2)...
r38793 stat = basectx.status(fixctx, match=match, listclean=bool(pats),
listunknown=bool(pats))
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 files.update(
set(itertools.chain(stat.added, stat.modified, stat.clean,
stat.unknown)))
return files
def lineranges(opts, path, basectxs, fixctx, content2):
"""Returns the set of line ranges that should be fixed in a file
Of the form [(10, 20), (30, 40)].
This depends on the given base contexts; we must consider lines that have
changed versus any of the base contexts, and whether the file has been
renamed versus any of them.
Another way to understand this is that we exclude line ranges that are
common to the file in all base contexts.
"""
if opts.get('whole'):
# Return a range containing all lines. Rely on the diff implementation's
# idea of how many lines are in the file, instead of reimplementing it.
return difflineranges('', content2)
rangeslist = []
for basectx in basectxs:
basepath = copies.pathcopies(basectx, fixctx).get(path, path)
if basepath in basectx:
content1 = basectx[basepath].data()
else:
content1 = ''
rangeslist.extend(difflineranges(content1, content2))
return unionranges(rangeslist)
def unionranges(rangeslist):
"""Return the union of some closed intervals
>>> unionranges([])
[]
>>> unionranges([(1, 100)])
[(1, 100)]
>>> unionranges([(1, 100), (1, 100)])
[(1, 100)]
>>> unionranges([(1, 100), (2, 100)])
[(1, 100)]
>>> unionranges([(1, 99), (1, 100)])
[(1, 100)]
>>> unionranges([(1, 100), (40, 60)])
[(1, 100)]
>>> unionranges([(1, 49), (50, 100)])
[(1, 100)]
>>> unionranges([(1, 48), (50, 100)])
[(1, 48), (50, 100)]
>>> unionranges([(1, 2), (3, 4), (5, 6)])
[(1, 6)]
"""
rangeslist = sorted(set(rangeslist))
unioned = []
if rangeslist:
unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
for a, b in rangeslist:
c, d = unioned[-1]
if a > d + 1:
unioned.append((a, b))
else:
unioned[-1] = (c, max(b, d))
return unioned
def difflineranges(content1, content2):
"""Return list of line number ranges in content2 that differ from content1.
Line numbers are 1-based. The numbers are the first and last line contained
in the range. Single-line ranges have the same line number for the first and
last line. Excludes any empty ranges that result from lines that are only
present in content1. Relies on mdiff's idea of where the line endings are in
the string.
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> from mercurial import pycompat
>>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'', b'')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 []
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'a', b'')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 []
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'', b'A')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(1, 1)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'a', b'a')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 []
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'a', b'A')
[(1, 1)]
>>> difflineranges2(b'ab', b'')
[]
>>> difflineranges2(b'', b'AB')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(1, 2)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'abc', b'ac')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 []
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'ab', b'aCb')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(2, 2)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'abc', b'aBc')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(2, 2)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'ab', b'AB')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(1, 2)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'abcde', b'aBcDe')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(2, 2), (4, 4)]
Yuya Nishihara
py3: fix fix doctests to be bytes-safe
r37230 >>> difflineranges2(b'abcde', b'aBCDe')
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 [(2, 4)]
"""
ranges = []
for lines, kind in mdiff.allblocks(content1, content2):
firstline, lastline = lines[2:4]
if kind == '!' and firstline != lastline:
ranges.append((firstline + 1, lastline))
return ranges
def getbasectxs(repo, opts, revstofix):
"""Returns a map of the base contexts for each revision
The base contexts determine which lines are considered modified when we
Danny Hooper
fix: add test case that shows why --whole with --base is useful...
r38609 attempt to fix just the modified lines in a file. It also determines which
files we attempt to fix, so it is important to compute this even when
--whole is used.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 """
# The --base flag overrides the usual logic, and we give every revision
# exactly the set of baserevs that the user specified.
if opts.get('base'):
baserevs = set(scmutil.revrange(repo, opts.get('base')))
if not baserevs:
baserevs = {nullrev}
basectxs = {repo[rev] for rev in baserevs}
return {rev: basectxs for rev in revstofix}
# Proceed in topological order so that we can easily determine each
# revision's baserevs by looking at its parents and their baserevs.
basectxs = collections.defaultdict(set)
for rev in sorted(revstofix):
ctx = repo[rev]
for pctx in ctx.parents():
if pctx.rev() in basectxs:
basectxs[rev].update(basectxs[pctx.rev()])
else:
basectxs[rev].add(pctx)
return basectxs
def fixfile(ui, opts, fixers, fixctx, path, basectxs):
"""Run any configured fixers that should affect the file in this context
Returns the file content that results from applying the fixers in some order
starting with the file's content in the fixctx. Fixers that support line
ranges will affect lines that have changed relative to any of the basectxs
(i.e. they will only avoid lines that are common to all basectxs).
Danny Hooper
fix: determine fixer tool failure by exit code instead of stderr...
r39003
A fixer tool's stdout will become the file's new content if and only if it
exits with code zero.
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 """
newdata = fixctx[path].data()
for fixername, fixer in fixers.iteritems():
if fixer.affects(opts, fixctx, path):
Danny Hooper
fix: compute changed lines lazily to make whole-file fixer tools faster...
r38896 rangesfn = lambda: lineranges(opts, path, basectxs, fixctx, newdata)
command = fixer.command(ui, path, rangesfn)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 if command is None:
continue
ui.debug('subprocess: %s\n' % (command,))
proc = subprocess.Popen(
Matt Harbison
py3: remove a couple of superfluous calls to pycompat.rapply()...
r39868 procutil.tonativestr(command),
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 shell=True,
Matt Harbison
py3: convert arguments, cwd and env to native strings when spawning subprocess...
r39851 cwd=procutil.tonativestr(b'/'),
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
newerdata, stderr = proc.communicate(newdata)
if stderr:
showstderr(ui, fixctx.rev(), fixername, stderr)
Danny Hooper
fix: determine fixer tool failure by exit code instead of stderr...
r39003 if proc.returncode == 0:
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 newdata = newerdata
Danny Hooper
fix: add a config to abort when a fixer tool fails...
r40568 else:
if not stderr:
message = _('exited with status %d\n') % (proc.returncode,)
showstderr(ui, fixctx.rev(), fixername, message)
checktoolfailureaction(
ui, _('no fixes will be applied'),
hint=_('use --config fix.failure=continue to apply any '
'successful fixes anyway'))
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 return newdata
def showstderr(ui, rev, fixername, stderr):
"""Writes the lines of the stderr string as warnings on the ui
Uses the revision number and fixername to give more context to each line of
the error message. Doesn't include file names, since those take up a lot of
space and would tend to be included in the error message if they were
relevant.
"""
for line in re.split('[\r\n]+', stderr):
if line:
ui.warn(('['))
if rev is None:
ui.warn(_('wdir'), label='evolve.rev')
else:
ui.warn((str(rev)), label='evolve.rev')
ui.warn(('] %s: %s\n') % (fixername, line))
def writeworkingdir(repo, ctx, filedata, replacements):
"""Write new content to the working copy and check out the new p1 if any
We check out a new revision if and only if we fixed something in both the
working directory and its parent revision. This avoids the need for a full
update/merge, and means that the working directory simply isn't affected
unless the --working-dir flag is given.
Directly updates the dirstate for the affected files.
"""
for path, data in filedata.iteritems():
fctx = ctx[path]
fctx.write(data, fctx.flags())
if repo.dirstate[path] == 'n':
repo.dirstate.normallookup(path)
oldparentnodes = repo.dirstate.parents()
newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
if newparentnodes != oldparentnodes:
repo.setparents(*newparentnodes)
def replacerev(ui, repo, ctx, filedata, replacements):
"""Commit a new revision like the given one, but with file content changes
"ctx" is the original revision to be replaced by a modified one.
"filedata" is a dict that maps paths to their new file content. All other
paths will be recreated from the original revision without changes.
"filedata" may contain paths that didn't exist in the original revision;
they will be added.
"replacements" is a dict that maps a single node to a single node, and it is
updated to indicate the original revision is replaced by the newly created
one. No entry is added if the replacement's node already exists.
The new revision has the same parents as the old one, unless those parents
have already been replaced, in which case those replacements are the parents
of this new revision. Thus, if revisions are replaced in topological order,
there is no need to rebase them into the original topology later.
"""
p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
p1ctx, p2ctx = repo[p1rev], repo[p2rev]
newp1node = replacements.get(p1ctx.node(), p1ctx.node())
newp2node = replacements.get(p2ctx.node(), p2ctx.node())
Danny Hooper
fix: add extra field to fixed revisions to avoid creating obsolescence cycles...
r40604 # We don't want to create a revision that has no changes from the original,
# but we should if the original revision's parent has been replaced.
# Otherwise, we would produce an orphan that needs no actual human
# intervention to evolve. We can't rely on commit() to avoid creating the
# un-needed revision because the extra field added below produces a new hash
# regardless of file content changes.
if (not filedata and
p1ctx.node() not in replacements and
p2ctx.node() not in replacements):
return
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 def filectxfn(repo, memctx, path):
if path not in ctx:
return None
fctx = ctx[path]
copied = fctx.renamed()
if copied:
copied = copied[0]
return context.memfilectx(
repo,
memctx,
path=fctx.path(),
data=filedata.get(path, fctx.data()),
islink=fctx.islink(),
isexec=fctx.isexec(),
copied=copied)
Danny Hooper
fix: add extra field to fixed revisions to avoid creating obsolescence cycles...
r40604 extra = ctx.extra().copy()
extra['fix_source'] = ctx.hex()
Martin von Zweigbergk
scmutil: make cleanupnodes optionally also fix the phase...
r38442 memctx = context.memctx(
repo,
parents=(newp1node, newp2node),
text=ctx.description(),
files=set(ctx.files()) | set(filedata.keys()),
filectxfn=filectxfn,
user=ctx.user(),
date=ctx.date(),
Danny Hooper
fix: add extra field to fixed revisions to avoid creating obsolescence cycles...
r40604 extra=extra,
Martin von Zweigbergk
scmutil: make cleanupnodes optionally also fix the phase...
r38442 branch=ctx.branch(),
editor=None)
sucnode = memctx.commit()
prenode = ctx.node()
if prenode == sucnode:
ui.debug('node %s already existed\n' % (ctx.hex()))
else:
replacements[ctx.node()] = sucnode
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
def getfixers(ui):
"""Returns a map of configured fixer tools indexed by their names
Each value is a Fixer object with methods that implement the behavior of the
fixer's config suboptions. Does not validate the config values.
"""
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 fixers = {}
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 for name in fixernames(ui):
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 fixers[name] = Fixer()
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 attrs = ui.configsuboptions('fix', name)[1]
Danny Hooper
fix: rename :fileset subconfig to :pattern...
r40569 if 'fileset' in attrs and 'pattern' not in attrs:
ui.warn(_('the fix.tool:fileset config name is deprecated; '
'please rename it to fix.tool:pattern\n'))
attrs['pattern'] = attrs['fileset']
Danny Hooper
fix: add suboption for configuring execution order of tools...
r40599 for key, default in FIXER_ATTRS.items():
setattr(fixers[name], pycompat.sysstr('_' + key),
attrs.get(key, default))
fixers[name]._priority = int(fixers[name]._priority)
return collections.OrderedDict(
sorted(fixers.items(), key=lambda item: item[1]._priority,
reverse=True))
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
def fixernames(ui):
"""Returns the names of [fix] config options that have suboptions"""
names = set()
for k, v in ui.configitems('fix'):
if ':' in k:
names.add(k.split(':', 1)[0])
return names
class Fixer(object):
"""Wraps the raw config values for a fixer with methods"""
def affects(self, opts, fixctx, path):
"""Should this fixer run on the file at the given path and context?"""
Danny Hooper
fix: rename :fileset subconfig to :pattern...
r40569 return scmutil.match(fixctx, [self._pattern], opts)(path)
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200
Danny Hooper
fix: compute changed lines lazily to make whole-file fixer tools faster...
r38896 def command(self, ui, path, rangesfn):
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 """A shell command to use to invoke this fixer on the given file/lines
May return None if there is no appropriate command to run for the given
parameters.
"""
Yuya Nishihara
fix: use templater to substitute values in command string...
r37792 expand = cmdutil.rendercommandtemplate
parts = [expand(ui, self._command,
{'rootpath': path, 'basename': os.path.basename(path)})]
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 if self._linerange:
Danny Hooper
fix: compute changed lines lazily to make whole-file fixer tools faster...
r38896 ranges = rangesfn()
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 if not ranges:
# No line ranges to fix, so don't run the fixer.
return None
for first, last in ranges:
Yuya Nishihara
fix: use templater to substitute values in command string...
r37792 parts.append(expand(ui, self._linerange,
{'first': first, 'last': last}))
Danny Hooper
fix: new extension for automatically modifying file contents...
r37200 return ' '.join(parts)