##// END OF EJS Templates
fix: remove a never-true check for unset pattern in Fixer.affects()...
Martin von Zweigbergk -
r43495:0e2a2fab default
parent child Browse files
Show More
@@ -1,878 +1,876 b''
1 # fix - rewrite file content in changesets and working copy
1 # fix - rewrite file content in changesets and working copy
2 #
2 #
3 # Copyright 2018 Google LLC.
3 # Copyright 2018 Google LLC.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8
8
9 Provides a command that runs configured tools on the contents of modified files,
9 Provides a command that runs configured tools on the contents of modified files,
10 writing back any fixes to the working copy or replacing changesets.
10 writing back any fixes to the working copy or replacing changesets.
11
11
12 Here is an example configuration that causes :hg:`fix` to apply automatic
12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 formatting fixes to modified lines in C++ code::
13 formatting fixes to modified lines in C++ code::
14
14
15 [fix]
15 [fix]
16 clang-format:command=clang-format --assume-filename={rootpath}
16 clang-format:command=clang-format --assume-filename={rootpath}
17 clang-format:linerange=--lines={first}:{last}
17 clang-format:linerange=--lines={first}:{last}
18 clang-format:pattern=set:**.cpp or **.hpp
18 clang-format:pattern=set:**.cpp or **.hpp
19
19
20 The :command suboption forms the first part of the shell command that will be
20 The :command suboption forms the first part of the shell command that will be
21 used to fix a file. The content of the file is passed on standard input, and the
21 used to fix a file. The content of the file is passed on standard input, and the
22 fixed file content is expected on standard output. Any output on standard error
22 fixed file content is expected on standard output. Any output on standard error
23 will be displayed as a warning. If the exit status is not zero, the file will
23 will be displayed as a warning. If the exit status is not zero, the file will
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 status but no standard error output. Some values may be substituted into the
25 status but no standard error output. Some values may be substituted into the
26 command::
26 command::
27
27
28 {rootpath} The path of the file being fixed, relative to the repo root
28 {rootpath} The path of the file being fixed, relative to the repo root
29 {basename} The name of the file being fixed, without the directory path
29 {basename} The name of the file being fixed, without the directory path
30
30
31 If the :linerange suboption is set, the tool will only be run if there are
31 If the :linerange suboption is set, the tool will only be run if there are
32 changed lines in a file. The value of this suboption is appended to the shell
32 changed lines in a file. The value of this suboption is appended to the shell
33 command once for every range of changed lines in the file. Some values may be
33 command once for every range of changed lines in the file. Some values may be
34 substituted into the command::
34 substituted into the command::
35
35
36 {first} The 1-based line number of the first line in the modified range
36 {first} The 1-based line number of the first line in the modified range
37 {last} The 1-based line number of the last line in the modified range
37 {last} The 1-based line number of the last line in the modified range
38
38
39 Deleted sections of a file will be ignored by :linerange, because there is no
39 Deleted sections of a file will be ignored by :linerange, because there is no
40 corresponding line range in the version being fixed.
40 corresponding line range in the version being fixed.
41
41
42 By default, tools that set :linerange will only be executed if there is at least
42 By default, tools that set :linerange will only be executed if there is at least
43 one changed line range. This is meant to prevent accidents like running a code
43 one changed line range. This is meant to prevent accidents like running a code
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 to false.
46 to false.
47
47
48 The :pattern suboption determines which files will be passed through each
48 The :pattern suboption determines which files will be passed through each
49 configured tool. See :hg:`help patterns` for possible values. If there are file
49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 arguments to :hg:`fix`, the intersection of these patterns is used.
50 arguments to :hg:`fix`, the intersection of these patterns is used.
51
51
52 There is also a configurable limit for the maximum size of file that will be
52 There is also a configurable limit for the maximum size of file that will be
53 processed by :hg:`fix`::
53 processed by :hg:`fix`::
54
54
55 [fix]
55 [fix]
56 maxfilesize = 2MB
56 maxfilesize = 2MB
57
57
58 Normally, execution of configured tools will continue after a failure (indicated
58 Normally, execution of configured tools will continue after a failure (indicated
59 by a non-zero exit status). It can also be configured to abort after the first
59 by a non-zero exit status). It can also be configured to abort after the first
60 such failure, so that no files will be affected if any tool fails. This abort
60 such failure, so that no files will be affected if any tool fails. This abort
61 will also cause :hg:`fix` to exit with a non-zero status::
61 will also cause :hg:`fix` to exit with a non-zero status::
62
62
63 [fix]
63 [fix]
64 failure = abort
64 failure = abort
65
65
66 When multiple tools are configured to affect a file, they execute in an order
66 When multiple tools are configured to affect a file, they execute in an order
67 defined by the :priority suboption. The priority suboption has a default value
67 defined by the :priority suboption. The priority suboption has a default value
68 of zero for each tool. Tools are executed in order of descending priority. The
68 of zero for each tool. Tools are executed in order of descending priority. The
69 execution order of tools with equal priority is unspecified. For example, you
69 execution order of tools with equal priority is unspecified. For example, you
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 in a text file by ensuring that 'sort' runs before 'head'::
71 in a text file by ensuring that 'sort' runs before 'head'::
72
72
73 [fix]
73 [fix]
74 sort:command = sort -n
74 sort:command = sort -n
75 head:command = head -n 10
75 head:command = head -n 10
76 sort:pattern = numbers.txt
76 sort:pattern = numbers.txt
77 head:pattern = numbers.txt
77 head:pattern = numbers.txt
78 sort:priority = 2
78 sort:priority = 2
79 head:priority = 1
79 head:priority = 1
80
80
81 To account for changes made by each tool, the line numbers used for incremental
81 To account for changes made by each tool, the line numbers used for incremental
82 formatting are recomputed before executing the next tool. So, each tool may see
82 formatting are recomputed before executing the next tool. So, each tool may see
83 different values for the arguments added by the :linerange suboption.
83 different values for the arguments added by the :linerange suboption.
84
84
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 content. The metadata must be placed before the file content on stdout,
86 content. The metadata must be placed before the file content on stdout,
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 is expected to produce this metadata encoding if and only if the :metadata
89 is expected to produce this metadata encoding if and only if the :metadata
90 suboption is true::
90 suboption is true::
91
91
92 [fix]
92 [fix]
93 tool:command = tool --prepend-json-metadata
93 tool:command = tool --prepend-json-metadata
94 tool:metadata = true
94 tool:metadata = true
95
95
96 The metadata values are passed to hooks, which can be used to print summaries or
96 The metadata values are passed to hooks, which can be used to print summaries or
97 perform other post-fixing work. The supported hooks are::
97 perform other post-fixing work. The supported hooks are::
98
98
99 "postfixfile"
99 "postfixfile"
100 Run once for each file in each revision where any fixer tools made changes
100 Run once for each file in each revision where any fixer tools made changes
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 tools that affected the file. Fixer tools that didn't affect the file have a
103 tools that affected the file. Fixer tools that didn't affect the file have a
104 valueof None. Only fixer tools that executed are present in the metadata.
104 valueof None. Only fixer tools that executed are present in the metadata.
105
105
106 "postfix"
106 "postfix"
107 Run once after all files and revisions have been handled. Provides
107 Run once after all files and revisions have been handled. Provides
108 "$HG_REPLACEMENTS" with information about what revisions were created and
108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 files in the working copy were updated. Provides a list "$HG_METADATA"
110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 mapping fixer tool names to lists of metadata values returned from
111 mapping fixer tool names to lists of metadata values returned from
112 executions that modified a file. This aggregates the same metadata
112 executions that modified a file. This aggregates the same metadata
113 previously passed to the "postfixfile" hook.
113 previously passed to the "postfixfile" hook.
114
114
115 Fixer tools are run the in repository's root directory. This allows them to read
115 Fixer tools are run the in repository's root directory. This allows them to read
116 configuration files from the working copy, or even write to the working copy.
116 configuration files from the working copy, or even write to the working copy.
117 The working copy is not updated to match the revision being fixed. In fact,
117 The working copy is not updated to match the revision being fixed. In fact,
118 several revisions may be fixed in parallel. Writes to the working copy are not
118 several revisions may be fixed in parallel. Writes to the working copy are not
119 amended into the revision being fixed; fixer tools should always write fixed
119 amended into the revision being fixed; fixer tools should always write fixed
120 file content back to stdout as documented above.
120 file content back to stdout as documented above.
121 """
121 """
122
122
123 from __future__ import absolute_import
123 from __future__ import absolute_import
124
124
125 import collections
125 import collections
126 import itertools
126 import itertools
127 import json
127 import json
128 import os
128 import os
129 import re
129 import re
130 import subprocess
130 import subprocess
131
131
132 from mercurial.i18n import _
132 from mercurial.i18n import _
133 from mercurial.node import nullrev
133 from mercurial.node import nullrev
134 from mercurial.node import wdirrev
134 from mercurial.node import wdirrev
135
135
136 from mercurial.utils import procutil
136 from mercurial.utils import procutil
137
137
138 from mercurial import (
138 from mercurial import (
139 cmdutil,
139 cmdutil,
140 context,
140 context,
141 copies,
141 copies,
142 error,
142 error,
143 mdiff,
143 mdiff,
144 merge,
144 merge,
145 obsolete,
145 obsolete,
146 pycompat,
146 pycompat,
147 registrar,
147 registrar,
148 scmutil,
148 scmutil,
149 util,
149 util,
150 worker,
150 worker,
151 )
151 )
152
152
153 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
153 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
154 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
154 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
155 # be specifying the version(s) of Mercurial they are tested with, or
155 # be specifying the version(s) of Mercurial they are tested with, or
156 # leave the attribute unspecified.
156 # leave the attribute unspecified.
157 testedwith = b'ships-with-hg-core'
157 testedwith = b'ships-with-hg-core'
158
158
159 cmdtable = {}
159 cmdtable = {}
160 command = registrar.command(cmdtable)
160 command = registrar.command(cmdtable)
161
161
162 configtable = {}
162 configtable = {}
163 configitem = registrar.configitem(configtable)
163 configitem = registrar.configitem(configtable)
164
164
165 # Register the suboptions allowed for each configured fixer, and default values.
165 # Register the suboptions allowed for each configured fixer, and default values.
166 FIXER_ATTRS = {
166 FIXER_ATTRS = {
167 b'command': None,
167 b'command': None,
168 b'linerange': None,
168 b'linerange': None,
169 b'pattern': None,
169 b'pattern': None,
170 b'priority': 0,
170 b'priority': 0,
171 b'metadata': False,
171 b'metadata': False,
172 b'skipclean': True,
172 b'skipclean': True,
173 b'enabled': True,
173 b'enabled': True,
174 }
174 }
175
175
176 for key, default in FIXER_ATTRS.items():
176 for key, default in FIXER_ATTRS.items():
177 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
177 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
178
178
179 # A good default size allows most source code files to be fixed, but avoids
179 # A good default size allows most source code files to be fixed, but avoids
180 # letting fixer tools choke on huge inputs, which could be surprising to the
180 # letting fixer tools choke on huge inputs, which could be surprising to the
181 # user.
181 # user.
182 configitem(b'fix', b'maxfilesize', default=b'2MB')
182 configitem(b'fix', b'maxfilesize', default=b'2MB')
183
183
184 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
184 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
185 # This helps users do shell scripts that stop when a fixer tool signals a
185 # This helps users do shell scripts that stop when a fixer tool signals a
186 # problem.
186 # problem.
187 configitem(b'fix', b'failure', default=b'continue')
187 configitem(b'fix', b'failure', default=b'continue')
188
188
189
189
190 def checktoolfailureaction(ui, message, hint=None):
190 def checktoolfailureaction(ui, message, hint=None):
191 """Abort with 'message' if fix.failure=abort"""
191 """Abort with 'message' if fix.failure=abort"""
192 action = ui.config(b'fix', b'failure')
192 action = ui.config(b'fix', b'failure')
193 if action not in (b'continue', b'abort'):
193 if action not in (b'continue', b'abort'):
194 raise error.Abort(
194 raise error.Abort(
195 _(b'unknown fix.failure action: %s') % (action,),
195 _(b'unknown fix.failure action: %s') % (action,),
196 hint=_(b'use "continue" or "abort"'),
196 hint=_(b'use "continue" or "abort"'),
197 )
197 )
198 if action == b'abort':
198 if action == b'abort':
199 raise error.Abort(message, hint=hint)
199 raise error.Abort(message, hint=hint)
200
200
201
201
202 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
202 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
203 baseopt = (
203 baseopt = (
204 b'',
204 b'',
205 b'base',
205 b'base',
206 [],
206 [],
207 _(
207 _(
208 b'revisions to diff against (overrides automatic '
208 b'revisions to diff against (overrides automatic '
209 b'selection, and applies to every revision being '
209 b'selection, and applies to every revision being '
210 b'fixed)'
210 b'fixed)'
211 ),
211 ),
212 _(b'REV'),
212 _(b'REV'),
213 )
213 )
214 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
214 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
215 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
215 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
216 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
216 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
217 usage = _(b'[OPTION]... [FILE]...')
217 usage = _(b'[OPTION]... [FILE]...')
218
218
219
219
220 @command(
220 @command(
221 b'fix',
221 b'fix',
222 [allopt, baseopt, revopt, wdiropt, wholeopt],
222 [allopt, baseopt, revopt, wdiropt, wholeopt],
223 usage,
223 usage,
224 helpcategory=command.CATEGORY_FILE_CONTENTS,
224 helpcategory=command.CATEGORY_FILE_CONTENTS,
225 )
225 )
226 def fix(ui, repo, *pats, **opts):
226 def fix(ui, repo, *pats, **opts):
227 """rewrite file content in changesets or working directory
227 """rewrite file content in changesets or working directory
228
228
229 Runs any configured tools to fix the content of files. Only affects files
229 Runs any configured tools to fix the content of files. Only affects files
230 with changes, unless file arguments are provided. Only affects changed lines
230 with changes, unless file arguments are provided. Only affects changed lines
231 of files, unless the --whole flag is used. Some tools may always affect the
231 of files, unless the --whole flag is used. Some tools may always affect the
232 whole file regardless of --whole.
232 whole file regardless of --whole.
233
233
234 If revisions are specified with --rev, those revisions will be checked, and
234 If revisions are specified with --rev, those revisions will be checked, and
235 they may be replaced with new revisions that have fixed file content. It is
235 they may be replaced with new revisions that have fixed file content. It is
236 desirable to specify all descendants of each specified revision, so that the
236 desirable to specify all descendants of each specified revision, so that the
237 fixes propagate to the descendants. If all descendants are fixed at the same
237 fixes propagate to the descendants. If all descendants are fixed at the same
238 time, no merging, rebasing, or evolution will be required.
238 time, no merging, rebasing, or evolution will be required.
239
239
240 If --working-dir is used, files with uncommitted changes in the working copy
240 If --working-dir is used, files with uncommitted changes in the working copy
241 will be fixed. If the checked-out revision is also fixed, the working
241 will be fixed. If the checked-out revision is also fixed, the working
242 directory will update to the replacement revision.
242 directory will update to the replacement revision.
243
243
244 When determining what lines of each file to fix at each revision, the whole
244 When determining what lines of each file to fix at each revision, the whole
245 set of revisions being fixed is considered, so that fixes to earlier
245 set of revisions being fixed is considered, so that fixes to earlier
246 revisions are not forgotten in later ones. The --base flag can be used to
246 revisions are not forgotten in later ones. The --base flag can be used to
247 override this default behavior, though it is not usually desirable to do so.
247 override this default behavior, though it is not usually desirable to do so.
248 """
248 """
249 opts = pycompat.byteskwargs(opts)
249 opts = pycompat.byteskwargs(opts)
250 if opts[b'all']:
250 if opts[b'all']:
251 if opts[b'rev']:
251 if opts[b'rev']:
252 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
252 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
253 opts[b'rev'] = [b'not public() and not obsolete()']
253 opts[b'rev'] = [b'not public() and not obsolete()']
254 opts[b'working_dir'] = True
254 opts[b'working_dir'] = True
255 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
255 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
256 revstofix = getrevstofix(ui, repo, opts)
256 revstofix = getrevstofix(ui, repo, opts)
257 basectxs = getbasectxs(repo, opts, revstofix)
257 basectxs = getbasectxs(repo, opts, revstofix)
258 workqueue, numitems = getworkqueue(
258 workqueue, numitems = getworkqueue(
259 ui, repo, pats, opts, revstofix, basectxs
259 ui, repo, pats, opts, revstofix, basectxs
260 )
260 )
261 fixers = getfixers(ui)
261 fixers = getfixers(ui)
262
262
263 # There are no data dependencies between the workers fixing each file
263 # There are no data dependencies between the workers fixing each file
264 # revision, so we can use all available parallelism.
264 # revision, so we can use all available parallelism.
265 def getfixes(items):
265 def getfixes(items):
266 for rev, path in items:
266 for rev, path in items:
267 ctx = repo[rev]
267 ctx = repo[rev]
268 olddata = ctx[path].data()
268 olddata = ctx[path].data()
269 metadata, newdata = fixfile(
269 metadata, newdata = fixfile(
270 ui, repo, opts, fixers, ctx, path, basectxs[rev]
270 ui, repo, opts, fixers, ctx, path, basectxs[rev]
271 )
271 )
272 # Don't waste memory/time passing unchanged content back, but
272 # Don't waste memory/time passing unchanged content back, but
273 # produce one result per item either way.
273 # produce one result per item either way.
274 yield (
274 yield (
275 rev,
275 rev,
276 path,
276 path,
277 metadata,
277 metadata,
278 newdata if newdata != olddata else None,
278 newdata if newdata != olddata else None,
279 )
279 )
280
280
281 results = worker.worker(
281 results = worker.worker(
282 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
282 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
283 )
283 )
284
284
285 # We have to hold on to the data for each successor revision in memory
285 # We have to hold on to the data for each successor revision in memory
286 # until all its parents are committed. We ensure this by committing and
286 # until all its parents are committed. We ensure this by committing and
287 # freeing memory for the revisions in some topological order. This
287 # freeing memory for the revisions in some topological order. This
288 # leaves a little bit of memory efficiency on the table, but also makes
288 # leaves a little bit of memory efficiency on the table, but also makes
289 # the tests deterministic. It might also be considered a feature since
289 # the tests deterministic. It might also be considered a feature since
290 # it makes the results more easily reproducible.
290 # it makes the results more easily reproducible.
291 filedata = collections.defaultdict(dict)
291 filedata = collections.defaultdict(dict)
292 aggregatemetadata = collections.defaultdict(list)
292 aggregatemetadata = collections.defaultdict(list)
293 replacements = {}
293 replacements = {}
294 wdirwritten = False
294 wdirwritten = False
295 commitorder = sorted(revstofix, reverse=True)
295 commitorder = sorted(revstofix, reverse=True)
296 with ui.makeprogress(
296 with ui.makeprogress(
297 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
297 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
298 ) as progress:
298 ) as progress:
299 for rev, path, filerevmetadata, newdata in results:
299 for rev, path, filerevmetadata, newdata in results:
300 progress.increment(item=path)
300 progress.increment(item=path)
301 for fixername, fixermetadata in filerevmetadata.items():
301 for fixername, fixermetadata in filerevmetadata.items():
302 aggregatemetadata[fixername].append(fixermetadata)
302 aggregatemetadata[fixername].append(fixermetadata)
303 if newdata is not None:
303 if newdata is not None:
304 filedata[rev][path] = newdata
304 filedata[rev][path] = newdata
305 hookargs = {
305 hookargs = {
306 b'rev': rev,
306 b'rev': rev,
307 b'path': path,
307 b'path': path,
308 b'metadata': filerevmetadata,
308 b'metadata': filerevmetadata,
309 }
309 }
310 repo.hook(
310 repo.hook(
311 b'postfixfile',
311 b'postfixfile',
312 throw=False,
312 throw=False,
313 **pycompat.strkwargs(hookargs)
313 **pycompat.strkwargs(hookargs)
314 )
314 )
315 numitems[rev] -= 1
315 numitems[rev] -= 1
316 # Apply the fixes for this and any other revisions that are
316 # Apply the fixes for this and any other revisions that are
317 # ready and sitting at the front of the queue. Using a loop here
317 # ready and sitting at the front of the queue. Using a loop here
318 # prevents the queue from being blocked by the first revision to
318 # prevents the queue from being blocked by the first revision to
319 # be ready out of order.
319 # be ready out of order.
320 while commitorder and not numitems[commitorder[-1]]:
320 while commitorder and not numitems[commitorder[-1]]:
321 rev = commitorder.pop()
321 rev = commitorder.pop()
322 ctx = repo[rev]
322 ctx = repo[rev]
323 if rev == wdirrev:
323 if rev == wdirrev:
324 writeworkingdir(repo, ctx, filedata[rev], replacements)
324 writeworkingdir(repo, ctx, filedata[rev], replacements)
325 wdirwritten = bool(filedata[rev])
325 wdirwritten = bool(filedata[rev])
326 else:
326 else:
327 replacerev(ui, repo, ctx, filedata[rev], replacements)
327 replacerev(ui, repo, ctx, filedata[rev], replacements)
328 del filedata[rev]
328 del filedata[rev]
329
329
330 cleanup(repo, replacements, wdirwritten)
330 cleanup(repo, replacements, wdirwritten)
331 hookargs = {
331 hookargs = {
332 b'replacements': replacements,
332 b'replacements': replacements,
333 b'wdirwritten': wdirwritten,
333 b'wdirwritten': wdirwritten,
334 b'metadata': aggregatemetadata,
334 b'metadata': aggregatemetadata,
335 }
335 }
336 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
336 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
337
337
338
338
339 def cleanup(repo, replacements, wdirwritten):
339 def cleanup(repo, replacements, wdirwritten):
340 """Calls scmutil.cleanupnodes() with the given replacements.
340 """Calls scmutil.cleanupnodes() with the given replacements.
341
341
342 "replacements" is a dict from nodeid to nodeid, with one key and one value
342 "replacements" is a dict from nodeid to nodeid, with one key and one value
343 for every revision that was affected by fixing. This is slightly different
343 for every revision that was affected by fixing. This is slightly different
344 from cleanupnodes().
344 from cleanupnodes().
345
345
346 "wdirwritten" is a bool which tells whether the working copy was affected by
346 "wdirwritten" is a bool which tells whether the working copy was affected by
347 fixing, since it has no entry in "replacements".
347 fixing, since it has no entry in "replacements".
348
348
349 Useful as a hook point for extending "hg fix" with output summarizing the
349 Useful as a hook point for extending "hg fix" with output summarizing the
350 effects of the command, though we choose not to output anything here.
350 effects of the command, though we choose not to output anything here.
351 """
351 """
352 replacements = {
352 replacements = {
353 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
353 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
354 }
354 }
355 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
355 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
356
356
357
357
358 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
358 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
359 """"Constructs the list of files to be fixed at specific revisions
359 """"Constructs the list of files to be fixed at specific revisions
360
360
361 It is up to the caller how to consume the work items, and the only
361 It is up to the caller how to consume the work items, and the only
362 dependence between them is that replacement revisions must be committed in
362 dependence between them is that replacement revisions must be committed in
363 topological order. Each work item represents a file in the working copy or
363 topological order. Each work item represents a file in the working copy or
364 in some revision that should be fixed and written back to the working copy
364 in some revision that should be fixed and written back to the working copy
365 or into a replacement revision.
365 or into a replacement revision.
366
366
367 Work items for the same revision are grouped together, so that a worker
367 Work items for the same revision are grouped together, so that a worker
368 pool starting with the first N items in parallel is likely to finish the
368 pool starting with the first N items in parallel is likely to finish the
369 first revision's work before other revisions. This can allow us to write
369 first revision's work before other revisions. This can allow us to write
370 the result to disk and reduce memory footprint. At time of writing, the
370 the result to disk and reduce memory footprint. At time of writing, the
371 partition strategy in worker.py seems favorable to this. We also sort the
371 partition strategy in worker.py seems favorable to this. We also sort the
372 items by ascending revision number to match the order in which we commit
372 items by ascending revision number to match the order in which we commit
373 the fixes later.
373 the fixes later.
374 """
374 """
375 workqueue = []
375 workqueue = []
376 numitems = collections.defaultdict(int)
376 numitems = collections.defaultdict(int)
377 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
377 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
378 for rev in sorted(revstofix):
378 for rev in sorted(revstofix):
379 fixctx = repo[rev]
379 fixctx = repo[rev]
380 match = scmutil.match(fixctx, pats, opts)
380 match = scmutil.match(fixctx, pats, opts)
381 for path in sorted(
381 for path in sorted(
382 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
382 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
383 ):
383 ):
384 fctx = fixctx[path]
384 fctx = fixctx[path]
385 if fctx.islink():
385 if fctx.islink():
386 continue
386 continue
387 if fctx.size() > maxfilesize:
387 if fctx.size() > maxfilesize:
388 ui.warn(
388 ui.warn(
389 _(b'ignoring file larger than %s: %s\n')
389 _(b'ignoring file larger than %s: %s\n')
390 % (util.bytecount(maxfilesize), path)
390 % (util.bytecount(maxfilesize), path)
391 )
391 )
392 continue
392 continue
393 workqueue.append((rev, path))
393 workqueue.append((rev, path))
394 numitems[rev] += 1
394 numitems[rev] += 1
395 return workqueue, numitems
395 return workqueue, numitems
396
396
397
397
398 def getrevstofix(ui, repo, opts):
398 def getrevstofix(ui, repo, opts):
399 """Returns the set of revision numbers that should be fixed"""
399 """Returns the set of revision numbers that should be fixed"""
400 revs = set(scmutil.revrange(repo, opts[b'rev']))
400 revs = set(scmutil.revrange(repo, opts[b'rev']))
401 for rev in revs:
401 for rev in revs:
402 checkfixablectx(ui, repo, repo[rev])
402 checkfixablectx(ui, repo, repo[rev])
403 if revs:
403 if revs:
404 cmdutil.checkunfinished(repo)
404 cmdutil.checkunfinished(repo)
405 checknodescendants(repo, revs)
405 checknodescendants(repo, revs)
406 if opts.get(b'working_dir'):
406 if opts.get(b'working_dir'):
407 revs.add(wdirrev)
407 revs.add(wdirrev)
408 if list(merge.mergestate.read(repo).unresolved()):
408 if list(merge.mergestate.read(repo).unresolved()):
409 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
409 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
410 if not revs:
410 if not revs:
411 raise error.Abort(
411 raise error.Abort(
412 b'no changesets specified', hint=b'use --rev or --working-dir'
412 b'no changesets specified', hint=b'use --rev or --working-dir'
413 )
413 )
414 return revs
414 return revs
415
415
416
416
417 def checknodescendants(repo, revs):
417 def checknodescendants(repo, revs):
418 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
418 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
419 b'(%ld::) - (%ld)', revs, revs
419 b'(%ld::) - (%ld)', revs, revs
420 ):
420 ):
421 raise error.Abort(
421 raise error.Abort(
422 _(b'can only fix a changeset together with all its descendants')
422 _(b'can only fix a changeset together with all its descendants')
423 )
423 )
424
424
425
425
426 def checkfixablectx(ui, repo, ctx):
426 def checkfixablectx(ui, repo, ctx):
427 """Aborts if the revision shouldn't be replaced with a fixed one."""
427 """Aborts if the revision shouldn't be replaced with a fixed one."""
428 if not ctx.mutable():
428 if not ctx.mutable():
429 raise error.Abort(
429 raise error.Abort(
430 b'can\'t fix immutable changeset %s'
430 b'can\'t fix immutable changeset %s'
431 % (scmutil.formatchangeid(ctx),)
431 % (scmutil.formatchangeid(ctx),)
432 )
432 )
433 if ctx.obsolete():
433 if ctx.obsolete():
434 # It would be better to actually check if the revision has a successor.
434 # It would be better to actually check if the revision has a successor.
435 allowdivergence = ui.configbool(
435 allowdivergence = ui.configbool(
436 b'experimental', b'evolution.allowdivergence'
436 b'experimental', b'evolution.allowdivergence'
437 )
437 )
438 if not allowdivergence:
438 if not allowdivergence:
439 raise error.Abort(
439 raise error.Abort(
440 b'fixing obsolete revision could cause divergence'
440 b'fixing obsolete revision could cause divergence'
441 )
441 )
442
442
443
443
444 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
444 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
445 """Returns the set of files that should be fixed in a context
445 """Returns the set of files that should be fixed in a context
446
446
447 The result depends on the base contexts; we include any file that has
447 The result depends on the base contexts; we include any file that has
448 changed relative to any of the base contexts. Base contexts should be
448 changed relative to any of the base contexts. Base contexts should be
449 ancestors of the context being fixed.
449 ancestors of the context being fixed.
450 """
450 """
451 files = set()
451 files = set()
452 for basectx in basectxs:
452 for basectx in basectxs:
453 stat = basectx.status(
453 stat = basectx.status(
454 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
454 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
455 )
455 )
456 files.update(
456 files.update(
457 set(
457 set(
458 itertools.chain(
458 itertools.chain(
459 stat.added, stat.modified, stat.clean, stat.unknown
459 stat.added, stat.modified, stat.clean, stat.unknown
460 )
460 )
461 )
461 )
462 )
462 )
463 return files
463 return files
464
464
465
465
466 def lineranges(opts, path, basectxs, fixctx, content2):
466 def lineranges(opts, path, basectxs, fixctx, content2):
467 """Returns the set of line ranges that should be fixed in a file
467 """Returns the set of line ranges that should be fixed in a file
468
468
469 Of the form [(10, 20), (30, 40)].
469 Of the form [(10, 20), (30, 40)].
470
470
471 This depends on the given base contexts; we must consider lines that have
471 This depends on the given base contexts; we must consider lines that have
472 changed versus any of the base contexts, and whether the file has been
472 changed versus any of the base contexts, and whether the file has been
473 renamed versus any of them.
473 renamed versus any of them.
474
474
475 Another way to understand this is that we exclude line ranges that are
475 Another way to understand this is that we exclude line ranges that are
476 common to the file in all base contexts.
476 common to the file in all base contexts.
477 """
477 """
478 if opts.get(b'whole'):
478 if opts.get(b'whole'):
479 # Return a range containing all lines. Rely on the diff implementation's
479 # Return a range containing all lines. Rely on the diff implementation's
480 # idea of how many lines are in the file, instead of reimplementing it.
480 # idea of how many lines are in the file, instead of reimplementing it.
481 return difflineranges(b'', content2)
481 return difflineranges(b'', content2)
482
482
483 rangeslist = []
483 rangeslist = []
484 for basectx in basectxs:
484 for basectx in basectxs:
485 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
485 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
486 if basepath in basectx:
486 if basepath in basectx:
487 content1 = basectx[basepath].data()
487 content1 = basectx[basepath].data()
488 else:
488 else:
489 content1 = b''
489 content1 = b''
490 rangeslist.extend(difflineranges(content1, content2))
490 rangeslist.extend(difflineranges(content1, content2))
491 return unionranges(rangeslist)
491 return unionranges(rangeslist)
492
492
493
493
494 def unionranges(rangeslist):
494 def unionranges(rangeslist):
495 """Return the union of some closed intervals
495 """Return the union of some closed intervals
496
496
497 >>> unionranges([])
497 >>> unionranges([])
498 []
498 []
499 >>> unionranges([(1, 100)])
499 >>> unionranges([(1, 100)])
500 [(1, 100)]
500 [(1, 100)]
501 >>> unionranges([(1, 100), (1, 100)])
501 >>> unionranges([(1, 100), (1, 100)])
502 [(1, 100)]
502 [(1, 100)]
503 >>> unionranges([(1, 100), (2, 100)])
503 >>> unionranges([(1, 100), (2, 100)])
504 [(1, 100)]
504 [(1, 100)]
505 >>> unionranges([(1, 99), (1, 100)])
505 >>> unionranges([(1, 99), (1, 100)])
506 [(1, 100)]
506 [(1, 100)]
507 >>> unionranges([(1, 100), (40, 60)])
507 >>> unionranges([(1, 100), (40, 60)])
508 [(1, 100)]
508 [(1, 100)]
509 >>> unionranges([(1, 49), (50, 100)])
509 >>> unionranges([(1, 49), (50, 100)])
510 [(1, 100)]
510 [(1, 100)]
511 >>> unionranges([(1, 48), (50, 100)])
511 >>> unionranges([(1, 48), (50, 100)])
512 [(1, 48), (50, 100)]
512 [(1, 48), (50, 100)]
513 >>> unionranges([(1, 2), (3, 4), (5, 6)])
513 >>> unionranges([(1, 2), (3, 4), (5, 6)])
514 [(1, 6)]
514 [(1, 6)]
515 """
515 """
516 rangeslist = sorted(set(rangeslist))
516 rangeslist = sorted(set(rangeslist))
517 unioned = []
517 unioned = []
518 if rangeslist:
518 if rangeslist:
519 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
519 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
520 for a, b in rangeslist:
520 for a, b in rangeslist:
521 c, d = unioned[-1]
521 c, d = unioned[-1]
522 if a > d + 1:
522 if a > d + 1:
523 unioned.append((a, b))
523 unioned.append((a, b))
524 else:
524 else:
525 unioned[-1] = (c, max(b, d))
525 unioned[-1] = (c, max(b, d))
526 return unioned
526 return unioned
527
527
528
528
529 def difflineranges(content1, content2):
529 def difflineranges(content1, content2):
530 """Return list of line number ranges in content2 that differ from content1.
530 """Return list of line number ranges in content2 that differ from content1.
531
531
532 Line numbers are 1-based. The numbers are the first and last line contained
532 Line numbers are 1-based. The numbers are the first and last line contained
533 in the range. Single-line ranges have the same line number for the first and
533 in the range. Single-line ranges have the same line number for the first and
534 last line. Excludes any empty ranges that result from lines that are only
534 last line. Excludes any empty ranges that result from lines that are only
535 present in content1. Relies on mdiff's idea of where the line endings are in
535 present in content1. Relies on mdiff's idea of where the line endings are in
536 the string.
536 the string.
537
537
538 >>> from mercurial import pycompat
538 >>> from mercurial import pycompat
539 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
539 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
540 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
540 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
541 >>> difflineranges2(b'', b'')
541 >>> difflineranges2(b'', b'')
542 []
542 []
543 >>> difflineranges2(b'a', b'')
543 >>> difflineranges2(b'a', b'')
544 []
544 []
545 >>> difflineranges2(b'', b'A')
545 >>> difflineranges2(b'', b'A')
546 [(1, 1)]
546 [(1, 1)]
547 >>> difflineranges2(b'a', b'a')
547 >>> difflineranges2(b'a', b'a')
548 []
548 []
549 >>> difflineranges2(b'a', b'A')
549 >>> difflineranges2(b'a', b'A')
550 [(1, 1)]
550 [(1, 1)]
551 >>> difflineranges2(b'ab', b'')
551 >>> difflineranges2(b'ab', b'')
552 []
552 []
553 >>> difflineranges2(b'', b'AB')
553 >>> difflineranges2(b'', b'AB')
554 [(1, 2)]
554 [(1, 2)]
555 >>> difflineranges2(b'abc', b'ac')
555 >>> difflineranges2(b'abc', b'ac')
556 []
556 []
557 >>> difflineranges2(b'ab', b'aCb')
557 >>> difflineranges2(b'ab', b'aCb')
558 [(2, 2)]
558 [(2, 2)]
559 >>> difflineranges2(b'abc', b'aBc')
559 >>> difflineranges2(b'abc', b'aBc')
560 [(2, 2)]
560 [(2, 2)]
561 >>> difflineranges2(b'ab', b'AB')
561 >>> difflineranges2(b'ab', b'AB')
562 [(1, 2)]
562 [(1, 2)]
563 >>> difflineranges2(b'abcde', b'aBcDe')
563 >>> difflineranges2(b'abcde', b'aBcDe')
564 [(2, 2), (4, 4)]
564 [(2, 2), (4, 4)]
565 >>> difflineranges2(b'abcde', b'aBCDe')
565 >>> difflineranges2(b'abcde', b'aBCDe')
566 [(2, 4)]
566 [(2, 4)]
567 """
567 """
568 ranges = []
568 ranges = []
569 for lines, kind in mdiff.allblocks(content1, content2):
569 for lines, kind in mdiff.allblocks(content1, content2):
570 firstline, lastline = lines[2:4]
570 firstline, lastline = lines[2:4]
571 if kind == b'!' and firstline != lastline:
571 if kind == b'!' and firstline != lastline:
572 ranges.append((firstline + 1, lastline))
572 ranges.append((firstline + 1, lastline))
573 return ranges
573 return ranges
574
574
575
575
576 def getbasectxs(repo, opts, revstofix):
576 def getbasectxs(repo, opts, revstofix):
577 """Returns a map of the base contexts for each revision
577 """Returns a map of the base contexts for each revision
578
578
579 The base contexts determine which lines are considered modified when we
579 The base contexts determine which lines are considered modified when we
580 attempt to fix just the modified lines in a file. It also determines which
580 attempt to fix just the modified lines in a file. It also determines which
581 files we attempt to fix, so it is important to compute this even when
581 files we attempt to fix, so it is important to compute this even when
582 --whole is used.
582 --whole is used.
583 """
583 """
584 # The --base flag overrides the usual logic, and we give every revision
584 # The --base flag overrides the usual logic, and we give every revision
585 # exactly the set of baserevs that the user specified.
585 # exactly the set of baserevs that the user specified.
586 if opts.get(b'base'):
586 if opts.get(b'base'):
587 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
587 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
588 if not baserevs:
588 if not baserevs:
589 baserevs = {nullrev}
589 baserevs = {nullrev}
590 basectxs = {repo[rev] for rev in baserevs}
590 basectxs = {repo[rev] for rev in baserevs}
591 return {rev: basectxs for rev in revstofix}
591 return {rev: basectxs for rev in revstofix}
592
592
593 # Proceed in topological order so that we can easily determine each
593 # Proceed in topological order so that we can easily determine each
594 # revision's baserevs by looking at its parents and their baserevs.
594 # revision's baserevs by looking at its parents and their baserevs.
595 basectxs = collections.defaultdict(set)
595 basectxs = collections.defaultdict(set)
596 for rev in sorted(revstofix):
596 for rev in sorted(revstofix):
597 ctx = repo[rev]
597 ctx = repo[rev]
598 for pctx in ctx.parents():
598 for pctx in ctx.parents():
599 if pctx.rev() in basectxs:
599 if pctx.rev() in basectxs:
600 basectxs[rev].update(basectxs[pctx.rev()])
600 basectxs[rev].update(basectxs[pctx.rev()])
601 else:
601 else:
602 basectxs[rev].add(pctx)
602 basectxs[rev].add(pctx)
603 return basectxs
603 return basectxs
604
604
605
605
606 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
606 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
607 """Run any configured fixers that should affect the file in this context
607 """Run any configured fixers that should affect the file in this context
608
608
609 Returns the file content that results from applying the fixers in some order
609 Returns the file content that results from applying the fixers in some order
610 starting with the file's content in the fixctx. Fixers that support line
610 starting with the file's content in the fixctx. Fixers that support line
611 ranges will affect lines that have changed relative to any of the basectxs
611 ranges will affect lines that have changed relative to any of the basectxs
612 (i.e. they will only avoid lines that are common to all basectxs).
612 (i.e. they will only avoid lines that are common to all basectxs).
613
613
614 A fixer tool's stdout will become the file's new content if and only if it
614 A fixer tool's stdout will become the file's new content if and only if it
615 exits with code zero. The fixer tool's working directory is the repository's
615 exits with code zero. The fixer tool's working directory is the repository's
616 root.
616 root.
617 """
617 """
618 metadata = {}
618 metadata = {}
619 newdata = fixctx[path].data()
619 newdata = fixctx[path].data()
620 for fixername, fixer in pycompat.iteritems(fixers):
620 for fixername, fixer in pycompat.iteritems(fixers):
621 if fixer.affects(opts, fixctx, path):
621 if fixer.affects(opts, fixctx, path):
622 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
622 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
623 command = fixer.command(ui, path, ranges)
623 command = fixer.command(ui, path, ranges)
624 if command is None:
624 if command is None:
625 continue
625 continue
626 ui.debug(b'subprocess: %s\n' % (command,))
626 ui.debug(b'subprocess: %s\n' % (command,))
627 proc = subprocess.Popen(
627 proc = subprocess.Popen(
628 procutil.tonativestr(command),
628 procutil.tonativestr(command),
629 shell=True,
629 shell=True,
630 cwd=procutil.tonativestr(repo.root),
630 cwd=procutil.tonativestr(repo.root),
631 stdin=subprocess.PIPE,
631 stdin=subprocess.PIPE,
632 stdout=subprocess.PIPE,
632 stdout=subprocess.PIPE,
633 stderr=subprocess.PIPE,
633 stderr=subprocess.PIPE,
634 )
634 )
635 stdout, stderr = proc.communicate(newdata)
635 stdout, stderr = proc.communicate(newdata)
636 if stderr:
636 if stderr:
637 showstderr(ui, fixctx.rev(), fixername, stderr)
637 showstderr(ui, fixctx.rev(), fixername, stderr)
638 newerdata = stdout
638 newerdata = stdout
639 if fixer.shouldoutputmetadata():
639 if fixer.shouldoutputmetadata():
640 try:
640 try:
641 metadatajson, newerdata = stdout.split(b'\0', 1)
641 metadatajson, newerdata = stdout.split(b'\0', 1)
642 metadata[fixername] = json.loads(metadatajson)
642 metadata[fixername] = json.loads(metadatajson)
643 except ValueError:
643 except ValueError:
644 ui.warn(
644 ui.warn(
645 _(b'ignored invalid output from fixer tool: %s\n')
645 _(b'ignored invalid output from fixer tool: %s\n')
646 % (fixername,)
646 % (fixername,)
647 )
647 )
648 continue
648 continue
649 else:
649 else:
650 metadata[fixername] = None
650 metadata[fixername] = None
651 if proc.returncode == 0:
651 if proc.returncode == 0:
652 newdata = newerdata
652 newdata = newerdata
653 else:
653 else:
654 if not stderr:
654 if not stderr:
655 message = _(b'exited with status %d\n') % (proc.returncode,)
655 message = _(b'exited with status %d\n') % (proc.returncode,)
656 showstderr(ui, fixctx.rev(), fixername, message)
656 showstderr(ui, fixctx.rev(), fixername, message)
657 checktoolfailureaction(
657 checktoolfailureaction(
658 ui,
658 ui,
659 _(b'no fixes will be applied'),
659 _(b'no fixes will be applied'),
660 hint=_(
660 hint=_(
661 b'use --config fix.failure=continue to apply any '
661 b'use --config fix.failure=continue to apply any '
662 b'successful fixes anyway'
662 b'successful fixes anyway'
663 ),
663 ),
664 )
664 )
665 return metadata, newdata
665 return metadata, newdata
666
666
667
667
668 def showstderr(ui, rev, fixername, stderr):
668 def showstderr(ui, rev, fixername, stderr):
669 """Writes the lines of the stderr string as warnings on the ui
669 """Writes the lines of the stderr string as warnings on the ui
670
670
671 Uses the revision number and fixername to give more context to each line of
671 Uses the revision number and fixername to give more context to each line of
672 the error message. Doesn't include file names, since those take up a lot of
672 the error message. Doesn't include file names, since those take up a lot of
673 space and would tend to be included in the error message if they were
673 space and would tend to be included in the error message if they were
674 relevant.
674 relevant.
675 """
675 """
676 for line in re.split(b'[\r\n]+', stderr):
676 for line in re.split(b'[\r\n]+', stderr):
677 if line:
677 if line:
678 ui.warn(b'[')
678 ui.warn(b'[')
679 if rev is None:
679 if rev is None:
680 ui.warn(_(b'wdir'), label=b'evolve.rev')
680 ui.warn(_(b'wdir'), label=b'evolve.rev')
681 else:
681 else:
682 ui.warn((str(rev)), label=b'evolve.rev')
682 ui.warn((str(rev)), label=b'evolve.rev')
683 ui.warn(b'] %s: %s\n' % (fixername, line))
683 ui.warn(b'] %s: %s\n' % (fixername, line))
684
684
685
685
686 def writeworkingdir(repo, ctx, filedata, replacements):
686 def writeworkingdir(repo, ctx, filedata, replacements):
687 """Write new content to the working copy and check out the new p1 if any
687 """Write new content to the working copy and check out the new p1 if any
688
688
689 We check out a new revision if and only if we fixed something in both the
689 We check out a new revision if and only if we fixed something in both the
690 working directory and its parent revision. This avoids the need for a full
690 working directory and its parent revision. This avoids the need for a full
691 update/merge, and means that the working directory simply isn't affected
691 update/merge, and means that the working directory simply isn't affected
692 unless the --working-dir flag is given.
692 unless the --working-dir flag is given.
693
693
694 Directly updates the dirstate for the affected files.
694 Directly updates the dirstate for the affected files.
695 """
695 """
696 for path, data in pycompat.iteritems(filedata):
696 for path, data in pycompat.iteritems(filedata):
697 fctx = ctx[path]
697 fctx = ctx[path]
698 fctx.write(data, fctx.flags())
698 fctx.write(data, fctx.flags())
699 if repo.dirstate[path] == b'n':
699 if repo.dirstate[path] == b'n':
700 repo.dirstate.normallookup(path)
700 repo.dirstate.normallookup(path)
701
701
702 oldparentnodes = repo.dirstate.parents()
702 oldparentnodes = repo.dirstate.parents()
703 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
703 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
704 if newparentnodes != oldparentnodes:
704 if newparentnodes != oldparentnodes:
705 repo.setparents(*newparentnodes)
705 repo.setparents(*newparentnodes)
706
706
707
707
708 def replacerev(ui, repo, ctx, filedata, replacements):
708 def replacerev(ui, repo, ctx, filedata, replacements):
709 """Commit a new revision like the given one, but with file content changes
709 """Commit a new revision like the given one, but with file content changes
710
710
711 "ctx" is the original revision to be replaced by a modified one.
711 "ctx" is the original revision to be replaced by a modified one.
712
712
713 "filedata" is a dict that maps paths to their new file content. All other
713 "filedata" is a dict that maps paths to their new file content. All other
714 paths will be recreated from the original revision without changes.
714 paths will be recreated from the original revision without changes.
715 "filedata" may contain paths that didn't exist in the original revision;
715 "filedata" may contain paths that didn't exist in the original revision;
716 they will be added.
716 they will be added.
717
717
718 "replacements" is a dict that maps a single node to a single node, and it is
718 "replacements" is a dict that maps a single node to a single node, and it is
719 updated to indicate the original revision is replaced by the newly created
719 updated to indicate the original revision is replaced by the newly created
720 one. No entry is added if the replacement's node already exists.
720 one. No entry is added if the replacement's node already exists.
721
721
722 The new revision has the same parents as the old one, unless those parents
722 The new revision has the same parents as the old one, unless those parents
723 have already been replaced, in which case those replacements are the parents
723 have already been replaced, in which case those replacements are the parents
724 of this new revision. Thus, if revisions are replaced in topological order,
724 of this new revision. Thus, if revisions are replaced in topological order,
725 there is no need to rebase them into the original topology later.
725 there is no need to rebase them into the original topology later.
726 """
726 """
727
727
728 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
728 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
729 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
729 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
730 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
730 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
731 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
731 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
732
732
733 # We don't want to create a revision that has no changes from the original,
733 # We don't want to create a revision that has no changes from the original,
734 # but we should if the original revision's parent has been replaced.
734 # but we should if the original revision's parent has been replaced.
735 # Otherwise, we would produce an orphan that needs no actual human
735 # Otherwise, we would produce an orphan that needs no actual human
736 # intervention to evolve. We can't rely on commit() to avoid creating the
736 # intervention to evolve. We can't rely on commit() to avoid creating the
737 # un-needed revision because the extra field added below produces a new hash
737 # un-needed revision because the extra field added below produces a new hash
738 # regardless of file content changes.
738 # regardless of file content changes.
739 if (
739 if (
740 not filedata
740 not filedata
741 and p1ctx.node() not in replacements
741 and p1ctx.node() not in replacements
742 and p2ctx.node() not in replacements
742 and p2ctx.node() not in replacements
743 ):
743 ):
744 return
744 return
745
745
746 def filectxfn(repo, memctx, path):
746 def filectxfn(repo, memctx, path):
747 if path not in ctx:
747 if path not in ctx:
748 return None
748 return None
749 fctx = ctx[path]
749 fctx = ctx[path]
750 copysource = fctx.copysource()
750 copysource = fctx.copysource()
751 return context.memfilectx(
751 return context.memfilectx(
752 repo,
752 repo,
753 memctx,
753 memctx,
754 path=fctx.path(),
754 path=fctx.path(),
755 data=filedata.get(path, fctx.data()),
755 data=filedata.get(path, fctx.data()),
756 islink=fctx.islink(),
756 islink=fctx.islink(),
757 isexec=fctx.isexec(),
757 isexec=fctx.isexec(),
758 copysource=copysource,
758 copysource=copysource,
759 )
759 )
760
760
761 extra = ctx.extra().copy()
761 extra = ctx.extra().copy()
762 extra[b'fix_source'] = ctx.hex()
762 extra[b'fix_source'] = ctx.hex()
763
763
764 memctx = context.memctx(
764 memctx = context.memctx(
765 repo,
765 repo,
766 parents=(newp1node, newp2node),
766 parents=(newp1node, newp2node),
767 text=ctx.description(),
767 text=ctx.description(),
768 files=set(ctx.files()) | set(filedata.keys()),
768 files=set(ctx.files()) | set(filedata.keys()),
769 filectxfn=filectxfn,
769 filectxfn=filectxfn,
770 user=ctx.user(),
770 user=ctx.user(),
771 date=ctx.date(),
771 date=ctx.date(),
772 extra=extra,
772 extra=extra,
773 branch=ctx.branch(),
773 branch=ctx.branch(),
774 editor=None,
774 editor=None,
775 )
775 )
776 sucnode = memctx.commit()
776 sucnode = memctx.commit()
777 prenode = ctx.node()
777 prenode = ctx.node()
778 if prenode == sucnode:
778 if prenode == sucnode:
779 ui.debug(b'node %s already existed\n' % (ctx.hex()))
779 ui.debug(b'node %s already existed\n' % (ctx.hex()))
780 else:
780 else:
781 replacements[ctx.node()] = sucnode
781 replacements[ctx.node()] = sucnode
782
782
783
783
784 def getfixers(ui):
784 def getfixers(ui):
785 """Returns a map of configured fixer tools indexed by their names
785 """Returns a map of configured fixer tools indexed by their names
786
786
787 Each value is a Fixer object with methods that implement the behavior of the
787 Each value is a Fixer object with methods that implement the behavior of the
788 fixer's config suboptions. Does not validate the config values.
788 fixer's config suboptions. Does not validate the config values.
789 """
789 """
790 fixers = {}
790 fixers = {}
791 for name in fixernames(ui):
791 for name in fixernames(ui):
792 enabled = ui.configbool(b'fix', name + b':enabled')
792 enabled = ui.configbool(b'fix', name + b':enabled')
793 command = ui.config(b'fix', name + b':command')
793 command = ui.config(b'fix', name + b':command')
794 pattern = ui.config(b'fix', name + b':pattern')
794 pattern = ui.config(b'fix', name + b':pattern')
795 linerange = ui.config(b'fix', name + b':linerange')
795 linerange = ui.config(b'fix', name + b':linerange')
796 priority = ui.configint(b'fix', name + b':priority')
796 priority = ui.configint(b'fix', name + b':priority')
797 metadata = ui.configbool(b'fix', name + b':metadata')
797 metadata = ui.configbool(b'fix', name + b':metadata')
798 skipclean = ui.configbool(b'fix', name + b':skipclean')
798 skipclean = ui.configbool(b'fix', name + b':skipclean')
799 # Don't use a fixer if it has no pattern configured. It would be
799 # Don't use a fixer if it has no pattern configured. It would be
800 # dangerous to let it affect all files. It would be pointless to let it
800 # dangerous to let it affect all files. It would be pointless to let it
801 # affect no files. There is no reasonable subset of files to use as the
801 # affect no files. There is no reasonable subset of files to use as the
802 # default.
802 # default.
803 if command is None:
803 if command is None:
804 ui.warn(
804 ui.warn(
805 _(b'fixer tool has no command configuration: %s\n') % (name,)
805 _(b'fixer tool has no command configuration: %s\n') % (name,)
806 )
806 )
807 elif pattern is None:
807 elif pattern is None:
808 ui.warn(
808 ui.warn(
809 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
809 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
810 )
810 )
811 elif not enabled:
811 elif not enabled:
812 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
812 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
813 else:
813 else:
814 fixers[name] = Fixer(
814 fixers[name] = Fixer(
815 command, pattern, linerange, priority, metadata, skipclean
815 command, pattern, linerange, priority, metadata, skipclean
816 )
816 )
817 return collections.OrderedDict(
817 return collections.OrderedDict(
818 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
818 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
819 )
819 )
820
820
821
821
822 def fixernames(ui):
822 def fixernames(ui):
823 """Returns the names of [fix] config options that have suboptions"""
823 """Returns the names of [fix] config options that have suboptions"""
824 names = set()
824 names = set()
825 for k, v in ui.configitems(b'fix'):
825 for k, v in ui.configitems(b'fix'):
826 if b':' in k:
826 if b':' in k:
827 names.add(k.split(b':', 1)[0])
827 names.add(k.split(b':', 1)[0])
828 return names
828 return names
829
829
830
830
831 class Fixer(object):
831 class Fixer(object):
832 """Wraps the raw config values for a fixer with methods"""
832 """Wraps the raw config values for a fixer with methods"""
833
833
834 def __init__(
834 def __init__(
835 self, command, pattern, linerange, priority, metadata, skipclean
835 self, command, pattern, linerange, priority, metadata, skipclean
836 ):
836 ):
837 self._command = command
837 self._command = command
838 self._pattern = pattern
838 self._pattern = pattern
839 self._linerange = linerange
839 self._linerange = linerange
840 self._priority = priority
840 self._priority = priority
841 self._metadata = metadata
841 self._metadata = metadata
842 self._skipclean = skipclean
842 self._skipclean = skipclean
843
843
844 def affects(self, opts, fixctx, path):
844 def affects(self, opts, fixctx, path):
845 """Should this fixer run on the file at the given path and context?"""
845 """Should this fixer run on the file at the given path and context?"""
846 return self._pattern is not None and scmutil.match(
846 return scmutil.match(fixctx, [self._pattern], opts)(path)
847 fixctx, [self._pattern], opts
848 )(path)
849
847
850 def shouldoutputmetadata(self):
848 def shouldoutputmetadata(self):
851 """Should the stdout of this fixer start with JSON and a null byte?"""
849 """Should the stdout of this fixer start with JSON and a null byte?"""
852 return self._metadata
850 return self._metadata
853
851
854 def command(self, ui, path, ranges):
852 def command(self, ui, path, ranges):
855 """A shell command to use to invoke this fixer on the given file/lines
853 """A shell command to use to invoke this fixer on the given file/lines
856
854
857 May return None if there is no appropriate command to run for the given
855 May return None if there is no appropriate command to run for the given
858 parameters.
856 parameters.
859 """
857 """
860 expand = cmdutil.rendercommandtemplate
858 expand = cmdutil.rendercommandtemplate
861 parts = [
859 parts = [
862 expand(
860 expand(
863 ui,
861 ui,
864 self._command,
862 self._command,
865 {b'rootpath': path, b'basename': os.path.basename(path)},
863 {b'rootpath': path, b'basename': os.path.basename(path)},
866 )
864 )
867 ]
865 ]
868 if self._linerange:
866 if self._linerange:
869 if self._skipclean and not ranges:
867 if self._skipclean and not ranges:
870 # No line ranges to fix, so don't run the fixer.
868 # No line ranges to fix, so don't run the fixer.
871 return None
869 return None
872 for first, last in ranges:
870 for first, last in ranges:
873 parts.append(
871 parts.append(
874 expand(
872 expand(
875 ui, self._linerange, {b'first': first, b'last': last}
873 ui, self._linerange, {b'first': first, b'last': last}
876 )
874 )
877 )
875 )
878 return b' '.join(parts)
876 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now