##// END OF EJS Templates
fix: fix registration of config item defaults...
Martin von Zweigbergk -
r43488:5cb3e6f4 default
parent child Browse files
Show More
@@ -1,871 +1,870 b''
1 # fix - rewrite file content in changesets and working copy
1 # fix - rewrite file content in changesets and working copy
2 #
2 #
3 # Copyright 2018 Google LLC.
3 # Copyright 2018 Google LLC.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8
8
9 Provides a command that runs configured tools on the contents of modified files,
9 Provides a command that runs configured tools on the contents of modified files,
10 writing back any fixes to the working copy or replacing changesets.
10 writing back any fixes to the working copy or replacing changesets.
11
11
12 Here is an example configuration that causes :hg:`fix` to apply automatic
12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 formatting fixes to modified lines in C++ code::
13 formatting fixes to modified lines in C++ code::
14
14
15 [fix]
15 [fix]
16 clang-format:command=clang-format --assume-filename={rootpath}
16 clang-format:command=clang-format --assume-filename={rootpath}
17 clang-format:linerange=--lines={first}:{last}
17 clang-format:linerange=--lines={first}:{last}
18 clang-format:pattern=set:**.cpp or **.hpp
18 clang-format:pattern=set:**.cpp or **.hpp
19
19
20 The :command suboption forms the first part of the shell command that will be
20 The :command suboption forms the first part of the shell command that will be
21 used to fix a file. The content of the file is passed on standard input, and the
21 used to fix a file. The content of the file is passed on standard input, and the
22 fixed file content is expected on standard output. Any output on standard error
22 fixed file content is expected on standard output. Any output on standard error
23 will be displayed as a warning. If the exit status is not zero, the file will
23 will be displayed as a warning. If the exit status is not zero, the file will
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 status but no standard error output. Some values may be substituted into the
25 status but no standard error output. Some values may be substituted into the
26 command::
26 command::
27
27
28 {rootpath} The path of the file being fixed, relative to the repo root
28 {rootpath} The path of the file being fixed, relative to the repo root
29 {basename} The name of the file being fixed, without the directory path
29 {basename} The name of the file being fixed, without the directory path
30
30
31 If the :linerange suboption is set, the tool will only be run if there are
31 If the :linerange suboption is set, the tool will only be run if there are
32 changed lines in a file. The value of this suboption is appended to the shell
32 changed lines in a file. The value of this suboption is appended to the shell
33 command once for every range of changed lines in the file. Some values may be
33 command once for every range of changed lines in the file. Some values may be
34 substituted into the command::
34 substituted into the command::
35
35
36 {first} The 1-based line number of the first line in the modified range
36 {first} The 1-based line number of the first line in the modified range
37 {last} The 1-based line number of the last line in the modified range
37 {last} The 1-based line number of the last line in the modified range
38
38
39 Deleted sections of a file will be ignored by :linerange, because there is no
39 Deleted sections of a file will be ignored by :linerange, because there is no
40 corresponding line range in the version being fixed.
40 corresponding line range in the version being fixed.
41
41
42 By default, tools that set :linerange will only be executed if there is at least
42 By default, tools that set :linerange will only be executed if there is at least
43 one changed line range. This is meant to prevent accidents like running a code
43 one changed line range. This is meant to prevent accidents like running a code
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 to false.
46 to false.
47
47
48 The :pattern suboption determines which files will be passed through each
48 The :pattern suboption determines which files will be passed through each
49 configured tool. See :hg:`help patterns` for possible values. If there are file
49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 arguments to :hg:`fix`, the intersection of these patterns is used.
50 arguments to :hg:`fix`, the intersection of these patterns is used.
51
51
52 There is also a configurable limit for the maximum size of file that will be
52 There is also a configurable limit for the maximum size of file that will be
53 processed by :hg:`fix`::
53 processed by :hg:`fix`::
54
54
55 [fix]
55 [fix]
56 maxfilesize = 2MB
56 maxfilesize = 2MB
57
57
58 Normally, execution of configured tools will continue after a failure (indicated
58 Normally, execution of configured tools will continue after a failure (indicated
59 by a non-zero exit status). It can also be configured to abort after the first
59 by a non-zero exit status). It can also be configured to abort after the first
60 such failure, so that no files will be affected if any tool fails. This abort
60 such failure, so that no files will be affected if any tool fails. This abort
61 will also cause :hg:`fix` to exit with a non-zero status::
61 will also cause :hg:`fix` to exit with a non-zero status::
62
62
63 [fix]
63 [fix]
64 failure = abort
64 failure = abort
65
65
66 When multiple tools are configured to affect a file, they execute in an order
66 When multiple tools are configured to affect a file, they execute in an order
67 defined by the :priority suboption. The priority suboption has a default value
67 defined by the :priority suboption. The priority suboption has a default value
68 of zero for each tool. Tools are executed in order of descending priority. The
68 of zero for each tool. Tools are executed in order of descending priority. The
69 execution order of tools with equal priority is unspecified. For example, you
69 execution order of tools with equal priority is unspecified. For example, you
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 in a text file by ensuring that 'sort' runs before 'head'::
71 in a text file by ensuring that 'sort' runs before 'head'::
72
72
73 [fix]
73 [fix]
74 sort:command = sort -n
74 sort:command = sort -n
75 head:command = head -n 10
75 head:command = head -n 10
76 sort:pattern = numbers.txt
76 sort:pattern = numbers.txt
77 head:pattern = numbers.txt
77 head:pattern = numbers.txt
78 sort:priority = 2
78 sort:priority = 2
79 head:priority = 1
79 head:priority = 1
80
80
81 To account for changes made by each tool, the line numbers used for incremental
81 To account for changes made by each tool, the line numbers used for incremental
82 formatting are recomputed before executing the next tool. So, each tool may see
82 formatting are recomputed before executing the next tool. So, each tool may see
83 different values for the arguments added by the :linerange suboption.
83 different values for the arguments added by the :linerange suboption.
84
84
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 content. The metadata must be placed before the file content on stdout,
86 content. The metadata must be placed before the file content on stdout,
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 is expected to produce this metadata encoding if and only if the :metadata
89 is expected to produce this metadata encoding if and only if the :metadata
90 suboption is true::
90 suboption is true::
91
91
92 [fix]
92 [fix]
93 tool:command = tool --prepend-json-metadata
93 tool:command = tool --prepend-json-metadata
94 tool:metadata = true
94 tool:metadata = true
95
95
96 The metadata values are passed to hooks, which can be used to print summaries or
96 The metadata values are passed to hooks, which can be used to print summaries or
97 perform other post-fixing work. The supported hooks are::
97 perform other post-fixing work. The supported hooks are::
98
98
99 "postfixfile"
99 "postfixfile"
100 Run once for each file in each revision where any fixer tools made changes
100 Run once for each file in each revision where any fixer tools made changes
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 tools that affected the file. Fixer tools that didn't affect the file have a
103 tools that affected the file. Fixer tools that didn't affect the file have a
104 valueof None. Only fixer tools that executed are present in the metadata.
104 valueof None. Only fixer tools that executed are present in the metadata.
105
105
106 "postfix"
106 "postfix"
107 Run once after all files and revisions have been handled. Provides
107 Run once after all files and revisions have been handled. Provides
108 "$HG_REPLACEMENTS" with information about what revisions were created and
108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 files in the working copy were updated. Provides a list "$HG_METADATA"
110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 mapping fixer tool names to lists of metadata values returned from
111 mapping fixer tool names to lists of metadata values returned from
112 executions that modified a file. This aggregates the same metadata
112 executions that modified a file. This aggregates the same metadata
113 previously passed to the "postfixfile" hook.
113 previously passed to the "postfixfile" hook.
114
114
115 Fixer tools are run the in repository's root directory. This allows them to read
115 Fixer tools are run the in repository's root directory. This allows them to read
116 configuration files from the working copy, or even write to the working copy.
116 configuration files from the working copy, or even write to the working copy.
117 The working copy is not updated to match the revision being fixed. In fact,
117 The working copy is not updated to match the revision being fixed. In fact,
118 several revisions may be fixed in parallel. Writes to the working copy are not
118 several revisions may be fixed in parallel. Writes to the working copy are not
119 amended into the revision being fixed; fixer tools should always write fixed
119 amended into the revision being fixed; fixer tools should always write fixed
120 file content back to stdout as documented above.
120 file content back to stdout as documented above.
121 """
121 """
122
122
123 from __future__ import absolute_import
123 from __future__ import absolute_import
124
124
125 import collections
125 import collections
126 import itertools
126 import itertools
127 import json
127 import json
128 import os
128 import os
129 import re
129 import re
130 import subprocess
130 import subprocess
131
131
132 from mercurial.i18n import _
132 from mercurial.i18n import _
133 from mercurial.node import nullrev
133 from mercurial.node import nullrev
134 from mercurial.node import wdirrev
134 from mercurial.node import wdirrev
135 from mercurial.pycompat import setattr
135 from mercurial.pycompat import setattr
136
136
137 from mercurial.utils import (
137 from mercurial.utils import (
138 procutil,
138 procutil,
139 stringutil,
139 stringutil,
140 )
140 )
141
141
142 from mercurial import (
142 from mercurial import (
143 cmdutil,
143 cmdutil,
144 context,
144 context,
145 copies,
145 copies,
146 error,
146 error,
147 mdiff,
147 mdiff,
148 merge,
148 merge,
149 obsolete,
149 obsolete,
150 pycompat,
150 pycompat,
151 registrar,
151 registrar,
152 scmutil,
152 scmutil,
153 util,
153 util,
154 worker,
154 worker,
155 )
155 )
156
156
157 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
157 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
158 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
158 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
159 # be specifying the version(s) of Mercurial they are tested with, or
159 # be specifying the version(s) of Mercurial they are tested with, or
160 # leave the attribute unspecified.
160 # leave the attribute unspecified.
161 testedwith = b'ships-with-hg-core'
161 testedwith = b'ships-with-hg-core'
162
162
163 cmdtable = {}
163 cmdtable = {}
164 command = registrar.command(cmdtable)
164 command = registrar.command(cmdtable)
165
165
166 configtable = {}
166 configtable = {}
167 configitem = registrar.configitem(configtable)
167 configitem = registrar.configitem(configtable)
168
168
169 # Register the suboptions allowed for each configured fixer, and default values.
169 # Register the suboptions allowed for each configured fixer, and default values.
170 FIXER_ATTRS = {
170 FIXER_ATTRS = {
171 b'command': None,
171 b'command': None,
172 b'linerange': None,
172 b'linerange': None,
173 b'pattern': None,
173 b'pattern': None,
174 b'priority': 0,
174 b'priority': 0,
175 b'metadata': b'false',
175 b'metadata': b'false',
176 b'skipclean': b'true',
176 b'skipclean': b'true',
177 b'enabled': b'true',
177 b'enabled': b'true',
178 }
178 }
179
179
180 for key, default in FIXER_ATTRS.items():
180 for key, default in FIXER_ATTRS.items():
181 configitem(b'fix', b'.*(:%s)?' % key, default=default, generic=True)
181 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
182
182
183 # A good default size allows most source code files to be fixed, but avoids
183 # A good default size allows most source code files to be fixed, but avoids
184 # letting fixer tools choke on huge inputs, which could be surprising to the
184 # letting fixer tools choke on huge inputs, which could be surprising to the
185 # user.
185 # user.
186 configitem(b'fix', b'maxfilesize', default=b'2MB')
186 configitem(b'fix', b'maxfilesize', default=b'2MB')
187
187
188 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
188 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
189 # This helps users do shell scripts that stop when a fixer tool signals a
189 # This helps users do shell scripts that stop when a fixer tool signals a
190 # problem.
190 # problem.
191 configitem(b'fix', b'failure', default=b'continue')
191 configitem(b'fix', b'failure', default=b'continue')
192
192
193
193
194 def checktoolfailureaction(ui, message, hint=None):
194 def checktoolfailureaction(ui, message, hint=None):
195 """Abort with 'message' if fix.failure=abort"""
195 """Abort with 'message' if fix.failure=abort"""
196 action = ui.config(b'fix', b'failure')
196 action = ui.config(b'fix', b'failure')
197 if action not in (b'continue', b'abort'):
197 if action not in (b'continue', b'abort'):
198 raise error.Abort(
198 raise error.Abort(
199 _(b'unknown fix.failure action: %s') % (action,),
199 _(b'unknown fix.failure action: %s') % (action,),
200 hint=_(b'use "continue" or "abort"'),
200 hint=_(b'use "continue" or "abort"'),
201 )
201 )
202 if action == b'abort':
202 if action == b'abort':
203 raise error.Abort(message, hint=hint)
203 raise error.Abort(message, hint=hint)
204
204
205
205
206 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
206 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
207 baseopt = (
207 baseopt = (
208 b'',
208 b'',
209 b'base',
209 b'base',
210 [],
210 [],
211 _(
211 _(
212 b'revisions to diff against (overrides automatic '
212 b'revisions to diff against (overrides automatic '
213 b'selection, and applies to every revision being '
213 b'selection, and applies to every revision being '
214 b'fixed)'
214 b'fixed)'
215 ),
215 ),
216 _(b'REV'),
216 _(b'REV'),
217 )
217 )
218 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
218 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
219 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
219 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
220 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
220 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
221 usage = _(b'[OPTION]... [FILE]...')
221 usage = _(b'[OPTION]... [FILE]...')
222
222
223
223
224 @command(
224 @command(
225 b'fix',
225 b'fix',
226 [allopt, baseopt, revopt, wdiropt, wholeopt],
226 [allopt, baseopt, revopt, wdiropt, wholeopt],
227 usage,
227 usage,
228 helpcategory=command.CATEGORY_FILE_CONTENTS,
228 helpcategory=command.CATEGORY_FILE_CONTENTS,
229 )
229 )
230 def fix(ui, repo, *pats, **opts):
230 def fix(ui, repo, *pats, **opts):
231 """rewrite file content in changesets or working directory
231 """rewrite file content in changesets or working directory
232
232
233 Runs any configured tools to fix the content of files. Only affects files
233 Runs any configured tools to fix the content of files. Only affects files
234 with changes, unless file arguments are provided. Only affects changed lines
234 with changes, unless file arguments are provided. Only affects changed lines
235 of files, unless the --whole flag is used. Some tools may always affect the
235 of files, unless the --whole flag is used. Some tools may always affect the
236 whole file regardless of --whole.
236 whole file regardless of --whole.
237
237
238 If revisions are specified with --rev, those revisions will be checked, and
238 If revisions are specified with --rev, those revisions will be checked, and
239 they may be replaced with new revisions that have fixed file content. It is
239 they may be replaced with new revisions that have fixed file content. It is
240 desirable to specify all descendants of each specified revision, so that the
240 desirable to specify all descendants of each specified revision, so that the
241 fixes propagate to the descendants. If all descendants are fixed at the same
241 fixes propagate to the descendants. If all descendants are fixed at the same
242 time, no merging, rebasing, or evolution will be required.
242 time, no merging, rebasing, or evolution will be required.
243
243
244 If --working-dir is used, files with uncommitted changes in the working copy
244 If --working-dir is used, files with uncommitted changes in the working copy
245 will be fixed. If the checked-out revision is also fixed, the working
245 will be fixed. If the checked-out revision is also fixed, the working
246 directory will update to the replacement revision.
246 directory will update to the replacement revision.
247
247
248 When determining what lines of each file to fix at each revision, the whole
248 When determining what lines of each file to fix at each revision, the whole
249 set of revisions being fixed is considered, so that fixes to earlier
249 set of revisions being fixed is considered, so that fixes to earlier
250 revisions are not forgotten in later ones. The --base flag can be used to
250 revisions are not forgotten in later ones. The --base flag can be used to
251 override this default behavior, though it is not usually desirable to do so.
251 override this default behavior, though it is not usually desirable to do so.
252 """
252 """
253 opts = pycompat.byteskwargs(opts)
253 opts = pycompat.byteskwargs(opts)
254 if opts[b'all']:
254 if opts[b'all']:
255 if opts[b'rev']:
255 if opts[b'rev']:
256 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
256 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
257 opts[b'rev'] = [b'not public() and not obsolete()']
257 opts[b'rev'] = [b'not public() and not obsolete()']
258 opts[b'working_dir'] = True
258 opts[b'working_dir'] = True
259 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
259 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
260 revstofix = getrevstofix(ui, repo, opts)
260 revstofix = getrevstofix(ui, repo, opts)
261 basectxs = getbasectxs(repo, opts, revstofix)
261 basectxs = getbasectxs(repo, opts, revstofix)
262 workqueue, numitems = getworkqueue(
262 workqueue, numitems = getworkqueue(
263 ui, repo, pats, opts, revstofix, basectxs
263 ui, repo, pats, opts, revstofix, basectxs
264 )
264 )
265 fixers = getfixers(ui)
265 fixers = getfixers(ui)
266
266
267 # There are no data dependencies between the workers fixing each file
267 # There are no data dependencies between the workers fixing each file
268 # revision, so we can use all available parallelism.
268 # revision, so we can use all available parallelism.
269 def getfixes(items):
269 def getfixes(items):
270 for rev, path in items:
270 for rev, path in items:
271 ctx = repo[rev]
271 ctx = repo[rev]
272 olddata = ctx[path].data()
272 olddata = ctx[path].data()
273 metadata, newdata = fixfile(
273 metadata, newdata = fixfile(
274 ui, repo, opts, fixers, ctx, path, basectxs[rev]
274 ui, repo, opts, fixers, ctx, path, basectxs[rev]
275 )
275 )
276 # Don't waste memory/time passing unchanged content back, but
276 # Don't waste memory/time passing unchanged content back, but
277 # produce one result per item either way.
277 # produce one result per item either way.
278 yield (
278 yield (
279 rev,
279 rev,
280 path,
280 path,
281 metadata,
281 metadata,
282 newdata if newdata != olddata else None,
282 newdata if newdata != olddata else None,
283 )
283 )
284
284
285 results = worker.worker(
285 results = worker.worker(
286 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
286 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
287 )
287 )
288
288
289 # We have to hold on to the data for each successor revision in memory
289 # We have to hold on to the data for each successor revision in memory
290 # until all its parents are committed. We ensure this by committing and
290 # until all its parents are committed. We ensure this by committing and
291 # freeing memory for the revisions in some topological order. This
291 # freeing memory for the revisions in some topological order. This
292 # leaves a little bit of memory efficiency on the table, but also makes
292 # leaves a little bit of memory efficiency on the table, but also makes
293 # the tests deterministic. It might also be considered a feature since
293 # the tests deterministic. It might also be considered a feature since
294 # it makes the results more easily reproducible.
294 # it makes the results more easily reproducible.
295 filedata = collections.defaultdict(dict)
295 filedata = collections.defaultdict(dict)
296 aggregatemetadata = collections.defaultdict(list)
296 aggregatemetadata = collections.defaultdict(list)
297 replacements = {}
297 replacements = {}
298 wdirwritten = False
298 wdirwritten = False
299 commitorder = sorted(revstofix, reverse=True)
299 commitorder = sorted(revstofix, reverse=True)
300 with ui.makeprogress(
300 with ui.makeprogress(
301 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
301 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
302 ) as progress:
302 ) as progress:
303 for rev, path, filerevmetadata, newdata in results:
303 for rev, path, filerevmetadata, newdata in results:
304 progress.increment(item=path)
304 progress.increment(item=path)
305 for fixername, fixermetadata in filerevmetadata.items():
305 for fixername, fixermetadata in filerevmetadata.items():
306 aggregatemetadata[fixername].append(fixermetadata)
306 aggregatemetadata[fixername].append(fixermetadata)
307 if newdata is not None:
307 if newdata is not None:
308 filedata[rev][path] = newdata
308 filedata[rev][path] = newdata
309 hookargs = {
309 hookargs = {
310 b'rev': rev,
310 b'rev': rev,
311 b'path': path,
311 b'path': path,
312 b'metadata': filerevmetadata,
312 b'metadata': filerevmetadata,
313 }
313 }
314 repo.hook(
314 repo.hook(
315 b'postfixfile',
315 b'postfixfile',
316 throw=False,
316 throw=False,
317 **pycompat.strkwargs(hookargs)
317 **pycompat.strkwargs(hookargs)
318 )
318 )
319 numitems[rev] -= 1
319 numitems[rev] -= 1
320 # Apply the fixes for this and any other revisions that are
320 # Apply the fixes for this and any other revisions that are
321 # ready and sitting at the front of the queue. Using a loop here
321 # ready and sitting at the front of the queue. Using a loop here
322 # prevents the queue from being blocked by the first revision to
322 # prevents the queue from being blocked by the first revision to
323 # be ready out of order.
323 # be ready out of order.
324 while commitorder and not numitems[commitorder[-1]]:
324 while commitorder and not numitems[commitorder[-1]]:
325 rev = commitorder.pop()
325 rev = commitorder.pop()
326 ctx = repo[rev]
326 ctx = repo[rev]
327 if rev == wdirrev:
327 if rev == wdirrev:
328 writeworkingdir(repo, ctx, filedata[rev], replacements)
328 writeworkingdir(repo, ctx, filedata[rev], replacements)
329 wdirwritten = bool(filedata[rev])
329 wdirwritten = bool(filedata[rev])
330 else:
330 else:
331 replacerev(ui, repo, ctx, filedata[rev], replacements)
331 replacerev(ui, repo, ctx, filedata[rev], replacements)
332 del filedata[rev]
332 del filedata[rev]
333
333
334 cleanup(repo, replacements, wdirwritten)
334 cleanup(repo, replacements, wdirwritten)
335 hookargs = {
335 hookargs = {
336 b'replacements': replacements,
336 b'replacements': replacements,
337 b'wdirwritten': wdirwritten,
337 b'wdirwritten': wdirwritten,
338 b'metadata': aggregatemetadata,
338 b'metadata': aggregatemetadata,
339 }
339 }
340 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
340 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
341
341
342
342
343 def cleanup(repo, replacements, wdirwritten):
343 def cleanup(repo, replacements, wdirwritten):
344 """Calls scmutil.cleanupnodes() with the given replacements.
344 """Calls scmutil.cleanupnodes() with the given replacements.
345
345
346 "replacements" is a dict from nodeid to nodeid, with one key and one value
346 "replacements" is a dict from nodeid to nodeid, with one key and one value
347 for every revision that was affected by fixing. This is slightly different
347 for every revision that was affected by fixing. This is slightly different
348 from cleanupnodes().
348 from cleanupnodes().
349
349
350 "wdirwritten" is a bool which tells whether the working copy was affected by
350 "wdirwritten" is a bool which tells whether the working copy was affected by
351 fixing, since it has no entry in "replacements".
351 fixing, since it has no entry in "replacements".
352
352
353 Useful as a hook point for extending "hg fix" with output summarizing the
353 Useful as a hook point for extending "hg fix" with output summarizing the
354 effects of the command, though we choose not to output anything here.
354 effects of the command, though we choose not to output anything here.
355 """
355 """
356 replacements = {
356 replacements = {
357 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
357 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
358 }
358 }
359 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
359 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
360
360
361
361
362 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
362 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
363 """"Constructs the list of files to be fixed at specific revisions
363 """"Constructs the list of files to be fixed at specific revisions
364
364
365 It is up to the caller how to consume the work items, and the only
365 It is up to the caller how to consume the work items, and the only
366 dependence between them is that replacement revisions must be committed in
366 dependence between them is that replacement revisions must be committed in
367 topological order. Each work item represents a file in the working copy or
367 topological order. Each work item represents a file in the working copy or
368 in some revision that should be fixed and written back to the working copy
368 in some revision that should be fixed and written back to the working copy
369 or into a replacement revision.
369 or into a replacement revision.
370
370
371 Work items for the same revision are grouped together, so that a worker
371 Work items for the same revision are grouped together, so that a worker
372 pool starting with the first N items in parallel is likely to finish the
372 pool starting with the first N items in parallel is likely to finish the
373 first revision's work before other revisions. This can allow us to write
373 first revision's work before other revisions. This can allow us to write
374 the result to disk and reduce memory footprint. At time of writing, the
374 the result to disk and reduce memory footprint. At time of writing, the
375 partition strategy in worker.py seems favorable to this. We also sort the
375 partition strategy in worker.py seems favorable to this. We also sort the
376 items by ascending revision number to match the order in which we commit
376 items by ascending revision number to match the order in which we commit
377 the fixes later.
377 the fixes later.
378 """
378 """
379 workqueue = []
379 workqueue = []
380 numitems = collections.defaultdict(int)
380 numitems = collections.defaultdict(int)
381 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
381 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
382 for rev in sorted(revstofix):
382 for rev in sorted(revstofix):
383 fixctx = repo[rev]
383 fixctx = repo[rev]
384 match = scmutil.match(fixctx, pats, opts)
384 match = scmutil.match(fixctx, pats, opts)
385 for path in sorted(
385 for path in sorted(
386 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
386 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
387 ):
387 ):
388 fctx = fixctx[path]
388 fctx = fixctx[path]
389 if fctx.islink():
389 if fctx.islink():
390 continue
390 continue
391 if fctx.size() > maxfilesize:
391 if fctx.size() > maxfilesize:
392 ui.warn(
392 ui.warn(
393 _(b'ignoring file larger than %s: %s\n')
393 _(b'ignoring file larger than %s: %s\n')
394 % (util.bytecount(maxfilesize), path)
394 % (util.bytecount(maxfilesize), path)
395 )
395 )
396 continue
396 continue
397 workqueue.append((rev, path))
397 workqueue.append((rev, path))
398 numitems[rev] += 1
398 numitems[rev] += 1
399 return workqueue, numitems
399 return workqueue, numitems
400
400
401
401
402 def getrevstofix(ui, repo, opts):
402 def getrevstofix(ui, repo, opts):
403 """Returns the set of revision numbers that should be fixed"""
403 """Returns the set of revision numbers that should be fixed"""
404 revs = set(scmutil.revrange(repo, opts[b'rev']))
404 revs = set(scmutil.revrange(repo, opts[b'rev']))
405 for rev in revs:
405 for rev in revs:
406 checkfixablectx(ui, repo, repo[rev])
406 checkfixablectx(ui, repo, repo[rev])
407 if revs:
407 if revs:
408 cmdutil.checkunfinished(repo)
408 cmdutil.checkunfinished(repo)
409 checknodescendants(repo, revs)
409 checknodescendants(repo, revs)
410 if opts.get(b'working_dir'):
410 if opts.get(b'working_dir'):
411 revs.add(wdirrev)
411 revs.add(wdirrev)
412 if list(merge.mergestate.read(repo).unresolved()):
412 if list(merge.mergestate.read(repo).unresolved()):
413 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
413 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
414 if not revs:
414 if not revs:
415 raise error.Abort(
415 raise error.Abort(
416 b'no changesets specified', hint=b'use --rev or --working-dir'
416 b'no changesets specified', hint=b'use --rev or --working-dir'
417 )
417 )
418 return revs
418 return revs
419
419
420
420
421 def checknodescendants(repo, revs):
421 def checknodescendants(repo, revs):
422 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
422 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
423 b'(%ld::) - (%ld)', revs, revs
423 b'(%ld::) - (%ld)', revs, revs
424 ):
424 ):
425 raise error.Abort(
425 raise error.Abort(
426 _(b'can only fix a changeset together with all its descendants')
426 _(b'can only fix a changeset together with all its descendants')
427 )
427 )
428
428
429
429
430 def checkfixablectx(ui, repo, ctx):
430 def checkfixablectx(ui, repo, ctx):
431 """Aborts if the revision shouldn't be replaced with a fixed one."""
431 """Aborts if the revision shouldn't be replaced with a fixed one."""
432 if not ctx.mutable():
432 if not ctx.mutable():
433 raise error.Abort(
433 raise error.Abort(
434 b'can\'t fix immutable changeset %s'
434 b'can\'t fix immutable changeset %s'
435 % (scmutil.formatchangeid(ctx),)
435 % (scmutil.formatchangeid(ctx),)
436 )
436 )
437 if ctx.obsolete():
437 if ctx.obsolete():
438 # It would be better to actually check if the revision has a successor.
438 # It would be better to actually check if the revision has a successor.
439 allowdivergence = ui.configbool(
439 allowdivergence = ui.configbool(
440 b'experimental', b'evolution.allowdivergence'
440 b'experimental', b'evolution.allowdivergence'
441 )
441 )
442 if not allowdivergence:
442 if not allowdivergence:
443 raise error.Abort(
443 raise error.Abort(
444 b'fixing obsolete revision could cause divergence'
444 b'fixing obsolete revision could cause divergence'
445 )
445 )
446
446
447
447
448 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
448 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
449 """Returns the set of files that should be fixed in a context
449 """Returns the set of files that should be fixed in a context
450
450
451 The result depends on the base contexts; we include any file that has
451 The result depends on the base contexts; we include any file that has
452 changed relative to any of the base contexts. Base contexts should be
452 changed relative to any of the base contexts. Base contexts should be
453 ancestors of the context being fixed.
453 ancestors of the context being fixed.
454 """
454 """
455 files = set()
455 files = set()
456 for basectx in basectxs:
456 for basectx in basectxs:
457 stat = basectx.status(
457 stat = basectx.status(
458 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
458 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
459 )
459 )
460 files.update(
460 files.update(
461 set(
461 set(
462 itertools.chain(
462 itertools.chain(
463 stat.added, stat.modified, stat.clean, stat.unknown
463 stat.added, stat.modified, stat.clean, stat.unknown
464 )
464 )
465 )
465 )
466 )
466 )
467 return files
467 return files
468
468
469
469
470 def lineranges(opts, path, basectxs, fixctx, content2):
470 def lineranges(opts, path, basectxs, fixctx, content2):
471 """Returns the set of line ranges that should be fixed in a file
471 """Returns the set of line ranges that should be fixed in a file
472
472
473 Of the form [(10, 20), (30, 40)].
473 Of the form [(10, 20), (30, 40)].
474
474
475 This depends on the given base contexts; we must consider lines that have
475 This depends on the given base contexts; we must consider lines that have
476 changed versus any of the base contexts, and whether the file has been
476 changed versus any of the base contexts, and whether the file has been
477 renamed versus any of them.
477 renamed versus any of them.
478
478
479 Another way to understand this is that we exclude line ranges that are
479 Another way to understand this is that we exclude line ranges that are
480 common to the file in all base contexts.
480 common to the file in all base contexts.
481 """
481 """
482 if opts.get(b'whole'):
482 if opts.get(b'whole'):
483 # Return a range containing all lines. Rely on the diff implementation's
483 # Return a range containing all lines. Rely on the diff implementation's
484 # idea of how many lines are in the file, instead of reimplementing it.
484 # idea of how many lines are in the file, instead of reimplementing it.
485 return difflineranges(b'', content2)
485 return difflineranges(b'', content2)
486
486
487 rangeslist = []
487 rangeslist = []
488 for basectx in basectxs:
488 for basectx in basectxs:
489 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
489 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
490 if basepath in basectx:
490 if basepath in basectx:
491 content1 = basectx[basepath].data()
491 content1 = basectx[basepath].data()
492 else:
492 else:
493 content1 = b''
493 content1 = b''
494 rangeslist.extend(difflineranges(content1, content2))
494 rangeslist.extend(difflineranges(content1, content2))
495 return unionranges(rangeslist)
495 return unionranges(rangeslist)
496
496
497
497
498 def unionranges(rangeslist):
498 def unionranges(rangeslist):
499 """Return the union of some closed intervals
499 """Return the union of some closed intervals
500
500
501 >>> unionranges([])
501 >>> unionranges([])
502 []
502 []
503 >>> unionranges([(1, 100)])
503 >>> unionranges([(1, 100)])
504 [(1, 100)]
504 [(1, 100)]
505 >>> unionranges([(1, 100), (1, 100)])
505 >>> unionranges([(1, 100), (1, 100)])
506 [(1, 100)]
506 [(1, 100)]
507 >>> unionranges([(1, 100), (2, 100)])
507 >>> unionranges([(1, 100), (2, 100)])
508 [(1, 100)]
508 [(1, 100)]
509 >>> unionranges([(1, 99), (1, 100)])
509 >>> unionranges([(1, 99), (1, 100)])
510 [(1, 100)]
510 [(1, 100)]
511 >>> unionranges([(1, 100), (40, 60)])
511 >>> unionranges([(1, 100), (40, 60)])
512 [(1, 100)]
512 [(1, 100)]
513 >>> unionranges([(1, 49), (50, 100)])
513 >>> unionranges([(1, 49), (50, 100)])
514 [(1, 100)]
514 [(1, 100)]
515 >>> unionranges([(1, 48), (50, 100)])
515 >>> unionranges([(1, 48), (50, 100)])
516 [(1, 48), (50, 100)]
516 [(1, 48), (50, 100)]
517 >>> unionranges([(1, 2), (3, 4), (5, 6)])
517 >>> unionranges([(1, 2), (3, 4), (5, 6)])
518 [(1, 6)]
518 [(1, 6)]
519 """
519 """
520 rangeslist = sorted(set(rangeslist))
520 rangeslist = sorted(set(rangeslist))
521 unioned = []
521 unioned = []
522 if rangeslist:
522 if rangeslist:
523 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
523 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
524 for a, b in rangeslist:
524 for a, b in rangeslist:
525 c, d = unioned[-1]
525 c, d = unioned[-1]
526 if a > d + 1:
526 if a > d + 1:
527 unioned.append((a, b))
527 unioned.append((a, b))
528 else:
528 else:
529 unioned[-1] = (c, max(b, d))
529 unioned[-1] = (c, max(b, d))
530 return unioned
530 return unioned
531
531
532
532
533 def difflineranges(content1, content2):
533 def difflineranges(content1, content2):
534 """Return list of line number ranges in content2 that differ from content1.
534 """Return list of line number ranges in content2 that differ from content1.
535
535
536 Line numbers are 1-based. The numbers are the first and last line contained
536 Line numbers are 1-based. The numbers are the first and last line contained
537 in the range. Single-line ranges have the same line number for the first and
537 in the range. Single-line ranges have the same line number for the first and
538 last line. Excludes any empty ranges that result from lines that are only
538 last line. Excludes any empty ranges that result from lines that are only
539 present in content1. Relies on mdiff's idea of where the line endings are in
539 present in content1. Relies on mdiff's idea of where the line endings are in
540 the string.
540 the string.
541
541
542 >>> from mercurial import pycompat
542 >>> from mercurial import pycompat
543 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
543 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
544 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
544 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
545 >>> difflineranges2(b'', b'')
545 >>> difflineranges2(b'', b'')
546 []
546 []
547 >>> difflineranges2(b'a', b'')
547 >>> difflineranges2(b'a', b'')
548 []
548 []
549 >>> difflineranges2(b'', b'A')
549 >>> difflineranges2(b'', b'A')
550 [(1, 1)]
550 [(1, 1)]
551 >>> difflineranges2(b'a', b'a')
551 >>> difflineranges2(b'a', b'a')
552 []
552 []
553 >>> difflineranges2(b'a', b'A')
553 >>> difflineranges2(b'a', b'A')
554 [(1, 1)]
554 [(1, 1)]
555 >>> difflineranges2(b'ab', b'')
555 >>> difflineranges2(b'ab', b'')
556 []
556 []
557 >>> difflineranges2(b'', b'AB')
557 >>> difflineranges2(b'', b'AB')
558 [(1, 2)]
558 [(1, 2)]
559 >>> difflineranges2(b'abc', b'ac')
559 >>> difflineranges2(b'abc', b'ac')
560 []
560 []
561 >>> difflineranges2(b'ab', b'aCb')
561 >>> difflineranges2(b'ab', b'aCb')
562 [(2, 2)]
562 [(2, 2)]
563 >>> difflineranges2(b'abc', b'aBc')
563 >>> difflineranges2(b'abc', b'aBc')
564 [(2, 2)]
564 [(2, 2)]
565 >>> difflineranges2(b'ab', b'AB')
565 >>> difflineranges2(b'ab', b'AB')
566 [(1, 2)]
566 [(1, 2)]
567 >>> difflineranges2(b'abcde', b'aBcDe')
567 >>> difflineranges2(b'abcde', b'aBcDe')
568 [(2, 2), (4, 4)]
568 [(2, 2), (4, 4)]
569 >>> difflineranges2(b'abcde', b'aBCDe')
569 >>> difflineranges2(b'abcde', b'aBCDe')
570 [(2, 4)]
570 [(2, 4)]
571 """
571 """
572 ranges = []
572 ranges = []
573 for lines, kind in mdiff.allblocks(content1, content2):
573 for lines, kind in mdiff.allblocks(content1, content2):
574 firstline, lastline = lines[2:4]
574 firstline, lastline = lines[2:4]
575 if kind == b'!' and firstline != lastline:
575 if kind == b'!' and firstline != lastline:
576 ranges.append((firstline + 1, lastline))
576 ranges.append((firstline + 1, lastline))
577 return ranges
577 return ranges
578
578
579
579
580 def getbasectxs(repo, opts, revstofix):
580 def getbasectxs(repo, opts, revstofix):
581 """Returns a map of the base contexts for each revision
581 """Returns a map of the base contexts for each revision
582
582
583 The base contexts determine which lines are considered modified when we
583 The base contexts determine which lines are considered modified when we
584 attempt to fix just the modified lines in a file. It also determines which
584 attempt to fix just the modified lines in a file. It also determines which
585 files we attempt to fix, so it is important to compute this even when
585 files we attempt to fix, so it is important to compute this even when
586 --whole is used.
586 --whole is used.
587 """
587 """
588 # The --base flag overrides the usual logic, and we give every revision
588 # The --base flag overrides the usual logic, and we give every revision
589 # exactly the set of baserevs that the user specified.
589 # exactly the set of baserevs that the user specified.
590 if opts.get(b'base'):
590 if opts.get(b'base'):
591 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
591 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
592 if not baserevs:
592 if not baserevs:
593 baserevs = {nullrev}
593 baserevs = {nullrev}
594 basectxs = {repo[rev] for rev in baserevs}
594 basectxs = {repo[rev] for rev in baserevs}
595 return {rev: basectxs for rev in revstofix}
595 return {rev: basectxs for rev in revstofix}
596
596
597 # Proceed in topological order so that we can easily determine each
597 # Proceed in topological order so that we can easily determine each
598 # revision's baserevs by looking at its parents and their baserevs.
598 # revision's baserevs by looking at its parents and their baserevs.
599 basectxs = collections.defaultdict(set)
599 basectxs = collections.defaultdict(set)
600 for rev in sorted(revstofix):
600 for rev in sorted(revstofix):
601 ctx = repo[rev]
601 ctx = repo[rev]
602 for pctx in ctx.parents():
602 for pctx in ctx.parents():
603 if pctx.rev() in basectxs:
603 if pctx.rev() in basectxs:
604 basectxs[rev].update(basectxs[pctx.rev()])
604 basectxs[rev].update(basectxs[pctx.rev()])
605 else:
605 else:
606 basectxs[rev].add(pctx)
606 basectxs[rev].add(pctx)
607 return basectxs
607 return basectxs
608
608
609
609
610 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
610 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
611 """Run any configured fixers that should affect the file in this context
611 """Run any configured fixers that should affect the file in this context
612
612
613 Returns the file content that results from applying the fixers in some order
613 Returns the file content that results from applying the fixers in some order
614 starting with the file's content in the fixctx. Fixers that support line
614 starting with the file's content in the fixctx. Fixers that support line
615 ranges will affect lines that have changed relative to any of the basectxs
615 ranges will affect lines that have changed relative to any of the basectxs
616 (i.e. they will only avoid lines that are common to all basectxs).
616 (i.e. they will only avoid lines that are common to all basectxs).
617
617
618 A fixer tool's stdout will become the file's new content if and only if it
618 A fixer tool's stdout will become the file's new content if and only if it
619 exits with code zero. The fixer tool's working directory is the repository's
619 exits with code zero. The fixer tool's working directory is the repository's
620 root.
620 root.
621 """
621 """
622 metadata = {}
622 metadata = {}
623 newdata = fixctx[path].data()
623 newdata = fixctx[path].data()
624 for fixername, fixer in pycompat.iteritems(fixers):
624 for fixername, fixer in pycompat.iteritems(fixers):
625 if fixer.affects(opts, fixctx, path):
625 if fixer.affects(opts, fixctx, path):
626 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
626 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
627 command = fixer.command(ui, path, ranges)
627 command = fixer.command(ui, path, ranges)
628 if command is None:
628 if command is None:
629 continue
629 continue
630 ui.debug(b'subprocess: %s\n' % (command,))
630 ui.debug(b'subprocess: %s\n' % (command,))
631 proc = subprocess.Popen(
631 proc = subprocess.Popen(
632 procutil.tonativestr(command),
632 procutil.tonativestr(command),
633 shell=True,
633 shell=True,
634 cwd=procutil.tonativestr(repo.root),
634 cwd=procutil.tonativestr(repo.root),
635 stdin=subprocess.PIPE,
635 stdin=subprocess.PIPE,
636 stdout=subprocess.PIPE,
636 stdout=subprocess.PIPE,
637 stderr=subprocess.PIPE,
637 stderr=subprocess.PIPE,
638 )
638 )
639 stdout, stderr = proc.communicate(newdata)
639 stdout, stderr = proc.communicate(newdata)
640 if stderr:
640 if stderr:
641 showstderr(ui, fixctx.rev(), fixername, stderr)
641 showstderr(ui, fixctx.rev(), fixername, stderr)
642 newerdata = stdout
642 newerdata = stdout
643 if fixer.shouldoutputmetadata():
643 if fixer.shouldoutputmetadata():
644 try:
644 try:
645 metadatajson, newerdata = stdout.split(b'\0', 1)
645 metadatajson, newerdata = stdout.split(b'\0', 1)
646 metadata[fixername] = json.loads(metadatajson)
646 metadata[fixername] = json.loads(metadatajson)
647 except ValueError:
647 except ValueError:
648 ui.warn(
648 ui.warn(
649 _(b'ignored invalid output from fixer tool: %s\n')
649 _(b'ignored invalid output from fixer tool: %s\n')
650 % (fixername,)
650 % (fixername,)
651 )
651 )
652 continue
652 continue
653 else:
653 else:
654 metadata[fixername] = None
654 metadata[fixername] = None
655 if proc.returncode == 0:
655 if proc.returncode == 0:
656 newdata = newerdata
656 newdata = newerdata
657 else:
657 else:
658 if not stderr:
658 if not stderr:
659 message = _(b'exited with status %d\n') % (proc.returncode,)
659 message = _(b'exited with status %d\n') % (proc.returncode,)
660 showstderr(ui, fixctx.rev(), fixername, message)
660 showstderr(ui, fixctx.rev(), fixername, message)
661 checktoolfailureaction(
661 checktoolfailureaction(
662 ui,
662 ui,
663 _(b'no fixes will be applied'),
663 _(b'no fixes will be applied'),
664 hint=_(
664 hint=_(
665 b'use --config fix.failure=continue to apply any '
665 b'use --config fix.failure=continue to apply any '
666 b'successful fixes anyway'
666 b'successful fixes anyway'
667 ),
667 ),
668 )
668 )
669 return metadata, newdata
669 return metadata, newdata
670
670
671
671
672 def showstderr(ui, rev, fixername, stderr):
672 def showstderr(ui, rev, fixername, stderr):
673 """Writes the lines of the stderr string as warnings on the ui
673 """Writes the lines of the stderr string as warnings on the ui
674
674
675 Uses the revision number and fixername to give more context to each line of
675 Uses the revision number and fixername to give more context to each line of
676 the error message. Doesn't include file names, since those take up a lot of
676 the error message. Doesn't include file names, since those take up a lot of
677 space and would tend to be included in the error message if they were
677 space and would tend to be included in the error message if they were
678 relevant.
678 relevant.
679 """
679 """
680 for line in re.split(b'[\r\n]+', stderr):
680 for line in re.split(b'[\r\n]+', stderr):
681 if line:
681 if line:
682 ui.warn(b'[')
682 ui.warn(b'[')
683 if rev is None:
683 if rev is None:
684 ui.warn(_(b'wdir'), label=b'evolve.rev')
684 ui.warn(_(b'wdir'), label=b'evolve.rev')
685 else:
685 else:
686 ui.warn((str(rev)), label=b'evolve.rev')
686 ui.warn((str(rev)), label=b'evolve.rev')
687 ui.warn(b'] %s: %s\n' % (fixername, line))
687 ui.warn(b'] %s: %s\n' % (fixername, line))
688
688
689
689
690 def writeworkingdir(repo, ctx, filedata, replacements):
690 def writeworkingdir(repo, ctx, filedata, replacements):
691 """Write new content to the working copy and check out the new p1 if any
691 """Write new content to the working copy and check out the new p1 if any
692
692
693 We check out a new revision if and only if we fixed something in both the
693 We check out a new revision if and only if we fixed something in both the
694 working directory and its parent revision. This avoids the need for a full
694 working directory and its parent revision. This avoids the need for a full
695 update/merge, and means that the working directory simply isn't affected
695 update/merge, and means that the working directory simply isn't affected
696 unless the --working-dir flag is given.
696 unless the --working-dir flag is given.
697
697
698 Directly updates the dirstate for the affected files.
698 Directly updates the dirstate for the affected files.
699 """
699 """
700 for path, data in pycompat.iteritems(filedata):
700 for path, data in pycompat.iteritems(filedata):
701 fctx = ctx[path]
701 fctx = ctx[path]
702 fctx.write(data, fctx.flags())
702 fctx.write(data, fctx.flags())
703 if repo.dirstate[path] == b'n':
703 if repo.dirstate[path] == b'n':
704 repo.dirstate.normallookup(path)
704 repo.dirstate.normallookup(path)
705
705
706 oldparentnodes = repo.dirstate.parents()
706 oldparentnodes = repo.dirstate.parents()
707 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
707 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
708 if newparentnodes != oldparentnodes:
708 if newparentnodes != oldparentnodes:
709 repo.setparents(*newparentnodes)
709 repo.setparents(*newparentnodes)
710
710
711
711
712 def replacerev(ui, repo, ctx, filedata, replacements):
712 def replacerev(ui, repo, ctx, filedata, replacements):
713 """Commit a new revision like the given one, but with file content changes
713 """Commit a new revision like the given one, but with file content changes
714
714
715 "ctx" is the original revision to be replaced by a modified one.
715 "ctx" is the original revision to be replaced by a modified one.
716
716
717 "filedata" is a dict that maps paths to their new file content. All other
717 "filedata" is a dict that maps paths to their new file content. All other
718 paths will be recreated from the original revision without changes.
718 paths will be recreated from the original revision without changes.
719 "filedata" may contain paths that didn't exist in the original revision;
719 "filedata" may contain paths that didn't exist in the original revision;
720 they will be added.
720 they will be added.
721
721
722 "replacements" is a dict that maps a single node to a single node, and it is
722 "replacements" is a dict that maps a single node to a single node, and it is
723 updated to indicate the original revision is replaced by the newly created
723 updated to indicate the original revision is replaced by the newly created
724 one. No entry is added if the replacement's node already exists.
724 one. No entry is added if the replacement's node already exists.
725
725
726 The new revision has the same parents as the old one, unless those parents
726 The new revision has the same parents as the old one, unless those parents
727 have already been replaced, in which case those replacements are the parents
727 have already been replaced, in which case those replacements are the parents
728 of this new revision. Thus, if revisions are replaced in topological order,
728 of this new revision. Thus, if revisions are replaced in topological order,
729 there is no need to rebase them into the original topology later.
729 there is no need to rebase them into the original topology later.
730 """
730 """
731
731
732 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
732 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
733 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
733 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
734 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
734 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
735 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
735 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
736
736
737 # We don't want to create a revision that has no changes from the original,
737 # We don't want to create a revision that has no changes from the original,
738 # but we should if the original revision's parent has been replaced.
738 # but we should if the original revision's parent has been replaced.
739 # Otherwise, we would produce an orphan that needs no actual human
739 # Otherwise, we would produce an orphan that needs no actual human
740 # intervention to evolve. We can't rely on commit() to avoid creating the
740 # intervention to evolve. We can't rely on commit() to avoid creating the
741 # un-needed revision because the extra field added below produces a new hash
741 # un-needed revision because the extra field added below produces a new hash
742 # regardless of file content changes.
742 # regardless of file content changes.
743 if (
743 if (
744 not filedata
744 not filedata
745 and p1ctx.node() not in replacements
745 and p1ctx.node() not in replacements
746 and p2ctx.node() not in replacements
746 and p2ctx.node() not in replacements
747 ):
747 ):
748 return
748 return
749
749
750 def filectxfn(repo, memctx, path):
750 def filectxfn(repo, memctx, path):
751 if path not in ctx:
751 if path not in ctx:
752 return None
752 return None
753 fctx = ctx[path]
753 fctx = ctx[path]
754 copysource = fctx.copysource()
754 copysource = fctx.copysource()
755 return context.memfilectx(
755 return context.memfilectx(
756 repo,
756 repo,
757 memctx,
757 memctx,
758 path=fctx.path(),
758 path=fctx.path(),
759 data=filedata.get(path, fctx.data()),
759 data=filedata.get(path, fctx.data()),
760 islink=fctx.islink(),
760 islink=fctx.islink(),
761 isexec=fctx.isexec(),
761 isexec=fctx.isexec(),
762 copysource=copysource,
762 copysource=copysource,
763 )
763 )
764
764
765 extra = ctx.extra().copy()
765 extra = ctx.extra().copy()
766 extra[b'fix_source'] = ctx.hex()
766 extra[b'fix_source'] = ctx.hex()
767
767
768 memctx = context.memctx(
768 memctx = context.memctx(
769 repo,
769 repo,
770 parents=(newp1node, newp2node),
770 parents=(newp1node, newp2node),
771 text=ctx.description(),
771 text=ctx.description(),
772 files=set(ctx.files()) | set(filedata.keys()),
772 files=set(ctx.files()) | set(filedata.keys()),
773 filectxfn=filectxfn,
773 filectxfn=filectxfn,
774 user=ctx.user(),
774 user=ctx.user(),
775 date=ctx.date(),
775 date=ctx.date(),
776 extra=extra,
776 extra=extra,
777 branch=ctx.branch(),
777 branch=ctx.branch(),
778 editor=None,
778 editor=None,
779 )
779 )
780 sucnode = memctx.commit()
780 sucnode = memctx.commit()
781 prenode = ctx.node()
781 prenode = ctx.node()
782 if prenode == sucnode:
782 if prenode == sucnode:
783 ui.debug(b'node %s already existed\n' % (ctx.hex()))
783 ui.debug(b'node %s already existed\n' % (ctx.hex()))
784 else:
784 else:
785 replacements[ctx.node()] = sucnode
785 replacements[ctx.node()] = sucnode
786
786
787
787
788 def getfixers(ui):
788 def getfixers(ui):
789 """Returns a map of configured fixer tools indexed by their names
789 """Returns a map of configured fixer tools indexed by their names
790
790
791 Each value is a Fixer object with methods that implement the behavior of the
791 Each value is a Fixer object with methods that implement the behavior of the
792 fixer's config suboptions. Does not validate the config values.
792 fixer's config suboptions. Does not validate the config values.
793 """
793 """
794 fixers = {}
794 fixers = {}
795 for name in fixernames(ui):
795 for name in fixernames(ui):
796 fixers[name] = Fixer()
796 fixers[name] = Fixer()
797 attrs = ui.configsuboptions(b'fix', name)[1]
798 for key, default in FIXER_ATTRS.items():
797 for key, default in FIXER_ATTRS.items():
799 setattr(
798 setattr(
800 fixers[name],
799 fixers[name],
801 pycompat.sysstr(b'_' + key),
800 pycompat.sysstr(b'_' + key),
802 attrs.get(key, default),
801 ui.config(b'fix', name + b':' + key, default),
803 )
802 )
804 fixers[name]._priority = int(fixers[name]._priority)
803 fixers[name]._priority = int(fixers[name]._priority)
805 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
804 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
806 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
805 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
807 fixers[name]._enabled = stringutil.parsebool(fixers[name]._enabled)
806 fixers[name]._enabled = stringutil.parsebool(fixers[name]._enabled)
808 # Don't use a fixer if it has no pattern configured. It would be
807 # Don't use a fixer if it has no pattern configured. It would be
809 # dangerous to let it affect all files. It would be pointless to let it
808 # dangerous to let it affect all files. It would be pointless to let it
810 # affect no files. There is no reasonable subset of files to use as the
809 # affect no files. There is no reasonable subset of files to use as the
811 # default.
810 # default.
812 if fixers[name]._pattern is None:
811 if fixers[name]._pattern is None:
813 ui.warn(
812 ui.warn(
814 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
813 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
815 )
814 )
816 del fixers[name]
815 del fixers[name]
817 elif not fixers[name]._enabled:
816 elif not fixers[name]._enabled:
818 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
817 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
819 del fixers[name]
818 del fixers[name]
820 return collections.OrderedDict(
819 return collections.OrderedDict(
821 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
820 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
822 )
821 )
823
822
824
823
825 def fixernames(ui):
824 def fixernames(ui):
826 """Returns the names of [fix] config options that have suboptions"""
825 """Returns the names of [fix] config options that have suboptions"""
827 names = set()
826 names = set()
828 for k, v in ui.configitems(b'fix'):
827 for k, v in ui.configitems(b'fix'):
829 if b':' in k:
828 if b':' in k:
830 names.add(k.split(b':', 1)[0])
829 names.add(k.split(b':', 1)[0])
831 return names
830 return names
832
831
833
832
834 class Fixer(object):
833 class Fixer(object):
835 """Wraps the raw config values for a fixer with methods"""
834 """Wraps the raw config values for a fixer with methods"""
836
835
837 def affects(self, opts, fixctx, path):
836 def affects(self, opts, fixctx, path):
838 """Should this fixer run on the file at the given path and context?"""
837 """Should this fixer run on the file at the given path and context?"""
839 return self._pattern is not None and scmutil.match(
838 return self._pattern is not None and scmutil.match(
840 fixctx, [self._pattern], opts
839 fixctx, [self._pattern], opts
841 )(path)
840 )(path)
842
841
843 def shouldoutputmetadata(self):
842 def shouldoutputmetadata(self):
844 """Should the stdout of this fixer start with JSON and a null byte?"""
843 """Should the stdout of this fixer start with JSON and a null byte?"""
845 return self._metadata
844 return self._metadata
846
845
847 def command(self, ui, path, ranges):
846 def command(self, ui, path, ranges):
848 """A shell command to use to invoke this fixer on the given file/lines
847 """A shell command to use to invoke this fixer on the given file/lines
849
848
850 May return None if there is no appropriate command to run for the given
849 May return None if there is no appropriate command to run for the given
851 parameters.
850 parameters.
852 """
851 """
853 expand = cmdutil.rendercommandtemplate
852 expand = cmdutil.rendercommandtemplate
854 parts = [
853 parts = [
855 expand(
854 expand(
856 ui,
855 ui,
857 self._command,
856 self._command,
858 {b'rootpath': path, b'basename': os.path.basename(path)},
857 {b'rootpath': path, b'basename': os.path.basename(path)},
859 )
858 )
860 ]
859 ]
861 if self._linerange:
860 if self._linerange:
862 if self._skipclean and not ranges:
861 if self._skipclean and not ranges:
863 # No line ranges to fix, so don't run the fixer.
862 # No line ranges to fix, so don't run the fixer.
864 return None
863 return None
865 for first, last in ranges:
864 for first, last in ranges:
866 parts.append(
865 parts.append(
867 expand(
866 expand(
868 ui, self._linerange, {b'first': first, b'last': last}
867 ui, self._linerange, {b'first': first, b'last': last}
869 )
868 )
870 )
869 )
871 return b' '.join(parts)
870 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now