##// END OF EJS Templates
fix: make Fixer initialization more explicit for clarity...
Martin von Zweigbergk -
r43493:0101db49 default
parent child Browse files
Show More
@@ -1,870 +1,874 b''
1 # fix - rewrite file content in changesets and working copy
1 # fix - rewrite file content in changesets and working copy
2 #
2 #
3 # Copyright 2018 Google LLC.
3 # Copyright 2018 Google LLC.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8
8
9 Provides a command that runs configured tools on the contents of modified files,
9 Provides a command that runs configured tools on the contents of modified files,
10 writing back any fixes to the working copy or replacing changesets.
10 writing back any fixes to the working copy or replacing changesets.
11
11
12 Here is an example configuration that causes :hg:`fix` to apply automatic
12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 formatting fixes to modified lines in C++ code::
13 formatting fixes to modified lines in C++ code::
14
14
15 [fix]
15 [fix]
16 clang-format:command=clang-format --assume-filename={rootpath}
16 clang-format:command=clang-format --assume-filename={rootpath}
17 clang-format:linerange=--lines={first}:{last}
17 clang-format:linerange=--lines={first}:{last}
18 clang-format:pattern=set:**.cpp or **.hpp
18 clang-format:pattern=set:**.cpp or **.hpp
19
19
20 The :command suboption forms the first part of the shell command that will be
20 The :command suboption forms the first part of the shell command that will be
21 used to fix a file. The content of the file is passed on standard input, and the
21 used to fix a file. The content of the file is passed on standard input, and the
22 fixed file content is expected on standard output. Any output on standard error
22 fixed file content is expected on standard output. Any output on standard error
23 will be displayed as a warning. If the exit status is not zero, the file will
23 will be displayed as a warning. If the exit status is not zero, the file will
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 status but no standard error output. Some values may be substituted into the
25 status but no standard error output. Some values may be substituted into the
26 command::
26 command::
27
27
28 {rootpath} The path of the file being fixed, relative to the repo root
28 {rootpath} The path of the file being fixed, relative to the repo root
29 {basename} The name of the file being fixed, without the directory path
29 {basename} The name of the file being fixed, without the directory path
30
30
31 If the :linerange suboption is set, the tool will only be run if there are
31 If the :linerange suboption is set, the tool will only be run if there are
32 changed lines in a file. The value of this suboption is appended to the shell
32 changed lines in a file. The value of this suboption is appended to the shell
33 command once for every range of changed lines in the file. Some values may be
33 command once for every range of changed lines in the file. Some values may be
34 substituted into the command::
34 substituted into the command::
35
35
36 {first} The 1-based line number of the first line in the modified range
36 {first} The 1-based line number of the first line in the modified range
37 {last} The 1-based line number of the last line in the modified range
37 {last} The 1-based line number of the last line in the modified range
38
38
39 Deleted sections of a file will be ignored by :linerange, because there is no
39 Deleted sections of a file will be ignored by :linerange, because there is no
40 corresponding line range in the version being fixed.
40 corresponding line range in the version being fixed.
41
41
42 By default, tools that set :linerange will only be executed if there is at least
42 By default, tools that set :linerange will only be executed if there is at least
43 one changed line range. This is meant to prevent accidents like running a code
43 one changed line range. This is meant to prevent accidents like running a code
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 to false.
46 to false.
47
47
48 The :pattern suboption determines which files will be passed through each
48 The :pattern suboption determines which files will be passed through each
49 configured tool. See :hg:`help patterns` for possible values. If there are file
49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 arguments to :hg:`fix`, the intersection of these patterns is used.
50 arguments to :hg:`fix`, the intersection of these patterns is used.
51
51
52 There is also a configurable limit for the maximum size of file that will be
52 There is also a configurable limit for the maximum size of file that will be
53 processed by :hg:`fix`::
53 processed by :hg:`fix`::
54
54
55 [fix]
55 [fix]
56 maxfilesize = 2MB
56 maxfilesize = 2MB
57
57
58 Normally, execution of configured tools will continue after a failure (indicated
58 Normally, execution of configured tools will continue after a failure (indicated
59 by a non-zero exit status). It can also be configured to abort after the first
59 by a non-zero exit status). It can also be configured to abort after the first
60 such failure, so that no files will be affected if any tool fails. This abort
60 such failure, so that no files will be affected if any tool fails. This abort
61 will also cause :hg:`fix` to exit with a non-zero status::
61 will also cause :hg:`fix` to exit with a non-zero status::
62
62
63 [fix]
63 [fix]
64 failure = abort
64 failure = abort
65
65
66 When multiple tools are configured to affect a file, they execute in an order
66 When multiple tools are configured to affect a file, they execute in an order
67 defined by the :priority suboption. The priority suboption has a default value
67 defined by the :priority suboption. The priority suboption has a default value
68 of zero for each tool. Tools are executed in order of descending priority. The
68 of zero for each tool. Tools are executed in order of descending priority. The
69 execution order of tools with equal priority is unspecified. For example, you
69 execution order of tools with equal priority is unspecified. For example, you
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 in a text file by ensuring that 'sort' runs before 'head'::
71 in a text file by ensuring that 'sort' runs before 'head'::
72
72
73 [fix]
73 [fix]
74 sort:command = sort -n
74 sort:command = sort -n
75 head:command = head -n 10
75 head:command = head -n 10
76 sort:pattern = numbers.txt
76 sort:pattern = numbers.txt
77 head:pattern = numbers.txt
77 head:pattern = numbers.txt
78 sort:priority = 2
78 sort:priority = 2
79 head:priority = 1
79 head:priority = 1
80
80
81 To account for changes made by each tool, the line numbers used for incremental
81 To account for changes made by each tool, the line numbers used for incremental
82 formatting are recomputed before executing the next tool. So, each tool may see
82 formatting are recomputed before executing the next tool. So, each tool may see
83 different values for the arguments added by the :linerange suboption.
83 different values for the arguments added by the :linerange suboption.
84
84
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 content. The metadata must be placed before the file content on stdout,
86 content. The metadata must be placed before the file content on stdout,
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 is expected to produce this metadata encoding if and only if the :metadata
89 is expected to produce this metadata encoding if and only if the :metadata
90 suboption is true::
90 suboption is true::
91
91
92 [fix]
92 [fix]
93 tool:command = tool --prepend-json-metadata
93 tool:command = tool --prepend-json-metadata
94 tool:metadata = true
94 tool:metadata = true
95
95
96 The metadata values are passed to hooks, which can be used to print summaries or
96 The metadata values are passed to hooks, which can be used to print summaries or
97 perform other post-fixing work. The supported hooks are::
97 perform other post-fixing work. The supported hooks are::
98
98
99 "postfixfile"
99 "postfixfile"
100 Run once for each file in each revision where any fixer tools made changes
100 Run once for each file in each revision where any fixer tools made changes
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 tools that affected the file. Fixer tools that didn't affect the file have a
103 tools that affected the file. Fixer tools that didn't affect the file have a
104 valueof None. Only fixer tools that executed are present in the metadata.
104 valueof None. Only fixer tools that executed are present in the metadata.
105
105
106 "postfix"
106 "postfix"
107 Run once after all files and revisions have been handled. Provides
107 Run once after all files and revisions have been handled. Provides
108 "$HG_REPLACEMENTS" with information about what revisions were created and
108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 files in the working copy were updated. Provides a list "$HG_METADATA"
110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 mapping fixer tool names to lists of metadata values returned from
111 mapping fixer tool names to lists of metadata values returned from
112 executions that modified a file. This aggregates the same metadata
112 executions that modified a file. This aggregates the same metadata
113 previously passed to the "postfixfile" hook.
113 previously passed to the "postfixfile" hook.
114
114
115 Fixer tools are run the in repository's root directory. This allows them to read
115 Fixer tools are run the in repository's root directory. This allows them to read
116 configuration files from the working copy, or even write to the working copy.
116 configuration files from the working copy, or even write to the working copy.
117 The working copy is not updated to match the revision being fixed. In fact,
117 The working copy is not updated to match the revision being fixed. In fact,
118 several revisions may be fixed in parallel. Writes to the working copy are not
118 several revisions may be fixed in parallel. Writes to the working copy are not
119 amended into the revision being fixed; fixer tools should always write fixed
119 amended into the revision being fixed; fixer tools should always write fixed
120 file content back to stdout as documented above.
120 file content back to stdout as documented above.
121 """
121 """
122
122
123 from __future__ import absolute_import
123 from __future__ import absolute_import
124
124
125 import collections
125 import collections
126 import itertools
126 import itertools
127 import json
127 import json
128 import os
128 import os
129 import re
129 import re
130 import subprocess
130 import subprocess
131
131
132 from mercurial.i18n import _
132 from mercurial.i18n import _
133 from mercurial.node import nullrev
133 from mercurial.node import nullrev
134 from mercurial.node import wdirrev
134 from mercurial.node import wdirrev
135 from mercurial.pycompat import setattr
136
135
137 from mercurial.utils import (
136 from mercurial.utils import procutil
138 procutil,
139 stringutil,
140 )
141
137
142 from mercurial import (
138 from mercurial import (
143 cmdutil,
139 cmdutil,
144 context,
140 context,
145 copies,
141 copies,
146 error,
142 error,
147 mdiff,
143 mdiff,
148 merge,
144 merge,
149 obsolete,
145 obsolete,
150 pycompat,
146 pycompat,
151 registrar,
147 registrar,
152 scmutil,
148 scmutil,
153 util,
149 util,
154 worker,
150 worker,
155 )
151 )
156
152
157 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
153 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
158 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
154 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
159 # be specifying the version(s) of Mercurial they are tested with, or
155 # be specifying the version(s) of Mercurial they are tested with, or
160 # leave the attribute unspecified.
156 # leave the attribute unspecified.
161 testedwith = b'ships-with-hg-core'
157 testedwith = b'ships-with-hg-core'
162
158
163 cmdtable = {}
159 cmdtable = {}
164 command = registrar.command(cmdtable)
160 command = registrar.command(cmdtable)
165
161
166 configtable = {}
162 configtable = {}
167 configitem = registrar.configitem(configtable)
163 configitem = registrar.configitem(configtable)
168
164
169 # Register the suboptions allowed for each configured fixer, and default values.
165 # Register the suboptions allowed for each configured fixer, and default values.
170 FIXER_ATTRS = {
166 FIXER_ATTRS = {
171 b'command': None,
167 b'command': None,
172 b'linerange': None,
168 b'linerange': None,
173 b'pattern': None,
169 b'pattern': None,
174 b'priority': 0,
170 b'priority': 0,
175 b'metadata': b'false',
171 b'metadata': False,
176 b'skipclean': b'true',
172 b'skipclean': True,
177 b'enabled': b'true',
173 b'enabled': True,
178 }
174 }
179
175
180 for key, default in FIXER_ATTRS.items():
176 for key, default in FIXER_ATTRS.items():
181 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
177 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
182
178
183 # A good default size allows most source code files to be fixed, but avoids
179 # A good default size allows most source code files to be fixed, but avoids
184 # letting fixer tools choke on huge inputs, which could be surprising to the
180 # letting fixer tools choke on huge inputs, which could be surprising to the
185 # user.
181 # user.
186 configitem(b'fix', b'maxfilesize', default=b'2MB')
182 configitem(b'fix', b'maxfilesize', default=b'2MB')
187
183
188 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
184 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
189 # This helps users do shell scripts that stop when a fixer tool signals a
185 # This helps users do shell scripts that stop when a fixer tool signals a
190 # problem.
186 # problem.
191 configitem(b'fix', b'failure', default=b'continue')
187 configitem(b'fix', b'failure', default=b'continue')
192
188
193
189
194 def checktoolfailureaction(ui, message, hint=None):
190 def checktoolfailureaction(ui, message, hint=None):
195 """Abort with 'message' if fix.failure=abort"""
191 """Abort with 'message' if fix.failure=abort"""
196 action = ui.config(b'fix', b'failure')
192 action = ui.config(b'fix', b'failure')
197 if action not in (b'continue', b'abort'):
193 if action not in (b'continue', b'abort'):
198 raise error.Abort(
194 raise error.Abort(
199 _(b'unknown fix.failure action: %s') % (action,),
195 _(b'unknown fix.failure action: %s') % (action,),
200 hint=_(b'use "continue" or "abort"'),
196 hint=_(b'use "continue" or "abort"'),
201 )
197 )
202 if action == b'abort':
198 if action == b'abort':
203 raise error.Abort(message, hint=hint)
199 raise error.Abort(message, hint=hint)
204
200
205
201
206 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
202 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
207 baseopt = (
203 baseopt = (
208 b'',
204 b'',
209 b'base',
205 b'base',
210 [],
206 [],
211 _(
207 _(
212 b'revisions to diff against (overrides automatic '
208 b'revisions to diff against (overrides automatic '
213 b'selection, and applies to every revision being '
209 b'selection, and applies to every revision being '
214 b'fixed)'
210 b'fixed)'
215 ),
211 ),
216 _(b'REV'),
212 _(b'REV'),
217 )
213 )
218 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
214 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
219 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
215 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
220 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
216 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
221 usage = _(b'[OPTION]... [FILE]...')
217 usage = _(b'[OPTION]... [FILE]...')
222
218
223
219
224 @command(
220 @command(
225 b'fix',
221 b'fix',
226 [allopt, baseopt, revopt, wdiropt, wholeopt],
222 [allopt, baseopt, revopt, wdiropt, wholeopt],
227 usage,
223 usage,
228 helpcategory=command.CATEGORY_FILE_CONTENTS,
224 helpcategory=command.CATEGORY_FILE_CONTENTS,
229 )
225 )
230 def fix(ui, repo, *pats, **opts):
226 def fix(ui, repo, *pats, **opts):
231 """rewrite file content in changesets or working directory
227 """rewrite file content in changesets or working directory
232
228
233 Runs any configured tools to fix the content of files. Only affects files
229 Runs any configured tools to fix the content of files. Only affects files
234 with changes, unless file arguments are provided. Only affects changed lines
230 with changes, unless file arguments are provided. Only affects changed lines
235 of files, unless the --whole flag is used. Some tools may always affect the
231 of files, unless the --whole flag is used. Some tools may always affect the
236 whole file regardless of --whole.
232 whole file regardless of --whole.
237
233
238 If revisions are specified with --rev, those revisions will be checked, and
234 If revisions are specified with --rev, those revisions will be checked, and
239 they may be replaced with new revisions that have fixed file content. It is
235 they may be replaced with new revisions that have fixed file content. It is
240 desirable to specify all descendants of each specified revision, so that the
236 desirable to specify all descendants of each specified revision, so that the
241 fixes propagate to the descendants. If all descendants are fixed at the same
237 fixes propagate to the descendants. If all descendants are fixed at the same
242 time, no merging, rebasing, or evolution will be required.
238 time, no merging, rebasing, or evolution will be required.
243
239
244 If --working-dir is used, files with uncommitted changes in the working copy
240 If --working-dir is used, files with uncommitted changes in the working copy
245 will be fixed. If the checked-out revision is also fixed, the working
241 will be fixed. If the checked-out revision is also fixed, the working
246 directory will update to the replacement revision.
242 directory will update to the replacement revision.
247
243
248 When determining what lines of each file to fix at each revision, the whole
244 When determining what lines of each file to fix at each revision, the whole
249 set of revisions being fixed is considered, so that fixes to earlier
245 set of revisions being fixed is considered, so that fixes to earlier
250 revisions are not forgotten in later ones. The --base flag can be used to
246 revisions are not forgotten in later ones. The --base flag can be used to
251 override this default behavior, though it is not usually desirable to do so.
247 override this default behavior, though it is not usually desirable to do so.
252 """
248 """
253 opts = pycompat.byteskwargs(opts)
249 opts = pycompat.byteskwargs(opts)
254 if opts[b'all']:
250 if opts[b'all']:
255 if opts[b'rev']:
251 if opts[b'rev']:
256 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
252 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
257 opts[b'rev'] = [b'not public() and not obsolete()']
253 opts[b'rev'] = [b'not public() and not obsolete()']
258 opts[b'working_dir'] = True
254 opts[b'working_dir'] = True
259 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
255 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
260 revstofix = getrevstofix(ui, repo, opts)
256 revstofix = getrevstofix(ui, repo, opts)
261 basectxs = getbasectxs(repo, opts, revstofix)
257 basectxs = getbasectxs(repo, opts, revstofix)
262 workqueue, numitems = getworkqueue(
258 workqueue, numitems = getworkqueue(
263 ui, repo, pats, opts, revstofix, basectxs
259 ui, repo, pats, opts, revstofix, basectxs
264 )
260 )
265 fixers = getfixers(ui)
261 fixers = getfixers(ui)
266
262
267 # There are no data dependencies between the workers fixing each file
263 # There are no data dependencies between the workers fixing each file
268 # revision, so we can use all available parallelism.
264 # revision, so we can use all available parallelism.
269 def getfixes(items):
265 def getfixes(items):
270 for rev, path in items:
266 for rev, path in items:
271 ctx = repo[rev]
267 ctx = repo[rev]
272 olddata = ctx[path].data()
268 olddata = ctx[path].data()
273 metadata, newdata = fixfile(
269 metadata, newdata = fixfile(
274 ui, repo, opts, fixers, ctx, path, basectxs[rev]
270 ui, repo, opts, fixers, ctx, path, basectxs[rev]
275 )
271 )
276 # Don't waste memory/time passing unchanged content back, but
272 # Don't waste memory/time passing unchanged content back, but
277 # produce one result per item either way.
273 # produce one result per item either way.
278 yield (
274 yield (
279 rev,
275 rev,
280 path,
276 path,
281 metadata,
277 metadata,
282 newdata if newdata != olddata else None,
278 newdata if newdata != olddata else None,
283 )
279 )
284
280
285 results = worker.worker(
281 results = worker.worker(
286 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
282 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
287 )
283 )
288
284
289 # We have to hold on to the data for each successor revision in memory
285 # We have to hold on to the data for each successor revision in memory
290 # until all its parents are committed. We ensure this by committing and
286 # until all its parents are committed. We ensure this by committing and
291 # freeing memory for the revisions in some topological order. This
287 # freeing memory for the revisions in some topological order. This
292 # leaves a little bit of memory efficiency on the table, but also makes
288 # leaves a little bit of memory efficiency on the table, but also makes
293 # the tests deterministic. It might also be considered a feature since
289 # the tests deterministic. It might also be considered a feature since
294 # it makes the results more easily reproducible.
290 # it makes the results more easily reproducible.
295 filedata = collections.defaultdict(dict)
291 filedata = collections.defaultdict(dict)
296 aggregatemetadata = collections.defaultdict(list)
292 aggregatemetadata = collections.defaultdict(list)
297 replacements = {}
293 replacements = {}
298 wdirwritten = False
294 wdirwritten = False
299 commitorder = sorted(revstofix, reverse=True)
295 commitorder = sorted(revstofix, reverse=True)
300 with ui.makeprogress(
296 with ui.makeprogress(
301 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
297 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
302 ) as progress:
298 ) as progress:
303 for rev, path, filerevmetadata, newdata in results:
299 for rev, path, filerevmetadata, newdata in results:
304 progress.increment(item=path)
300 progress.increment(item=path)
305 for fixername, fixermetadata in filerevmetadata.items():
301 for fixername, fixermetadata in filerevmetadata.items():
306 aggregatemetadata[fixername].append(fixermetadata)
302 aggregatemetadata[fixername].append(fixermetadata)
307 if newdata is not None:
303 if newdata is not None:
308 filedata[rev][path] = newdata
304 filedata[rev][path] = newdata
309 hookargs = {
305 hookargs = {
310 b'rev': rev,
306 b'rev': rev,
311 b'path': path,
307 b'path': path,
312 b'metadata': filerevmetadata,
308 b'metadata': filerevmetadata,
313 }
309 }
314 repo.hook(
310 repo.hook(
315 b'postfixfile',
311 b'postfixfile',
316 throw=False,
312 throw=False,
317 **pycompat.strkwargs(hookargs)
313 **pycompat.strkwargs(hookargs)
318 )
314 )
319 numitems[rev] -= 1
315 numitems[rev] -= 1
320 # Apply the fixes for this and any other revisions that are
316 # Apply the fixes for this and any other revisions that are
321 # ready and sitting at the front of the queue. Using a loop here
317 # ready and sitting at the front of the queue. Using a loop here
322 # prevents the queue from being blocked by the first revision to
318 # prevents the queue from being blocked by the first revision to
323 # be ready out of order.
319 # be ready out of order.
324 while commitorder and not numitems[commitorder[-1]]:
320 while commitorder and not numitems[commitorder[-1]]:
325 rev = commitorder.pop()
321 rev = commitorder.pop()
326 ctx = repo[rev]
322 ctx = repo[rev]
327 if rev == wdirrev:
323 if rev == wdirrev:
328 writeworkingdir(repo, ctx, filedata[rev], replacements)
324 writeworkingdir(repo, ctx, filedata[rev], replacements)
329 wdirwritten = bool(filedata[rev])
325 wdirwritten = bool(filedata[rev])
330 else:
326 else:
331 replacerev(ui, repo, ctx, filedata[rev], replacements)
327 replacerev(ui, repo, ctx, filedata[rev], replacements)
332 del filedata[rev]
328 del filedata[rev]
333
329
334 cleanup(repo, replacements, wdirwritten)
330 cleanup(repo, replacements, wdirwritten)
335 hookargs = {
331 hookargs = {
336 b'replacements': replacements,
332 b'replacements': replacements,
337 b'wdirwritten': wdirwritten,
333 b'wdirwritten': wdirwritten,
338 b'metadata': aggregatemetadata,
334 b'metadata': aggregatemetadata,
339 }
335 }
340 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
336 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
341
337
342
338
343 def cleanup(repo, replacements, wdirwritten):
339 def cleanup(repo, replacements, wdirwritten):
344 """Calls scmutil.cleanupnodes() with the given replacements.
340 """Calls scmutil.cleanupnodes() with the given replacements.
345
341
346 "replacements" is a dict from nodeid to nodeid, with one key and one value
342 "replacements" is a dict from nodeid to nodeid, with one key and one value
347 for every revision that was affected by fixing. This is slightly different
343 for every revision that was affected by fixing. This is slightly different
348 from cleanupnodes().
344 from cleanupnodes().
349
345
350 "wdirwritten" is a bool which tells whether the working copy was affected by
346 "wdirwritten" is a bool which tells whether the working copy was affected by
351 fixing, since it has no entry in "replacements".
347 fixing, since it has no entry in "replacements".
352
348
353 Useful as a hook point for extending "hg fix" with output summarizing the
349 Useful as a hook point for extending "hg fix" with output summarizing the
354 effects of the command, though we choose not to output anything here.
350 effects of the command, though we choose not to output anything here.
355 """
351 """
356 replacements = {
352 replacements = {
357 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
353 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
358 }
354 }
359 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
355 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
360
356
361
357
362 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
358 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
363 """"Constructs the list of files to be fixed at specific revisions
359 """"Constructs the list of files to be fixed at specific revisions
364
360
365 It is up to the caller how to consume the work items, and the only
361 It is up to the caller how to consume the work items, and the only
366 dependence between them is that replacement revisions must be committed in
362 dependence between them is that replacement revisions must be committed in
367 topological order. Each work item represents a file in the working copy or
363 topological order. Each work item represents a file in the working copy or
368 in some revision that should be fixed and written back to the working copy
364 in some revision that should be fixed and written back to the working copy
369 or into a replacement revision.
365 or into a replacement revision.
370
366
371 Work items for the same revision are grouped together, so that a worker
367 Work items for the same revision are grouped together, so that a worker
372 pool starting with the first N items in parallel is likely to finish the
368 pool starting with the first N items in parallel is likely to finish the
373 first revision's work before other revisions. This can allow us to write
369 first revision's work before other revisions. This can allow us to write
374 the result to disk and reduce memory footprint. At time of writing, the
370 the result to disk and reduce memory footprint. At time of writing, the
375 partition strategy in worker.py seems favorable to this. We also sort the
371 partition strategy in worker.py seems favorable to this. We also sort the
376 items by ascending revision number to match the order in which we commit
372 items by ascending revision number to match the order in which we commit
377 the fixes later.
373 the fixes later.
378 """
374 """
379 workqueue = []
375 workqueue = []
380 numitems = collections.defaultdict(int)
376 numitems = collections.defaultdict(int)
381 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
377 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
382 for rev in sorted(revstofix):
378 for rev in sorted(revstofix):
383 fixctx = repo[rev]
379 fixctx = repo[rev]
384 match = scmutil.match(fixctx, pats, opts)
380 match = scmutil.match(fixctx, pats, opts)
385 for path in sorted(
381 for path in sorted(
386 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
382 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
387 ):
383 ):
388 fctx = fixctx[path]
384 fctx = fixctx[path]
389 if fctx.islink():
385 if fctx.islink():
390 continue
386 continue
391 if fctx.size() > maxfilesize:
387 if fctx.size() > maxfilesize:
392 ui.warn(
388 ui.warn(
393 _(b'ignoring file larger than %s: %s\n')
389 _(b'ignoring file larger than %s: %s\n')
394 % (util.bytecount(maxfilesize), path)
390 % (util.bytecount(maxfilesize), path)
395 )
391 )
396 continue
392 continue
397 workqueue.append((rev, path))
393 workqueue.append((rev, path))
398 numitems[rev] += 1
394 numitems[rev] += 1
399 return workqueue, numitems
395 return workqueue, numitems
400
396
401
397
402 def getrevstofix(ui, repo, opts):
398 def getrevstofix(ui, repo, opts):
403 """Returns the set of revision numbers that should be fixed"""
399 """Returns the set of revision numbers that should be fixed"""
404 revs = set(scmutil.revrange(repo, opts[b'rev']))
400 revs = set(scmutil.revrange(repo, opts[b'rev']))
405 for rev in revs:
401 for rev in revs:
406 checkfixablectx(ui, repo, repo[rev])
402 checkfixablectx(ui, repo, repo[rev])
407 if revs:
403 if revs:
408 cmdutil.checkunfinished(repo)
404 cmdutil.checkunfinished(repo)
409 checknodescendants(repo, revs)
405 checknodescendants(repo, revs)
410 if opts.get(b'working_dir'):
406 if opts.get(b'working_dir'):
411 revs.add(wdirrev)
407 revs.add(wdirrev)
412 if list(merge.mergestate.read(repo).unresolved()):
408 if list(merge.mergestate.read(repo).unresolved()):
413 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
409 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
414 if not revs:
410 if not revs:
415 raise error.Abort(
411 raise error.Abort(
416 b'no changesets specified', hint=b'use --rev or --working-dir'
412 b'no changesets specified', hint=b'use --rev or --working-dir'
417 )
413 )
418 return revs
414 return revs
419
415
420
416
421 def checknodescendants(repo, revs):
417 def checknodescendants(repo, revs):
422 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
418 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
423 b'(%ld::) - (%ld)', revs, revs
419 b'(%ld::) - (%ld)', revs, revs
424 ):
420 ):
425 raise error.Abort(
421 raise error.Abort(
426 _(b'can only fix a changeset together with all its descendants')
422 _(b'can only fix a changeset together with all its descendants')
427 )
423 )
428
424
429
425
430 def checkfixablectx(ui, repo, ctx):
426 def checkfixablectx(ui, repo, ctx):
431 """Aborts if the revision shouldn't be replaced with a fixed one."""
427 """Aborts if the revision shouldn't be replaced with a fixed one."""
432 if not ctx.mutable():
428 if not ctx.mutable():
433 raise error.Abort(
429 raise error.Abort(
434 b'can\'t fix immutable changeset %s'
430 b'can\'t fix immutable changeset %s'
435 % (scmutil.formatchangeid(ctx),)
431 % (scmutil.formatchangeid(ctx),)
436 )
432 )
437 if ctx.obsolete():
433 if ctx.obsolete():
438 # It would be better to actually check if the revision has a successor.
434 # It would be better to actually check if the revision has a successor.
439 allowdivergence = ui.configbool(
435 allowdivergence = ui.configbool(
440 b'experimental', b'evolution.allowdivergence'
436 b'experimental', b'evolution.allowdivergence'
441 )
437 )
442 if not allowdivergence:
438 if not allowdivergence:
443 raise error.Abort(
439 raise error.Abort(
444 b'fixing obsolete revision could cause divergence'
440 b'fixing obsolete revision could cause divergence'
445 )
441 )
446
442
447
443
448 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
444 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
449 """Returns the set of files that should be fixed in a context
445 """Returns the set of files that should be fixed in a context
450
446
451 The result depends on the base contexts; we include any file that has
447 The result depends on the base contexts; we include any file that has
452 changed relative to any of the base contexts. Base contexts should be
448 changed relative to any of the base contexts. Base contexts should be
453 ancestors of the context being fixed.
449 ancestors of the context being fixed.
454 """
450 """
455 files = set()
451 files = set()
456 for basectx in basectxs:
452 for basectx in basectxs:
457 stat = basectx.status(
453 stat = basectx.status(
458 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
454 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
459 )
455 )
460 files.update(
456 files.update(
461 set(
457 set(
462 itertools.chain(
458 itertools.chain(
463 stat.added, stat.modified, stat.clean, stat.unknown
459 stat.added, stat.modified, stat.clean, stat.unknown
464 )
460 )
465 )
461 )
466 )
462 )
467 return files
463 return files
468
464
469
465
470 def lineranges(opts, path, basectxs, fixctx, content2):
466 def lineranges(opts, path, basectxs, fixctx, content2):
471 """Returns the set of line ranges that should be fixed in a file
467 """Returns the set of line ranges that should be fixed in a file
472
468
473 Of the form [(10, 20), (30, 40)].
469 Of the form [(10, 20), (30, 40)].
474
470
475 This depends on the given base contexts; we must consider lines that have
471 This depends on the given base contexts; we must consider lines that have
476 changed versus any of the base contexts, and whether the file has been
472 changed versus any of the base contexts, and whether the file has been
477 renamed versus any of them.
473 renamed versus any of them.
478
474
479 Another way to understand this is that we exclude line ranges that are
475 Another way to understand this is that we exclude line ranges that are
480 common to the file in all base contexts.
476 common to the file in all base contexts.
481 """
477 """
482 if opts.get(b'whole'):
478 if opts.get(b'whole'):
483 # Return a range containing all lines. Rely on the diff implementation's
479 # Return a range containing all lines. Rely on the diff implementation's
484 # idea of how many lines are in the file, instead of reimplementing it.
480 # idea of how many lines are in the file, instead of reimplementing it.
485 return difflineranges(b'', content2)
481 return difflineranges(b'', content2)
486
482
487 rangeslist = []
483 rangeslist = []
488 for basectx in basectxs:
484 for basectx in basectxs:
489 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
485 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
490 if basepath in basectx:
486 if basepath in basectx:
491 content1 = basectx[basepath].data()
487 content1 = basectx[basepath].data()
492 else:
488 else:
493 content1 = b''
489 content1 = b''
494 rangeslist.extend(difflineranges(content1, content2))
490 rangeslist.extend(difflineranges(content1, content2))
495 return unionranges(rangeslist)
491 return unionranges(rangeslist)
496
492
497
493
498 def unionranges(rangeslist):
494 def unionranges(rangeslist):
499 """Return the union of some closed intervals
495 """Return the union of some closed intervals
500
496
501 >>> unionranges([])
497 >>> unionranges([])
502 []
498 []
503 >>> unionranges([(1, 100)])
499 >>> unionranges([(1, 100)])
504 [(1, 100)]
500 [(1, 100)]
505 >>> unionranges([(1, 100), (1, 100)])
501 >>> unionranges([(1, 100), (1, 100)])
506 [(1, 100)]
502 [(1, 100)]
507 >>> unionranges([(1, 100), (2, 100)])
503 >>> unionranges([(1, 100), (2, 100)])
508 [(1, 100)]
504 [(1, 100)]
509 >>> unionranges([(1, 99), (1, 100)])
505 >>> unionranges([(1, 99), (1, 100)])
510 [(1, 100)]
506 [(1, 100)]
511 >>> unionranges([(1, 100), (40, 60)])
507 >>> unionranges([(1, 100), (40, 60)])
512 [(1, 100)]
508 [(1, 100)]
513 >>> unionranges([(1, 49), (50, 100)])
509 >>> unionranges([(1, 49), (50, 100)])
514 [(1, 100)]
510 [(1, 100)]
515 >>> unionranges([(1, 48), (50, 100)])
511 >>> unionranges([(1, 48), (50, 100)])
516 [(1, 48), (50, 100)]
512 [(1, 48), (50, 100)]
517 >>> unionranges([(1, 2), (3, 4), (5, 6)])
513 >>> unionranges([(1, 2), (3, 4), (5, 6)])
518 [(1, 6)]
514 [(1, 6)]
519 """
515 """
520 rangeslist = sorted(set(rangeslist))
516 rangeslist = sorted(set(rangeslist))
521 unioned = []
517 unioned = []
522 if rangeslist:
518 if rangeslist:
523 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
519 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
524 for a, b in rangeslist:
520 for a, b in rangeslist:
525 c, d = unioned[-1]
521 c, d = unioned[-1]
526 if a > d + 1:
522 if a > d + 1:
527 unioned.append((a, b))
523 unioned.append((a, b))
528 else:
524 else:
529 unioned[-1] = (c, max(b, d))
525 unioned[-1] = (c, max(b, d))
530 return unioned
526 return unioned
531
527
532
528
533 def difflineranges(content1, content2):
529 def difflineranges(content1, content2):
534 """Return list of line number ranges in content2 that differ from content1.
530 """Return list of line number ranges in content2 that differ from content1.
535
531
536 Line numbers are 1-based. The numbers are the first and last line contained
532 Line numbers are 1-based. The numbers are the first and last line contained
537 in the range. Single-line ranges have the same line number for the first and
533 in the range. Single-line ranges have the same line number for the first and
538 last line. Excludes any empty ranges that result from lines that are only
534 last line. Excludes any empty ranges that result from lines that are only
539 present in content1. Relies on mdiff's idea of where the line endings are in
535 present in content1. Relies on mdiff's idea of where the line endings are in
540 the string.
536 the string.
541
537
542 >>> from mercurial import pycompat
538 >>> from mercurial import pycompat
543 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
539 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
544 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
540 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
545 >>> difflineranges2(b'', b'')
541 >>> difflineranges2(b'', b'')
546 []
542 []
547 >>> difflineranges2(b'a', b'')
543 >>> difflineranges2(b'a', b'')
548 []
544 []
549 >>> difflineranges2(b'', b'A')
545 >>> difflineranges2(b'', b'A')
550 [(1, 1)]
546 [(1, 1)]
551 >>> difflineranges2(b'a', b'a')
547 >>> difflineranges2(b'a', b'a')
552 []
548 []
553 >>> difflineranges2(b'a', b'A')
549 >>> difflineranges2(b'a', b'A')
554 [(1, 1)]
550 [(1, 1)]
555 >>> difflineranges2(b'ab', b'')
551 >>> difflineranges2(b'ab', b'')
556 []
552 []
557 >>> difflineranges2(b'', b'AB')
553 >>> difflineranges2(b'', b'AB')
558 [(1, 2)]
554 [(1, 2)]
559 >>> difflineranges2(b'abc', b'ac')
555 >>> difflineranges2(b'abc', b'ac')
560 []
556 []
561 >>> difflineranges2(b'ab', b'aCb')
557 >>> difflineranges2(b'ab', b'aCb')
562 [(2, 2)]
558 [(2, 2)]
563 >>> difflineranges2(b'abc', b'aBc')
559 >>> difflineranges2(b'abc', b'aBc')
564 [(2, 2)]
560 [(2, 2)]
565 >>> difflineranges2(b'ab', b'AB')
561 >>> difflineranges2(b'ab', b'AB')
566 [(1, 2)]
562 [(1, 2)]
567 >>> difflineranges2(b'abcde', b'aBcDe')
563 >>> difflineranges2(b'abcde', b'aBcDe')
568 [(2, 2), (4, 4)]
564 [(2, 2), (4, 4)]
569 >>> difflineranges2(b'abcde', b'aBCDe')
565 >>> difflineranges2(b'abcde', b'aBCDe')
570 [(2, 4)]
566 [(2, 4)]
571 """
567 """
572 ranges = []
568 ranges = []
573 for lines, kind in mdiff.allblocks(content1, content2):
569 for lines, kind in mdiff.allblocks(content1, content2):
574 firstline, lastline = lines[2:4]
570 firstline, lastline = lines[2:4]
575 if kind == b'!' and firstline != lastline:
571 if kind == b'!' and firstline != lastline:
576 ranges.append((firstline + 1, lastline))
572 ranges.append((firstline + 1, lastline))
577 return ranges
573 return ranges
578
574
579
575
580 def getbasectxs(repo, opts, revstofix):
576 def getbasectxs(repo, opts, revstofix):
581 """Returns a map of the base contexts for each revision
577 """Returns a map of the base contexts for each revision
582
578
583 The base contexts determine which lines are considered modified when we
579 The base contexts determine which lines are considered modified when we
584 attempt to fix just the modified lines in a file. It also determines which
580 attempt to fix just the modified lines in a file. It also determines which
585 files we attempt to fix, so it is important to compute this even when
581 files we attempt to fix, so it is important to compute this even when
586 --whole is used.
582 --whole is used.
587 """
583 """
588 # The --base flag overrides the usual logic, and we give every revision
584 # The --base flag overrides the usual logic, and we give every revision
589 # exactly the set of baserevs that the user specified.
585 # exactly the set of baserevs that the user specified.
590 if opts.get(b'base'):
586 if opts.get(b'base'):
591 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
587 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
592 if not baserevs:
588 if not baserevs:
593 baserevs = {nullrev}
589 baserevs = {nullrev}
594 basectxs = {repo[rev] for rev in baserevs}
590 basectxs = {repo[rev] for rev in baserevs}
595 return {rev: basectxs for rev in revstofix}
591 return {rev: basectxs for rev in revstofix}
596
592
597 # Proceed in topological order so that we can easily determine each
593 # Proceed in topological order so that we can easily determine each
598 # revision's baserevs by looking at its parents and their baserevs.
594 # revision's baserevs by looking at its parents and their baserevs.
599 basectxs = collections.defaultdict(set)
595 basectxs = collections.defaultdict(set)
600 for rev in sorted(revstofix):
596 for rev in sorted(revstofix):
601 ctx = repo[rev]
597 ctx = repo[rev]
602 for pctx in ctx.parents():
598 for pctx in ctx.parents():
603 if pctx.rev() in basectxs:
599 if pctx.rev() in basectxs:
604 basectxs[rev].update(basectxs[pctx.rev()])
600 basectxs[rev].update(basectxs[pctx.rev()])
605 else:
601 else:
606 basectxs[rev].add(pctx)
602 basectxs[rev].add(pctx)
607 return basectxs
603 return basectxs
608
604
609
605
610 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
606 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
611 """Run any configured fixers that should affect the file in this context
607 """Run any configured fixers that should affect the file in this context
612
608
613 Returns the file content that results from applying the fixers in some order
609 Returns the file content that results from applying the fixers in some order
614 starting with the file's content in the fixctx. Fixers that support line
610 starting with the file's content in the fixctx. Fixers that support line
615 ranges will affect lines that have changed relative to any of the basectxs
611 ranges will affect lines that have changed relative to any of the basectxs
616 (i.e. they will only avoid lines that are common to all basectxs).
612 (i.e. they will only avoid lines that are common to all basectxs).
617
613
618 A fixer tool's stdout will become the file's new content if and only if it
614 A fixer tool's stdout will become the file's new content if and only if it
619 exits with code zero. The fixer tool's working directory is the repository's
615 exits with code zero. The fixer tool's working directory is the repository's
620 root.
616 root.
621 """
617 """
622 metadata = {}
618 metadata = {}
623 newdata = fixctx[path].data()
619 newdata = fixctx[path].data()
624 for fixername, fixer in pycompat.iteritems(fixers):
620 for fixername, fixer in pycompat.iteritems(fixers):
625 if fixer.affects(opts, fixctx, path):
621 if fixer.affects(opts, fixctx, path):
626 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
622 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
627 command = fixer.command(ui, path, ranges)
623 command = fixer.command(ui, path, ranges)
628 if command is None:
624 if command is None:
629 continue
625 continue
630 ui.debug(b'subprocess: %s\n' % (command,))
626 ui.debug(b'subprocess: %s\n' % (command,))
631 proc = subprocess.Popen(
627 proc = subprocess.Popen(
632 procutil.tonativestr(command),
628 procutil.tonativestr(command),
633 shell=True,
629 shell=True,
634 cwd=procutil.tonativestr(repo.root),
630 cwd=procutil.tonativestr(repo.root),
635 stdin=subprocess.PIPE,
631 stdin=subprocess.PIPE,
636 stdout=subprocess.PIPE,
632 stdout=subprocess.PIPE,
637 stderr=subprocess.PIPE,
633 stderr=subprocess.PIPE,
638 )
634 )
639 stdout, stderr = proc.communicate(newdata)
635 stdout, stderr = proc.communicate(newdata)
640 if stderr:
636 if stderr:
641 showstderr(ui, fixctx.rev(), fixername, stderr)
637 showstderr(ui, fixctx.rev(), fixername, stderr)
642 newerdata = stdout
638 newerdata = stdout
643 if fixer.shouldoutputmetadata():
639 if fixer.shouldoutputmetadata():
644 try:
640 try:
645 metadatajson, newerdata = stdout.split(b'\0', 1)
641 metadatajson, newerdata = stdout.split(b'\0', 1)
646 metadata[fixername] = json.loads(metadatajson)
642 metadata[fixername] = json.loads(metadatajson)
647 except ValueError:
643 except ValueError:
648 ui.warn(
644 ui.warn(
649 _(b'ignored invalid output from fixer tool: %s\n')
645 _(b'ignored invalid output from fixer tool: %s\n')
650 % (fixername,)
646 % (fixername,)
651 )
647 )
652 continue
648 continue
653 else:
649 else:
654 metadata[fixername] = None
650 metadata[fixername] = None
655 if proc.returncode == 0:
651 if proc.returncode == 0:
656 newdata = newerdata
652 newdata = newerdata
657 else:
653 else:
658 if not stderr:
654 if not stderr:
659 message = _(b'exited with status %d\n') % (proc.returncode,)
655 message = _(b'exited with status %d\n') % (proc.returncode,)
660 showstderr(ui, fixctx.rev(), fixername, message)
656 showstderr(ui, fixctx.rev(), fixername, message)
661 checktoolfailureaction(
657 checktoolfailureaction(
662 ui,
658 ui,
663 _(b'no fixes will be applied'),
659 _(b'no fixes will be applied'),
664 hint=_(
660 hint=_(
665 b'use --config fix.failure=continue to apply any '
661 b'use --config fix.failure=continue to apply any '
666 b'successful fixes anyway'
662 b'successful fixes anyway'
667 ),
663 ),
668 )
664 )
669 return metadata, newdata
665 return metadata, newdata
670
666
671
667
672 def showstderr(ui, rev, fixername, stderr):
668 def showstderr(ui, rev, fixername, stderr):
673 """Writes the lines of the stderr string as warnings on the ui
669 """Writes the lines of the stderr string as warnings on the ui
674
670
675 Uses the revision number and fixername to give more context to each line of
671 Uses the revision number and fixername to give more context to each line of
676 the error message. Doesn't include file names, since those take up a lot of
672 the error message. Doesn't include file names, since those take up a lot of
677 space and would tend to be included in the error message if they were
673 space and would tend to be included in the error message if they were
678 relevant.
674 relevant.
679 """
675 """
680 for line in re.split(b'[\r\n]+', stderr):
676 for line in re.split(b'[\r\n]+', stderr):
681 if line:
677 if line:
682 ui.warn(b'[')
678 ui.warn(b'[')
683 if rev is None:
679 if rev is None:
684 ui.warn(_(b'wdir'), label=b'evolve.rev')
680 ui.warn(_(b'wdir'), label=b'evolve.rev')
685 else:
681 else:
686 ui.warn((str(rev)), label=b'evolve.rev')
682 ui.warn((str(rev)), label=b'evolve.rev')
687 ui.warn(b'] %s: %s\n' % (fixername, line))
683 ui.warn(b'] %s: %s\n' % (fixername, line))
688
684
689
685
690 def writeworkingdir(repo, ctx, filedata, replacements):
686 def writeworkingdir(repo, ctx, filedata, replacements):
691 """Write new content to the working copy and check out the new p1 if any
687 """Write new content to the working copy and check out the new p1 if any
692
688
693 We check out a new revision if and only if we fixed something in both the
689 We check out a new revision if and only if we fixed something in both the
694 working directory and its parent revision. This avoids the need for a full
690 working directory and its parent revision. This avoids the need for a full
695 update/merge, and means that the working directory simply isn't affected
691 update/merge, and means that the working directory simply isn't affected
696 unless the --working-dir flag is given.
692 unless the --working-dir flag is given.
697
693
698 Directly updates the dirstate for the affected files.
694 Directly updates the dirstate for the affected files.
699 """
695 """
700 for path, data in pycompat.iteritems(filedata):
696 for path, data in pycompat.iteritems(filedata):
701 fctx = ctx[path]
697 fctx = ctx[path]
702 fctx.write(data, fctx.flags())
698 fctx.write(data, fctx.flags())
703 if repo.dirstate[path] == b'n':
699 if repo.dirstate[path] == b'n':
704 repo.dirstate.normallookup(path)
700 repo.dirstate.normallookup(path)
705
701
706 oldparentnodes = repo.dirstate.parents()
702 oldparentnodes = repo.dirstate.parents()
707 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
703 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
708 if newparentnodes != oldparentnodes:
704 if newparentnodes != oldparentnodes:
709 repo.setparents(*newparentnodes)
705 repo.setparents(*newparentnodes)
710
706
711
707
712 def replacerev(ui, repo, ctx, filedata, replacements):
708 def replacerev(ui, repo, ctx, filedata, replacements):
713 """Commit a new revision like the given one, but with file content changes
709 """Commit a new revision like the given one, but with file content changes
714
710
715 "ctx" is the original revision to be replaced by a modified one.
711 "ctx" is the original revision to be replaced by a modified one.
716
712
717 "filedata" is a dict that maps paths to their new file content. All other
713 "filedata" is a dict that maps paths to their new file content. All other
718 paths will be recreated from the original revision without changes.
714 paths will be recreated from the original revision without changes.
719 "filedata" may contain paths that didn't exist in the original revision;
715 "filedata" may contain paths that didn't exist in the original revision;
720 they will be added.
716 they will be added.
721
717
722 "replacements" is a dict that maps a single node to a single node, and it is
718 "replacements" is a dict that maps a single node to a single node, and it is
723 updated to indicate the original revision is replaced by the newly created
719 updated to indicate the original revision is replaced by the newly created
724 one. No entry is added if the replacement's node already exists.
720 one. No entry is added if the replacement's node already exists.
725
721
726 The new revision has the same parents as the old one, unless those parents
722 The new revision has the same parents as the old one, unless those parents
727 have already been replaced, in which case those replacements are the parents
723 have already been replaced, in which case those replacements are the parents
728 of this new revision. Thus, if revisions are replaced in topological order,
724 of this new revision. Thus, if revisions are replaced in topological order,
729 there is no need to rebase them into the original topology later.
725 there is no need to rebase them into the original topology later.
730 """
726 """
731
727
732 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
728 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
733 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
729 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
734 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
730 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
735 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
731 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
736
732
737 # We don't want to create a revision that has no changes from the original,
733 # We don't want to create a revision that has no changes from the original,
738 # but we should if the original revision's parent has been replaced.
734 # but we should if the original revision's parent has been replaced.
739 # Otherwise, we would produce an orphan that needs no actual human
735 # Otherwise, we would produce an orphan that needs no actual human
740 # intervention to evolve. We can't rely on commit() to avoid creating the
736 # intervention to evolve. We can't rely on commit() to avoid creating the
741 # un-needed revision because the extra field added below produces a new hash
737 # un-needed revision because the extra field added below produces a new hash
742 # regardless of file content changes.
738 # regardless of file content changes.
743 if (
739 if (
744 not filedata
740 not filedata
745 and p1ctx.node() not in replacements
741 and p1ctx.node() not in replacements
746 and p2ctx.node() not in replacements
742 and p2ctx.node() not in replacements
747 ):
743 ):
748 return
744 return
749
745
750 def filectxfn(repo, memctx, path):
746 def filectxfn(repo, memctx, path):
751 if path not in ctx:
747 if path not in ctx:
752 return None
748 return None
753 fctx = ctx[path]
749 fctx = ctx[path]
754 copysource = fctx.copysource()
750 copysource = fctx.copysource()
755 return context.memfilectx(
751 return context.memfilectx(
756 repo,
752 repo,
757 memctx,
753 memctx,
758 path=fctx.path(),
754 path=fctx.path(),
759 data=filedata.get(path, fctx.data()),
755 data=filedata.get(path, fctx.data()),
760 islink=fctx.islink(),
756 islink=fctx.islink(),
761 isexec=fctx.isexec(),
757 isexec=fctx.isexec(),
762 copysource=copysource,
758 copysource=copysource,
763 )
759 )
764
760
765 extra = ctx.extra().copy()
761 extra = ctx.extra().copy()
766 extra[b'fix_source'] = ctx.hex()
762 extra[b'fix_source'] = ctx.hex()
767
763
768 memctx = context.memctx(
764 memctx = context.memctx(
769 repo,
765 repo,
770 parents=(newp1node, newp2node),
766 parents=(newp1node, newp2node),
771 text=ctx.description(),
767 text=ctx.description(),
772 files=set(ctx.files()) | set(filedata.keys()),
768 files=set(ctx.files()) | set(filedata.keys()),
773 filectxfn=filectxfn,
769 filectxfn=filectxfn,
774 user=ctx.user(),
770 user=ctx.user(),
775 date=ctx.date(),
771 date=ctx.date(),
776 extra=extra,
772 extra=extra,
777 branch=ctx.branch(),
773 branch=ctx.branch(),
778 editor=None,
774 editor=None,
779 )
775 )
780 sucnode = memctx.commit()
776 sucnode = memctx.commit()
781 prenode = ctx.node()
777 prenode = ctx.node()
782 if prenode == sucnode:
778 if prenode == sucnode:
783 ui.debug(b'node %s already existed\n' % (ctx.hex()))
779 ui.debug(b'node %s already existed\n' % (ctx.hex()))
784 else:
780 else:
785 replacements[ctx.node()] = sucnode
781 replacements[ctx.node()] = sucnode
786
782
787
783
788 def getfixers(ui):
784 def getfixers(ui):
789 """Returns a map of configured fixer tools indexed by their names
785 """Returns a map of configured fixer tools indexed by their names
790
786
791 Each value is a Fixer object with methods that implement the behavior of the
787 Each value is a Fixer object with methods that implement the behavior of the
792 fixer's config suboptions. Does not validate the config values.
788 fixer's config suboptions. Does not validate the config values.
793 """
789 """
794 fixers = {}
790 fixers = {}
795 for name in fixernames(ui):
791 for name in fixernames(ui):
796 fixers[name] = Fixer()
792 enabled = ui.configbool(b'fix', name + b':enabled')
797 for key in FIXER_ATTRS:
793 command = ui.config(b'fix', name + b':command')
798 setattr(
794 pattern = ui.config(b'fix', name + b':pattern')
799 fixers[name],
795 linerange = ui.config(b'fix', name + b':linerange')
800 pycompat.sysstr(b'_' + key),
796 priority = ui.configint(b'fix', name + b':priority')
801 ui.config(b'fix', name + b':' + key),
797 metadata = ui.configbool(b'fix', name + b':metadata')
802 )
798 skipclean = ui.configbool(b'fix', name + b':skipclean')
803 fixers[name]._priority = int(fixers[name]._priority)
804 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
805 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
806 fixers[name]._enabled = stringutil.parsebool(fixers[name]._enabled)
807 # Don't use a fixer if it has no pattern configured. It would be
799 # Don't use a fixer if it has no pattern configured. It would be
808 # dangerous to let it affect all files. It would be pointless to let it
800 # dangerous to let it affect all files. It would be pointless to let it
809 # affect no files. There is no reasonable subset of files to use as the
801 # affect no files. There is no reasonable subset of files to use as the
810 # default.
802 # default.
811 if fixers[name]._pattern is None:
803 if pattern is None:
812 ui.warn(
804 ui.warn(
813 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
805 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
814 )
806 )
815 del fixers[name]
807 elif not enabled:
816 elif not fixers[name]._enabled:
817 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
808 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
818 del fixers[name]
809 else:
810 fixers[name] = Fixer(
811 command, pattern, linerange, priority, metadata, skipclean
812 )
819 return collections.OrderedDict(
813 return collections.OrderedDict(
820 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
814 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
821 )
815 )
822
816
823
817
824 def fixernames(ui):
818 def fixernames(ui):
825 """Returns the names of [fix] config options that have suboptions"""
819 """Returns the names of [fix] config options that have suboptions"""
826 names = set()
820 names = set()
827 for k, v in ui.configitems(b'fix'):
821 for k, v in ui.configitems(b'fix'):
828 if b':' in k:
822 if b':' in k:
829 names.add(k.split(b':', 1)[0])
823 names.add(k.split(b':', 1)[0])
830 return names
824 return names
831
825
832
826
833 class Fixer(object):
827 class Fixer(object):
834 """Wraps the raw config values for a fixer with methods"""
828 """Wraps the raw config values for a fixer with methods"""
835
829
830 def __init__(
831 self, command, pattern, linerange, priority, metadata, skipclean
832 ):
833 self._command = command
834 self._pattern = pattern
835 self._linerange = linerange
836 self._priority = priority
837 self._metadata = metadata
838 self._skipclean = skipclean
839
836 def affects(self, opts, fixctx, path):
840 def affects(self, opts, fixctx, path):
837 """Should this fixer run on the file at the given path and context?"""
841 """Should this fixer run on the file at the given path and context?"""
838 return self._pattern is not None and scmutil.match(
842 return self._pattern is not None and scmutil.match(
839 fixctx, [self._pattern], opts
843 fixctx, [self._pattern], opts
840 )(path)
844 )(path)
841
845
842 def shouldoutputmetadata(self):
846 def shouldoutputmetadata(self):
843 """Should the stdout of this fixer start with JSON and a null byte?"""
847 """Should the stdout of this fixer start with JSON and a null byte?"""
844 return self._metadata
848 return self._metadata
845
849
846 def command(self, ui, path, ranges):
850 def command(self, ui, path, ranges):
847 """A shell command to use to invoke this fixer on the given file/lines
851 """A shell command to use to invoke this fixer on the given file/lines
848
852
849 May return None if there is no appropriate command to run for the given
853 May return None if there is no appropriate command to run for the given
850 parameters.
854 parameters.
851 """
855 """
852 expand = cmdutil.rendercommandtemplate
856 expand = cmdutil.rendercommandtemplate
853 parts = [
857 parts = [
854 expand(
858 expand(
855 ui,
859 ui,
856 self._command,
860 self._command,
857 {b'rootpath': path, b'basename': os.path.basename(path)},
861 {b'rootpath': path, b'basename': os.path.basename(path)},
858 )
862 )
859 ]
863 ]
860 if self._linerange:
864 if self._linerange:
861 if self._skipclean and not ranges:
865 if self._skipclean and not ranges:
862 # No line ranges to fix, so don't run the fixer.
866 # No line ranges to fix, so don't run the fixer.
863 return None
867 return None
864 for first, last in ranges:
868 for first, last in ranges:
865 parts.append(
869 parts.append(
866 expand(
870 expand(
867 ui, self._linerange, {b'first': first, b'last': last}
871 ui, self._linerange, {b'first': first, b'last': last}
868 )
872 )
869 )
873 )
870 return b' '.join(parts)
874 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now