##// END OF EJS Templates
fix: make Fixer initialization more explicit for clarity...
Martin von Zweigbergk -
r43490:2e3aa8ed default draft
parent child Browse files
Show More
@@ -1,870 +1,876 b''
1 # fix - rewrite file content in changesets and working copy
1 # fix - rewrite file content in changesets and working copy
2 #
2 #
3 # Copyright 2018 Google LLC.
3 # Copyright 2018 Google LLC.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8
8
9 Provides a command that runs configured tools on the contents of modified files,
9 Provides a command that runs configured tools on the contents of modified files,
10 writing back any fixes to the working copy or replacing changesets.
10 writing back any fixes to the working copy or replacing changesets.
11
11
12 Here is an example configuration that causes :hg:`fix` to apply automatic
12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 formatting fixes to modified lines in C++ code::
13 formatting fixes to modified lines in C++ code::
14
14
15 [fix]
15 [fix]
16 clang-format:command=clang-format --assume-filename={rootpath}
16 clang-format:command=clang-format --assume-filename={rootpath}
17 clang-format:linerange=--lines={first}:{last}
17 clang-format:linerange=--lines={first}:{last}
18 clang-format:pattern=set:**.cpp or **.hpp
18 clang-format:pattern=set:**.cpp or **.hpp
19
19
20 The :command suboption forms the first part of the shell command that will be
20 The :command suboption forms the first part of the shell command that will be
21 used to fix a file. The content of the file is passed on standard input, and the
21 used to fix a file. The content of the file is passed on standard input, and the
22 fixed file content is expected on standard output. Any output on standard error
22 fixed file content is expected on standard output. Any output on standard error
23 will be displayed as a warning. If the exit status is not zero, the file will
23 will be displayed as a warning. If the exit status is not zero, the file will
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 status but no standard error output. Some values may be substituted into the
25 status but no standard error output. Some values may be substituted into the
26 command::
26 command::
27
27
28 {rootpath} The path of the file being fixed, relative to the repo root
28 {rootpath} The path of the file being fixed, relative to the repo root
29 {basename} The name of the file being fixed, without the directory path
29 {basename} The name of the file being fixed, without the directory path
30
30
31 If the :linerange suboption is set, the tool will only be run if there are
31 If the :linerange suboption is set, the tool will only be run if there are
32 changed lines in a file. The value of this suboption is appended to the shell
32 changed lines in a file. The value of this suboption is appended to the shell
33 command once for every range of changed lines in the file. Some values may be
33 command once for every range of changed lines in the file. Some values may be
34 substituted into the command::
34 substituted into the command::
35
35
36 {first} The 1-based line number of the first line in the modified range
36 {first} The 1-based line number of the first line in the modified range
37 {last} The 1-based line number of the last line in the modified range
37 {last} The 1-based line number of the last line in the modified range
38
38
39 Deleted sections of a file will be ignored by :linerange, because there is no
39 Deleted sections of a file will be ignored by :linerange, because there is no
40 corresponding line range in the version being fixed.
40 corresponding line range in the version being fixed.
41
41
42 By default, tools that set :linerange will only be executed if there is at least
42 By default, tools that set :linerange will only be executed if there is at least
43 one changed line range. This is meant to prevent accidents like running a code
43 one changed line range. This is meant to prevent accidents like running a code
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 to false.
46 to false.
47
47
48 The :pattern suboption determines which files will be passed through each
48 The :pattern suboption determines which files will be passed through each
49 configured tool. See :hg:`help patterns` for possible values. If there are file
49 configured tool. See :hg:`help patterns` for possible values. If there are file
50 arguments to :hg:`fix`, the intersection of these patterns is used.
50 arguments to :hg:`fix`, the intersection of these patterns is used.
51
51
52 There is also a configurable limit for the maximum size of file that will be
52 There is also a configurable limit for the maximum size of file that will be
53 processed by :hg:`fix`::
53 processed by :hg:`fix`::
54
54
55 [fix]
55 [fix]
56 maxfilesize = 2MB
56 maxfilesize = 2MB
57
57
58 Normally, execution of configured tools will continue after a failure (indicated
58 Normally, execution of configured tools will continue after a failure (indicated
59 by a non-zero exit status). It can also be configured to abort after the first
59 by a non-zero exit status). It can also be configured to abort after the first
60 such failure, so that no files will be affected if any tool fails. This abort
60 such failure, so that no files will be affected if any tool fails. This abort
61 will also cause :hg:`fix` to exit with a non-zero status::
61 will also cause :hg:`fix` to exit with a non-zero status::
62
62
63 [fix]
63 [fix]
64 failure = abort
64 failure = abort
65
65
66 When multiple tools are configured to affect a file, they execute in an order
66 When multiple tools are configured to affect a file, they execute in an order
67 defined by the :priority suboption. The priority suboption has a default value
67 defined by the :priority suboption. The priority suboption has a default value
68 of zero for each tool. Tools are executed in order of descending priority. The
68 of zero for each tool. Tools are executed in order of descending priority. The
69 execution order of tools with equal priority is unspecified. For example, you
69 execution order of tools with equal priority is unspecified. For example, you
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
70 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
71 in a text file by ensuring that 'sort' runs before 'head'::
71 in a text file by ensuring that 'sort' runs before 'head'::
72
72
73 [fix]
73 [fix]
74 sort:command = sort -n
74 sort:command = sort -n
75 head:command = head -n 10
75 head:command = head -n 10
76 sort:pattern = numbers.txt
76 sort:pattern = numbers.txt
77 head:pattern = numbers.txt
77 head:pattern = numbers.txt
78 sort:priority = 2
78 sort:priority = 2
79 head:priority = 1
79 head:priority = 1
80
80
81 To account for changes made by each tool, the line numbers used for incremental
81 To account for changes made by each tool, the line numbers used for incremental
82 formatting are recomputed before executing the next tool. So, each tool may see
82 formatting are recomputed before executing the next tool. So, each tool may see
83 different values for the arguments added by the :linerange suboption.
83 different values for the arguments added by the :linerange suboption.
84
84
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
85 Each fixer tool is allowed to return some metadata in addition to the fixed file
86 content. The metadata must be placed before the file content on stdout,
86 content. The metadata must be placed before the file content on stdout,
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
87 separated from the file content by a zero byte. The metadata is parsed as a JSON
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
88 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
89 is expected to produce this metadata encoding if and only if the :metadata
89 is expected to produce this metadata encoding if and only if the :metadata
90 suboption is true::
90 suboption is true::
91
91
92 [fix]
92 [fix]
93 tool:command = tool --prepend-json-metadata
93 tool:command = tool --prepend-json-metadata
94 tool:metadata = true
94 tool:metadata = true
95
95
96 The metadata values are passed to hooks, which can be used to print summaries or
96 The metadata values are passed to hooks, which can be used to print summaries or
97 perform other post-fixing work. The supported hooks are::
97 perform other post-fixing work. The supported hooks are::
98
98
99 "postfixfile"
99 "postfixfile"
100 Run once for each file in each revision where any fixer tools made changes
100 Run once for each file in each revision where any fixer tools made changes
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
101 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
102 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
103 tools that affected the file. Fixer tools that didn't affect the file have a
103 tools that affected the file. Fixer tools that didn't affect the file have a
104 valueof None. Only fixer tools that executed are present in the metadata.
104 valueof None. Only fixer tools that executed are present in the metadata.
105
105
106 "postfix"
106 "postfix"
107 Run once after all files and revisions have been handled. Provides
107 Run once after all files and revisions have been handled. Provides
108 "$HG_REPLACEMENTS" with information about what revisions were created and
108 "$HG_REPLACEMENTS" with information about what revisions were created and
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
109 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
110 files in the working copy were updated. Provides a list "$HG_METADATA"
110 files in the working copy were updated. Provides a list "$HG_METADATA"
111 mapping fixer tool names to lists of metadata values returned from
111 mapping fixer tool names to lists of metadata values returned from
112 executions that modified a file. This aggregates the same metadata
112 executions that modified a file. This aggregates the same metadata
113 previously passed to the "postfixfile" hook.
113 previously passed to the "postfixfile" hook.
114
114
115 Fixer tools are run the in repository's root directory. This allows them to read
115 Fixer tools are run the in repository's root directory. This allows them to read
116 configuration files from the working copy, or even write to the working copy.
116 configuration files from the working copy, or even write to the working copy.
117 The working copy is not updated to match the revision being fixed. In fact,
117 The working copy is not updated to match the revision being fixed. In fact,
118 several revisions may be fixed in parallel. Writes to the working copy are not
118 several revisions may be fixed in parallel. Writes to the working copy are not
119 amended into the revision being fixed; fixer tools should always write fixed
119 amended into the revision being fixed; fixer tools should always write fixed
120 file content back to stdout as documented above.
120 file content back to stdout as documented above.
121 """
121 """
122
122
123 from __future__ import absolute_import
123 from __future__ import absolute_import
124
124
125 import collections
125 import collections
126 import itertools
126 import itertools
127 import json
127 import json
128 import os
128 import os
129 import re
129 import re
130 import subprocess
130 import subprocess
131
131
132 from mercurial.i18n import _
132 from mercurial.i18n import _
133 from mercurial.node import nullrev
133 from mercurial.node import nullrev
134 from mercurial.node import wdirrev
134 from mercurial.node import wdirrev
135 from mercurial.pycompat import setattr
136
135
137 from mercurial.utils import (
136 from mercurial.utils import (
138 procutil,
137 procutil,
139 stringutil,
140 )
138 )
141
139
142 from mercurial import (
140 from mercurial import (
143 cmdutil,
141 cmdutil,
144 context,
142 context,
145 copies,
143 copies,
146 error,
144 error,
147 mdiff,
145 mdiff,
148 merge,
146 merge,
149 obsolete,
147 obsolete,
150 pycompat,
148 pycompat,
151 registrar,
149 registrar,
152 scmutil,
150 scmutil,
153 util,
151 util,
154 worker,
152 worker,
155 )
153 )
156
154
157 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
155 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
158 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
156 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
159 # be specifying the version(s) of Mercurial they are tested with, or
157 # be specifying the version(s) of Mercurial they are tested with, or
160 # leave the attribute unspecified.
158 # leave the attribute unspecified.
161 testedwith = b'ships-with-hg-core'
159 testedwith = b'ships-with-hg-core'
162
160
163 cmdtable = {}
161 cmdtable = {}
164 command = registrar.command(cmdtable)
162 command = registrar.command(cmdtable)
165
163
166 configtable = {}
164 configtable = {}
167 configitem = registrar.configitem(configtable)
165 configitem = registrar.configitem(configtable)
168
166
169 # Register the suboptions allowed for each configured fixer, and default values.
167 # Register the suboptions allowed for each configured fixer, and default values.
170 FIXER_ATTRS = {
168 FIXER_ATTRS = {
171 b'command': None,
169 b'command': None,
172 b'linerange': None,
170 b'linerange': None,
173 b'pattern': None,
171 b'pattern': None,
174 b'priority': 0,
172 b'priority': 0,
175 b'metadata': b'false',
173 b'metadata': False,
176 b'skipclean': b'true',
174 b'skipclean': True,
177 b'enabled': b'true',
175 b'enabled': True,
178 }
176 }
179
177
180 for key, default in FIXER_ATTRS.items():
178 for key, default in FIXER_ATTRS.items():
181 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
179 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
182
180
183 # A good default size allows most source code files to be fixed, but avoids
181 # A good default size allows most source code files to be fixed, but avoids
184 # letting fixer tools choke on huge inputs, which could be surprising to the
182 # letting fixer tools choke on huge inputs, which could be surprising to the
185 # user.
183 # user.
186 configitem(b'fix', b'maxfilesize', default=b'2MB')
184 configitem(b'fix', b'maxfilesize', default=b'2MB')
187
185
188 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
186 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
189 # This helps users do shell scripts that stop when a fixer tool signals a
187 # This helps users do shell scripts that stop when a fixer tool signals a
190 # problem.
188 # problem.
191 configitem(b'fix', b'failure', default=b'continue')
189 configitem(b'fix', b'failure', default=b'continue')
192
190
193
191
194 def checktoolfailureaction(ui, message, hint=None):
192 def checktoolfailureaction(ui, message, hint=None):
195 """Abort with 'message' if fix.failure=abort"""
193 """Abort with 'message' if fix.failure=abort"""
196 action = ui.config(b'fix', b'failure')
194 action = ui.config(b'fix', b'failure')
197 if action not in (b'continue', b'abort'):
195 if action not in (b'continue', b'abort'):
198 raise error.Abort(
196 raise error.Abort(
199 _(b'unknown fix.failure action: %s') % (action,),
197 _(b'unknown fix.failure action: %s') % (action,),
200 hint=_(b'use "continue" or "abort"'),
198 hint=_(b'use "continue" or "abort"'),
201 )
199 )
202 if action == b'abort':
200 if action == b'abort':
203 raise error.Abort(message, hint=hint)
201 raise error.Abort(message, hint=hint)
204
202
205
203
206 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
204 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
207 baseopt = (
205 baseopt = (
208 b'',
206 b'',
209 b'base',
207 b'base',
210 [],
208 [],
211 _(
209 _(
212 b'revisions to diff against (overrides automatic '
210 b'revisions to diff against (overrides automatic '
213 b'selection, and applies to every revision being '
211 b'selection, and applies to every revision being '
214 b'fixed)'
212 b'fixed)'
215 ),
213 ),
216 _(b'REV'),
214 _(b'REV'),
217 )
215 )
218 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
216 revopt = (b'r', b'rev', [], _(b'revisions to fix'), _(b'REV'))
219 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
217 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
220 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
218 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
221 usage = _(b'[OPTION]... [FILE]...')
219 usage = _(b'[OPTION]... [FILE]...')
222
220
223
221
224 @command(
222 @command(
225 b'fix',
223 b'fix',
226 [allopt, baseopt, revopt, wdiropt, wholeopt],
224 [allopt, baseopt, revopt, wdiropt, wholeopt],
227 usage,
225 usage,
228 helpcategory=command.CATEGORY_FILE_CONTENTS,
226 helpcategory=command.CATEGORY_FILE_CONTENTS,
229 )
227 )
230 def fix(ui, repo, *pats, **opts):
228 def fix(ui, repo, *pats, **opts):
231 """rewrite file content in changesets or working directory
229 """rewrite file content in changesets or working directory
232
230
233 Runs any configured tools to fix the content of files. Only affects files
231 Runs any configured tools to fix the content of files. Only affects files
234 with changes, unless file arguments are provided. Only affects changed lines
232 with changes, unless file arguments are provided. Only affects changed lines
235 of files, unless the --whole flag is used. Some tools may always affect the
233 of files, unless the --whole flag is used. Some tools may always affect the
236 whole file regardless of --whole.
234 whole file regardless of --whole.
237
235
238 If revisions are specified with --rev, those revisions will be checked, and
236 If revisions are specified with --rev, those revisions will be checked, and
239 they may be replaced with new revisions that have fixed file content. It is
237 they may be replaced with new revisions that have fixed file content. It is
240 desirable to specify all descendants of each specified revision, so that the
238 desirable to specify all descendants of each specified revision, so that the
241 fixes propagate to the descendants. If all descendants are fixed at the same
239 fixes propagate to the descendants. If all descendants are fixed at the same
242 time, no merging, rebasing, or evolution will be required.
240 time, no merging, rebasing, or evolution will be required.
243
241
244 If --working-dir is used, files with uncommitted changes in the working copy
242 If --working-dir is used, files with uncommitted changes in the working copy
245 will be fixed. If the checked-out revision is also fixed, the working
243 will be fixed. If the checked-out revision is also fixed, the working
246 directory will update to the replacement revision.
244 directory will update to the replacement revision.
247
245
248 When determining what lines of each file to fix at each revision, the whole
246 When determining what lines of each file to fix at each revision, the whole
249 set of revisions being fixed is considered, so that fixes to earlier
247 set of revisions being fixed is considered, so that fixes to earlier
250 revisions are not forgotten in later ones. The --base flag can be used to
248 revisions are not forgotten in later ones. The --base flag can be used to
251 override this default behavior, though it is not usually desirable to do so.
249 override this default behavior, though it is not usually desirable to do so.
252 """
250 """
253 opts = pycompat.byteskwargs(opts)
251 opts = pycompat.byteskwargs(opts)
254 if opts[b'all']:
252 if opts[b'all']:
255 if opts[b'rev']:
253 if opts[b'rev']:
256 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
254 raise error.Abort(_(b'cannot specify both "--rev" and "--all"'))
257 opts[b'rev'] = [b'not public() and not obsolete()']
255 opts[b'rev'] = [b'not public() and not obsolete()']
258 opts[b'working_dir'] = True
256 opts[b'working_dir'] = True
259 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
257 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
260 revstofix = getrevstofix(ui, repo, opts)
258 revstofix = getrevstofix(ui, repo, opts)
261 basectxs = getbasectxs(repo, opts, revstofix)
259 basectxs = getbasectxs(repo, opts, revstofix)
262 workqueue, numitems = getworkqueue(
260 workqueue, numitems = getworkqueue(
263 ui, repo, pats, opts, revstofix, basectxs
261 ui, repo, pats, opts, revstofix, basectxs
264 )
262 )
265 fixers = getfixers(ui)
263 fixers = getfixers(ui)
266
264
267 # There are no data dependencies between the workers fixing each file
265 # There are no data dependencies between the workers fixing each file
268 # revision, so we can use all available parallelism.
266 # revision, so we can use all available parallelism.
269 def getfixes(items):
267 def getfixes(items):
270 for rev, path in items:
268 for rev, path in items:
271 ctx = repo[rev]
269 ctx = repo[rev]
272 olddata = ctx[path].data()
270 olddata = ctx[path].data()
273 metadata, newdata = fixfile(
271 metadata, newdata = fixfile(
274 ui, repo, opts, fixers, ctx, path, basectxs[rev]
272 ui, repo, opts, fixers, ctx, path, basectxs[rev]
275 )
273 )
276 # Don't waste memory/time passing unchanged content back, but
274 # Don't waste memory/time passing unchanged content back, but
277 # produce one result per item either way.
275 # produce one result per item either way.
278 yield (
276 yield (
279 rev,
277 rev,
280 path,
278 path,
281 metadata,
279 metadata,
282 newdata if newdata != olddata else None,
280 newdata if newdata != olddata else None,
283 )
281 )
284
282
285 results = worker.worker(
283 results = worker.worker(
286 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
284 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
287 )
285 )
288
286
289 # We have to hold on to the data for each successor revision in memory
287 # We have to hold on to the data for each successor revision in memory
290 # until all its parents are committed. We ensure this by committing and
288 # until all its parents are committed. We ensure this by committing and
291 # freeing memory for the revisions in some topological order. This
289 # freeing memory for the revisions in some topological order. This
292 # leaves a little bit of memory efficiency on the table, but also makes
290 # leaves a little bit of memory efficiency on the table, but also makes
293 # the tests deterministic. It might also be considered a feature since
291 # the tests deterministic. It might also be considered a feature since
294 # it makes the results more easily reproducible.
292 # it makes the results more easily reproducible.
295 filedata = collections.defaultdict(dict)
293 filedata = collections.defaultdict(dict)
296 aggregatemetadata = collections.defaultdict(list)
294 aggregatemetadata = collections.defaultdict(list)
297 replacements = {}
295 replacements = {}
298 wdirwritten = False
296 wdirwritten = False
299 commitorder = sorted(revstofix, reverse=True)
297 commitorder = sorted(revstofix, reverse=True)
300 with ui.makeprogress(
298 with ui.makeprogress(
301 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
299 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
302 ) as progress:
300 ) as progress:
303 for rev, path, filerevmetadata, newdata in results:
301 for rev, path, filerevmetadata, newdata in results:
304 progress.increment(item=path)
302 progress.increment(item=path)
305 for fixername, fixermetadata in filerevmetadata.items():
303 for fixername, fixermetadata in filerevmetadata.items():
306 aggregatemetadata[fixername].append(fixermetadata)
304 aggregatemetadata[fixername].append(fixermetadata)
307 if newdata is not None:
305 if newdata is not None:
308 filedata[rev][path] = newdata
306 filedata[rev][path] = newdata
309 hookargs = {
307 hookargs = {
310 b'rev': rev,
308 b'rev': rev,
311 b'path': path,
309 b'path': path,
312 b'metadata': filerevmetadata,
310 b'metadata': filerevmetadata,
313 }
311 }
314 repo.hook(
312 repo.hook(
315 b'postfixfile',
313 b'postfixfile',
316 throw=False,
314 throw=False,
317 **pycompat.strkwargs(hookargs)
315 **pycompat.strkwargs(hookargs)
318 )
316 )
319 numitems[rev] -= 1
317 numitems[rev] -= 1
320 # Apply the fixes for this and any other revisions that are
318 # Apply the fixes for this and any other revisions that are
321 # ready and sitting at the front of the queue. Using a loop here
319 # ready and sitting at the front of the queue. Using a loop here
322 # prevents the queue from being blocked by the first revision to
320 # prevents the queue from being blocked by the first revision to
323 # be ready out of order.
321 # be ready out of order.
324 while commitorder and not numitems[commitorder[-1]]:
322 while commitorder and not numitems[commitorder[-1]]:
325 rev = commitorder.pop()
323 rev = commitorder.pop()
326 ctx = repo[rev]
324 ctx = repo[rev]
327 if rev == wdirrev:
325 if rev == wdirrev:
328 writeworkingdir(repo, ctx, filedata[rev], replacements)
326 writeworkingdir(repo, ctx, filedata[rev], replacements)
329 wdirwritten = bool(filedata[rev])
327 wdirwritten = bool(filedata[rev])
330 else:
328 else:
331 replacerev(ui, repo, ctx, filedata[rev], replacements)
329 replacerev(ui, repo, ctx, filedata[rev], replacements)
332 del filedata[rev]
330 del filedata[rev]
333
331
334 cleanup(repo, replacements, wdirwritten)
332 cleanup(repo, replacements, wdirwritten)
335 hookargs = {
333 hookargs = {
336 b'replacements': replacements,
334 b'replacements': replacements,
337 b'wdirwritten': wdirwritten,
335 b'wdirwritten': wdirwritten,
338 b'metadata': aggregatemetadata,
336 b'metadata': aggregatemetadata,
339 }
337 }
340 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
338 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
341
339
342
340
343 def cleanup(repo, replacements, wdirwritten):
341 def cleanup(repo, replacements, wdirwritten):
344 """Calls scmutil.cleanupnodes() with the given replacements.
342 """Calls scmutil.cleanupnodes() with the given replacements.
345
343
346 "replacements" is a dict from nodeid to nodeid, with one key and one value
344 "replacements" is a dict from nodeid to nodeid, with one key and one value
347 for every revision that was affected by fixing. This is slightly different
345 for every revision that was affected by fixing. This is slightly different
348 from cleanupnodes().
346 from cleanupnodes().
349
347
350 "wdirwritten" is a bool which tells whether the working copy was affected by
348 "wdirwritten" is a bool which tells whether the working copy was affected by
351 fixing, since it has no entry in "replacements".
349 fixing, since it has no entry in "replacements".
352
350
353 Useful as a hook point for extending "hg fix" with output summarizing the
351 Useful as a hook point for extending "hg fix" with output summarizing the
354 effects of the command, though we choose not to output anything here.
352 effects of the command, though we choose not to output anything here.
355 """
353 """
356 replacements = {
354 replacements = {
357 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
355 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
358 }
356 }
359 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
357 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
360
358
361
359
362 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
360 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
363 """"Constructs the list of files to be fixed at specific revisions
361 """"Constructs the list of files to be fixed at specific revisions
364
362
365 It is up to the caller how to consume the work items, and the only
363 It is up to the caller how to consume the work items, and the only
366 dependence between them is that replacement revisions must be committed in
364 dependence between them is that replacement revisions must be committed in
367 topological order. Each work item represents a file in the working copy or
365 topological order. Each work item represents a file in the working copy or
368 in some revision that should be fixed and written back to the working copy
366 in some revision that should be fixed and written back to the working copy
369 or into a replacement revision.
367 or into a replacement revision.
370
368
371 Work items for the same revision are grouped together, so that a worker
369 Work items for the same revision are grouped together, so that a worker
372 pool starting with the first N items in parallel is likely to finish the
370 pool starting with the first N items in parallel is likely to finish the
373 first revision's work before other revisions. This can allow us to write
371 first revision's work before other revisions. This can allow us to write
374 the result to disk and reduce memory footprint. At time of writing, the
372 the result to disk and reduce memory footprint. At time of writing, the
375 partition strategy in worker.py seems favorable to this. We also sort the
373 partition strategy in worker.py seems favorable to this. We also sort the
376 items by ascending revision number to match the order in which we commit
374 items by ascending revision number to match the order in which we commit
377 the fixes later.
375 the fixes later.
378 """
376 """
379 workqueue = []
377 workqueue = []
380 numitems = collections.defaultdict(int)
378 numitems = collections.defaultdict(int)
381 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
379 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
382 for rev in sorted(revstofix):
380 for rev in sorted(revstofix):
383 fixctx = repo[rev]
381 fixctx = repo[rev]
384 match = scmutil.match(fixctx, pats, opts)
382 match = scmutil.match(fixctx, pats, opts)
385 for path in sorted(
383 for path in sorted(
386 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
384 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
387 ):
385 ):
388 fctx = fixctx[path]
386 fctx = fixctx[path]
389 if fctx.islink():
387 if fctx.islink():
390 continue
388 continue
391 if fctx.size() > maxfilesize:
389 if fctx.size() > maxfilesize:
392 ui.warn(
390 ui.warn(
393 _(b'ignoring file larger than %s: %s\n')
391 _(b'ignoring file larger than %s: %s\n')
394 % (util.bytecount(maxfilesize), path)
392 % (util.bytecount(maxfilesize), path)
395 )
393 )
396 continue
394 continue
397 workqueue.append((rev, path))
395 workqueue.append((rev, path))
398 numitems[rev] += 1
396 numitems[rev] += 1
399 return workqueue, numitems
397 return workqueue, numitems
400
398
401
399
402 def getrevstofix(ui, repo, opts):
400 def getrevstofix(ui, repo, opts):
403 """Returns the set of revision numbers that should be fixed"""
401 """Returns the set of revision numbers that should be fixed"""
404 revs = set(scmutil.revrange(repo, opts[b'rev']))
402 revs = set(scmutil.revrange(repo, opts[b'rev']))
405 for rev in revs:
403 for rev in revs:
406 checkfixablectx(ui, repo, repo[rev])
404 checkfixablectx(ui, repo, repo[rev])
407 if revs:
405 if revs:
408 cmdutil.checkunfinished(repo)
406 cmdutil.checkunfinished(repo)
409 checknodescendants(repo, revs)
407 checknodescendants(repo, revs)
410 if opts.get(b'working_dir'):
408 if opts.get(b'working_dir'):
411 revs.add(wdirrev)
409 revs.add(wdirrev)
412 if list(merge.mergestate.read(repo).unresolved()):
410 if list(merge.mergestate.read(repo).unresolved()):
413 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
411 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
414 if not revs:
412 if not revs:
415 raise error.Abort(
413 raise error.Abort(
416 b'no changesets specified', hint=b'use --rev or --working-dir'
414 b'no changesets specified', hint=b'use --rev or --working-dir'
417 )
415 )
418 return revs
416 return revs
419
417
420
418
421 def checknodescendants(repo, revs):
419 def checknodescendants(repo, revs):
422 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
420 if not obsolete.isenabled(repo, obsolete.allowunstableopt) and repo.revs(
423 b'(%ld::) - (%ld)', revs, revs
421 b'(%ld::) - (%ld)', revs, revs
424 ):
422 ):
425 raise error.Abort(
423 raise error.Abort(
426 _(b'can only fix a changeset together with all its descendants')
424 _(b'can only fix a changeset together with all its descendants')
427 )
425 )
428
426
429
427
430 def checkfixablectx(ui, repo, ctx):
428 def checkfixablectx(ui, repo, ctx):
431 """Aborts if the revision shouldn't be replaced with a fixed one."""
429 """Aborts if the revision shouldn't be replaced with a fixed one."""
432 if not ctx.mutable():
430 if not ctx.mutable():
433 raise error.Abort(
431 raise error.Abort(
434 b'can\'t fix immutable changeset %s'
432 b'can\'t fix immutable changeset %s'
435 % (scmutil.formatchangeid(ctx),)
433 % (scmutil.formatchangeid(ctx),)
436 )
434 )
437 if ctx.obsolete():
435 if ctx.obsolete():
438 # It would be better to actually check if the revision has a successor.
436 # It would be better to actually check if the revision has a successor.
439 allowdivergence = ui.configbool(
437 allowdivergence = ui.configbool(
440 b'experimental', b'evolution.allowdivergence'
438 b'experimental', b'evolution.allowdivergence'
441 )
439 )
442 if not allowdivergence:
440 if not allowdivergence:
443 raise error.Abort(
441 raise error.Abort(
444 b'fixing obsolete revision could cause divergence'
442 b'fixing obsolete revision could cause divergence'
445 )
443 )
446
444
447
445
448 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
446 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
449 """Returns the set of files that should be fixed in a context
447 """Returns the set of files that should be fixed in a context
450
448
451 The result depends on the base contexts; we include any file that has
449 The result depends on the base contexts; we include any file that has
452 changed relative to any of the base contexts. Base contexts should be
450 changed relative to any of the base contexts. Base contexts should be
453 ancestors of the context being fixed.
451 ancestors of the context being fixed.
454 """
452 """
455 files = set()
453 files = set()
456 for basectx in basectxs:
454 for basectx in basectxs:
457 stat = basectx.status(
455 stat = basectx.status(
458 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
456 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
459 )
457 )
460 files.update(
458 files.update(
461 set(
459 set(
462 itertools.chain(
460 itertools.chain(
463 stat.added, stat.modified, stat.clean, stat.unknown
461 stat.added, stat.modified, stat.clean, stat.unknown
464 )
462 )
465 )
463 )
466 )
464 )
467 return files
465 return files
468
466
469
467
470 def lineranges(opts, path, basectxs, fixctx, content2):
468 def lineranges(opts, path, basectxs, fixctx, content2):
471 """Returns the set of line ranges that should be fixed in a file
469 """Returns the set of line ranges that should be fixed in a file
472
470
473 Of the form [(10, 20), (30, 40)].
471 Of the form [(10, 20), (30, 40)].
474
472
475 This depends on the given base contexts; we must consider lines that have
473 This depends on the given base contexts; we must consider lines that have
476 changed versus any of the base contexts, and whether the file has been
474 changed versus any of the base contexts, and whether the file has been
477 renamed versus any of them.
475 renamed versus any of them.
478
476
479 Another way to understand this is that we exclude line ranges that are
477 Another way to understand this is that we exclude line ranges that are
480 common to the file in all base contexts.
478 common to the file in all base contexts.
481 """
479 """
482 if opts.get(b'whole'):
480 if opts.get(b'whole'):
483 # Return a range containing all lines. Rely on the diff implementation's
481 # Return a range containing all lines. Rely on the diff implementation's
484 # idea of how many lines are in the file, instead of reimplementing it.
482 # idea of how many lines are in the file, instead of reimplementing it.
485 return difflineranges(b'', content2)
483 return difflineranges(b'', content2)
486
484
487 rangeslist = []
485 rangeslist = []
488 for basectx in basectxs:
486 for basectx in basectxs:
489 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
487 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
490 if basepath in basectx:
488 if basepath in basectx:
491 content1 = basectx[basepath].data()
489 content1 = basectx[basepath].data()
492 else:
490 else:
493 content1 = b''
491 content1 = b''
494 rangeslist.extend(difflineranges(content1, content2))
492 rangeslist.extend(difflineranges(content1, content2))
495 return unionranges(rangeslist)
493 return unionranges(rangeslist)
496
494
497
495
498 def unionranges(rangeslist):
496 def unionranges(rangeslist):
499 """Return the union of some closed intervals
497 """Return the union of some closed intervals
500
498
501 >>> unionranges([])
499 >>> unionranges([])
502 []
500 []
503 >>> unionranges([(1, 100)])
501 >>> unionranges([(1, 100)])
504 [(1, 100)]
502 [(1, 100)]
505 >>> unionranges([(1, 100), (1, 100)])
503 >>> unionranges([(1, 100), (1, 100)])
506 [(1, 100)]
504 [(1, 100)]
507 >>> unionranges([(1, 100), (2, 100)])
505 >>> unionranges([(1, 100), (2, 100)])
508 [(1, 100)]
506 [(1, 100)]
509 >>> unionranges([(1, 99), (1, 100)])
507 >>> unionranges([(1, 99), (1, 100)])
510 [(1, 100)]
508 [(1, 100)]
511 >>> unionranges([(1, 100), (40, 60)])
509 >>> unionranges([(1, 100), (40, 60)])
512 [(1, 100)]
510 [(1, 100)]
513 >>> unionranges([(1, 49), (50, 100)])
511 >>> unionranges([(1, 49), (50, 100)])
514 [(1, 100)]
512 [(1, 100)]
515 >>> unionranges([(1, 48), (50, 100)])
513 >>> unionranges([(1, 48), (50, 100)])
516 [(1, 48), (50, 100)]
514 [(1, 48), (50, 100)]
517 >>> unionranges([(1, 2), (3, 4), (5, 6)])
515 >>> unionranges([(1, 2), (3, 4), (5, 6)])
518 [(1, 6)]
516 [(1, 6)]
519 """
517 """
520 rangeslist = sorted(set(rangeslist))
518 rangeslist = sorted(set(rangeslist))
521 unioned = []
519 unioned = []
522 if rangeslist:
520 if rangeslist:
523 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
521 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
524 for a, b in rangeslist:
522 for a, b in rangeslist:
525 c, d = unioned[-1]
523 c, d = unioned[-1]
526 if a > d + 1:
524 if a > d + 1:
527 unioned.append((a, b))
525 unioned.append((a, b))
528 else:
526 else:
529 unioned[-1] = (c, max(b, d))
527 unioned[-1] = (c, max(b, d))
530 return unioned
528 return unioned
531
529
532
530
533 def difflineranges(content1, content2):
531 def difflineranges(content1, content2):
534 """Return list of line number ranges in content2 that differ from content1.
532 """Return list of line number ranges in content2 that differ from content1.
535
533
536 Line numbers are 1-based. The numbers are the first and last line contained
534 Line numbers are 1-based. The numbers are the first and last line contained
537 in the range. Single-line ranges have the same line number for the first and
535 in the range. Single-line ranges have the same line number for the first and
538 last line. Excludes any empty ranges that result from lines that are only
536 last line. Excludes any empty ranges that result from lines that are only
539 present in content1. Relies on mdiff's idea of where the line endings are in
537 present in content1. Relies on mdiff's idea of where the line endings are in
540 the string.
538 the string.
541
539
542 >>> from mercurial import pycompat
540 >>> from mercurial import pycompat
543 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
541 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
544 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
542 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
545 >>> difflineranges2(b'', b'')
543 >>> difflineranges2(b'', b'')
546 []
544 []
547 >>> difflineranges2(b'a', b'')
545 >>> difflineranges2(b'a', b'')
548 []
546 []
549 >>> difflineranges2(b'', b'A')
547 >>> difflineranges2(b'', b'A')
550 [(1, 1)]
548 [(1, 1)]
551 >>> difflineranges2(b'a', b'a')
549 >>> difflineranges2(b'a', b'a')
552 []
550 []
553 >>> difflineranges2(b'a', b'A')
551 >>> difflineranges2(b'a', b'A')
554 [(1, 1)]
552 [(1, 1)]
555 >>> difflineranges2(b'ab', b'')
553 >>> difflineranges2(b'ab', b'')
556 []
554 []
557 >>> difflineranges2(b'', b'AB')
555 >>> difflineranges2(b'', b'AB')
558 [(1, 2)]
556 [(1, 2)]
559 >>> difflineranges2(b'abc', b'ac')
557 >>> difflineranges2(b'abc', b'ac')
560 []
558 []
561 >>> difflineranges2(b'ab', b'aCb')
559 >>> difflineranges2(b'ab', b'aCb')
562 [(2, 2)]
560 [(2, 2)]
563 >>> difflineranges2(b'abc', b'aBc')
561 >>> difflineranges2(b'abc', b'aBc')
564 [(2, 2)]
562 [(2, 2)]
565 >>> difflineranges2(b'ab', b'AB')
563 >>> difflineranges2(b'ab', b'AB')
566 [(1, 2)]
564 [(1, 2)]
567 >>> difflineranges2(b'abcde', b'aBcDe')
565 >>> difflineranges2(b'abcde', b'aBcDe')
568 [(2, 2), (4, 4)]
566 [(2, 2), (4, 4)]
569 >>> difflineranges2(b'abcde', b'aBCDe')
567 >>> difflineranges2(b'abcde', b'aBCDe')
570 [(2, 4)]
568 [(2, 4)]
571 """
569 """
572 ranges = []
570 ranges = []
573 for lines, kind in mdiff.allblocks(content1, content2):
571 for lines, kind in mdiff.allblocks(content1, content2):
574 firstline, lastline = lines[2:4]
572 firstline, lastline = lines[2:4]
575 if kind == b'!' and firstline != lastline:
573 if kind == b'!' and firstline != lastline:
576 ranges.append((firstline + 1, lastline))
574 ranges.append((firstline + 1, lastline))
577 return ranges
575 return ranges
578
576
579
577
580 def getbasectxs(repo, opts, revstofix):
578 def getbasectxs(repo, opts, revstofix):
581 """Returns a map of the base contexts for each revision
579 """Returns a map of the base contexts for each revision
582
580
583 The base contexts determine which lines are considered modified when we
581 The base contexts determine which lines are considered modified when we
584 attempt to fix just the modified lines in a file. It also determines which
582 attempt to fix just the modified lines in a file. It also determines which
585 files we attempt to fix, so it is important to compute this even when
583 files we attempt to fix, so it is important to compute this even when
586 --whole is used.
584 --whole is used.
587 """
585 """
588 # The --base flag overrides the usual logic, and we give every revision
586 # The --base flag overrides the usual logic, and we give every revision
589 # exactly the set of baserevs that the user specified.
587 # exactly the set of baserevs that the user specified.
590 if opts.get(b'base'):
588 if opts.get(b'base'):
591 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
589 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
592 if not baserevs:
590 if not baserevs:
593 baserevs = {nullrev}
591 baserevs = {nullrev}
594 basectxs = {repo[rev] for rev in baserevs}
592 basectxs = {repo[rev] for rev in baserevs}
595 return {rev: basectxs for rev in revstofix}
593 return {rev: basectxs for rev in revstofix}
596
594
597 # Proceed in topological order so that we can easily determine each
595 # Proceed in topological order so that we can easily determine each
598 # revision's baserevs by looking at its parents and their baserevs.
596 # revision's baserevs by looking at its parents and their baserevs.
599 basectxs = collections.defaultdict(set)
597 basectxs = collections.defaultdict(set)
600 for rev in sorted(revstofix):
598 for rev in sorted(revstofix):
601 ctx = repo[rev]
599 ctx = repo[rev]
602 for pctx in ctx.parents():
600 for pctx in ctx.parents():
603 if pctx.rev() in basectxs:
601 if pctx.rev() in basectxs:
604 basectxs[rev].update(basectxs[pctx.rev()])
602 basectxs[rev].update(basectxs[pctx.rev()])
605 else:
603 else:
606 basectxs[rev].add(pctx)
604 basectxs[rev].add(pctx)
607 return basectxs
605 return basectxs
608
606
609
607
610 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
608 def fixfile(ui, repo, opts, fixers, fixctx, path, basectxs):
611 """Run any configured fixers that should affect the file in this context
609 """Run any configured fixers that should affect the file in this context
612
610
613 Returns the file content that results from applying the fixers in some order
611 Returns the file content that results from applying the fixers in some order
614 starting with the file's content in the fixctx. Fixers that support line
612 starting with the file's content in the fixctx. Fixers that support line
615 ranges will affect lines that have changed relative to any of the basectxs
613 ranges will affect lines that have changed relative to any of the basectxs
616 (i.e. they will only avoid lines that are common to all basectxs).
614 (i.e. they will only avoid lines that are common to all basectxs).
617
615
618 A fixer tool's stdout will become the file's new content if and only if it
616 A fixer tool's stdout will become the file's new content if and only if it
619 exits with code zero. The fixer tool's working directory is the repository's
617 exits with code zero. The fixer tool's working directory is the repository's
620 root.
618 root.
621 """
619 """
622 metadata = {}
620 metadata = {}
623 newdata = fixctx[path].data()
621 newdata = fixctx[path].data()
624 for fixername, fixer in pycompat.iteritems(fixers):
622 for fixername, fixer in pycompat.iteritems(fixers):
625 if fixer.affects(opts, fixctx, path):
623 if fixer.affects(opts, fixctx, path):
626 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
624 ranges = lineranges(opts, path, basectxs, fixctx, newdata)
627 command = fixer.command(ui, path, ranges)
625 command = fixer.command(ui, path, ranges)
628 if command is None:
626 if command is None:
629 continue
627 continue
630 ui.debug(b'subprocess: %s\n' % (command,))
628 ui.debug(b'subprocess: %s\n' % (command,))
631 proc = subprocess.Popen(
629 proc = subprocess.Popen(
632 procutil.tonativestr(command),
630 procutil.tonativestr(command),
633 shell=True,
631 shell=True,
634 cwd=procutil.tonativestr(repo.root),
632 cwd=procutil.tonativestr(repo.root),
635 stdin=subprocess.PIPE,
633 stdin=subprocess.PIPE,
636 stdout=subprocess.PIPE,
634 stdout=subprocess.PIPE,
637 stderr=subprocess.PIPE,
635 stderr=subprocess.PIPE,
638 )
636 )
639 stdout, stderr = proc.communicate(newdata)
637 stdout, stderr = proc.communicate(newdata)
640 if stderr:
638 if stderr:
641 showstderr(ui, fixctx.rev(), fixername, stderr)
639 showstderr(ui, fixctx.rev(), fixername, stderr)
642 newerdata = stdout
640 newerdata = stdout
643 if fixer.shouldoutputmetadata():
641 if fixer.shouldoutputmetadata():
644 try:
642 try:
645 metadatajson, newerdata = stdout.split(b'\0', 1)
643 metadatajson, newerdata = stdout.split(b'\0', 1)
646 metadata[fixername] = json.loads(metadatajson)
644 metadata[fixername] = json.loads(metadatajson)
647 except ValueError:
645 except ValueError:
648 ui.warn(
646 ui.warn(
649 _(b'ignored invalid output from fixer tool: %s\n')
647 _(b'ignored invalid output from fixer tool: %s\n')
650 % (fixername,)
648 % (fixername,)
651 )
649 )
652 continue
650 continue
653 else:
651 else:
654 metadata[fixername] = None
652 metadata[fixername] = None
655 if proc.returncode == 0:
653 if proc.returncode == 0:
656 newdata = newerdata
654 newdata = newerdata
657 else:
655 else:
658 if not stderr:
656 if not stderr:
659 message = _(b'exited with status %d\n') % (proc.returncode,)
657 message = _(b'exited with status %d\n') % (proc.returncode,)
660 showstderr(ui, fixctx.rev(), fixername, message)
658 showstderr(ui, fixctx.rev(), fixername, message)
661 checktoolfailureaction(
659 checktoolfailureaction(
662 ui,
660 ui,
663 _(b'no fixes will be applied'),
661 _(b'no fixes will be applied'),
664 hint=_(
662 hint=_(
665 b'use --config fix.failure=continue to apply any '
663 b'use --config fix.failure=continue to apply any '
666 b'successful fixes anyway'
664 b'successful fixes anyway'
667 ),
665 ),
668 )
666 )
669 return metadata, newdata
667 return metadata, newdata
670
668
671
669
672 def showstderr(ui, rev, fixername, stderr):
670 def showstderr(ui, rev, fixername, stderr):
673 """Writes the lines of the stderr string as warnings on the ui
671 """Writes the lines of the stderr string as warnings on the ui
674
672
675 Uses the revision number and fixername to give more context to each line of
673 Uses the revision number and fixername to give more context to each line of
676 the error message. Doesn't include file names, since those take up a lot of
674 the error message. Doesn't include file names, since those take up a lot of
677 space and would tend to be included in the error message if they were
675 space and would tend to be included in the error message if they were
678 relevant.
676 relevant.
679 """
677 """
680 for line in re.split(b'[\r\n]+', stderr):
678 for line in re.split(b'[\r\n]+', stderr):
681 if line:
679 if line:
682 ui.warn(b'[')
680 ui.warn(b'[')
683 if rev is None:
681 if rev is None:
684 ui.warn(_(b'wdir'), label=b'evolve.rev')
682 ui.warn(_(b'wdir'), label=b'evolve.rev')
685 else:
683 else:
686 ui.warn((str(rev)), label=b'evolve.rev')
684 ui.warn((str(rev)), label=b'evolve.rev')
687 ui.warn(b'] %s: %s\n' % (fixername, line))
685 ui.warn(b'] %s: %s\n' % (fixername, line))
688
686
689
687
690 def writeworkingdir(repo, ctx, filedata, replacements):
688 def writeworkingdir(repo, ctx, filedata, replacements):
691 """Write new content to the working copy and check out the new p1 if any
689 """Write new content to the working copy and check out the new p1 if any
692
690
693 We check out a new revision if and only if we fixed something in both the
691 We check out a new revision if and only if we fixed something in both the
694 working directory and its parent revision. This avoids the need for a full
692 working directory and its parent revision. This avoids the need for a full
695 update/merge, and means that the working directory simply isn't affected
693 update/merge, and means that the working directory simply isn't affected
696 unless the --working-dir flag is given.
694 unless the --working-dir flag is given.
697
695
698 Directly updates the dirstate for the affected files.
696 Directly updates the dirstate for the affected files.
699 """
697 """
700 for path, data in pycompat.iteritems(filedata):
698 for path, data in pycompat.iteritems(filedata):
701 fctx = ctx[path]
699 fctx = ctx[path]
702 fctx.write(data, fctx.flags())
700 fctx.write(data, fctx.flags())
703 if repo.dirstate[path] == b'n':
701 if repo.dirstate[path] == b'n':
704 repo.dirstate.normallookup(path)
702 repo.dirstate.normallookup(path)
705
703
706 oldparentnodes = repo.dirstate.parents()
704 oldparentnodes = repo.dirstate.parents()
707 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
705 newparentnodes = [replacements.get(n, n) for n in oldparentnodes]
708 if newparentnodes != oldparentnodes:
706 if newparentnodes != oldparentnodes:
709 repo.setparents(*newparentnodes)
707 repo.setparents(*newparentnodes)
710
708
711
709
712 def replacerev(ui, repo, ctx, filedata, replacements):
710 def replacerev(ui, repo, ctx, filedata, replacements):
713 """Commit a new revision like the given one, but with file content changes
711 """Commit a new revision like the given one, but with file content changes
714
712
715 "ctx" is the original revision to be replaced by a modified one.
713 "ctx" is the original revision to be replaced by a modified one.
716
714
717 "filedata" is a dict that maps paths to their new file content. All other
715 "filedata" is a dict that maps paths to their new file content. All other
718 paths will be recreated from the original revision without changes.
716 paths will be recreated from the original revision without changes.
719 "filedata" may contain paths that didn't exist in the original revision;
717 "filedata" may contain paths that didn't exist in the original revision;
720 they will be added.
718 they will be added.
721
719
722 "replacements" is a dict that maps a single node to a single node, and it is
720 "replacements" is a dict that maps a single node to a single node, and it is
723 updated to indicate the original revision is replaced by the newly created
721 updated to indicate the original revision is replaced by the newly created
724 one. No entry is added if the replacement's node already exists.
722 one. No entry is added if the replacement's node already exists.
725
723
726 The new revision has the same parents as the old one, unless those parents
724 The new revision has the same parents as the old one, unless those parents
727 have already been replaced, in which case those replacements are the parents
725 have already been replaced, in which case those replacements are the parents
728 of this new revision. Thus, if revisions are replaced in topological order,
726 of this new revision. Thus, if revisions are replaced in topological order,
729 there is no need to rebase them into the original topology later.
727 there is no need to rebase them into the original topology later.
730 """
728 """
731
729
732 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
730 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
733 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
731 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
734 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
732 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
735 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
733 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
736
734
737 # We don't want to create a revision that has no changes from the original,
735 # We don't want to create a revision that has no changes from the original,
738 # but we should if the original revision's parent has been replaced.
736 # but we should if the original revision's parent has been replaced.
739 # Otherwise, we would produce an orphan that needs no actual human
737 # Otherwise, we would produce an orphan that needs no actual human
740 # intervention to evolve. We can't rely on commit() to avoid creating the
738 # intervention to evolve. We can't rely on commit() to avoid creating the
741 # un-needed revision because the extra field added below produces a new hash
739 # un-needed revision because the extra field added below produces a new hash
742 # regardless of file content changes.
740 # regardless of file content changes.
743 if (
741 if (
744 not filedata
742 not filedata
745 and p1ctx.node() not in replacements
743 and p1ctx.node() not in replacements
746 and p2ctx.node() not in replacements
744 and p2ctx.node() not in replacements
747 ):
745 ):
748 return
746 return
749
747
750 def filectxfn(repo, memctx, path):
748 def filectxfn(repo, memctx, path):
751 if path not in ctx:
749 if path not in ctx:
752 return None
750 return None
753 fctx = ctx[path]
751 fctx = ctx[path]
754 copysource = fctx.copysource()
752 copysource = fctx.copysource()
755 return context.memfilectx(
753 return context.memfilectx(
756 repo,
754 repo,
757 memctx,
755 memctx,
758 path=fctx.path(),
756 path=fctx.path(),
759 data=filedata.get(path, fctx.data()),
757 data=filedata.get(path, fctx.data()),
760 islink=fctx.islink(),
758 islink=fctx.islink(),
761 isexec=fctx.isexec(),
759 isexec=fctx.isexec(),
762 copysource=copysource,
760 copysource=copysource,
763 )
761 )
764
762
765 extra = ctx.extra().copy()
763 extra = ctx.extra().copy()
766 extra[b'fix_source'] = ctx.hex()
764 extra[b'fix_source'] = ctx.hex()
767
765
768 memctx = context.memctx(
766 memctx = context.memctx(
769 repo,
767 repo,
770 parents=(newp1node, newp2node),
768 parents=(newp1node, newp2node),
771 text=ctx.description(),
769 text=ctx.description(),
772 files=set(ctx.files()) | set(filedata.keys()),
770 files=set(ctx.files()) | set(filedata.keys()),
773 filectxfn=filectxfn,
771 filectxfn=filectxfn,
774 user=ctx.user(),
772 user=ctx.user(),
775 date=ctx.date(),
773 date=ctx.date(),
776 extra=extra,
774 extra=extra,
777 branch=ctx.branch(),
775 branch=ctx.branch(),
778 editor=None,
776 editor=None,
779 )
777 )
780 sucnode = memctx.commit()
778 sucnode = memctx.commit()
781 prenode = ctx.node()
779 prenode = ctx.node()
782 if prenode == sucnode:
780 if prenode == sucnode:
783 ui.debug(b'node %s already existed\n' % (ctx.hex()))
781 ui.debug(b'node %s already existed\n' % (ctx.hex()))
784 else:
782 else:
785 replacements[ctx.node()] = sucnode
783 replacements[ctx.node()] = sucnode
786
784
787
785
788 def getfixers(ui):
786 def getfixers(ui):
789 """Returns a map of configured fixer tools indexed by their names
787 """Returns a map of configured fixer tools indexed by their names
790
788
791 Each value is a Fixer object with methods that implement the behavior of the
789 Each value is a Fixer object with methods that implement the behavior of the
792 fixer's config suboptions. Does not validate the config values.
790 fixer's config suboptions. Does not validate the config values.
793 """
791 """
794 fixers = {}
792 fixers = {}
795 for name in fixernames(ui):
793 for name in fixernames(ui):
796 fixers[name] = Fixer()
794 enabled = ui.configbool(b'fix', name + b':enabled')
797 for key in FIXER_ATTRS:
795 command = ui.config(b'fix', name + b':command')
798 setattr(
796 pattern = ui.config(b'fix', name + b':pattern')
799 fixers[name],
797 linerange = ui.config(b'fix', name + b':linerange')
800 pycompat.sysstr(b'_' + key),
798 priority = ui.configint(b'fix', name + b':priority')
801 ui.config(b'fix', name + b':' + key),
799 metadata = ui.configbool(b'fix', name + b':metadata')
802 )
800 skipclean = ui.configbool(b'fix', name + b':skipclean')
803 fixers[name]._priority = int(fixers[name]._priority)
804 fixers[name]._metadata = stringutil.parsebool(fixers[name]._metadata)
805 fixers[name]._skipclean = stringutil.parsebool(fixers[name]._skipclean)
806 fixers[name]._enabled = stringutil.parsebool(fixers[name]._enabled)
807 # Don't use a fixer if it has no pattern configured. It would be
801 # Don't use a fixer if it has no pattern configured. It would be
808 # dangerous to let it affect all files. It would be pointless to let it
802 # dangerous to let it affect all files. It would be pointless to let it
809 # affect no files. There is no reasonable subset of files to use as the
803 # affect no files. There is no reasonable subset of files to use as the
810 # default.
804 # default.
811 if fixers[name]._pattern is None:
805 if pattern is None:
812 ui.warn(
806 ui.warn(
813 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
807 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
814 )
808 )
815 del fixers[name]
809 elif not enabled:
816 elif not fixers[name]._enabled:
817 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
810 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
818 del fixers[name]
811 else:
812 fixers[name] = Fixer(
813 command, pattern, linerange, priority, metadata, skipclean
814 )
819 return collections.OrderedDict(
815 return collections.OrderedDict(
820 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
816 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
821 )
817 )
822
818
823
819
824 def fixernames(ui):
820 def fixernames(ui):
825 """Returns the names of [fix] config options that have suboptions"""
821 """Returns the names of [fix] config options that have suboptions"""
826 names = set()
822 names = set()
827 for k, v in ui.configitems(b'fix'):
823 for k, v in ui.configitems(b'fix'):
828 if b':' in k:
824 if b':' in k:
829 names.add(k.split(b':', 1)[0])
825 names.add(k.split(b':', 1)[0])
830 return names
826 return names
831
827
832
828
833 class Fixer(object):
829 class Fixer(object):
834 """Wraps the raw config values for a fixer with methods"""
830 """Wraps the raw config values for a fixer with methods"""
835
831
832 def __init__(
833 self, command, pattern, linerange, priority, metadata, skipclean
834 ):
835 self._command = command
836 self._pattern = pattern
837 self._linerange = linerange
838 self._priority = priority
839 self._metadata = metadata
840 self._skipclean = skipclean
841
836 def affects(self, opts, fixctx, path):
842 def affects(self, opts, fixctx, path):
837 """Should this fixer run on the file at the given path and context?"""
843 """Should this fixer run on the file at the given path and context?"""
838 return self._pattern is not None and scmutil.match(
844 return self._pattern is not None and scmutil.match(
839 fixctx, [self._pattern], opts
845 fixctx, [self._pattern], opts
840 )(path)
846 )(path)
841
847
842 def shouldoutputmetadata(self):
848 def shouldoutputmetadata(self):
843 """Should the stdout of this fixer start with JSON and a null byte?"""
849 """Should the stdout of this fixer start with JSON and a null byte?"""
844 return self._metadata
850 return self._metadata
845
851
846 def command(self, ui, path, ranges):
852 def command(self, ui, path, ranges):
847 """A shell command to use to invoke this fixer on the given file/lines
853 """A shell command to use to invoke this fixer on the given file/lines
848
854
849 May return None if there is no appropriate command to run for the given
855 May return None if there is no appropriate command to run for the given
850 parameters.
856 parameters.
851 """
857 """
852 expand = cmdutil.rendercommandtemplate
858 expand = cmdutil.rendercommandtemplate
853 parts = [
859 parts = [
854 expand(
860 expand(
855 ui,
861 ui,
856 self._command,
862 self._command,
857 {b'rootpath': path, b'basename': os.path.basename(path)},
863 {b'rootpath': path, b'basename': os.path.basename(path)},
858 )
864 )
859 ]
865 ]
860 if self._linerange:
866 if self._linerange:
861 if self._skipclean and not ranges:
867 if self._skipclean and not ranges:
862 # No line ranges to fix, so don't run the fixer.
868 # No line ranges to fix, so don't run the fixer.
863 return None
869 return None
864 for first, last in ranges:
870 for first, last in ranges:
865 parts.append(
871 parts.append(
866 expand(
872 expand(
867 ui, self._linerange, {b'first': first, b'last': last}
873 ui, self._linerange, {b'first': first, b'last': last}
868 )
874 )
869 )
875 )
870 return b' '.join(parts)
876 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now