##// END OF EJS Templates
fix: use obsolete.isenabled() to check for experimental.allowdivergence...
av6 -
r48598:e69c82bf stable
parent child Browse files
Show More
@@ -1,942 +1,940 b''
1 # fix - rewrite file content in changesets and working copy
1 # fix - rewrite file content in changesets and working copy
2 #
2 #
3 # Copyright 2018 Google LLC.
3 # Copyright 2018 Google LLC.
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
7 """rewrite file content in changesets or working copy (EXPERIMENTAL)
8
8
9 Provides a command that runs configured tools on the contents of modified files,
9 Provides a command that runs configured tools on the contents of modified files,
10 writing back any fixes to the working copy or replacing changesets.
10 writing back any fixes to the working copy or replacing changesets.
11
11
12 Here is an example configuration that causes :hg:`fix` to apply automatic
12 Here is an example configuration that causes :hg:`fix` to apply automatic
13 formatting fixes to modified lines in C++ code::
13 formatting fixes to modified lines in C++ code::
14
14
15 [fix]
15 [fix]
16 clang-format:command=clang-format --assume-filename={rootpath}
16 clang-format:command=clang-format --assume-filename={rootpath}
17 clang-format:linerange=--lines={first}:{last}
17 clang-format:linerange=--lines={first}:{last}
18 clang-format:pattern=set:**.cpp or **.hpp
18 clang-format:pattern=set:**.cpp or **.hpp
19
19
20 The :command suboption forms the first part of the shell command that will be
20 The :command suboption forms the first part of the shell command that will be
21 used to fix a file. The content of the file is passed on standard input, and the
21 used to fix a file. The content of the file is passed on standard input, and the
22 fixed file content is expected on standard output. Any output on standard error
22 fixed file content is expected on standard output. Any output on standard error
23 will be displayed as a warning. If the exit status is not zero, the file will
23 will be displayed as a warning. If the exit status is not zero, the file will
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
24 not be affected. A placeholder warning is displayed if there is a non-zero exit
25 status but no standard error output. Some values may be substituted into the
25 status but no standard error output. Some values may be substituted into the
26 command::
26 command::
27
27
28 {rootpath} The path of the file being fixed, relative to the repo root
28 {rootpath} The path of the file being fixed, relative to the repo root
29 {basename} The name of the file being fixed, without the directory path
29 {basename} The name of the file being fixed, without the directory path
30
30
31 If the :linerange suboption is set, the tool will only be run if there are
31 If the :linerange suboption is set, the tool will only be run if there are
32 changed lines in a file. The value of this suboption is appended to the shell
32 changed lines in a file. The value of this suboption is appended to the shell
33 command once for every range of changed lines in the file. Some values may be
33 command once for every range of changed lines in the file. Some values may be
34 substituted into the command::
34 substituted into the command::
35
35
36 {first} The 1-based line number of the first line in the modified range
36 {first} The 1-based line number of the first line in the modified range
37 {last} The 1-based line number of the last line in the modified range
37 {last} The 1-based line number of the last line in the modified range
38
38
39 Deleted sections of a file will be ignored by :linerange, because there is no
39 Deleted sections of a file will be ignored by :linerange, because there is no
40 corresponding line range in the version being fixed.
40 corresponding line range in the version being fixed.
41
41
42 By default, tools that set :linerange will only be executed if there is at least
42 By default, tools that set :linerange will only be executed if there is at least
43 one changed line range. This is meant to prevent accidents like running a code
43 one changed line range. This is meant to prevent accidents like running a code
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
44 formatter in such a way that it unexpectedly reformats the whole file. If such a
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
45 tool needs to operate on unchanged files, it should set the :skipclean suboption
46 to false.
46 to false.
47
47
48 The :pattern suboption determines which files will be passed through each
48 The :pattern suboption determines which files will be passed through each
49 configured tool. See :hg:`help patterns` for possible values. However, all
49 configured tool. See :hg:`help patterns` for possible values. However, all
50 patterns are relative to the repo root, even if that text says they are relative
50 patterns are relative to the repo root, even if that text says they are relative
51 to the current working directory. If there are file arguments to :hg:`fix`, the
51 to the current working directory. If there are file arguments to :hg:`fix`, the
52 intersection of these patterns is used.
52 intersection of these patterns is used.
53
53
54 There is also a configurable limit for the maximum size of file that will be
54 There is also a configurable limit for the maximum size of file that will be
55 processed by :hg:`fix`::
55 processed by :hg:`fix`::
56
56
57 [fix]
57 [fix]
58 maxfilesize = 2MB
58 maxfilesize = 2MB
59
59
60 Normally, execution of configured tools will continue after a failure (indicated
60 Normally, execution of configured tools will continue after a failure (indicated
61 by a non-zero exit status). It can also be configured to abort after the first
61 by a non-zero exit status). It can also be configured to abort after the first
62 such failure, so that no files will be affected if any tool fails. This abort
62 such failure, so that no files will be affected if any tool fails. This abort
63 will also cause :hg:`fix` to exit with a non-zero status::
63 will also cause :hg:`fix` to exit with a non-zero status::
64
64
65 [fix]
65 [fix]
66 failure = abort
66 failure = abort
67
67
68 When multiple tools are configured to affect a file, they execute in an order
68 When multiple tools are configured to affect a file, they execute in an order
69 defined by the :priority suboption. The priority suboption has a default value
69 defined by the :priority suboption. The priority suboption has a default value
70 of zero for each tool. Tools are executed in order of descending priority. The
70 of zero for each tool. Tools are executed in order of descending priority. The
71 execution order of tools with equal priority is unspecified. For example, you
71 execution order of tools with equal priority is unspecified. For example, you
72 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
72 could use the 'sort' and 'head' utilities to keep only the 10 smallest numbers
73 in a text file by ensuring that 'sort' runs before 'head'::
73 in a text file by ensuring that 'sort' runs before 'head'::
74
74
75 [fix]
75 [fix]
76 sort:command = sort -n
76 sort:command = sort -n
77 head:command = head -n 10
77 head:command = head -n 10
78 sort:pattern = numbers.txt
78 sort:pattern = numbers.txt
79 head:pattern = numbers.txt
79 head:pattern = numbers.txt
80 sort:priority = 2
80 sort:priority = 2
81 head:priority = 1
81 head:priority = 1
82
82
83 To account for changes made by each tool, the line numbers used for incremental
83 To account for changes made by each tool, the line numbers used for incremental
84 formatting are recomputed before executing the next tool. So, each tool may see
84 formatting are recomputed before executing the next tool. So, each tool may see
85 different values for the arguments added by the :linerange suboption.
85 different values for the arguments added by the :linerange suboption.
86
86
87 Each fixer tool is allowed to return some metadata in addition to the fixed file
87 Each fixer tool is allowed to return some metadata in addition to the fixed file
88 content. The metadata must be placed before the file content on stdout,
88 content. The metadata must be placed before the file content on stdout,
89 separated from the file content by a zero byte. The metadata is parsed as a JSON
89 separated from the file content by a zero byte. The metadata is parsed as a JSON
90 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
90 value (so, it should be UTF-8 encoded and contain no zero bytes). A fixer tool
91 is expected to produce this metadata encoding if and only if the :metadata
91 is expected to produce this metadata encoding if and only if the :metadata
92 suboption is true::
92 suboption is true::
93
93
94 [fix]
94 [fix]
95 tool:command = tool --prepend-json-metadata
95 tool:command = tool --prepend-json-metadata
96 tool:metadata = true
96 tool:metadata = true
97
97
98 The metadata values are passed to hooks, which can be used to print summaries or
98 The metadata values are passed to hooks, which can be used to print summaries or
99 perform other post-fixing work. The supported hooks are::
99 perform other post-fixing work. The supported hooks are::
100
100
101 "postfixfile"
101 "postfixfile"
102 Run once for each file in each revision where any fixer tools made changes
102 Run once for each file in each revision where any fixer tools made changes
103 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
103 to the file content. Provides "$HG_REV" and "$HG_PATH" to identify the file,
104 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
104 and "$HG_METADATA" with a map of fixer names to metadata values from fixer
105 tools that affected the file. Fixer tools that didn't affect the file have a
105 tools that affected the file. Fixer tools that didn't affect the file have a
106 value of None. Only fixer tools that executed are present in the metadata.
106 value of None. Only fixer tools that executed are present in the metadata.
107
107
108 "postfix"
108 "postfix"
109 Run once after all files and revisions have been handled. Provides
109 Run once after all files and revisions have been handled. Provides
110 "$HG_REPLACEMENTS" with information about what revisions were created and
110 "$HG_REPLACEMENTS" with information about what revisions were created and
111 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
111 made obsolete. Provides a boolean "$HG_WDIRWRITTEN" to indicate whether any
112 files in the working copy were updated. Provides a list "$HG_METADATA"
112 files in the working copy were updated. Provides a list "$HG_METADATA"
113 mapping fixer tool names to lists of metadata values returned from
113 mapping fixer tool names to lists of metadata values returned from
114 executions that modified a file. This aggregates the same metadata
114 executions that modified a file. This aggregates the same metadata
115 previously passed to the "postfixfile" hook.
115 previously passed to the "postfixfile" hook.
116
116
117 Fixer tools are run in the repository's root directory. This allows them to read
117 Fixer tools are run in the repository's root directory. This allows them to read
118 configuration files from the working copy, or even write to the working copy.
118 configuration files from the working copy, or even write to the working copy.
119 The working copy is not updated to match the revision being fixed. In fact,
119 The working copy is not updated to match the revision being fixed. In fact,
120 several revisions may be fixed in parallel. Writes to the working copy are not
120 several revisions may be fixed in parallel. Writes to the working copy are not
121 amended into the revision being fixed; fixer tools should always write fixed
121 amended into the revision being fixed; fixer tools should always write fixed
122 file content back to stdout as documented above.
122 file content back to stdout as documented above.
123 """
123 """
124
124
125 from __future__ import absolute_import
125 from __future__ import absolute_import
126
126
127 import collections
127 import collections
128 import itertools
128 import itertools
129 import os
129 import os
130 import re
130 import re
131 import subprocess
131 import subprocess
132
132
133 from mercurial.i18n import _
133 from mercurial.i18n import _
134 from mercurial.node import (
134 from mercurial.node import (
135 nullid,
135 nullid,
136 nullrev,
136 nullrev,
137 wdirrev,
137 wdirrev,
138 )
138 )
139
139
140 from mercurial.utils import procutil
140 from mercurial.utils import procutil
141
141
142 from mercurial import (
142 from mercurial import (
143 cmdutil,
143 cmdutil,
144 context,
144 context,
145 copies,
145 copies,
146 error,
146 error,
147 match as matchmod,
147 match as matchmod,
148 mdiff,
148 mdiff,
149 merge,
149 merge,
150 mergestate as mergestatemod,
150 mergestate as mergestatemod,
151 obsolete,
151 pycompat,
152 pycompat,
152 registrar,
153 registrar,
153 rewriteutil,
154 rewriteutil,
154 scmutil,
155 scmutil,
155 util,
156 util,
156 worker,
157 worker,
157 )
158 )
158
159
159 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
160 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
160 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
161 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
161 # be specifying the version(s) of Mercurial they are tested with, or
162 # be specifying the version(s) of Mercurial they are tested with, or
162 # leave the attribute unspecified.
163 # leave the attribute unspecified.
163 testedwith = b'ships-with-hg-core'
164 testedwith = b'ships-with-hg-core'
164
165
165 cmdtable = {}
166 cmdtable = {}
166 command = registrar.command(cmdtable)
167 command = registrar.command(cmdtable)
167
168
168 configtable = {}
169 configtable = {}
169 configitem = registrar.configitem(configtable)
170 configitem = registrar.configitem(configtable)
170
171
171 # Register the suboptions allowed for each configured fixer, and default values.
172 # Register the suboptions allowed for each configured fixer, and default values.
172 FIXER_ATTRS = {
173 FIXER_ATTRS = {
173 b'command': None,
174 b'command': None,
174 b'linerange': None,
175 b'linerange': None,
175 b'pattern': None,
176 b'pattern': None,
176 b'priority': 0,
177 b'priority': 0,
177 b'metadata': False,
178 b'metadata': False,
178 b'skipclean': True,
179 b'skipclean': True,
179 b'enabled': True,
180 b'enabled': True,
180 }
181 }
181
182
182 for key, default in FIXER_ATTRS.items():
183 for key, default in FIXER_ATTRS.items():
183 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
184 configitem(b'fix', b'.*:%s$' % key, default=default, generic=True)
184
185
185 # A good default size allows most source code files to be fixed, but avoids
186 # A good default size allows most source code files to be fixed, but avoids
186 # letting fixer tools choke on huge inputs, which could be surprising to the
187 # letting fixer tools choke on huge inputs, which could be surprising to the
187 # user.
188 # user.
188 configitem(b'fix', b'maxfilesize', default=b'2MB')
189 configitem(b'fix', b'maxfilesize', default=b'2MB')
189
190
190 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
191 # Allow fix commands to exit non-zero if an executed fixer tool exits non-zero.
191 # This helps users do shell scripts that stop when a fixer tool signals a
192 # This helps users do shell scripts that stop when a fixer tool signals a
192 # problem.
193 # problem.
193 configitem(b'fix', b'failure', default=b'continue')
194 configitem(b'fix', b'failure', default=b'continue')
194
195
195
196
196 def checktoolfailureaction(ui, message, hint=None):
197 def checktoolfailureaction(ui, message, hint=None):
197 """Abort with 'message' if fix.failure=abort"""
198 """Abort with 'message' if fix.failure=abort"""
198 action = ui.config(b'fix', b'failure')
199 action = ui.config(b'fix', b'failure')
199 if action not in (b'continue', b'abort'):
200 if action not in (b'continue', b'abort'):
200 raise error.Abort(
201 raise error.Abort(
201 _(b'unknown fix.failure action: %s') % (action,),
202 _(b'unknown fix.failure action: %s') % (action,),
202 hint=_(b'use "continue" or "abort"'),
203 hint=_(b'use "continue" or "abort"'),
203 )
204 )
204 if action == b'abort':
205 if action == b'abort':
205 raise error.Abort(message, hint=hint)
206 raise error.Abort(message, hint=hint)
206
207
207
208
208 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
209 allopt = (b'', b'all', False, _(b'fix all non-public non-obsolete revisions'))
209 baseopt = (
210 baseopt = (
210 b'',
211 b'',
211 b'base',
212 b'base',
212 [],
213 [],
213 _(
214 _(
214 b'revisions to diff against (overrides automatic '
215 b'revisions to diff against (overrides automatic '
215 b'selection, and applies to every revision being '
216 b'selection, and applies to every revision being '
216 b'fixed)'
217 b'fixed)'
217 ),
218 ),
218 _(b'REV'),
219 _(b'REV'),
219 )
220 )
220 revopt = (b'r', b'rev', [], _(b'revisions to fix (ADVANCED)'), _(b'REV'))
221 revopt = (b'r', b'rev', [], _(b'revisions to fix (ADVANCED)'), _(b'REV'))
221 sourceopt = (
222 sourceopt = (
222 b's',
223 b's',
223 b'source',
224 b'source',
224 [],
225 [],
225 _(b'fix the specified revisions and their descendants'),
226 _(b'fix the specified revisions and their descendants'),
226 _(b'REV'),
227 _(b'REV'),
227 )
228 )
228 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
229 wdiropt = (b'w', b'working-dir', False, _(b'fix the working directory'))
229 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
230 wholeopt = (b'', b'whole', False, _(b'always fix every line of a file'))
230 usage = _(b'[OPTION]... [FILE]...')
231 usage = _(b'[OPTION]... [FILE]...')
231
232
232
233
233 @command(
234 @command(
234 b'fix',
235 b'fix',
235 [allopt, baseopt, revopt, sourceopt, wdiropt, wholeopt],
236 [allopt, baseopt, revopt, sourceopt, wdiropt, wholeopt],
236 usage,
237 usage,
237 helpcategory=command.CATEGORY_FILE_CONTENTS,
238 helpcategory=command.CATEGORY_FILE_CONTENTS,
238 )
239 )
239 def fix(ui, repo, *pats, **opts):
240 def fix(ui, repo, *pats, **opts):
240 """rewrite file content in changesets or working directory
241 """rewrite file content in changesets or working directory
241
242
242 Runs any configured tools to fix the content of files. Only affects files
243 Runs any configured tools to fix the content of files. Only affects files
243 with changes, unless file arguments are provided. Only affects changed lines
244 with changes, unless file arguments are provided. Only affects changed lines
244 of files, unless the --whole flag is used. Some tools may always affect the
245 of files, unless the --whole flag is used. Some tools may always affect the
245 whole file regardless of --whole.
246 whole file regardless of --whole.
246
247
247 If --working-dir is used, files with uncommitted changes in the working copy
248 If --working-dir is used, files with uncommitted changes in the working copy
248 will be fixed. Note that no backup are made.
249 will be fixed. Note that no backup are made.
249
250
250 If revisions are specified with --source, those revisions and their
251 If revisions are specified with --source, those revisions and their
251 descendants will be checked, and they may be replaced with new revisions
252 descendants will be checked, and they may be replaced with new revisions
252 that have fixed file content. By automatically including the descendants,
253 that have fixed file content. By automatically including the descendants,
253 no merging, rebasing, or evolution will be required. If an ancestor of the
254 no merging, rebasing, or evolution will be required. If an ancestor of the
254 working copy is included, then the working copy itself will also be fixed,
255 working copy is included, then the working copy itself will also be fixed,
255 and the working copy will be updated to the fixed parent.
256 and the working copy will be updated to the fixed parent.
256
257
257 When determining what lines of each file to fix at each revision, the whole
258 When determining what lines of each file to fix at each revision, the whole
258 set of revisions being fixed is considered, so that fixes to earlier
259 set of revisions being fixed is considered, so that fixes to earlier
259 revisions are not forgotten in later ones. The --base flag can be used to
260 revisions are not forgotten in later ones. The --base flag can be used to
260 override this default behavior, though it is not usually desirable to do so.
261 override this default behavior, though it is not usually desirable to do so.
261 """
262 """
262 opts = pycompat.byteskwargs(opts)
263 opts = pycompat.byteskwargs(opts)
263 cmdutil.check_at_most_one_arg(opts, b'all', b'source', b'rev')
264 cmdutil.check_at_most_one_arg(opts, b'all', b'source', b'rev')
264 cmdutil.check_incompatible_arguments(
265 cmdutil.check_incompatible_arguments(
265 opts, b'working_dir', [b'all', b'source']
266 opts, b'working_dir', [b'all', b'source']
266 )
267 )
267
268
268 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
269 with repo.wlock(), repo.lock(), repo.transaction(b'fix'):
269 revstofix = getrevstofix(ui, repo, opts)
270 revstofix = getrevstofix(ui, repo, opts)
270 basectxs = getbasectxs(repo, opts, revstofix)
271 basectxs = getbasectxs(repo, opts, revstofix)
271 workqueue, numitems = getworkqueue(
272 workqueue, numitems = getworkqueue(
272 ui, repo, pats, opts, revstofix, basectxs
273 ui, repo, pats, opts, revstofix, basectxs
273 )
274 )
274 basepaths = getbasepaths(repo, opts, workqueue, basectxs)
275 basepaths = getbasepaths(repo, opts, workqueue, basectxs)
275 fixers = getfixers(ui)
276 fixers = getfixers(ui)
276
277
277 # Rather than letting each worker independently fetch the files
278 # Rather than letting each worker independently fetch the files
278 # (which also would add complications for shared/keepalive
279 # (which also would add complications for shared/keepalive
279 # connections), prefetch them all first.
280 # connections), prefetch them all first.
280 _prefetchfiles(repo, workqueue, basepaths)
281 _prefetchfiles(repo, workqueue, basepaths)
281
282
282 # There are no data dependencies between the workers fixing each file
283 # There are no data dependencies between the workers fixing each file
283 # revision, so we can use all available parallelism.
284 # revision, so we can use all available parallelism.
284 def getfixes(items):
285 def getfixes(items):
285 for rev, path in items:
286 for rev, path in items:
286 ctx = repo[rev]
287 ctx = repo[rev]
287 olddata = ctx[path].data()
288 olddata = ctx[path].data()
288 metadata, newdata = fixfile(
289 metadata, newdata = fixfile(
289 ui, repo, opts, fixers, ctx, path, basepaths, basectxs[rev]
290 ui, repo, opts, fixers, ctx, path, basepaths, basectxs[rev]
290 )
291 )
291 # Don't waste memory/time passing unchanged content back, but
292 # Don't waste memory/time passing unchanged content back, but
292 # produce one result per item either way.
293 # produce one result per item either way.
293 yield (
294 yield (
294 rev,
295 rev,
295 path,
296 path,
296 metadata,
297 metadata,
297 newdata if newdata != olddata else None,
298 newdata if newdata != olddata else None,
298 )
299 )
299
300
300 results = worker.worker(
301 results = worker.worker(
301 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
302 ui, 1.0, getfixes, tuple(), workqueue, threadsafe=False
302 )
303 )
303
304
304 # We have to hold on to the data for each successor revision in memory
305 # We have to hold on to the data for each successor revision in memory
305 # until all its parents are committed. We ensure this by committing and
306 # until all its parents are committed. We ensure this by committing and
306 # freeing memory for the revisions in some topological order. This
307 # freeing memory for the revisions in some topological order. This
307 # leaves a little bit of memory efficiency on the table, but also makes
308 # leaves a little bit of memory efficiency on the table, but also makes
308 # the tests deterministic. It might also be considered a feature since
309 # the tests deterministic. It might also be considered a feature since
309 # it makes the results more easily reproducible.
310 # it makes the results more easily reproducible.
310 filedata = collections.defaultdict(dict)
311 filedata = collections.defaultdict(dict)
311 aggregatemetadata = collections.defaultdict(list)
312 aggregatemetadata = collections.defaultdict(list)
312 replacements = {}
313 replacements = {}
313 wdirwritten = False
314 wdirwritten = False
314 commitorder = sorted(revstofix, reverse=True)
315 commitorder = sorted(revstofix, reverse=True)
315 with ui.makeprogress(
316 with ui.makeprogress(
316 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
317 topic=_(b'fixing'), unit=_(b'files'), total=sum(numitems.values())
317 ) as progress:
318 ) as progress:
318 for rev, path, filerevmetadata, newdata in results:
319 for rev, path, filerevmetadata, newdata in results:
319 progress.increment(item=path)
320 progress.increment(item=path)
320 for fixername, fixermetadata in filerevmetadata.items():
321 for fixername, fixermetadata in filerevmetadata.items():
321 aggregatemetadata[fixername].append(fixermetadata)
322 aggregatemetadata[fixername].append(fixermetadata)
322 if newdata is not None:
323 if newdata is not None:
323 filedata[rev][path] = newdata
324 filedata[rev][path] = newdata
324 hookargs = {
325 hookargs = {
325 b'rev': rev,
326 b'rev': rev,
326 b'path': path,
327 b'path': path,
327 b'metadata': filerevmetadata,
328 b'metadata': filerevmetadata,
328 }
329 }
329 repo.hook(
330 repo.hook(
330 b'postfixfile',
331 b'postfixfile',
331 throw=False,
332 throw=False,
332 **pycompat.strkwargs(hookargs)
333 **pycompat.strkwargs(hookargs)
333 )
334 )
334 numitems[rev] -= 1
335 numitems[rev] -= 1
335 # Apply the fixes for this and any other revisions that are
336 # Apply the fixes for this and any other revisions that are
336 # ready and sitting at the front of the queue. Using a loop here
337 # ready and sitting at the front of the queue. Using a loop here
337 # prevents the queue from being blocked by the first revision to
338 # prevents the queue from being blocked by the first revision to
338 # be ready out of order.
339 # be ready out of order.
339 while commitorder and not numitems[commitorder[-1]]:
340 while commitorder and not numitems[commitorder[-1]]:
340 rev = commitorder.pop()
341 rev = commitorder.pop()
341 ctx = repo[rev]
342 ctx = repo[rev]
342 if rev == wdirrev:
343 if rev == wdirrev:
343 writeworkingdir(repo, ctx, filedata[rev], replacements)
344 writeworkingdir(repo, ctx, filedata[rev], replacements)
344 wdirwritten = bool(filedata[rev])
345 wdirwritten = bool(filedata[rev])
345 else:
346 else:
346 replacerev(ui, repo, ctx, filedata[rev], replacements)
347 replacerev(ui, repo, ctx, filedata[rev], replacements)
347 del filedata[rev]
348 del filedata[rev]
348
349
349 cleanup(repo, replacements, wdirwritten)
350 cleanup(repo, replacements, wdirwritten)
350 hookargs = {
351 hookargs = {
351 b'replacements': replacements,
352 b'replacements': replacements,
352 b'wdirwritten': wdirwritten,
353 b'wdirwritten': wdirwritten,
353 b'metadata': aggregatemetadata,
354 b'metadata': aggregatemetadata,
354 }
355 }
355 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
356 repo.hook(b'postfix', throw=True, **pycompat.strkwargs(hookargs))
356
357
357
358
358 def cleanup(repo, replacements, wdirwritten):
359 def cleanup(repo, replacements, wdirwritten):
359 """Calls scmutil.cleanupnodes() with the given replacements.
360 """Calls scmutil.cleanupnodes() with the given replacements.
360
361
361 "replacements" is a dict from nodeid to nodeid, with one key and one value
362 "replacements" is a dict from nodeid to nodeid, with one key and one value
362 for every revision that was affected by fixing. This is slightly different
363 for every revision that was affected by fixing. This is slightly different
363 from cleanupnodes().
364 from cleanupnodes().
364
365
365 "wdirwritten" is a bool which tells whether the working copy was affected by
366 "wdirwritten" is a bool which tells whether the working copy was affected by
366 fixing, since it has no entry in "replacements".
367 fixing, since it has no entry in "replacements".
367
368
368 Useful as a hook point for extending "hg fix" with output summarizing the
369 Useful as a hook point for extending "hg fix" with output summarizing the
369 effects of the command, though we choose not to output anything here.
370 effects of the command, though we choose not to output anything here.
370 """
371 """
371 replacements = {
372 replacements = {
372 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
373 prec: [succ] for prec, succ in pycompat.iteritems(replacements)
373 }
374 }
374 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
375 scmutil.cleanupnodes(repo, replacements, b'fix', fixphase=True)
375
376
376
377
377 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
378 def getworkqueue(ui, repo, pats, opts, revstofix, basectxs):
378 """Constructs the list of files to be fixed at specific revisions
379 """Constructs the list of files to be fixed at specific revisions
379
380
380 It is up to the caller how to consume the work items, and the only
381 It is up to the caller how to consume the work items, and the only
381 dependence between them is that replacement revisions must be committed in
382 dependence between them is that replacement revisions must be committed in
382 topological order. Each work item represents a file in the working copy or
383 topological order. Each work item represents a file in the working copy or
383 in some revision that should be fixed and written back to the working copy
384 in some revision that should be fixed and written back to the working copy
384 or into a replacement revision.
385 or into a replacement revision.
385
386
386 Work items for the same revision are grouped together, so that a worker
387 Work items for the same revision are grouped together, so that a worker
387 pool starting with the first N items in parallel is likely to finish the
388 pool starting with the first N items in parallel is likely to finish the
388 first revision's work before other revisions. This can allow us to write
389 first revision's work before other revisions. This can allow us to write
389 the result to disk and reduce memory footprint. At time of writing, the
390 the result to disk and reduce memory footprint. At time of writing, the
390 partition strategy in worker.py seems favorable to this. We also sort the
391 partition strategy in worker.py seems favorable to this. We also sort the
391 items by ascending revision number to match the order in which we commit
392 items by ascending revision number to match the order in which we commit
392 the fixes later.
393 the fixes later.
393 """
394 """
394 workqueue = []
395 workqueue = []
395 numitems = collections.defaultdict(int)
396 numitems = collections.defaultdict(int)
396 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
397 maxfilesize = ui.configbytes(b'fix', b'maxfilesize')
397 for rev in sorted(revstofix):
398 for rev in sorted(revstofix):
398 fixctx = repo[rev]
399 fixctx = repo[rev]
399 match = scmutil.match(fixctx, pats, opts)
400 match = scmutil.match(fixctx, pats, opts)
400 for path in sorted(
401 for path in sorted(
401 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
402 pathstofix(ui, repo, pats, opts, match, basectxs[rev], fixctx)
402 ):
403 ):
403 fctx = fixctx[path]
404 fctx = fixctx[path]
404 if fctx.islink():
405 if fctx.islink():
405 continue
406 continue
406 if fctx.size() > maxfilesize:
407 if fctx.size() > maxfilesize:
407 ui.warn(
408 ui.warn(
408 _(b'ignoring file larger than %s: %s\n')
409 _(b'ignoring file larger than %s: %s\n')
409 % (util.bytecount(maxfilesize), path)
410 % (util.bytecount(maxfilesize), path)
410 )
411 )
411 continue
412 continue
412 workqueue.append((rev, path))
413 workqueue.append((rev, path))
413 numitems[rev] += 1
414 numitems[rev] += 1
414 return workqueue, numitems
415 return workqueue, numitems
415
416
416
417
417 def getrevstofix(ui, repo, opts):
418 def getrevstofix(ui, repo, opts):
418 """Returns the set of revision numbers that should be fixed"""
419 """Returns the set of revision numbers that should be fixed"""
419 if opts[b'all']:
420 if opts[b'all']:
420 revs = repo.revs(b'(not public() and not obsolete()) or wdir()')
421 revs = repo.revs(b'(not public() and not obsolete()) or wdir()')
421 elif opts[b'source']:
422 elif opts[b'source']:
422 source_revs = scmutil.revrange(repo, opts[b'source'])
423 source_revs = scmutil.revrange(repo, opts[b'source'])
423 revs = set(repo.revs(b'(%ld::) - obsolete()', source_revs))
424 revs = set(repo.revs(b'(%ld::) - obsolete()', source_revs))
424 if wdirrev in source_revs:
425 if wdirrev in source_revs:
425 # `wdir()::` is currently empty, so manually add wdir
426 # `wdir()::` is currently empty, so manually add wdir
426 revs.add(wdirrev)
427 revs.add(wdirrev)
427 if repo[b'.'].rev() in revs:
428 if repo[b'.'].rev() in revs:
428 revs.add(wdirrev)
429 revs.add(wdirrev)
429 else:
430 else:
430 revs = set(scmutil.revrange(repo, opts[b'rev']))
431 revs = set(scmutil.revrange(repo, opts[b'rev']))
431 if opts.get(b'working_dir'):
432 if opts.get(b'working_dir'):
432 revs.add(wdirrev)
433 revs.add(wdirrev)
433 for rev in revs:
434 for rev in revs:
434 checkfixablectx(ui, repo, repo[rev])
435 checkfixablectx(ui, repo, repo[rev])
435 # Allow fixing only wdir() even if there's an unfinished operation
436 # Allow fixing only wdir() even if there's an unfinished operation
436 if not (len(revs) == 1 and wdirrev in revs):
437 if not (len(revs) == 1 and wdirrev in revs):
437 cmdutil.checkunfinished(repo)
438 cmdutil.checkunfinished(repo)
438 rewriteutil.precheck(repo, revs, b'fix')
439 rewriteutil.precheck(repo, revs, b'fix')
439 if (
440 if (
440 wdirrev in revs
441 wdirrev in revs
441 and mergestatemod.mergestate.read(repo).unresolvedcount()
442 and mergestatemod.mergestate.read(repo).unresolvedcount()
442 ):
443 ):
443 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
444 raise error.Abort(b'unresolved conflicts', hint=b"use 'hg resolve'")
444 if not revs:
445 if not revs:
445 raise error.Abort(
446 raise error.Abort(
446 b'no changesets specified', hint=b'use --source or --working-dir'
447 b'no changesets specified', hint=b'use --source or --working-dir'
447 )
448 )
448 return revs
449 return revs
449
450
450
451
451 def checkfixablectx(ui, repo, ctx):
452 def checkfixablectx(ui, repo, ctx):
452 """Aborts if the revision shouldn't be replaced with a fixed one."""
453 """Aborts if the revision shouldn't be replaced with a fixed one."""
453 if ctx.obsolete():
454 if ctx.obsolete():
454 # It would be better to actually check if the revision has a successor.
455 # It would be better to actually check if the revision has a successor.
455 allowdivergence = ui.configbool(
456 if not obsolete.isenabled(repo, obsolete.allowdivergenceopt):
456 b'experimental', b'evolution.allowdivergence'
457 )
458 if not allowdivergence:
459 raise error.Abort(
457 raise error.Abort(
460 b'fixing obsolete revision could cause divergence'
458 b'fixing obsolete revision could cause divergence'
461 )
459 )
462
460
463
461
464 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
462 def pathstofix(ui, repo, pats, opts, match, basectxs, fixctx):
465 """Returns the set of files that should be fixed in a context
463 """Returns the set of files that should be fixed in a context
466
464
467 The result depends on the base contexts; we include any file that has
465 The result depends on the base contexts; we include any file that has
468 changed relative to any of the base contexts. Base contexts should be
466 changed relative to any of the base contexts. Base contexts should be
469 ancestors of the context being fixed.
467 ancestors of the context being fixed.
470 """
468 """
471 files = set()
469 files = set()
472 for basectx in basectxs:
470 for basectx in basectxs:
473 stat = basectx.status(
471 stat = basectx.status(
474 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
472 fixctx, match=match, listclean=bool(pats), listunknown=bool(pats)
475 )
473 )
476 files.update(
474 files.update(
477 set(
475 set(
478 itertools.chain(
476 itertools.chain(
479 stat.added, stat.modified, stat.clean, stat.unknown
477 stat.added, stat.modified, stat.clean, stat.unknown
480 )
478 )
481 )
479 )
482 )
480 )
483 return files
481 return files
484
482
485
483
486 def lineranges(opts, path, basepaths, basectxs, fixctx, content2):
484 def lineranges(opts, path, basepaths, basectxs, fixctx, content2):
487 """Returns the set of line ranges that should be fixed in a file
485 """Returns the set of line ranges that should be fixed in a file
488
486
489 Of the form [(10, 20), (30, 40)].
487 Of the form [(10, 20), (30, 40)].
490
488
491 This depends on the given base contexts; we must consider lines that have
489 This depends on the given base contexts; we must consider lines that have
492 changed versus any of the base contexts, and whether the file has been
490 changed versus any of the base contexts, and whether the file has been
493 renamed versus any of them.
491 renamed versus any of them.
494
492
495 Another way to understand this is that we exclude line ranges that are
493 Another way to understand this is that we exclude line ranges that are
496 common to the file in all base contexts.
494 common to the file in all base contexts.
497 """
495 """
498 if opts.get(b'whole'):
496 if opts.get(b'whole'):
499 # Return a range containing all lines. Rely on the diff implementation's
497 # Return a range containing all lines. Rely on the diff implementation's
500 # idea of how many lines are in the file, instead of reimplementing it.
498 # idea of how many lines are in the file, instead of reimplementing it.
501 return difflineranges(b'', content2)
499 return difflineranges(b'', content2)
502
500
503 rangeslist = []
501 rangeslist = []
504 for basectx in basectxs:
502 for basectx in basectxs:
505 basepath = basepaths.get((basectx.rev(), fixctx.rev(), path), path)
503 basepath = basepaths.get((basectx.rev(), fixctx.rev(), path), path)
506
504
507 if basepath in basectx:
505 if basepath in basectx:
508 content1 = basectx[basepath].data()
506 content1 = basectx[basepath].data()
509 else:
507 else:
510 content1 = b''
508 content1 = b''
511 rangeslist.extend(difflineranges(content1, content2))
509 rangeslist.extend(difflineranges(content1, content2))
512 return unionranges(rangeslist)
510 return unionranges(rangeslist)
513
511
514
512
515 def getbasepaths(repo, opts, workqueue, basectxs):
513 def getbasepaths(repo, opts, workqueue, basectxs):
516 if opts.get(b'whole'):
514 if opts.get(b'whole'):
517 # Base paths will never be fetched for line range determination.
515 # Base paths will never be fetched for line range determination.
518 return {}
516 return {}
519
517
520 basepaths = {}
518 basepaths = {}
521 for rev, path in workqueue:
519 for rev, path in workqueue:
522 fixctx = repo[rev]
520 fixctx = repo[rev]
523 for basectx in basectxs[rev]:
521 for basectx in basectxs[rev]:
524 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
522 basepath = copies.pathcopies(basectx, fixctx).get(path, path)
525 if basepath in basectx:
523 if basepath in basectx:
526 basepaths[(basectx.rev(), fixctx.rev(), path)] = basepath
524 basepaths[(basectx.rev(), fixctx.rev(), path)] = basepath
527 return basepaths
525 return basepaths
528
526
529
527
530 def unionranges(rangeslist):
528 def unionranges(rangeslist):
531 """Return the union of some closed intervals
529 """Return the union of some closed intervals
532
530
533 >>> unionranges([])
531 >>> unionranges([])
534 []
532 []
535 >>> unionranges([(1, 100)])
533 >>> unionranges([(1, 100)])
536 [(1, 100)]
534 [(1, 100)]
537 >>> unionranges([(1, 100), (1, 100)])
535 >>> unionranges([(1, 100), (1, 100)])
538 [(1, 100)]
536 [(1, 100)]
539 >>> unionranges([(1, 100), (2, 100)])
537 >>> unionranges([(1, 100), (2, 100)])
540 [(1, 100)]
538 [(1, 100)]
541 >>> unionranges([(1, 99), (1, 100)])
539 >>> unionranges([(1, 99), (1, 100)])
542 [(1, 100)]
540 [(1, 100)]
543 >>> unionranges([(1, 100), (40, 60)])
541 >>> unionranges([(1, 100), (40, 60)])
544 [(1, 100)]
542 [(1, 100)]
545 >>> unionranges([(1, 49), (50, 100)])
543 >>> unionranges([(1, 49), (50, 100)])
546 [(1, 100)]
544 [(1, 100)]
547 >>> unionranges([(1, 48), (50, 100)])
545 >>> unionranges([(1, 48), (50, 100)])
548 [(1, 48), (50, 100)]
546 [(1, 48), (50, 100)]
549 >>> unionranges([(1, 2), (3, 4), (5, 6)])
547 >>> unionranges([(1, 2), (3, 4), (5, 6)])
550 [(1, 6)]
548 [(1, 6)]
551 """
549 """
552 rangeslist = sorted(set(rangeslist))
550 rangeslist = sorted(set(rangeslist))
553 unioned = []
551 unioned = []
554 if rangeslist:
552 if rangeslist:
555 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
553 unioned, rangeslist = [rangeslist[0]], rangeslist[1:]
556 for a, b in rangeslist:
554 for a, b in rangeslist:
557 c, d = unioned[-1]
555 c, d = unioned[-1]
558 if a > d + 1:
556 if a > d + 1:
559 unioned.append((a, b))
557 unioned.append((a, b))
560 else:
558 else:
561 unioned[-1] = (c, max(b, d))
559 unioned[-1] = (c, max(b, d))
562 return unioned
560 return unioned
563
561
564
562
565 def difflineranges(content1, content2):
563 def difflineranges(content1, content2):
566 """Return list of line number ranges in content2 that differ from content1.
564 """Return list of line number ranges in content2 that differ from content1.
567
565
568 Line numbers are 1-based. The numbers are the first and last line contained
566 Line numbers are 1-based. The numbers are the first and last line contained
569 in the range. Single-line ranges have the same line number for the first and
567 in the range. Single-line ranges have the same line number for the first and
570 last line. Excludes any empty ranges that result from lines that are only
568 last line. Excludes any empty ranges that result from lines that are only
571 present in content1. Relies on mdiff's idea of where the line endings are in
569 present in content1. Relies on mdiff's idea of where the line endings are in
572 the string.
570 the string.
573
571
574 >>> from mercurial import pycompat
572 >>> from mercurial import pycompat
575 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
573 >>> lines = lambda s: b'\\n'.join([c for c in pycompat.iterbytestr(s)])
576 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
574 >>> difflineranges2 = lambda a, b: difflineranges(lines(a), lines(b))
577 >>> difflineranges2(b'', b'')
575 >>> difflineranges2(b'', b'')
578 []
576 []
579 >>> difflineranges2(b'a', b'')
577 >>> difflineranges2(b'a', b'')
580 []
578 []
581 >>> difflineranges2(b'', b'A')
579 >>> difflineranges2(b'', b'A')
582 [(1, 1)]
580 [(1, 1)]
583 >>> difflineranges2(b'a', b'a')
581 >>> difflineranges2(b'a', b'a')
584 []
582 []
585 >>> difflineranges2(b'a', b'A')
583 >>> difflineranges2(b'a', b'A')
586 [(1, 1)]
584 [(1, 1)]
587 >>> difflineranges2(b'ab', b'')
585 >>> difflineranges2(b'ab', b'')
588 []
586 []
589 >>> difflineranges2(b'', b'AB')
587 >>> difflineranges2(b'', b'AB')
590 [(1, 2)]
588 [(1, 2)]
591 >>> difflineranges2(b'abc', b'ac')
589 >>> difflineranges2(b'abc', b'ac')
592 []
590 []
593 >>> difflineranges2(b'ab', b'aCb')
591 >>> difflineranges2(b'ab', b'aCb')
594 [(2, 2)]
592 [(2, 2)]
595 >>> difflineranges2(b'abc', b'aBc')
593 >>> difflineranges2(b'abc', b'aBc')
596 [(2, 2)]
594 [(2, 2)]
597 >>> difflineranges2(b'ab', b'AB')
595 >>> difflineranges2(b'ab', b'AB')
598 [(1, 2)]
596 [(1, 2)]
599 >>> difflineranges2(b'abcde', b'aBcDe')
597 >>> difflineranges2(b'abcde', b'aBcDe')
600 [(2, 2), (4, 4)]
598 [(2, 2), (4, 4)]
601 >>> difflineranges2(b'abcde', b'aBCDe')
599 >>> difflineranges2(b'abcde', b'aBCDe')
602 [(2, 4)]
600 [(2, 4)]
603 """
601 """
604 ranges = []
602 ranges = []
605 for lines, kind in mdiff.allblocks(content1, content2):
603 for lines, kind in mdiff.allblocks(content1, content2):
606 firstline, lastline = lines[2:4]
604 firstline, lastline = lines[2:4]
607 if kind == b'!' and firstline != lastline:
605 if kind == b'!' and firstline != lastline:
608 ranges.append((firstline + 1, lastline))
606 ranges.append((firstline + 1, lastline))
609 return ranges
607 return ranges
610
608
611
609
612 def getbasectxs(repo, opts, revstofix):
610 def getbasectxs(repo, opts, revstofix):
613 """Returns a map of the base contexts for each revision
611 """Returns a map of the base contexts for each revision
614
612
615 The base contexts determine which lines are considered modified when we
613 The base contexts determine which lines are considered modified when we
616 attempt to fix just the modified lines in a file. It also determines which
614 attempt to fix just the modified lines in a file. It also determines which
617 files we attempt to fix, so it is important to compute this even when
615 files we attempt to fix, so it is important to compute this even when
618 --whole is used.
616 --whole is used.
619 """
617 """
620 # The --base flag overrides the usual logic, and we give every revision
618 # The --base flag overrides the usual logic, and we give every revision
621 # exactly the set of baserevs that the user specified.
619 # exactly the set of baserevs that the user specified.
622 if opts.get(b'base'):
620 if opts.get(b'base'):
623 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
621 baserevs = set(scmutil.revrange(repo, opts.get(b'base')))
624 if not baserevs:
622 if not baserevs:
625 baserevs = {nullrev}
623 baserevs = {nullrev}
626 basectxs = {repo[rev] for rev in baserevs}
624 basectxs = {repo[rev] for rev in baserevs}
627 return {rev: basectxs for rev in revstofix}
625 return {rev: basectxs for rev in revstofix}
628
626
629 # Proceed in topological order so that we can easily determine each
627 # Proceed in topological order so that we can easily determine each
630 # revision's baserevs by looking at its parents and their baserevs.
628 # revision's baserevs by looking at its parents and their baserevs.
631 basectxs = collections.defaultdict(set)
629 basectxs = collections.defaultdict(set)
632 for rev in sorted(revstofix):
630 for rev in sorted(revstofix):
633 ctx = repo[rev]
631 ctx = repo[rev]
634 for pctx in ctx.parents():
632 for pctx in ctx.parents():
635 if pctx.rev() in basectxs:
633 if pctx.rev() in basectxs:
636 basectxs[rev].update(basectxs[pctx.rev()])
634 basectxs[rev].update(basectxs[pctx.rev()])
637 else:
635 else:
638 basectxs[rev].add(pctx)
636 basectxs[rev].add(pctx)
639 return basectxs
637 return basectxs
640
638
641
639
642 def _prefetchfiles(repo, workqueue, basepaths):
640 def _prefetchfiles(repo, workqueue, basepaths):
643 toprefetch = set()
641 toprefetch = set()
644
642
645 # Prefetch the files that will be fixed.
643 # Prefetch the files that will be fixed.
646 for rev, path in workqueue:
644 for rev, path in workqueue:
647 if rev == wdirrev:
645 if rev == wdirrev:
648 continue
646 continue
649 toprefetch.add((rev, path))
647 toprefetch.add((rev, path))
650
648
651 # Prefetch the base contents for lineranges().
649 # Prefetch the base contents for lineranges().
652 for (baserev, fixrev, path), basepath in basepaths.items():
650 for (baserev, fixrev, path), basepath in basepaths.items():
653 toprefetch.add((baserev, basepath))
651 toprefetch.add((baserev, basepath))
654
652
655 if toprefetch:
653 if toprefetch:
656 scmutil.prefetchfiles(
654 scmutil.prefetchfiles(
657 repo,
655 repo,
658 [
656 [
659 (rev, scmutil.matchfiles(repo, [path]))
657 (rev, scmutil.matchfiles(repo, [path]))
660 for rev, path in toprefetch
658 for rev, path in toprefetch
661 ],
659 ],
662 )
660 )
663
661
664
662
665 def fixfile(ui, repo, opts, fixers, fixctx, path, basepaths, basectxs):
663 def fixfile(ui, repo, opts, fixers, fixctx, path, basepaths, basectxs):
666 """Run any configured fixers that should affect the file in this context
664 """Run any configured fixers that should affect the file in this context
667
665
668 Returns the file content that results from applying the fixers in some order
666 Returns the file content that results from applying the fixers in some order
669 starting with the file's content in the fixctx. Fixers that support line
667 starting with the file's content in the fixctx. Fixers that support line
670 ranges will affect lines that have changed relative to any of the basectxs
668 ranges will affect lines that have changed relative to any of the basectxs
671 (i.e. they will only avoid lines that are common to all basectxs).
669 (i.e. they will only avoid lines that are common to all basectxs).
672
670
673 A fixer tool's stdout will become the file's new content if and only if it
671 A fixer tool's stdout will become the file's new content if and only if it
674 exits with code zero. The fixer tool's working directory is the repository's
672 exits with code zero. The fixer tool's working directory is the repository's
675 root.
673 root.
676 """
674 """
677 metadata = {}
675 metadata = {}
678 newdata = fixctx[path].data()
676 newdata = fixctx[path].data()
679 for fixername, fixer in pycompat.iteritems(fixers):
677 for fixername, fixer in pycompat.iteritems(fixers):
680 if fixer.affects(opts, fixctx, path):
678 if fixer.affects(opts, fixctx, path):
681 ranges = lineranges(
679 ranges = lineranges(
682 opts, path, basepaths, basectxs, fixctx, newdata
680 opts, path, basepaths, basectxs, fixctx, newdata
683 )
681 )
684 command = fixer.command(ui, path, ranges)
682 command = fixer.command(ui, path, ranges)
685 if command is None:
683 if command is None:
686 continue
684 continue
687 ui.debug(b'subprocess: %s\n' % (command,))
685 ui.debug(b'subprocess: %s\n' % (command,))
688 proc = subprocess.Popen(
686 proc = subprocess.Popen(
689 procutil.tonativestr(command),
687 procutil.tonativestr(command),
690 shell=True,
688 shell=True,
691 cwd=procutil.tonativestr(repo.root),
689 cwd=procutil.tonativestr(repo.root),
692 stdin=subprocess.PIPE,
690 stdin=subprocess.PIPE,
693 stdout=subprocess.PIPE,
691 stdout=subprocess.PIPE,
694 stderr=subprocess.PIPE,
692 stderr=subprocess.PIPE,
695 )
693 )
696 stdout, stderr = proc.communicate(newdata)
694 stdout, stderr = proc.communicate(newdata)
697 if stderr:
695 if stderr:
698 showstderr(ui, fixctx.rev(), fixername, stderr)
696 showstderr(ui, fixctx.rev(), fixername, stderr)
699 newerdata = stdout
697 newerdata = stdout
700 if fixer.shouldoutputmetadata():
698 if fixer.shouldoutputmetadata():
701 try:
699 try:
702 metadatajson, newerdata = stdout.split(b'\0', 1)
700 metadatajson, newerdata = stdout.split(b'\0', 1)
703 metadata[fixername] = pycompat.json_loads(metadatajson)
701 metadata[fixername] = pycompat.json_loads(metadatajson)
704 except ValueError:
702 except ValueError:
705 ui.warn(
703 ui.warn(
706 _(b'ignored invalid output from fixer tool: %s\n')
704 _(b'ignored invalid output from fixer tool: %s\n')
707 % (fixername,)
705 % (fixername,)
708 )
706 )
709 continue
707 continue
710 else:
708 else:
711 metadata[fixername] = None
709 metadata[fixername] = None
712 if proc.returncode == 0:
710 if proc.returncode == 0:
713 newdata = newerdata
711 newdata = newerdata
714 else:
712 else:
715 if not stderr:
713 if not stderr:
716 message = _(b'exited with status %d\n') % (proc.returncode,)
714 message = _(b'exited with status %d\n') % (proc.returncode,)
717 showstderr(ui, fixctx.rev(), fixername, message)
715 showstderr(ui, fixctx.rev(), fixername, message)
718 checktoolfailureaction(
716 checktoolfailureaction(
719 ui,
717 ui,
720 _(b'no fixes will be applied'),
718 _(b'no fixes will be applied'),
721 hint=_(
719 hint=_(
722 b'use --config fix.failure=continue to apply any '
720 b'use --config fix.failure=continue to apply any '
723 b'successful fixes anyway'
721 b'successful fixes anyway'
724 ),
722 ),
725 )
723 )
726 return metadata, newdata
724 return metadata, newdata
727
725
728
726
729 def showstderr(ui, rev, fixername, stderr):
727 def showstderr(ui, rev, fixername, stderr):
730 """Writes the lines of the stderr string as warnings on the ui
728 """Writes the lines of the stderr string as warnings on the ui
731
729
732 Uses the revision number and fixername to give more context to each line of
730 Uses the revision number and fixername to give more context to each line of
733 the error message. Doesn't include file names, since those take up a lot of
731 the error message. Doesn't include file names, since those take up a lot of
734 space and would tend to be included in the error message if they were
732 space and would tend to be included in the error message if they were
735 relevant.
733 relevant.
736 """
734 """
737 for line in re.split(b'[\r\n]+', stderr):
735 for line in re.split(b'[\r\n]+', stderr):
738 if line:
736 if line:
739 ui.warn(b'[')
737 ui.warn(b'[')
740 if rev is None:
738 if rev is None:
741 ui.warn(_(b'wdir'), label=b'evolve.rev')
739 ui.warn(_(b'wdir'), label=b'evolve.rev')
742 else:
740 else:
743 ui.warn(b'%d' % rev, label=b'evolve.rev')
741 ui.warn(b'%d' % rev, label=b'evolve.rev')
744 ui.warn(b'] %s: %s\n' % (fixername, line))
742 ui.warn(b'] %s: %s\n' % (fixername, line))
745
743
746
744
747 def writeworkingdir(repo, ctx, filedata, replacements):
745 def writeworkingdir(repo, ctx, filedata, replacements):
748 """Write new content to the working copy and check out the new p1 if any
746 """Write new content to the working copy and check out the new p1 if any
749
747
750 We check out a new revision if and only if we fixed something in both the
748 We check out a new revision if and only if we fixed something in both the
751 working directory and its parent revision. This avoids the need for a full
749 working directory and its parent revision. This avoids the need for a full
752 update/merge, and means that the working directory simply isn't affected
750 update/merge, and means that the working directory simply isn't affected
753 unless the --working-dir flag is given.
751 unless the --working-dir flag is given.
754
752
755 Directly updates the dirstate for the affected files.
753 Directly updates the dirstate for the affected files.
756 """
754 """
757 assert repo.dirstate.p2() == nullid
755 assert repo.dirstate.p2() == nullid
758
756
759 for path, data in pycompat.iteritems(filedata):
757 for path, data in pycompat.iteritems(filedata):
760 fctx = ctx[path]
758 fctx = ctx[path]
761 fctx.write(data, fctx.flags())
759 fctx.write(data, fctx.flags())
762
760
763 oldp1 = repo.dirstate.p1()
761 oldp1 = repo.dirstate.p1()
764 newp1 = replacements.get(oldp1, oldp1)
762 newp1 = replacements.get(oldp1, oldp1)
765 if newp1 != oldp1:
763 if newp1 != oldp1:
766 with repo.dirstate.parentchange():
764 with repo.dirstate.parentchange():
767 scmutil.movedirstate(repo, repo[newp1])
765 scmutil.movedirstate(repo, repo[newp1])
768
766
769
767
770 def replacerev(ui, repo, ctx, filedata, replacements):
768 def replacerev(ui, repo, ctx, filedata, replacements):
771 """Commit a new revision like the given one, but with file content changes
769 """Commit a new revision like the given one, but with file content changes
772
770
773 "ctx" is the original revision to be replaced by a modified one.
771 "ctx" is the original revision to be replaced by a modified one.
774
772
775 "filedata" is a dict that maps paths to their new file content. All other
773 "filedata" is a dict that maps paths to their new file content. All other
776 paths will be recreated from the original revision without changes.
774 paths will be recreated from the original revision without changes.
777 "filedata" may contain paths that didn't exist in the original revision;
775 "filedata" may contain paths that didn't exist in the original revision;
778 they will be added.
776 they will be added.
779
777
780 "replacements" is a dict that maps a single node to a single node, and it is
778 "replacements" is a dict that maps a single node to a single node, and it is
781 updated to indicate the original revision is replaced by the newly created
779 updated to indicate the original revision is replaced by the newly created
782 one. No entry is added if the replacement's node already exists.
780 one. No entry is added if the replacement's node already exists.
783
781
784 The new revision has the same parents as the old one, unless those parents
782 The new revision has the same parents as the old one, unless those parents
785 have already been replaced, in which case those replacements are the parents
783 have already been replaced, in which case those replacements are the parents
786 of this new revision. Thus, if revisions are replaced in topological order,
784 of this new revision. Thus, if revisions are replaced in topological order,
787 there is no need to rebase them into the original topology later.
785 there is no need to rebase them into the original topology later.
788 """
786 """
789
787
790 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
788 p1rev, p2rev = repo.changelog.parentrevs(ctx.rev())
791 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
789 p1ctx, p2ctx = repo[p1rev], repo[p2rev]
792 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
790 newp1node = replacements.get(p1ctx.node(), p1ctx.node())
793 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
791 newp2node = replacements.get(p2ctx.node(), p2ctx.node())
794
792
795 # We don't want to create a revision that has no changes from the original,
793 # We don't want to create a revision that has no changes from the original,
796 # but we should if the original revision's parent has been replaced.
794 # but we should if the original revision's parent has been replaced.
797 # Otherwise, we would produce an orphan that needs no actual human
795 # Otherwise, we would produce an orphan that needs no actual human
798 # intervention to evolve. We can't rely on commit() to avoid creating the
796 # intervention to evolve. We can't rely on commit() to avoid creating the
799 # un-needed revision because the extra field added below produces a new hash
797 # un-needed revision because the extra field added below produces a new hash
800 # regardless of file content changes.
798 # regardless of file content changes.
801 if (
799 if (
802 not filedata
800 not filedata
803 and p1ctx.node() not in replacements
801 and p1ctx.node() not in replacements
804 and p2ctx.node() not in replacements
802 and p2ctx.node() not in replacements
805 ):
803 ):
806 return
804 return
807
805
808 extra = ctx.extra().copy()
806 extra = ctx.extra().copy()
809 extra[b'fix_source'] = ctx.hex()
807 extra[b'fix_source'] = ctx.hex()
810
808
811 wctx = context.overlayworkingctx(repo)
809 wctx = context.overlayworkingctx(repo)
812 wctx.setbase(repo[newp1node])
810 wctx.setbase(repo[newp1node])
813 merge.revert_to(ctx, wc=wctx)
811 merge.revert_to(ctx, wc=wctx)
814 copies.graftcopies(wctx, ctx, ctx.p1())
812 copies.graftcopies(wctx, ctx, ctx.p1())
815
813
816 for path in filedata.keys():
814 for path in filedata.keys():
817 fctx = ctx[path]
815 fctx = ctx[path]
818 copysource = fctx.copysource()
816 copysource = fctx.copysource()
819 wctx.write(path, filedata[path], flags=fctx.flags())
817 wctx.write(path, filedata[path], flags=fctx.flags())
820 if copysource:
818 if copysource:
821 wctx.markcopied(path, copysource)
819 wctx.markcopied(path, copysource)
822
820
823 desc = rewriteutil.update_hash_refs(
821 desc = rewriteutil.update_hash_refs(
824 repo,
822 repo,
825 ctx.description(),
823 ctx.description(),
826 {oldnode: [newnode] for oldnode, newnode in replacements.items()},
824 {oldnode: [newnode] for oldnode, newnode in replacements.items()},
827 )
825 )
828
826
829 memctx = wctx.tomemctx(
827 memctx = wctx.tomemctx(
830 text=desc,
828 text=desc,
831 branch=ctx.branch(),
829 branch=ctx.branch(),
832 extra=extra,
830 extra=extra,
833 date=ctx.date(),
831 date=ctx.date(),
834 parents=(newp1node, newp2node),
832 parents=(newp1node, newp2node),
835 user=ctx.user(),
833 user=ctx.user(),
836 )
834 )
837
835
838 sucnode = memctx.commit()
836 sucnode = memctx.commit()
839 prenode = ctx.node()
837 prenode = ctx.node()
840 if prenode == sucnode:
838 if prenode == sucnode:
841 ui.debug(b'node %s already existed\n' % (ctx.hex()))
839 ui.debug(b'node %s already existed\n' % (ctx.hex()))
842 else:
840 else:
843 replacements[ctx.node()] = sucnode
841 replacements[ctx.node()] = sucnode
844
842
845
843
846 def getfixers(ui):
844 def getfixers(ui):
847 """Returns a map of configured fixer tools indexed by their names
845 """Returns a map of configured fixer tools indexed by their names
848
846
849 Each value is a Fixer object with methods that implement the behavior of the
847 Each value is a Fixer object with methods that implement the behavior of the
850 fixer's config suboptions. Does not validate the config values.
848 fixer's config suboptions. Does not validate the config values.
851 """
849 """
852 fixers = {}
850 fixers = {}
853 for name in fixernames(ui):
851 for name in fixernames(ui):
854 enabled = ui.configbool(b'fix', name + b':enabled')
852 enabled = ui.configbool(b'fix', name + b':enabled')
855 command = ui.config(b'fix', name + b':command')
853 command = ui.config(b'fix', name + b':command')
856 pattern = ui.config(b'fix', name + b':pattern')
854 pattern = ui.config(b'fix', name + b':pattern')
857 linerange = ui.config(b'fix', name + b':linerange')
855 linerange = ui.config(b'fix', name + b':linerange')
858 priority = ui.configint(b'fix', name + b':priority')
856 priority = ui.configint(b'fix', name + b':priority')
859 metadata = ui.configbool(b'fix', name + b':metadata')
857 metadata = ui.configbool(b'fix', name + b':metadata')
860 skipclean = ui.configbool(b'fix', name + b':skipclean')
858 skipclean = ui.configbool(b'fix', name + b':skipclean')
861 # Don't use a fixer if it has no pattern configured. It would be
859 # Don't use a fixer if it has no pattern configured. It would be
862 # dangerous to let it affect all files. It would be pointless to let it
860 # dangerous to let it affect all files. It would be pointless to let it
863 # affect no files. There is no reasonable subset of files to use as the
861 # affect no files. There is no reasonable subset of files to use as the
864 # default.
862 # default.
865 if command is None:
863 if command is None:
866 ui.warn(
864 ui.warn(
867 _(b'fixer tool has no command configuration: %s\n') % (name,)
865 _(b'fixer tool has no command configuration: %s\n') % (name,)
868 )
866 )
869 elif pattern is None:
867 elif pattern is None:
870 ui.warn(
868 ui.warn(
871 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
869 _(b'fixer tool has no pattern configuration: %s\n') % (name,)
872 )
870 )
873 elif not enabled:
871 elif not enabled:
874 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
872 ui.debug(b'ignoring disabled fixer tool: %s\n' % (name,))
875 else:
873 else:
876 fixers[name] = Fixer(
874 fixers[name] = Fixer(
877 command, pattern, linerange, priority, metadata, skipclean
875 command, pattern, linerange, priority, metadata, skipclean
878 )
876 )
879 return collections.OrderedDict(
877 return collections.OrderedDict(
880 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
878 sorted(fixers.items(), key=lambda item: item[1]._priority, reverse=True)
881 )
879 )
882
880
883
881
884 def fixernames(ui):
882 def fixernames(ui):
885 """Returns the names of [fix] config options that have suboptions"""
883 """Returns the names of [fix] config options that have suboptions"""
886 names = set()
884 names = set()
887 for k, v in ui.configitems(b'fix'):
885 for k, v in ui.configitems(b'fix'):
888 if b':' in k:
886 if b':' in k:
889 names.add(k.split(b':', 1)[0])
887 names.add(k.split(b':', 1)[0])
890 return names
888 return names
891
889
892
890
893 class Fixer(object):
891 class Fixer(object):
894 """Wraps the raw config values for a fixer with methods"""
892 """Wraps the raw config values for a fixer with methods"""
895
893
896 def __init__(
894 def __init__(
897 self, command, pattern, linerange, priority, metadata, skipclean
895 self, command, pattern, linerange, priority, metadata, skipclean
898 ):
896 ):
899 self._command = command
897 self._command = command
900 self._pattern = pattern
898 self._pattern = pattern
901 self._linerange = linerange
899 self._linerange = linerange
902 self._priority = priority
900 self._priority = priority
903 self._metadata = metadata
901 self._metadata = metadata
904 self._skipclean = skipclean
902 self._skipclean = skipclean
905
903
906 def affects(self, opts, fixctx, path):
904 def affects(self, opts, fixctx, path):
907 """Should this fixer run on the file at the given path and context?"""
905 """Should this fixer run on the file at the given path and context?"""
908 repo = fixctx.repo()
906 repo = fixctx.repo()
909 matcher = matchmod.match(
907 matcher = matchmod.match(
910 repo.root, repo.root, [self._pattern], ctx=fixctx
908 repo.root, repo.root, [self._pattern], ctx=fixctx
911 )
909 )
912 return matcher(path)
910 return matcher(path)
913
911
914 def shouldoutputmetadata(self):
912 def shouldoutputmetadata(self):
915 """Should the stdout of this fixer start with JSON and a null byte?"""
913 """Should the stdout of this fixer start with JSON and a null byte?"""
916 return self._metadata
914 return self._metadata
917
915
918 def command(self, ui, path, ranges):
916 def command(self, ui, path, ranges):
919 """A shell command to use to invoke this fixer on the given file/lines
917 """A shell command to use to invoke this fixer on the given file/lines
920
918
921 May return None if there is no appropriate command to run for the given
919 May return None if there is no appropriate command to run for the given
922 parameters.
920 parameters.
923 """
921 """
924 expand = cmdutil.rendercommandtemplate
922 expand = cmdutil.rendercommandtemplate
925 parts = [
923 parts = [
926 expand(
924 expand(
927 ui,
925 ui,
928 self._command,
926 self._command,
929 {b'rootpath': path, b'basename': os.path.basename(path)},
927 {b'rootpath': path, b'basename': os.path.basename(path)},
930 )
928 )
931 ]
929 ]
932 if self._linerange:
930 if self._linerange:
933 if self._skipclean and not ranges:
931 if self._skipclean and not ranges:
934 # No line ranges to fix, so don't run the fixer.
932 # No line ranges to fix, so don't run the fixer.
935 return None
933 return None
936 for first, last in ranges:
934 for first, last in ranges:
937 parts.append(
935 parts.append(
938 expand(
936 expand(
939 ui, self._linerange, {b'first': first, b'last': last}
937 ui, self._linerange, {b'first': first, b'last': last}
940 )
938 )
941 )
939 )
942 return b' '.join(parts)
940 return b' '.join(parts)
General Comments 0
You need to be logged in to leave comments. Login now