##// END OF EJS Templates
repo: repo isolation, do not pass on repo.ui for creating new repos...
Simon Heimberg -
r18825:f0564402 default
parent child Browse files
Show More
@@ -1,731 +1,731 b''
1 # keyword.py - $Keyword$ expansion for Mercurial
1 # keyword.py - $Keyword$ expansion for Mercurial
2 #
2 #
3 # Copyright 2007-2012 Christian Ebert <blacktrash@gmx.net>
3 # Copyright 2007-2012 Christian Ebert <blacktrash@gmx.net>
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7 #
7 #
8 # $Id$
8 # $Id$
9 #
9 #
10 # Keyword expansion hack against the grain of a Distributed SCM
10 # Keyword expansion hack against the grain of a Distributed SCM
11 #
11 #
12 # There are many good reasons why this is not needed in a distributed
12 # There are many good reasons why this is not needed in a distributed
13 # SCM, still it may be useful in very small projects based on single
13 # SCM, still it may be useful in very small projects based on single
14 # files (like LaTeX packages), that are mostly addressed to an
14 # files (like LaTeX packages), that are mostly addressed to an
15 # audience not running a version control system.
15 # audience not running a version control system.
16 #
16 #
17 # For in-depth discussion refer to
17 # For in-depth discussion refer to
18 # <http://mercurial.selenic.com/wiki/KeywordPlan>.
18 # <http://mercurial.selenic.com/wiki/KeywordPlan>.
19 #
19 #
20 # Keyword expansion is based on Mercurial's changeset template mappings.
20 # Keyword expansion is based on Mercurial's changeset template mappings.
21 #
21 #
22 # Binary files are not touched.
22 # Binary files are not touched.
23 #
23 #
24 # Files to act upon/ignore are specified in the [keyword] section.
24 # Files to act upon/ignore are specified in the [keyword] section.
25 # Customized keyword template mappings in the [keywordmaps] section.
25 # Customized keyword template mappings in the [keywordmaps] section.
26 #
26 #
27 # Run "hg help keyword" and "hg kwdemo" to get info on configuration.
27 # Run "hg help keyword" and "hg kwdemo" to get info on configuration.
28
28
29 '''expand keywords in tracked files
29 '''expand keywords in tracked files
30
30
31 This extension expands RCS/CVS-like or self-customized $Keywords$ in
31 This extension expands RCS/CVS-like or self-customized $Keywords$ in
32 tracked text files selected by your configuration.
32 tracked text files selected by your configuration.
33
33
34 Keywords are only expanded in local repositories and not stored in the
34 Keywords are only expanded in local repositories and not stored in the
35 change history. The mechanism can be regarded as a convenience for the
35 change history. The mechanism can be regarded as a convenience for the
36 current user or for archive distribution.
36 current user or for archive distribution.
37
37
38 Keywords expand to the changeset data pertaining to the latest change
38 Keywords expand to the changeset data pertaining to the latest change
39 relative to the working directory parent of each file.
39 relative to the working directory parent of each file.
40
40
41 Configuration is done in the [keyword], [keywordset] and [keywordmaps]
41 Configuration is done in the [keyword], [keywordset] and [keywordmaps]
42 sections of hgrc files.
42 sections of hgrc files.
43
43
44 Example::
44 Example::
45
45
46 [keyword]
46 [keyword]
47 # expand keywords in every python file except those matching "x*"
47 # expand keywords in every python file except those matching "x*"
48 **.py =
48 **.py =
49 x* = ignore
49 x* = ignore
50
50
51 [keywordset]
51 [keywordset]
52 # prefer svn- over cvs-like default keywordmaps
52 # prefer svn- over cvs-like default keywordmaps
53 svn = True
53 svn = True
54
54
55 .. note::
55 .. note::
56 The more specific you are in your filename patterns the less you
56 The more specific you are in your filename patterns the less you
57 lose speed in huge repositories.
57 lose speed in huge repositories.
58
58
59 For [keywordmaps] template mapping and expansion demonstration and
59 For [keywordmaps] template mapping and expansion demonstration and
60 control run :hg:`kwdemo`. See :hg:`help templates` for a list of
60 control run :hg:`kwdemo`. See :hg:`help templates` for a list of
61 available templates and filters.
61 available templates and filters.
62
62
63 Three additional date template filters are provided:
63 Three additional date template filters are provided:
64
64
65 :``utcdate``: "2006/09/18 15:13:13"
65 :``utcdate``: "2006/09/18 15:13:13"
66 :``svnutcdate``: "2006-09-18 15:13:13Z"
66 :``svnutcdate``: "2006-09-18 15:13:13Z"
67 :``svnisodate``: "2006-09-18 08:13:13 -700 (Mon, 18 Sep 2006)"
67 :``svnisodate``: "2006-09-18 08:13:13 -700 (Mon, 18 Sep 2006)"
68
68
69 The default template mappings (view with :hg:`kwdemo -d`) can be
69 The default template mappings (view with :hg:`kwdemo -d`) can be
70 replaced with customized keywords and templates. Again, run
70 replaced with customized keywords and templates. Again, run
71 :hg:`kwdemo` to control the results of your configuration changes.
71 :hg:`kwdemo` to control the results of your configuration changes.
72
72
73 Before changing/disabling active keywords, you must run :hg:`kwshrink`
73 Before changing/disabling active keywords, you must run :hg:`kwshrink`
74 to avoid storing expanded keywords in the change history.
74 to avoid storing expanded keywords in the change history.
75
75
76 To force expansion after enabling it, or a configuration change, run
76 To force expansion after enabling it, or a configuration change, run
77 :hg:`kwexpand`.
77 :hg:`kwexpand`.
78
78
79 Expansions spanning more than one line and incremental expansions,
79 Expansions spanning more than one line and incremental expansions,
80 like CVS' $Log$, are not supported. A keyword template map "Log =
80 like CVS' $Log$, are not supported. A keyword template map "Log =
81 {desc}" expands to the first line of the changeset description.
81 {desc}" expands to the first line of the changeset description.
82 '''
82 '''
83
83
84 from mercurial import commands, context, cmdutil, dispatch, filelog, extensions
84 from mercurial import commands, context, cmdutil, dispatch, filelog, extensions
85 from mercurial import localrepo, match, patch, templatefilters, templater, util
85 from mercurial import localrepo, match, patch, templatefilters, templater, util
86 from mercurial import scmutil
86 from mercurial import scmutil
87 from mercurial.hgweb import webcommands
87 from mercurial.hgweb import webcommands
88 from mercurial.i18n import _
88 from mercurial.i18n import _
89 import os, re, shutil, tempfile
89 import os, re, shutil, tempfile
90
90
91 commands.optionalrepo += ' kwdemo'
91 commands.optionalrepo += ' kwdemo'
92 commands.inferrepo += ' kwexpand kwfiles kwshrink'
92 commands.inferrepo += ' kwexpand kwfiles kwshrink'
93
93
94 cmdtable = {}
94 cmdtable = {}
95 command = cmdutil.command(cmdtable)
95 command = cmdutil.command(cmdtable)
96 testedwith = 'internal'
96 testedwith = 'internal'
97
97
98 # hg commands that do not act on keywords
98 # hg commands that do not act on keywords
99 nokwcommands = ('add addremove annotate bundle export grep incoming init log'
99 nokwcommands = ('add addremove annotate bundle export grep incoming init log'
100 ' outgoing push tip verify convert email glog')
100 ' outgoing push tip verify convert email glog')
101
101
102 # hg commands that trigger expansion only when writing to working dir,
102 # hg commands that trigger expansion only when writing to working dir,
103 # not when reading filelog, and unexpand when reading from working dir
103 # not when reading filelog, and unexpand when reading from working dir
104 restricted = 'merge kwexpand kwshrink record qrecord resolve transplant'
104 restricted = 'merge kwexpand kwshrink record qrecord resolve transplant'
105
105
106 # names of extensions using dorecord
106 # names of extensions using dorecord
107 recordextensions = 'record'
107 recordextensions = 'record'
108
108
109 colortable = {
109 colortable = {
110 'kwfiles.enabled': 'green bold',
110 'kwfiles.enabled': 'green bold',
111 'kwfiles.deleted': 'cyan bold underline',
111 'kwfiles.deleted': 'cyan bold underline',
112 'kwfiles.enabledunknown': 'green',
112 'kwfiles.enabledunknown': 'green',
113 'kwfiles.ignored': 'bold',
113 'kwfiles.ignored': 'bold',
114 'kwfiles.ignoredunknown': 'none'
114 'kwfiles.ignoredunknown': 'none'
115 }
115 }
116
116
117 # date like in cvs' $Date
117 # date like in cvs' $Date
118 def utcdate(text):
118 def utcdate(text):
119 ''':utcdate: Date. Returns a UTC-date in this format: "2009/08/18 11:00:13".
119 ''':utcdate: Date. Returns a UTC-date in this format: "2009/08/18 11:00:13".
120 '''
120 '''
121 return util.datestr((util.parsedate(text)[0], 0), '%Y/%m/%d %H:%M:%S')
121 return util.datestr((util.parsedate(text)[0], 0), '%Y/%m/%d %H:%M:%S')
122 # date like in svn's $Date
122 # date like in svn's $Date
123 def svnisodate(text):
123 def svnisodate(text):
124 ''':svnisodate: Date. Returns a date in this format: "2009-08-18 13:00:13
124 ''':svnisodate: Date. Returns a date in this format: "2009-08-18 13:00:13
125 +0200 (Tue, 18 Aug 2009)".
125 +0200 (Tue, 18 Aug 2009)".
126 '''
126 '''
127 return util.datestr(text, '%Y-%m-%d %H:%M:%S %1%2 (%a, %d %b %Y)')
127 return util.datestr(text, '%Y-%m-%d %H:%M:%S %1%2 (%a, %d %b %Y)')
128 # date like in svn's $Id
128 # date like in svn's $Id
129 def svnutcdate(text):
129 def svnutcdate(text):
130 ''':svnutcdate: Date. Returns a UTC-date in this format: "2009-08-18
130 ''':svnutcdate: Date. Returns a UTC-date in this format: "2009-08-18
131 11:00:13Z".
131 11:00:13Z".
132 '''
132 '''
133 return util.datestr((util.parsedate(text)[0], 0), '%Y-%m-%d %H:%M:%SZ')
133 return util.datestr((util.parsedate(text)[0], 0), '%Y-%m-%d %H:%M:%SZ')
134
134
135 templatefilters.filters.update({'utcdate': utcdate,
135 templatefilters.filters.update({'utcdate': utcdate,
136 'svnisodate': svnisodate,
136 'svnisodate': svnisodate,
137 'svnutcdate': svnutcdate})
137 'svnutcdate': svnutcdate})
138
138
139 # make keyword tools accessible
139 # make keyword tools accessible
140 kwtools = {'templater': None, 'hgcmd': ''}
140 kwtools = {'templater': None, 'hgcmd': ''}
141
141
142 def _defaultkwmaps(ui):
142 def _defaultkwmaps(ui):
143 '''Returns default keywordmaps according to keywordset configuration.'''
143 '''Returns default keywordmaps according to keywordset configuration.'''
144 templates = {
144 templates = {
145 'Revision': '{node|short}',
145 'Revision': '{node|short}',
146 'Author': '{author|user}',
146 'Author': '{author|user}',
147 }
147 }
148 kwsets = ({
148 kwsets = ({
149 'Date': '{date|utcdate}',
149 'Date': '{date|utcdate}',
150 'RCSfile': '{file|basename},v',
150 'RCSfile': '{file|basename},v',
151 'RCSFile': '{file|basename},v', # kept for backwards compatibility
151 'RCSFile': '{file|basename},v', # kept for backwards compatibility
152 # with hg-keyword
152 # with hg-keyword
153 'Source': '{root}/{file},v',
153 'Source': '{root}/{file},v',
154 'Id': '{file|basename},v {node|short} {date|utcdate} {author|user}',
154 'Id': '{file|basename},v {node|short} {date|utcdate} {author|user}',
155 'Header': '{root}/{file},v {node|short} {date|utcdate} {author|user}',
155 'Header': '{root}/{file},v {node|short} {date|utcdate} {author|user}',
156 }, {
156 }, {
157 'Date': '{date|svnisodate}',
157 'Date': '{date|svnisodate}',
158 'Id': '{file|basename},v {node|short} {date|svnutcdate} {author|user}',
158 'Id': '{file|basename},v {node|short} {date|svnutcdate} {author|user}',
159 'LastChangedRevision': '{node|short}',
159 'LastChangedRevision': '{node|short}',
160 'LastChangedBy': '{author|user}',
160 'LastChangedBy': '{author|user}',
161 'LastChangedDate': '{date|svnisodate}',
161 'LastChangedDate': '{date|svnisodate}',
162 })
162 })
163 templates.update(kwsets[ui.configbool('keywordset', 'svn')])
163 templates.update(kwsets[ui.configbool('keywordset', 'svn')])
164 return templates
164 return templates
165
165
166 def _shrinktext(text, subfunc):
166 def _shrinktext(text, subfunc):
167 '''Helper for keyword expansion removal in text.
167 '''Helper for keyword expansion removal in text.
168 Depending on subfunc also returns number of substitutions.'''
168 Depending on subfunc also returns number of substitutions.'''
169 return subfunc(r'$\1$', text)
169 return subfunc(r'$\1$', text)
170
170
171 def _preselect(wstatus, changed):
171 def _preselect(wstatus, changed):
172 '''Retrieves modified and added files from a working directory state
172 '''Retrieves modified and added files from a working directory state
173 and returns the subset of each contained in given changed files
173 and returns the subset of each contained in given changed files
174 retrieved from a change context.'''
174 retrieved from a change context.'''
175 modified, added = wstatus[:2]
175 modified, added = wstatus[:2]
176 modified = [f for f in modified if f in changed]
176 modified = [f for f in modified if f in changed]
177 added = [f for f in added if f in changed]
177 added = [f for f in added if f in changed]
178 return modified, added
178 return modified, added
179
179
180
180
181 class kwtemplater(object):
181 class kwtemplater(object):
182 '''
182 '''
183 Sets up keyword templates, corresponding keyword regex, and
183 Sets up keyword templates, corresponding keyword regex, and
184 provides keyword substitution functions.
184 provides keyword substitution functions.
185 '''
185 '''
186
186
187 def __init__(self, ui, repo, inc, exc):
187 def __init__(self, ui, repo, inc, exc):
188 self.ui = ui
188 self.ui = ui
189 self.repo = repo
189 self.repo = repo
190 self.match = match.match(repo.root, '', [], inc, exc)
190 self.match = match.match(repo.root, '', [], inc, exc)
191 self.restrict = kwtools['hgcmd'] in restricted.split()
191 self.restrict = kwtools['hgcmd'] in restricted.split()
192 self.postcommit = False
192 self.postcommit = False
193
193
194 kwmaps = self.ui.configitems('keywordmaps')
194 kwmaps = self.ui.configitems('keywordmaps')
195 if kwmaps: # override default templates
195 if kwmaps: # override default templates
196 self.templates = dict((k, templater.parsestring(v, False))
196 self.templates = dict((k, templater.parsestring(v, False))
197 for k, v in kwmaps)
197 for k, v in kwmaps)
198 else:
198 else:
199 self.templates = _defaultkwmaps(self.ui)
199 self.templates = _defaultkwmaps(self.ui)
200
200
201 @util.propertycache
201 @util.propertycache
202 def escape(self):
202 def escape(self):
203 '''Returns bar-separated and escaped keywords.'''
203 '''Returns bar-separated and escaped keywords.'''
204 return '|'.join(map(re.escape, self.templates.keys()))
204 return '|'.join(map(re.escape, self.templates.keys()))
205
205
206 @util.propertycache
206 @util.propertycache
207 def rekw(self):
207 def rekw(self):
208 '''Returns regex for unexpanded keywords.'''
208 '''Returns regex for unexpanded keywords.'''
209 return re.compile(r'\$(%s)\$' % self.escape)
209 return re.compile(r'\$(%s)\$' % self.escape)
210
210
211 @util.propertycache
211 @util.propertycache
212 def rekwexp(self):
212 def rekwexp(self):
213 '''Returns regex for expanded keywords.'''
213 '''Returns regex for expanded keywords.'''
214 return re.compile(r'\$(%s): [^$\n\r]*? \$' % self.escape)
214 return re.compile(r'\$(%s): [^$\n\r]*? \$' % self.escape)
215
215
216 def substitute(self, data, path, ctx, subfunc):
216 def substitute(self, data, path, ctx, subfunc):
217 '''Replaces keywords in data with expanded template.'''
217 '''Replaces keywords in data with expanded template.'''
218 def kwsub(mobj):
218 def kwsub(mobj):
219 kw = mobj.group(1)
219 kw = mobj.group(1)
220 ct = cmdutil.changeset_templater(self.ui, self.repo,
220 ct = cmdutil.changeset_templater(self.ui, self.repo,
221 False, None, '', False)
221 False, None, '', False)
222 ct.use_template(self.templates[kw])
222 ct.use_template(self.templates[kw])
223 self.ui.pushbuffer()
223 self.ui.pushbuffer()
224 ct.show(ctx, root=self.repo.root, file=path)
224 ct.show(ctx, root=self.repo.root, file=path)
225 ekw = templatefilters.firstline(self.ui.popbuffer())
225 ekw = templatefilters.firstline(self.ui.popbuffer())
226 return '$%s: %s $' % (kw, ekw)
226 return '$%s: %s $' % (kw, ekw)
227 return subfunc(kwsub, data)
227 return subfunc(kwsub, data)
228
228
229 def linkctx(self, path, fileid):
229 def linkctx(self, path, fileid):
230 '''Similar to filelog.linkrev, but returns a changectx.'''
230 '''Similar to filelog.linkrev, but returns a changectx.'''
231 return self.repo.filectx(path, fileid=fileid).changectx()
231 return self.repo.filectx(path, fileid=fileid).changectx()
232
232
233 def expand(self, path, node, data):
233 def expand(self, path, node, data):
234 '''Returns data with keywords expanded.'''
234 '''Returns data with keywords expanded.'''
235 if not self.restrict and self.match(path) and not util.binary(data):
235 if not self.restrict and self.match(path) and not util.binary(data):
236 ctx = self.linkctx(path, node)
236 ctx = self.linkctx(path, node)
237 return self.substitute(data, path, ctx, self.rekw.sub)
237 return self.substitute(data, path, ctx, self.rekw.sub)
238 return data
238 return data
239
239
240 def iskwfile(self, cand, ctx):
240 def iskwfile(self, cand, ctx):
241 '''Returns subset of candidates which are configured for keyword
241 '''Returns subset of candidates which are configured for keyword
242 expansion but are not symbolic links.'''
242 expansion but are not symbolic links.'''
243 return [f for f in cand if self.match(f) and 'l' not in ctx.flags(f)]
243 return [f for f in cand if self.match(f) and 'l' not in ctx.flags(f)]
244
244
245 def overwrite(self, ctx, candidates, lookup, expand, rekw=False):
245 def overwrite(self, ctx, candidates, lookup, expand, rekw=False):
246 '''Overwrites selected files expanding/shrinking keywords.'''
246 '''Overwrites selected files expanding/shrinking keywords.'''
247 if self.restrict or lookup or self.postcommit: # exclude kw_copy
247 if self.restrict or lookup or self.postcommit: # exclude kw_copy
248 candidates = self.iskwfile(candidates, ctx)
248 candidates = self.iskwfile(candidates, ctx)
249 if not candidates:
249 if not candidates:
250 return
250 return
251 kwcmd = self.restrict and lookup # kwexpand/kwshrink
251 kwcmd = self.restrict and lookup # kwexpand/kwshrink
252 if self.restrict or expand and lookup:
252 if self.restrict or expand and lookup:
253 mf = ctx.manifest()
253 mf = ctx.manifest()
254 if self.restrict or rekw:
254 if self.restrict or rekw:
255 re_kw = self.rekw
255 re_kw = self.rekw
256 else:
256 else:
257 re_kw = self.rekwexp
257 re_kw = self.rekwexp
258 if expand:
258 if expand:
259 msg = _('overwriting %s expanding keywords\n')
259 msg = _('overwriting %s expanding keywords\n')
260 else:
260 else:
261 msg = _('overwriting %s shrinking keywords\n')
261 msg = _('overwriting %s shrinking keywords\n')
262 for f in candidates:
262 for f in candidates:
263 if self.restrict:
263 if self.restrict:
264 data = self.repo.file(f).read(mf[f])
264 data = self.repo.file(f).read(mf[f])
265 else:
265 else:
266 data = self.repo.wread(f)
266 data = self.repo.wread(f)
267 if util.binary(data):
267 if util.binary(data):
268 continue
268 continue
269 if expand:
269 if expand:
270 if lookup:
270 if lookup:
271 ctx = self.linkctx(f, mf[f])
271 ctx = self.linkctx(f, mf[f])
272 data, found = self.substitute(data, f, ctx, re_kw.subn)
272 data, found = self.substitute(data, f, ctx, re_kw.subn)
273 elif self.restrict:
273 elif self.restrict:
274 found = re_kw.search(data)
274 found = re_kw.search(data)
275 else:
275 else:
276 data, found = _shrinktext(data, re_kw.subn)
276 data, found = _shrinktext(data, re_kw.subn)
277 if found:
277 if found:
278 self.ui.note(msg % f)
278 self.ui.note(msg % f)
279 fp = self.repo.wopener(f, "wb", atomictemp=True)
279 fp = self.repo.wopener(f, "wb", atomictemp=True)
280 fp.write(data)
280 fp.write(data)
281 fp.close()
281 fp.close()
282 if kwcmd:
282 if kwcmd:
283 self.repo.dirstate.normal(f)
283 self.repo.dirstate.normal(f)
284 elif self.postcommit:
284 elif self.postcommit:
285 self.repo.dirstate.normallookup(f)
285 self.repo.dirstate.normallookup(f)
286
286
287 def shrink(self, fname, text):
287 def shrink(self, fname, text):
288 '''Returns text with all keyword substitutions removed.'''
288 '''Returns text with all keyword substitutions removed.'''
289 if self.match(fname) and not util.binary(text):
289 if self.match(fname) and not util.binary(text):
290 return _shrinktext(text, self.rekwexp.sub)
290 return _shrinktext(text, self.rekwexp.sub)
291 return text
291 return text
292
292
293 def shrinklines(self, fname, lines):
293 def shrinklines(self, fname, lines):
294 '''Returns lines with keyword substitutions removed.'''
294 '''Returns lines with keyword substitutions removed.'''
295 if self.match(fname):
295 if self.match(fname):
296 text = ''.join(lines)
296 text = ''.join(lines)
297 if not util.binary(text):
297 if not util.binary(text):
298 return _shrinktext(text, self.rekwexp.sub).splitlines(True)
298 return _shrinktext(text, self.rekwexp.sub).splitlines(True)
299 return lines
299 return lines
300
300
301 def wread(self, fname, data):
301 def wread(self, fname, data):
302 '''If in restricted mode returns data read from wdir with
302 '''If in restricted mode returns data read from wdir with
303 keyword substitutions removed.'''
303 keyword substitutions removed.'''
304 if self.restrict:
304 if self.restrict:
305 return self.shrink(fname, data)
305 return self.shrink(fname, data)
306 return data
306 return data
307
307
308 class kwfilelog(filelog.filelog):
308 class kwfilelog(filelog.filelog):
309 '''
309 '''
310 Subclass of filelog to hook into its read, add, cmp methods.
310 Subclass of filelog to hook into its read, add, cmp methods.
311 Keywords are "stored" unexpanded, and processed on reading.
311 Keywords are "stored" unexpanded, and processed on reading.
312 '''
312 '''
313 def __init__(self, opener, kwt, path):
313 def __init__(self, opener, kwt, path):
314 super(kwfilelog, self).__init__(opener, path)
314 super(kwfilelog, self).__init__(opener, path)
315 self.kwt = kwt
315 self.kwt = kwt
316 self.path = path
316 self.path = path
317
317
318 def read(self, node):
318 def read(self, node):
319 '''Expands keywords when reading filelog.'''
319 '''Expands keywords when reading filelog.'''
320 data = super(kwfilelog, self).read(node)
320 data = super(kwfilelog, self).read(node)
321 if self.renamed(node):
321 if self.renamed(node):
322 return data
322 return data
323 return self.kwt.expand(self.path, node, data)
323 return self.kwt.expand(self.path, node, data)
324
324
325 def add(self, text, meta, tr, link, p1=None, p2=None):
325 def add(self, text, meta, tr, link, p1=None, p2=None):
326 '''Removes keyword substitutions when adding to filelog.'''
326 '''Removes keyword substitutions when adding to filelog.'''
327 text = self.kwt.shrink(self.path, text)
327 text = self.kwt.shrink(self.path, text)
328 return super(kwfilelog, self).add(text, meta, tr, link, p1, p2)
328 return super(kwfilelog, self).add(text, meta, tr, link, p1, p2)
329
329
330 def cmp(self, node, text):
330 def cmp(self, node, text):
331 '''Removes keyword substitutions for comparison.'''
331 '''Removes keyword substitutions for comparison.'''
332 text = self.kwt.shrink(self.path, text)
332 text = self.kwt.shrink(self.path, text)
333 return super(kwfilelog, self).cmp(node, text)
333 return super(kwfilelog, self).cmp(node, text)
334
334
335 def _status(ui, repo, wctx, kwt, *pats, **opts):
335 def _status(ui, repo, wctx, kwt, *pats, **opts):
336 '''Bails out if [keyword] configuration is not active.
336 '''Bails out if [keyword] configuration is not active.
337 Returns status of working directory.'''
337 Returns status of working directory.'''
338 if kwt:
338 if kwt:
339 return repo.status(match=scmutil.match(wctx, pats, opts), clean=True,
339 return repo.status(match=scmutil.match(wctx, pats, opts), clean=True,
340 unknown=opts.get('unknown') or opts.get('all'))
340 unknown=opts.get('unknown') or opts.get('all'))
341 if ui.configitems('keyword'):
341 if ui.configitems('keyword'):
342 raise util.Abort(_('[keyword] patterns cannot match'))
342 raise util.Abort(_('[keyword] patterns cannot match'))
343 raise util.Abort(_('no [keyword] patterns configured'))
343 raise util.Abort(_('no [keyword] patterns configured'))
344
344
345 def _kwfwrite(ui, repo, expand, *pats, **opts):
345 def _kwfwrite(ui, repo, expand, *pats, **opts):
346 '''Selects files and passes them to kwtemplater.overwrite.'''
346 '''Selects files and passes them to kwtemplater.overwrite.'''
347 wctx = repo[None]
347 wctx = repo[None]
348 if len(wctx.parents()) > 1:
348 if len(wctx.parents()) > 1:
349 raise util.Abort(_('outstanding uncommitted merge'))
349 raise util.Abort(_('outstanding uncommitted merge'))
350 kwt = kwtools['templater']
350 kwt = kwtools['templater']
351 wlock = repo.wlock()
351 wlock = repo.wlock()
352 try:
352 try:
353 status = _status(ui, repo, wctx, kwt, *pats, **opts)
353 status = _status(ui, repo, wctx, kwt, *pats, **opts)
354 modified, added, removed, deleted, unknown, ignored, clean = status
354 modified, added, removed, deleted, unknown, ignored, clean = status
355 if modified or added or removed or deleted:
355 if modified or added or removed or deleted:
356 raise util.Abort(_('outstanding uncommitted changes'))
356 raise util.Abort(_('outstanding uncommitted changes'))
357 kwt.overwrite(wctx, clean, True, expand)
357 kwt.overwrite(wctx, clean, True, expand)
358 finally:
358 finally:
359 wlock.release()
359 wlock.release()
360
360
361 @command('kwdemo',
361 @command('kwdemo',
362 [('d', 'default', None, _('show default keyword template maps')),
362 [('d', 'default', None, _('show default keyword template maps')),
363 ('f', 'rcfile', '',
363 ('f', 'rcfile', '',
364 _('read maps from rcfile'), _('FILE'))],
364 _('read maps from rcfile'), _('FILE'))],
365 _('hg kwdemo [-d] [-f RCFILE] [TEMPLATEMAP]...'))
365 _('hg kwdemo [-d] [-f RCFILE] [TEMPLATEMAP]...'))
366 def demo(ui, repo, *args, **opts):
366 def demo(ui, repo, *args, **opts):
367 '''print [keywordmaps] configuration and an expansion example
367 '''print [keywordmaps] configuration and an expansion example
368
368
369 Show current, custom, or default keyword template maps and their
369 Show current, custom, or default keyword template maps and their
370 expansions.
370 expansions.
371
371
372 Extend the current configuration by specifying maps as arguments
372 Extend the current configuration by specifying maps as arguments
373 and using -f/--rcfile to source an external hgrc file.
373 and using -f/--rcfile to source an external hgrc file.
374
374
375 Use -d/--default to disable current configuration.
375 Use -d/--default to disable current configuration.
376
376
377 See :hg:`help templates` for information on templates and filters.
377 See :hg:`help templates` for information on templates and filters.
378 '''
378 '''
379 def demoitems(section, items):
379 def demoitems(section, items):
380 ui.write('[%s]\n' % section)
380 ui.write('[%s]\n' % section)
381 for k, v in sorted(items):
381 for k, v in sorted(items):
382 ui.write('%s = %s\n' % (k, v))
382 ui.write('%s = %s\n' % (k, v))
383
383
384 fn = 'demo.txt'
384 fn = 'demo.txt'
385 tmpdir = tempfile.mkdtemp('', 'kwdemo.')
385 tmpdir = tempfile.mkdtemp('', 'kwdemo.')
386 ui.note(_('creating temporary repository at %s\n') % tmpdir)
386 ui.note(_('creating temporary repository at %s\n') % tmpdir)
387 repo = localrepo.localrepository(ui, tmpdir, True)
387 repo = localrepo.localrepository(repo.baseui, tmpdir, True)
388 ui.setconfig('keyword', fn, '')
388 ui.setconfig('keyword', fn, '')
389 svn = ui.configbool('keywordset', 'svn')
389 svn = ui.configbool('keywordset', 'svn')
390 # explicitly set keywordset for demo output
390 # explicitly set keywordset for demo output
391 ui.setconfig('keywordset', 'svn', svn)
391 ui.setconfig('keywordset', 'svn', svn)
392
392
393 uikwmaps = ui.configitems('keywordmaps')
393 uikwmaps = ui.configitems('keywordmaps')
394 if args or opts.get('rcfile'):
394 if args or opts.get('rcfile'):
395 ui.status(_('\n\tconfiguration using custom keyword template maps\n'))
395 ui.status(_('\n\tconfiguration using custom keyword template maps\n'))
396 if uikwmaps:
396 if uikwmaps:
397 ui.status(_('\textending current template maps\n'))
397 ui.status(_('\textending current template maps\n'))
398 if opts.get('default') or not uikwmaps:
398 if opts.get('default') or not uikwmaps:
399 if svn:
399 if svn:
400 ui.status(_('\toverriding default svn keywordset\n'))
400 ui.status(_('\toverriding default svn keywordset\n'))
401 else:
401 else:
402 ui.status(_('\toverriding default cvs keywordset\n'))
402 ui.status(_('\toverriding default cvs keywordset\n'))
403 if opts.get('rcfile'):
403 if opts.get('rcfile'):
404 ui.readconfig(opts.get('rcfile'))
404 ui.readconfig(opts.get('rcfile'))
405 if args:
405 if args:
406 # simulate hgrc parsing
406 # simulate hgrc parsing
407 rcmaps = ['[keywordmaps]\n'] + [a + '\n' for a in args]
407 rcmaps = ['[keywordmaps]\n'] + [a + '\n' for a in args]
408 fp = repo.opener('hgrc', 'w')
408 fp = repo.opener('hgrc', 'w')
409 fp.writelines(rcmaps)
409 fp.writelines(rcmaps)
410 fp.close()
410 fp.close()
411 ui.readconfig(repo.join('hgrc'))
411 ui.readconfig(repo.join('hgrc'))
412 kwmaps = dict(ui.configitems('keywordmaps'))
412 kwmaps = dict(ui.configitems('keywordmaps'))
413 elif opts.get('default'):
413 elif opts.get('default'):
414 if svn:
414 if svn:
415 ui.status(_('\n\tconfiguration using default svn keywordset\n'))
415 ui.status(_('\n\tconfiguration using default svn keywordset\n'))
416 else:
416 else:
417 ui.status(_('\n\tconfiguration using default cvs keywordset\n'))
417 ui.status(_('\n\tconfiguration using default cvs keywordset\n'))
418 kwmaps = _defaultkwmaps(ui)
418 kwmaps = _defaultkwmaps(ui)
419 if uikwmaps:
419 if uikwmaps:
420 ui.status(_('\tdisabling current template maps\n'))
420 ui.status(_('\tdisabling current template maps\n'))
421 for k, v in kwmaps.iteritems():
421 for k, v in kwmaps.iteritems():
422 ui.setconfig('keywordmaps', k, v)
422 ui.setconfig('keywordmaps', k, v)
423 else:
423 else:
424 ui.status(_('\n\tconfiguration using current keyword template maps\n'))
424 ui.status(_('\n\tconfiguration using current keyword template maps\n'))
425 if uikwmaps:
425 if uikwmaps:
426 kwmaps = dict(uikwmaps)
426 kwmaps = dict(uikwmaps)
427 else:
427 else:
428 kwmaps = _defaultkwmaps(ui)
428 kwmaps = _defaultkwmaps(ui)
429
429
430 uisetup(ui)
430 uisetup(ui)
431 reposetup(ui, repo)
431 reposetup(ui, repo)
432 ui.write('[extensions]\nkeyword =\n')
432 ui.write('[extensions]\nkeyword =\n')
433 demoitems('keyword', ui.configitems('keyword'))
433 demoitems('keyword', ui.configitems('keyword'))
434 demoitems('keywordset', ui.configitems('keywordset'))
434 demoitems('keywordset', ui.configitems('keywordset'))
435 demoitems('keywordmaps', kwmaps.iteritems())
435 demoitems('keywordmaps', kwmaps.iteritems())
436 keywords = '$' + '$\n$'.join(sorted(kwmaps.keys())) + '$\n'
436 keywords = '$' + '$\n$'.join(sorted(kwmaps.keys())) + '$\n'
437 repo.wopener.write(fn, keywords)
437 repo.wopener.write(fn, keywords)
438 repo[None].add([fn])
438 repo[None].add([fn])
439 ui.note(_('\nkeywords written to %s:\n') % fn)
439 ui.note(_('\nkeywords written to %s:\n') % fn)
440 ui.note(keywords)
440 ui.note(keywords)
441 repo.dirstate.setbranch('demobranch')
441 repo.dirstate.setbranch('demobranch')
442 for name, cmd in ui.configitems('hooks'):
442 for name, cmd in ui.configitems('hooks'):
443 if name.split('.', 1)[0].find('commit') > -1:
443 if name.split('.', 1)[0].find('commit') > -1:
444 repo.ui.setconfig('hooks', name, '')
444 repo.ui.setconfig('hooks', name, '')
445 msg = _('hg keyword configuration and expansion example')
445 msg = _('hg keyword configuration and expansion example')
446 ui.note("hg ci -m '%s'\n" % msg) # check-code-ignore
446 ui.note("hg ci -m '%s'\n" % msg) # check-code-ignore
447 repo.commit(text=msg)
447 repo.commit(text=msg)
448 ui.status(_('\n\tkeywords expanded\n'))
448 ui.status(_('\n\tkeywords expanded\n'))
449 ui.write(repo.wread(fn))
449 ui.write(repo.wread(fn))
450 shutil.rmtree(tmpdir, ignore_errors=True)
450 shutil.rmtree(tmpdir, ignore_errors=True)
451
451
452 @command('kwexpand', commands.walkopts, _('hg kwexpand [OPTION]... [FILE]...'))
452 @command('kwexpand', commands.walkopts, _('hg kwexpand [OPTION]... [FILE]...'))
453 def expand(ui, repo, *pats, **opts):
453 def expand(ui, repo, *pats, **opts):
454 '''expand keywords in the working directory
454 '''expand keywords in the working directory
455
455
456 Run after (re)enabling keyword expansion.
456 Run after (re)enabling keyword expansion.
457
457
458 kwexpand refuses to run if given files contain local changes.
458 kwexpand refuses to run if given files contain local changes.
459 '''
459 '''
460 # 3rd argument sets expansion to True
460 # 3rd argument sets expansion to True
461 _kwfwrite(ui, repo, True, *pats, **opts)
461 _kwfwrite(ui, repo, True, *pats, **opts)
462
462
463 @command('kwfiles',
463 @command('kwfiles',
464 [('A', 'all', None, _('show keyword status flags of all files')),
464 [('A', 'all', None, _('show keyword status flags of all files')),
465 ('i', 'ignore', None, _('show files excluded from expansion')),
465 ('i', 'ignore', None, _('show files excluded from expansion')),
466 ('u', 'unknown', None, _('only show unknown (not tracked) files')),
466 ('u', 'unknown', None, _('only show unknown (not tracked) files')),
467 ] + commands.walkopts,
467 ] + commands.walkopts,
468 _('hg kwfiles [OPTION]... [FILE]...'))
468 _('hg kwfiles [OPTION]... [FILE]...'))
469 def files(ui, repo, *pats, **opts):
469 def files(ui, repo, *pats, **opts):
470 '''show files configured for keyword expansion
470 '''show files configured for keyword expansion
471
471
472 List which files in the working directory are matched by the
472 List which files in the working directory are matched by the
473 [keyword] configuration patterns.
473 [keyword] configuration patterns.
474
474
475 Useful to prevent inadvertent keyword expansion and to speed up
475 Useful to prevent inadvertent keyword expansion and to speed up
476 execution by including only files that are actual candidates for
476 execution by including only files that are actual candidates for
477 expansion.
477 expansion.
478
478
479 See :hg:`help keyword` on how to construct patterns both for
479 See :hg:`help keyword` on how to construct patterns both for
480 inclusion and exclusion of files.
480 inclusion and exclusion of files.
481
481
482 With -A/--all and -v/--verbose the codes used to show the status
482 With -A/--all and -v/--verbose the codes used to show the status
483 of files are::
483 of files are::
484
484
485 K = keyword expansion candidate
485 K = keyword expansion candidate
486 k = keyword expansion candidate (not tracked)
486 k = keyword expansion candidate (not tracked)
487 I = ignored
487 I = ignored
488 i = ignored (not tracked)
488 i = ignored (not tracked)
489 '''
489 '''
490 kwt = kwtools['templater']
490 kwt = kwtools['templater']
491 wctx = repo[None]
491 wctx = repo[None]
492 status = _status(ui, repo, wctx, kwt, *pats, **opts)
492 status = _status(ui, repo, wctx, kwt, *pats, **opts)
493 cwd = pats and repo.getcwd() or ''
493 cwd = pats and repo.getcwd() or ''
494 modified, added, removed, deleted, unknown, ignored, clean = status
494 modified, added, removed, deleted, unknown, ignored, clean = status
495 files = []
495 files = []
496 if not opts.get('unknown') or opts.get('all'):
496 if not opts.get('unknown') or opts.get('all'):
497 files = sorted(modified + added + clean)
497 files = sorted(modified + added + clean)
498 kwfiles = kwt.iskwfile(files, wctx)
498 kwfiles = kwt.iskwfile(files, wctx)
499 kwdeleted = kwt.iskwfile(deleted, wctx)
499 kwdeleted = kwt.iskwfile(deleted, wctx)
500 kwunknown = kwt.iskwfile(unknown, wctx)
500 kwunknown = kwt.iskwfile(unknown, wctx)
501 if not opts.get('ignore') or opts.get('all'):
501 if not opts.get('ignore') or opts.get('all'):
502 showfiles = kwfiles, kwdeleted, kwunknown
502 showfiles = kwfiles, kwdeleted, kwunknown
503 else:
503 else:
504 showfiles = [], [], []
504 showfiles = [], [], []
505 if opts.get('all') or opts.get('ignore'):
505 if opts.get('all') or opts.get('ignore'):
506 showfiles += ([f for f in files if f not in kwfiles],
506 showfiles += ([f for f in files if f not in kwfiles],
507 [f for f in unknown if f not in kwunknown])
507 [f for f in unknown if f not in kwunknown])
508 kwlabels = 'enabled deleted enabledunknown ignored ignoredunknown'.split()
508 kwlabels = 'enabled deleted enabledunknown ignored ignoredunknown'.split()
509 kwstates = zip(kwlabels, 'K!kIi', showfiles)
509 kwstates = zip(kwlabels, 'K!kIi', showfiles)
510 fm = ui.formatter('kwfiles', opts)
510 fm = ui.formatter('kwfiles', opts)
511 fmt = '%.0s%s\n'
511 fmt = '%.0s%s\n'
512 if opts.get('all') or ui.verbose:
512 if opts.get('all') or ui.verbose:
513 fmt = '%s %s\n'
513 fmt = '%s %s\n'
514 for kwstate, char, filenames in kwstates:
514 for kwstate, char, filenames in kwstates:
515 label = 'kwfiles.' + kwstate
515 label = 'kwfiles.' + kwstate
516 for f in filenames:
516 for f in filenames:
517 fm.startitem()
517 fm.startitem()
518 fm.write('kwstatus path', fmt, char,
518 fm.write('kwstatus path', fmt, char,
519 repo.pathto(f, cwd), label=label)
519 repo.pathto(f, cwd), label=label)
520 fm.end()
520 fm.end()
521
521
522 @command('kwshrink', commands.walkopts, _('hg kwshrink [OPTION]... [FILE]...'))
522 @command('kwshrink', commands.walkopts, _('hg kwshrink [OPTION]... [FILE]...'))
523 def shrink(ui, repo, *pats, **opts):
523 def shrink(ui, repo, *pats, **opts):
524 '''revert expanded keywords in the working directory
524 '''revert expanded keywords in the working directory
525
525
526 Must be run before changing/disabling active keywords.
526 Must be run before changing/disabling active keywords.
527
527
528 kwshrink refuses to run if given files contain local changes.
528 kwshrink refuses to run if given files contain local changes.
529 '''
529 '''
530 # 3rd argument sets expansion to False
530 # 3rd argument sets expansion to False
531 _kwfwrite(ui, repo, False, *pats, **opts)
531 _kwfwrite(ui, repo, False, *pats, **opts)
532
532
533
533
534 def uisetup(ui):
534 def uisetup(ui):
535 ''' Monkeypatches dispatch._parse to retrieve user command.'''
535 ''' Monkeypatches dispatch._parse to retrieve user command.'''
536
536
537 def kwdispatch_parse(orig, ui, args):
537 def kwdispatch_parse(orig, ui, args):
538 '''Monkeypatch dispatch._parse to obtain running hg command.'''
538 '''Monkeypatch dispatch._parse to obtain running hg command.'''
539 cmd, func, args, options, cmdoptions = orig(ui, args)
539 cmd, func, args, options, cmdoptions = orig(ui, args)
540 kwtools['hgcmd'] = cmd
540 kwtools['hgcmd'] = cmd
541 return cmd, func, args, options, cmdoptions
541 return cmd, func, args, options, cmdoptions
542
542
543 extensions.wrapfunction(dispatch, '_parse', kwdispatch_parse)
543 extensions.wrapfunction(dispatch, '_parse', kwdispatch_parse)
544
544
545 def reposetup(ui, repo):
545 def reposetup(ui, repo):
546 '''Sets up repo as kwrepo for keyword substitution.
546 '''Sets up repo as kwrepo for keyword substitution.
547 Overrides file method to return kwfilelog instead of filelog
547 Overrides file method to return kwfilelog instead of filelog
548 if file matches user configuration.
548 if file matches user configuration.
549 Wraps commit to overwrite configured files with updated
549 Wraps commit to overwrite configured files with updated
550 keyword substitutions.
550 keyword substitutions.
551 Monkeypatches patch and webcommands.'''
551 Monkeypatches patch and webcommands.'''
552
552
553 try:
553 try:
554 if (not repo.local() or kwtools['hgcmd'] in nokwcommands.split()
554 if (not repo.local() or kwtools['hgcmd'] in nokwcommands.split()
555 or '.hg' in util.splitpath(repo.root)
555 or '.hg' in util.splitpath(repo.root)
556 or repo._url.startswith('bundle:')):
556 or repo._url.startswith('bundle:')):
557 return
557 return
558 except AttributeError:
558 except AttributeError:
559 pass
559 pass
560
560
561 inc, exc = [], ['.hg*']
561 inc, exc = [], ['.hg*']
562 for pat, opt in ui.configitems('keyword'):
562 for pat, opt in ui.configitems('keyword'):
563 if opt != 'ignore':
563 if opt != 'ignore':
564 inc.append(pat)
564 inc.append(pat)
565 else:
565 else:
566 exc.append(pat)
566 exc.append(pat)
567 if not inc:
567 if not inc:
568 return
568 return
569
569
570 kwtools['templater'] = kwt = kwtemplater(ui, repo, inc, exc)
570 kwtools['templater'] = kwt = kwtemplater(ui, repo, inc, exc)
571
571
572 class kwrepo(repo.__class__):
572 class kwrepo(repo.__class__):
573 def file(self, f):
573 def file(self, f):
574 if f[0] == '/':
574 if f[0] == '/':
575 f = f[1:]
575 f = f[1:]
576 return kwfilelog(self.sopener, kwt, f)
576 return kwfilelog(self.sopener, kwt, f)
577
577
578 def wread(self, filename):
578 def wread(self, filename):
579 data = super(kwrepo, self).wread(filename)
579 data = super(kwrepo, self).wread(filename)
580 return kwt.wread(filename, data)
580 return kwt.wread(filename, data)
581
581
582 def commit(self, *args, **opts):
582 def commit(self, *args, **opts):
583 # use custom commitctx for user commands
583 # use custom commitctx for user commands
584 # other extensions can still wrap repo.commitctx directly
584 # other extensions can still wrap repo.commitctx directly
585 self.commitctx = self.kwcommitctx
585 self.commitctx = self.kwcommitctx
586 try:
586 try:
587 return super(kwrepo, self).commit(*args, **opts)
587 return super(kwrepo, self).commit(*args, **opts)
588 finally:
588 finally:
589 del self.commitctx
589 del self.commitctx
590
590
591 def kwcommitctx(self, ctx, error=False):
591 def kwcommitctx(self, ctx, error=False):
592 n = super(kwrepo, self).commitctx(ctx, error)
592 n = super(kwrepo, self).commitctx(ctx, error)
593 # no lock needed, only called from repo.commit() which already locks
593 # no lock needed, only called from repo.commit() which already locks
594 if not kwt.postcommit:
594 if not kwt.postcommit:
595 restrict = kwt.restrict
595 restrict = kwt.restrict
596 kwt.restrict = True
596 kwt.restrict = True
597 kwt.overwrite(self[n], sorted(ctx.added() + ctx.modified()),
597 kwt.overwrite(self[n], sorted(ctx.added() + ctx.modified()),
598 False, True)
598 False, True)
599 kwt.restrict = restrict
599 kwt.restrict = restrict
600 return n
600 return n
601
601
602 def rollback(self, dryrun=False, force=False):
602 def rollback(self, dryrun=False, force=False):
603 wlock = self.wlock()
603 wlock = self.wlock()
604 try:
604 try:
605 if not dryrun:
605 if not dryrun:
606 changed = self['.'].files()
606 changed = self['.'].files()
607 ret = super(kwrepo, self).rollback(dryrun, force)
607 ret = super(kwrepo, self).rollback(dryrun, force)
608 if not dryrun:
608 if not dryrun:
609 ctx = self['.']
609 ctx = self['.']
610 modified, added = _preselect(self[None].status(), changed)
610 modified, added = _preselect(self[None].status(), changed)
611 kwt.overwrite(ctx, modified, True, True)
611 kwt.overwrite(ctx, modified, True, True)
612 kwt.overwrite(ctx, added, True, False)
612 kwt.overwrite(ctx, added, True, False)
613 return ret
613 return ret
614 finally:
614 finally:
615 wlock.release()
615 wlock.release()
616
616
617 # monkeypatches
617 # monkeypatches
618 def kwpatchfile_init(orig, self, ui, gp, backend, store, eolmode=None):
618 def kwpatchfile_init(orig, self, ui, gp, backend, store, eolmode=None):
619 '''Monkeypatch/wrap patch.patchfile.__init__ to avoid
619 '''Monkeypatch/wrap patch.patchfile.__init__ to avoid
620 rejects or conflicts due to expanded keywords in working dir.'''
620 rejects or conflicts due to expanded keywords in working dir.'''
621 orig(self, ui, gp, backend, store, eolmode)
621 orig(self, ui, gp, backend, store, eolmode)
622 # shrink keywords read from working dir
622 # shrink keywords read from working dir
623 self.lines = kwt.shrinklines(self.fname, self.lines)
623 self.lines = kwt.shrinklines(self.fname, self.lines)
624
624
625 def kw_diff(orig, repo, node1=None, node2=None, match=None, changes=None,
625 def kw_diff(orig, repo, node1=None, node2=None, match=None, changes=None,
626 opts=None, prefix=''):
626 opts=None, prefix=''):
627 '''Monkeypatch patch.diff to avoid expansion.'''
627 '''Monkeypatch patch.diff to avoid expansion.'''
628 kwt.restrict = True
628 kwt.restrict = True
629 return orig(repo, node1, node2, match, changes, opts, prefix)
629 return orig(repo, node1, node2, match, changes, opts, prefix)
630
630
631 def kwweb_skip(orig, web, req, tmpl):
631 def kwweb_skip(orig, web, req, tmpl):
632 '''Wraps webcommands.x turning off keyword expansion.'''
632 '''Wraps webcommands.x turning off keyword expansion.'''
633 kwt.match = util.never
633 kwt.match = util.never
634 return orig(web, req, tmpl)
634 return orig(web, req, tmpl)
635
635
636 def kw_amend(orig, ui, repo, commitfunc, old, extra, pats, opts):
636 def kw_amend(orig, ui, repo, commitfunc, old, extra, pats, opts):
637 '''Wraps cmdutil.amend expanding keywords after amend.'''
637 '''Wraps cmdutil.amend expanding keywords after amend.'''
638 wlock = repo.wlock()
638 wlock = repo.wlock()
639 try:
639 try:
640 kwt.postcommit = True
640 kwt.postcommit = True
641 newid = orig(ui, repo, commitfunc, old, extra, pats, opts)
641 newid = orig(ui, repo, commitfunc, old, extra, pats, opts)
642 if newid != old.node():
642 if newid != old.node():
643 ctx = repo[newid]
643 ctx = repo[newid]
644 kwt.restrict = True
644 kwt.restrict = True
645 kwt.overwrite(ctx, ctx.files(), False, True)
645 kwt.overwrite(ctx, ctx.files(), False, True)
646 kwt.restrict = False
646 kwt.restrict = False
647 return newid
647 return newid
648 finally:
648 finally:
649 wlock.release()
649 wlock.release()
650
650
651 def kw_copy(orig, ui, repo, pats, opts, rename=False):
651 def kw_copy(orig, ui, repo, pats, opts, rename=False):
652 '''Wraps cmdutil.copy so that copy/rename destinations do not
652 '''Wraps cmdutil.copy so that copy/rename destinations do not
653 contain expanded keywords.
653 contain expanded keywords.
654 Note that the source of a regular file destination may also be a
654 Note that the source of a regular file destination may also be a
655 symlink:
655 symlink:
656 hg cp sym x -> x is symlink
656 hg cp sym x -> x is symlink
657 cp sym x; hg cp -A sym x -> x is file (maybe expanded keywords)
657 cp sym x; hg cp -A sym x -> x is file (maybe expanded keywords)
658 For the latter we have to follow the symlink to find out whether its
658 For the latter we have to follow the symlink to find out whether its
659 target is configured for expansion and we therefore must unexpand the
659 target is configured for expansion and we therefore must unexpand the
660 keywords in the destination.'''
660 keywords in the destination.'''
661 wlock = repo.wlock()
661 wlock = repo.wlock()
662 try:
662 try:
663 orig(ui, repo, pats, opts, rename)
663 orig(ui, repo, pats, opts, rename)
664 if opts.get('dry_run'):
664 if opts.get('dry_run'):
665 return
665 return
666 wctx = repo[None]
666 wctx = repo[None]
667 cwd = repo.getcwd()
667 cwd = repo.getcwd()
668
668
669 def haskwsource(dest):
669 def haskwsource(dest):
670 '''Returns true if dest is a regular file and configured for
670 '''Returns true if dest is a regular file and configured for
671 expansion or a symlink which points to a file configured for
671 expansion or a symlink which points to a file configured for
672 expansion. '''
672 expansion. '''
673 source = repo.dirstate.copied(dest)
673 source = repo.dirstate.copied(dest)
674 if 'l' in wctx.flags(source):
674 if 'l' in wctx.flags(source):
675 source = scmutil.canonpath(repo.root, cwd,
675 source = scmutil.canonpath(repo.root, cwd,
676 os.path.realpath(source))
676 os.path.realpath(source))
677 return kwt.match(source)
677 return kwt.match(source)
678
678
679 candidates = [f for f in repo.dirstate.copies() if
679 candidates = [f for f in repo.dirstate.copies() if
680 'l' not in wctx.flags(f) and haskwsource(f)]
680 'l' not in wctx.flags(f) and haskwsource(f)]
681 kwt.overwrite(wctx, candidates, False, False)
681 kwt.overwrite(wctx, candidates, False, False)
682 finally:
682 finally:
683 wlock.release()
683 wlock.release()
684
684
685 def kw_dorecord(orig, ui, repo, commitfunc, *pats, **opts):
685 def kw_dorecord(orig, ui, repo, commitfunc, *pats, **opts):
686 '''Wraps record.dorecord expanding keywords after recording.'''
686 '''Wraps record.dorecord expanding keywords after recording.'''
687 wlock = repo.wlock()
687 wlock = repo.wlock()
688 try:
688 try:
689 # record returns 0 even when nothing has changed
689 # record returns 0 even when nothing has changed
690 # therefore compare nodes before and after
690 # therefore compare nodes before and after
691 kwt.postcommit = True
691 kwt.postcommit = True
692 ctx = repo['.']
692 ctx = repo['.']
693 wstatus = repo[None].status()
693 wstatus = repo[None].status()
694 ret = orig(ui, repo, commitfunc, *pats, **opts)
694 ret = orig(ui, repo, commitfunc, *pats, **opts)
695 recctx = repo['.']
695 recctx = repo['.']
696 if ctx != recctx:
696 if ctx != recctx:
697 modified, added = _preselect(wstatus, recctx.files())
697 modified, added = _preselect(wstatus, recctx.files())
698 kwt.restrict = False
698 kwt.restrict = False
699 kwt.overwrite(recctx, modified, False, True)
699 kwt.overwrite(recctx, modified, False, True)
700 kwt.overwrite(recctx, added, False, True, True)
700 kwt.overwrite(recctx, added, False, True, True)
701 kwt.restrict = True
701 kwt.restrict = True
702 return ret
702 return ret
703 finally:
703 finally:
704 wlock.release()
704 wlock.release()
705
705
706 def kwfilectx_cmp(orig, self, fctx):
706 def kwfilectx_cmp(orig, self, fctx):
707 # keyword affects data size, comparing wdir and filelog size does
707 # keyword affects data size, comparing wdir and filelog size does
708 # not make sense
708 # not make sense
709 if (fctx._filerev is None and
709 if (fctx._filerev is None and
710 (self._repo._encodefilterpats or
710 (self._repo._encodefilterpats or
711 kwt.match(fctx.path()) and 'l' not in fctx.flags() or
711 kwt.match(fctx.path()) and 'l' not in fctx.flags() or
712 self.size() - 4 == fctx.size()) or
712 self.size() - 4 == fctx.size()) or
713 self.size() == fctx.size()):
713 self.size() == fctx.size()):
714 return self._filelog.cmp(self._filenode, fctx.data())
714 return self._filelog.cmp(self._filenode, fctx.data())
715 return True
715 return True
716
716
717 extensions.wrapfunction(context.filectx, 'cmp', kwfilectx_cmp)
717 extensions.wrapfunction(context.filectx, 'cmp', kwfilectx_cmp)
718 extensions.wrapfunction(patch.patchfile, '__init__', kwpatchfile_init)
718 extensions.wrapfunction(patch.patchfile, '__init__', kwpatchfile_init)
719 extensions.wrapfunction(patch, 'diff', kw_diff)
719 extensions.wrapfunction(patch, 'diff', kw_diff)
720 extensions.wrapfunction(cmdutil, 'amend', kw_amend)
720 extensions.wrapfunction(cmdutil, 'amend', kw_amend)
721 extensions.wrapfunction(cmdutil, 'copy', kw_copy)
721 extensions.wrapfunction(cmdutil, 'copy', kw_copy)
722 for c in 'annotate changeset rev filediff diff'.split():
722 for c in 'annotate changeset rev filediff diff'.split():
723 extensions.wrapfunction(webcommands, c, kwweb_skip)
723 extensions.wrapfunction(webcommands, c, kwweb_skip)
724 for name in recordextensions.split():
724 for name in recordextensions.split():
725 try:
725 try:
726 record = extensions.find(name)
726 record = extensions.find(name)
727 extensions.wrapfunction(record, 'dorecord', kw_dorecord)
727 extensions.wrapfunction(record, 'dorecord', kw_dorecord)
728 except KeyError:
728 except KeyError:
729 pass
729 pass
730
730
731 repo.__class__ = kwrepo
731 repo.__class__ = kwrepo
@@ -1,184 +1,184 b''
1 # Mercurial extension to provide 'hg relink' command
1 # Mercurial extension to provide 'hg relink' command
2 #
2 #
3 # Copyright (C) 2007 Brendan Cully <brendan@kublai.com>
3 # Copyright (C) 2007 Brendan Cully <brendan@kublai.com>
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 """recreates hardlinks between repository clones"""
8 """recreates hardlinks between repository clones"""
9
9
10 from mercurial import hg, util
10 from mercurial import hg, util
11 from mercurial.i18n import _
11 from mercurial.i18n import _
12 import os, stat
12 import os, stat
13
13
14 testedwith = 'internal'
14 testedwith = 'internal'
15
15
16 def relink(ui, repo, origin=None, **opts):
16 def relink(ui, repo, origin=None, **opts):
17 """recreate hardlinks between two repositories
17 """recreate hardlinks between two repositories
18
18
19 When repositories are cloned locally, their data files will be
19 When repositories are cloned locally, their data files will be
20 hardlinked so that they only use the space of a single repository.
20 hardlinked so that they only use the space of a single repository.
21
21
22 Unfortunately, subsequent pulls into either repository will break
22 Unfortunately, subsequent pulls into either repository will break
23 hardlinks for any files touched by the new changesets, even if
23 hardlinks for any files touched by the new changesets, even if
24 both repositories end up pulling the same changes.
24 both repositories end up pulling the same changes.
25
25
26 Similarly, passing --rev to "hg clone" will fail to use any
26 Similarly, passing --rev to "hg clone" will fail to use any
27 hardlinks, falling back to a complete copy of the source
27 hardlinks, falling back to a complete copy of the source
28 repository.
28 repository.
29
29
30 This command lets you recreate those hardlinks and reclaim that
30 This command lets you recreate those hardlinks and reclaim that
31 wasted space.
31 wasted space.
32
32
33 This repository will be relinked to share space with ORIGIN, which
33 This repository will be relinked to share space with ORIGIN, which
34 must be on the same local disk. If ORIGIN is omitted, looks for
34 must be on the same local disk. If ORIGIN is omitted, looks for
35 "default-relink", then "default", in [paths].
35 "default-relink", then "default", in [paths].
36
36
37 Do not attempt any read operations on this repository while the
37 Do not attempt any read operations on this repository while the
38 command is running. (Both repositories will be locked against
38 command is running. (Both repositories will be locked against
39 writes.)
39 writes.)
40 """
40 """
41 if (not util.safehasattr(util, 'samefile') or
41 if (not util.safehasattr(util, 'samefile') or
42 not util.safehasattr(util, 'samedevice')):
42 not util.safehasattr(util, 'samedevice')):
43 raise util.Abort(_('hardlinks are not supported on this system'))
43 raise util.Abort(_('hardlinks are not supported on this system'))
44 src = hg.repository(ui, ui.expandpath(origin or 'default-relink',
44 src = hg.repository(repo.baseui, ui.expandpath(origin or 'default-relink',
45 origin or 'default'))
45 origin or 'default'))
46 ui.status(_('relinking %s to %s\n') % (src.store.path, repo.store.path))
46 ui.status(_('relinking %s to %s\n') % (src.store.path, repo.store.path))
47 if repo.root == src.root:
47 if repo.root == src.root:
48 ui.status(_('there is nothing to relink\n'))
48 ui.status(_('there is nothing to relink\n'))
49 return
49 return
50
50
51 locallock = repo.lock()
51 locallock = repo.lock()
52 try:
52 try:
53 remotelock = src.lock()
53 remotelock = src.lock()
54 try:
54 try:
55 candidates = sorted(collect(src, ui))
55 candidates = sorted(collect(src, ui))
56 targets = prune(candidates, src.store.path, repo.store.path, ui)
56 targets = prune(candidates, src.store.path, repo.store.path, ui)
57 do_relink(src.store.path, repo.store.path, targets, ui)
57 do_relink(src.store.path, repo.store.path, targets, ui)
58 finally:
58 finally:
59 remotelock.release()
59 remotelock.release()
60 finally:
60 finally:
61 locallock.release()
61 locallock.release()
62
62
63 def collect(src, ui):
63 def collect(src, ui):
64 seplen = len(os.path.sep)
64 seplen = len(os.path.sep)
65 candidates = []
65 candidates = []
66 live = len(src['tip'].manifest())
66 live = len(src['tip'].manifest())
67 # Your average repository has some files which were deleted before
67 # Your average repository has some files which were deleted before
68 # the tip revision. We account for that by assuming that there are
68 # the tip revision. We account for that by assuming that there are
69 # 3 tracked files for every 2 live files as of the tip version of
69 # 3 tracked files for every 2 live files as of the tip version of
70 # the repository.
70 # the repository.
71 #
71 #
72 # mozilla-central as of 2010-06-10 had a ratio of just over 7:5.
72 # mozilla-central as of 2010-06-10 had a ratio of just over 7:5.
73 total = live * 3 // 2
73 total = live * 3 // 2
74 src = src.store.path
74 src = src.store.path
75 pos = 0
75 pos = 0
76 ui.status(_("tip has %d files, estimated total number of files: %s\n")
76 ui.status(_("tip has %d files, estimated total number of files: %s\n")
77 % (live, total))
77 % (live, total))
78 for dirpath, dirnames, filenames in os.walk(src):
78 for dirpath, dirnames, filenames in os.walk(src):
79 dirnames.sort()
79 dirnames.sort()
80 relpath = dirpath[len(src) + seplen:]
80 relpath = dirpath[len(src) + seplen:]
81 for filename in sorted(filenames):
81 for filename in sorted(filenames):
82 if filename[-2:] not in ('.d', '.i'):
82 if filename[-2:] not in ('.d', '.i'):
83 continue
83 continue
84 st = os.stat(os.path.join(dirpath, filename))
84 st = os.stat(os.path.join(dirpath, filename))
85 if not stat.S_ISREG(st.st_mode):
85 if not stat.S_ISREG(st.st_mode):
86 continue
86 continue
87 pos += 1
87 pos += 1
88 candidates.append((os.path.join(relpath, filename), st))
88 candidates.append((os.path.join(relpath, filename), st))
89 ui.progress(_('collecting'), pos, filename, _('files'), total)
89 ui.progress(_('collecting'), pos, filename, _('files'), total)
90
90
91 ui.progress(_('collecting'), None)
91 ui.progress(_('collecting'), None)
92 ui.status(_('collected %d candidate storage files\n') % len(candidates))
92 ui.status(_('collected %d candidate storage files\n') % len(candidates))
93 return candidates
93 return candidates
94
94
95 def prune(candidates, src, dst, ui):
95 def prune(candidates, src, dst, ui):
96 def linkfilter(src, dst, st):
96 def linkfilter(src, dst, st):
97 try:
97 try:
98 ts = os.stat(dst)
98 ts = os.stat(dst)
99 except OSError:
99 except OSError:
100 # Destination doesn't have this file?
100 # Destination doesn't have this file?
101 return False
101 return False
102 if util.samefile(src, dst):
102 if util.samefile(src, dst):
103 return False
103 return False
104 if not util.samedevice(src, dst):
104 if not util.samedevice(src, dst):
105 # No point in continuing
105 # No point in continuing
106 raise util.Abort(
106 raise util.Abort(
107 _('source and destination are on different devices'))
107 _('source and destination are on different devices'))
108 if st.st_size != ts.st_size:
108 if st.st_size != ts.st_size:
109 return False
109 return False
110 return st
110 return st
111
111
112 targets = []
112 targets = []
113 total = len(candidates)
113 total = len(candidates)
114 pos = 0
114 pos = 0
115 for fn, st in candidates:
115 for fn, st in candidates:
116 pos += 1
116 pos += 1
117 srcpath = os.path.join(src, fn)
117 srcpath = os.path.join(src, fn)
118 tgt = os.path.join(dst, fn)
118 tgt = os.path.join(dst, fn)
119 ts = linkfilter(srcpath, tgt, st)
119 ts = linkfilter(srcpath, tgt, st)
120 if not ts:
120 if not ts:
121 ui.debug('not linkable: %s\n' % fn)
121 ui.debug('not linkable: %s\n' % fn)
122 continue
122 continue
123 targets.append((fn, ts.st_size))
123 targets.append((fn, ts.st_size))
124 ui.progress(_('pruning'), pos, fn, _('files'), total)
124 ui.progress(_('pruning'), pos, fn, _('files'), total)
125
125
126 ui.progress(_('pruning'), None)
126 ui.progress(_('pruning'), None)
127 ui.status(_('pruned down to %d probably relinkable files\n') % len(targets))
127 ui.status(_('pruned down to %d probably relinkable files\n') % len(targets))
128 return targets
128 return targets
129
129
130 def do_relink(src, dst, files, ui):
130 def do_relink(src, dst, files, ui):
131 def relinkfile(src, dst):
131 def relinkfile(src, dst):
132 bak = dst + '.bak'
132 bak = dst + '.bak'
133 os.rename(dst, bak)
133 os.rename(dst, bak)
134 try:
134 try:
135 util.oslink(src, dst)
135 util.oslink(src, dst)
136 except OSError:
136 except OSError:
137 os.rename(bak, dst)
137 os.rename(bak, dst)
138 raise
138 raise
139 os.remove(bak)
139 os.remove(bak)
140
140
141 CHUNKLEN = 65536
141 CHUNKLEN = 65536
142 relinked = 0
142 relinked = 0
143 savedbytes = 0
143 savedbytes = 0
144
144
145 pos = 0
145 pos = 0
146 total = len(files)
146 total = len(files)
147 for f, sz in files:
147 for f, sz in files:
148 pos += 1
148 pos += 1
149 source = os.path.join(src, f)
149 source = os.path.join(src, f)
150 tgt = os.path.join(dst, f)
150 tgt = os.path.join(dst, f)
151 # Binary mode, so that read() works correctly, especially on Windows
151 # Binary mode, so that read() works correctly, especially on Windows
152 sfp = file(source, 'rb')
152 sfp = file(source, 'rb')
153 dfp = file(tgt, 'rb')
153 dfp = file(tgt, 'rb')
154 sin = sfp.read(CHUNKLEN)
154 sin = sfp.read(CHUNKLEN)
155 while sin:
155 while sin:
156 din = dfp.read(CHUNKLEN)
156 din = dfp.read(CHUNKLEN)
157 if sin != din:
157 if sin != din:
158 break
158 break
159 sin = sfp.read(CHUNKLEN)
159 sin = sfp.read(CHUNKLEN)
160 sfp.close()
160 sfp.close()
161 dfp.close()
161 dfp.close()
162 if sin:
162 if sin:
163 ui.debug('not linkable: %s\n' % f)
163 ui.debug('not linkable: %s\n' % f)
164 continue
164 continue
165 try:
165 try:
166 relinkfile(source, tgt)
166 relinkfile(source, tgt)
167 ui.progress(_('relinking'), pos, f, _('files'), total)
167 ui.progress(_('relinking'), pos, f, _('files'), total)
168 relinked += 1
168 relinked += 1
169 savedbytes += sz
169 savedbytes += sz
170 except OSError, inst:
170 except OSError, inst:
171 ui.warn('%s: %s\n' % (tgt, str(inst)))
171 ui.warn('%s: %s\n' % (tgt, str(inst)))
172
172
173 ui.progress(_('relinking'), None)
173 ui.progress(_('relinking'), None)
174
174
175 ui.status(_('relinked %d files (%s reclaimed)\n') %
175 ui.status(_('relinked %d files (%s reclaimed)\n') %
176 (relinked, util.bytecount(savedbytes)))
176 (relinked, util.bytecount(savedbytes)))
177
177
178 cmdtable = {
178 cmdtable = {
179 'relink': (
179 'relink': (
180 relink,
180 relink,
181 [],
181 [],
182 _('[ORIGIN]')
182 _('[ORIGIN]')
183 )
183 )
184 }
184 }
@@ -1,75 +1,75 b''
1 # Copyright 2006, 2007 Matt Mackall <mpm@selenic.com>
1 # Copyright 2006, 2007 Matt Mackall <mpm@selenic.com>
2 #
2 #
3 # This software may be used and distributed according to the terms of the
3 # This software may be used and distributed according to the terms of the
4 # GNU General Public License version 2 or any later version.
4 # GNU General Public License version 2 or any later version.
5
5
6 '''share a common history between several working directories'''
6 '''share a common history between several working directories'''
7
7
8 from mercurial.i18n import _
8 from mercurial.i18n import _
9 from mercurial import hg, commands, util
9 from mercurial import hg, commands, util
10
10
11 testedwith = 'internal'
11 testedwith = 'internal'
12
12
13 def share(ui, source, dest=None, noupdate=False):
13 def share(ui, source, dest=None, noupdate=False):
14 """create a new shared repository
14 """create a new shared repository
15
15
16 Initialize a new repository and working directory that shares its
16 Initialize a new repository and working directory that shares its
17 history with another repository.
17 history with another repository.
18
18
19 .. note::
19 .. note::
20 using rollback or extensions that destroy/modify history (mq,
20 using rollback or extensions that destroy/modify history (mq,
21 rebase, etc.) can cause considerable confusion with shared
21 rebase, etc.) can cause considerable confusion with shared
22 clones. In particular, if two shared clones are both updated to
22 clones. In particular, if two shared clones are both updated to
23 the same changeset, and one of them destroys that changeset
23 the same changeset, and one of them destroys that changeset
24 with rollback, the other clone will suddenly stop working: all
24 with rollback, the other clone will suddenly stop working: all
25 operations will fail with "abort: working directory has unknown
25 operations will fail with "abort: working directory has unknown
26 parent". The only known workaround is to use debugsetparents on
26 parent". The only known workaround is to use debugsetparents on
27 the broken clone to reset it to a changeset that still exists
27 the broken clone to reset it to a changeset that still exists
28 (e.g. tip).
28 (e.g. tip).
29 """
29 """
30
30
31 return hg.share(ui, source, dest, not noupdate)
31 return hg.share(ui, source, dest, not noupdate)
32
32
33 def unshare(ui, repo):
33 def unshare(ui, repo):
34 """convert a shared repository to a normal one
34 """convert a shared repository to a normal one
35
35
36 Copy the store data to the repo and remove the sharedpath data.
36 Copy the store data to the repo and remove the sharedpath data.
37 """
37 """
38
38
39 if repo.sharedpath == repo.path:
39 if repo.sharedpath == repo.path:
40 raise util.Abort(_("this is not a shared repo"))
40 raise util.Abort(_("this is not a shared repo"))
41
41
42 destlock = lock = None
42 destlock = lock = None
43 lock = repo.lock()
43 lock = repo.lock()
44 try:
44 try:
45 # we use locks here because if we race with commit, we
45 # we use locks here because if we race with commit, we
46 # can end up with extra data in the cloned revlogs that's
46 # can end up with extra data in the cloned revlogs that's
47 # not pointed to by changesets, thus causing verify to
47 # not pointed to by changesets, thus causing verify to
48 # fail
48 # fail
49
49
50 destlock = hg.copystore(ui, repo, repo.path)
50 destlock = hg.copystore(ui, repo, repo.path)
51
51
52 sharefile = repo.join('sharedpath')
52 sharefile = repo.join('sharedpath')
53 util.rename(sharefile, sharefile + '.old')
53 util.rename(sharefile, sharefile + '.old')
54
54
55 repo.requirements.discard('sharedpath')
55 repo.requirements.discard('sharedpath')
56 repo._writerequirements()
56 repo._writerequirements()
57 finally:
57 finally:
58 destlock and destlock.release()
58 destlock and destlock.release()
59 lock and lock.release()
59 lock and lock.release()
60
60
61 # update store, spath, sopener and sjoin of repo
61 # update store, spath, sopener and sjoin of repo
62 repo.__init__(ui, repo.root)
62 repo.__init__(repo.baseui, repo.root)
63
63
64 cmdtable = {
64 cmdtable = {
65 "share":
65 "share":
66 (share,
66 (share,
67 [('U', 'noupdate', None, _('do not create a working copy'))],
67 [('U', 'noupdate', None, _('do not create a working copy'))],
68 _('[-U] SOURCE [DEST]')),
68 _('[-U] SOURCE [DEST]')),
69 "unshare":
69 "unshare":
70 (unshare,
70 (unshare,
71 [],
71 [],
72 ''),
72 ''),
73 }
73 }
74
74
75 commands.norepo += " share"
75 commands.norepo += " share"
@@ -1,381 +1,382 b''
1 # bundlerepo.py - repository class for viewing uncompressed bundles
1 # bundlerepo.py - repository class for viewing uncompressed bundles
2 #
2 #
3 # Copyright 2006, 2007 Benoit Boissinot <bboissin@gmail.com>
3 # Copyright 2006, 2007 Benoit Boissinot <bboissin@gmail.com>
4 #
4 #
5 # This software may be used and distributed according to the terms of the
5 # This software may be used and distributed according to the terms of the
6 # GNU General Public License version 2 or any later version.
6 # GNU General Public License version 2 or any later version.
7
7
8 """Repository class for viewing uncompressed bundles.
8 """Repository class for viewing uncompressed bundles.
9
9
10 This provides a read-only repository interface to bundles as if they
10 This provides a read-only repository interface to bundles as if they
11 were part of the actual repository.
11 were part of the actual repository.
12 """
12 """
13
13
14 from node import nullid
14 from node import nullid
15 from i18n import _
15 from i18n import _
16 import os, tempfile, shutil
16 import os, tempfile, shutil
17 import changegroup, util, mdiff, discovery, cmdutil, scmutil
17 import changegroup, util, mdiff, discovery, cmdutil, scmutil
18 import localrepo, changelog, manifest, filelog, revlog, error
18 import localrepo, changelog, manifest, filelog, revlog, error
19
19
20 class bundlerevlog(revlog.revlog):
20 class bundlerevlog(revlog.revlog):
21 def __init__(self, opener, indexfile, bundle, linkmapper):
21 def __init__(self, opener, indexfile, bundle, linkmapper):
22 # How it works:
22 # How it works:
23 # To retrieve a revision, we need to know the offset of the revision in
23 # To retrieve a revision, we need to know the offset of the revision in
24 # the bundle (an unbundle object). We store this offset in the index
24 # the bundle (an unbundle object). We store this offset in the index
25 # (start). The base of the delta is stored in the base field.
25 # (start). The base of the delta is stored in the base field.
26 #
26 #
27 # To differentiate a rev in the bundle from a rev in the revlog, we
27 # To differentiate a rev in the bundle from a rev in the revlog, we
28 # check revision against repotiprev.
28 # check revision against repotiprev.
29 opener = scmutil.readonlyvfs(opener)
29 opener = scmutil.readonlyvfs(opener)
30 revlog.revlog.__init__(self, opener, indexfile)
30 revlog.revlog.__init__(self, opener, indexfile)
31 self.bundle = bundle
31 self.bundle = bundle
32 n = len(self)
32 n = len(self)
33 self.repotiprev = n - 1
33 self.repotiprev = n - 1
34 chain = None
34 chain = None
35 self.bundlerevs = set() # used by 'bundle()' revset expression
35 self.bundlerevs = set() # used by 'bundle()' revset expression
36 while True:
36 while True:
37 chunkdata = bundle.deltachunk(chain)
37 chunkdata = bundle.deltachunk(chain)
38 if not chunkdata:
38 if not chunkdata:
39 break
39 break
40 node = chunkdata['node']
40 node = chunkdata['node']
41 p1 = chunkdata['p1']
41 p1 = chunkdata['p1']
42 p2 = chunkdata['p2']
42 p2 = chunkdata['p2']
43 cs = chunkdata['cs']
43 cs = chunkdata['cs']
44 deltabase = chunkdata['deltabase']
44 deltabase = chunkdata['deltabase']
45 delta = chunkdata['delta']
45 delta = chunkdata['delta']
46
46
47 size = len(delta)
47 size = len(delta)
48 start = bundle.tell() - size
48 start = bundle.tell() - size
49
49
50 link = linkmapper(cs)
50 link = linkmapper(cs)
51 if node in self.nodemap:
51 if node in self.nodemap:
52 # this can happen if two branches make the same change
52 # this can happen if two branches make the same change
53 chain = node
53 chain = node
54 self.bundlerevs.add(self.nodemap[node])
54 self.bundlerevs.add(self.nodemap[node])
55 continue
55 continue
56
56
57 for p in (p1, p2):
57 for p in (p1, p2):
58 if p not in self.nodemap:
58 if p not in self.nodemap:
59 raise error.LookupError(p, self.indexfile,
59 raise error.LookupError(p, self.indexfile,
60 _("unknown parent"))
60 _("unknown parent"))
61
61
62 if deltabase not in self.nodemap:
62 if deltabase not in self.nodemap:
63 raise LookupError(deltabase, self.indexfile,
63 raise LookupError(deltabase, self.indexfile,
64 _('unknown delta base'))
64 _('unknown delta base'))
65
65
66 baserev = self.rev(deltabase)
66 baserev = self.rev(deltabase)
67 # start, size, full unc. size, base (unused), link, p1, p2, node
67 # start, size, full unc. size, base (unused), link, p1, p2, node
68 e = (revlog.offset_type(start, 0), size, -1, baserev, link,
68 e = (revlog.offset_type(start, 0), size, -1, baserev, link,
69 self.rev(p1), self.rev(p2), node)
69 self.rev(p1), self.rev(p2), node)
70 self.index.insert(-1, e)
70 self.index.insert(-1, e)
71 self.nodemap[node] = n
71 self.nodemap[node] = n
72 self.bundlerevs.add(n)
72 self.bundlerevs.add(n)
73 chain = node
73 chain = node
74 n += 1
74 n += 1
75
75
76 def _chunk(self, rev):
76 def _chunk(self, rev):
77 # Warning: in case of bundle, the diff is against what we stored as
77 # Warning: in case of bundle, the diff is against what we stored as
78 # delta base, not against rev - 1
78 # delta base, not against rev - 1
79 # XXX: could use some caching
79 # XXX: could use some caching
80 if rev <= self.repotiprev:
80 if rev <= self.repotiprev:
81 return revlog.revlog._chunk(self, rev)
81 return revlog.revlog._chunk(self, rev)
82 self.bundle.seek(self.start(rev))
82 self.bundle.seek(self.start(rev))
83 return self.bundle.read(self.length(rev))
83 return self.bundle.read(self.length(rev))
84
84
85 def revdiff(self, rev1, rev2):
85 def revdiff(self, rev1, rev2):
86 """return or calculate a delta between two revisions"""
86 """return or calculate a delta between two revisions"""
87 if rev1 > self.repotiprev and rev2 > self.repotiprev:
87 if rev1 > self.repotiprev and rev2 > self.repotiprev:
88 # hot path for bundle
88 # hot path for bundle
89 revb = self.index[rev2][3]
89 revb = self.index[rev2][3]
90 if revb == rev1:
90 if revb == rev1:
91 return self._chunk(rev2)
91 return self._chunk(rev2)
92 elif rev1 <= self.repotiprev and rev2 <= self.repotiprev:
92 elif rev1 <= self.repotiprev and rev2 <= self.repotiprev:
93 return revlog.revlog.revdiff(self, rev1, rev2)
93 return revlog.revlog.revdiff(self, rev1, rev2)
94
94
95 return mdiff.textdiff(self.revision(self.node(rev1)),
95 return mdiff.textdiff(self.revision(self.node(rev1)),
96 self.revision(self.node(rev2)))
96 self.revision(self.node(rev2)))
97
97
98 def revision(self, nodeorrev):
98 def revision(self, nodeorrev):
99 """return an uncompressed revision of a given node or revision
99 """return an uncompressed revision of a given node or revision
100 number.
100 number.
101 """
101 """
102 if isinstance(nodeorrev, int):
102 if isinstance(nodeorrev, int):
103 rev = nodeorrev
103 rev = nodeorrev
104 node = self.node(rev)
104 node = self.node(rev)
105 else:
105 else:
106 node = nodeorrev
106 node = nodeorrev
107 rev = self.rev(node)
107 rev = self.rev(node)
108
108
109 if node == nullid:
109 if node == nullid:
110 return ""
110 return ""
111
111
112 text = None
112 text = None
113 chain = []
113 chain = []
114 iterrev = rev
114 iterrev = rev
115 # reconstruct the revision if it is from a changegroup
115 # reconstruct the revision if it is from a changegroup
116 while iterrev > self.repotiprev:
116 while iterrev > self.repotiprev:
117 if self._cache and self._cache[1] == iterrev:
117 if self._cache and self._cache[1] == iterrev:
118 text = self._cache[2]
118 text = self._cache[2]
119 break
119 break
120 chain.append(iterrev)
120 chain.append(iterrev)
121 iterrev = self.index[iterrev][3]
121 iterrev = self.index[iterrev][3]
122 if text is None:
122 if text is None:
123 text = revlog.revlog.revision(self, iterrev)
123 text = revlog.revlog.revision(self, iterrev)
124
124
125 while chain:
125 while chain:
126 delta = self._chunk(chain.pop())
126 delta = self._chunk(chain.pop())
127 text = mdiff.patches(text, [delta])
127 text = mdiff.patches(text, [delta])
128
128
129 self._checkhash(text, node, rev)
129 self._checkhash(text, node, rev)
130 self._cache = (node, rev, text)
130 self._cache = (node, rev, text)
131 return text
131 return text
132
132
133 def addrevision(self, text, transaction, link, p1=None, p2=None, d=None):
133 def addrevision(self, text, transaction, link, p1=None, p2=None, d=None):
134 raise NotImplementedError
134 raise NotImplementedError
135 def addgroup(self, revs, linkmapper, transaction):
135 def addgroup(self, revs, linkmapper, transaction):
136 raise NotImplementedError
136 raise NotImplementedError
137 def strip(self, rev, minlink):
137 def strip(self, rev, minlink):
138 raise NotImplementedError
138 raise NotImplementedError
139 def checksize(self):
139 def checksize(self):
140 raise NotImplementedError
140 raise NotImplementedError
141
141
142 class bundlechangelog(bundlerevlog, changelog.changelog):
142 class bundlechangelog(bundlerevlog, changelog.changelog):
143 def __init__(self, opener, bundle):
143 def __init__(self, opener, bundle):
144 changelog.changelog.__init__(self, opener)
144 changelog.changelog.__init__(self, opener)
145 linkmapper = lambda x: x
145 linkmapper = lambda x: x
146 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
146 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
147 linkmapper)
147 linkmapper)
148
148
149 class bundlemanifest(bundlerevlog, manifest.manifest):
149 class bundlemanifest(bundlerevlog, manifest.manifest):
150 def __init__(self, opener, bundle, linkmapper):
150 def __init__(self, opener, bundle, linkmapper):
151 manifest.manifest.__init__(self, opener)
151 manifest.manifest.__init__(self, opener)
152 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
152 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
153 linkmapper)
153 linkmapper)
154
154
155 class bundlefilelog(bundlerevlog, filelog.filelog):
155 class bundlefilelog(bundlerevlog, filelog.filelog):
156 def __init__(self, opener, path, bundle, linkmapper, repo):
156 def __init__(self, opener, path, bundle, linkmapper, repo):
157 filelog.filelog.__init__(self, opener, path)
157 filelog.filelog.__init__(self, opener, path)
158 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
158 bundlerevlog.__init__(self, opener, self.indexfile, bundle,
159 linkmapper)
159 linkmapper)
160 self._repo = repo
160 self._repo = repo
161
161
162 def _file(self, f):
162 def _file(self, f):
163 self._repo.file(f)
163 self._repo.file(f)
164
164
165 class bundlepeer(localrepo.localpeer):
165 class bundlepeer(localrepo.localpeer):
166 def canpush(self):
166 def canpush(self):
167 return False
167 return False
168
168
169 class bundlerepository(localrepo.localrepository):
169 class bundlerepository(localrepo.localrepository):
170 def __init__(self, ui, path, bundlename):
170 def __init__(self, ui, path, bundlename):
171 self._tempparent = None
171 self._tempparent = None
172 try:
172 try:
173 localrepo.localrepository.__init__(self, ui, path)
173 localrepo.localrepository.__init__(self, ui, path)
174 except error.RepoError:
174 except error.RepoError:
175 self._tempparent = tempfile.mkdtemp()
175 self._tempparent = tempfile.mkdtemp()
176 localrepo.instance(ui, self._tempparent, 1)
176 localrepo.instance(ui, self._tempparent, 1)
177 localrepo.localrepository.__init__(self, ui, self._tempparent)
177 localrepo.localrepository.__init__(self, ui, self._tempparent)
178 self.ui.setconfig('phases', 'publish', False)
178 self.ui.setconfig('phases', 'publish', False)
179
179
180 if path:
180 if path:
181 self._url = 'bundle:' + util.expandpath(path) + '+' + bundlename
181 self._url = 'bundle:' + util.expandpath(path) + '+' + bundlename
182 else:
182 else:
183 self._url = 'bundle:' + bundlename
183 self._url = 'bundle:' + bundlename
184
184
185 self.tempfile = None
185 self.tempfile = None
186 f = util.posixfile(bundlename, "rb")
186 f = util.posixfile(bundlename, "rb")
187 self.bundle = changegroup.readbundle(f, bundlename)
187 self.bundle = changegroup.readbundle(f, bundlename)
188 if self.bundle.compressed():
188 if self.bundle.compressed():
189 fdtemp, temp = tempfile.mkstemp(prefix="hg-bundle-",
189 fdtemp, temp = tempfile.mkstemp(prefix="hg-bundle-",
190 suffix=".hg10un", dir=self.path)
190 suffix=".hg10un", dir=self.path)
191 self.tempfile = temp
191 self.tempfile = temp
192 fptemp = os.fdopen(fdtemp, 'wb')
192 fptemp = os.fdopen(fdtemp, 'wb')
193
193
194 try:
194 try:
195 fptemp.write("HG10UN")
195 fptemp.write("HG10UN")
196 while True:
196 while True:
197 chunk = self.bundle.read(2**18)
197 chunk = self.bundle.read(2**18)
198 if not chunk:
198 if not chunk:
199 break
199 break
200 fptemp.write(chunk)
200 fptemp.write(chunk)
201 finally:
201 finally:
202 fptemp.close()
202 fptemp.close()
203
203
204 f = util.posixfile(self.tempfile, "rb")
204 f = util.posixfile(self.tempfile, "rb")
205 self.bundle = changegroup.readbundle(f, bundlename)
205 self.bundle = changegroup.readbundle(f, bundlename)
206
206
207 # dict with the mapping 'filename' -> position in the bundle
207 # dict with the mapping 'filename' -> position in the bundle
208 self.bundlefilespos = {}
208 self.bundlefilespos = {}
209
209
210 @localrepo.unfilteredpropertycache
210 @localrepo.unfilteredpropertycache
211 def changelog(self):
211 def changelog(self):
212 # consume the header if it exists
212 # consume the header if it exists
213 self.bundle.changelogheader()
213 self.bundle.changelogheader()
214 c = bundlechangelog(self.sopener, self.bundle)
214 c = bundlechangelog(self.sopener, self.bundle)
215 self.manstart = self.bundle.tell()
215 self.manstart = self.bundle.tell()
216 return c
216 return c
217
217
218 @localrepo.unfilteredpropertycache
218 @localrepo.unfilteredpropertycache
219 def manifest(self):
219 def manifest(self):
220 self.bundle.seek(self.manstart)
220 self.bundle.seek(self.manstart)
221 # consume the header if it exists
221 # consume the header if it exists
222 self.bundle.manifestheader()
222 self.bundle.manifestheader()
223 m = bundlemanifest(self.sopener, self.bundle, self.changelog.rev)
223 m = bundlemanifest(self.sopener, self.bundle, self.changelog.rev)
224 self.filestart = self.bundle.tell()
224 self.filestart = self.bundle.tell()
225 return m
225 return m
226
226
227 @localrepo.unfilteredpropertycache
227 @localrepo.unfilteredpropertycache
228 def manstart(self):
228 def manstart(self):
229 self.changelog
229 self.changelog
230 return self.manstart
230 return self.manstart
231
231
232 @localrepo.unfilteredpropertycache
232 @localrepo.unfilteredpropertycache
233 def filestart(self):
233 def filestart(self):
234 self.manifest
234 self.manifest
235 return self.filestart
235 return self.filestart
236
236
237 def url(self):
237 def url(self):
238 return self._url
238 return self._url
239
239
240 def file(self, f):
240 def file(self, f):
241 if not self.bundlefilespos:
241 if not self.bundlefilespos:
242 self.bundle.seek(self.filestart)
242 self.bundle.seek(self.filestart)
243 while True:
243 while True:
244 chunkdata = self.bundle.filelogheader()
244 chunkdata = self.bundle.filelogheader()
245 if not chunkdata:
245 if not chunkdata:
246 break
246 break
247 fname = chunkdata['filename']
247 fname = chunkdata['filename']
248 self.bundlefilespos[fname] = self.bundle.tell()
248 self.bundlefilespos[fname] = self.bundle.tell()
249 while True:
249 while True:
250 c = self.bundle.deltachunk(None)
250 c = self.bundle.deltachunk(None)
251 if not c:
251 if not c:
252 break
252 break
253
253
254 if f in self.bundlefilespos:
254 if f in self.bundlefilespos:
255 self.bundle.seek(self.bundlefilespos[f])
255 self.bundle.seek(self.bundlefilespos[f])
256 return bundlefilelog(self.sopener, f, self.bundle,
256 return bundlefilelog(self.sopener, f, self.bundle,
257 self.changelog.rev, self)
257 self.changelog.rev, self)
258 else:
258 else:
259 return filelog.filelog(self.sopener, f)
259 return filelog.filelog(self.sopener, f)
260
260
261 def close(self):
261 def close(self):
262 """Close assigned bundle file immediately."""
262 """Close assigned bundle file immediately."""
263 self.bundle.close()
263 self.bundle.close()
264 if self.tempfile is not None:
264 if self.tempfile is not None:
265 os.unlink(self.tempfile)
265 os.unlink(self.tempfile)
266 if self._tempparent:
266 if self._tempparent:
267 shutil.rmtree(self._tempparent, True)
267 shutil.rmtree(self._tempparent, True)
268
268
269 def cancopy(self):
269 def cancopy(self):
270 return False
270 return False
271
271
272 def peer(self):
272 def peer(self):
273 return bundlepeer(self)
273 return bundlepeer(self)
274
274
275 def getcwd(self):
275 def getcwd(self):
276 return os.getcwd() # always outside the repo
276 return os.getcwd() # always outside the repo
277
277
278
278
279 def instance(ui, path, create):
279 def instance(ui, path, create):
280 if create:
280 if create:
281 raise util.Abort(_('cannot create new bundle repository'))
281 raise util.Abort(_('cannot create new bundle repository'))
282 parentpath = ui.config("bundle", "mainreporoot", "")
282 parentpath = ui.config("bundle", "mainreporoot", "")
283 if not parentpath:
283 if not parentpath:
284 # try to find the correct path to the working directory repo
284 # try to find the correct path to the working directory repo
285 parentpath = cmdutil.findrepo(os.getcwd())
285 parentpath = cmdutil.findrepo(os.getcwd())
286 if parentpath is None:
286 if parentpath is None:
287 parentpath = ''
287 parentpath = ''
288 if parentpath:
288 if parentpath:
289 # Try to make the full path relative so we get a nice, short URL.
289 # Try to make the full path relative so we get a nice, short URL.
290 # In particular, we don't want temp dir names in test outputs.
290 # In particular, we don't want temp dir names in test outputs.
291 cwd = os.getcwd()
291 cwd = os.getcwd()
292 if parentpath == cwd:
292 if parentpath == cwd:
293 parentpath = ''
293 parentpath = ''
294 else:
294 else:
295 cwd = os.path.join(cwd,'')
295 cwd = os.path.join(cwd,'')
296 if parentpath.startswith(cwd):
296 if parentpath.startswith(cwd):
297 parentpath = parentpath[len(cwd):]
297 parentpath = parentpath[len(cwd):]
298 u = util.url(path)
298 u = util.url(path)
299 path = u.localpath()
299 path = u.localpath()
300 if u.scheme == 'bundle':
300 if u.scheme == 'bundle':
301 s = path.split("+", 1)
301 s = path.split("+", 1)
302 if len(s) == 1:
302 if len(s) == 1:
303 repopath, bundlename = parentpath, s[0]
303 repopath, bundlename = parentpath, s[0]
304 else:
304 else:
305 repopath, bundlename = s
305 repopath, bundlename = s
306 else:
306 else:
307 repopath, bundlename = parentpath, path
307 repopath, bundlename = parentpath, path
308 return bundlerepository(ui, repopath, bundlename)
308 return bundlerepository(ui, repopath, bundlename)
309
309
310 def getremotechanges(ui, repo, other, onlyheads=None, bundlename=None,
310 def getremotechanges(ui, repo, other, onlyheads=None, bundlename=None,
311 force=False):
311 force=False):
312 '''obtains a bundle of changes incoming from other
312 '''obtains a bundle of changes incoming from other
313
313
314 "onlyheads" restricts the returned changes to those reachable from the
314 "onlyheads" restricts the returned changes to those reachable from the
315 specified heads.
315 specified heads.
316 "bundlename", if given, stores the bundle to this file path permanently;
316 "bundlename", if given, stores the bundle to this file path permanently;
317 otherwise it's stored to a temp file and gets deleted again when you call
317 otherwise it's stored to a temp file and gets deleted again when you call
318 the returned "cleanupfn".
318 the returned "cleanupfn".
319 "force" indicates whether to proceed on unrelated repos.
319 "force" indicates whether to proceed on unrelated repos.
320
320
321 Returns a tuple (local, csets, cleanupfn):
321 Returns a tuple (local, csets, cleanupfn):
322
322
323 "local" is a local repo from which to obtain the actual incoming
323 "local" is a local repo from which to obtain the actual incoming
324 changesets; it is a bundlerepo for the obtained bundle when the
324 changesets; it is a bundlerepo for the obtained bundle when the
325 original "other" is remote.
325 original "other" is remote.
326 "csets" lists the incoming changeset node ids.
326 "csets" lists the incoming changeset node ids.
327 "cleanupfn" must be called without arguments when you're done processing
327 "cleanupfn" must be called without arguments when you're done processing
328 the changes; it closes both the original "other" and the one returned
328 the changes; it closes both the original "other" and the one returned
329 here.
329 here.
330 '''
330 '''
331 tmp = discovery.findcommonincoming(repo, other, heads=onlyheads,
331 tmp = discovery.findcommonincoming(repo, other, heads=onlyheads,
332 force=force)
332 force=force)
333 common, incoming, rheads = tmp
333 common, incoming, rheads = tmp
334 if not incoming:
334 if not incoming:
335 try:
335 try:
336 if bundlename:
336 if bundlename:
337 os.unlink(bundlename)
337 os.unlink(bundlename)
338 except OSError:
338 except OSError:
339 pass
339 pass
340 return repo, [], other.close
340 return repo, [], other.close
341
341
342 bundle = None
342 bundle = None
343 bundlerepo = None
343 bundlerepo = None
344 localrepo = other.local()
344 localrepo = other.local()
345 if bundlename or not localrepo:
345 if bundlename or not localrepo:
346 # create a bundle (uncompressed if other repo is not local)
346 # create a bundle (uncompressed if other repo is not local)
347
347
348 if other.capable('getbundle'):
348 if other.capable('getbundle'):
349 cg = other.getbundle('incoming', common=common, heads=rheads)
349 cg = other.getbundle('incoming', common=common, heads=rheads)
350 elif onlyheads is None and not other.capable('changegroupsubset'):
350 elif onlyheads is None and not other.capable('changegroupsubset'):
351 # compat with older servers when pulling all remote heads
351 # compat with older servers when pulling all remote heads
352 cg = other.changegroup(incoming, "incoming")
352 cg = other.changegroup(incoming, "incoming")
353 rheads = None
353 rheads = None
354 else:
354 else:
355 cg = other.changegroupsubset(incoming, rheads, 'incoming')
355 cg = other.changegroupsubset(incoming, rheads, 'incoming')
356 bundletype = localrepo and "HG10BZ" or "HG10UN"
356 bundletype = localrepo and "HG10BZ" or "HG10UN"
357 fname = bundle = changegroup.writebundle(cg, bundlename, bundletype)
357 fname = bundle = changegroup.writebundle(cg, bundlename, bundletype)
358 # keep written bundle?
358 # keep written bundle?
359 if bundlename:
359 if bundlename:
360 bundle = None
360 bundle = None
361 if not localrepo:
361 if not localrepo:
362 # use the created uncompressed bundlerepo
362 # use the created uncompressed bundlerepo
363 localrepo = bundlerepo = bundlerepository(ui, repo.root, fname)
363 localrepo = bundlerepo = bundlerepository(repo.baseui, repo.root,
364 fname)
364 # this repo contains local and other now, so filter out local again
365 # this repo contains local and other now, so filter out local again
365 common = repo.heads()
366 common = repo.heads()
366 if localrepo:
367 if localrepo:
367 # Part of common may be remotely filtered
368 # Part of common may be remotely filtered
368 # So use an unfiltered version
369 # So use an unfiltered version
369 # The discovery process probably need cleanup to avoid that
370 # The discovery process probably need cleanup to avoid that
370 localrepo = localrepo.unfiltered()
371 localrepo = localrepo.unfiltered()
371
372
372 csets = localrepo.changelog.findmissing(common, rheads)
373 csets = localrepo.changelog.findmissing(common, rheads)
373
374
374 def cleanup():
375 def cleanup():
375 if bundlerepo:
376 if bundlerepo:
376 bundlerepo.close()
377 bundlerepo.close()
377 if bundle:
378 if bundle:
378 os.unlink(bundle)
379 os.unlink(bundle)
379 other.close()
380 other.close()
380
381
381 return (localrepo, csets, cleanup)
382 return (localrepo, csets, cleanup)
General Comments 0
You need to be logged in to leave comments. Login now