##// END OF EJS Templates
py3: replace os.name with pycompat.osname (part 2 of 2)
Pulkit Goyal -
r30640:7a3e67bf default
parent child Browse files
Show More
@@ -1,719 +1,718 b''
1 1 # color.py color output for Mercurial commands
2 2 #
3 3 # Copyright (C) 2007 Kevin Christen <kevin.christen@gmail.com>
4 4 #
5 5 # This software may be used and distributed according to the terms of the
6 6 # GNU General Public License version 2 or any later version.
7 7
8 8 '''colorize output from some commands
9 9
10 10 The color extension colorizes output from several Mercurial commands.
11 11 For example, the diff command shows additions in green and deletions
12 12 in red, while the status command shows modified files in magenta. Many
13 13 other commands have analogous colors. It is possible to customize
14 14 these colors.
15 15
16 16 Effects
17 17 -------
18 18
19 19 Other effects in addition to color, like bold and underlined text, are
20 20 also available. By default, the terminfo database is used to find the
21 21 terminal codes used to change color and effect. If terminfo is not
22 22 available, then effects are rendered with the ECMA-48 SGR control
23 23 function (aka ANSI escape codes).
24 24
25 25 The available effects in terminfo mode are 'blink', 'bold', 'dim',
26 26 'inverse', 'invisible', 'italic', 'standout', and 'underline'; in
27 27 ECMA-48 mode, the options are 'bold', 'inverse', 'italic', and
28 28 'underline'. How each is rendered depends on the terminal emulator.
29 29 Some may not be available for a given terminal type, and will be
30 30 silently ignored.
31 31
32 32 If the terminfo entry for your terminal is missing codes for an effect
33 33 or has the wrong codes, you can add or override those codes in your
34 34 configuration::
35 35
36 36 [color]
37 37 terminfo.dim = \E[2m
38 38
39 39 where '\E' is substituted with an escape character.
40 40
41 41 Labels
42 42 ------
43 43
44 44 Text receives color effects depending on the labels that it has. Many
45 45 default Mercurial commands emit labelled text. You can also define
46 46 your own labels in templates using the label function, see :hg:`help
47 47 templates`. A single portion of text may have more than one label. In
48 48 that case, effects given to the last label will override any other
49 49 effects. This includes the special "none" effect, which nullifies
50 50 other effects.
51 51
52 52 Labels are normally invisible. In order to see these labels and their
53 53 position in the text, use the global --color=debug option. The same
54 54 anchor text may be associated to multiple labels, e.g.
55 55
56 56 [log.changeset changeset.secret|changeset: 22611:6f0a53c8f587]
57 57
58 58 The following are the default effects for some default labels. Default
59 59 effects may be overridden from your configuration file::
60 60
61 61 [color]
62 62 status.modified = blue bold underline red_background
63 63 status.added = green bold
64 64 status.removed = red bold blue_background
65 65 status.deleted = cyan bold underline
66 66 status.unknown = magenta bold underline
67 67 status.ignored = black bold
68 68
69 69 # 'none' turns off all effects
70 70 status.clean = none
71 71 status.copied = none
72 72
73 73 qseries.applied = blue bold underline
74 74 qseries.unapplied = black bold
75 75 qseries.missing = red bold
76 76
77 77 diff.diffline = bold
78 78 diff.extended = cyan bold
79 79 diff.file_a = red bold
80 80 diff.file_b = green bold
81 81 diff.hunk = magenta
82 82 diff.deleted = red
83 83 diff.inserted = green
84 84 diff.changed = white
85 85 diff.tab =
86 86 diff.trailingwhitespace = bold red_background
87 87
88 88 # Blank so it inherits the style of the surrounding label
89 89 changeset.public =
90 90 changeset.draft =
91 91 changeset.secret =
92 92
93 93 resolve.unresolved = red bold
94 94 resolve.resolved = green bold
95 95
96 96 bookmarks.active = green
97 97
98 98 branches.active = none
99 99 branches.closed = black bold
100 100 branches.current = green
101 101 branches.inactive = none
102 102
103 103 tags.normal = green
104 104 tags.local = black bold
105 105
106 106 rebase.rebased = blue
107 107 rebase.remaining = red bold
108 108
109 109 shelve.age = cyan
110 110 shelve.newest = green bold
111 111 shelve.name = blue bold
112 112
113 113 histedit.remaining = red bold
114 114
115 115 Custom colors
116 116 -------------
117 117
118 118 Because there are only eight standard colors, this module allows you
119 119 to define color names for other color slots which might be available
120 120 for your terminal type, assuming terminfo mode. For instance::
121 121
122 122 color.brightblue = 12
123 123 color.pink = 207
124 124 color.orange = 202
125 125
126 126 to set 'brightblue' to color slot 12 (useful for 16 color terminals
127 127 that have brighter colors defined in the upper eight) and, 'pink' and
128 128 'orange' to colors in 256-color xterm's default color cube. These
129 129 defined colors may then be used as any of the pre-defined eight,
130 130 including appending '_background' to set the background to that color.
131 131
132 132 Modes
133 133 -----
134 134
135 135 By default, the color extension will use ANSI mode (or win32 mode on
136 136 Windows) if it detects a terminal. To override auto mode (to enable
137 137 terminfo mode, for example), set the following configuration option::
138 138
139 139 [color]
140 140 mode = terminfo
141 141
142 142 Any value other than 'ansi', 'win32', 'terminfo', or 'auto' will
143 143 disable color.
144 144
145 145 Note that on some systems, terminfo mode may cause problems when using
146 146 color with the pager extension and less -R. less with the -R option
147 147 will only display ECMA-48 color codes, and terminfo mode may sometimes
148 148 emit codes that less doesn't understand. You can work around this by
149 149 either using ansi mode (or auto mode), or by using less -r (which will
150 150 pass through all terminal control codes, not just color control
151 151 codes).
152 152
153 153 On some systems (such as MSYS in Windows), the terminal may support
154 154 a different color mode than the pager (activated via the "pager"
155 155 extension). It is possible to define separate modes depending on whether
156 156 the pager is active::
157 157
158 158 [color]
159 159 mode = auto
160 160 pagermode = ansi
161 161
162 162 If ``pagermode`` is not defined, the ``mode`` will be used.
163 163 '''
164 164
165 165 from __future__ import absolute_import
166 166
167 import os
168
169 167 from mercurial.i18n import _
170 168 from mercurial import (
171 169 cmdutil,
172 170 commands,
173 171 dispatch,
174 172 encoding,
175 173 extensions,
174 pycompat,
176 175 subrepo,
177 176 ui as uimod,
178 177 util,
179 178 )
180 179
181 180 cmdtable = {}
182 181 command = cmdutil.command(cmdtable)
183 182 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
184 183 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
185 184 # be specifying the version(s) of Mercurial they are tested with, or
186 185 # leave the attribute unspecified.
187 186 testedwith = 'ships-with-hg-core'
188 187
189 188 # start and stop parameters for effects
190 189 _effects = {'none': 0, 'black': 30, 'red': 31, 'green': 32, 'yellow': 33,
191 190 'blue': 34, 'magenta': 35, 'cyan': 36, 'white': 37, 'bold': 1,
192 191 'italic': 3, 'underline': 4, 'inverse': 7, 'dim': 2,
193 192 'black_background': 40, 'red_background': 41,
194 193 'green_background': 42, 'yellow_background': 43,
195 194 'blue_background': 44, 'purple_background': 45,
196 195 'cyan_background': 46, 'white_background': 47}
197 196
198 197 def _terminfosetup(ui, mode):
199 198 '''Initialize terminfo data and the terminal if we're in terminfo mode.'''
200 199
201 200 global _terminfo_params
202 201 # If we failed to load curses, we go ahead and return.
203 202 if not _terminfo_params:
204 203 return
205 204 # Otherwise, see what the config file says.
206 205 if mode not in ('auto', 'terminfo'):
207 206 return
208 207
209 208 _terminfo_params.update((key[6:], (False, int(val), ''))
210 209 for key, val in ui.configitems('color')
211 210 if key.startswith('color.'))
212 211 _terminfo_params.update((key[9:], (True, '', val.replace('\\E', '\x1b')))
213 212 for key, val in ui.configitems('color')
214 213 if key.startswith('terminfo.'))
215 214
216 215 try:
217 216 curses.setupterm()
218 217 except curses.error as e:
219 218 _terminfo_params = {}
220 219 return
221 220
222 221 for key, (b, e, c) in _terminfo_params.items():
223 222 if not b:
224 223 continue
225 224 if not c and not curses.tigetstr(e):
226 225 # Most terminals don't support dim, invis, etc, so don't be
227 226 # noisy and use ui.debug().
228 227 ui.debug("no terminfo entry for %s\n" % e)
229 228 del _terminfo_params[key]
230 229 if not curses.tigetstr('setaf') or not curses.tigetstr('setab'):
231 230 # Only warn about missing terminfo entries if we explicitly asked for
232 231 # terminfo mode.
233 232 if mode == "terminfo":
234 233 ui.warn(_("no terminfo entry for setab/setaf: reverting to "
235 234 "ECMA-48 color\n"))
236 235 _terminfo_params = {}
237 236
238 237 def _modesetup(ui, coloropt):
239 238 global _terminfo_params
240 239
241 240 if coloropt == 'debug':
242 241 return 'debug'
243 242
244 243 auto = (coloropt == 'auto')
245 244 always = not auto and util.parsebool(coloropt)
246 245 if not always and not auto:
247 246 return None
248 247
249 248 formatted = (always or (encoding.environ.get('TERM') != 'dumb'
250 249 and ui.formatted()))
251 250
252 251 mode = ui.config('color', 'mode', 'auto')
253 252
254 253 # If pager is active, color.pagermode overrides color.mode.
255 254 if getattr(ui, 'pageractive', False):
256 255 mode = ui.config('color', 'pagermode', mode)
257 256
258 257 realmode = mode
259 258 if mode == 'auto':
260 if os.name == 'nt':
259 if pycompat.osname == 'nt':
261 260 term = encoding.environ.get('TERM')
262 261 # TERM won't be defined in a vanilla cmd.exe environment.
263 262
264 263 # UNIX-like environments on Windows such as Cygwin and MSYS will
265 264 # set TERM. They appear to make a best effort attempt at setting it
266 265 # to something appropriate. However, not all environments with TERM
267 266 # defined support ANSI. Since "ansi" could result in terminal
268 267 # gibberish, we error on the side of selecting "win32". However, if
269 268 # w32effects is not defined, we almost certainly don't support
270 269 # "win32", so don't even try.
271 270 if (term and 'xterm' in term) or not w32effects:
272 271 realmode = 'ansi'
273 272 else:
274 273 realmode = 'win32'
275 274 else:
276 275 realmode = 'ansi'
277 276
278 277 def modewarn():
279 278 # only warn if color.mode was explicitly set and we're in
280 279 # a formatted terminal
281 280 if mode == realmode and ui.formatted():
282 281 ui.warn(_('warning: failed to set color mode to %s\n') % mode)
283 282
284 283 if realmode == 'win32':
285 284 _terminfo_params = {}
286 285 if not w32effects:
287 286 modewarn()
288 287 return None
289 288 _effects.update(w32effects)
290 289 elif realmode == 'ansi':
291 290 _terminfo_params = {}
292 291 elif realmode == 'terminfo':
293 292 _terminfosetup(ui, mode)
294 293 if not _terminfo_params:
295 294 ## FIXME Shouldn't we return None in this case too?
296 295 modewarn()
297 296 realmode = 'ansi'
298 297 else:
299 298 return None
300 299
301 300 if always or (auto and formatted):
302 301 return realmode
303 302 return None
304 303
305 304 try:
306 305 import curses
307 306 # Mapping from effect name to terminfo attribute name (or raw code) or
308 307 # color number. This will also force-load the curses module.
309 308 _terminfo_params = {'none': (True, 'sgr0', ''),
310 309 'standout': (True, 'smso', ''),
311 310 'underline': (True, 'smul', ''),
312 311 'reverse': (True, 'rev', ''),
313 312 'inverse': (True, 'rev', ''),
314 313 'blink': (True, 'blink', ''),
315 314 'dim': (True, 'dim', ''),
316 315 'bold': (True, 'bold', ''),
317 316 'invisible': (True, 'invis', ''),
318 317 'italic': (True, 'sitm', ''),
319 318 'black': (False, curses.COLOR_BLACK, ''),
320 319 'red': (False, curses.COLOR_RED, ''),
321 320 'green': (False, curses.COLOR_GREEN, ''),
322 321 'yellow': (False, curses.COLOR_YELLOW, ''),
323 322 'blue': (False, curses.COLOR_BLUE, ''),
324 323 'magenta': (False, curses.COLOR_MAGENTA, ''),
325 324 'cyan': (False, curses.COLOR_CYAN, ''),
326 325 'white': (False, curses.COLOR_WHITE, '')}
327 326 except ImportError:
328 327 _terminfo_params = {}
329 328
330 329 _styles = {'grep.match': 'red bold',
331 330 'grep.linenumber': 'green',
332 331 'grep.rev': 'green',
333 332 'grep.change': 'green',
334 333 'grep.sep': 'cyan',
335 334 'grep.filename': 'magenta',
336 335 'grep.user': 'magenta',
337 336 'grep.date': 'magenta',
338 337 'bookmarks.active': 'green',
339 338 'branches.active': 'none',
340 339 'branches.closed': 'black bold',
341 340 'branches.current': 'green',
342 341 'branches.inactive': 'none',
343 342 'diff.changed': 'white',
344 343 'diff.deleted': 'red',
345 344 'diff.diffline': 'bold',
346 345 'diff.extended': 'cyan bold',
347 346 'diff.file_a': 'red bold',
348 347 'diff.file_b': 'green bold',
349 348 'diff.hunk': 'magenta',
350 349 'diff.inserted': 'green',
351 350 'diff.tab': '',
352 351 'diff.trailingwhitespace': 'bold red_background',
353 352 'changeset.public' : '',
354 353 'changeset.draft' : '',
355 354 'changeset.secret' : '',
356 355 'diffstat.deleted': 'red',
357 356 'diffstat.inserted': 'green',
358 357 'histedit.remaining': 'red bold',
359 358 'ui.prompt': 'yellow',
360 359 'log.changeset': 'yellow',
361 360 'patchbomb.finalsummary': '',
362 361 'patchbomb.from': 'magenta',
363 362 'patchbomb.to': 'cyan',
364 363 'patchbomb.subject': 'green',
365 364 'patchbomb.diffstats': '',
366 365 'rebase.rebased': 'blue',
367 366 'rebase.remaining': 'red bold',
368 367 'resolve.resolved': 'green bold',
369 368 'resolve.unresolved': 'red bold',
370 369 'shelve.age': 'cyan',
371 370 'shelve.newest': 'green bold',
372 371 'shelve.name': 'blue bold',
373 372 'status.added': 'green bold',
374 373 'status.clean': 'none',
375 374 'status.copied': 'none',
376 375 'status.deleted': 'cyan bold underline',
377 376 'status.ignored': 'black bold',
378 377 'status.modified': 'blue bold',
379 378 'status.removed': 'red bold',
380 379 'status.unknown': 'magenta bold underline',
381 380 'tags.normal': 'green',
382 381 'tags.local': 'black bold'}
383 382
384 383
385 384 def _effect_str(effect):
386 385 '''Helper function for render_effects().'''
387 386
388 387 bg = False
389 388 if effect.endswith('_background'):
390 389 bg = True
391 390 effect = effect[:-11]
392 391 try:
393 392 attr, val, termcode = _terminfo_params[effect]
394 393 except KeyError:
395 394 return ''
396 395 if attr:
397 396 if termcode:
398 397 return termcode
399 398 else:
400 399 return curses.tigetstr(val)
401 400 elif bg:
402 401 return curses.tparm(curses.tigetstr('setab'), val)
403 402 else:
404 403 return curses.tparm(curses.tigetstr('setaf'), val)
405 404
406 405 def render_effects(text, effects):
407 406 'Wrap text in commands to turn on each effect.'
408 407 if not text:
409 408 return text
410 409 if not _terminfo_params:
411 410 start = [str(_effects[e]) for e in ['none'] + effects.split()]
412 411 start = '\033[' + ';'.join(start) + 'm'
413 412 stop = '\033[' + str(_effects['none']) + 'm'
414 413 else:
415 414 start = ''.join(_effect_str(effect)
416 415 for effect in ['none'] + effects.split())
417 416 stop = _effect_str('none')
418 417 return ''.join([start, text, stop])
419 418
420 419 def extstyles():
421 420 for name, ext in extensions.extensions():
422 421 _styles.update(getattr(ext, 'colortable', {}))
423 422
424 423 def valideffect(effect):
425 424 'Determine if the effect is valid or not.'
426 425 good = False
427 426 if not _terminfo_params and effect in _effects:
428 427 good = True
429 428 elif effect in _terminfo_params or effect[:-11] in _terminfo_params:
430 429 good = True
431 430 return good
432 431
433 432 def configstyles(ui):
434 433 for status, cfgeffects in ui.configitems('color'):
435 434 if '.' not in status or status.startswith(('color.', 'terminfo.')):
436 435 continue
437 436 cfgeffects = ui.configlist('color', status)
438 437 if cfgeffects:
439 438 good = []
440 439 for e in cfgeffects:
441 440 if valideffect(e):
442 441 good.append(e)
443 442 else:
444 443 ui.warn(_("ignoring unknown color/effect %r "
445 444 "(configured in color.%s)\n")
446 445 % (e, status))
447 446 _styles[status] = ' '.join(good)
448 447
449 448 class colorui(uimod.ui):
450 449 _colormode = 'ansi'
451 450 def write(self, *args, **opts):
452 451 if self._colormode is None:
453 452 return super(colorui, self).write(*args, **opts)
454 453
455 454 label = opts.get('label', '')
456 455 if self._buffers and not opts.get('prompt', False):
457 456 if self._bufferapplylabels:
458 457 self._buffers[-1].extend(self.label(a, label) for a in args)
459 458 else:
460 459 self._buffers[-1].extend(args)
461 460 elif self._colormode == 'win32':
462 461 for a in args:
463 462 win32print(a, super(colorui, self).write, **opts)
464 463 else:
465 464 return super(colorui, self).write(
466 465 *[self.label(a, label) for a in args], **opts)
467 466
468 467 def write_err(self, *args, **opts):
469 468 if self._colormode is None:
470 469 return super(colorui, self).write_err(*args, **opts)
471 470
472 471 label = opts.get('label', '')
473 472 if self._bufferstates and self._bufferstates[-1][0]:
474 473 return self.write(*args, **opts)
475 474 if self._colormode == 'win32':
476 475 for a in args:
477 476 win32print(a, super(colorui, self).write_err, **opts)
478 477 else:
479 478 return super(colorui, self).write_err(
480 479 *[self.label(a, label) for a in args], **opts)
481 480
482 481 def showlabel(self, msg, label):
483 482 if label and msg:
484 483 if msg[-1] == '\n':
485 484 return "[%s|%s]\n" % (label, msg[:-1])
486 485 else:
487 486 return "[%s|%s]" % (label, msg)
488 487 else:
489 488 return msg
490 489
491 490 def label(self, msg, label):
492 491 if self._colormode is None:
493 492 return super(colorui, self).label(msg, label)
494 493
495 494 if self._colormode == 'debug':
496 495 return self.showlabel(msg, label)
497 496
498 497 effects = []
499 498 for l in label.split():
500 499 s = _styles.get(l, '')
501 500 if s:
502 501 effects.append(s)
503 502 elif valideffect(l):
504 503 effects.append(l)
505 504 effects = ' '.join(effects)
506 505 if effects:
507 506 return '\n'.join([render_effects(line, effects)
508 507 for line in msg.split('\n')])
509 508 return msg
510 509
511 510 def uisetup(ui):
512 511 if ui.plain():
513 512 return
514 513 if not isinstance(ui, colorui):
515 514 colorui.__bases__ = (ui.__class__,)
516 515 ui.__class__ = colorui
517 516 def colorcmd(orig, ui_, opts, cmd, cmdfunc):
518 517 mode = _modesetup(ui_, opts['color'])
519 518 colorui._colormode = mode
520 519 if mode and mode != 'debug':
521 520 extstyles()
522 521 configstyles(ui_)
523 522 return orig(ui_, opts, cmd, cmdfunc)
524 523 def colorgit(orig, gitsub, commands, env=None, stream=False, cwd=None):
525 524 if gitsub.ui._colormode and len(commands) and commands[0] == "diff":
526 525 # insert the argument in the front,
527 526 # the end of git diff arguments is used for paths
528 527 commands.insert(1, '--color')
529 528 return orig(gitsub, commands, env, stream, cwd)
530 529 extensions.wrapfunction(dispatch, '_runcommand', colorcmd)
531 530 extensions.wrapfunction(subrepo.gitsubrepo, '_gitnodir', colorgit)
532 531
533 532 def extsetup(ui):
534 533 commands.globalopts.append(
535 534 ('', 'color', 'auto',
536 535 # i18n: 'always', 'auto', 'never', and 'debug' are keywords
537 536 # and should not be translated
538 537 _("when to colorize (boolean, always, auto, never, or debug)"),
539 538 _('TYPE')))
540 539
541 540 @command('debugcolor',
542 541 [('', 'style', None, _('show all configured styles'))],
543 542 'hg debugcolor')
544 543 def debugcolor(ui, repo, **opts):
545 544 """show available color, effects or style"""
546 545 ui.write(('color mode: %s\n') % ui._colormode)
547 546 if opts.get('style'):
548 547 return _debugdisplaystyle(ui)
549 548 else:
550 549 return _debugdisplaycolor(ui)
551 550
552 551 def _debugdisplaycolor(ui):
553 552 global _styles
554 553 oldstyle = _styles
555 554 try:
556 555 _styles = {}
557 556 for effect in _effects.keys():
558 557 _styles[effect] = effect
559 558 if _terminfo_params:
560 559 for k, v in ui.configitems('color'):
561 560 if k.startswith('color.'):
562 561 _styles[k] = k[6:]
563 562 elif k.startswith('terminfo.'):
564 563 _styles[k] = k[9:]
565 564 ui.write(_('available colors:\n'))
566 565 # sort label with a '_' after the other to group '_background' entry.
567 566 items = sorted(_styles.items(),
568 567 key=lambda i: ('_' in i[0], i[0], i[1]))
569 568 for colorname, label in items:
570 569 ui.write(('%s\n') % colorname, label=label)
571 570 finally:
572 571 _styles = oldstyle
573 572
574 573 def _debugdisplaystyle(ui):
575 574 ui.write(_('available style:\n'))
576 575 width = max(len(s) for s in _styles)
577 576 for label, effects in sorted(_styles.items()):
578 577 ui.write('%s' % label, label=label)
579 578 if effects:
580 579 # 50
581 580 ui.write(': ')
582 581 ui.write(' ' * (max(0, width - len(label))))
583 582 ui.write(', '.join(ui.label(e, e) for e in effects.split()))
584 583 ui.write('\n')
585 584
586 if os.name != 'nt':
585 if pycompat.osname != 'nt':
587 586 w32effects = None
588 587 else:
589 588 import ctypes
590 589 import re
591 590
592 591 _kernel32 = ctypes.windll.kernel32
593 592
594 593 _WORD = ctypes.c_ushort
595 594
596 595 _INVALID_HANDLE_VALUE = -1
597 596
598 597 class _COORD(ctypes.Structure):
599 598 _fields_ = [('X', ctypes.c_short),
600 599 ('Y', ctypes.c_short)]
601 600
602 601 class _SMALL_RECT(ctypes.Structure):
603 602 _fields_ = [('Left', ctypes.c_short),
604 603 ('Top', ctypes.c_short),
605 604 ('Right', ctypes.c_short),
606 605 ('Bottom', ctypes.c_short)]
607 606
608 607 class _CONSOLE_SCREEN_BUFFER_INFO(ctypes.Structure):
609 608 _fields_ = [('dwSize', _COORD),
610 609 ('dwCursorPosition', _COORD),
611 610 ('wAttributes', _WORD),
612 611 ('srWindow', _SMALL_RECT),
613 612 ('dwMaximumWindowSize', _COORD)]
614 613
615 614 _STD_OUTPUT_HANDLE = 0xfffffff5 # (DWORD)-11
616 615 _STD_ERROR_HANDLE = 0xfffffff4 # (DWORD)-12
617 616
618 617 _FOREGROUND_BLUE = 0x0001
619 618 _FOREGROUND_GREEN = 0x0002
620 619 _FOREGROUND_RED = 0x0004
621 620 _FOREGROUND_INTENSITY = 0x0008
622 621
623 622 _BACKGROUND_BLUE = 0x0010
624 623 _BACKGROUND_GREEN = 0x0020
625 624 _BACKGROUND_RED = 0x0040
626 625 _BACKGROUND_INTENSITY = 0x0080
627 626
628 627 _COMMON_LVB_REVERSE_VIDEO = 0x4000
629 628 _COMMON_LVB_UNDERSCORE = 0x8000
630 629
631 630 # http://msdn.microsoft.com/en-us/library/ms682088%28VS.85%29.aspx
632 631 w32effects = {
633 632 'none': -1,
634 633 'black': 0,
635 634 'red': _FOREGROUND_RED,
636 635 'green': _FOREGROUND_GREEN,
637 636 'yellow': _FOREGROUND_RED | _FOREGROUND_GREEN,
638 637 'blue': _FOREGROUND_BLUE,
639 638 'magenta': _FOREGROUND_BLUE | _FOREGROUND_RED,
640 639 'cyan': _FOREGROUND_BLUE | _FOREGROUND_GREEN,
641 640 'white': _FOREGROUND_RED | _FOREGROUND_GREEN | _FOREGROUND_BLUE,
642 641 'bold': _FOREGROUND_INTENSITY,
643 642 'black_background': 0x100, # unused value > 0x0f
644 643 'red_background': _BACKGROUND_RED,
645 644 'green_background': _BACKGROUND_GREEN,
646 645 'yellow_background': _BACKGROUND_RED | _BACKGROUND_GREEN,
647 646 'blue_background': _BACKGROUND_BLUE,
648 647 'purple_background': _BACKGROUND_BLUE | _BACKGROUND_RED,
649 648 'cyan_background': _BACKGROUND_BLUE | _BACKGROUND_GREEN,
650 649 'white_background': (_BACKGROUND_RED | _BACKGROUND_GREEN |
651 650 _BACKGROUND_BLUE),
652 651 'bold_background': _BACKGROUND_INTENSITY,
653 652 'underline': _COMMON_LVB_UNDERSCORE, # double-byte charsets only
654 653 'inverse': _COMMON_LVB_REVERSE_VIDEO, # double-byte charsets only
655 654 }
656 655
657 656 passthrough = set([_FOREGROUND_INTENSITY,
658 657 _BACKGROUND_INTENSITY,
659 658 _COMMON_LVB_UNDERSCORE,
660 659 _COMMON_LVB_REVERSE_VIDEO])
661 660
662 661 stdout = _kernel32.GetStdHandle(
663 662 _STD_OUTPUT_HANDLE) # don't close the handle returned
664 663 if stdout is None or stdout == _INVALID_HANDLE_VALUE:
665 664 w32effects = None
666 665 else:
667 666 csbi = _CONSOLE_SCREEN_BUFFER_INFO()
668 667 if not _kernel32.GetConsoleScreenBufferInfo(
669 668 stdout, ctypes.byref(csbi)):
670 669 # stdout may not support GetConsoleScreenBufferInfo()
671 670 # when called from subprocess or redirected
672 671 w32effects = None
673 672 else:
674 673 origattr = csbi.wAttributes
675 674 ansire = re.compile('\033\[([^m]*)m([^\033]*)(.*)',
676 675 re.MULTILINE | re.DOTALL)
677 676
678 677 def win32print(text, orig, **opts):
679 678 label = opts.get('label', '')
680 679 attr = origattr
681 680
682 681 def mapcolor(val, attr):
683 682 if val == -1:
684 683 return origattr
685 684 elif val in passthrough:
686 685 return attr | val
687 686 elif val > 0x0f:
688 687 return (val & 0x70) | (attr & 0x8f)
689 688 else:
690 689 return (val & 0x07) | (attr & 0xf8)
691 690
692 691 # determine console attributes based on labels
693 692 for l in label.split():
694 693 style = _styles.get(l, '')
695 694 for effect in style.split():
696 695 try:
697 696 attr = mapcolor(w32effects[effect], attr)
698 697 except KeyError:
699 698 # w32effects could not have certain attributes so we skip
700 699 # them if not found
701 700 pass
702 701 # hack to ensure regexp finds data
703 702 if not text.startswith('\033['):
704 703 text = '\033[m' + text
705 704
706 705 # Look for ANSI-like codes embedded in text
707 706 m = re.match(ansire, text)
708 707
709 708 try:
710 709 while m:
711 710 for sattr in m.group(1).split(';'):
712 711 if sattr:
713 712 attr = mapcolor(int(sattr), attr)
714 713 _kernel32.SetConsoleTextAttribute(stdout, attr)
715 714 orig(m.group(2), **opts)
716 715 m = re.match(ansire, m.group(3))
717 716 finally:
718 717 # Explicitly reset original attributes
719 718 _kernel32.SetConsoleTextAttribute(stdout, origattr)
@@ -1,1353 +1,1353 b''
1 1 # Subversion 1.4/1.5 Python API backend
2 2 #
3 3 # Copyright(C) 2007 Daniel Holth et al
4 4 from __future__ import absolute_import
5 5
6 6 import os
7 7 import re
8 8 import tempfile
9 9 import xml.dom.minidom
10 10
11 11 from mercurial.i18n import _
12 12 from mercurial import (
13 13 encoding,
14 14 error,
15 15 pycompat,
16 16 scmutil,
17 17 util,
18 18 )
19 19
20 20 from . import common
21 21
22 22 pickle = util.pickle
23 23 stringio = util.stringio
24 24 propertycache = util.propertycache
25 25 urlerr = util.urlerr
26 26 urlreq = util.urlreq
27 27
28 28 commandline = common.commandline
29 29 commit = common.commit
30 30 converter_sink = common.converter_sink
31 31 converter_source = common.converter_source
32 32 decodeargs = common.decodeargs
33 33 encodeargs = common.encodeargs
34 34 makedatetimestamp = common.makedatetimestamp
35 35 mapfile = common.mapfile
36 36 MissingTool = common.MissingTool
37 37 NoRepo = common.NoRepo
38 38
39 39 # Subversion stuff. Works best with very recent Python SVN bindings
40 40 # e.g. SVN 1.5 or backports. Thanks to the bzr folks for enhancing
41 41 # these bindings.
42 42
43 43 try:
44 44 import svn
45 45 import svn.client
46 46 import svn.core
47 47 import svn.ra
48 48 import svn.delta
49 49 from . import transport
50 50 import warnings
51 51 warnings.filterwarnings('ignore',
52 52 module='svn.core',
53 53 category=DeprecationWarning)
54 54 svn.core.SubversionException # trigger import to catch error
55 55
56 56 except ImportError:
57 57 svn = None
58 58
59 59 class SvnPathNotFound(Exception):
60 60 pass
61 61
62 62 def revsplit(rev):
63 63 """Parse a revision string and return (uuid, path, revnum).
64 64 >>> revsplit('svn:a2147622-4a9f-4db4-a8d3-13562ff547b2'
65 65 ... '/proj%20B/mytrunk/mytrunk@1')
66 66 ('a2147622-4a9f-4db4-a8d3-13562ff547b2', '/proj%20B/mytrunk/mytrunk', 1)
67 67 >>> revsplit('svn:8af66a51-67f5-4354-b62c-98d67cc7be1d@1')
68 68 ('', '', 1)
69 69 >>> revsplit('@7')
70 70 ('', '', 7)
71 71 >>> revsplit('7')
72 72 ('', '', 0)
73 73 >>> revsplit('bad')
74 74 ('', '', 0)
75 75 """
76 76 parts = rev.rsplit('@', 1)
77 77 revnum = 0
78 78 if len(parts) > 1:
79 79 revnum = int(parts[1])
80 80 parts = parts[0].split('/', 1)
81 81 uuid = ''
82 82 mod = ''
83 83 if len(parts) > 1 and parts[0].startswith('svn:'):
84 84 uuid = parts[0][4:]
85 85 mod = '/' + parts[1]
86 86 return uuid, mod, revnum
87 87
88 88 def quote(s):
89 89 # As of svn 1.7, many svn calls expect "canonical" paths. In
90 90 # theory, we should call svn.core.*canonicalize() on all paths
91 91 # before passing them to the API. Instead, we assume the base url
92 92 # is canonical and copy the behaviour of svn URL encoding function
93 93 # so we can extend it safely with new components. The "safe"
94 94 # characters were taken from the "svn_uri__char_validity" table in
95 95 # libsvn_subr/path.c.
96 96 return urlreq.quote(s, "!$&'()*+,-./:=@_~")
97 97
98 98 def geturl(path):
99 99 try:
100 100 return svn.client.url_from_path(svn.core.svn_path_canonicalize(path))
101 101 except svn.core.SubversionException:
102 102 # svn.client.url_from_path() fails with local repositories
103 103 pass
104 104 if os.path.isdir(path):
105 105 path = os.path.normpath(os.path.abspath(path))
106 if os.name == 'nt':
106 if pycompat.osname == 'nt':
107 107 path = '/' + util.normpath(path)
108 108 # Module URL is later compared with the repository URL returned
109 109 # by svn API, which is UTF-8.
110 110 path = encoding.tolocal(path)
111 111 path = 'file://%s' % quote(path)
112 112 return svn.core.svn_path_canonicalize(path)
113 113
114 114 def optrev(number):
115 115 optrev = svn.core.svn_opt_revision_t()
116 116 optrev.kind = svn.core.svn_opt_revision_number
117 117 optrev.value.number = number
118 118 return optrev
119 119
120 120 class changedpath(object):
121 121 def __init__(self, p):
122 122 self.copyfrom_path = p.copyfrom_path
123 123 self.copyfrom_rev = p.copyfrom_rev
124 124 self.action = p.action
125 125
126 126 def get_log_child(fp, url, paths, start, end, limit=0,
127 127 discover_changed_paths=True, strict_node_history=False):
128 128 protocol = -1
129 129 def receiver(orig_paths, revnum, author, date, message, pool):
130 130 paths = {}
131 131 if orig_paths is not None:
132 132 for k, v in orig_paths.iteritems():
133 133 paths[k] = changedpath(v)
134 134 pickle.dump((paths, revnum, author, date, message),
135 135 fp, protocol)
136 136
137 137 try:
138 138 # Use an ra of our own so that our parent can consume
139 139 # our results without confusing the server.
140 140 t = transport.SvnRaTransport(url=url)
141 141 svn.ra.get_log(t.ra, paths, start, end, limit,
142 142 discover_changed_paths,
143 143 strict_node_history,
144 144 receiver)
145 145 except IOError:
146 146 # Caller may interrupt the iteration
147 147 pickle.dump(None, fp, protocol)
148 148 except Exception as inst:
149 149 pickle.dump(str(inst), fp, protocol)
150 150 else:
151 151 pickle.dump(None, fp, protocol)
152 152 fp.close()
153 153 # With large history, cleanup process goes crazy and suddenly
154 154 # consumes *huge* amount of memory. The output file being closed,
155 155 # there is no need for clean termination.
156 156 os._exit(0)
157 157
158 158 def debugsvnlog(ui, **opts):
159 159 """Fetch SVN log in a subprocess and channel them back to parent to
160 160 avoid memory collection issues.
161 161 """
162 162 if svn is None:
163 163 raise error.Abort(_('debugsvnlog could not load Subversion python '
164 164 'bindings'))
165 165
166 166 args = decodeargs(ui.fin.read())
167 167 get_log_child(ui.fout, *args)
168 168
169 169 class logstream(object):
170 170 """Interruptible revision log iterator."""
171 171 def __init__(self, stdout):
172 172 self._stdout = stdout
173 173
174 174 def __iter__(self):
175 175 while True:
176 176 try:
177 177 entry = pickle.load(self._stdout)
178 178 except EOFError:
179 179 raise error.Abort(_('Mercurial failed to run itself, check'
180 180 ' hg executable is in PATH'))
181 181 try:
182 182 orig_paths, revnum, author, date, message = entry
183 183 except (TypeError, ValueError):
184 184 if entry is None:
185 185 break
186 186 raise error.Abort(_("log stream exception '%s'") % entry)
187 187 yield entry
188 188
189 189 def close(self):
190 190 if self._stdout:
191 191 self._stdout.close()
192 192 self._stdout = None
193 193
194 194 class directlogstream(list):
195 195 """Direct revision log iterator.
196 196 This can be used for debugging and development but it will probably leak
197 197 memory and is not suitable for real conversions."""
198 198 def __init__(self, url, paths, start, end, limit=0,
199 199 discover_changed_paths=True, strict_node_history=False):
200 200
201 201 def receiver(orig_paths, revnum, author, date, message, pool):
202 202 paths = {}
203 203 if orig_paths is not None:
204 204 for k, v in orig_paths.iteritems():
205 205 paths[k] = changedpath(v)
206 206 self.append((paths, revnum, author, date, message))
207 207
208 208 # Use an ra of our own so that our parent can consume
209 209 # our results without confusing the server.
210 210 t = transport.SvnRaTransport(url=url)
211 211 svn.ra.get_log(t.ra, paths, start, end, limit,
212 212 discover_changed_paths,
213 213 strict_node_history,
214 214 receiver)
215 215
216 216 def close(self):
217 217 pass
218 218
219 219 # Check to see if the given path is a local Subversion repo. Verify this by
220 220 # looking for several svn-specific files and directories in the given
221 221 # directory.
222 222 def filecheck(ui, path, proto):
223 223 for x in ('locks', 'hooks', 'format', 'db'):
224 224 if not os.path.exists(os.path.join(path, x)):
225 225 return False
226 226 return True
227 227
228 228 # Check to see if a given path is the root of an svn repo over http. We verify
229 229 # this by requesting a version-controlled URL we know can't exist and looking
230 230 # for the svn-specific "not found" XML.
231 231 def httpcheck(ui, path, proto):
232 232 try:
233 233 opener = urlreq.buildopener()
234 234 rsp = opener.open('%s://%s/!svn/ver/0/.svn' % (proto, path))
235 235 data = rsp.read()
236 236 except urlerr.httperror as inst:
237 237 if inst.code != 404:
238 238 # Except for 404 we cannot know for sure this is not an svn repo
239 239 ui.warn(_('svn: cannot probe remote repository, assume it could '
240 240 'be a subversion repository. Use --source-type if you '
241 241 'know better.\n'))
242 242 return True
243 243 data = inst.fp.read()
244 244 except Exception:
245 245 # Could be urlerr.urlerror if the URL is invalid or anything else.
246 246 return False
247 247 return '<m:human-readable errcode="160013">' in data
248 248
249 249 protomap = {'http': httpcheck,
250 250 'https': httpcheck,
251 251 'file': filecheck,
252 252 }
253 253 def issvnurl(ui, url):
254 254 try:
255 255 proto, path = url.split('://', 1)
256 256 if proto == 'file':
257 if (os.name == 'nt' and path[:1] == '/' and path[1:2].isalpha()
258 and path[2:6].lower() == '%3a/'):
257 if (pycompat.osname == 'nt' and path[:1] == '/'
258 and path[1:2].isalpha() and path[2:6].lower() == '%3a/'):
259 259 path = path[:2] + ':/' + path[6:]
260 260 path = urlreq.url2pathname(path)
261 261 except ValueError:
262 262 proto = 'file'
263 263 path = os.path.abspath(url)
264 264 if proto == 'file':
265 265 path = util.pconvert(path)
266 266 check = protomap.get(proto, lambda *args: False)
267 267 while '/' in path:
268 268 if check(ui, path, proto):
269 269 return True
270 270 path = path.rsplit('/', 1)[0]
271 271 return False
272 272
273 273 # SVN conversion code stolen from bzr-svn and tailor
274 274 #
275 275 # Subversion looks like a versioned filesystem, branches structures
276 276 # are defined by conventions and not enforced by the tool. First,
277 277 # we define the potential branches (modules) as "trunk" and "branches"
278 278 # children directories. Revisions are then identified by their
279 279 # module and revision number (and a repository identifier).
280 280 #
281 281 # The revision graph is really a tree (or a forest). By default, a
282 282 # revision parent is the previous revision in the same module. If the
283 283 # module directory is copied/moved from another module then the
284 284 # revision is the module root and its parent the source revision in
285 285 # the parent module. A revision has at most one parent.
286 286 #
287 287 class svn_source(converter_source):
288 288 def __init__(self, ui, url, revs=None):
289 289 super(svn_source, self).__init__(ui, url, revs=revs)
290 290
291 291 if not (url.startswith('svn://') or url.startswith('svn+ssh://') or
292 292 (os.path.exists(url) and
293 293 os.path.exists(os.path.join(url, '.svn'))) or
294 294 issvnurl(ui, url)):
295 295 raise NoRepo(_("%s does not look like a Subversion repository")
296 296 % url)
297 297 if svn is None:
298 298 raise MissingTool(_('could not load Subversion python bindings'))
299 299
300 300 try:
301 301 version = svn.core.SVN_VER_MAJOR, svn.core.SVN_VER_MINOR
302 302 if version < (1, 4):
303 303 raise MissingTool(_('Subversion python bindings %d.%d found, '
304 304 '1.4 or later required') % version)
305 305 except AttributeError:
306 306 raise MissingTool(_('Subversion python bindings are too old, 1.4 '
307 307 'or later required'))
308 308
309 309 self.lastrevs = {}
310 310
311 311 latest = None
312 312 try:
313 313 # Support file://path@rev syntax. Useful e.g. to convert
314 314 # deleted branches.
315 315 at = url.rfind('@')
316 316 if at >= 0:
317 317 latest = int(url[at + 1:])
318 318 url = url[:at]
319 319 except ValueError:
320 320 pass
321 321 self.url = geturl(url)
322 322 self.encoding = 'UTF-8' # Subversion is always nominal UTF-8
323 323 try:
324 324 self.transport = transport.SvnRaTransport(url=self.url)
325 325 self.ra = self.transport.ra
326 326 self.ctx = self.transport.client
327 327 self.baseurl = svn.ra.get_repos_root(self.ra)
328 328 # Module is either empty or a repository path starting with
329 329 # a slash and not ending with a slash.
330 330 self.module = urlreq.unquote(self.url[len(self.baseurl):])
331 331 self.prevmodule = None
332 332 self.rootmodule = self.module
333 333 self.commits = {}
334 334 self.paths = {}
335 335 self.uuid = svn.ra.get_uuid(self.ra)
336 336 except svn.core.SubversionException:
337 337 ui.traceback()
338 338 svnversion = '%d.%d.%d' % (svn.core.SVN_VER_MAJOR,
339 339 svn.core.SVN_VER_MINOR,
340 340 svn.core.SVN_VER_MICRO)
341 341 raise NoRepo(_("%s does not look like a Subversion repository "
342 342 "to libsvn version %s")
343 343 % (self.url, svnversion))
344 344
345 345 if revs:
346 346 if len(revs) > 1:
347 347 raise error.Abort(_('subversion source does not support '
348 348 'specifying multiple revisions'))
349 349 try:
350 350 latest = int(revs[0])
351 351 except ValueError:
352 352 raise error.Abort(_('svn: revision %s is not an integer') %
353 353 revs[0])
354 354
355 355 self.trunkname = self.ui.config('convert', 'svn.trunk',
356 356 'trunk').strip('/')
357 357 self.startrev = self.ui.config('convert', 'svn.startrev', default=0)
358 358 try:
359 359 self.startrev = int(self.startrev)
360 360 if self.startrev < 0:
361 361 self.startrev = 0
362 362 except ValueError:
363 363 raise error.Abort(_('svn: start revision %s is not an integer')
364 364 % self.startrev)
365 365
366 366 try:
367 367 self.head = self.latest(self.module, latest)
368 368 except SvnPathNotFound:
369 369 self.head = None
370 370 if not self.head:
371 371 raise error.Abort(_('no revision found in module %s')
372 372 % self.module)
373 373 self.last_changed = self.revnum(self.head)
374 374
375 375 self._changescache = (None, None)
376 376
377 377 if os.path.exists(os.path.join(url, '.svn/entries')):
378 378 self.wc = url
379 379 else:
380 380 self.wc = None
381 381 self.convertfp = None
382 382
383 383 def setrevmap(self, revmap):
384 384 lastrevs = {}
385 385 for revid in revmap.iterkeys():
386 386 uuid, module, revnum = revsplit(revid)
387 387 lastrevnum = lastrevs.setdefault(module, revnum)
388 388 if revnum > lastrevnum:
389 389 lastrevs[module] = revnum
390 390 self.lastrevs = lastrevs
391 391
392 392 def exists(self, path, optrev):
393 393 try:
394 394 svn.client.ls(self.url.rstrip('/') + '/' + quote(path),
395 395 optrev, False, self.ctx)
396 396 return True
397 397 except svn.core.SubversionException:
398 398 return False
399 399
400 400 def getheads(self):
401 401
402 402 def isdir(path, revnum):
403 403 kind = self._checkpath(path, revnum)
404 404 return kind == svn.core.svn_node_dir
405 405
406 406 def getcfgpath(name, rev):
407 407 cfgpath = self.ui.config('convert', 'svn.' + name)
408 408 if cfgpath is not None and cfgpath.strip() == '':
409 409 return None
410 410 path = (cfgpath or name).strip('/')
411 411 if not self.exists(path, rev):
412 412 if self.module.endswith(path) and name == 'trunk':
413 413 # we are converting from inside this directory
414 414 return None
415 415 if cfgpath:
416 416 raise error.Abort(_('expected %s to be at %r, but not found'
417 417 ) % (name, path))
418 418 return None
419 419 self.ui.note(_('found %s at %r\n') % (name, path))
420 420 return path
421 421
422 422 rev = optrev(self.last_changed)
423 423 oldmodule = ''
424 424 trunk = getcfgpath('trunk', rev)
425 425 self.tags = getcfgpath('tags', rev)
426 426 branches = getcfgpath('branches', rev)
427 427
428 428 # If the project has a trunk or branches, we will extract heads
429 429 # from them. We keep the project root otherwise.
430 430 if trunk:
431 431 oldmodule = self.module or ''
432 432 self.module += '/' + trunk
433 433 self.head = self.latest(self.module, self.last_changed)
434 434 if not self.head:
435 435 raise error.Abort(_('no revision found in module %s')
436 436 % self.module)
437 437
438 438 # First head in the list is the module's head
439 439 self.heads = [self.head]
440 440 if self.tags is not None:
441 441 self.tags = '%s/%s' % (oldmodule , (self.tags or 'tags'))
442 442
443 443 # Check if branches bring a few more heads to the list
444 444 if branches:
445 445 rpath = self.url.strip('/')
446 446 branchnames = svn.client.ls(rpath + '/' + quote(branches),
447 447 rev, False, self.ctx)
448 448 for branch in sorted(branchnames):
449 449 module = '%s/%s/%s' % (oldmodule, branches, branch)
450 450 if not isdir(module, self.last_changed):
451 451 continue
452 452 brevid = self.latest(module, self.last_changed)
453 453 if not brevid:
454 454 self.ui.note(_('ignoring empty branch %s\n') % branch)
455 455 continue
456 456 self.ui.note(_('found branch %s at %d\n') %
457 457 (branch, self.revnum(brevid)))
458 458 self.heads.append(brevid)
459 459
460 460 if self.startrev and self.heads:
461 461 if len(self.heads) > 1:
462 462 raise error.Abort(_('svn: start revision is not supported '
463 463 'with more than one branch'))
464 464 revnum = self.revnum(self.heads[0])
465 465 if revnum < self.startrev:
466 466 raise error.Abort(
467 467 _('svn: no revision found after start revision %d')
468 468 % self.startrev)
469 469
470 470 return self.heads
471 471
472 472 def _getchanges(self, rev, full):
473 473 (paths, parents) = self.paths[rev]
474 474 copies = {}
475 475 if parents:
476 476 files, self.removed, copies = self.expandpaths(rev, paths, parents)
477 477 if full or not parents:
478 478 # Perform a full checkout on roots
479 479 uuid, module, revnum = revsplit(rev)
480 480 entries = svn.client.ls(self.baseurl + quote(module),
481 481 optrev(revnum), True, self.ctx)
482 482 files = [n for n, e in entries.iteritems()
483 483 if e.kind == svn.core.svn_node_file]
484 484 self.removed = set()
485 485
486 486 files.sort()
487 487 files = zip(files, [rev] * len(files))
488 488 return (files, copies)
489 489
490 490 def getchanges(self, rev, full):
491 491 # reuse cache from getchangedfiles
492 492 if self._changescache[0] == rev and not full:
493 493 (files, copies) = self._changescache[1]
494 494 else:
495 495 (files, copies) = self._getchanges(rev, full)
496 496 # caller caches the result, so free it here to release memory
497 497 del self.paths[rev]
498 498 return (files, copies, set())
499 499
500 500 def getchangedfiles(self, rev, i):
501 501 # called from filemap - cache computed values for reuse in getchanges
502 502 (files, copies) = self._getchanges(rev, False)
503 503 self._changescache = (rev, (files, copies))
504 504 return [f[0] for f in files]
505 505
506 506 def getcommit(self, rev):
507 507 if rev not in self.commits:
508 508 uuid, module, revnum = revsplit(rev)
509 509 self.module = module
510 510 self.reparent(module)
511 511 # We assume that:
512 512 # - requests for revisions after "stop" come from the
513 513 # revision graph backward traversal. Cache all of them
514 514 # down to stop, they will be used eventually.
515 515 # - requests for revisions before "stop" come to get
516 516 # isolated branches parents. Just fetch what is needed.
517 517 stop = self.lastrevs.get(module, 0)
518 518 if revnum < stop:
519 519 stop = revnum + 1
520 520 self._fetch_revisions(revnum, stop)
521 521 if rev not in self.commits:
522 522 raise error.Abort(_('svn: revision %s not found') % revnum)
523 523 revcommit = self.commits[rev]
524 524 # caller caches the result, so free it here to release memory
525 525 del self.commits[rev]
526 526 return revcommit
527 527
528 528 def checkrevformat(self, revstr, mapname='splicemap'):
529 529 """ fails if revision format does not match the correct format"""
530 530 if not re.match(r'svn:[0-9a-f]{8,8}-[0-9a-f]{4,4}-'
531 531 r'[0-9a-f]{4,4}-[0-9a-f]{4,4}-[0-9a-f]'
532 532 r'{12,12}(.*)\@[0-9]+$',revstr):
533 533 raise error.Abort(_('%s entry %s is not a valid revision'
534 534 ' identifier') % (mapname, revstr))
535 535
536 536 def numcommits(self):
537 537 return int(self.head.rsplit('@', 1)[1]) - self.startrev
538 538
539 539 def gettags(self):
540 540 tags = {}
541 541 if self.tags is None:
542 542 return tags
543 543
544 544 # svn tags are just a convention, project branches left in a
545 545 # 'tags' directory. There is no other relationship than
546 546 # ancestry, which is expensive to discover and makes them hard
547 547 # to update incrementally. Worse, past revisions may be
548 548 # referenced by tags far away in the future, requiring a deep
549 549 # history traversal on every calculation. Current code
550 550 # performs a single backward traversal, tracking moves within
551 551 # the tags directory (tag renaming) and recording a new tag
552 552 # everytime a project is copied from outside the tags
553 553 # directory. It also lists deleted tags, this behaviour may
554 554 # change in the future.
555 555 pendings = []
556 556 tagspath = self.tags
557 557 start = svn.ra.get_latest_revnum(self.ra)
558 558 stream = self._getlog([self.tags], start, self.startrev)
559 559 try:
560 560 for entry in stream:
561 561 origpaths, revnum, author, date, message = entry
562 562 if not origpaths:
563 563 origpaths = []
564 564 copies = [(e.copyfrom_path, e.copyfrom_rev, p) for p, e
565 565 in origpaths.iteritems() if e.copyfrom_path]
566 566 # Apply moves/copies from more specific to general
567 567 copies.sort(reverse=True)
568 568
569 569 srctagspath = tagspath
570 570 if copies and copies[-1][2] == tagspath:
571 571 # Track tags directory moves
572 572 srctagspath = copies.pop()[0]
573 573
574 574 for source, sourcerev, dest in copies:
575 575 if not dest.startswith(tagspath + '/'):
576 576 continue
577 577 for tag in pendings:
578 578 if tag[0].startswith(dest):
579 579 tagpath = source + tag[0][len(dest):]
580 580 tag[:2] = [tagpath, sourcerev]
581 581 break
582 582 else:
583 583 pendings.append([source, sourcerev, dest])
584 584
585 585 # Filter out tags with children coming from different
586 586 # parts of the repository like:
587 587 # /tags/tag.1 (from /trunk:10)
588 588 # /tags/tag.1/foo (from /branches/foo:12)
589 589 # Here/tags/tag.1 discarded as well as its children.
590 590 # It happens with tools like cvs2svn. Such tags cannot
591 591 # be represented in mercurial.
592 592 addeds = dict((p, e.copyfrom_path) for p, e
593 593 in origpaths.iteritems()
594 594 if e.action == 'A' and e.copyfrom_path)
595 595 badroots = set()
596 596 for destroot in addeds:
597 597 for source, sourcerev, dest in pendings:
598 598 if (not dest.startswith(destroot + '/')
599 599 or source.startswith(addeds[destroot] + '/')):
600 600 continue
601 601 badroots.add(destroot)
602 602 break
603 603
604 604 for badroot in badroots:
605 605 pendings = [p for p in pendings if p[2] != badroot
606 606 and not p[2].startswith(badroot + '/')]
607 607
608 608 # Tell tag renamings from tag creations
609 609 renamings = []
610 610 for source, sourcerev, dest in pendings:
611 611 tagname = dest.split('/')[-1]
612 612 if source.startswith(srctagspath):
613 613 renamings.append([source, sourcerev, tagname])
614 614 continue
615 615 if tagname in tags:
616 616 # Keep the latest tag value
617 617 continue
618 618 # From revision may be fake, get one with changes
619 619 try:
620 620 tagid = self.latest(source, sourcerev)
621 621 if tagid and tagname not in tags:
622 622 tags[tagname] = tagid
623 623 except SvnPathNotFound:
624 624 # It happens when we are following directories
625 625 # we assumed were copied with their parents
626 626 # but were really created in the tag
627 627 # directory.
628 628 pass
629 629 pendings = renamings
630 630 tagspath = srctagspath
631 631 finally:
632 632 stream.close()
633 633 return tags
634 634
635 635 def converted(self, rev, destrev):
636 636 if not self.wc:
637 637 return
638 638 if self.convertfp is None:
639 639 self.convertfp = open(os.path.join(self.wc, '.svn', 'hg-shamap'),
640 640 'a')
641 641 self.convertfp.write('%s %d\n' % (destrev, self.revnum(rev)))
642 642 self.convertfp.flush()
643 643
644 644 def revid(self, revnum, module=None):
645 645 return 'svn:%s%s@%s' % (self.uuid, module or self.module, revnum)
646 646
647 647 def revnum(self, rev):
648 648 return int(rev.split('@')[-1])
649 649
650 650 def latest(self, path, stop=None):
651 651 """Find the latest revid affecting path, up to stop revision
652 652 number. If stop is None, default to repository latest
653 653 revision. It may return a revision in a different module,
654 654 since a branch may be moved without a change being
655 655 reported. Return None if computed module does not belong to
656 656 rootmodule subtree.
657 657 """
658 658 def findchanges(path, start, stop=None):
659 659 stream = self._getlog([path], start, stop or 1)
660 660 try:
661 661 for entry in stream:
662 662 paths, revnum, author, date, message = entry
663 663 if stop is None and paths:
664 664 # We do not know the latest changed revision,
665 665 # keep the first one with changed paths.
666 666 break
667 667 if revnum <= stop:
668 668 break
669 669
670 670 for p in paths:
671 671 if (not path.startswith(p) or
672 672 not paths[p].copyfrom_path):
673 673 continue
674 674 newpath = paths[p].copyfrom_path + path[len(p):]
675 675 self.ui.debug("branch renamed from %s to %s at %d\n" %
676 676 (path, newpath, revnum))
677 677 path = newpath
678 678 break
679 679 if not paths:
680 680 revnum = None
681 681 return revnum, path
682 682 finally:
683 683 stream.close()
684 684
685 685 if not path.startswith(self.rootmodule):
686 686 # Requests on foreign branches may be forbidden at server level
687 687 self.ui.debug('ignoring foreign branch %r\n' % path)
688 688 return None
689 689
690 690 if stop is None:
691 691 stop = svn.ra.get_latest_revnum(self.ra)
692 692 try:
693 693 prevmodule = self.reparent('')
694 694 dirent = svn.ra.stat(self.ra, path.strip('/'), stop)
695 695 self.reparent(prevmodule)
696 696 except svn.core.SubversionException:
697 697 dirent = None
698 698 if not dirent:
699 699 raise SvnPathNotFound(_('%s not found up to revision %d')
700 700 % (path, stop))
701 701
702 702 # stat() gives us the previous revision on this line of
703 703 # development, but it might be in *another module*. Fetch the
704 704 # log and detect renames down to the latest revision.
705 705 revnum, realpath = findchanges(path, stop, dirent.created_rev)
706 706 if revnum is None:
707 707 # Tools like svnsync can create empty revision, when
708 708 # synchronizing only a subtree for instance. These empty
709 709 # revisions created_rev still have their original values
710 710 # despite all changes having disappeared and can be
711 711 # returned by ra.stat(), at least when stating the root
712 712 # module. In that case, do not trust created_rev and scan
713 713 # the whole history.
714 714 revnum, realpath = findchanges(path, stop)
715 715 if revnum is None:
716 716 self.ui.debug('ignoring empty branch %r\n' % realpath)
717 717 return None
718 718
719 719 if not realpath.startswith(self.rootmodule):
720 720 self.ui.debug('ignoring foreign branch %r\n' % realpath)
721 721 return None
722 722 return self.revid(revnum, realpath)
723 723
724 724 def reparent(self, module):
725 725 """Reparent the svn transport and return the previous parent."""
726 726 if self.prevmodule == module:
727 727 return module
728 728 svnurl = self.baseurl + quote(module)
729 729 prevmodule = self.prevmodule
730 730 if prevmodule is None:
731 731 prevmodule = ''
732 732 self.ui.debug("reparent to %s\n" % svnurl)
733 733 svn.ra.reparent(self.ra, svnurl)
734 734 self.prevmodule = module
735 735 return prevmodule
736 736
737 737 def expandpaths(self, rev, paths, parents):
738 738 changed, removed = set(), set()
739 739 copies = {}
740 740
741 741 new_module, revnum = revsplit(rev)[1:]
742 742 if new_module != self.module:
743 743 self.module = new_module
744 744 self.reparent(self.module)
745 745
746 746 for i, (path, ent) in enumerate(paths):
747 747 self.ui.progress(_('scanning paths'), i, item=path,
748 748 total=len(paths), unit=_('paths'))
749 749 entrypath = self.getrelpath(path)
750 750
751 751 kind = self._checkpath(entrypath, revnum)
752 752 if kind == svn.core.svn_node_file:
753 753 changed.add(self.recode(entrypath))
754 754 if not ent.copyfrom_path or not parents:
755 755 continue
756 756 # Copy sources not in parent revisions cannot be
757 757 # represented, ignore their origin for now
758 758 pmodule, prevnum = revsplit(parents[0])[1:]
759 759 if ent.copyfrom_rev < prevnum:
760 760 continue
761 761 copyfrom_path = self.getrelpath(ent.copyfrom_path, pmodule)
762 762 if not copyfrom_path:
763 763 continue
764 764 self.ui.debug("copied to %s from %s@%s\n" %
765 765 (entrypath, copyfrom_path, ent.copyfrom_rev))
766 766 copies[self.recode(entrypath)] = self.recode(copyfrom_path)
767 767 elif kind == 0: # gone, but had better be a deleted *file*
768 768 self.ui.debug("gone from %s\n" % ent.copyfrom_rev)
769 769 pmodule, prevnum = revsplit(parents[0])[1:]
770 770 parentpath = pmodule + "/" + entrypath
771 771 fromkind = self._checkpath(entrypath, prevnum, pmodule)
772 772
773 773 if fromkind == svn.core.svn_node_file:
774 774 removed.add(self.recode(entrypath))
775 775 elif fromkind == svn.core.svn_node_dir:
776 776 oroot = parentpath.strip('/')
777 777 nroot = path.strip('/')
778 778 children = self._iterfiles(oroot, prevnum)
779 779 for childpath in children:
780 780 childpath = childpath.replace(oroot, nroot)
781 781 childpath = self.getrelpath("/" + childpath, pmodule)
782 782 if childpath:
783 783 removed.add(self.recode(childpath))
784 784 else:
785 785 self.ui.debug('unknown path in revision %d: %s\n' % \
786 786 (revnum, path))
787 787 elif kind == svn.core.svn_node_dir:
788 788 if ent.action == 'M':
789 789 # If the directory just had a prop change,
790 790 # then we shouldn't need to look for its children.
791 791 continue
792 792 if ent.action == 'R' and parents:
793 793 # If a directory is replacing a file, mark the previous
794 794 # file as deleted
795 795 pmodule, prevnum = revsplit(parents[0])[1:]
796 796 pkind = self._checkpath(entrypath, prevnum, pmodule)
797 797 if pkind == svn.core.svn_node_file:
798 798 removed.add(self.recode(entrypath))
799 799 elif pkind == svn.core.svn_node_dir:
800 800 # We do not know what files were kept or removed,
801 801 # mark them all as changed.
802 802 for childpath in self._iterfiles(pmodule, prevnum):
803 803 childpath = self.getrelpath("/" + childpath)
804 804 if childpath:
805 805 changed.add(self.recode(childpath))
806 806
807 807 for childpath in self._iterfiles(path, revnum):
808 808 childpath = self.getrelpath("/" + childpath)
809 809 if childpath:
810 810 changed.add(self.recode(childpath))
811 811
812 812 # Handle directory copies
813 813 if not ent.copyfrom_path or not parents:
814 814 continue
815 815 # Copy sources not in parent revisions cannot be
816 816 # represented, ignore their origin for now
817 817 pmodule, prevnum = revsplit(parents[0])[1:]
818 818 if ent.copyfrom_rev < prevnum:
819 819 continue
820 820 copyfrompath = self.getrelpath(ent.copyfrom_path, pmodule)
821 821 if not copyfrompath:
822 822 continue
823 823 self.ui.debug("mark %s came from %s:%d\n"
824 824 % (path, copyfrompath, ent.copyfrom_rev))
825 825 children = self._iterfiles(ent.copyfrom_path, ent.copyfrom_rev)
826 826 for childpath in children:
827 827 childpath = self.getrelpath("/" + childpath, pmodule)
828 828 if not childpath:
829 829 continue
830 830 copytopath = path + childpath[len(copyfrompath):]
831 831 copytopath = self.getrelpath(copytopath)
832 832 copies[self.recode(copytopath)] = self.recode(childpath)
833 833
834 834 self.ui.progress(_('scanning paths'), None)
835 835 changed.update(removed)
836 836 return (list(changed), removed, copies)
837 837
838 838 def _fetch_revisions(self, from_revnum, to_revnum):
839 839 if from_revnum < to_revnum:
840 840 from_revnum, to_revnum = to_revnum, from_revnum
841 841
842 842 self.child_cset = None
843 843
844 844 def parselogentry(orig_paths, revnum, author, date, message):
845 845 """Return the parsed commit object or None, and True if
846 846 the revision is a branch root.
847 847 """
848 848 self.ui.debug("parsing revision %d (%d changes)\n" %
849 849 (revnum, len(orig_paths)))
850 850
851 851 branched = False
852 852 rev = self.revid(revnum)
853 853 # branch log might return entries for a parent we already have
854 854
855 855 if rev in self.commits or revnum < to_revnum:
856 856 return None, branched
857 857
858 858 parents = []
859 859 # check whether this revision is the start of a branch or part
860 860 # of a branch renaming
861 861 orig_paths = sorted(orig_paths.iteritems())
862 862 root_paths = [(p, e) for p, e in orig_paths
863 863 if self.module.startswith(p)]
864 864 if root_paths:
865 865 path, ent = root_paths[-1]
866 866 if ent.copyfrom_path:
867 867 branched = True
868 868 newpath = ent.copyfrom_path + self.module[len(path):]
869 869 # ent.copyfrom_rev may not be the actual last revision
870 870 previd = self.latest(newpath, ent.copyfrom_rev)
871 871 if previd is not None:
872 872 prevmodule, prevnum = revsplit(previd)[1:]
873 873 if prevnum >= self.startrev:
874 874 parents = [previd]
875 875 self.ui.note(
876 876 _('found parent of branch %s at %d: %s\n') %
877 877 (self.module, prevnum, prevmodule))
878 878 else:
879 879 self.ui.debug("no copyfrom path, don't know what to do.\n")
880 880
881 881 paths = []
882 882 # filter out unrelated paths
883 883 for path, ent in orig_paths:
884 884 if self.getrelpath(path) is None:
885 885 continue
886 886 paths.append((path, ent))
887 887
888 888 # Example SVN datetime. Includes microseconds.
889 889 # ISO-8601 conformant
890 890 # '2007-01-04T17:35:00.902377Z'
891 891 date = util.parsedate(date[:19] + " UTC", ["%Y-%m-%dT%H:%M:%S"])
892 892 if self.ui.configbool('convert', 'localtimezone'):
893 893 date = makedatetimestamp(date[0])
894 894
895 895 if message:
896 896 log = self.recode(message)
897 897 else:
898 898 log = ''
899 899
900 900 if author:
901 901 author = self.recode(author)
902 902 else:
903 903 author = ''
904 904
905 905 try:
906 906 branch = self.module.split("/")[-1]
907 907 if branch == self.trunkname:
908 908 branch = None
909 909 except IndexError:
910 910 branch = None
911 911
912 912 cset = commit(author=author,
913 913 date=util.datestr(date, '%Y-%m-%d %H:%M:%S %1%2'),
914 914 desc=log,
915 915 parents=parents,
916 916 branch=branch,
917 917 rev=rev)
918 918
919 919 self.commits[rev] = cset
920 920 # The parents list is *shared* among self.paths and the
921 921 # commit object. Both will be updated below.
922 922 self.paths[rev] = (paths, cset.parents)
923 923 if self.child_cset and not self.child_cset.parents:
924 924 self.child_cset.parents[:] = [rev]
925 925 self.child_cset = cset
926 926 return cset, branched
927 927
928 928 self.ui.note(_('fetching revision log for "%s" from %d to %d\n') %
929 929 (self.module, from_revnum, to_revnum))
930 930
931 931 try:
932 932 firstcset = None
933 933 lastonbranch = False
934 934 stream = self._getlog([self.module], from_revnum, to_revnum)
935 935 try:
936 936 for entry in stream:
937 937 paths, revnum, author, date, message = entry
938 938 if revnum < self.startrev:
939 939 lastonbranch = True
940 940 break
941 941 if not paths:
942 942 self.ui.debug('revision %d has no entries\n' % revnum)
943 943 # If we ever leave the loop on an empty
944 944 # revision, do not try to get a parent branch
945 945 lastonbranch = lastonbranch or revnum == 0
946 946 continue
947 947 cset, lastonbranch = parselogentry(paths, revnum, author,
948 948 date, message)
949 949 if cset:
950 950 firstcset = cset
951 951 if lastonbranch:
952 952 break
953 953 finally:
954 954 stream.close()
955 955
956 956 if not lastonbranch and firstcset and not firstcset.parents:
957 957 # The first revision of the sequence (the last fetched one)
958 958 # has invalid parents if not a branch root. Find the parent
959 959 # revision now, if any.
960 960 try:
961 961 firstrevnum = self.revnum(firstcset.rev)
962 962 if firstrevnum > 1:
963 963 latest = self.latest(self.module, firstrevnum - 1)
964 964 if latest:
965 965 firstcset.parents.append(latest)
966 966 except SvnPathNotFound:
967 967 pass
968 968 except svn.core.SubversionException as xxx_todo_changeme:
969 969 (inst, num) = xxx_todo_changeme.args
970 970 if num == svn.core.SVN_ERR_FS_NO_SUCH_REVISION:
971 971 raise error.Abort(_('svn: branch has no revision %s')
972 972 % to_revnum)
973 973 raise
974 974
975 975 def getfile(self, file, rev):
976 976 # TODO: ra.get_file transmits the whole file instead of diffs.
977 977 if file in self.removed:
978 978 return None, None
979 979 mode = ''
980 980 try:
981 981 new_module, revnum = revsplit(rev)[1:]
982 982 if self.module != new_module:
983 983 self.module = new_module
984 984 self.reparent(self.module)
985 985 io = stringio()
986 986 info = svn.ra.get_file(self.ra, file, revnum, io)
987 987 data = io.getvalue()
988 988 # ra.get_file() seems to keep a reference on the input buffer
989 989 # preventing collection. Release it explicitly.
990 990 io.close()
991 991 if isinstance(info, list):
992 992 info = info[-1]
993 993 mode = ("svn:executable" in info) and 'x' or ''
994 994 mode = ("svn:special" in info) and 'l' or mode
995 995 except svn.core.SubversionException as e:
996 996 notfound = (svn.core.SVN_ERR_FS_NOT_FOUND,
997 997 svn.core.SVN_ERR_RA_DAV_PATH_NOT_FOUND)
998 998 if e.apr_err in notfound: # File not found
999 999 return None, None
1000 1000 raise
1001 1001 if mode == 'l':
1002 1002 link_prefix = "link "
1003 1003 if data.startswith(link_prefix):
1004 1004 data = data[len(link_prefix):]
1005 1005 return data, mode
1006 1006
1007 1007 def _iterfiles(self, path, revnum):
1008 1008 """Enumerate all files in path at revnum, recursively."""
1009 1009 path = path.strip('/')
1010 1010 pool = svn.core.Pool()
1011 1011 rpath = '/'.join([self.baseurl, quote(path)]).strip('/')
1012 1012 entries = svn.client.ls(rpath, optrev(revnum), True, self.ctx, pool)
1013 1013 if path:
1014 1014 path += '/'
1015 1015 return ((path + p) for p, e in entries.iteritems()
1016 1016 if e.kind == svn.core.svn_node_file)
1017 1017
1018 1018 def getrelpath(self, path, module=None):
1019 1019 if module is None:
1020 1020 module = self.module
1021 1021 # Given the repository url of this wc, say
1022 1022 # "http://server/plone/CMFPlone/branches/Plone-2_0-branch"
1023 1023 # extract the "entry" portion (a relative path) from what
1024 1024 # svn log --xml says, i.e.
1025 1025 # "/CMFPlone/branches/Plone-2_0-branch/tests/PloneTestCase.py"
1026 1026 # that is to say "tests/PloneTestCase.py"
1027 1027 if path.startswith(module):
1028 1028 relative = path.rstrip('/')[len(module):]
1029 1029 if relative.startswith('/'):
1030 1030 return relative[1:]
1031 1031 elif relative == '':
1032 1032 return relative
1033 1033
1034 1034 # The path is outside our tracked tree...
1035 1035 self.ui.debug('%r is not under %r, ignoring\n' % (path, module))
1036 1036 return None
1037 1037
1038 1038 def _checkpath(self, path, revnum, module=None):
1039 1039 if module is not None:
1040 1040 prevmodule = self.reparent('')
1041 1041 path = module + '/' + path
1042 1042 try:
1043 1043 # ra.check_path does not like leading slashes very much, it leads
1044 1044 # to PROPFIND subversion errors
1045 1045 return svn.ra.check_path(self.ra, path.strip('/'), revnum)
1046 1046 finally:
1047 1047 if module is not None:
1048 1048 self.reparent(prevmodule)
1049 1049
1050 1050 def _getlog(self, paths, start, end, limit=0, discover_changed_paths=True,
1051 1051 strict_node_history=False):
1052 1052 # Normalize path names, svn >= 1.5 only wants paths relative to
1053 1053 # supplied URL
1054 1054 relpaths = []
1055 1055 for p in paths:
1056 1056 if not p.startswith('/'):
1057 1057 p = self.module + '/' + p
1058 1058 relpaths.append(p.strip('/'))
1059 1059 args = [self.baseurl, relpaths, start, end, limit,
1060 1060 discover_changed_paths, strict_node_history]
1061 1061 # developer config: convert.svn.debugsvnlog
1062 1062 if not self.ui.configbool('convert', 'svn.debugsvnlog', True):
1063 1063 return directlogstream(*args)
1064 1064 arg = encodeargs(args)
1065 1065 hgexe = util.hgexecutable()
1066 1066 cmd = '%s debugsvnlog' % util.shellquote(hgexe)
1067 1067 stdin, stdout = util.popen2(util.quotecommand(cmd))
1068 1068 stdin.write(arg)
1069 1069 try:
1070 1070 stdin.close()
1071 1071 except IOError:
1072 1072 raise error.Abort(_('Mercurial failed to run itself, check'
1073 1073 ' hg executable is in PATH'))
1074 1074 return logstream(stdout)
1075 1075
1076 1076 pre_revprop_change = '''#!/bin/sh
1077 1077
1078 1078 REPOS="$1"
1079 1079 REV="$2"
1080 1080 USER="$3"
1081 1081 PROPNAME="$4"
1082 1082 ACTION="$5"
1083 1083
1084 1084 if [ "$ACTION" = "M" -a "$PROPNAME" = "svn:log" ]; then exit 0; fi
1085 1085 if [ "$ACTION" = "A" -a "$PROPNAME" = "hg:convert-branch" ]; then exit 0; fi
1086 1086 if [ "$ACTION" = "A" -a "$PROPNAME" = "hg:convert-rev" ]; then exit 0; fi
1087 1087
1088 1088 echo "Changing prohibited revision property" >&2
1089 1089 exit 1
1090 1090 '''
1091 1091
1092 1092 class svn_sink(converter_sink, commandline):
1093 1093 commit_re = re.compile(r'Committed revision (\d+).', re.M)
1094 1094 uuid_re = re.compile(r'Repository UUID:\s*(\S+)', re.M)
1095 1095
1096 1096 def prerun(self):
1097 1097 if self.wc:
1098 1098 os.chdir(self.wc)
1099 1099
1100 1100 def postrun(self):
1101 1101 if self.wc:
1102 1102 os.chdir(self.cwd)
1103 1103
1104 1104 def join(self, name):
1105 1105 return os.path.join(self.wc, '.svn', name)
1106 1106
1107 1107 def revmapfile(self):
1108 1108 return self.join('hg-shamap')
1109 1109
1110 1110 def authorfile(self):
1111 1111 return self.join('hg-authormap')
1112 1112
1113 1113 def __init__(self, ui, path):
1114 1114
1115 1115 converter_sink.__init__(self, ui, path)
1116 1116 commandline.__init__(self, ui, 'svn')
1117 1117 self.delete = []
1118 1118 self.setexec = []
1119 1119 self.delexec = []
1120 1120 self.copies = []
1121 1121 self.wc = None
1122 1122 self.cwd = pycompat.getcwd()
1123 1123
1124 1124 created = False
1125 1125 if os.path.isfile(os.path.join(path, '.svn', 'entries')):
1126 1126 self.wc = os.path.realpath(path)
1127 1127 self.run0('update')
1128 1128 else:
1129 1129 if not re.search(r'^(file|http|https|svn|svn\+ssh)\://', path):
1130 1130 path = os.path.realpath(path)
1131 1131 if os.path.isdir(os.path.dirname(path)):
1132 1132 if not os.path.exists(os.path.join(path, 'db', 'fs-type')):
1133 1133 ui.status(_('initializing svn repository %r\n') %
1134 1134 os.path.basename(path))
1135 1135 commandline(ui, 'svnadmin').run0('create', path)
1136 1136 created = path
1137 1137 path = util.normpath(path)
1138 1138 if not path.startswith('/'):
1139 1139 path = '/' + path
1140 1140 path = 'file://' + path
1141 1141
1142 1142 wcpath = os.path.join(pycompat.getcwd(), os.path.basename(path) +
1143 1143 '-wc')
1144 1144 ui.status(_('initializing svn working copy %r\n')
1145 1145 % os.path.basename(wcpath))
1146 1146 self.run0('checkout', path, wcpath)
1147 1147
1148 1148 self.wc = wcpath
1149 1149 self.opener = scmutil.opener(self.wc)
1150 1150 self.wopener = scmutil.opener(self.wc)
1151 1151 self.childmap = mapfile(ui, self.join('hg-childmap'))
1152 1152 if util.checkexec(self.wc):
1153 1153 self.is_exec = util.isexec
1154 1154 else:
1155 1155 self.is_exec = None
1156 1156
1157 1157 if created:
1158 1158 hook = os.path.join(created, 'hooks', 'pre-revprop-change')
1159 1159 fp = open(hook, 'w')
1160 1160 fp.write(pre_revprop_change)
1161 1161 fp.close()
1162 1162 util.setflags(hook, False, True)
1163 1163
1164 1164 output = self.run0('info')
1165 1165 self.uuid = self.uuid_re.search(output).group(1).strip()
1166 1166
1167 1167 def wjoin(self, *names):
1168 1168 return os.path.join(self.wc, *names)
1169 1169
1170 1170 @propertycache
1171 1171 def manifest(self):
1172 1172 # As of svn 1.7, the "add" command fails when receiving
1173 1173 # already tracked entries, so we have to track and filter them
1174 1174 # ourselves.
1175 1175 m = set()
1176 1176 output = self.run0('ls', recursive=True, xml=True)
1177 1177 doc = xml.dom.minidom.parseString(output)
1178 1178 for e in doc.getElementsByTagName('entry'):
1179 1179 for n in e.childNodes:
1180 1180 if n.nodeType != n.ELEMENT_NODE or n.tagName != 'name':
1181 1181 continue
1182 1182 name = ''.join(c.data for c in n.childNodes
1183 1183 if c.nodeType == c.TEXT_NODE)
1184 1184 # Entries are compared with names coming from
1185 1185 # mercurial, so bytes with undefined encoding. Our
1186 1186 # best bet is to assume they are in local
1187 1187 # encoding. They will be passed to command line calls
1188 1188 # later anyway, so they better be.
1189 1189 m.add(encoding.tolocal(name.encode('utf-8')))
1190 1190 break
1191 1191 return m
1192 1192
1193 1193 def putfile(self, filename, flags, data):
1194 1194 if 'l' in flags:
1195 1195 self.wopener.symlink(data, filename)
1196 1196 else:
1197 1197 try:
1198 1198 if os.path.islink(self.wjoin(filename)):
1199 1199 os.unlink(filename)
1200 1200 except OSError:
1201 1201 pass
1202 1202 self.wopener.write(filename, data)
1203 1203
1204 1204 if self.is_exec:
1205 1205 if self.is_exec(self.wjoin(filename)):
1206 1206 if 'x' not in flags:
1207 1207 self.delexec.append(filename)
1208 1208 else:
1209 1209 if 'x' in flags:
1210 1210 self.setexec.append(filename)
1211 1211 util.setflags(self.wjoin(filename), False, 'x' in flags)
1212 1212
1213 1213 def _copyfile(self, source, dest):
1214 1214 # SVN's copy command pukes if the destination file exists, but
1215 1215 # our copyfile method expects to record a copy that has
1216 1216 # already occurred. Cross the semantic gap.
1217 1217 wdest = self.wjoin(dest)
1218 1218 exists = os.path.lexists(wdest)
1219 1219 if exists:
1220 1220 fd, tempname = tempfile.mkstemp(
1221 1221 prefix='hg-copy-', dir=os.path.dirname(wdest))
1222 1222 os.close(fd)
1223 1223 os.unlink(tempname)
1224 1224 os.rename(wdest, tempname)
1225 1225 try:
1226 1226 self.run0('copy', source, dest)
1227 1227 finally:
1228 1228 self.manifest.add(dest)
1229 1229 if exists:
1230 1230 try:
1231 1231 os.unlink(wdest)
1232 1232 except OSError:
1233 1233 pass
1234 1234 os.rename(tempname, wdest)
1235 1235
1236 1236 def dirs_of(self, files):
1237 1237 dirs = set()
1238 1238 for f in files:
1239 1239 if os.path.isdir(self.wjoin(f)):
1240 1240 dirs.add(f)
1241 1241 i = len(f)
1242 1242 for i in iter(lambda: f.rfind('/', 0, i), -1):
1243 1243 dirs.add(f[:i])
1244 1244 return dirs
1245 1245
1246 1246 def add_dirs(self, files):
1247 1247 add_dirs = [d for d in sorted(self.dirs_of(files))
1248 1248 if d not in self.manifest]
1249 1249 if add_dirs:
1250 1250 self.manifest.update(add_dirs)
1251 1251 self.xargs(add_dirs, 'add', non_recursive=True, quiet=True)
1252 1252 return add_dirs
1253 1253
1254 1254 def add_files(self, files):
1255 1255 files = [f for f in files if f not in self.manifest]
1256 1256 if files:
1257 1257 self.manifest.update(files)
1258 1258 self.xargs(files, 'add', quiet=True)
1259 1259 return files
1260 1260
1261 1261 def addchild(self, parent, child):
1262 1262 self.childmap[parent] = child
1263 1263
1264 1264 def revid(self, rev):
1265 1265 return u"svn:%s@%s" % (self.uuid, rev)
1266 1266
1267 1267 def putcommit(self, files, copies, parents, commit, source, revmap, full,
1268 1268 cleanp2):
1269 1269 for parent in parents:
1270 1270 try:
1271 1271 return self.revid(self.childmap[parent])
1272 1272 except KeyError:
1273 1273 pass
1274 1274
1275 1275 # Apply changes to working copy
1276 1276 for f, v in files:
1277 1277 data, mode = source.getfile(f, v)
1278 1278 if data is None:
1279 1279 self.delete.append(f)
1280 1280 else:
1281 1281 self.putfile(f, mode, data)
1282 1282 if f in copies:
1283 1283 self.copies.append([copies[f], f])
1284 1284 if full:
1285 1285 self.delete.extend(sorted(self.manifest.difference(files)))
1286 1286 files = [f[0] for f in files]
1287 1287
1288 1288 entries = set(self.delete)
1289 1289 files = frozenset(files)
1290 1290 entries.update(self.add_dirs(files.difference(entries)))
1291 1291 if self.copies:
1292 1292 for s, d in self.copies:
1293 1293 self._copyfile(s, d)
1294 1294 self.copies = []
1295 1295 if self.delete:
1296 1296 self.xargs(self.delete, 'delete')
1297 1297 for f in self.delete:
1298 1298 self.manifest.remove(f)
1299 1299 self.delete = []
1300 1300 entries.update(self.add_files(files.difference(entries)))
1301 1301 if self.delexec:
1302 1302 self.xargs(self.delexec, 'propdel', 'svn:executable')
1303 1303 self.delexec = []
1304 1304 if self.setexec:
1305 1305 self.xargs(self.setexec, 'propset', 'svn:executable', '*')
1306 1306 self.setexec = []
1307 1307
1308 1308 fd, messagefile = tempfile.mkstemp(prefix='hg-convert-')
1309 1309 fp = os.fdopen(fd, 'w')
1310 1310 fp.write(commit.desc)
1311 1311 fp.close()
1312 1312 try:
1313 1313 output = self.run0('commit',
1314 1314 username=util.shortuser(commit.author),
1315 1315 file=messagefile,
1316 1316 encoding='utf-8')
1317 1317 try:
1318 1318 rev = self.commit_re.search(output).group(1)
1319 1319 except AttributeError:
1320 1320 if parents and not files:
1321 1321 return parents[0]
1322 1322 self.ui.warn(_('unexpected svn output:\n'))
1323 1323 self.ui.warn(output)
1324 1324 raise error.Abort(_('unable to cope with svn output'))
1325 1325 if commit.rev:
1326 1326 self.run('propset', 'hg:convert-rev', commit.rev,
1327 1327 revprop=True, revision=rev)
1328 1328 if commit.branch and commit.branch != 'default':
1329 1329 self.run('propset', 'hg:convert-branch', commit.branch,
1330 1330 revprop=True, revision=rev)
1331 1331 for parent in parents:
1332 1332 self.addchild(parent, rev)
1333 1333 return self.revid(rev)
1334 1334 finally:
1335 1335 os.unlink(messagefile)
1336 1336
1337 1337 def puttags(self, tags):
1338 1338 self.ui.warn(_('writing Subversion tags is not yet implemented\n'))
1339 1339 return None, None
1340 1340
1341 1341 def hascommitfrommap(self, rev):
1342 1342 # We trust that revisions referenced in a map still is present
1343 1343 # TODO: implement something better if necessary and feasible
1344 1344 return True
1345 1345
1346 1346 def hascommitforsplicemap(self, rev):
1347 1347 # This is not correct as one can convert to an existing subversion
1348 1348 # repository and childmap would not list all revisions. Too bad.
1349 1349 if rev in self.childmap:
1350 1350 return True
1351 1351 raise error.Abort(_('splice map revision %s not found in subversion '
1352 1352 'child map (revision lookups are not implemented)')
1353 1353 % rev)
@@ -1,662 +1,664 b''
1 1 # Copyright 2009-2010 Gregory P. Ward
2 2 # Copyright 2009-2010 Intelerad Medical Systems Incorporated
3 3 # Copyright 2010-2011 Fog Creek Software
4 4 # Copyright 2010-2011 Unity Technologies
5 5 #
6 6 # This software may be used and distributed according to the terms of the
7 7 # GNU General Public License version 2 or any later version.
8 8
9 9 '''largefiles utility code: must not import other modules in this package.'''
10 10 from __future__ import absolute_import
11 11
12 12 import copy
13 13 import hashlib
14 14 import os
15 15 import platform
16 16 import stat
17 17
18 18 from mercurial.i18n import _
19 19
20 20 from mercurial import (
21 21 dirstate,
22 22 error,
23 23 httpconnection,
24 24 match as matchmod,
25 25 node,
26 pycompat,
26 27 scmutil,
27 28 util,
28 29 )
29 30
30 31 shortname = '.hglf'
31 32 shortnameslash = shortname + '/'
32 33 longname = 'largefiles'
33 34
34 35 # -- Private worker functions ------------------------------------------
35 36
36 37 def getminsize(ui, assumelfiles, opt, default=10):
37 38 lfsize = opt
38 39 if not lfsize and assumelfiles:
39 40 lfsize = ui.config(longname, 'minsize', default=default)
40 41 if lfsize:
41 42 try:
42 43 lfsize = float(lfsize)
43 44 except ValueError:
44 45 raise error.Abort(_('largefiles: size must be number (not %s)\n')
45 46 % lfsize)
46 47 if lfsize is None:
47 48 raise error.Abort(_('minimum size for largefiles must be specified'))
48 49 return lfsize
49 50
50 51 def link(src, dest):
51 52 """Try to create hardlink - if that fails, efficiently make a copy."""
52 53 util.makedirs(os.path.dirname(dest))
53 54 try:
54 55 util.oslink(src, dest)
55 56 except OSError:
56 57 # if hardlinks fail, fallback on atomic copy
57 58 with open(src, 'rb') as srcf:
58 59 with util.atomictempfile(dest) as dstf:
59 60 for chunk in util.filechunkiter(srcf):
60 61 dstf.write(chunk)
61 62 os.chmod(dest, os.stat(src).st_mode)
62 63
63 64 def usercachepath(ui, hash):
64 65 '''Return the correct location in the "global" largefiles cache for a file
65 66 with the given hash.
66 67 This cache is used for sharing of largefiles across repositories - both
67 68 to preserve download bandwidth and storage space.'''
68 69 return os.path.join(_usercachedir(ui), hash)
69 70
70 71 def _usercachedir(ui):
71 72 '''Return the location of the "global" largefiles cache.'''
72 73 path = ui.configpath(longname, 'usercache', None)
73 74 if path:
74 75 return path
75 if os.name == 'nt':
76 if pycompat.osname == 'nt':
76 77 appdata = os.getenv('LOCALAPPDATA', os.getenv('APPDATA'))
77 78 if appdata:
78 79 return os.path.join(appdata, longname)
79 80 elif platform.system() == 'Darwin':
80 81 home = os.getenv('HOME')
81 82 if home:
82 83 return os.path.join(home, 'Library', 'Caches', longname)
83 elif os.name == 'posix':
84 elif pycompat.osname == 'posix':
84 85 path = os.getenv('XDG_CACHE_HOME')
85 86 if path:
86 87 return os.path.join(path, longname)
87 88 home = os.getenv('HOME')
88 89 if home:
89 90 return os.path.join(home, '.cache', longname)
90 91 else:
91 raise error.Abort(_('unknown operating system: %s\n') % os.name)
92 raise error.Abort(_('unknown operating system: %s\n')
93 % pycompat.osname)
92 94 raise error.Abort(_('unknown %s usercache location') % longname)
93 95
94 96 def inusercache(ui, hash):
95 97 path = usercachepath(ui, hash)
96 98 return os.path.exists(path)
97 99
98 100 def findfile(repo, hash):
99 101 '''Return store path of the largefile with the specified hash.
100 102 As a side effect, the file might be linked from user cache.
101 103 Return None if the file can't be found locally.'''
102 104 path, exists = findstorepath(repo, hash)
103 105 if exists:
104 106 repo.ui.note(_('found %s in store\n') % hash)
105 107 return path
106 108 elif inusercache(repo.ui, hash):
107 109 repo.ui.note(_('found %s in system cache\n') % hash)
108 110 path = storepath(repo, hash)
109 111 link(usercachepath(repo.ui, hash), path)
110 112 return path
111 113 return None
112 114
113 115 class largefilesdirstate(dirstate.dirstate):
114 116 def __getitem__(self, key):
115 117 return super(largefilesdirstate, self).__getitem__(unixpath(key))
116 118 def normal(self, f):
117 119 return super(largefilesdirstate, self).normal(unixpath(f))
118 120 def remove(self, f):
119 121 return super(largefilesdirstate, self).remove(unixpath(f))
120 122 def add(self, f):
121 123 return super(largefilesdirstate, self).add(unixpath(f))
122 124 def drop(self, f):
123 125 return super(largefilesdirstate, self).drop(unixpath(f))
124 126 def forget(self, f):
125 127 return super(largefilesdirstate, self).forget(unixpath(f))
126 128 def normallookup(self, f):
127 129 return super(largefilesdirstate, self).normallookup(unixpath(f))
128 130 def _ignore(self, f):
129 131 return False
130 132 def write(self, tr=False):
131 133 # (1) disable PENDING mode always
132 134 # (lfdirstate isn't yet managed as a part of the transaction)
133 135 # (2) avoid develwarn 'use dirstate.write with ....'
134 136 super(largefilesdirstate, self).write(None)
135 137
136 138 def openlfdirstate(ui, repo, create=True):
137 139 '''
138 140 Return a dirstate object that tracks largefiles: i.e. its root is
139 141 the repo root, but it is saved in .hg/largefiles/dirstate.
140 142 '''
141 143 vfs = repo.vfs
142 144 lfstoredir = longname
143 145 opener = scmutil.opener(vfs.join(lfstoredir))
144 146 lfdirstate = largefilesdirstate(opener, ui, repo.root,
145 147 repo.dirstate._validate)
146 148
147 149 # If the largefiles dirstate does not exist, populate and create
148 150 # it. This ensures that we create it on the first meaningful
149 151 # largefiles operation in a new clone.
150 152 if create and not vfs.exists(vfs.join(lfstoredir, 'dirstate')):
151 153 matcher = getstandinmatcher(repo)
152 154 standins = repo.dirstate.walk(matcher, [], False, False)
153 155
154 156 if len(standins) > 0:
155 157 vfs.makedirs(lfstoredir)
156 158
157 159 for standin in standins:
158 160 lfile = splitstandin(standin)
159 161 lfdirstate.normallookup(lfile)
160 162 return lfdirstate
161 163
162 164 def lfdirstatestatus(lfdirstate, repo):
163 165 wctx = repo['.']
164 166 match = matchmod.always(repo.root, repo.getcwd())
165 167 unsure, s = lfdirstate.status(match, [], False, False, False)
166 168 modified, clean = s.modified, s.clean
167 169 for lfile in unsure:
168 170 try:
169 171 fctx = wctx[standin(lfile)]
170 172 except LookupError:
171 173 fctx = None
172 174 if not fctx or fctx.data().strip() != hashfile(repo.wjoin(lfile)):
173 175 modified.append(lfile)
174 176 else:
175 177 clean.append(lfile)
176 178 lfdirstate.normal(lfile)
177 179 return s
178 180
179 181 def listlfiles(repo, rev=None, matcher=None):
180 182 '''return a list of largefiles in the working copy or the
181 183 specified changeset'''
182 184
183 185 if matcher is None:
184 186 matcher = getstandinmatcher(repo)
185 187
186 188 # ignore unknown files in working directory
187 189 return [splitstandin(f)
188 190 for f in repo[rev].walk(matcher)
189 191 if rev is not None or repo.dirstate[f] != '?']
190 192
191 193 def instore(repo, hash, forcelocal=False):
192 194 '''Return true if a largefile with the given hash exists in the store'''
193 195 return os.path.exists(storepath(repo, hash, forcelocal))
194 196
195 197 def storepath(repo, hash, forcelocal=False):
196 198 '''Return the correct location in the repository largefiles store for a
197 199 file with the given hash.'''
198 200 if not forcelocal and repo.shared():
199 201 return repo.vfs.reljoin(repo.sharedpath, longname, hash)
200 202 return repo.join(longname, hash)
201 203
202 204 def findstorepath(repo, hash):
203 205 '''Search through the local store path(s) to find the file for the given
204 206 hash. If the file is not found, its path in the primary store is returned.
205 207 The return value is a tuple of (path, exists(path)).
206 208 '''
207 209 # For shared repos, the primary store is in the share source. But for
208 210 # backward compatibility, force a lookup in the local store if it wasn't
209 211 # found in the share source.
210 212 path = storepath(repo, hash, False)
211 213
212 214 if instore(repo, hash):
213 215 return (path, True)
214 216 elif repo.shared() and instore(repo, hash, True):
215 217 return storepath(repo, hash, True), True
216 218
217 219 return (path, False)
218 220
219 221 def copyfromcache(repo, hash, filename):
220 222 '''Copy the specified largefile from the repo or system cache to
221 223 filename in the repository. Return true on success or false if the
222 224 file was not found in either cache (which should not happened:
223 225 this is meant to be called only after ensuring that the needed
224 226 largefile exists in the cache).'''
225 227 wvfs = repo.wvfs
226 228 path = findfile(repo, hash)
227 229 if path is None:
228 230 return False
229 231 wvfs.makedirs(wvfs.dirname(wvfs.join(filename)))
230 232 # The write may fail before the file is fully written, but we
231 233 # don't use atomic writes in the working copy.
232 234 with open(path, 'rb') as srcfd:
233 235 with wvfs(filename, 'wb') as destfd:
234 236 gothash = copyandhash(
235 237 util.filechunkiter(srcfd), destfd)
236 238 if gothash != hash:
237 239 repo.ui.warn(_('%s: data corruption in %s with hash %s\n')
238 240 % (filename, path, gothash))
239 241 wvfs.unlink(filename)
240 242 return False
241 243 return True
242 244
243 245 def copytostore(repo, rev, file, uploaded=False):
244 246 wvfs = repo.wvfs
245 247 hash = readstandin(repo, file, rev)
246 248 if instore(repo, hash):
247 249 return
248 250 if wvfs.exists(file):
249 251 copytostoreabsolute(repo, wvfs.join(file), hash)
250 252 else:
251 253 repo.ui.warn(_("%s: largefile %s not available from local store\n") %
252 254 (file, hash))
253 255
254 256 def copyalltostore(repo, node):
255 257 '''Copy all largefiles in a given revision to the store'''
256 258
257 259 ctx = repo[node]
258 260 for filename in ctx.files():
259 261 if isstandin(filename) and filename in ctx.manifest():
260 262 realfile = splitstandin(filename)
261 263 copytostore(repo, ctx.node(), realfile)
262 264
263 265 def copytostoreabsolute(repo, file, hash):
264 266 if inusercache(repo.ui, hash):
265 267 link(usercachepath(repo.ui, hash), storepath(repo, hash))
266 268 else:
267 269 util.makedirs(os.path.dirname(storepath(repo, hash)))
268 270 with open(file, 'rb') as srcf:
269 271 with util.atomictempfile(storepath(repo, hash),
270 272 createmode=repo.store.createmode) as dstf:
271 273 for chunk in util.filechunkiter(srcf):
272 274 dstf.write(chunk)
273 275 linktousercache(repo, hash)
274 276
275 277 def linktousercache(repo, hash):
276 278 '''Link / copy the largefile with the specified hash from the store
277 279 to the cache.'''
278 280 path = usercachepath(repo.ui, hash)
279 281 link(storepath(repo, hash), path)
280 282
281 283 def getstandinmatcher(repo, rmatcher=None):
282 284 '''Return a match object that applies rmatcher to the standin directory'''
283 285 wvfs = repo.wvfs
284 286 standindir = shortname
285 287
286 288 # no warnings about missing files or directories
287 289 badfn = lambda f, msg: None
288 290
289 291 if rmatcher and not rmatcher.always():
290 292 pats = [wvfs.join(standindir, pat) for pat in rmatcher.files()]
291 293 if not pats:
292 294 pats = [wvfs.join(standindir)]
293 295 match = scmutil.match(repo[None], pats, badfn=badfn)
294 296 # if pats is empty, it would incorrectly always match, so clear _always
295 297 match._always = False
296 298 else:
297 299 # no patterns: relative to repo root
298 300 match = scmutil.match(repo[None], [wvfs.join(standindir)], badfn=badfn)
299 301 return match
300 302
301 303 def composestandinmatcher(repo, rmatcher):
302 304 '''Return a matcher that accepts standins corresponding to the
303 305 files accepted by rmatcher. Pass the list of files in the matcher
304 306 as the paths specified by the user.'''
305 307 smatcher = getstandinmatcher(repo, rmatcher)
306 308 isstandin = smatcher.matchfn
307 309 def composedmatchfn(f):
308 310 return isstandin(f) and rmatcher.matchfn(splitstandin(f))
309 311 smatcher.matchfn = composedmatchfn
310 312
311 313 return smatcher
312 314
313 315 def standin(filename):
314 316 '''Return the repo-relative path to the standin for the specified big
315 317 file.'''
316 318 # Notes:
317 319 # 1) Some callers want an absolute path, but for instance addlargefiles
318 320 # needs it repo-relative so it can be passed to repo[None].add(). So
319 321 # leave it up to the caller to use repo.wjoin() to get an absolute path.
320 322 # 2) Join with '/' because that's what dirstate always uses, even on
321 323 # Windows. Change existing separator to '/' first in case we are
322 324 # passed filenames from an external source (like the command line).
323 325 return shortnameslash + util.pconvert(filename)
324 326
325 327 def isstandin(filename):
326 328 '''Return true if filename is a big file standin. filename must be
327 329 in Mercurial's internal form (slash-separated).'''
328 330 return filename.startswith(shortnameslash)
329 331
330 332 def splitstandin(filename):
331 333 # Split on / because that's what dirstate always uses, even on Windows.
332 334 # Change local separator to / first just in case we are passed filenames
333 335 # from an external source (like the command line).
334 336 bits = util.pconvert(filename).split('/', 1)
335 337 if len(bits) == 2 and bits[0] == shortname:
336 338 return bits[1]
337 339 else:
338 340 return None
339 341
340 342 def updatestandin(repo, standin):
341 343 file = repo.wjoin(splitstandin(standin))
342 344 if repo.wvfs.exists(splitstandin(standin)):
343 345 hash = hashfile(file)
344 346 executable = getexecutable(file)
345 347 writestandin(repo, standin, hash, executable)
346 348 else:
347 349 raise error.Abort(_('%s: file not found!') % splitstandin(standin))
348 350
349 351 def readstandin(repo, filename, node=None):
350 352 '''read hex hash from standin for filename at given node, or working
351 353 directory if no node is given'''
352 354 return repo[node][standin(filename)].data().strip()
353 355
354 356 def writestandin(repo, standin, hash, executable):
355 357 '''write hash to <repo.root>/<standin>'''
356 358 repo.wwrite(standin, hash + '\n', executable and 'x' or '')
357 359
358 360 def copyandhash(instream, outfile):
359 361 '''Read bytes from instream (iterable) and write them to outfile,
360 362 computing the SHA-1 hash of the data along the way. Return the hash.'''
361 363 hasher = hashlib.sha1('')
362 364 for data in instream:
363 365 hasher.update(data)
364 366 outfile.write(data)
365 367 return hasher.hexdigest()
366 368
367 369 def hashrepofile(repo, file):
368 370 return hashfile(repo.wjoin(file))
369 371
370 372 def hashfile(file):
371 373 if not os.path.exists(file):
372 374 return ''
373 375 hasher = hashlib.sha1('')
374 376 with open(file, 'rb') as fd:
375 377 for data in util.filechunkiter(fd):
376 378 hasher.update(data)
377 379 return hasher.hexdigest()
378 380
379 381 def getexecutable(filename):
380 382 mode = os.stat(filename).st_mode
381 383 return ((mode & stat.S_IXUSR) and
382 384 (mode & stat.S_IXGRP) and
383 385 (mode & stat.S_IXOTH))
384 386
385 387 def urljoin(first, second, *arg):
386 388 def join(left, right):
387 389 if not left.endswith('/'):
388 390 left += '/'
389 391 if right.startswith('/'):
390 392 right = right[1:]
391 393 return left + right
392 394
393 395 url = join(first, second)
394 396 for a in arg:
395 397 url = join(url, a)
396 398 return url
397 399
398 400 def hexsha1(data):
399 401 """hexsha1 returns the hex-encoded sha1 sum of the data in the file-like
400 402 object data"""
401 403 h = hashlib.sha1()
402 404 for chunk in util.filechunkiter(data):
403 405 h.update(chunk)
404 406 return h.hexdigest()
405 407
406 408 def httpsendfile(ui, filename):
407 409 return httpconnection.httpsendfile(ui, filename, 'rb')
408 410
409 411 def unixpath(path):
410 412 '''Return a version of path normalized for use with the lfdirstate.'''
411 413 return util.pconvert(os.path.normpath(path))
412 414
413 415 def islfilesrepo(repo):
414 416 '''Return true if the repo is a largefile repo.'''
415 417 if ('largefiles' in repo.requirements and
416 418 any(shortnameslash in f[0] for f in repo.store.datafiles())):
417 419 return True
418 420
419 421 return any(openlfdirstate(repo.ui, repo, False))
420 422
421 423 class storeprotonotcapable(Exception):
422 424 def __init__(self, storetypes):
423 425 self.storetypes = storetypes
424 426
425 427 def getstandinsstate(repo):
426 428 standins = []
427 429 matcher = getstandinmatcher(repo)
428 430 for standin in repo.dirstate.walk(matcher, [], False, False):
429 431 lfile = splitstandin(standin)
430 432 try:
431 433 hash = readstandin(repo, lfile)
432 434 except IOError:
433 435 hash = None
434 436 standins.append((lfile, hash))
435 437 return standins
436 438
437 439 def synclfdirstate(repo, lfdirstate, lfile, normallookup):
438 440 lfstandin = standin(lfile)
439 441 if lfstandin in repo.dirstate:
440 442 stat = repo.dirstate._map[lfstandin]
441 443 state, mtime = stat[0], stat[3]
442 444 else:
443 445 state, mtime = '?', -1
444 446 if state == 'n':
445 447 if (normallookup or mtime < 0 or
446 448 not repo.wvfs.exists(lfile)):
447 449 # state 'n' doesn't ensure 'clean' in this case
448 450 lfdirstate.normallookup(lfile)
449 451 else:
450 452 lfdirstate.normal(lfile)
451 453 elif state == 'm':
452 454 lfdirstate.normallookup(lfile)
453 455 elif state == 'r':
454 456 lfdirstate.remove(lfile)
455 457 elif state == 'a':
456 458 lfdirstate.add(lfile)
457 459 elif state == '?':
458 460 lfdirstate.drop(lfile)
459 461
460 462 def markcommitted(orig, ctx, node):
461 463 repo = ctx.repo()
462 464
463 465 orig(node)
464 466
465 467 # ATTENTION: "ctx.files()" may differ from "repo[node].files()"
466 468 # because files coming from the 2nd parent are omitted in the latter.
467 469 #
468 470 # The former should be used to get targets of "synclfdirstate",
469 471 # because such files:
470 472 # - are marked as "a" by "patch.patch()" (e.g. via transplant), and
471 473 # - have to be marked as "n" after commit, but
472 474 # - aren't listed in "repo[node].files()"
473 475
474 476 lfdirstate = openlfdirstate(repo.ui, repo)
475 477 for f in ctx.files():
476 478 if isstandin(f):
477 479 lfile = splitstandin(f)
478 480 synclfdirstate(repo, lfdirstate, lfile, False)
479 481 lfdirstate.write()
480 482
481 483 # As part of committing, copy all of the largefiles into the cache.
482 484 copyalltostore(repo, node)
483 485
484 486 def getlfilestoupdate(oldstandins, newstandins):
485 487 changedstandins = set(oldstandins).symmetric_difference(set(newstandins))
486 488 filelist = []
487 489 for f in changedstandins:
488 490 if f[0] not in filelist:
489 491 filelist.append(f[0])
490 492 return filelist
491 493
492 494 def getlfilestoupload(repo, missing, addfunc):
493 495 for i, n in enumerate(missing):
494 496 repo.ui.progress(_('finding outgoing largefiles'), i,
495 497 unit=_('revisions'), total=len(missing))
496 498 parents = [p for p in repo[n].parents() if p != node.nullid]
497 499
498 500 oldlfstatus = repo.lfstatus
499 501 repo.lfstatus = False
500 502 try:
501 503 ctx = repo[n]
502 504 finally:
503 505 repo.lfstatus = oldlfstatus
504 506
505 507 files = set(ctx.files())
506 508 if len(parents) == 2:
507 509 mc = ctx.manifest()
508 510 mp1 = ctx.parents()[0].manifest()
509 511 mp2 = ctx.parents()[1].manifest()
510 512 for f in mp1:
511 513 if f not in mc:
512 514 files.add(f)
513 515 for f in mp2:
514 516 if f not in mc:
515 517 files.add(f)
516 518 for f in mc:
517 519 if mc[f] != mp1.get(f, None) or mc[f] != mp2.get(f, None):
518 520 files.add(f)
519 521 for fn in files:
520 522 if isstandin(fn) and fn in ctx:
521 523 addfunc(fn, ctx[fn].data().strip())
522 524 repo.ui.progress(_('finding outgoing largefiles'), None)
523 525
524 526 def updatestandinsbymatch(repo, match):
525 527 '''Update standins in the working directory according to specified match
526 528
527 529 This returns (possibly modified) ``match`` object to be used for
528 530 subsequent commit process.
529 531 '''
530 532
531 533 ui = repo.ui
532 534
533 535 # Case 1: user calls commit with no specific files or
534 536 # include/exclude patterns: refresh and commit all files that
535 537 # are "dirty".
536 538 if match is None or match.always():
537 539 # Spend a bit of time here to get a list of files we know
538 540 # are modified so we can compare only against those.
539 541 # It can cost a lot of time (several seconds)
540 542 # otherwise to update all standins if the largefiles are
541 543 # large.
542 544 lfdirstate = openlfdirstate(ui, repo)
543 545 dirtymatch = matchmod.always(repo.root, repo.getcwd())
544 546 unsure, s = lfdirstate.status(dirtymatch, [], False, False,
545 547 False)
546 548 modifiedfiles = unsure + s.modified + s.added + s.removed
547 549 lfiles = listlfiles(repo)
548 550 # this only loops through largefiles that exist (not
549 551 # removed/renamed)
550 552 for lfile in lfiles:
551 553 if lfile in modifiedfiles:
552 554 if repo.wvfs.exists(standin(lfile)):
553 555 # this handles the case where a rebase is being
554 556 # performed and the working copy is not updated
555 557 # yet.
556 558 if repo.wvfs.exists(lfile):
557 559 updatestandin(repo,
558 560 standin(lfile))
559 561
560 562 return match
561 563
562 564 lfiles = listlfiles(repo)
563 565 match._files = repo._subdirlfs(match.files(), lfiles)
564 566
565 567 # Case 2: user calls commit with specified patterns: refresh
566 568 # any matching big files.
567 569 smatcher = composestandinmatcher(repo, match)
568 570 standins = repo.dirstate.walk(smatcher, [], False, False)
569 571
570 572 # No matching big files: get out of the way and pass control to
571 573 # the usual commit() method.
572 574 if not standins:
573 575 return match
574 576
575 577 # Refresh all matching big files. It's possible that the
576 578 # commit will end up failing, in which case the big files will
577 579 # stay refreshed. No harm done: the user modified them and
578 580 # asked to commit them, so sooner or later we're going to
579 581 # refresh the standins. Might as well leave them refreshed.
580 582 lfdirstate = openlfdirstate(ui, repo)
581 583 for fstandin in standins:
582 584 lfile = splitstandin(fstandin)
583 585 if lfdirstate[lfile] != 'r':
584 586 updatestandin(repo, fstandin)
585 587
586 588 # Cook up a new matcher that only matches regular files or
587 589 # standins corresponding to the big files requested by the
588 590 # user. Have to modify _files to prevent commit() from
589 591 # complaining "not tracked" for big files.
590 592 match = copy.copy(match)
591 593 origmatchfn = match.matchfn
592 594
593 595 # Check both the list of largefiles and the list of
594 596 # standins because if a largefile was removed, it
595 597 # won't be in the list of largefiles at this point
596 598 match._files += sorted(standins)
597 599
598 600 actualfiles = []
599 601 for f in match._files:
600 602 fstandin = standin(f)
601 603
602 604 # For largefiles, only one of the normal and standin should be
603 605 # committed (except if one of them is a remove). In the case of a
604 606 # standin removal, drop the normal file if it is unknown to dirstate.
605 607 # Thus, skip plain largefile names but keep the standin.
606 608 if f in lfiles or fstandin in standins:
607 609 if repo.dirstate[fstandin] != 'r':
608 610 if repo.dirstate[f] != 'r':
609 611 continue
610 612 elif repo.dirstate[f] == '?':
611 613 continue
612 614
613 615 actualfiles.append(f)
614 616 match._files = actualfiles
615 617
616 618 def matchfn(f):
617 619 if origmatchfn(f):
618 620 return f not in lfiles
619 621 else:
620 622 return f in standins
621 623
622 624 match.matchfn = matchfn
623 625
624 626 return match
625 627
626 628 class automatedcommithook(object):
627 629 '''Stateful hook to update standins at the 1st commit of resuming
628 630
629 631 For efficiency, updating standins in the working directory should
630 632 be avoided while automated committing (like rebase, transplant and
631 633 so on), because they should be updated before committing.
632 634
633 635 But the 1st commit of resuming automated committing (e.g. ``rebase
634 636 --continue``) should update them, because largefiles may be
635 637 modified manually.
636 638 '''
637 639 def __init__(self, resuming):
638 640 self.resuming = resuming
639 641
640 642 def __call__(self, repo, match):
641 643 if self.resuming:
642 644 self.resuming = False # avoids updating at subsequent commits
643 645 return updatestandinsbymatch(repo, match)
644 646 else:
645 647 return match
646 648
647 649 def getstatuswriter(ui, repo, forcibly=None):
648 650 '''Return the function to write largefiles specific status out
649 651
650 652 If ``forcibly`` is ``None``, this returns the last element of
651 653 ``repo._lfstatuswriters`` as "default" writer function.
652 654
653 655 Otherwise, this returns the function to always write out (or
654 656 ignore if ``not forcibly``) status.
655 657 '''
656 658 if forcibly is None and util.safehasattr(repo, '_largefilesenabled'):
657 659 return repo._lfstatuswriters[-1]
658 660 else:
659 661 if forcibly:
660 662 return ui.status # forcibly WRITE OUT
661 663 else:
662 664 return lambda *msg, **opts: None # forcibly IGNORE
@@ -1,132 +1,133 b''
1 1 # Copyright 2009, Alexander Solovyov <piranha@piranha.org.ua>
2 2 #
3 3 # This software may be used and distributed according to the terms of the
4 4 # GNU General Public License version 2 or any later version.
5 5
6 6 """extend schemes with shortcuts to repository swarms
7 7
8 8 This extension allows you to specify shortcuts for parent URLs with a
9 9 lot of repositories to act like a scheme, for example::
10 10
11 11 [schemes]
12 12 py = http://code.python.org/hg/
13 13
14 14 After that you can use it like::
15 15
16 16 hg clone py://trunk/
17 17
18 18 Additionally there is support for some more complex schemas, for
19 19 example used by Google Code::
20 20
21 21 [schemes]
22 22 gcode = http://{1}.googlecode.com/hg/
23 23
24 24 The syntax is taken from Mercurial templates, and you have unlimited
25 25 number of variables, starting with ``{1}`` and continuing with
26 26 ``{2}``, ``{3}`` and so on. This variables will receive parts of URL
27 27 supplied, split by ``/``. Anything not specified as ``{part}`` will be
28 28 just appended to an URL.
29 29
30 30 For convenience, the extension adds these schemes by default::
31 31
32 32 [schemes]
33 33 py = http://hg.python.org/
34 34 bb = https://bitbucket.org/
35 35 bb+ssh = ssh://hg@bitbucket.org/
36 36 gcode = https://{1}.googlecode.com/hg/
37 37 kiln = https://{1}.kilnhg.com/Repo/
38 38
39 39 You can override a predefined scheme by defining a new scheme with the
40 40 same name.
41 41 """
42 42 from __future__ import absolute_import
43 43
44 44 import os
45 45 import re
46 46
47 47 from mercurial.i18n import _
48 48 from mercurial import (
49 49 cmdutil,
50 50 error,
51 51 extensions,
52 52 hg,
53 pycompat,
53 54 templater,
54 55 util,
55 56 )
56 57
57 58 cmdtable = {}
58 59 command = cmdutil.command(cmdtable)
59 60 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
60 61 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
61 62 # be specifying the version(s) of Mercurial they are tested with, or
62 63 # leave the attribute unspecified.
63 64 testedwith = 'ships-with-hg-core'
64 65
65 66
66 67 class ShortRepository(object):
67 68 def __init__(self, url, scheme, templater):
68 69 self.scheme = scheme
69 70 self.templater = templater
70 71 self.url = url
71 72 try:
72 73 self.parts = max(map(int, re.findall(r'\{(\d+)\}', self.url)))
73 74 except ValueError:
74 75 self.parts = 0
75 76
76 77 def __repr__(self):
77 78 return '<ShortRepository: %s>' % self.scheme
78 79
79 80 def instance(self, ui, url, create):
80 81 url = self.resolve(url)
81 82 return hg._peerlookup(url).instance(ui, url, create)
82 83
83 84 def resolve(self, url):
84 85 # Should this use the util.url class, or is manual parsing better?
85 86 try:
86 87 url = url.split('://', 1)[1]
87 88 except IndexError:
88 89 raise error.Abort(_("no '://' in scheme url '%s'") % url)
89 90 parts = url.split('/', self.parts)
90 91 if len(parts) > self.parts:
91 92 tail = parts[-1]
92 93 parts = parts[:-1]
93 94 else:
94 95 tail = ''
95 96 context = dict((str(i + 1), v) for i, v in enumerate(parts))
96 97 return ''.join(self.templater.process(self.url, context)) + tail
97 98
98 99 def hasdriveletter(orig, path):
99 100 if path:
100 101 for scheme in schemes:
101 102 if path.startswith(scheme + ':'):
102 103 return False
103 104 return orig(path)
104 105
105 106 schemes = {
106 107 'py': 'http://hg.python.org/',
107 108 'bb': 'https://bitbucket.org/',
108 109 'bb+ssh': 'ssh://hg@bitbucket.org/',
109 110 'gcode': 'https://{1}.googlecode.com/hg/',
110 111 'kiln': 'https://{1}.kilnhg.com/Repo/'
111 112 }
112 113
113 114 def extsetup(ui):
114 115 schemes.update(dict(ui.configitems('schemes')))
115 116 t = templater.engine(lambda x: x)
116 117 for scheme, url in schemes.items():
117 if (os.name == 'nt' and len(scheme) == 1 and scheme.isalpha()
118 if (pycompat.osname == 'nt' and len(scheme) == 1 and scheme.isalpha()
118 119 and os.path.exists('%s:\\' % scheme)):
119 120 raise error.Abort(_('custom scheme %s:// conflicts with drive '
120 121 'letter %s:\\\n') % (scheme, scheme.upper()))
121 122 hg.schemes[scheme] = ShortRepository(url, scheme, t)
122 123
123 124 extensions.wrapfunction(util, 'hasdriveletter', hasdriveletter)
124 125
125 126 @command('debugexpandscheme', norepo=True)
126 127 def expandscheme(ui, url, **opts):
127 128 """given a repo path, provide the scheme-expanded path
128 129 """
129 130 repo = hg._peerlookup(url)
130 131 if isinstance(repo, ShortRepository):
131 132 url = repo.resolve(url)
132 133 ui.write(url + '\n')
@@ -1,194 +1,194 b''
1 1 # win32mbcs.py -- MBCS filename support for Mercurial
2 2 #
3 3 # Copyright (c) 2008 Shun-ichi Goto <shunichi.goto@gmail.com>
4 4 #
5 5 # Version: 0.3
6 6 # Author: Shun-ichi Goto <shunichi.goto@gmail.com>
7 7 #
8 8 # This software may be used and distributed according to the terms of the
9 9 # GNU General Public License version 2 or any later version.
10 10 #
11 11
12 12 '''allow the use of MBCS paths with problematic encodings
13 13
14 14 Some MBCS encodings are not good for some path operations (i.e.
15 15 splitting path, case conversion, etc.) with its encoded bytes. We call
16 16 such a encoding (i.e. shift_jis and big5) as "problematic encoding".
17 17 This extension can be used to fix the issue with those encodings by
18 18 wrapping some functions to convert to Unicode string before path
19 19 operation.
20 20
21 21 This extension is useful for:
22 22
23 23 - Japanese Windows users using shift_jis encoding.
24 24 - Chinese Windows users using big5 encoding.
25 25 - All users who use a repository with one of problematic encodings on
26 26 case-insensitive file system.
27 27
28 28 This extension is not needed for:
29 29
30 30 - Any user who use only ASCII chars in path.
31 31 - Any user who do not use any of problematic encodings.
32 32
33 33 Note that there are some limitations on using this extension:
34 34
35 35 - You should use single encoding in one repository.
36 36 - If the repository path ends with 0x5c, .hg/hgrc cannot be read.
37 37 - win32mbcs is not compatible with fixutf8 extension.
38 38
39 39 By default, win32mbcs uses encoding.encoding decided by Mercurial.
40 40 You can specify the encoding by config option::
41 41
42 42 [win32mbcs]
43 43 encoding = sjis
44 44
45 45 It is useful for the users who want to commit with UTF-8 log message.
46 46 '''
47 47 from __future__ import absolute_import
48 48
49 49 import os
50 50 import sys
51 51
52 52 from mercurial.i18n import _
53 53 from mercurial import (
54 54 encoding,
55 55 error,
56 56 pycompat,
57 57 )
58 58
59 59 # Note for extension authors: ONLY specify testedwith = 'ships-with-hg-core' for
60 60 # extensions which SHIP WITH MERCURIAL. Non-mainline extensions should
61 61 # be specifying the version(s) of Mercurial they are tested with, or
62 62 # leave the attribute unspecified.
63 63 testedwith = 'ships-with-hg-core'
64 64
65 65 _encoding = None # see extsetup
66 66
67 67 def decode(arg):
68 68 if isinstance(arg, str):
69 69 uarg = arg.decode(_encoding)
70 70 if arg == uarg.encode(_encoding):
71 71 return uarg
72 72 raise UnicodeError("Not local encoding")
73 73 elif isinstance(arg, tuple):
74 74 return tuple(map(decode, arg))
75 75 elif isinstance(arg, list):
76 76 return map(decode, arg)
77 77 elif isinstance(arg, dict):
78 78 for k, v in arg.items():
79 79 arg[k] = decode(v)
80 80 return arg
81 81
82 82 def encode(arg):
83 83 if isinstance(arg, unicode):
84 84 return arg.encode(_encoding)
85 85 elif isinstance(arg, tuple):
86 86 return tuple(map(encode, arg))
87 87 elif isinstance(arg, list):
88 88 return map(encode, arg)
89 89 elif isinstance(arg, dict):
90 90 for k, v in arg.items():
91 91 arg[k] = encode(v)
92 92 return arg
93 93
94 94 def appendsep(s):
95 95 # ensure the path ends with os.sep, appending it if necessary.
96 96 try:
97 97 us = decode(s)
98 98 except UnicodeError:
99 99 us = s
100 100 if us and us[-1] not in ':/\\':
101 101 s += pycompat.ossep
102 102 return s
103 103
104 104
105 105 def basewrapper(func, argtype, enc, dec, args, kwds):
106 106 # check check already converted, then call original
107 107 for arg in args:
108 108 if isinstance(arg, argtype):
109 109 return func(*args, **kwds)
110 110
111 111 try:
112 112 # convert string arguments, call func, then convert back the
113 113 # return value.
114 114 return enc(func(*dec(args), **dec(kwds)))
115 115 except UnicodeError:
116 116 raise error.Abort(_("[win32mbcs] filename conversion failed with"
117 117 " %s encoding\n") % (_encoding))
118 118
119 119 def wrapper(func, args, kwds):
120 120 return basewrapper(func, unicode, encode, decode, args, kwds)
121 121
122 122
123 123 def reversewrapper(func, args, kwds):
124 124 return basewrapper(func, str, decode, encode, args, kwds)
125 125
126 126 def wrapperforlistdir(func, args, kwds):
127 127 # Ensure 'path' argument ends with os.sep to avoids
128 128 # misinterpreting last 0x5c of MBCS 2nd byte as path separator.
129 129 if args:
130 130 args = list(args)
131 131 args[0] = appendsep(args[0])
132 132 if 'path' in kwds:
133 133 kwds['path'] = appendsep(kwds['path'])
134 134 return func(*args, **kwds)
135 135
136 136 def wrapname(name, wrapper):
137 137 module, name = name.rsplit('.', 1)
138 138 module = sys.modules[module]
139 139 func = getattr(module, name)
140 140 def f(*args, **kwds):
141 141 return wrapper(func, args, kwds)
142 142 f.__name__ = func.__name__
143 143 setattr(module, name, f)
144 144
145 145 # List of functions to be wrapped.
146 146 # NOTE: os.path.dirname() and os.path.basename() are safe because
147 147 # they use result of os.path.split()
148 148 funcs = '''os.path.join os.path.split os.path.splitext
149 149 os.path.normpath os.makedirs mercurial.util.endswithsep
150 150 mercurial.util.splitpath mercurial.util.fscasesensitive
151 151 mercurial.util.fspath mercurial.util.pconvert mercurial.util.normpath
152 152 mercurial.util.checkwinfilename mercurial.util.checkosfilename
153 153 mercurial.util.split'''
154 154
155 155 # These functions are required to be called with local encoded string
156 156 # because they expects argument is local encoded string and cause
157 157 # problem with unicode string.
158 158 rfuncs = '''mercurial.encoding.upper mercurial.encoding.lower'''
159 159
160 160 # List of Windows specific functions to be wrapped.
161 161 winfuncs = '''os.path.splitunc'''
162 162
163 163 # codec and alias names of sjis and big5 to be faked.
164 164 problematic_encodings = '''big5 big5-tw csbig5 big5hkscs big5-hkscs
165 165 hkscs cp932 932 ms932 mskanji ms-kanji shift_jis csshiftjis shiftjis
166 166 sjis s_jis shift_jis_2004 shiftjis2004 sjis_2004 sjis2004
167 167 shift_jisx0213 shiftjisx0213 sjisx0213 s_jisx0213 950 cp950 ms950 '''
168 168
169 169 def extsetup(ui):
170 170 # TODO: decide use of config section for this extension
171 171 if ((not os.path.supports_unicode_filenames) and
172 172 (sys.platform != 'cygwin')):
173 173 ui.warn(_("[win32mbcs] cannot activate on this platform.\n"))
174 174 return
175 175 # determine encoding for filename
176 176 global _encoding
177 177 _encoding = ui.config('win32mbcs', 'encoding', encoding.encoding)
178 178 # fake is only for relevant environment.
179 179 if _encoding.lower() in problematic_encodings.split():
180 180 for f in funcs.split():
181 181 wrapname(f, wrapper)
182 if os.name == 'nt':
182 if pycompat.osname == 'nt':
183 183 for f in winfuncs.split():
184 184 wrapname(f, wrapper)
185 185 wrapname("mercurial.osutil.listdir", wrapperforlistdir)
186 186 # wrap functions to be called with local byte string arguments
187 187 for f in rfuncs.split():
188 188 wrapname(f, reversewrapper)
189 189 # Check sys.args manually instead of using ui.debug() because
190 190 # command line options is not yet applied when
191 191 # extensions.loadall() is called.
192 192 if '--debug' in sys.argv:
193 193 ui.write(("[win32mbcs] activated with encoding: %s\n")
194 194 % _encoding)
General Comments 0
You need to be logged in to leave comments. Login now