##// END OF EJS Templates
revset: optimize "draft() & ::x" pattern...
revset: optimize "draft() & ::x" pattern The `draft() & ::x` type query could be common for selecting one or more draft feature branches being worked on. Before this patch, `::x` may travel through the changelog DAG for a long distance until it gets a smaller revision number than `min(draft())`. It could be very slow on long changelog with distant (in terms of revision numbers) drafts. This patch adds a fast path for this situation, and will stop traveling the changelog DAG once `::x` hits a non-draft revision. The fast path also works for `secret()` and `not public()`. To measure the performance difference, I used drawdag to create a repo that emulates distant drafts: DRAFT4 | DRAFT3 # draft / PUBLIC9999 # public | PUBLIC9998 | . DRAFT2 . | . DRAFT1 # draft | / PUBLIC0001 # public And measured the performance using the repo: (BEFORE) $ hg perfrevset 'draft() & ::(DRAFT2+DRAFT4)' ! wall 0.017132 comb 0.010000 user 0.010000 sys 0.000000 (best of 156) $ hg perfrevset 'draft() & ::(all())' ! wall 0.024221 comb 0.030000 user 0.030000 sys 0.000000 (best of 113) (AFTER) $ hg perfrevset 'draft() & ::(DRAFT2+DRAFT4)' ! wall 0.000243 comb 0.000000 user 0.000000 sys 0.000000 (best of 9303) $ hg perfrevset 'draft() & ::(all())' ! wall 0.004319 comb 0.000000 user 0.000000 sys 0.000000 (best of 655) Differential Revision: https://phab.mercurial-scm.org/D441

File last commit:

r34047:79681d8e default
r34067:c6c8a52e default
Show More
parser.py
684 lines | 24.7 KiB | text/x-python | PythonLexer
Matt Mackall
revset: introduce basic parser
r11274 # parser.py - simple top-down operator precedence parser for mercurial
#
# Copyright 2010 Matt Mackall <mpm@selenic.com>
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
Julian Cowley
parser: fix URL to effbot
r11449 # see http://effbot.org/zone/simple-top-down-parsing.htm and
Matt Mackall
revset: introduce basic parser
r11274 # http://eli.thegreenplace.net/2010/01/02/top-down-operator-precedence-parsing/
# for background
# takes a tokenizer and elements
Yuya Nishihara
parser: update documentation about tokenizer and elements
r25655 # tokenizer is an iterator that returns (type, value, pos) tuples
Yuya Nishihara
parser: separate actions for primary expression and prefix operator...
r25815 # elements is a mapping of types to binding strength, primary, prefix, infix
# and suffix actions
Matt Mackall
revset: introduce basic parser
r11274 # an action is a tree node name, a tree label, and an optional match
timeless@mozdev.org
en-us: labeled
r17500 # __call__(program) parses program into a labeled tree
Matt Mackall
revset: introduce basic parser
r11274
Gregory Szorc
parser: use absolute_import
r25963 from __future__ import absolute_import
from .i18n import _
Yuya Nishihara
util: wrap s.decode('string_escape') calls for future py3 compatibility
r31484 from . import (
error,
util,
)
Matt Mackall
revset: raise ParseError exceptions
r11289
Matt Mackall
revset: introduce basic parser
r11274 class parser(object):
Yuya Nishihara
parser: accept iterator of tokens instead of tokenizer function and program...
r25654 def __init__(self, elements, methods=None):
Matt Mackall
revset: introduce basic parser
r11274 self._elements = elements
self._methods = methods
Matt Mackall
templater: use the parser.py parser to extend the templater syntax
r13176 self.current = None
Matt Mackall
revset: introduce basic parser
r11274 def _advance(self):
'advance the tokenizer'
t = self.current
Pierre-Yves David
parsers: use 'next' instead of try/except...
r25171 self.current = next(self._iter, None)
Matt Mackall
revset: introduce basic parser
r11274 return t
Yuya Nishihara
parser: extract function that tests if next token may start new term...
r25804 def _hasnewterm(self):
'True if next token may start new term'
Yuya Nishihara
parser: separate actions for primary expression and prefix operator...
r25815 return any(self._elements[self.current[0]][1:3])
Yuya Nishihara
parser: remove unused parameter 'pos' from _match()...
r25802 def _match(self, m):
Matt Mackall
revset: introduce basic parser
r11274 'make sure the tokenizer matches an end condition'
if self.current[0] != m:
Mads Kiilerich
parsers: fix localization markup of parser errors
r14701 raise error.ParseError(_("unexpected token: %s") % self.current[0],
Dirkjan Ochtman
cleanups: undefined variables
r11305 self.current[2])
Matt Mackall
revset: introduce basic parser
r11274 self._advance()
Yuya Nishihara
parser: factor out function that parses right-hand side of prefix/infix ops...
r25803 def _parseoperand(self, bind, m=None):
'gather right-hand-side operand until an end condition or binding met'
if m and self.current[0] == m:
expr = None
else:
expr = self._parse(bind)
if m:
self._match(m)
return expr
Matt Mackall
revset: introduce basic parser
r11274 def _parse(self, bind=0):
Matt Mackall
revset: raise ParseError exceptions
r11289 token, value, pos = self._advance()
Yuya Nishihara
parser: resolve ambiguity where both prefix and primary actions are defined...
r25816 # handle prefix rules on current token, take as primary if unambiguous
Yuya Nishihara
parser: separate actions for primary expression and prefix operator...
r25815 primary, prefix = self._elements[token][1:3]
Yuya Nishihara
parser: resolve ambiguity where both prefix and primary actions are defined...
r25816 if primary and not (prefix and self._hasnewterm()):
Yuya Nishihara
parser: separate actions for primary expression and prefix operator...
r25815 expr = (primary, value)
elif prefix:
expr = (prefix[0], self._parseoperand(*prefix[1:]))
else:
Mads Kiilerich
parsers: fix localization markup of parser errors
r14701 raise error.ParseError(_("not a prefix: %s") % token, pos)
Matt Mackall
revset: introduce basic parser
r11274 # gather tokens until we meet a lower binding strength
while bind < self._elements[self.current[0]][0]:
Matt Mackall
revset: raise ParseError exceptions
r11289 token, value, pos = self._advance()
Yuya Nishihara
parser: reorder infix/suffix handling to be similar to prefix/primary flow...
r25817 # handle infix rules, take as suffix if unambiguous
Yuya Nishihara
parser: separate actions for primary expression and prefix operator...
r25815 infix, suffix = self._elements[token][3:]
Yuya Nishihara
parser: take suffix action if no infix action is defined...
r25818 if suffix and not (infix and self._hasnewterm()):
Yuya Nishihara
parser: remove unused binding parameter from suffix action...
r29767 expr = (suffix, expr)
Yuya Nishihara
parser: reorder infix/suffix handling to be similar to prefix/primary flow...
r25817 elif infix:
expr = (infix[0], expr, self._parseoperand(*infix[1:]))
Matt Mackall
revset: introduce basic parser
r11274 else:
Yuya Nishihara
parser: reorder infix/suffix handling to be similar to prefix/primary flow...
r25817 raise error.ParseError(_("not an infix: %s") % token, pos)
Matt Mackall
revset: introduce basic parser
r11274 return expr
Yuya Nishihara
parser: accept iterator of tokens instead of tokenizer function and program...
r25654 def parse(self, tokeniter):
'generate a parse tree from tokens'
self._iter = tokeniter
Matt Mackall
templater: use the parser.py parser to extend the templater syntax
r13176 self._advance()
Bernhard Leiner
revset: report a parse error if a revset is not parsed completely (issue2654)
r13665 res = self._parse()
token, value, pos = self.current
return res, pos
Matt Mackall
revset: introduce basic parser
r11274 def eval(self, tree):
'recursively evaluate a parse tree using node methods'
if not isinstance(tree, tuple):
return tree
return self._methods[tree[0]](*[self.eval(t) for t in tree[1:]])
Yuya Nishihara
parser: accept iterator of tokens instead of tokenizer function and program...
r25654 def __call__(self, tokeniter):
'parse tokens into a parse tree and evaluate if methods given'
t = self.parse(tokeniter)
Matt Mackall
revset: introduce basic parser
r11274 if self._methods:
return self.eval(t)
return t
Yuya Nishihara
parser: move prettyformat() function from revset module...
r25253
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 def splitargspec(spec):
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 """Parse spec of function arguments into (poskeys, varkey, keys, optkey)
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753
>>> splitargspec('')
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 ([], None, [], None)
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 >>> splitargspec('foo bar')
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 ([], None, ['foo', 'bar'], None)
>>> splitargspec('foo *bar baz **qux')
(['foo'], 'bar', ['baz'], 'qux')
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 >>> splitargspec('*foo')
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 ([], 'foo', [], None)
>>> splitargspec('**foo')
([], None, [], 'foo')
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 """
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 optkey = None
pre, sep, post = spec.partition('**')
if sep:
posts = post.split()
if not posts:
raise error.ProgrammingError('no **optkey name provided')
if len(posts) > 1:
raise error.ProgrammingError('excessive **optkey names provided')
optkey = posts[0]
pre, sep, post = pre.partition('*')
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 pres = pre.split()
posts = post.split()
if sep:
if not posts:
raise error.ProgrammingError('no *varkey name provided')
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 return pres, posts[0], posts[1:], optkey
return [], None, pres, optkey
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753
def buildargsdict(trees, funcname, argspec, keyvaluenode, keynode):
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 """Build dict from list containing positional and keyword arguments
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 Arguments are specified by a tuple of ``(poskeys, varkey, keys, optkey)``
where
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753
- ``poskeys``: list of names of positional arguments
- ``varkey``: optional argument name that takes up remainder
- ``keys``: list of names that can be either positional or keyword arguments
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 - ``optkey``: optional argument name that takes up excess keyword arguments
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753
If ``varkey`` specified, all ``keys`` must be given as keyword arguments.
Invalid keywords, too few positional arguments, or too many positional
arguments are rejected, but missing keyword arguments are just omitted.
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 """
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 poskeys, varkey, keys, optkey = argspec
Yuya Nishihara
parser: make buildargsdict() precompute position where keyword args start...
r30752 kwstart = next((i for i, x in enumerate(trees) if x[0] == keyvaluenode),
len(trees))
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 if kwstart < len(poskeys):
raise error.ParseError(_("%(func)s takes at least %(nargs)d positional "
"arguments")
% {'func': funcname, 'nargs': len(poskeys)})
Yuya Nishihara
parser: verify excessive number of args excluding kwargs in buildargsdict()...
r31920 if not varkey and kwstart > len(poskeys) + len(keys):
raise error.ParseError(_("%(func)s takes at most %(nargs)d positional "
"arguments")
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 % {'func': funcname,
'nargs': len(poskeys) + len(keys)})
Yuya Nishihara
parser: preserve order of keyword arguments...
r31922 args = util.sortdict()
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 # consume positional arguments
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 for k, x in zip(poskeys, trees[:kwstart]):
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 args[k] = x
Yuya Nishihara
parser: extend buildargsdict() to support variable-length positional args...
r30753 if varkey:
args[varkey] = trees[len(args):kwstart]
else:
for k, x in zip(keys, trees[len(args):kwstart]):
args[k] = x
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 # remainder should be keyword arguments
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 if optkey:
Yuya Nishihara
parser: preserve order of keyword arguments...
r31922 args[optkey] = util.sortdict()
Yuya Nishihara
parser: make buildargsdict() precompute position where keyword args start...
r30752 for x in trees[kwstart:]:
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 if x[0] != keyvaluenode or x[1][0] != keynode:
raise error.ParseError(_("%(func)s got an invalid argument")
% {'func': funcname})
k = x[1][1]
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 if k in keys:
d = args
elif not optkey:
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 raise error.ParseError(_("%(func)s got an unexpected keyword "
"argument '%(key)s'")
% {'func': funcname, 'key': k})
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 else:
d = args[optkey]
if k in d:
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 raise error.ParseError(_("%(func)s got multiple values for keyword "
"argument '%(key)s'")
% {'func': funcname, 'key': k})
Yuya Nishihara
parser: extend buildargsdict() to support arbitrary number of **kwargs...
r31921 d[k] = x[2]
Yuya Nishihara
revset: add function to build dict of positional and keyword arguments...
r25705 return args
Yuya Nishihara
parser: move unescape helper from templater...
r26231 def unescapestr(s):
try:
Yuya Nishihara
util: wrap s.decode('string_escape') calls for future py3 compatibility
r31484 return util.unescapestr(s)
Yuya Nishihara
parser: move unescape helper from templater...
r26231 except ValueError as e:
# mangle Python's exception into our format
raise error.ParseError(str(e).lower())
Yuya Nishihara
parser: extract closure of prettyformat() to a top-level function...
r25254 def _prettyformat(tree, leafnodes, level, lines):
if not isinstance(tree, tuple) or tree[0] in leafnodes:
lines.append((level, str(tree)))
else:
lines.append((level, '(%s' % tree[0]))
for s in tree[1:]:
_prettyformat(s, leafnodes, level + 1, lines)
lines[-1:] = [(lines[-1][0], lines[-1][1] + ')')]
Yuya Nishihara
parser: move prettyformat() function from revset module...
r25253 def prettyformat(tree, leafnodes):
lines = []
Yuya Nishihara
parser: extract closure of prettyformat() to a top-level function...
r25254 _prettyformat(tree, leafnodes, 0, lines)
Yuya Nishihara
parser: move prettyformat() function from revset module...
r25253 output = '\n'.join((' ' * l + s) for l, s in lines)
return output
Yuya Nishihara
parser: add helper to reduce nesting of chained infix operations...
r25306
def simplifyinfixops(tree, targetnodes):
"""Flatten chained infix operations to reduce usage of Python stack
>>> def f(tree):
... print prettyformat(simplifyinfixops(tree, ('or',)), ('symbol',))
>>> f(('or',
... ('or',
... ('symbol', '1'),
... ('symbol', '2')),
... ('symbol', '3')))
(or
('symbol', '1')
('symbol', '2')
('symbol', '3'))
>>> f(('func',
... ('symbol', 'p1'),
... ('or',
... ('or',
... ('func',
... ('symbol', 'sort'),
... ('list',
... ('or',
... ('or',
... ('symbol', '1'),
... ('symbol', '2')),
... ('symbol', '3')),
... ('negate',
... ('symbol', 'rev')))),
... ('and',
... ('symbol', '4'),
... ('group',
... ('or',
... ('or',
... ('symbol', '5'),
... ('symbol', '6')),
... ('symbol', '7'))))),
... ('symbol', '8'))))
(func
('symbol', 'p1')
(or
(func
('symbol', 'sort')
(list
(or
('symbol', '1')
('symbol', '2')
('symbol', '3'))
(negate
('symbol', 'rev'))))
(and
('symbol', '4')
(group
(or
('symbol', '5')
('symbol', '6')
('symbol', '7'))))
('symbol', '8')))
"""
if not isinstance(tree, tuple):
return tree
op = tree[0]
if op not in targetnodes:
return (op,) + tuple(simplifyinfixops(x, targetnodes) for x in tree[1:])
# walk down left nodes taking each right node. no recursion to left nodes
# because infix operators are left-associative, i.e. left tree is deep.
# e.g. '1 + 2 + 3' -> (+ (+ 1 2) 3) -> (+ 1 2 3)
simplified = []
x = tree
while x[0] == op:
l, r = x[1:]
simplified.append(simplifyinfixops(r, targetnodes))
x = l
simplified.append(simplifyinfixops(x, targetnodes))
simplified.append(op)
return tuple(reversed(simplified))
Yuya Nishihara
parser: move parsererrordetail() function from revset module...
r28720
Yuya Nishihara
parser: add helper function that constructs parsed tree from template...
r34045 def _buildtree(template, placeholder, replstack):
if template == placeholder:
return replstack.pop()
if not isinstance(template, tuple):
return template
return tuple(_buildtree(x, placeholder, replstack) for x in template)
def buildtree(template, placeholder, *repls):
"""Create new tree by substituting placeholders by replacements
>>> _ = ('symbol', '_')
>>> def f(template, *repls):
... return buildtree(template, _, *repls)
>>> f(('func', ('symbol', 'only'), ('list', _, _)),
... ('symbol', '1'), ('symbol', '2'))
('func', ('symbol', 'only'), ('list', ('symbol', '1'), ('symbol', '2')))
>>> f(('and', _, ('not', _)), ('symbol', '1'), ('symbol', '2'))
('and', ('symbol', '1'), ('not', ('symbol', '2')))
"""
if not isinstance(placeholder, tuple):
raise error.ProgrammingError('placeholder must be a node tuple')
replstack = list(reversed(repls))
r = _buildtree(template, placeholder, replstack)
if replstack:
raise error.ProgrammingError('too many replacements')
return r
Yuya Nishihara
parser: add helper function to test if pattern matches parsed tree...
r34047 def _matchtree(pattern, tree, placeholder, incompletenodes, matches):
if pattern == tree:
return True
if not isinstance(pattern, tuple) or not isinstance(tree, tuple):
return False
if pattern == placeholder and tree[0] not in incompletenodes:
matches.append(tree)
return True
if len(pattern) != len(tree):
return False
return all(_matchtree(p, x, placeholder, incompletenodes, matches)
for p, x in zip(pattern, tree))
def matchtree(pattern, tree, placeholder=None, incompletenodes=()):
"""If a tree matches the pattern, return a list of the tree and nodes
matched with the placeholder; Otherwise None
>>> def f(pattern, tree):
... m = matchtree(pattern, tree, _, {'keyvalue', 'list'})
... if m:
... return m[1:]
>>> _ = ('symbol', '_')
>>> f(('func', ('symbol', 'ancestors'), _),
... ('func', ('symbol', 'ancestors'), ('symbol', '1')))
[('symbol', '1')]
>>> f(('func', ('symbol', 'ancestors'), _),
... ('func', ('symbol', 'ancestors'), None))
>>> f(('range', ('dagrange', _, _), _),
... ('range',
... ('dagrange', ('symbol', '1'), ('symbol', '2')),
... ('symbol', '3')))
[('symbol', '1'), ('symbol', '2'), ('symbol', '3')]
The placeholder does not match the specified incomplete nodes because
an incomplete node (e.g. argument list) cannot construct an expression.
>>> f(('func', ('symbol', 'ancestors'), _),
... ('func', ('symbol', 'ancestors'),
... ('list', ('symbol', '1'), ('symbol', '2'))))
The placeholder may be omitted, but which shouldn't match a None node.
>>> _ = None
>>> f(('func', ('symbol', 'ancestors'), None),
... ('func', ('symbol', 'ancestors'), ('symbol', '0')))
"""
if placeholder is not None and not isinstance(placeholder, tuple):
raise error.ProgrammingError('placeholder must be a node tuple')
matches = [tree]
if _matchtree(pattern, tree, placeholder, incompletenodes, matches):
return matches
Yuya Nishihara
parser: move parsererrordetail() function from revset module...
r28720 def parseerrordetail(inst):
"""Compose error message from specified ParseError object
"""
if len(inst.args) > 1:
Augie Fackler
parser: use %d instead of %s for interpolating error position...
r31353 return _('at %d: %s') % (inst.args[1], inst.args[0])
Yuya Nishihara
parser: move parsererrordetail() function from revset module...
r28720 else:
return inst.args[0]
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 class alias(object):
"""Parsed result of alias"""
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 def __init__(self, name, args, err, replacement):
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 self.name = name
self.args = args
self.error = err
self.replacement = replacement
# whether own `error` information is already shown or not.
Yuya Nishihara
revset: rename findaliases() to expandaliases()...
r28898 # this avoids showing same warning multiple times at each
# `expandaliases`.
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 self.warned = False
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870 class basealiasrules(object):
"""Parsing and expansion rule set of aliases
This is a helper for fileset/revset/template aliases. A concrete rule set
should be made by sub-classing this and implementing class/static methods.
Mads Kiilerich
spelling: fixes of non-dictionary words
r30332 It supports alias expansion of symbol and function-call styles::
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870
# decl = defn
h = heads(default)
b($1) = ancestors($1) - ancestors(default)
"""
# typically a config section, which will be included in error messages
_section = None
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 # tag of symbol node
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870 _symbolnode = 'symbol'
def __new__(cls):
raise TypeError("'%s' is not instantiatable" % cls.__name__)
@staticmethod
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 def _parse(spec):
"""Parse an alias name, arguments and definition"""
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870 raise NotImplementedError
@staticmethod
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 def _trygetfunc(tree):
"""Return (name, args) if tree is a function; otherwise None"""
Yuya Nishihara
parser: add stub class that will host alias parsing and expansion...
r28870 raise NotImplementedError
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
@classmethod
def _builddecl(cls, decl):
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 """Parse an alias declaration into ``(name, args, errorstr)``
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
This function analyzes the parsed tree. The parsing rule is provided
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 by ``_parse()``.
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
- ``name``: of declared alias (may be ``decl`` itself at error)
- ``args``: list of argument names (or None for symbol declaration)
- ``errorstr``: detail about detected error (or None)
>>> sym = lambda x: ('symbol', x)
>>> symlist = lambda *xs: ('list',) + tuple(sym(x) for x in xs)
>>> func = lambda n, a: ('func', sym(n), a)
>>> parsemap = {
... 'foo': sym('foo'),
... '$foo': sym('$foo'),
... 'foo::bar': ('dagrange', sym('foo'), sym('bar')),
... 'foo()': func('foo', None),
... '$foo()': func('$foo', None),
... 'foo($1, $2)': func('foo', symlist('$1', '$2')),
... 'foo(bar_bar, baz.baz)':
... func('foo', symlist('bar_bar', 'baz.baz')),
... 'foo(bar($1, $2))':
... func('foo', func('bar', symlist('$1', '$2'))),
... 'foo($1, $2, nested($1, $2))':
... func('foo', (symlist('$1', '$2') +
... (func('nested', symlist('$1', '$2')),))),
... 'foo("bar")': func('foo', ('string', 'bar')),
... 'foo($1, $2': error.ParseError('unexpected token: end', 10),
... 'foo("bar': error.ParseError('unterminated string', 5),
... 'foo($1, $2, $1)': func('foo', symlist('$1', '$2', '$1')),
... }
>>> def parse(expr):
... x = parsemap[expr]
... if isinstance(x, Exception):
... raise x
... return x
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 >>> def trygetfunc(tree):
... if not tree or tree[0] != 'func' or tree[1][0] != 'symbol':
... return None
... if not tree[2]:
... return tree[1][1], []
... if tree[2][0] == 'list':
... return tree[1][1], list(tree[2][1:])
... return tree[1][1], [tree[2]]
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> class aliasrules(basealiasrules):
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 ... _parse = staticmethod(parse)
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 ... _trygetfunc = staticmethod(trygetfunc)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl = aliasrules._builddecl
>>> builddecl('foo')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo', None, None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('$foo')
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 ('$foo', None, "invalid symbol '$foo'")
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo::bar')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo::bar', None, 'invalid format')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo()')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo', [], None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('$foo()')
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 ('$foo()', None, "invalid function '$foo'")
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo($1, $2)')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo', ['$1', '$2'], None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo(bar_bar, baz.baz)')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo', ['bar_bar', 'baz.baz'], None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo($1, $2, nested($1, $2))')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo($1, $2, nested($1, $2))', None, 'invalid argument list')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo(bar($1, $2))')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo(bar($1, $2))', None, 'invalid argument list')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo("bar")')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo("bar")', None, 'invalid argument list')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo($1, $2')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo($1, $2', None, 'at 10: unexpected token: end')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo("bar')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo("bar', None, 'at 5: unterminated string')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 >>> builddecl('foo($1, $2, $1)')
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 ('foo', None, 'argument names collide with each other')
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 """
try:
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 tree = cls._parse(decl)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 except error.ParseError as inst:
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 return (decl, None, parseerrordetail(inst))
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
if tree[0] == cls._symbolnode:
# "name = ...." style
name = tree[1]
if name.startswith('$'):
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 return (decl, None, _("invalid symbol '%s'") % name)
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 return (name, None, None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 func = cls._trygetfunc(tree)
if func:
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 # "name(arg, ....) = ...." style
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 name, args = func
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 if name.startswith('$'):
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 return (decl, None, _("invalid function '%s'") % name)
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 if any(t[0] != cls._symbolnode for t in args):
return (decl, None, _("invalid argument list"))
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871 if len(args) != len(set(args)):
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 return (name, None, _("argument names collide with each other"))
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 return (name, [t[1] for t in args], None)
Yuya Nishihara
parser: move alias declaration parser to common rule-set class...
r28871
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 return (decl, None, _("invalid format"))
Yuya Nishihara
parser: move _relabelaliasargs() to common rule-set class...
r28872
@classmethod
def _relabelargs(cls, tree, args):
"""Mark alias arguments as ``_aliasarg``"""
if not isinstance(tree, tuple):
return tree
op = tree[0]
if op != cls._symbolnode:
return (op,) + tuple(cls._relabelargs(x, args) for x in tree[1:])
assert len(tree) == 2
sym = tree[1]
if sym in args:
op = '_aliasarg'
elif sym.startswith('$'):
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 raise error.ParseError(_("invalid symbol '%s'") % sym)
Yuya Nishihara
parser: move _relabelaliasargs() to common rule-set class...
r28872 return (op, sym)
Yuya Nishihara
parser: move alias definition parser to common rule-set class...
r28873
@classmethod
def _builddefn(cls, defn, args):
"""Parse an alias definition into a tree and marks substitutions
This function marks alias argument references as ``_aliasarg``. The
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 parsing rule is provided by ``_parse()``.
Yuya Nishihara
parser: move alias definition parser to common rule-set class...
r28873
``args`` is a list of alias argument names, or None if the alias
is declared as a symbol.
>>> parsemap = {
... '$1 or foo': ('or', ('symbol', '$1'), ('symbol', 'foo')),
... '$1 or $bar': ('or', ('symbol', '$1'), ('symbol', '$bar')),
... '$10 or baz': ('or', ('symbol', '$10'), ('symbol', 'baz')),
... '"$1" or "foo"': ('or', ('string', '$1'), ('string', 'foo')),
... }
>>> class aliasrules(basealiasrules):
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 ... _parse = staticmethod(parsemap.__getitem__)
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 ... _trygetfunc = staticmethod(lambda x: None)
Yuya Nishihara
parser: move alias definition parser to common rule-set class...
r28873 >>> builddefn = aliasrules._builddefn
>>> def pprint(tree):
... print prettyformat(tree, ('_aliasarg', 'string', 'symbol'))
>>> args = ['$1', '$2', 'foo']
>>> pprint(builddefn('$1 or foo', args))
(or
('_aliasarg', '$1')
('_aliasarg', 'foo'))
>>> try:
... builddefn('$1 or $bar', args)
... except error.ParseError as inst:
... print parseerrordetail(inst)
Yuya Nishihara
parser: rephrase "'$' not for alias arguments" message...
r29058 invalid symbol '$bar'
Yuya Nishihara
parser: move alias definition parser to common rule-set class...
r28873 >>> args = ['$1', '$10', 'foo']
>>> pprint(builddefn('$10 or baz', args))
(or
('_aliasarg', '$10')
('symbol', 'baz'))
>>> pprint(builddefn('"$1" or "foo"', args))
(or
('string', '$1')
('string', 'foo'))
"""
Yuya Nishihara
parser: unify parser function of alias declaration and definition...
r28875 tree = cls._parse(defn)
Yuya Nishihara
parser: move alias definition parser to common rule-set class...
r28873 if args:
args = set(args)
else:
args = set()
return cls._relabelargs(tree, args)
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892
@classmethod
def build(cls, decl, defn):
"""Parse an alias declaration and definition into an alias object"""
repl = efmt = None
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 name, args, err = cls._builddecl(decl)
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 if err:
Yuya Nishihara
parser: shorten prefix of alias parsing errors...
r29059 efmt = _('bad declaration of %(section)s "%(name)s": %(error)s')
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 else:
try:
repl = cls._builddefn(defn, args)
except error.ParseError as inst:
err = parseerrordetail(inst)
Yuya Nishihara
parser: shorten prefix of alias parsing errors...
r29059 efmt = _('bad definition of %(section)s "%(name)s": %(error)s')
Yuya Nishihara
parser: construct alias object by rule-set class...
r28892 if err:
err = efmt % {'section': cls._section, 'name': name, 'error': err}
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 return alias(name, args, err, repl)
Yuya Nishihara
parser: extract helper that creates a dict of aliases...
r28893
@classmethod
def buildmap(cls, items):
"""Parse a list of alias (name, replacement) pairs into a dict of
alias objects"""
aliases = {}
for decl, defn in items:
a = cls.build(decl, defn)
aliases[a.name] = a
return aliases
Yuya Nishihara
parser: move functions that process alias expansion to rule-set class...
r28895
@classmethod
def _getalias(cls, aliases, tree):
Yuya Nishihara
parser: make _getalias() return (alias, pattern-args) pair...
r28909 """If tree looks like an unexpanded alias, return (alias, pattern-args)
pair. Return None otherwise.
Yuya Nishihara
parser: move functions that process alias expansion to rule-set class...
r28895 """
if not isinstance(tree, tuple):
return None
if tree[0] == cls._symbolnode:
name = tree[1]
a = aliases.get(name)
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 if a and a.args is None:
Yuya Nishihara
parser: make _getalias() return (alias, pattern-args) pair...
r28909 return a, None
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 func = cls._trygetfunc(tree)
if func:
name, args = func
Yuya Nishihara
parser: move functions that process alias expansion to rule-set class...
r28895 a = aliases.get(name)
Yuya Nishihara
parser: drop redundant comparison between alias declaration tree and pattern...
r28908 if a and a.args is not None:
Yuya Nishihara
parser: factor out _trygetfunc() that extracts function name and arguments...
r28910 return a, args
Yuya Nishihara
parser: move functions that process alias expansion to rule-set class...
r28895 return None
@classmethod
def _expandargs(cls, tree, args):
"""Replace _aliasarg instances with the substitution value of the
same name in args, recursively.
"""
if not isinstance(tree, tuple):
return tree
if tree[0] == '_aliasarg':
sym = tree[1]
return args[sym]
return tuple(cls._expandargs(t, args) for t in tree)
@classmethod
def _expand(cls, aliases, tree, expanding, cache):
if not isinstance(tree, tuple):
return tree
Yuya Nishihara
parser: make _getalias() return (alias, pattern-args) pair...
r28909 r = cls._getalias(aliases, tree)
if r is None:
Yuya Nishihara
parser: reorder alias expansion routine to return early...
r28896 return tuple(cls._expand(aliases, t, expanding, cache)
for t in tree)
Yuya Nishihara
parser: make _getalias() return (alias, pattern-args) pair...
r28909 a, l = r
Yuya Nishihara
parser: reorder alias expansion routine to return early...
r28896 if a.error:
raise error.Abort(a.error)
if a in expanding:
raise error.ParseError(_('infinite expansion of %(section)s '
'"%(name)s" detected')
% {'section': cls._section, 'name': a.name})
Yuya Nishihara
parser: add short comment how aliases are expanded in phases
r28897 # get cacheable replacement tree by expanding aliases recursively
Yuya Nishihara
parser: reorder alias expansion routine to return early...
r28896 expanding.append(a)
if a.name not in cache:
cache[a.name] = cls._expand(aliases, a.replacement, expanding,
cache)
result = cache[a.name]
expanding.pop()
if a.args is None:
return result
Yuya Nishihara
parser: add short comment how aliases are expanded in phases
r28897 # substitute function arguments in replacement tree
Yuya Nishihara
parser: reorder alias expansion routine to return early...
r28896 if len(l) != len(a.args):
raise error.ParseError(_('invalid number of arguments: %d')
% len(l))
l = [cls._expand(aliases, t, [], cache) for t in l]
return cls._expandargs(result, dict(zip(a.args, l)))
Yuya Nishihara
parser: move functions that process alias expansion to rule-set class...
r28895
@classmethod
def expand(cls, aliases, tree):
"""Expand aliases in tree, recursively.
'aliases' is a dictionary mapping user defined aliases to alias objects.
"""
return cls._expand(aliases, tree, [], {})