asda?‰PNG  IHDR ? f ??C1 sRGB ??é gAMA ±? üa pHYs ? ??o¨d GIDATx^íüL”÷e÷Y?a?("Bh?_ò???¢§?q5k?*:t0A-o??¥]VkJ¢M??f?±8\k2íll£1]q?ù???T fixer_base.py000064400000015042151027012300007216 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Base class for fixers (optional, but recommended).""" # Python imports import itertools # Local imports from .patcomp import PatternCompiler from . import pygram from .fixer_util import does_tree_import class BaseFix(object): """Optional base class for fixers. The subclass name must be FixFooBar where FooBar is the result of removing underscores and capitalizing the words of the fix name. For example, the class name for a fixer named 'has_key' should be FixHasKey. """ PATTERN = None # Most subclasses should override with a string literal pattern = None # Compiled pattern, set by compile_pattern() pattern_tree = None # Tree representation of the pattern options = None # Options object passed to initializer filename = None # The filename (set by set_filename) numbers = itertools.count(1) # For new_name() used_names = set() # A set of all used NAMEs order = "post" # Does the fixer prefer pre- or post-order traversal explicit = False # Is this ignored by refactor.py -f all? run_order = 5 # Fixers will be sorted by run order before execution # Lower numbers will be run first. _accept_type = None # [Advanced and not public] This tells RefactoringTool # which node type to accept when there's not a pattern. keep_line_order = False # For the bottom matcher: match with the # original line order BM_compatible = False # Compatibility with the bottom matching # module; every fixer should set this # manually # Shortcut for access to Python grammar symbols syms = pygram.python_symbols def __init__(self, options, log): """Initializer. Subclass may override. Args: options: a dict containing the options passed to RefactoringTool that could be used to customize the fixer through the command line. log: a list to append warnings and other messages to. """ self.options = options self.log = log self.compile_pattern() def compile_pattern(self): """Compiles self.PATTERN into self.pattern. Subclass may override if it doesn't want to use self.{pattern,PATTERN} in .match(). """ if self.PATTERN is not None: PC = PatternCompiler() self.pattern, self.pattern_tree = PC.compile_pattern(self.PATTERN, with_tree=True) def set_filename(self, filename): """Set the filename. The main refactoring tool should call this. """ self.filename = filename def match(self, node): """Returns match for a given parse tree node. Should return a true or false object (not necessarily a bool). It may return a non-empty dict of matching sub-nodes as returned by a matching pattern. Subclass may override. """ results = {"node": node} return self.pattern.match(node, results) and results def transform(self, node, results): """Returns the transformation for a given parse tree node. Args: node: the root of the parse tree that matched the fixer. results: a dict mapping symbolic names to part of the match. Returns: None, or a node that is a modified copy of the argument node. The node argument may also be modified in-place to effect the same change. Subclass *must* override. """ raise NotImplementedError() def new_name(self, template="xxx_todo_changeme"): """Return a string suitable for use as an identifier The new name is guaranteed not to conflict with other identifiers. """ name = template while name in self.used_names: name = template + str(next(self.numbers)) self.used_names.add(name) return name def log_message(self, message): if self.first_log: self.first_log = False self.log.append("### In file %s ###" % self.filename) self.log.append(message) def cannot_convert(self, node, reason=None): """Warn the user that a given chunk of code is not valid Python 3, but that it cannot be converted automatically. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. """ lineno = node.get_lineno() for_output = node.clone() for_output.prefix = "" msg = "Line %d: could not convert: %s" self.log_message(msg % (lineno, for_output)) if reason: self.log_message(reason) def warning(self, node, reason): """Used for warning the user about possible uncertainty in the translation. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. """ lineno = node.get_lineno() self.log_message("Line %d: %s" % (lineno, reason)) def start_tree(self, tree, filename): """Some fixers need to maintain tree-wide state. This method is called once, at the start of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. """ self.used_names = tree.used_names self.set_filename(filename) self.numbers = itertools.count(1) self.first_log = True def finish_tree(self, tree, filename): """Some fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. """ pass class ConditionalFix(BaseFix): """ Base class for fixers which not execute if an import is found. """ # This is the name of the import which, if found, will cause the test to be skipped skip_on = None def start_tree(self, *args): super(ConditionalFix, self).start_tree(*args) self._should_skip = None def should_skip(self, node): if self._should_skip is not None: return self._should_skip pkg = self.skip_on.split(".") name = pkg[-1] pkg = ".".join(pkg[:-1]) self._should_skip = does_tree_import(pkg, name, node) return self._should_skip Grammar.txt000064400000020770151027012300006670 0ustar00# Grammar for 2to3. This grammar supports Python 2.x and 3.x. # NOTE WELL: You should also follow all the steps listed at # https://devguide.python.org/grammar/ # Start symbols for the grammar: # file_input is a module or sequence of commands read from an input file; # single_input is a single interactive statement; # eval_input is the input for the eval() and input() functions. # NB: compound_stmt in single_input is followed by extra NEWLINE! file_input: (NEWLINE | stmt)* ENDMARKER single_input: NEWLINE | simple_stmt | compound_stmt NEWLINE eval_input: testlist NEWLINE* ENDMARKER decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE decorators: decorator+ decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef funcdef: 'def' NAME parameters ['->' test] ':' suite parameters: '(' [typedargslist] ')' # The following definition for typedarglist is equivalent to this set of rules: # # arguments = argument (',' argument)* # argument = tfpdef ['=' test] # kwargs = '**' tname [','] # args = '*' [tname] # kwonly_kwargs = (',' argument)* [',' [kwargs]] # args_kwonly_kwargs = args kwonly_kwargs | kwargs # poskeyword_args_kwonly_kwargs = arguments [',' [args_kwonly_kwargs]] # typedargslist_no_posonly = poskeyword_args_kwonly_kwargs | args_kwonly_kwargs # typedarglist = arguments ',' '/' [',' [typedargslist_no_posonly]])|(typedargslist_no_posonly)" # # It needs to be fully expanded to allow our LL(1) parser to work on it. typedargslist: tfpdef ['=' test] (',' tfpdef ['=' test])* ',' '/' [ ',' [((tfpdef ['=' test] ',')* ('*' [tname] (',' tname ['=' test])* [',' ['**' tname [',']]] | '**' tname [',']) | tfpdef ['=' test] (',' tfpdef ['=' test])* [','])] ] | ((tfpdef ['=' test] ',')* ('*' [tname] (',' tname ['=' test])* [',' ['**' tname [',']]] | '**' tname [',']) | tfpdef ['=' test] (',' tfpdef ['=' test])* [',']) tname: NAME [':' test] tfpdef: tname | '(' tfplist ')' tfplist: tfpdef (',' tfpdef)* [','] # The following definition for varargslist is equivalent to this set of rules: # # arguments = argument (',' argument )* # argument = vfpdef ['=' test] # kwargs = '**' vname [','] # args = '*' [vname] # kwonly_kwargs = (',' argument )* [',' [kwargs]] # args_kwonly_kwargs = args kwonly_kwargs | kwargs # poskeyword_args_kwonly_kwargs = arguments [',' [args_kwonly_kwargs]] # vararglist_no_posonly = poskeyword_args_kwonly_kwargs | args_kwonly_kwargs # varargslist = arguments ',' '/' [','[(vararglist_no_posonly)]] | (vararglist_no_posonly) # # It needs to be fully expanded to allow our LL(1) parser to work on it. varargslist: vfpdef ['=' test ](',' vfpdef ['=' test])* ',' '/' [',' [ ((vfpdef ['=' test] ',')* ('*' [vname] (',' vname ['=' test])* [',' ['**' vname [',']]] | '**' vname [',']) | vfpdef ['=' test] (',' vfpdef ['=' test])* [',']) ]] | ((vfpdef ['=' test] ',')* ('*' [vname] (',' vname ['=' test])* [',' ['**' vname [',']]]| '**' vname [',']) | vfpdef ['=' test] (',' vfpdef ['=' test])* [',']) vname: NAME vfpdef: vname | '(' vfplist ')' vfplist: vfpdef (',' vfpdef)* [','] stmt: simple_stmt | compound_stmt simple_stmt: small_stmt (';' small_stmt)* [';'] NEWLINE small_stmt: (expr_stmt | print_stmt | del_stmt | pass_stmt | flow_stmt | import_stmt | global_stmt | exec_stmt | assert_stmt) expr_stmt: testlist_star_expr (annassign | augassign (yield_expr|testlist) | ('=' (yield_expr|testlist_star_expr))*) annassign: ':' test ['=' test] testlist_star_expr: (test|star_expr) (',' (test|star_expr))* [','] augassign: ('+=' | '-=' | '*=' | '@=' | '/=' | '%=' | '&=' | '|=' | '^=' | '<<=' | '>>=' | '**=' | '//=') # For normal and annotated assignments, additional restrictions enforced by the interpreter print_stmt: 'print' ( [ test (',' test)* [','] ] | '>>' test [ (',' test)+ [','] ] ) del_stmt: 'del' exprlist pass_stmt: 'pass' flow_stmt: break_stmt | continue_stmt | return_stmt | raise_stmt | yield_stmt break_stmt: 'break' continue_stmt: 'continue' return_stmt: 'return' [testlist_star_expr] yield_stmt: yield_expr raise_stmt: 'raise' [test ['from' test | ',' test [',' test]]] import_stmt: import_name | import_from import_name: 'import' dotted_as_names import_from: ('from' ('.'* dotted_name | '.'+) 'import' ('*' | '(' import_as_names ')' | import_as_names)) import_as_name: NAME ['as' NAME] dotted_as_name: dotted_name ['as' NAME] import_as_names: import_as_name (',' import_as_name)* [','] dotted_as_names: dotted_as_name (',' dotted_as_name)* dotted_name: NAME ('.' NAME)* global_stmt: ('global' | 'nonlocal') NAME (',' NAME)* exec_stmt: 'exec' expr ['in' test [',' test]] assert_stmt: 'assert' test [',' test] compound_stmt: if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt async_stmt: ASYNC (funcdef | with_stmt | for_stmt) if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)* ['else' ':' suite] while_stmt: 'while' namedexpr_test ':' suite ['else' ':' suite] for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite] try_stmt: ('try' ':' suite ((except_clause ':' suite)+ ['else' ':' suite] ['finally' ':' suite] | 'finally' ':' suite)) with_stmt: 'with' with_item (',' with_item)* ':' suite with_item: test ['as' expr] with_var: 'as' expr # NB compile.c makes sure that the default except clause is last except_clause: 'except' [test [(',' | 'as') test]] suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT # Backward compatibility cruft to support: # [ x for x in lambda: True, lambda: False if x() ] # even while also allowing: # lambda x: 5 if x else 2 # (But not a mix of the two) testlist_safe: old_test [(',' old_test)+ [',']] old_test: or_test | old_lambdef old_lambdef: 'lambda' [varargslist] ':' old_test namedexpr_test: test [':=' test] test: or_test ['if' or_test 'else' test] | lambdef or_test: and_test ('or' and_test)* and_test: not_test ('and' not_test)* not_test: 'not' not_test | comparison comparison: expr (comp_op expr)* comp_op: '<'|'>'|'=='|'>='|'<='|'<>'|'!='|'in'|'not' 'in'|'is'|'is' 'not' star_expr: '*' expr expr: xor_expr ('|' xor_expr)* xor_expr: and_expr ('^' and_expr)* and_expr: shift_expr ('&' shift_expr)* shift_expr: arith_expr (('<<'|'>>') arith_expr)* arith_expr: term (('+'|'-') term)* term: factor (('*'|'@'|'/'|'%'|'//') factor)* factor: ('+'|'-'|'~') factor | power power: [AWAIT] atom trailer* ['**' factor] atom: ('(' [yield_expr|testlist_gexp] ')' | '[' [listmaker] ']' | '{' [dictsetmaker] '}' | '`' testlist1 '`' | NAME | NUMBER | STRING+ | '.' '.' '.') listmaker: (namedexpr_test|star_expr) ( comp_for | (',' (namedexpr_test|star_expr))* [','] ) testlist_gexp: (namedexpr_test|star_expr) ( comp_for | (',' (namedexpr_test|star_expr))* [','] ) lambdef: 'lambda' [varargslist] ':' test trailer: '(' [arglist] ')' | '[' subscriptlist ']' | '.' NAME subscriptlist: subscript (',' subscript)* [','] subscript: test | [test] ':' [test] [sliceop] sliceop: ':' [test] exprlist: (expr|star_expr) (',' (expr|star_expr))* [','] testlist: test (',' test)* [','] dictsetmaker: ( ((test ':' test | '**' expr) (comp_for | (',' (test ':' test | '**' expr))* [','])) | ((test | star_expr) (comp_for | (',' (test | star_expr))* [','])) ) classdef: 'class' NAME ['(' [arglist] ')'] ':' suite arglist: argument (',' argument)* [','] # "test '=' test" is really "keyword '=' test", but we have no such token. # These need to be in a single rule to avoid grammar that is ambiguous # to our LL(1) parser. Even though 'test' includes '*expr' in star_expr, # we explicitly match '*' here, too, to give it proper precedence. # Illegal combinations and orderings are blocked in ast.c: # multiple (test comp_for) arguments are blocked; keyword unpackings # that precede iterable unpackings are blocked; etc. argument: ( test [comp_for] | test ':=' test | test '=' test | '**' test | '*' test ) comp_iter: comp_for | comp_if comp_for: [ASYNC] 'for' exprlist 'in' testlist_safe [comp_iter] comp_if: 'if' old_test [comp_iter] testlist1: test (',' test)* # not used in grammar, but may appear in "node" passed from Parser to Compiler encoding_decl: NAME yield_expr: 'yield' [yield_arg] yield_arg: 'from' test | testlist_star_expr fixer_util.py000064400000035546151027012300007274 0ustar00"""Utility functions, node construction macros, etc.""" # Author: Collin Winter # Local imports from .pgen2 import token from .pytree import Leaf, Node from .pygram import python_symbols as syms from . import patcomp ########################################################### ### Common node-construction "macros" ########################################################### def KeywordArg(keyword, value): return Node(syms.argument, [keyword, Leaf(token.EQUAL, "="), value]) def LParen(): return Leaf(token.LPAR, "(") def RParen(): return Leaf(token.RPAR, ")") def Assign(target, source): """Build an assignment statement""" if not isinstance(target, list): target = [target] if not isinstance(source, list): source.prefix = " " source = [source] return Node(syms.atom, target + [Leaf(token.EQUAL, "=", prefix=" ")] + source) def Name(name, prefix=None): """Return a NAME leaf""" return Leaf(token.NAME, name, prefix=prefix) def Attr(obj, attr): """A node tuple for obj.attr""" return [obj, Node(syms.trailer, [Dot(), attr])] def Comma(): """A comma leaf""" return Leaf(token.COMMA, ",") def Dot(): """A period (.) leaf""" return Leaf(token.DOT, ".") def ArgList(args, lparen=LParen(), rparen=RParen()): """A parenthesised argument list, used by Call()""" node = Node(syms.trailer, [lparen.clone(), rparen.clone()]) if args: node.insert_child(1, Node(syms.arglist, args)) return node def Call(func_name, args=None, prefix=None): """A function call""" node = Node(syms.power, [func_name, ArgList(args)]) if prefix is not None: node.prefix = prefix return node def Newline(): """A newline literal""" return Leaf(token.NEWLINE, "\n") def BlankLine(): """A blank line""" return Leaf(token.NEWLINE, "") def Number(n, prefix=None): return Leaf(token.NUMBER, n, prefix=prefix) def Subscript(index_node): """A numeric or string subscript""" return Node(syms.trailer, [Leaf(token.LBRACE, "["), index_node, Leaf(token.RBRACE, "]")]) def String(string, prefix=None): """A string leaf""" return Leaf(token.STRING, string, prefix=prefix) def ListComp(xp, fp, it, test=None): """A list comprehension of the form [xp for fp in it if test]. If test is None, the "if test" part is omitted. """ xp.prefix = "" fp.prefix = " " it.prefix = " " for_leaf = Leaf(token.NAME, "for") for_leaf.prefix = " " in_leaf = Leaf(token.NAME, "in") in_leaf.prefix = " " inner_args = [for_leaf, fp, in_leaf, it] if test: test.prefix = " " if_leaf = Leaf(token.NAME, "if") if_leaf.prefix = " " inner_args.append(Node(syms.comp_if, [if_leaf, test])) inner = Node(syms.listmaker, [xp, Node(syms.comp_for, inner_args)]) return Node(syms.atom, [Leaf(token.LBRACE, "["), inner, Leaf(token.RBRACE, "]")]) def FromImport(package_name, name_leafs): """ Return an import statement in the form: from package import name_leafs""" # XXX: May not handle dotted imports properly (eg, package_name='foo.bar') #assert package_name == '.' or '.' not in package_name, "FromImport has "\ # "not been tested with dotted package names -- use at your own "\ # "peril!" for leaf in name_leafs: # Pull the leaves out of their old tree leaf.remove() children = [Leaf(token.NAME, "from"), Leaf(token.NAME, package_name, prefix=" "), Leaf(token.NAME, "import", prefix=" "), Node(syms.import_as_names, name_leafs)] imp = Node(syms.import_from, children) return imp def ImportAndCall(node, results, names): """Returns an import statement and calls a method of the module: import module module.name()""" obj = results["obj"].clone() if obj.type == syms.arglist: newarglist = obj.clone() else: newarglist = Node(syms.arglist, [obj.clone()]) after = results["after"] if after: after = [n.clone() for n in after] new = Node(syms.power, Attr(Name(names[0]), Name(names[1])) + [Node(syms.trailer, [results["lpar"].clone(), newarglist, results["rpar"].clone()])] + after) new.prefix = node.prefix return new ########################################################### ### Determine whether a node represents a given literal ########################################################### def is_tuple(node): """Does the node represent a tuple literal?""" if isinstance(node, Node) and node.children == [LParen(), RParen()]: return True return (isinstance(node, Node) and len(node.children) == 3 and isinstance(node.children[0], Leaf) and isinstance(node.children[1], Node) and isinstance(node.children[2], Leaf) and node.children[0].value == "(" and node.children[2].value == ")") def is_list(node): """Does the node represent a list literal?""" return (isinstance(node, Node) and len(node.children) > 1 and isinstance(node.children[0], Leaf) and isinstance(node.children[-1], Leaf) and node.children[0].value == "[" and node.children[-1].value == "]") ########################################################### ### Misc ########################################################### def parenthesize(node): return Node(syms.atom, [LParen(), node, RParen()]) consuming_calls = {"sorted", "list", "set", "any", "all", "tuple", "sum", "min", "max", "enumerate"} def attr_chain(obj, attr): """Follow an attribute chain. If you have a chain of objects where a.foo -> b, b.foo-> c, etc, use this to iterate over all objects in the chain. Iteration is terminated by getattr(x, attr) is None. Args: obj: the starting object attr: the name of the chaining attribute Yields: Each successive object in the chain. """ next = getattr(obj, attr) while next: yield next next = getattr(next, attr) p0 = """for_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > """ p1 = """ power< ( 'iter' | 'list' | 'tuple' | 'sorted' | 'set' | 'sum' | 'any' | 'all' | 'enumerate' | (any* trailer< '.' 'join' >) ) trailer< '(' node=any ')' > any* > """ p2 = """ power< ( 'sorted' | 'enumerate' ) trailer< '(' arglist ')' > any* > """ pats_built = False def in_special_context(node): """ Returns true if node is in an environment where all that is required of it is being iterable (ie, it doesn't matter if it returns a list or an iterator). See test_map_nochange in test_fixers.py for some examples and tests. """ global p0, p1, p2, pats_built if not pats_built: p0 = patcomp.compile_pattern(p0) p1 = patcomp.compile_pattern(p1) p2 = patcomp.compile_pattern(p2) pats_built = True patterns = [p0, p1, p2] for pattern, parent in zip(patterns, attr_chain(node, "parent")): results = {} if pattern.match(parent, results) and results["node"] is node: return True return False def is_probably_builtin(node): """ Check that something isn't an attribute or function name etc. """ prev = node.prev_sibling if prev is not None and prev.type == token.DOT: # Attribute lookup. return False parent = node.parent if parent.type in (syms.funcdef, syms.classdef): return False if parent.type == syms.expr_stmt and parent.children[0] is node: # Assignment. return False if parent.type == syms.parameters or \ (parent.type == syms.typedargslist and ( (prev is not None and prev.type == token.COMMA) or parent.children[0] is node )): # The name of an argument. return False return True def find_indentation(node): """Find the indentation of *node*.""" while node is not None: if node.type == syms.suite and len(node.children) > 2: indent = node.children[1] if indent.type == token.INDENT: return indent.value node = node.parent return "" ########################################################### ### The following functions are to find bindings in a suite ########################################################### def make_suite(node): if node.type == syms.suite: return node node = node.clone() parent, node.parent = node.parent, None suite = Node(syms.suite, [node]) suite.parent = parent return suite def find_root(node): """Find the top level namespace.""" # Scamper up to the top level namespace while node.type != syms.file_input: node = node.parent if not node: raise ValueError("root found before file_input node was found.") return node def does_tree_import(package, name, node): """ Returns true if name is imported from package at the top level of the tree which node belongs to. To cover the case of an import like 'import foo', use None for the package and 'foo' for the name. """ binding = find_binding(name, find_root(node), package) return bool(binding) def is_import(node): """Returns true if the node is an import statement.""" return node.type in (syms.import_name, syms.import_from) def touch_import(package, name, node): """ Works like `does_tree_import` but adds an import statement if it was not imported. """ def is_import_stmt(node): return (node.type == syms.simple_stmt and node.children and is_import(node.children[0])) root = find_root(node) if does_tree_import(package, name, root): return # figure out where to insert the new import. First try to find # the first import and then skip to the last one. insert_pos = offset = 0 for idx, node in enumerate(root.children): if not is_import_stmt(node): continue for offset, node2 in enumerate(root.children[idx:]): if not is_import_stmt(node2): break insert_pos = idx + offset break # if there are no imports where we can insert, find the docstring. # if that also fails, we stick to the beginning of the file if insert_pos == 0: for idx, node in enumerate(root.children): if (node.type == syms.simple_stmt and node.children and node.children[0].type == token.STRING): insert_pos = idx + 1 break if package is None: import_ = Node(syms.import_name, [ Leaf(token.NAME, "import"), Leaf(token.NAME, name, prefix=" ") ]) else: import_ = FromImport(package, [Leaf(token.NAME, name, prefix=" ")]) children = [import_, Newline()] root.insert_child(insert_pos, Node(syms.simple_stmt, children)) _def_syms = {syms.classdef, syms.funcdef} def find_binding(name, node, package=None): """ Returns the node which binds variable name, otherwise None. If optional argument package is supplied, only imports will be returned. See test cases for examples.""" for child in node.children: ret = None if child.type == syms.for_stmt: if _find(name, child.children[1]): return child n = find_binding(name, make_suite(child.children[-1]), package) if n: ret = n elif child.type in (syms.if_stmt, syms.while_stmt): n = find_binding(name, make_suite(child.children[-1]), package) if n: ret = n elif child.type == syms.try_stmt: n = find_binding(name, make_suite(child.children[2]), package) if n: ret = n else: for i, kid in enumerate(child.children[3:]): if kid.type == token.COLON and kid.value == ":": # i+3 is the colon, i+4 is the suite n = find_binding(name, make_suite(child.children[i+4]), package) if n: ret = n elif child.type in _def_syms and child.children[1].value == name: ret = child elif _is_import_binding(child, name, package): ret = child elif child.type == syms.simple_stmt: ret = find_binding(name, child, package) elif child.type == syms.expr_stmt: if _find(name, child.children[0]): ret = child if ret: if not package: return ret if is_import(ret): return ret return None _block_syms = {syms.funcdef, syms.classdef, syms.trailer} def _find(name, node): nodes = [node] while nodes: node = nodes.pop() if node.type > 256 and node.type not in _block_syms: nodes.extend(node.children) elif node.type == token.NAME and node.value == name: return node return None def _is_import_binding(node, name, package=None): """ Will return node if node will import name, or node will import * from package. None is returned otherwise. See test cases for examples. """ if node.type == syms.import_name and not package: imp = node.children[1] if imp.type == syms.dotted_as_names: for child in imp.children: if child.type == syms.dotted_as_name: if child.children[2].value == name: return node elif child.type == token.NAME and child.value == name: return node elif imp.type == syms.dotted_as_name: last = imp.children[-1] if last.type == token.NAME and last.value == name: return node elif imp.type == token.NAME and imp.value == name: return node elif node.type == syms.import_from: # str(...) is used to make life easier here, because # from a.b import parses to ['import', ['a', '.', 'b'], ...] if package and str(node.children[1]).strip() != package: return None n = node.children[3] if package and _find("as", n): # See test_from_import_as for explanation return None elif n.type == syms.import_as_names and _find(name, n): return node elif n.type == syms.import_as_name: child = n.children[2] if child.type == token.NAME and child.value == name: return node elif n.type == token.NAME and n.value == name: return node elif package and n.type == token.STAR: return node return None __main__.py000064400000000103151027012300006617 0ustar00import sys from .main import main sys.exit(main("lib2to3.fixes")) pgen2/parse.py000064400000017733151027012300007245 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Parser engine for the grammar tables generated by pgen. The grammar table must be loaded first. See Parser/parser.c in the Python distribution for additional info on how this parsing engine works. """ # Local imports from . import token class ParseError(Exception): """Exception to signal the parser is stuck.""" def __init__(self, msg, type, value, context): Exception.__init__(self, "%s: type=%r, value=%r, context=%r" % (msg, type, value, context)) self.msg = msg self.type = type self.value = value self.context = context def __reduce__(self): return type(self), (self.msg, self.type, self.value, self.context) class Parser(object): """Parser engine. The proper usage sequence is: p = Parser(grammar, [converter]) # create instance p.setup([start]) # prepare for parsing : if p.addtoken(...): # parse a token; may raise ParseError break root = p.rootnode # root of abstract syntax tree A Parser instance may be reused by calling setup() repeatedly. A Parser instance contains state pertaining to the current token sequence, and should not be used concurrently by different threads to parse separate token sequences. See driver.py for how to get input tokens by tokenizing a file or string. Parsing is complete when addtoken() returns True; the root of the abstract syntax tree can then be retrieved from the rootnode instance variable. When a syntax error occurs, addtoken() raises the ParseError exception. There is no error recovery; the parser cannot be used after a syntax error was reported (but it can be reinitialized by calling setup()). """ def __init__(self, grammar, convert=None): """Constructor. The grammar argument is a grammar.Grammar instance; see the grammar module for more information. The parser is not ready yet for parsing; you must call the setup() method to get it started. The optional convert argument is a function mapping concrete syntax tree nodes to abstract syntax tree nodes. If not given, no conversion is done and the syntax tree produced is the concrete syntax tree. If given, it must be a function of two arguments, the first being the grammar (a grammar.Grammar instance), and the second being the concrete syntax tree node to be converted. The syntax tree is converted from the bottom up. A concrete syntax tree node is a (type, value, context, nodes) tuple, where type is the node type (a token or symbol number), value is None for symbols and a string for tokens, context is None or an opaque value used for error reporting (typically a (lineno, offset) pair), and nodes is a list of children for symbols, and None for tokens. An abstract syntax tree node may be anything; this is entirely up to the converter function. """ self.grammar = grammar self.convert = convert or (lambda grammar, node: node) def setup(self, start=None): """Prepare for parsing. This *must* be called before starting to parse. The optional argument is an alternative start symbol; it defaults to the grammar's start symbol. You can use a Parser instance to parse any number of programs; each time you call setup() the parser is reset to an initial state determined by the (implicit or explicit) start symbol. """ if start is None: start = self.grammar.start # Each stack entry is a tuple: (dfa, state, node). # A node is a tuple: (type, value, context, children), # where children is a list of nodes or None, and context may be None. newnode = (start, None, None, []) stackentry = (self.grammar.dfas[start], 0, newnode) self.stack = [stackentry] self.rootnode = None self.used_names = set() # Aliased to self.rootnode.used_names in pop() def addtoken(self, type, value, context): """Add a token; return True iff this is the end of the program.""" # Map from token to label ilabel = self.classify(type, value, context) # Loop until the token is shifted; may raise exceptions while True: dfa, state, node = self.stack[-1] states, first = dfa arcs = states[state] # Look for a state with this label for i, newstate in arcs: t, v = self.grammar.labels[i] if ilabel == i: # Look it up in the list of labels assert t < 256 # Shift a token; we're done with it self.shift(type, value, newstate, context) # Pop while we are in an accept-only state state = newstate while states[state] == [(0, state)]: self.pop() if not self.stack: # Done parsing! return True dfa, state, node = self.stack[-1] states, first = dfa # Done with this token return False elif t >= 256: # See if it's a symbol and if we're in its first set itsdfa = self.grammar.dfas[t] itsstates, itsfirst = itsdfa if ilabel in itsfirst: # Push a symbol self.push(t, self.grammar.dfas[t], newstate, context) break # To continue the outer while loop else: if (0, state) in arcs: # An accepting state, pop it and try something else self.pop() if not self.stack: # Done parsing, but another token is input raise ParseError("too much input", type, value, context) else: # No success finding a transition raise ParseError("bad input", type, value, context) def classify(self, type, value, context): """Turn a token into a label. (Internal)""" if type == token.NAME: # Keep a listing of all used names self.used_names.add(value) # Check for reserved words ilabel = self.grammar.keywords.get(value) if ilabel is not None: return ilabel ilabel = self.grammar.tokens.get(type) if ilabel is None: raise ParseError("bad token", type, value, context) return ilabel def shift(self, type, value, newstate, context): """Shift a token. (Internal)""" dfa, state, node = self.stack[-1] newnode = (type, value, context, None) newnode = self.convert(self.grammar, newnode) if newnode is not None: node[-1].append(newnode) self.stack[-1] = (dfa, newstate, node) def push(self, type, newdfa, newstate, context): """Push a nonterminal. (Internal)""" dfa, state, node = self.stack[-1] newnode = (type, None, context, []) self.stack[-1] = (dfa, newstate, node) self.stack.append((newdfa, 0, newnode)) def pop(self): """Pop a nonterminal. (Internal)""" popdfa, popstate, popnode = self.stack.pop() newnode = self.convert(self.grammar, popnode) if newnode is not None: if self.stack: dfa, state, node = self.stack[-1] node[-1].append(newnode) else: self.rootnode = newnode self.rootnode.used_names = self.used_names pgen2/driver.py000064400000013521151027012300007415 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. # Modifications: # Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Parser driver. This provides a high-level interface to parse a file into a syntax tree. """ __author__ = "Guido van Rossum " __all__ = ["Driver", "load_grammar"] # Python imports import io import os import logging import pkgutil import sys # Pgen imports from . import grammar, parse, token, tokenize, pgen class Driver(object): def __init__(self, grammar, convert=None, logger=None): self.grammar = grammar if logger is None: logger = logging.getLogger() self.logger = logger self.convert = convert def parse_tokens(self, tokens, debug=False): """Parse a series of tokens and return the syntax tree.""" # XXX Move the prefix computation into a wrapper around tokenize. p = parse.Parser(self.grammar, self.convert) p.setup() lineno = 1 column = 0 type = value = start = end = line_text = None prefix = "" for quintuple in tokens: type, value, start, end, line_text = quintuple if start != (lineno, column): assert (lineno, column) <= start, ((lineno, column), start) s_lineno, s_column = start if lineno < s_lineno: prefix += "\n" * (s_lineno - lineno) lineno = s_lineno column = 0 if column < s_column: prefix += line_text[column:s_column] column = s_column if type in (tokenize.COMMENT, tokenize.NL): prefix += value lineno, column = end if value.endswith("\n"): lineno += 1 column = 0 continue if type == token.OP: type = grammar.opmap[value] if debug: self.logger.debug("%s %r (prefix=%r)", token.tok_name[type], value, prefix) if p.addtoken(type, value, (prefix, start)): if debug: self.logger.debug("Stop.") break prefix = "" lineno, column = end if value.endswith("\n"): lineno += 1 column = 0 else: # We never broke out -- EOF is too soon (how can this happen???) raise parse.ParseError("incomplete input", type, value, (prefix, start)) return p.rootnode def parse_stream_raw(self, stream, debug=False): """Parse a stream and return the syntax tree.""" tokens = tokenize.generate_tokens(stream.readline) return self.parse_tokens(tokens, debug) def parse_stream(self, stream, debug=False): """Parse a stream and return the syntax tree.""" return self.parse_stream_raw(stream, debug) def parse_file(self, filename, encoding=None, debug=False): """Parse a file and return the syntax tree.""" with io.open(filename, "r", encoding=encoding) as stream: return self.parse_stream(stream, debug) def parse_string(self, text, debug=False): """Parse a string and return the syntax tree.""" tokens = tokenize.generate_tokens(io.StringIO(text).readline) return self.parse_tokens(tokens, debug) def _generate_pickle_name(gt): head, tail = os.path.splitext(gt) if tail == ".txt": tail = "" return head + tail + ".".join(map(str, sys.version_info)) + ".pickle" def load_grammar(gt="Grammar.txt", gp=None, save=True, force=False, logger=None): """Load the grammar (maybe from a pickle).""" if logger is None: logger = logging.getLogger() gp = _generate_pickle_name(gt) if gp is None else gp if force or not _newer(gp, gt): logger.info("Generating grammar tables from %s", gt) g = pgen.generate_grammar(gt) if save: logger.info("Writing grammar tables to %s", gp) try: g.dump(gp) except OSError as e: logger.info("Writing failed: %s", e) else: g = grammar.Grammar() g.load(gp) return g def _newer(a, b): """Inquire whether file a was written since file b.""" if not os.path.exists(a): return False if not os.path.exists(b): return True return os.path.getmtime(a) >= os.path.getmtime(b) def load_packaged_grammar(package, grammar_source): """Normally, loads a pickled grammar by doing pkgutil.get_data(package, pickled_grammar) where *pickled_grammar* is computed from *grammar_source* by adding the Python version and using a ``.pickle`` extension. However, if *grammar_source* is an extant file, load_grammar(grammar_source) is called instead. This facilitates using a packaged grammar file when needed but preserves load_grammar's automatic regeneration behavior when possible. """ if os.path.isfile(grammar_source): return load_grammar(grammar_source) pickled_name = _generate_pickle_name(os.path.basename(grammar_source)) data = pkgutil.get_data(package, pickled_name) g = grammar.Grammar() g.loads(data) return g def main(*args): """Main program, when run as a script: produce grammar pickle files. Calls load_grammar for each argument, a path to a grammar text file. """ if not args: args = sys.argv[1:] logging.basicConfig(level=logging.INFO, stream=sys.stdout, format='%(message)s') for gt in args: load_grammar(gt, save=True, force=True) return True if __name__ == "__main__": sys.exit(int(not main())) pgen2/token.py000075500000002425151027012300007246 0ustar00#! /usr/bin/python3.11 """Token constants (from "token.h").""" # Taken from Python (r53757) and modified to include some tokens # originally monkeypatched in by pgen2.tokenize #--start constants-- ENDMARKER = 0 NAME = 1 NUMBER = 2 STRING = 3 NEWLINE = 4 INDENT = 5 DEDENT = 6 LPAR = 7 RPAR = 8 LSQB = 9 RSQB = 10 COLON = 11 COMMA = 12 SEMI = 13 PLUS = 14 MINUS = 15 STAR = 16 SLASH = 17 VBAR = 18 AMPER = 19 LESS = 20 GREATER = 21 EQUAL = 22 DOT = 23 PERCENT = 24 BACKQUOTE = 25 LBRACE = 26 RBRACE = 27 EQEQUAL = 28 NOTEQUAL = 29 LESSEQUAL = 30 GREATEREQUAL = 31 TILDE = 32 CIRCUMFLEX = 33 LEFTSHIFT = 34 RIGHTSHIFT = 35 DOUBLESTAR = 36 PLUSEQUAL = 37 MINEQUAL = 38 STAREQUAL = 39 SLASHEQUAL = 40 PERCENTEQUAL = 41 AMPEREQUAL = 42 VBAREQUAL = 43 CIRCUMFLEXEQUAL = 44 LEFTSHIFTEQUAL = 45 RIGHTSHIFTEQUAL = 46 DOUBLESTAREQUAL = 47 DOUBLESLASH = 48 DOUBLESLASHEQUAL = 49 AT = 50 ATEQUAL = 51 OP = 52 COMMENT = 53 NL = 54 RARROW = 55 AWAIT = 56 ASYNC = 57 ERRORTOKEN = 58 COLONEQUAL = 59 N_TOKENS = 60 NT_OFFSET = 256 #--end constants-- tok_name = {} for _name, _value in list(globals().items()): if type(_value) is type(0): tok_name[_value] = _name def ISTERMINAL(x): return x < NT_OFFSET def ISNONTERMINAL(x): return x >= NT_OFFSET def ISEOF(x): return x == ENDMARKER pgen2/grammar.py000064400000012660151027012300007553 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """This module defines the data structures used to represent a grammar. These are a bit arcane because they are derived from the data structures used by Python's 'pgen' parser generator. There's also a table here mapping operators to their names in the token module; the Python tokenize module reports all operators as the fallback token code OP, but the parser needs the actual token code. """ # Python imports import pickle # Local imports from . import token class Grammar(object): """Pgen parsing tables conversion class. Once initialized, this class supplies the grammar tables for the parsing engine implemented by parse.py. The parsing engine accesses the instance variables directly. The class here does not provide initialization of the tables; several subclasses exist to do this (see the conv and pgen modules). The load() method reads the tables from a pickle file, which is much faster than the other ways offered by subclasses. The pickle file is written by calling dump() (after loading the grammar tables using a subclass). The report() method prints a readable representation of the tables to stdout, for debugging. The instance variables are as follows: symbol2number -- a dict mapping symbol names to numbers. Symbol numbers are always 256 or higher, to distinguish them from token numbers, which are between 0 and 255 (inclusive). number2symbol -- a dict mapping numbers to symbol names; these two are each other's inverse. states -- a list of DFAs, where each DFA is a list of states, each state is a list of arcs, and each arc is a (i, j) pair where i is a label and j is a state number. The DFA number is the index into this list. (This name is slightly confusing.) Final states are represented by a special arc of the form (0, j) where j is its own state number. dfas -- a dict mapping symbol numbers to (DFA, first) pairs, where DFA is an item from the states list above, and first is a set of tokens that can begin this grammar rule (represented by a dict whose values are always 1). labels -- a list of (x, y) pairs where x is either a token number or a symbol number, and y is either None or a string; the strings are keywords. The label number is the index in this list; label numbers are used to mark state transitions (arcs) in the DFAs. start -- the number of the grammar's start symbol. keywords -- a dict mapping keyword strings to arc labels. tokens -- a dict mapping token numbers to arc labels. """ def __init__(self): self.symbol2number = {} self.number2symbol = {} self.states = [] self.dfas = {} self.labels = [(0, "EMPTY")] self.keywords = {} self.tokens = {} self.symbol2label = {} self.start = 256 def dump(self, filename): """Dump the grammar tables to a pickle file.""" with open(filename, "wb") as f: pickle.dump(self.__dict__, f, pickle.HIGHEST_PROTOCOL) def load(self, filename): """Load the grammar tables from a pickle file.""" with open(filename, "rb") as f: d = pickle.load(f) self.__dict__.update(d) def loads(self, pkl): """Load the grammar tables from a pickle bytes object.""" self.__dict__.update(pickle.loads(pkl)) def copy(self): """ Copy the grammar. """ new = self.__class__() for dict_attr in ("symbol2number", "number2symbol", "dfas", "keywords", "tokens", "symbol2label"): setattr(new, dict_attr, getattr(self, dict_attr).copy()) new.labels = self.labels[:] new.states = self.states[:] new.start = self.start return new def report(self): """Dump the grammar tables to standard output, for debugging.""" from pprint import pprint print("s2n") pprint(self.symbol2number) print("n2s") pprint(self.number2symbol) print("states") pprint(self.states) print("dfas") pprint(self.dfas) print("labels") pprint(self.labels) print("start", self.start) # Map from operator to number (since tokenize doesn't do this) opmap_raw = """ ( LPAR ) RPAR [ LSQB ] RSQB : COLON , COMMA ; SEMI + PLUS - MINUS * STAR / SLASH | VBAR & AMPER < LESS > GREATER = EQUAL . DOT % PERCENT ` BACKQUOTE { LBRACE } RBRACE @ AT @= ATEQUAL == EQEQUAL != NOTEQUAL <> NOTEQUAL <= LESSEQUAL >= GREATEREQUAL ~ TILDE ^ CIRCUMFLEX << LEFTSHIFT >> RIGHTSHIFT ** DOUBLESTAR += PLUSEQUAL -= MINEQUAL *= STAREQUAL /= SLASHEQUAL %= PERCENTEQUAL &= AMPEREQUAL |= VBAREQUAL ^= CIRCUMFLEXEQUAL <<= LEFTSHIFTEQUAL >>= RIGHTSHIFTEQUAL **= DOUBLESTAREQUAL // DOUBLESLASH //= DOUBLESLASHEQUAL -> RARROW := COLONEQUAL """ opmap = {} for line in opmap_raw.splitlines(): if line: op, name = line.split() opmap[op] = getattr(token, name) del line, op, name pgen2/conv.py000064400000022652151027012300007074 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Convert graminit.[ch] spit out by pgen to Python code. Pgen is the Python parser generator. It is useful to quickly create a parser from a grammar file in Python's grammar notation. But I don't want my parsers to be written in C (yet), so I'm translating the parsing tables to Python data structures and writing a Python parse engine. Note that the token numbers are constants determined by the standard Python tokenizer. The standard token module defines these numbers and their names (the names are not used much). The token numbers are hardcoded into the Python tokenizer and into pgen. A Python implementation of the Python tokenizer is also available, in the standard tokenize module. On the other hand, symbol numbers (representing the grammar's non-terminals) are assigned by pgen based on the actual grammar input. Note: this module is pretty much obsolete; the pgen module generates equivalent grammar tables directly from the Grammar.txt input file without having to invoke the Python pgen C program. """ # Python imports import re # Local imports from pgen2 import grammar, token class Converter(grammar.Grammar): """Grammar subclass that reads classic pgen output files. The run() method reads the tables as produced by the pgen parser generator, typically contained in two C files, graminit.h and graminit.c. The other methods are for internal use only. See the base class for more documentation. """ def run(self, graminit_h, graminit_c): """Load the grammar tables from the text files written by pgen.""" self.parse_graminit_h(graminit_h) self.parse_graminit_c(graminit_c) self.finish_off() def parse_graminit_h(self, filename): """Parse the .h file written by pgen. (Internal) This file is a sequence of #define statements defining the nonterminals of the grammar as numbers. We build two tables mapping the numbers to names and back. """ try: f = open(filename) except OSError as err: print("Can't open %s: %s" % (filename, err)) return False self.symbol2number = {} self.number2symbol = {} lineno = 0 for line in f: lineno += 1 mo = re.match(r"^#define\s+(\w+)\s+(\d+)$", line) if not mo and line.strip(): print("%s(%s): can't parse %s" % (filename, lineno, line.strip())) else: symbol, number = mo.groups() number = int(number) assert symbol not in self.symbol2number assert number not in self.number2symbol self.symbol2number[symbol] = number self.number2symbol[number] = symbol return True def parse_graminit_c(self, filename): """Parse the .c file written by pgen. (Internal) The file looks as follows. The first two lines are always this: #include "pgenheaders.h" #include "grammar.h" After that come four blocks: 1) one or more state definitions 2) a table defining dfas 3) a table defining labels 4) a struct defining the grammar A state definition has the following form: - one or more arc arrays, each of the form: static arc arcs__[] = { {, }, ... }; - followed by a state array, of the form: static state states_[] = { {, arcs__}, ... }; """ try: f = open(filename) except OSError as err: print("Can't open %s: %s" % (filename, err)) return False # The code below essentially uses f's iterator-ness! lineno = 0 # Expect the two #include lines lineno, line = lineno+1, next(f) assert line == '#include "pgenheaders.h"\n', (lineno, line) lineno, line = lineno+1, next(f) assert line == '#include "grammar.h"\n', (lineno, line) # Parse the state definitions lineno, line = lineno+1, next(f) allarcs = {} states = [] while line.startswith("static arc "): while line.startswith("static arc "): mo = re.match(r"static arc arcs_(\d+)_(\d+)\[(\d+)\] = {$", line) assert mo, (lineno, line) n, m, k = list(map(int, mo.groups())) arcs = [] for _ in range(k): lineno, line = lineno+1, next(f) mo = re.match(r"\s+{(\d+), (\d+)},$", line) assert mo, (lineno, line) i, j = list(map(int, mo.groups())) arcs.append((i, j)) lineno, line = lineno+1, next(f) assert line == "};\n", (lineno, line) allarcs[(n, m)] = arcs lineno, line = lineno+1, next(f) mo = re.match(r"static state states_(\d+)\[(\d+)\] = {$", line) assert mo, (lineno, line) s, t = list(map(int, mo.groups())) assert s == len(states), (lineno, line) state = [] for _ in range(t): lineno, line = lineno+1, next(f) mo = re.match(r"\s+{(\d+), arcs_(\d+)_(\d+)},$", line) assert mo, (lineno, line) k, n, m = list(map(int, mo.groups())) arcs = allarcs[n, m] assert k == len(arcs), (lineno, line) state.append(arcs) states.append(state) lineno, line = lineno+1, next(f) assert line == "};\n", (lineno, line) lineno, line = lineno+1, next(f) self.states = states # Parse the dfas dfas = {} mo = re.match(r"static dfa dfas\[(\d+)\] = {$", line) assert mo, (lineno, line) ndfas = int(mo.group(1)) for i in range(ndfas): lineno, line = lineno+1, next(f) mo = re.match(r'\s+{(\d+), "(\w+)", (\d+), (\d+), states_(\d+),$', line) assert mo, (lineno, line) symbol = mo.group(2) number, x, y, z = list(map(int, mo.group(1, 3, 4, 5))) assert self.symbol2number[symbol] == number, (lineno, line) assert self.number2symbol[number] == symbol, (lineno, line) assert x == 0, (lineno, line) state = states[z] assert y == len(state), (lineno, line) lineno, line = lineno+1, next(f) mo = re.match(r'\s+("(?:\\\d\d\d)*")},$', line) assert mo, (lineno, line) first = {} rawbitset = eval(mo.group(1)) for i, c in enumerate(rawbitset): byte = ord(c) for j in range(8): if byte & (1< ... != first[ilabel] = 1 return first def make_label(self, c, label): # XXX Maybe this should be a method on a subclass of converter? ilabel = len(c.labels) if label[0].isalpha(): # Either a symbol name or a named token if label in c.symbol2number: # A symbol name (a non-terminal) if label in c.symbol2label: return c.symbol2label[label] else: c.labels.append((c.symbol2number[label], None)) c.symbol2label[label] = ilabel return ilabel else: # A named token (NAME, NUMBER, STRING) itoken = getattr(token, label, None) assert isinstance(itoken, int), label assert itoken in token.tok_name, label if itoken in c.tokens: return c.tokens[itoken] else: c.labels.append((itoken, None)) c.tokens[itoken] = ilabel return ilabel else: # Either a keyword or an operator assert label[0] in ('"', "'"), label value = eval(label) if value[0].isalpha(): # A keyword if value in c.keywords: return c.keywords[value] else: c.labels.append((token.NAME, value)) c.keywords[value] = ilabel return ilabel else: # An operator (any non-numeric token) itoken = grammar.opmap[value] # Fails if unknown token if itoken in c.tokens: return c.tokens[itoken] else: c.labels.append((itoken, None)) c.tokens[itoken] = ilabel return ilabel def addfirstsets(self): names = list(self.dfas.keys()) names.sort() for name in names: if name not in self.first: self.calcfirst(name) #print name, self.first[name].keys() def calcfirst(self, name): dfa = self.dfas[name] self.first[name] = None # dummy to detect left recursion state = dfa[0] totalset = {} overlapcheck = {} for label, next in state.arcs.items(): if label in self.dfas: if label in self.first: fset = self.first[label] if fset is None: raise ValueError("recursion for rule %r" % name) else: self.calcfirst(label) fset = self.first[label] totalset.update(fset) overlapcheck[label] = fset else: totalset[label] = 1 overlapcheck[label] = {label: 1} inverse = {} for label, itsfirst in overlapcheck.items(): for symbol in itsfirst: if symbol in inverse: raise ValueError("rule %s is ambiguous; %s is in the" " first sets of %s as well as %s" % (name, symbol, label, inverse[symbol])) inverse[symbol] = label self.first[name] = totalset def parse(self): dfas = {} startsymbol = None # MSTART: (NEWLINE | RULE)* ENDMARKER while self.type != token.ENDMARKER: while self.type == token.NEWLINE: self.gettoken() # RULE: NAME ':' RHS NEWLINE name = self.expect(token.NAME) self.expect(token.OP, ":") a, z = self.parse_rhs() self.expect(token.NEWLINE) #self.dump_nfa(name, a, z) dfa = self.make_dfa(a, z) #self.dump_dfa(name, dfa) oldlen = len(dfa) self.simplify_dfa(dfa) newlen = len(dfa) dfas[name] = dfa #print name, oldlen, newlen if startsymbol is None: startsymbol = name return dfas, startsymbol def make_dfa(self, start, finish): # To turn an NFA into a DFA, we define the states of the DFA # to correspond to *sets* of states of the NFA. Then do some # state reduction. Let's represent sets as dicts with 1 for # values. assert isinstance(start, NFAState) assert isinstance(finish, NFAState) def closure(state): base = {} addclosure(state, base) return base def addclosure(state, base): assert isinstance(state, NFAState) if state in base: return base[state] = 1 for label, next in state.arcs: if label is None: addclosure(next, base) states = [DFAState(closure(start), finish)] for state in states: # NB states grows while we're iterating arcs = {} for nfastate in state.nfaset: for label, next in nfastate.arcs: if label is not None: addclosure(next, arcs.setdefault(label, {})) for label, nfaset in sorted(arcs.items()): for st in states: if st.nfaset == nfaset: break else: st = DFAState(nfaset, finish) states.append(st) state.addarc(st, label) return states # List of DFAState instances; first one is start def dump_nfa(self, name, start, finish): print("Dump of NFA for", name) todo = [start] for i, state in enumerate(todo): print(" State", i, state is finish and "(final)" or "") for label, next in state.arcs: if next in todo: j = todo.index(next) else: j = len(todo) todo.append(next) if label is None: print(" -> %d" % j) else: print(" %s -> %d" % (label, j)) def dump_dfa(self, name, dfa): print("Dump of DFA for", name) for i, state in enumerate(dfa): print(" State", i, state.isfinal and "(final)" or "") for label, next in sorted(state.arcs.items()): print(" %s -> %d" % (label, dfa.index(next))) def simplify_dfa(self, dfa): # This is not theoretically optimal, but works well enough. # Algorithm: repeatedly look for two states that have the same # set of arcs (same labels pointing to the same nodes) and # unify them, until things stop changing. # dfa is a list of DFAState instances changes = True while changes: changes = False for i, state_i in enumerate(dfa): for j in range(i+1, len(dfa)): state_j = dfa[j] if state_i == state_j: #print " unify", i, j del dfa[j] for state in dfa: state.unifystate(state_j, state_i) changes = True break def parse_rhs(self): # RHS: ALT ('|' ALT)* a, z = self.parse_alt() if self.value != "|": return a, z else: aa = NFAState() zz = NFAState() aa.addarc(a) z.addarc(zz) while self.value == "|": self.gettoken() a, z = self.parse_alt() aa.addarc(a) z.addarc(zz) return aa, zz def parse_alt(self): # ALT: ITEM+ a, b = self.parse_item() while (self.value in ("(", "[") or self.type in (token.NAME, token.STRING)): c, d = self.parse_item() b.addarc(c) b = d return a, b def parse_item(self): # ITEM: '[' RHS ']' | ATOM ['+' | '*'] if self.value == "[": self.gettoken() a, z = self.parse_rhs() self.expect(token.OP, "]") a.addarc(z) return a, z else: a, z = self.parse_atom() value = self.value if value not in ("+", "*"): return a, z self.gettoken() z.addarc(a) if value == "+": return a, z else: return a, a def parse_atom(self): # ATOM: '(' RHS ')' | NAME | STRING if self.value == "(": self.gettoken() a, z = self.parse_rhs() self.expect(token.OP, ")") return a, z elif self.type in (token.NAME, token.STRING): a = NFAState() z = NFAState() a.addarc(z, self.value) self.gettoken() return a, z else: self.raise_error("expected (...) or NAME or STRING, got %s/%s", self.type, self.value) def expect(self, type, value=None): if self.type != type or (value is not None and self.value != value): self.raise_error("expected %s/%s, got %s/%s", type, value, self.type, self.value) value = self.value self.gettoken() return value def gettoken(self): tup = next(self.generator) while tup[0] in (tokenize.COMMENT, tokenize.NL): tup = next(self.generator) self.type, self.value, self.begin, self.end, self.line = tup #print token.tok_name[self.type], repr(self.value) def raise_error(self, msg, *args): if args: try: msg = msg % args except: msg = " ".join([msg] + list(map(str, args))) raise SyntaxError(msg, (self.filename, self.end[0], self.end[1], self.line)) class NFAState(object): def __init__(self): self.arcs = [] # list of (label, NFAState) pairs def addarc(self, next, label=None): assert label is None or isinstance(label, str) assert isinstance(next, NFAState) self.arcs.append((label, next)) class DFAState(object): def __init__(self, nfaset, final): assert isinstance(nfaset, dict) assert isinstance(next(iter(nfaset)), NFAState) assert isinstance(final, NFAState) self.nfaset = nfaset self.isfinal = final in nfaset self.arcs = {} # map from label to DFAState def addarc(self, next, label): assert isinstance(label, str) assert label not in self.arcs assert isinstance(next, DFAState) self.arcs[label] = next def unifystate(self, old, new): for label, next in self.arcs.items(): if next is old: self.arcs[label] = new def __eq__(self, other): # Equality test -- ignore the nfaset instance variable assert isinstance(other, DFAState) if self.isfinal != other.isfinal: return False # Can't just return self.arcs == other.arcs, because that # would invoke this method recursively, with cycles... if len(self.arcs) != len(other.arcs): return False for label, next in self.arcs.items(): if next is not other.arcs.get(label): return False return True __hash__ = None # For Py3 compatibility. def generate_grammar(filename="Grammar.txt"): p = ParserGenerator(filename) return p.make_grammar() pgen2/__pycache__/conv.cpython-311.opt-1.pyc000064400000025467151027012300014402 0ustar00 !A?h%HdZddlZddlmZmZGddejZdS)aConvert graminit.[ch] spit out by pgen to Python code. Pgen is the Python parser generator. It is useful to quickly create a parser from a grammar file in Python's grammar notation. But I don't want my parsers to be written in C (yet), so I'm translating the parsing tables to Python data structures and writing a Python parse engine. Note that the token numbers are constants determined by the standard Python tokenizer. The standard token module defines these numbers and their names (the names are not used much). The token numbers are hardcoded into the Python tokenizer and into pgen. A Python implementation of the Python tokenizer is also available, in the standard tokenize module. On the other hand, symbol numbers (representing the grammar's non-terminals) are assigned by pgen based on the actual grammar input. Note: this module is pretty much obsolete; the pgen module generates equivalent grammar tables directly from the Grammar.txt input file without having to invoke the Python pgen C program. N)grammartokenc*eZdZdZdZdZdZdZdS) Convertera2Grammar subclass that reads classic pgen output files. The run() method reads the tables as produced by the pgen parser generator, typically contained in two C files, graminit.h and graminit.c. The other methods are for internal use only. See the base class for more documentation. c|||||dS)z|r*t|d|d |\|\}}t|}||j|<||j|<d S) zParse the .h file written by pgen. (Internal) This file is a sequence of #define statements defining the nonterminals of the grammar as numbers. We build two tables mapping the numbers to names and back. Can't open : NFrz^#define\s+(\w+)\s+(\d+)$(z): can't parse T) openOSErrorprint symbol2number number2symbolrematchstripgroupsint) r filenameferrlinenolinemosymbolnumbers rrzConverter.parse_graminit_h5s/ XAA    E337 8 8 855555   4 4D aKF6==B 4$**,, 4(((FFF26**,,,@AAAA"$V.4"6*-3"6**ts <7<c  t|}n-#t$r }td|d|Yd}~dSd}~wwxYwd}|dzt|}}|dzt|}}|dzt|}}i}g}|drf|drt jd|}ttt| \} } } g} t| D]y} |dzt|}}t jd |}ttt| \}}| ||fz|dzt|}}| || | f<|dzt|}}|dt jd |}ttt| \}}g}t|D]} |dzt|}}t jd |}ttt| \} } } || | f} | | | ||dzt|}}|dzt|}}|df||_ i}t jd |}t|d}t|D]#}|dzt|}}t jd |}|d}ttt|dddd\}}}}||}|dzt|}}t jd|}i}t|d}t!|D]9\}}t#|}tdD]}|d|zzr d||dz|z<:||f||<%|dzt|}}||_g}|dzt|}}t jd|}t|d}t|D]}|dzt|}}t jd|}| \}}t|}|dkrd}nt|}| ||f|dzt|}}||_|dzt|}}|dzt|}}t jd|}t|d}|dzt|}}|dzt|}}t jd|}t|d}|dzt|}}t jd|}t|d} | |_|dzt|}} |dzt|}}dS#t*$rYdSwxYw)aParse the .c file written by pgen. (Internal) The file looks as follows. The first two lines are always this: #include "pgenheaders.h" #include "grammar.h" After that come four blocks: 1) one or more state definitions 2) a table defining dfas 3) a table defining labels 4) a struct defining the grammar A state definition has the following form: - one or more arc arrays, each of the form: static arc arcs__[] = { {, }, ... }; - followed by a state array, of the form: static state states_[] = { {, arcs__}, ... }; rrNFrrz static arc z)static arc arcs_(\d+)_(\d+)\[(\d+)\] = {$z\s+{(\d+), (\d+)},$z'static state states_(\d+)\[(\d+)\] = {$z\s+{(\d+), arcs_(\d+)_(\d+)},$zstatic dfa dfas\[(\d+)\] = {$z0\s+{(\d+), "(\w+)", (\d+), (\d+), states_(\d+),$z\s+("(?:\\\d\d\d)*")},$z!static label labels\[(\d+)\] = {$z\s+{(\d+), (0|"\w+")},$0z \s+(\d+),$z\s+{(\d+), labels},$z \s+(\d+)$)rrrnext startswithrrlistmaprrrangeappendstatesgroupeval enumerateorddfaslabelsstart StopIteration)!r r r!r"r#r$allarcsr5r%nmkarcs_ijststater:ndfasr&r'xyzfirst rawbitsetcbyter;nlabelsr<s! rr zConverter.parse_graminit_cTs8 XAA    E337 8 8 855555 axaaxaaxaoom,,! -//-00 1XJ"$$s3 44551aq((A#)!8T!WWDF"8$??BC 5 566DAqKKA''''%axa"&A%axa//-00 1 DdKKBC--..DAqE1XX # #%axaX?FFs3 44551aq!t} T"""" MM% !!8T!WWDF!!8T!WWDFCoom,,! -D  X6 = =BHHQKK  u * *A!!8T!WWDFM  BXXa[[F"3sBHHQ1a,@,@#A#ABBOFAq!1IE!!8T!WWDF4d;;BERXXa[[))I!),, + +11vvq++Aq!t}+)*acAg+"5>DLLaxa axa X:D A Abhhqkk""w " "A!!8T!WWDF4d;;B99;;DAqAACxxGG MM1a& ! ! ! !axa axaaxa XmT * *BHHQKK  axaaxa X-t 4 4bhhqkk""axa XlD ) )BHHQKK   axa %!!8T!WWDFFF    DD s" <7<)Z?? [  [ ci|_i|_t|jD]1\}\}}|tjkr | ||j|<%| ||j|<2dS)z1Create additional useful structures. (Internal).N)keywordstokensr8r;rNAME)r ilabeltypevalues rr zConverter.finish_offso  %.t{%;%; + + !FMT5uz!!e&7'- e$$$* D!  + +rN)__name__ __module__ __qualname____doc__rrr r rrrr$s^ >c%c%c%J+++++rr)r\rpgen2rrGrammarrr]rrr`st4 ! ]+]+]+]+]+]+]+]+]+]+rpgen2/__pycache__/pgen.cpython-311.opt-2.pyc000064400000045135151027012300014361 0ustar00 !A?h6ddlmZmZmZGddejZGddeZGddeZGdd eZ d d Z d S))grammartokentokenizeceZdZdS) PgenGrammarN)__name__ __module__ __qualname__+/usr/lib64/python3.11/lib2to3/pgen2/pgen.pyrrsDr rc~eZdZddZdZdZdZdZdZdZ d Z d Z d Z d Z d ZdZdZdZddZdZdZdS)ParserGeneratorNcNd}|t|d}|j}||_||_t j|j|_|| \|_ |_ | |i|_ | dS)Nzutf-8)encoding)openclosefilenamestreamrgenerate_tokensreadline generatorgettokenparsedfas startsymbolfirst addfirstsets)selfrr close_streams r __init__zParserGenerator.__init__ s >(W555F!%00HOOQ_U%;T$BCCC,2AN5)!M!t44QX%%8F++HOOVTN333'-AHV$!MKKEQx!! "AJ&&:e,,HOOUZ$7888(.AJu%!M!u-QX%%8F++HOOVTN333'-AHV$!Mr ct|j}||D] }||jvr||!dSN)r%rr&r'r calcfirst)rr8r9s r rzParserGenerator.addfirstsetsksaTY^^%%&&  % %D4:%%t$$$ % %r c .|j|}d|j|<|d}i}i}|jD]\}}||jvrh||jvr"|j|}|t d|zn"|||j|}|||||<vd||<|di||<i} |D]4\}} | D],} | | vr!t d|d| d|d| | || | <-5||j|<dS)Nr#zrecursion for rule %rrzrule z is ambiguous; z is in the first sets of z as well as )rrr.r/ ValueErrorrSupdate) rr9r;r<totalset overlapcheckr=r>fsetinverseitsfirstsymbols r rSzParserGenerator.calcfirstssio 4A  :++-- 1 1KE4 !!DJ&&:e,D|()@4)GHHH$NN5))):e,D%%%&* U##"#',aj U##+1133 ( (OE8" ( (W$$$*&*ddFFFEEE76??&LMMM#(  ( $ 4r cti}d}|jtjkr|jtjkr)||jtjk)|tj}|tjd|\}}|tj| ||}t|}| |t|}|||<||}|jtjk||fS)N:) typer ENDMARKERNEWLINErexpectrMOP parse_rhsmake_dfar* simplify_dfa) rrrr9azr;oldlennewlens r rzParserGenerator.parses  i5?**)u},, )u},,;;uz**D KK# & & &>>##DAq KK & & &--1%%CXXF   c " " "XXFDJ"" #i5?**$[  r c  fd} fd t|||g}|D]}i}|jD]1}|jD]'\}} |  | ||i(2t |D]R\}} |D]} | j| krn&t| |} || || |S|S)Nc$i}|||SrRr )r<base addclosures r closurez)ParserGenerator.make_dfa..closuresD Jud # # #Kr cT||vrdSd||<|jD]\}}| ||dSrAr.)r<rmr=r>rns r rnz,ParserGenerator.make_dfa..addclosuresQ}}DK$z + + t=JtT*** + +r )DFAStatenfasetr. setdefaultr-r/r0addarc) rr6finishror4r<r.nfastater=r>rsstrns @r rezParserGenerator.make_dfasM      + + + + +775>>6223 ( (ED!L E E#+=EEKE4(" 4)C)CDDDE"( !5!5 ( ( v &&ByF**+"&&11BMM"%%% R'''' ( r cltd||g}t|D]\}}td|||urdpd|jD]l\}}||vr||} n$t |} |||td| zXtd|| fzmdS)NzDump of NFA for State(final)z -> %d %s -> %d)print enumerater.r2r*r0) rr9r6rvtodor:r<r=r>js r dump_nfazParserGenerator.dump_nfas &&&w!$ 7 7HAu )Q =I C D D D$z 7 7 t4<< 4((AAD AKK%%%=+/****.E1:56666 7 7 7r c *td|t|D]r\}}td||jrdpdt|jD],\}}td|||fz-sdS)NzDump of DFA forrzr{r|r})r~rr3r-r.r/r2)rr9r;r:r<r=r>s r dump_dfazParserGenerator.dump_dfas &&&!# A AHAu )Q ;) Ar B B B%ej&6&6&8&899 A A tnsyy'??@@@@ A A Ar cd}|rnd}t|D]X\}}t|dzt|D]2}||}||kr"||=|D]}|||d}n3Y|ldSdS)NTFr)rranger* unifystate)rr;changesr:state_irstate_jr<s r rfzParserGenerator.simplify_dfas G'nn   7qsCHH--A!!fG'))F%(??E!,,Wg>>>>"& *      r c|\}}|jdkr||fSt}t}|||||jdkr`||\}}|||||jdk`||fS)N|) parse_altrPNFAStaterur)rrgrhaazzs r rdzParserGenerator.parse_rhss~~1 :  a4KBB IIaLLL HHRLLL*## ~~''1 !  *## r6Mr c4|\}}|jdvs|jtjtjfvrV|\}}|||}|jdv7|jtjtjfvV||fS)N)([) parse_itemrPr_rrMSTRINGru)rrgbr7ds r rzParserGenerator.parse_alt s  1zZ''yUZ666??$$DAq HHQKKKA zZ''yUZ666!t r c|jdkrd||\}}|tjd||||fS|\}}|j}|dvr||fS||||dkr||fS||fS)Nr])+*r)rPrrdrbrrcru parse_atom)rrgrhrPs r rzParserGenerator.parse_items :   MMOOO>>##DAq KK# & & & HHQKKKa4K??$$DAqJEJ&&!t MMOOO HHQKKK||!t !t r c|jdkrO||\}}|tjd||fS|jtjtjfvrOt}t}| ||j|||fS| d|j|jdS)Nr)z+expected (...) or NAME or STRING, got %s/%s) rPrrdrbrrcr_rMrrru raise_error)rrgrhs r rzParserGenerator.parse_atom(s :   MMOOO>>##DAq KK# & & &a4K Y5:u|4 4 4 A A HHQ # # # MMOOOa4K   J!Y  4 4 4 4 4r c|j|ks |.|j|kr#|d|||j|j|j}||S)Nzexpected %s/%s, got %s/%s)r_rPrr)rr_rPs r rbzParserGenerator.expect9sd 9  !2tzU7J7J   8!5$)TZ A A A   r ct|j}|dtjtjfvr4t|j}|dtjtjfv4|\|_|_|_|_|_ dSrE) r>rrCOMMENTNLr_rPbeginendline)rtups r rzParserGenerator.gettokenAsr4>""!f)8;777t~&&C!f)8;777AD> 4:tz48TYYYr c |rG ||z}n@#d|gttt|z}YnxYwt ||j|jd|jd|jf)N r#r)joinr%mapstr SyntaxErrorrrr)rmsgargss r rzParserGenerator.raise_errorHs  = =Dj =hhutCTNN';';;<<# tx{ $ TY 899 9s  ;ArR)rr r r!r?r5r1rrSrrerrrfrdrrrrbrrr r r rr s4    2,",","\%%%$$$<!!!0"""H777 AAA*"(444"EEE99999r rceZdZdZddZdS)rcg|_dSrRrq)rs r r!zNFAState.__init__Ss  r Nc>|j||fdSrR)r.r0rr>r=s r ruzNFAState.addarcVs$ %'''''r rR)rr r r!rur r r rrQs7((((((r rc*eZdZdZdZdZdZdZdS)rrc4||_||v|_i|_dSrR)rsr3r.)rrsfinals r r!zDFAState.__init__]s!   r c||j|<dSrRrqrs r ruzDFAState.addarces  %r c`|jD]\}}||ur ||j|<dSrR)r.r/)roldnewr=r>s r rzDFAState.unifystateksA9??,, ' 'KE4s{{#& %  ' 'r c|j|jkrdSt|jt|jkrdS|jD]$\}}||j|urdS%dS)NFT)r3r*r.r/get)rotherr=r>s r __eq__zDFAState.__eq__ps <5= ( (5 ty>>S__ , ,59??,,  KE45:>>%0000uu1tr N)rr r r!rurr__hash__r r r rrrr[sQ   '''   HHHr rr Grammar.txtcHt|}|SrR)rr?)rps r generate_grammarrs!!A >>  r N)r) r|rrrGrammarrobjectrrrrrr r r rs '&&&&&&&&&     '/   E9E9E9E9E9fE9E9E9N (((((v(((#####v###Jr pgen2/__pycache__/parse.cpython-311.opt-1.pyc000064400000021455151027012300014540 0ustar00 !A?hNdZddlmZGddeZGddeZdS)zParser engine for the grammar tables generated by pgen. The grammar table must be loaded first. See Parser/parser.c in the Python distribution for additional info on how this parsing engine works. )tokenceZdZdZdZdZdS) ParseErrorz(Exception to signal the parser is stuck.c t||d|d|d|||_||_||_||_dS)Nz: type=z, value=z , context=) Exception__init__msgtypevaluecontext)selfr r r r s ,/usr/lib64/python3.11/lib2to3/pgen2/parse.pyrzParseError.__init__sX4CCuuugg"7 8 8 8   cTt||j|j|j|jffSN)r r r r )r s r __reduce__zParseError.__reduce__s$DzzDHdiT\JJJrN)__name__ __module__ __qualname____doc__rrrrrrs=22KKKKKrrc@eZdZdZd dZd dZdZdZdZdZ d Z dS) Parsera5Parser engine. The proper usage sequence is: p = Parser(grammar, [converter]) # create instance p.setup([start]) # prepare for parsing : if p.addtoken(...): # parse a token; may raise ParseError break root = p.rootnode # root of abstract syntax tree A Parser instance may be reused by calling setup() repeatedly. A Parser instance contains state pertaining to the current token sequence, and should not be used concurrently by different threads to parse separate token sequences. See driver.py for how to get input tokens by tokenizing a file or string. Parsing is complete when addtoken() returns True; the root of the abstract syntax tree can then be retrieved from the rootnode instance variable. When a syntax error occurs, addtoken() raises the ParseError exception. There is no error recovery; the parser cannot be used after a syntax error was reported (but it can be reinitialized by calling setup()). Nc(||_|pd|_dS)aConstructor. The grammar argument is a grammar.Grammar instance; see the grammar module for more information. The parser is not ready yet for parsing; you must call the setup() method to get it started. The optional convert argument is a function mapping concrete syntax tree nodes to abstract syntax tree nodes. If not given, no conversion is done and the syntax tree produced is the concrete syntax tree. If given, it must be a function of two arguments, the first being the grammar (a grammar.Grammar instance), and the second being the concrete syntax tree node to be converted. The syntax tree is converted from the bottom up. A concrete syntax tree node is a (type, value, context, nodes) tuple, where type is the node type (a token or symbol number), value is None for symbols and a string for tokens, context is None or an opaque value used for error reporting (typically a (lineno, offset) pair), and nodes is a list of children for symbols, and None for tokens. An abstract syntax tree node may be anything; this is entirely up to the converter function. c|Srr)grammarnodes rz!Parser.__init__..ZsrN)rconvert)r rrs rrzParser.__init__<s: >#=#= rc| |jj}|ddgf}|jj|d|f}|g|_d|_t |_dS)aPrepare for parsing. This *must* be called before starting to parse. The optional argument is an alternative start symbol; it defaults to the grammar's start symbol. You can use a Parser instance to parse any number of programs; each time you call setup() the parser is reset to an initial state determined by the (implicit or explicit) start symbol. N)rstartdfasstackrootnodeset used_names)r r"newnode stackentrys rsetupz Parser.setup\sX =L&E$b)l'.7;  \  %%rc||||} |jd\}}}|\}} ||} | D]\} } |jj| \} }|| kro|||| || }||d|fgkrC||jsdS|jd\}}}|\}} ||d|fgkCdS| dkrE|jj| }|\}}||vr*|| |jj| | |nGd|f| vr.||jstd|||ntd|||C)z>>$E -QJ<77 #z(#'44+/:b>(UD(+  !-QJ<77!55#XX!\.q1F*0'Ix)) !T\%6q%98WMMMu:%%HHJJJ:?()9)-ug???? %[$wGGGS) Hrc|tjkr=|j||jj|}||S|jj|}|td||||S)z&Turn a token into a label. (Internal)Nz bad token) rNAMEr'addrkeywordsgettokensr)r r r r r3s rr.zParser.classifys~ 5:   O   & & &\*..u55F! $((.. >[$w?? ? rc|jd\}}}|||df}||j|}||d||||f|jd<dS)zShift a token. (Internal)r,N)r$rrappend) r r r r:r r4r5rr(s rr0z Parser.shiftsi:b>UD.,,t|W55   HOOG $ $ $x. 2rc|jd\}}}|d|gf}|||f|jd<|j|d|fdS)zPush a nonterminal. (Internal)r,Nr!)r$rH) r r newdfar:r r4r5rr(s rr2z Parser.pushsW:b>UDw+x. 2 61g./////rc|j\}}}||j|}|O|jr.|jd\}}}|d|dS||_|j|j_dSdS)zPop a nonterminal. (Internal)Nr,)r$r1rrrHr%r')r popdfapopstatepopnoder(r4r5rs rr1z Parser.pops$(JNN$4$4!',,t|W55  z ;#':b> UDR((((( ' +/? (((  rr) rrrrrr*r@r.r0r2r1rrrrrs:????@    0.H.H.H`   ///000 ; ; ; ; ;rrN)rrrrobjectrrrrrQs K K K K K K K Kn;n;n;n;n;Vn;n;n;n;n;rpgen2/__pycache__/pgen.cpython-311.opt-1.pyc000064400000045135151027012300014360 0ustar00 !A?h6ddlmZmZmZGddejZGddeZGddeZGdd eZ d d Z d S))grammartokentokenizeceZdZdS) PgenGrammarN)__name__ __module__ __qualname__+/usr/lib64/python3.11/lib2to3/pgen2/pgen.pyrrsDr rc~eZdZddZdZdZdZdZdZdZ d Z d Z d Z d Z d ZdZdZdZddZdZdZdS)ParserGeneratorNcNd}|t|d}|j}||_||_t j|j|_|| \|_ |_ | |i|_ | dS)Nzutf-8)encoding)openclosefilenamestreamrgenerate_tokensreadline generatorgettokenparsedfas startsymbolfirst addfirstsets)selfrr close_streams r __init__zParserGenerator.__init__ s >(W555F!%00HOOQ_U%;T$BCCC,2AN5)!M!t44QX%%8F++HOOVTN333'-AHV$!MKKEQx!! "AJ&&:e,,HOOUZ$7888(.AJu%!M!u-QX%%8F++HOOVTN333'-AHV$!Mr ct|j}||D] }||jvr||!dSN)r%rr&r'r calcfirst)rr8r9s r rzParserGenerator.addfirstsetsksaTY^^%%&&  % %D4:%%t$$$ % %r c .|j|}d|j|<|d}i}i}|jD]\}}||jvrh||jvr"|j|}|t d|zn"|||j|}|||||<vd||<|di||<i} |D]4\}} | D],} | | vr!t d|d| d|d| | || | <-5||j|<dS)Nr#zrecursion for rule %rrzrule z is ambiguous; z is in the first sets of z as well as )rrr.r/ ValueErrorrSupdate) rr9r;r<totalset overlapcheckr=r>fsetinverseitsfirstsymbols r rSzParserGenerator.calcfirstssio 4A  :++-- 1 1KE4 !!DJ&&:e,D|()@4)GHHH$NN5))):e,D%%%&* U##"#',aj U##+1133 ( (OE8" ( (W$$$*&*ddFFFEEE76??&LMMM#(  ( $ 4r cti}d}|jtjkr|jtjkr)||jtjk)|tj}|tjd|\}}|tj| ||}t|}| |t|}|||<||}|jtjk||fS)N:) typer ENDMARKERNEWLINErexpectrMOP parse_rhsmake_dfar* simplify_dfa) rrrr9azr;oldlennewlens r rzParserGenerator.parses  i5?**)u},, )u},,;;uz**D KK# & & &>>##DAq KK & & &--1%%CXXF   c " " "XXFDJ"" #i5?**$[  r c  fd} fd t|||g}|D]}i}|jD]1}|jD]'\}} |  | ||i(2t |D]R\}} |D]} | j| krn&t| |} || || |S|S)Nc$i}|||SrRr )r<base addclosures r closurez)ParserGenerator.make_dfa..closuresD Jud # # #Kr cT||vrdSd||<|jD]\}}| ||dSrAr.)r<rmr=r>rns r rnz,ParserGenerator.make_dfa..addclosuresQ}}DK$z + + t=JtT*** + +r )DFAStatenfasetr. setdefaultr-r/r0addarc) rr6finishror4r<r.nfastater=r>rsstrns @r rezParserGenerator.make_dfasM      + + + + +775>>6223 ( (ED!L E E#+=EEKE4(" 4)C)CDDDE"( !5!5 ( ( v &&ByF**+"&&11BMM"%%% R'''' ( r cltd||g}t|D]\}}td|||urdpd|jD]l\}}||vr||} n$t |} |||td| zXtd|| fzmdS)NzDump of NFA for State(final)z -> %d %s -> %d)print enumerater.r2r*r0) rr9r6rvtodor:r<r=r>js r dump_nfazParserGenerator.dump_nfas &&&w!$ 7 7HAu )Q =I C D D D$z 7 7 t4<< 4((AAD AKK%%%=+/****.E1:56666 7 7 7r c *td|t|D]r\}}td||jrdpdt|jD],\}}td|||fz-sdS)NzDump of DFA forrzr{r|r})r~rr3r-r.r/r2)rr9r;r:r<r=r>s r dump_dfazParserGenerator.dump_dfas &&&!# A AHAu )Q ;) Ar B B B%ej&6&6&8&899 A A tnsyy'??@@@@ A A Ar cd}|rnd}t|D]X\}}t|dzt|D]2}||}||kr"||=|D]}|||d}n3Y|ldSdS)NTFr)rranger* unifystate)rr;changesr:state_irstate_jr<s r rfzParserGenerator.simplify_dfas G'nn   7qsCHH--A!!fG'))F%(??E!,,Wg>>>>"& *      r c|\}}|jdkr||fSt}t}|||||jdkr`||\}}|||||jdk`||fS)N|) parse_altrPNFAStaterur)rrgrhaazzs r rdzParserGenerator.parse_rhss~~1 :  a4KBB IIaLLL HHRLLL*## ~~''1 !  *## r6Mr c4|\}}|jdvs|jtjtjfvrV|\}}|||}|jdv7|jtjtjfvV||fS)N)([) parse_itemrPr_rrMSTRINGru)rrgbr7ds r rzParserGenerator.parse_alt s  1zZ''yUZ666??$$DAq HHQKKKA zZ''yUZ666!t r c|jdkrd||\}}|tjd||||fS|\}}|j}|dvr||fS||||dkr||fS||fS)Nr])+*r)rPrrdrbrrcru parse_atom)rrgrhrPs r rzParserGenerator.parse_items :   MMOOO>>##DAq KK# & & & HHQKKKa4K??$$DAqJEJ&&!t MMOOO HHQKKK||!t !t r c|jdkrO||\}}|tjd||fS|jtjtjfvrOt}t}| ||j|||fS| d|j|jdS)Nr)z+expected (...) or NAME or STRING, got %s/%s) rPrrdrbrrcr_rMrrru raise_error)rrgrhs r rzParserGenerator.parse_atom(s :   MMOOO>>##DAq KK# & & &a4K Y5:u|4 4 4 A A HHQ # # # MMOOOa4K   J!Y  4 4 4 4 4r c|j|ks |.|j|kr#|d|||j|j|j}||S)Nzexpected %s/%s, got %s/%s)r_rPrr)rr_rPs r rbzParserGenerator.expect9sd 9  !2tzU7J7J   8!5$)TZ A A A   r ct|j}|dtjtjfvr4t|j}|dtjtjfv4|\|_|_|_|_|_ dSrE) r>rrCOMMENTNLr_rPbeginendline)rtups r rzParserGenerator.gettokenAsr4>""!f)8;777t~&&C!f)8;777AD> 4:tz48TYYYr c |rG ||z}n@#d|gttt|z}YnxYwt ||j|jd|jd|jf)N r#r)joinr%mapstr SyntaxErrorrrr)rmsgargss r rzParserGenerator.raise_errorHs  = =Dj =hhutCTNN';';;<<# tx{ $ TY 899 9s  ;ArR)rr r r!r?r5r1rrSrrerrrfrdrrrrbrrr r r rr s4    2,",","\%%%$$$<!!!0"""H777 AAA*"(444"EEE99999r rceZdZdZddZdS)rcg|_dSrRrq)rs r r!zNFAState.__init__Ss  r Nc>|j||fdSrR)r.r0rr>r=s r ruzNFAState.addarcVs$ %'''''r rR)rr r r!rur r r rrQs7((((((r rc*eZdZdZdZdZdZdZdS)rrc4||_||v|_i|_dSrR)rsr3r.)rrsfinals r r!zDFAState.__init__]s!   r c||j|<dSrRrqrs r ruzDFAState.addarces  %r c`|jD]\}}||ur ||j|<dSrR)r.r/)roldnewr=r>s r rzDFAState.unifystateksA9??,, ' 'KE4s{{#& %  ' 'r c|j|jkrdSt|jt|jkrdS|jD]$\}}||j|urdS%dS)NFT)r3r*r.r/get)rotherr=r>s r __eq__zDFAState.__eq__ps <5= ( (5 ty>>S__ , ,59??,,  KE45:>>%0000uu1tr N)rr r r!rurr__hash__r r r rrrr[sQ   '''   HHHr rr Grammar.txtcHt|}|SrR)rr?)rps r generate_grammarrs!!A >>  r N)r) r|rrrGrammarrobjectrrrrrr r r rs '&&&&&&&&&     '/   E9E9E9E9E9fE9E9E9N (((((v(((#####v###Jr pgen2/__pycache__/literals.cpython-311.opt-2.pyc000064400000004522151027012300015242 0ustar00 !A?hc ` ddlZdddddddd d d d Zd ZdZdZedkr edSdS)N     '"\) abfnrtvr r r c|dd\}}t|}||S|drb|dd}t |dkrt d|z t |d}nT#t $rt d|zdwxYw t |d}n!#t $rt d|zdwxYwt|S) Nrxz!invalid hex string escape ('\%s')z#invalid octal string escape ('\%s'))groupsimple_escapesget startswithlen ValueErrorintchr)malltaileschexesis //usr/lib64/python3.11/lib2to3/pgen2/literals.pyescaper)s1 IC   T " "C   s VQRR u::>>ADHII I TE2AA T T TADHIIt S T VD! AA V V VCdJKKQU U V q66Ms=BB,0CCc|d}|dd|dzkr|dz}|t|t| }tjdt|S)Nrz)\\(\'|\"|\\|[abfnrtv]|x.{0,2}|[0-7]{1,3}))rresubr))sqs r( evalStringr0(s\ !A!u!|| aC #a&&#a&&.A 6> J JJctdD]G}t|}t|}t|}||krt ||||HdS)N)ranger!reprr0print)r'cr.es r(testr92s` 3ZZ FF GG qMM 66 !Q1    r1__main__)r,rr)r0r9__name__r1r(r=sC   *KKK zDFFFFFr1pgen2/__pycache__/literals.cpython-311.pyc000064400000006004151027012300014277 0ustar00 !A?hc bdZddlZddddddd d d d d ZdZdZdZedkr edSdS)z>$      T " "C   s VQRR u::>>ADHII I TE2AA T T TADHIIt S T VD! AA V V VCdJKKQU U V q66MsB%%CCC6c|ds4|dsJt|dd|d}|dd|dzkr|dz}||s-Jt|t| dt|dt|zksJ|t|t| }t jdt |S)Nr r rrrz)\\(\'|\"|\\|[abfnrtv]|x.{0,2}|[0-7]{1,3}))rreprendswithrresubr))sqs r( evalStringr2(s <<  > S 1 1>>4"1";;>> 1 !A!u!|| aC ::a==++$q#a&&{++++= q66Qs1vvX     #a&&#a&&.A 6> J JJctdD]G}t|}t|}t|}||krt ||||HdS)N)ranger!r,r2print)r'cr0es r(testr:2s` 3ZZ FF GG qMM 66 !Q1    r3__main__)__doc__r.rr)r2r:__name__r3r(r?sCB   *KKK zDFFFFFr3pgen2/__pycache__/parse.cpython-311.pyc000064400000021510151027012300013571 0ustar00 !A?hNdZddlmZGddeZGddeZdS)zParser engine for the grammar tables generated by pgen. The grammar table must be loaded first. See Parser/parser.c in the Python distribution for additional info on how this parsing engine works. )tokenceZdZdZdZdZdS) ParseErrorz(Exception to signal the parser is stuck.c t||d|d|d|||_||_||_||_dS)Nz: type=z, value=z , context=) Exception__init__msgtypevaluecontext)selfr r r r s ,/usr/lib64/python3.11/lib2to3/pgen2/parse.pyrzParseError.__init__sX4CCuuugg"7 8 8 8   cTt||j|j|j|jffSN)r r r r )r s r __reduce__zParseError.__reduce__s$DzzDHdiT\JJJrN)__name__ __module__ __qualname____doc__rrrrrrs=22KKKKKrrc@eZdZdZd dZd dZdZdZdZdZ d Z dS) Parsera5Parser engine. The proper usage sequence is: p = Parser(grammar, [converter]) # create instance p.setup([start]) # prepare for parsing : if p.addtoken(...): # parse a token; may raise ParseError break root = p.rootnode # root of abstract syntax tree A Parser instance may be reused by calling setup() repeatedly. A Parser instance contains state pertaining to the current token sequence, and should not be used concurrently by different threads to parse separate token sequences. See driver.py for how to get input tokens by tokenizing a file or string. Parsing is complete when addtoken() returns True; the root of the abstract syntax tree can then be retrieved from the rootnode instance variable. When a syntax error occurs, addtoken() raises the ParseError exception. There is no error recovery; the parser cannot be used after a syntax error was reported (but it can be reinitialized by calling setup()). Nc(||_|pd|_dS)aConstructor. The grammar argument is a grammar.Grammar instance; see the grammar module for more information. The parser is not ready yet for parsing; you must call the setup() method to get it started. The optional convert argument is a function mapping concrete syntax tree nodes to abstract syntax tree nodes. If not given, no conversion is done and the syntax tree produced is the concrete syntax tree. If given, it must be a function of two arguments, the first being the grammar (a grammar.Grammar instance), and the second being the concrete syntax tree node to be converted. The syntax tree is converted from the bottom up. A concrete syntax tree node is a (type, value, context, nodes) tuple, where type is the node type (a token or symbol number), value is None for symbols and a string for tokens, context is None or an opaque value used for error reporting (typically a (lineno, offset) pair), and nodes is a list of children for symbols, and None for tokens. An abstract syntax tree node may be anything; this is entirely up to the converter function. c|Srr)grammarnodes rz!Parser.__init__..ZsrN)rconvert)r rrs rrzParser.__init__<s: >#=#= rc| |jj}|ddgf}|jj|d|f}|g|_d|_t |_dS)aPrepare for parsing. This *must* be called before starting to parse. The optional argument is an alternative start symbol; it defaults to the grammar's start symbol. You can use a Parser instance to parse any number of programs; each time you call setup() the parser is reset to an initial state determined by the (implicit or explicit) start symbol. N)rstartdfasstackrootnodeset used_names)r r"newnode stackentrys rsetupz Parser.setup\sX =L&E$b)l'.7;  \  %%rc||||} |jd\}}}|\}} ||} | D]\} } |jj| \} }|| krw| dksJ|||| || }||d|fgkrC||jsdS|jd\}}}|\}} ||d|fgkCdS| dkrE|jj| }|\}}||vr*|| |jj| | |nGd|f| vr.||jstd|||ntd|||K)z>>$E -QJ<77 #z(#'44+/:b>(UD(+  !-QJ<77!55#XX!\.q1F*0'Ix)) !T\%6q%98WMMMu:%%HHJJJ:?()9)-ug???? %[$wGGGS) Hrc|tjkr=|j||jj|}||S|jj|}|td||||S)z&Turn a token into a label. (Internal)Nz bad token) rNAMEr'addrkeywordsgettokensr)r r r r r3s rr.zParser.classifys~ 5:   O   & & &\*..u55F! $((.. >[$w?? ? rc|jd\}}}|||df}||j|}||d||||f|jd<dS)zShift a token. (Internal)r,N)r$rrappend) r r r r:r r4r5rr(s rr0z Parser.shiftsi:b>UD.,,t|W55   HOOG $ $ $x. 2rc|jd\}}}|d|gf}|||f|jd<|j|d|fdS)zPush a nonterminal. (Internal)r,Nr!)r$rH) r r newdfar:r r4r5rr(s rr2z Parser.pushsW:b>UDw+x. 2 61g./////rc|j\}}}||j|}|O|jr.|jd\}}}|d|dS||_|j|j_dSdS)zPop a nonterminal. (Internal)Nr,)r$r1rrrHr%r')r popdfapopstatepopnoder(r4r5rs rr1z Parser.pops$(JNN$4$4!',,t|W55  z ;#':b> UDR((((( ' +/? (((  rr) rrrrrr*r@r.r0r2r1rrrrrs:????@    0.H.H.H`   ///000 ; ; ; ; ;rrN)rrrrobjectrrrrrQs K K K K K K K Kn;n;n;n;n;Vn;n;n;n;n;rpgen2/__pycache__/literals.cpython-311.opt-1.pyc000064400000004636151027012300015247 0ustar00 !A?hc bdZddlZddddddd d d d d ZdZdZdZedkr edSdS)z>ADHII I TE2AA T T TADHIIt S T VD! AA V V VCdJKKQU U V q66Ms=BB,0CCc|d}|dd|dzkr|dz}|t|t| }tjdt|S)Nrz)\\(\'|\"|\\|[abfnrtv]|x.{0,2}|[0-7]{1,3}))rresubr))sqs r( evalStringr0(s\ !A!u!|| aC #a&&#a&&.A 6> J JJctdD]G}t|}t|}t|}||krt ||||HdS)N)ranger!reprr0print)r'cr.es r(testr92s` 3ZZ FF GG qMM 66 !Q1    r1__main__)__doc__r,rr)r0r9__name__r1r(r>sCB   *KKK zDFFFFFr1pgen2/__pycache__/parse.cpython-311.opt-2.pyc000064400000013272151027012300014537 0ustar00 !A?hL ddlmZGddeZGddeZdS))tokenceZdZ dZdZdS) ParseErrorc t||d|d|d|||_||_||_||_dS)Nz: type=z, value=z , context=) Exception__init__msgtypevaluecontext)selfr r r r s ,/usr/lib64/python3.11/lib2to3/pgen2/parse.pyrzParseError.__init__sX4CCuuugg"7 8 8 8   cTt||j|j|j|jffSN)r r r r )r s r __reduce__zParseError.__reduce__s$DzzDHdiT\JJJrN)__name__ __module__ __qualname__rrrrrrs:2KKKKKrrc>eZdZ d dZd dZdZdZdZdZdZ dS) ParserNc* ||_|pd|_dS)Nc|Srr)grammarnodes rz!Parser.__init__..Zsr)rconvert)r rrs rrzParser.__init__<s$ 8 >#=#= rc | |jj}|ddgf}|jj|d|f}|g|_d|_t |_dS)N)rstartdfasstackrootnodeset used_names)r r!newnode stackentrys rsetupz Parser.setup\s]  =L&E$b)l'.7;  \  %%rc ||||} |jd\}}}|\}} ||} | D]\} } |jj| \} }|| kro|||| || }||d|fgkrC||jsdS|jd\}}}|\}} ||d|fgkCdS| dkrE|jj| }|\}}||vr*|| |jj| | |nGd|f| vr.||jstd|||ntd|||C)NTr Fztoo much inputz bad input) classifyr#rlabelsshiftpopr"pushr)r r r r ilabeldfastaterstatesfirstarcsinewstatetvitsdfa itsstatesitsfirsts raddtokenzParser.addtokentsJtUG44) H#z"~ CMFE%=D#$ H$ H 8|*1-1Q;;JJtUHg>>>$E -QJ<77 #z(#'44+/:b>(UD(+  !-QJ<77!55#XX!\.q1F*0'Ix)) !T\%6q%98WMMMu:%%HHJJJ:?()9)-ug???? %[$wGGGS) Hrc |tjkr=|j||jj|}||S|jj|}|td||||S)Nz bad token) rNAMEr&addrkeywordsgettokensr)r r r r r2s rr-zParser.classifys4 5:   O   & & &\*..u55F! $((.. >[$w?? ? rc |jd\}}}|||df}||j|}||d||||f|jd<dSNr+)r#rrappend) r r r r9r r3r4rr's rr/z Parser.shiftsl(:b>UD.,,t|W55   HOOG $ $ $x. 2rc |jd\}}}|d|gf}|||f|jd<|j|d|fdS)Nr+r )r#rH) r r newdfar9r r3r4rr's rr1z Parser.pushsZ-:b>UDw+x. 2 61g./////rc |j\}}}||j|}|O|jr.|jd\}}}|d|dS||_|j|j_dSdSrG)r#r0rrrHr$r&)r popdfapopstatepopnoder'r3r4rs rr0z Parser.pops,$(JNN$4$4!',,t|W55  z ;#':b> UDR((((( ' +/? (((  rr) rrrrr)r?r-r/r1r0rrrrrs:????@    0.H.H.H`   ///000 ; ; ; ; ;rrN)rrrobjectrrrrrQs K K K K K K K Kn;n;n;n;n;Vn;n;n;n;n;rpgen2/__pycache__/tokenize.cpython-311.pyc000064400000057125151027012300014322 0ustar00 !A?hR PdZdZdZddlZddlZddlmZmZddlTddl m Z d e e Dgd zZ [ e n #e$reZ YnwxYwd Zd Zd ZdZdZdZeedezzeezZdZdZdZdZeddZeeeeeZdZeddeezZdezZeeeZ ede dzZ!ee!e eZ"dZ#dZ$d Z%d!Z&d"Z'ee'd#ze'd$zZ(ee'd%ze'd&zZ)ed'd(d)d*d+d,d-d.d/ Z*d0Z+ed1d2d3Z,ee*e+e,Z-ee"e-e)eZ.ee.zZ/ee'd4zed5dze'd6zed7dzZ0edee(Z1eee1e"e-e0ezZ2e3ej4e/e2e%e&f\Z5Z6Z7Z8ed8d9d:d;ed8d9dzZ9ej4e#ej4e$e7e8d?d@e9DdAe9DdBe9DZ:d#d$hdCe9DzdDe9DzZ;d5d7hdEe9DzdFe9DzZZ?GdJdKe>Z@dLZAeAfdMZBdNZCGdOdPZDej4dQejEZFej4dRejEZGdSZHdTZIdUZJdVZKeLdWkrUddlMZMeNeMjOdkr&eBePeMjOdjQdSeBeMjRjQdSdS)XaTokenization help for Python programs. generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or "" for EOF). It generates 5-tuples with these members: the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string) It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators Older entry points tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken) are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.zKa-Ping Yee z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip MontanaroN)BOM_UTF8lookup)*)tokenc*g|]}|ddk|S)r_).0xs //usr/lib64/python3.11/lib2to3/pgen2/tokenize.py r%s! 0 0 0AaDCKK1KKK)tokenizegenerate_tokens untokenizec8dd|zdzS)N(|))joinchoicess r groupr0sC#((7"3"33c99rct|dzS)Nrrrs r anyr1s%/C//rct|dzS)N?rrs r mayber 2sE7Oc11rc:tfdDS)Nc3K|];}dzD]3}||k,||zV4z _combinations..4s`!e)qzz||qzz||/K/KA/K/K/K/K/Kr)set)r&s`r _combinationsr)3s;   rz[ \f\t]*z #[^\r\n]*z\\\r?\nz\w+z0[bB]_?[01]+(?:_[01]+)*z(0[xX]_?[\da-fA-F]+(?:_[\da-fA-F]+)*[lL]?z0[oO]?_?[0-7]+(?:_[0-7]+)*[lL]?z[1-9]\d*(?:_\d+)*[lL]?z0[lL]?z[eE][-+]?\d+(?:_\d+)*z\d+(?:_\d+)*\.(?:\d+(?:_\d+)*)?z\.\d+(?:_\d+)*z \d+(?:_\d+)*z\d+(?:_\d+)*[jJ]z[jJ]z[^'\\]*(?:\\.[^'\\]*)*'z[^"\\]*(?:\\.[^"\\]*)*"z%[^'\\]*(?:(?:\\.|'(?!''))[^'\\]*)*'''z%[^"\\]*(?:(?:\\.|"(?!""))[^"\\]*)*"""z'(?:[uUrRbBfF]|[rR][fFbB]|[fFbBuU][rR])?'''"""z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"z\*\*=?z>>=?z<<=?z<>z!=z//=?z->z[+\-*/%&@|^=<>]=?~z[][(){}]z\r?\nz:=z[:;.,`@]z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"rRfFbB>UuURUruRur)r-r.r*r+c$i|] }|dtSr*) single3progr prefixs r r@y FFFv&~~~{FFFrc$i|] }|dtSr+) double3progr>s r r@r@zrArci|]}|dSNr r>s r r@r@{s777vt777rch|]}|dSr<r r>s r rH///^^^///rch|]}|dSrCr r>s r rHrHrIrch|]}|dS)r-r r>s r rHrH---f\\\---rch|]}|dS)r.r r>s r rHrHrLrceZdZdS) TokenErrorN__name__ __module__ __qualname__r rr rPrPrrPceZdZdS)StopTokenizingNrQr rr rWrWrUrrWc z|\}}|\}}td||||t|t|fzdS)Nz%d,%d-%d,%d: %s %s)printtok_namerepr) typerxxx_todo_changemexxx_todo_changeme1linesrowscolerowecols r printtokenrdsR$LT4%LT4 tT4$e= >?????rcJ t||dS#t$rYdSwxYw)a: The tokenize() function accepts two parameters: one representing the input stream, and one providing an output mechanism for tokenize(). The first parameter, readline, must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. The second parameter, tokeneater, must also be a callable object. It is called once for each token, with five arguments, corresponding to the tuples generated by generate_tokens(). N) tokenize_looprW)readline tokeneaters r rrs? h +++++      s  ""c4t|D]}||dSrF)r)rgrh token_infos r rfrfs3%h//   J  rc&eZdZdZdZdZdZdS) Untokenizerc0g|_d|_d|_dS)Nrr)tokensprev_rowprev_col)selfs r __init__zUntokenizer.__init__s   rc|\}}||jksJ||jz }|r|jd|zdSdS)N )rorprnappend)rqstartrowcol col_offsets r add_whitespacezUntokenizer.add_whitespaces]Sdm####4=(  1 K  sZ/ 0 0 0 0 0 1 1rcp|D]}t|dkr|||nn|\}}}}}|||j||\|_|_|ttfvr|xjdz c_d|_d |jS)Nrrr#) lencompatrzrnrurorpNEWLINENLr)rqiterablettok_typerrvendr_s r rzUntokenizer.untokenizes " "A1vv{{ Ax(((01 -HeUC    & & & K  u % % %+. (DM4=GR=(( " ! wwt{###rcd}g}|jj}|\}}|ttfvr|dz }|tt fvrd}|D]}|dd\}}|ttt tfvr|dz }|tkr||Q|tkr| q|tt fvrd}n|r|r||dd}||dS)NFrtTr|) rnruNAMENUMBERrrASYNCAWAITINDENTDEDENTpop) rqrr startlineindents toks_appendtoknumtokvaltoks r r~zUntokenizer.compats k(  dF^ # # cMF gr] " "I  C !WNFF$u555# v&&&6!! GR=((  "w " GBK(((! K    #  rN)rRrSrTrrrzrr~r rr rlrlsP 111 $ $ $     rrlz&^[ \t\f]*#.*?coding[:=][ \t]*([-\w.]+)s^[ \t\f]*(?:[#\r\n]|$)c|dddd}|dks|drdS|dvs|drd S|S) z(Imitates get_normal_name in tokenizer.c.N r -utf-8zutf-8-)zlatin-1 iso-8859-1z iso-latin-1)zlatin-1-z iso-8859-1-z iso-latin-1-r)lowerreplace startswith)orig_encencs r _get_normal_namersv 3B3-     ' 'S 1 1C g~~11~w 666 ~~ABB7| OrcLdd}d}fd}fd}|}|trd|dd}d}|s|gfS||}|r||gfSt|s||gfS|}|s||gfS||}|r|||gfS|||gfS) a The detect_encoding() function is used to detect the encoding that should be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used (as a string) and a list of any lines (left as bytes) it has read in. It detects the encoding from the presence of a utf-8 bom or an encoding cookie as specified in pep-0263. If both a bom and a cookie are present, but disagree, a SyntaxError will be raised. If the encoding cookie is an invalid charset, raise a SyntaxError. Note that if a utf-8 bom is found, 'utf-8-sig' is returned. If no encoding is specified, then the default of 'utf-8' will be returned. FNrcV S#t$rtcYSwxYwrF) StopIterationbytes)rgsr read_or_stopz%detect_encoding..read_or_stops< 8::    77NNN s ((c| |d}n#t$rYdSwxYwt|}|sdSt |d} t |}n #t$rtd|zwxYwr|j dkrtd|dz }|S)Nasciirzunknown encoding: rzencoding problem: utf-8z-sig) decodeUnicodeDecodeError cookie_rematchrrr LookupError SyntaxErrorname)r_ line_stringrencodingcodec bom_founds r find_cookiez$detect_encoding..find_cookie s ++g..KK!   44  ,, 4#EKKNN33 ?8$$EE ? ? ?2X=>> > ?  zW$$!";<<<  Hs ''+A;;BTz utf-8-sig)rrblank_rer)rgrdefaultrrfirstsecondrs` @r detect_encodingrs-$IHG , LNNE !! abb  {{5!!H!%  >>%   \^^F  {6""H)%(( UFO ##rcHt}||S)aTransform tokens back into Python source code. Each element returned by the iterable must be a token sequence with at least two elements, a token number and token value. If only two tokens are passed, the resulting output is poor. Round-trip invariant for full input: Untokenized source will match input source exactly Round-trip invariant for limited input: # Output text will tokenize the back to the input t1 = [tok[:2] for tok in generate_tokens(f.readline)] newcode = untokenize(t1) readline = iter(newcode.splitlines(1)).next t2 = [tok[:2] for tokin generate_tokens(readline)] assert t1 == t2 )rlr)ruts r rr:s$ B == " ""rc# Kdx}x}}d\}}d}dg}d}d} d} d} |} n#t$rd} YnwxYw|dz}dt| }} |r| std||| }|r>|dx} }t || d|z|||f|| zfVd\}}d}nZ|rA| ddd kr3| d dd kr%t || z||t| f|fVd}d}|| z}|| z}|dkr|s| snd}| |krO| | d kr|dz}n2| | d kr|tzdztz}n| | dkrd}nn | dz} | |kO| |krn|r|Vd}| | dvr| | dkry| | dd}| t|z}t||| f|| t|zf| fVt| |d||f|t| f| fVn>ttf| | dk| | d|| f|t| f| fV0||dkr/| |t| d| |df|| f| fV||dkrT||vrtdd|| | f|dd}| r| |dkrd} d} d} td|| f|| f| fV||dkT| r| r| |dkrd} d} d} n| std|dfd}| |krt| | }|r|d\}}||f||f|} }}| ||| |}}|t"jvs |dkr|dkrt&|||| fVn|dvr,t(}|dkrt}n| rd} |r|Vd}||||| fVna|dkr*|jdrJ|r|Vd}t|||| fVn1|t,vrpt.|}|| | }|r9|d} | || }|r|Vd}t |||| f| fVn||f}| |d}| }n|t0vs"|ddt0vs|ddt0vrk|ddkrG||f}t.|p%t.|dpt.|d}| |dd}}| }nA|r|Vd}t |||| fVn"|r|dvr| r|dkrt4nt6|||| fV)t8|||| f}|dkr|s|}A|dvrW|rU|dt8krD|ddkr8|dkr d} |d} t4|d|d|d|dfVd}|r|Vd}|Vnk|d kr|r|Vd}t|||| f| fVd}nJ|d!vr|dz}n |d"vr|dz }|r|Vd}t:|||| fVn t | | || f|| dzf| fV| dz} | |k|r|Vd}|ddD]}td|df|dfdfVt<d|df|dfdfVdS)#a4 The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline can be a callable function terminating with StopIteration: readline = open(myfile).next # Example of alternate readline The generator produces 5-tuples with these members: the token type; the token string; a 2-tuple (srow, scol) of ints specifying the row and column where the token begins in the source; a 2-tuple (erow, ecol) of ints specifying the row and column where the token ends in the source; and the line on which the token was found. The line passed is the physical line. r)r#rNFrr#zEOF in multi-line stringz\ z\ rt  z# #z rz3unindent does not match any outer indentation levelz zEOF in multi-line statement.T r|r)asyncawaitr)defforr\z([{z)]})rr}rPrrSTRING ERRORTOKENtabsizerstripCOMMENTrrurIndentationErrorr pseudoprogspanstringdigitsrrendswith triple_quotedendprogs single_quoted isidentifierrrrOP ENDMARKER)rglnumparenlev continuedcontstrneedcontcontlinerstashed async_defasync_def_indent async_def_nlr_posmaxstrstartendprogendmatchrcolumn comment_tokennl_pos pseudomatchrvsposeposrinitialnewlinerindents r rrOs #$#D#8iGXHcGGIL} 8::DD   DDD axc$iiS J  G !;XFFF}}T**H $LLOO+cwdsd3$ho????$)! d233i611d233i86K6K!7T>#dCII%6BBBB!D.#d? ]]9] F))9##fqjVV#Y$&&&'/A2Ew1N#Y$&&Ag )) czz5  CyG##9##$(J$5$5f$=$=M 3}#5#55F"M #;sS5G5G/G(H$PPPPtFGG} &>D#d))+'"+.M.M! $ #$  K !>q JJJICii$**455Kg (--a00 s#'-$cCd!%eCi$u+wfm++sNNu||!5$d;;;;;&&%G!||"$",'+ '% "&"E4t<<<<<^^-u~d33333'% "&"E4t<<<<<m++&uoG&}}T377H &ll1oo $U3Y"+")MMM&*G%udT3KFFFFF$(%="&uvv,#' --"1"I.."1"I..RyD(($(%=#+G#4$6q8J$6#+E!H#5 ,0L!#'"+")MMM&*G%udD$?????))++,8 222$%,1W,<,<55%#($d#<<<<$dD9C'''"% ..# +$+AJ$$6$6$+AJ'$9$9$~~,0 3:2; 0#('!*#*1:wqz#*1:#////'+G'% "&IIII__'% "&udT3K>>>> !II%''HqL E))hl8'% "&udD$77777!49 #;s1u t====AgSCiii}~ !""+55rD!9tQi44444 b4)dAY 333333s ( 77__main__)S__doc__ __author__ __credits__rrecodecsrrlib2to3.pgen2.tokenr#rdir__all__r NameErrorstrrrr r) WhitespaceCommentIgnoreName Binnumber Hexnumber Octnumber Decnumber IntnumberExponent PointfloatExpfloat Floatnumber ImagnumberNumberSingleDoubleSingle3Double3 _litprefixTripleStringOperatorBracketSpecialFunny PlainTokenTokenContStr PseudoExtras PseudoTokenmapcompile tokenprogrr=rD _strprefixesrrrr ExceptionrPrWrdrrfrlASCIIrrrrrrrRsysr}argvopenrgstdinr rr r%s##0* F ########!!!! 0 0cc%jj 0 0 04,4,4, ,  EE EEE :99///111   cc*z122 2UU7^^ C  & 7 . E+X 6 6 E)Y 9 = = # U57H I IEERZOO [ X %eJ)) U& g(= > > z; 2 2 $ # 2 2 7 zE!:#5 6 6 z;;;; = = 5GWeU%%    %% - - h)) U65&$ / /  % ;;c:&&';;c:&&' ( (uZ&11 55vugtLLL 25#J Wg63838/ :{KM#sC%%M#sC%%&&&&' F##*"*V*<*<{ 9 9FFFFF 9GFFFF 987,777  9 EN//,///0//,///0  #J-- ---.-- ---. !!!!!!!!%%%%%Y%%%??? #-    &   6 6 6 6 6 6 6 6 p BJ@"( K K 2:0"( ; ;   G$G$G$R###*`4`4`4D zJJJ s38}}q((44 #4#4#=>>>>> (39% & & & & &sAA  A pgen2/__pycache__/token.cpython-311.pyc000064400000004475151027012300013612 0ustar00 !A?hdZdZdZdZdZdZdZdZdZd Z d Z d Z d Z d Z dZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZd Z d!Z!d"Z"d#Z#d$Z$d%Z%d&Z&d'Z'd(Z(d)Z)d*Z*d+Z+d,Z,d-Z-d.Z.d/Z/d0Z0d1Z1d2Z2d3Z3d4Z4d5Z5d6Z6d7Z7d8Z8d9Z9d:Z:d;Z;dZ>iZ?e@eABD] \ZCZDeEeDeEdureCe?eD<!d?ZFd@ZGdAZHdBS)Cz!Token constants (from "token.h").  !"#$%&'()*+,-./0123456789:;<c|tkSN NT_OFFSETxs ,/usr/lib64/python3.11/lib2to3/pgen2/token.py ISTERMINALrGOs y=c|tkSrArBrDs rF ISNONTERMINALrJR >rHc|tkSrA) ENDMARKERrDs rFISEOFrNUrKrHN)I__doc__rMNAMENUMBERSTRINGNEWLINEINDENTDEDENTLPARRPARLSQBRSQBCOLONCOMMASEMIPLUSMINUSSTARSLASHVBARAMPERLESSGREATEREQUALDOTPERCENT BACKQUOTELBRACERBRACEEQEQUALNOTEQUAL LESSEQUAL GREATEREQUALTILDE CIRCUMFLEX LEFTSHIFT RIGHTSHIFT DOUBLESTAR PLUSEQUALMINEQUAL STAREQUAL SLASHEQUAL PERCENTEQUAL AMPEREQUAL VBAREQUALCIRCUMFLEXEQUALLEFTSHIFTEQUALRIGHTSHIFTEQUALDOUBLESTAREQUAL DOUBLESLASHDOUBLESLASHEQUALATATEQUALOPCOMMENTNLRARROWAWAITASYNC ERRORTOKEN COLONEQUALN_TOKENSrCtok_namelistglobalsitems_name_valuetyperGrJrNrHrFrs('                                                      T''))//++,,!!ME6 tF||ttAww rHpgen2/__pycache__/driver.cpython-311.opt-1.pyc000064400000021024151027012300014711 0ustar00 !A?hQdZdZddgZddlZddlZddlZddlZddlZddlm Z m Z m Z m Z m Z GddeZd Z dd ZdZdZdZedkr$ejee dSdS)zZParser driver. This provides a high-level interface to parse a file into a syntax tree. z#Guido van Rossum Driver load_grammarN)grammarparsetokentokenizepgenc>eZdZd dZd dZd dZd dZd dZd dZdS) rNcZ||_|tj}||_||_dS)N)rlogging getLoggerloggerconvert)selfrrrs -/usr/lib64/python3.11/lib2to3/pgen2/driver.py__init__zDriver.__init__s. >&((F  Fc,tj|j|j}|d}d}dx}x}x}x} } d} |D].} | \}}}} } |||fkr/|\} }|| kr| d| |z zz } | }d}||kr| | ||z } |}|t jt jfvr'| |z } | \}}|dr|dz }d}|tj krtj |}|r-|j dtj||| |||| |fr|r|j dn>d} | \}}|dr|dz }d}0tjd||| |f|jS) z4Parse a series of tokens and return the syntax tree.rrN z%s %r (prefix=%r)zStop.zincomplete input)rParserrrsetupr COMMENTNLendswithrOPopmaprdebugtok_nameaddtoken ParseErrorrootnode)rtokensrplinenocolumntypevaluestartend line_textprefix quintuples_linenos_columns r parse_tokenszDriver.parse_tokens&s Lt| 4 4  1555u5u5sY$ A$ AI1: .D%Y(((%*"(H$$dh&788F%FFH$$ix88F%F((+666%!$>>$''aKFFux}U+ G !!"5"'."6vGGGzz$77 /K%%g...F NFF~~d## ! "#5#'AA Azrc`tj|j}|||Sz*Parse a stream and return the syntax tree.)r generate_tokensreadliner1)rstreamrr$s rparse_stream_rawzDriver.parse_stream_rawVs*)&/::  ///rc.|||Sr3)r7)rr6rs r parse_streamzDriver.parse_stream[s$$VU333rctj|d|5}|||cdddS#1swxYwYdS)z(Parse a file and return the syntax tree.r)encodingN)ioopenr9)rfilenamer<rr6s r parse_filezDriver.parse_file_s WXsX 6 6 6 4&$$VU33 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4s ;??ctjtj|j}|||S)z*Parse a string and return the syntax tree.)r r4r=StringIOr5r1)rtextrr$s r parse_stringzDriver.parse_stringds5)"+d*;*;*DEE  ///r)NN)F)NF) __name__ __module__ __qualname__rr1r7r9r@rDrrrrs....`0000 44444444 000000rctj|\}}|dkrd}||zdt t t jzdzS)Nz.txtr.z.pickle)ospathsplitextjoinmapstrsys version_info)gtheadtails r_generate_pickle_namerVjsV!!"%%JD$ v~~ $;#c3+;"<"<== = IIr Grammar.txtTFc|tj}|t|n|}|st||s|d|t j|}|rZ|d| ||nV#t$r }|d|Yd}~n1d}~wwxYwn(tj }| ||S)z'Load the grammar (maybe from a pickle).Nz!Generating grammar tables from %szWriting grammar tables to %szWriting failed: %s) r rrV_newerinfor generate_grammardumpOSErrorrGrammarload)rSgpsaveforcerges rrrqs~"$$&(j r " " "bB F2rNN  7<<<  !" % %  5 KK6 ; ; ; 5r  5 5 5 0!44444444 5  5 O   r Hs>B B>B99B>ctj|sdStj|sdStj|tj|kS)z0Inquire whether file a was written since file b.FT)rKrLexistsgetmtime)abs rrYrYsc 7>>!  u 7>>!  t 7  A  "'"2"21"5"5 55rc4tj|rt|St tj|}t j||}tj }| ||S)aNormally, loads a pickled grammar by doing pkgutil.get_data(package, pickled_grammar) where *pickled_grammar* is computed from *grammar_source* by adding the Python version and using a ``.pickle`` extension. However, if *grammar_source* is an extant file, load_grammar(grammar_source) is called instead. This facilitates using a packaged grammar file when needed but preserves load_grammar's automatic regeneration behavior when possible. ) rKrLisfilerrVbasenamepkgutilget_datarr^loads)packagegrammar_source pickled_namedatarcs rload_packaged_grammarrtsx w~~n%%,N+++()9)9.)I)IJJL  G\ 2 2DAGGDMMM Hrc|stjdd}tjtjtjd|D]}t |dddS)zMain program, when run as a script: produce grammar pickle files. Calls load_grammar for each argument, a path to a grammar text file. rNz %(message)s)levelr6formatT)rarb)rQargvr basicConfigINFOstdoutr)argsrSs rmainr}sl x| gl3:,....00Rd$///// 4r__main__)rWNTFN)__doc__ __author____all__r=rKr rmrQrrrrr r objectrrVrrYrtr}rEexitintrHrrrsF 3 ^ $  43333333333333J0J0J0J0J0VJ0J0J0ZJJJ'+04    *666   (    z CHSSTTVV__rpgen2/__pycache__/tokenize.cpython-311.opt-2.pyc000064400000047303151027012300015257 0ustar00 !A?hR N dZdZddlZddlZddlmZmZddlTddlm Z de e Dgd zZ [ e n #e $reZ YnwxYwd Zd Zd Zd ZdZdZeedezzeezZdZdZdZdZeddZeeeeeZdZeddeezZdezZeeeZededzZ ee eeZ!dZ"dZ#dZ$d Z%d!Z&ee&d"ze&d#zZ'ee&d$ze&d%zZ(ed&d'd(d)d*d+d,d-d. Z)d/Z*ed0d1d2Z+ee)e*e+Z,ee!e,e(eZ-ee-zZ.ee&d3zed4dze&d5zed6dzZ/edee'Z0eee0e!e,e/ezZ1e2ej3e.e1e$e%f\Z4Z5Z6Z7ed7d8d9d:ed7d8d;d<zhd=zZ8ej3e"ej3e#e6e7d>d?e8Dd@e8DdAe8DZ9d"d#hdBe8DzdCe8DzZ:d4d6hdDe8DzdEe8DzZ;dFZ<GdGdHe=Z>GdIdJe=Z?dKZ@e@fdLZAdMZBGdNdOZCej3dPejDZEej3dQejDZFdRZGdSZHdTZIdUZJeKdVkrUddlLZLeMeLjNdkr&eAeOeLjNdjPdSeAeLjQjPdSdS)WzKa-Ping Yee z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip MontanaroN)BOM_UTF8lookup)*)tokenc*g|]}|ddk|S)r_).0xs //usr/lib64/python3.11/lib2to3/pgen2/tokenize.py r%s! 0 0 0AaDCKK1KKK)tokenizegenerate_tokens untokenizec8dd|zdzS)N(|))joinchoicess r groupr0sC#((7"3"33c99rct|dzS)Nrrrs r anyr1s%/C//rct|dzS)N?rrs r mayber 2sE7Oc11rc:tfdDS)Nc3K|];}dzD]3}||k,||zV4z _combinations..4s`!e)qzz||qzz||/K/KA/K/K/K/K/Kr)set)r&s`r _combinationsr)3s;   rz[ \f\t]*z #[^\r\n]*z\\\r?\nz\w+z0[bB]_?[01]+(?:_[01]+)*z(0[xX]_?[\da-fA-F]+(?:_[\da-fA-F]+)*[lL]?z0[oO]?_?[0-7]+(?:_[0-7]+)*[lL]?z[1-9]\d*(?:_\d+)*[lL]?z0[lL]?z[eE][-+]?\d+(?:_\d+)*z\d+(?:_\d+)*\.(?:\d+(?:_\d+)*)?z\.\d+(?:_\d+)*z \d+(?:_\d+)*z\d+(?:_\d+)*[jJ]z[jJ]z[^'\\]*(?:\\.[^'\\]*)*'z[^"\\]*(?:\\.[^"\\]*)*"z%[^'\\]*(?:(?:\\.|'(?!''))[^'\\]*)*'''z%[^"\\]*(?:(?:\\.|"(?!""))[^"\\]*)*"""z'(?:[uUrRbBfF]|[rR][fFbB]|[fFbBuU][rR])?'''"""z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"z\*\*=?z>>=?z<<=?z<>z!=z//=?z->z[+\-*/%&@|^=<>]=?~z[][(){}]z\r?\nz:=z[:;.,`@]z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"rRfFbB>UuURUruRur)r-r.r*r+c$i|] }|dtSr*) single3progr prefixs r r@y FFFv&~~~{FFFrc$i|] }|dtSr+) double3progr>s r r@r@zrArci|]}|dSNr r>s r r@r@{s777vt777rch|]}|dSr<r r>s r rH///^^^///rch|]}|dSrCr r>s r rHrHrIrch|]}|dS)r-r r>s r rHrH---f\\\---rch|]}|dS)r.r r>s r rHrHrLrceZdZdS) TokenErrorN__name__ __module__ __qualname__r rr rPrPrrPceZdZdS)StopTokenizingNrQr rr rWrWrUrrWc z|\}}|\}}td||||t|t|fzdS)Nz%d,%d-%d,%d: %s %s)printtok_namerepr) typerxxx_todo_changemexxx_todo_changeme1linesrowscolerowecols r printtokenrdsR$LT4%LT4 tT4$e= >?????rcL t||dS#t$rYdSwxYwrF) tokenize_looprW)readline tokeneaters r rrsD  h +++++      s  ##c4t|D]}||dSrF)r)rgrh token_infos r rfrfs3%h//   J  rc&eZdZdZdZdZdZdS) Untokenizerc0g|_d|_d|_dS)Nrr)tokensprev_rowprev_col)selfs r __init__zUntokenizer.__init__s   rcf|\}}||jz }|r|jd|zdSdS)N )rprnappend)rqstartrowcol col_offsets r add_whitespacezUntokenizer.add_whitespacesJS4=(  1 K  sZ/ 0 0 0 0 0 1 1rcp|D]}t|dkr|||nn|\}}}}}|||j||\|_|_|ttfvr|xjdz c_d|_d |jS)Nrrr#) lencompatrzrnrurorpNEWLINENLr)rqiterablettok_typerrvendr_s r rzUntokenizer.untokenizes " "A1vv{{ Ax(((01 -HeUC    & & & K  u % % %+. (DM4=GR=(( " ! wwt{###rcd}g}|jj}|\}}|ttfvr|dz }|tt fvrd}|D]}|dd\}}|ttt tfvr|dz }|tkr||Q|tkr| q|tt fvrd}n|r|r||dd}||dS)NFrtTr|) rnruNAMENUMBERrrASYNCAWAITINDENTDEDENTpop) rqrr startlineindents toks_appendtoknumtokvaltoks r r~zUntokenizer.compats k(  dF^ # # cMF gr] " "I  C !WNFF$u555# v&&&6!! GR=((  "w " GBK(((! K    #  rN)rRrSrTrrrzrr~r rr rlrlsP 111 $ $ $     rrlz&^[ \t\f]*#.*?coding[:=][ \t]*([-\w.]+)s^[ \t\f]*(?:[#\r\n]|$)c |dddd}|dks|drdS|dvs|drdS|S) N r -utf-8zutf-8-)zlatin-1 iso-8859-1z iso-latin-1)zlatin-1-z iso-8859-1-z iso-latin-1-r)lowerreplace startswith)orig_encencs r _get_normal_namersw2 3B3-     ' 'S 1 1C g~~11~w 666 ~~ABB7| OrcN dd}d}fd}fd}|}|trd|dd}d}|s|gfS||}|r||gfSt|s||gfS|}|s||gfS||}|r|||gfS|||gfS)NFrcV S#t$rtcYSwxYwrF) StopIterationbytes)rgsr read_or_stopz%detect_encoding..read_or_stops< 8::    77NNN s ((c| |d}n#t$rYdSwxYwt|}|sdSt |d} t |}n #t$rtd|zwxYwr|j dkrtd|dz }|S)Nasciirzunknown encoding: rzencoding problem: utf-8z-sig) decodeUnicodeDecodeError cookie_rematchrrr LookupError SyntaxErrorname)r_ line_stringrencodingcodec bom_founds r find_cookiez$detect_encoding..find_cookie s ++g..KK!   44  ,, 4#EKKNN33 ?8$$EE ? ? ?2X=>> > ?  zW$$!";<<<  Hs ''+A;;BTz utf-8-sig)rrblank_rer)rgrdefaultrrfirstsecondrs` @r detect_encodingrs2"IHG , LNNE !! abb  {{5!!H!%  >>%   \^^F  {6""H)%(( UFO ##rcJ t}||SrF)rlr)ruts r rr:s$" B == " ""rc# K dx}x}}d\}}d}dg}d}d} d} d} |} n#t$rd} YnwxYw|dz}dt| }} |r| std||| }|r>|dx} }t || d|z|||f|| zfVd\}}d}nZ|rA| dddkr3| d dd kr%t || z||t| f|fVd}d}|| z}|| z}|dkr|s| sn d}| |krO| | d kr|dz}n2| | d kr|tzdztz}n| | d krd}nn | dz} | |kO| |krn|r|Vd}| | dvr| | dkry| | dd}| t|z}t||| f|| t|zf| fVt| |d||f|t| f| fVn>ttf| | dk| | d|| f|t| f| fV0||dkr/| |t| d| |df|| f| fV||dkrT||vrtdd|| | f|dd}| r| |dkrd} d} d} td|| f|| f| fV||dkT| r| r| |dkrd} d} d} n| std|dfd}| |krt| | }|r|d\}}||f||f|} }}| ||| |}}|t"jvs |dkr|dkrt&|||| fVn|dvr,t(}|dkrt}n| rd} |r|Vd}||||| fVnO|dkr|r|Vd}t|||| fVn1|t*vrpt,|}|| | }|r9|d} | || }|r|Vd}t |||| f| fVn||f}| |d}| }n|t.vs"|ddt.vs|ddt.vrk|ddkrG||f}t,|p%t,|dpt,|d}| |dd}}| }nA|r|Vd}t |||| fVn"|r|dvr| r|dkrt2nt4|||| fVt6|||| f}|dkr|s|}/|dvrW|rU|dt6krD|ddkr8|dkr d} |d} t2|d|d|d|dfVd}|r|Vd}|Vnk|dkr|r|Vd}t|||| f| fVd}nJ|d vr|dz}n |d!vr|dz }|r|Vd}t8|||| fVn t | | || f|| dzf| fV| dz} | |k|r|Vd}|ddD]}td|df|dfdfVt:d|df|dfdfVdS)"Nr)r#rFrr#zEOF in multi-line stringz\ z\ rt  z# #z rz3unindent does not match any outer indentation levelz zEOF in multi-line statement.Tr|r )asyncawaitr)defforr\z([{z)]})rr}rPrrSTRING ERRORTOKENtabsizerstripCOMMENTrrurIndentationErrorr pseudoprogspanstringdigitsrr triple_quotedendprogs single_quoted isidentifierrrrOP ENDMARKER)rglnumparenlev continuedcontstrneedcontcontlinerstashed async_defasync_def_indent async_def_nlr_posmaxstrstartendprogendmatchrcolumn comment_tokennl_pos pseudomatchrvsposeposrinitialnewlinerindents r rrOs #$#D#8iGXHcGGIL} 8::DD   DDD axc$iiS J  G !;XFFF}}T**H $LLOO+cwdsd3$ho????$)! d233i611d233i86K6K!7T>#dCII%6BBBB!D.#d? ]]9] F))9##fqjVV#Y$&&&'/A2Ew1N#Y$&&Ag )) czz5  CyG##9##$(J$5$5f$=$=M 3}#5#55F"M #;sS5G5G/G(H$PPPPtFGG} &>D#d))+'"+.M.M! $ #$  K !>q JJJICii$**455Kg (--a00 s#'-$cCd!%eCi$u+wfm++sNNu||!5$d;;;;;&&%G!||"$",'+ '% "&"E4t<<<<<^^'% "&"E4t<<<<<m++&uoG&}}T377H &ll1oo $U3Y"+")MMM&*G%udT3KFFFFF$(%="&uvv,#' --"1"I.."1"I..RyD(($(%=#+G#4$6q8J$6#+E!H#5 ,0L!#'"+")MMM&*G%udD$?????))++,8 222$%,1W,<,<55%#($d#<<<<$dD9C'''"% ..# +$+AJ$$6$6$+AJ'$9$9$~~,0 3:2; 0#('!*#*1:wqz#*1:#////'+G'% "&IIII__'% "&udT3K>>>> !II%''HqL E))hl8'% "&udD$77777!49 #;s1u t====AgSCiii}~ !""+55rD!9tQi44444 b4)dAY 333333s ) 88__main__)R __author__ __credits__rrecodecsrrlib2to3.pgen2.tokenr#rdir__all__r NameErrorstrrrr r) WhitespaceCommentIgnoreName Binnumber Hexnumber Octnumber Decnumber IntnumberExponent PointfloatExpfloat Floatnumber ImagnumberNumberSingleDoubleSingle3Double3 _litprefixTripleStringOperatorBracketSpecialFunny PlainTokenTokenContStr PseudoExtras PseudoTokenmapcompile tokenprogrr=rD _strprefixesrrrr ExceptionrPrWrdrrfrlASCIIrrrrrrrRsysr}argvopenrgstdinr rr r#s#0* F ########!!!! 0 0cc%jj 0 0 04,4,4, ,  EE EEE :99///111   cc*z122 2UU7^^ C  & 7 . E+X 6 6 E)Y 9 = = # U57H I IEERZOO [ X %eJ)) U& g(= > > z; 2 2 $ # 2 2 7 zE!:#5 6 6 z;;;; = = 5GWeU%%    %% - - h)) U65&$ / /  % ;;c:&&';;c:&&' ( (uZ&11 55vugtLLL 25#J Wg63838/ :{KM#sC%%M#sC%%&&&&' F##*"*V*<*<{ 9 9FFFFF 9GFFFF 987,777  9 EN//,///0//,///0  #J-- ---.-- ---. !!!!!!!!%%%%%Y%%%??? #-    &   6 6 6 6 6 6 6 6 p BJ@"( K K 2:0"( ; ;   G$G$G$R###*`4`4`4D zJJJ s38}}q((44 #4#4#=>>>>> (39% & & & & &s?A A pgen2/__pycache__/pgen.cpython-311.pyc000064400000047437151027012300013430 0ustar00 !A?h6ddlmZmZmZGddejZGddeZGddeZGdd eZ d d Z d S))grammartokentokenizeceZdZdS) PgenGrammarN)__name__ __module__ __qualname__+/usr/lib64/python3.11/lib2to3/pgen2/pgen.pyrrsDr rc~eZdZddZdZdZdZdZdZdZ d Z d Z d Z d Z d ZdZdZdZddZdZdZdS)ParserGeneratorNcNd}|t|d}|j}||_||_t j|j|_|| \|_ |_ | |i|_ | dS)Nzutf-8)encoding)openclosefilenamestreamrgenerate_tokensreadline generatorgettokenparsedfas startsymbolfirst addfirstsets)selfrr close_streams r __init__zParserGenerator.__init__ s >(W555F!%00HOOQ_U%;T$BCCC,2AN5)!M!t44!&#..5555.//////QX%%8F++HOOVTN333'-AHV$!M8z)))5)))KKEQx!! "AJ&&:e,,HOOUZ$7888(.AJu%!M!u-QX%%8F++HOOVTN333'-AHV$!Mr ct|j}||D] }||jvr||!dSN)r%rr&r'r calcfirst)rr8r9s r rzParserGenerator.addfirstsetsksaTY^^%%&&  % %D4:%%t$$$ % %r c .|j|}d|j|<|d}i}i}|jD]\}}||jvrh||jvr"|j|}|t d|zn"|||j|}|||||<vd||<|di||<i} |D]4\}} | D],} | | vr!t d|d| d|d| | || | <-5||j|<dS)Nr#zrecursion for rule %rrzrule z is ambiguous; z is in the first sets of z as well as )rrr.r/ ValueErrorrWupdate) rr9r;r<totalset overlapcheckr=r>fsetinverseitsfirstsymbols r rWzParserGenerator.calcfirstssio 4A  :++-- 1 1KE4 !!DJ&&:e,D|()@4)GHHH$NN5))):e,D%%%&* U##"#',aj U##+1133 ( (OE8" ( (W$$$*&*ddFFFEEE76??&LMMM#(  ( $ 4r cti}d}|jtjkr|jtjkr)||jtjk)|tj}|tjd|\}}|tj| ||}t|}| |t|}|||<||}|jtjk||fS)N:) typer ENDMARKERNEWLINErexpectrQOP parse_rhsmake_dfar* simplify_dfa) rrrr9azr;oldlennewlens r rzParserGenerator.parses  i5?**)u},, )u},,;;uz**D KK# & & &>>##DAq KK & & &--1%%CXXF   c " " "XXFDJ"" #i5?**$[  r c  t|tsJt|tsJ fd} fd t|||g}|D]}i}|jD]1}|jD]'\}} |  | ||i(2t |D]R\}} |D]} | j| krn&t| |} || | | |S|S)Nc$i}|||SrVr )r<base addclosures r closurez)ParserGenerator.make_dfa..closuresD Jud # # #Kr ct|tsJ||vrdSd||<|jD]\}}| ||dSrA)rKNFAStater.)r<rqr=r>rrs r rrz,ParserGenerator.make_dfa..addclosuresgeX.. . ..}}DK$z + + t=JtT*** + +r ) rKruDFAStatenfasetr. setdefaultr-r/r0addarc) rr6finishrsr4r<r.nfastater=r>rwstrrs @r rizParserGenerator.make_dfas{ %*****&(+++++      + + + + +775>>6223 ( (ED!L E E#+=EEKE4(" 4)C)CDDDE"( !5!5 ( ( v &&ByF**+"&&11BMM"%%% R'''' ( r cltd||g}t|D]\}}td|||urdpd|jD]l\}}||vr||} n$t |} |||td| zXtd|| fzmdS)NzDump of NFA for State(final)z -> %d %s -> %d)print enumerater.r2r*r0) rr9r6rztodor:r<r=r>js r dump_nfazParserGenerator.dump_nfas &&&w!$ 7 7HAu )Q =I C D D D$z 7 7 t4<< 4((AAD AKK%%%=+/****.E1:56666 7 7 7r c *td|t|D]r\}}td||jrdpdt|jD],\}}td|||fz-sdS)NzDump of DFA forr~rrr)rrr3r-r.r/r2)rr9r;r:r<r=r>s r dump_dfazParserGenerator.dump_dfas &&&!# A AHAu )Q ;) Ar B B B%ej&6&6&8&899 A A tnsyy'??@@@@ A A Ar cd}|rnd}t|D]X\}}t|dzt|D]2}||}||kr"||=|D]}|||d}n3Y|ldSdS)NTFr)rranger* unifystate)rr;changesr:state_irstate_jr<s r rjzParserGenerator.simplify_dfas G'nn   7qsCHH--A!!fG'))F%(??E!,,Wg>>>>"& *      r c|\}}|jdkr||fSt}t}|||||jdkr`||\}}|||||jdk`||fS)N|) parse_altrTruryr)rrkrlaazzs r rhzParserGenerator.parse_rhss~~1 :  a4KBB IIaLLL HHRLLL*## ~~''1 !  *## r6Mr c4|\}}|jdvs|jtjtjfvrV|\}}|||}|jdv7|jtjtjfvV||fS)N)([) parse_itemrTrcrrQSTRINGry)rrkbr7ds r rzParserGenerator.parse_alt s  1zZ''yUZ666??$$DAq HHQKKKA zZ''yUZ666!t r c|jdkrd||\}}|tjd||||fS|\}}|j}|dvr||fS||||dkr||fS||fS)Nr])+*r)rTrrhrfrrgry parse_atom)rrkrlrTs r rzParserGenerator.parse_items :   MMOOO>>##DAq KK# & & & HHQKKKa4K??$$DAqJEJ&&!t MMOOO HHQKKK||!t !t r c|jdkrO||\}}|tjd||fS|jtjtjfvrOt}t}| ||j|||fS| d|j|jdS)Nr)z+expected (...) or NAME or STRING, got %s/%s) rTrrhrfrrgrcrQrrury raise_error)rrkrls r rzParserGenerator.parse_atom(s :   MMOOO>>##DAq KK# & & &a4K Y5:u|4 4 4 A A HHQ # # # MMOOOa4K   J!Y  4 4 4 4 4r c|j|ks |.|j|kr#|d|||j|j|j}||S)Nzexpected %s/%s, got %s/%s)rcrTrr)rrcrTs r rfzParserGenerator.expect9sd 9  !2tzU7J7J   8!5$)TZ A A A   r ct|j}|dtjtjfvr4t|j}|dtjtjfv4|\|_|_|_|_|_ dS)Nr#) r>rrCOMMENTNLrcrTbeginendline)rtups r rzParserGenerator.gettokenAsr4>""!f)8;777t~&&C!f)8;777AD> 4:tz48TYYYr c |rG ||z}n@#d|gttt|z}YnxYwt ||j|jd|jd|jf)N r#r)joinr%mapstr SyntaxErrorrrr)rmsgargss r rzParserGenerator.raise_errorHs  = =Dj =hhutCTNN';';;<<# tx{ $ TY 899 9s  ;ArV)rr r r!r?r5r1rrWrrirrrjrhrrrrfrrr r r rr s4    2,",","\%%%$$$<!!!0"""H777 AAA*"(444"EEE99999r rceZdZdZddZdS)rucg|_dSrV)r.)rs r r!zNFAState.__init__Ss  r Nc|t|tsJt|tsJ|j||fdSrV)rKrrur.r0rr>r=s r ryzNFAState.addarcVsP} 5# 6 6}}6$))))) %'''''r rV)rr r r!ryr r r ruruQs7((((((r ruc*eZdZdZdZdZdZdZdS)rvct|tsJttt|tsJt|tsJ||_||v|_i|_dSrV)rKdictr>iterrurwr3r.)rrwfinals r r!zDFAState.__init__]so&$'''''$tF||,,h77777%*****   r ct|tsJ||jvsJt|tsJ||j|<dSrV)rKrr.rvrs r ryzDFAState.addarcesS%%%%%%DI%%%%$))))) %r c`|jD]\}}||ur ||j|<dSrV)r.r/)roldnewr=r>s r rzDFAState.unifystateksA9??,, ' 'KE4s{{#& %  ' 'r c,t|tsJ|j|jkrdSt|jt|jkrdS|jD]$\}}||j|urdS%dS)NFT)rKrvr3r*r.r/get)rotherr=r>s r __eq__zDFAState.__eq__ps%***** <5= ( (5 ty>>S__ , ,59??,,  KE45:>>%0000uu1tr N)rr r r!ryrr__hash__r r r rvrv[sQ   '''   HHHr rv Grammar.txtcHt|}|SrV)rr?)rps r generate_grammarrs!!A >>  r N)r) rrrrGrammarrobjectrrurvrr r r rs '&&&&&&&&&     '/   E9E9E9E9E9fE9E9E9N (((((v(((#####v###Jr pgen2/__pycache__/__init__.cpython-311.pyc000064400000000276151027012300014224 0ustar00 !A?h dZdS)zThe pgen2 package.N)__doc__//usr/lib64/python3.11/lib2to3/pgen2/__init__.pyrsrpgen2/__pycache__/tokenize.cpython-311.opt-1.pyc000064400000056735151027012300015267 0ustar00 !A?hR PdZdZdZddlZddlZddlmZmZddlTddl m Z d e e Dgd zZ [ e n #e$reZ YnwxYwd Zd Zd ZdZdZdZeedezzeezZdZdZdZdZeddZeeeeeZdZeddeezZdezZeeeZ ede dzZ!ee!e eZ"dZ#dZ$d Z%d!Z&d"Z'ee'd#ze'd$zZ(ee'd%ze'd&zZ)ed'd(d)d*d+d,d-d.d/ Z*d0Z+ed1d2d3Z,ee*e+e,Z-ee"e-e)eZ.ee.zZ/ee'd4zed5dze'd6zed7dzZ0edee(Z1eee1e"e-e0ezZ2e3ej4e/e2e%e&f\Z5Z6Z7Z8ed8d9d:d;ed8d9dzZ9ej4e#ej4e$e7e8d?d@e9DdAe9DdBe9DZ:d#d$hdCe9DzdDe9DzZ;d5d7hdEe9DzdFe9DzZZ?GdJdKe>Z@dLZAeAfdMZBdNZCGdOdPZDej4dQejEZFej4dRejEZGdSZHdTZIdUZJdVZKeLdWkrUddlMZMeNeMjOdkr&eBePeMjOdjQdSeBeMjRjQdSdS)XaTokenization help for Python programs. generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or "" for EOF). It generates 5-tuples with these members: the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string) It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators Older entry points tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken) are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.zKa-Ping Yee z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip MontanaroN)BOM_UTF8lookup)*)tokenc*g|]}|ddk|S)r_).0xs //usr/lib64/python3.11/lib2to3/pgen2/tokenize.py r%s! 0 0 0AaDCKK1KKK)tokenizegenerate_tokens untokenizec8dd|zdzS)N(|))joinchoicess r groupr0sC#((7"3"33c99rct|dzS)Nrrrs r anyr1s%/C//rct|dzS)N?rrs r mayber 2sE7Oc11rc:tfdDS)Nc3K|];}dzD]3}||k,||zV4z _combinations..4s`!e)qzz||qzz||/K/KA/K/K/K/K/Kr)set)r&s`r _combinationsr)3s;   rz[ \f\t]*z #[^\r\n]*z\\\r?\nz\w+z0[bB]_?[01]+(?:_[01]+)*z(0[xX]_?[\da-fA-F]+(?:_[\da-fA-F]+)*[lL]?z0[oO]?_?[0-7]+(?:_[0-7]+)*[lL]?z[1-9]\d*(?:_\d+)*[lL]?z0[lL]?z[eE][-+]?\d+(?:_\d+)*z\d+(?:_\d+)*\.(?:\d+(?:_\d+)*)?z\.\d+(?:_\d+)*z \d+(?:_\d+)*z\d+(?:_\d+)*[jJ]z[jJ]z[^'\\]*(?:\\.[^'\\]*)*'z[^"\\]*(?:\\.[^"\\]*)*"z%[^'\\]*(?:(?:\\.|'(?!''))[^'\\]*)*'''z%[^"\\]*(?:(?:\\.|"(?!""))[^"\\]*)*"""z'(?:[uUrRbBfF]|[rR][fFbB]|[fFbBuU][rR])?'''"""z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"z\*\*=?z>>=?z<<=?z<>z!=z//=?z->z[+\-*/%&@|^=<>]=?~z[][(){}]z\r?\nz:=z[:;.,`@]z'[^\n'\\]*(?:\\.[^\n'\\]*)*'z"[^\n"\\]*(?:\\.[^\n"\\]*)*"rRfFbB>UuURUruRur)r-r.r*r+c$i|] }|dtSr*) single3progr prefixs r r@y FFFv&~~~{FFFrc$i|] }|dtSr+) double3progr>s r r@r@zrArci|]}|dSNr r>s r r@r@{s777vt777rch|]}|dSr<r r>s r rH///^^^///rch|]}|dSrCr r>s r rHrHrIrch|]}|dS)r-r r>s r rHrH---f\\\---rch|]}|dS)r.r r>s r rHrHrLrceZdZdS) TokenErrorN__name__ __module__ __qualname__r rr rPrPrrPceZdZdS)StopTokenizingNrQr rr rWrWrUrrWc z|\}}|\}}td||||t|t|fzdS)Nz%d,%d-%d,%d: %s %s)printtok_namerepr) typerxxx_todo_changemexxx_todo_changeme1linesrowscolerowecols r printtokenrdsR$LT4%LT4 tT4$e= >?????rcJ t||dS#t$rYdSwxYw)a: The tokenize() function accepts two parameters: one representing the input stream, and one providing an output mechanism for tokenize(). The first parameter, readline, must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. The second parameter, tokeneater, must also be a callable object. It is called once for each token, with five arguments, corresponding to the tuples generated by generate_tokens(). N) tokenize_looprW)readline tokeneaters r rrs? h +++++      s  ""c4t|D]}||dSrF)r)rgrh token_infos r rfrfs3%h//   J  rc&eZdZdZdZdZdZdS) Untokenizerc0g|_d|_d|_dS)Nrr)tokensprev_rowprev_col)selfs r __init__zUntokenizer.__init__s   rcf|\}}||jz }|r|jd|zdSdS)N )rprnappend)rqstartrowcol col_offsets r add_whitespacezUntokenizer.add_whitespacesJS4=(  1 K  sZ/ 0 0 0 0 0 1 1rcp|D]}t|dkr|||nn|\}}}}}|||j||\|_|_|ttfvr|xjdz c_d|_d |jS)Nrrr#) lencompatrzrnrurorpNEWLINENLr)rqiterablettok_typerrvendr_s r rzUntokenizer.untokenizes " "A1vv{{ Ax(((01 -HeUC    & & & K  u % % %+. (DM4=GR=(( " ! wwt{###rcd}g}|jj}|\}}|ttfvr|dz }|tt fvrd}|D]}|dd\}}|ttt tfvr|dz }|tkr||Q|tkr| q|tt fvrd}n|r|r||dd}||dS)NFrtTr|) rnruNAMENUMBERrrASYNCAWAITINDENTDEDENTpop) rqrr startlineindents toks_appendtoknumtokvaltoks r r~zUntokenizer.compats k(  dF^ # # cMF gr] " "I  C !WNFF$u555# v&&&6!! GR=((  "w " GBK(((! K    #  rN)rRrSrTrrrzrr~r rr rlrlsP 111 $ $ $     rrlz&^[ \t\f]*#.*?coding[:=][ \t]*([-\w.]+)s^[ \t\f]*(?:[#\r\n]|$)c|dddd}|dks|drdS|dvs|drd S|S) z(Imitates get_normal_name in tokenizer.c.N r -utf-8zutf-8-)zlatin-1 iso-8859-1z iso-latin-1)zlatin-1-z iso-8859-1-z iso-latin-1-r)lowerreplace startswith)orig_encencs r _get_normal_namersv 3B3-     ' 'S 1 1C g~~11~w 666 ~~ABB7| OrcLdd}d}fd}fd}|}|trd|dd}d}|s|gfS||}|r||gfSt|s||gfS|}|s||gfS||}|r|||gfS|||gfS) a The detect_encoding() function is used to detect the encoding that should be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used (as a string) and a list of any lines (left as bytes) it has read in. It detects the encoding from the presence of a utf-8 bom or an encoding cookie as specified in pep-0263. If both a bom and a cookie are present, but disagree, a SyntaxError will be raised. If the encoding cookie is an invalid charset, raise a SyntaxError. Note that if a utf-8 bom is found, 'utf-8-sig' is returned. If no encoding is specified, then the default of 'utf-8' will be returned. FNrcV S#t$rtcYSwxYwrF) StopIterationbytes)rgsr read_or_stopz%detect_encoding..read_or_stops< 8::    77NNN s ((c| |d}n#t$rYdSwxYwt|}|sdSt |d} t |}n #t$rtd|zwxYwr|j dkrtd|dz }|S)Nasciirzunknown encoding: rzencoding problem: utf-8z-sig) decodeUnicodeDecodeError cookie_rematchrrr LookupError SyntaxErrorname)r_ line_stringrencodingcodec bom_founds r find_cookiez$detect_encoding..find_cookie s ++g..KK!   44  ,, 4#EKKNN33 ?8$$EE ? ? ?2X=>> > ?  zW$$!";<<<  Hs ''+A;;BTz utf-8-sig)rrblank_rer)rgrdefaultrrfirstsecondrs` @r detect_encodingrs-$IHG , LNNE !! abb  {{5!!H!%  >>%   \^^F  {6""H)%(( UFO ##rcHt}||S)aTransform tokens back into Python source code. Each element returned by the iterable must be a token sequence with at least two elements, a token number and token value. If only two tokens are passed, the resulting output is poor. Round-trip invariant for full input: Untokenized source will match input source exactly Round-trip invariant for limited input: # Output text will tokenize the back to the input t1 = [tok[:2] for tok in generate_tokens(f.readline)] newcode = untokenize(t1) readline = iter(newcode.splitlines(1)).next t2 = [tok[:2] for tokin generate_tokens(readline)] assert t1 == t2 )rlr)ruts r rr:s$ B == " ""rc# Kdx}x}}d\}}d}dg}d}d} d} d} |} n#t$rd} YnwxYw|dz}dt| }} |r| std||| }|r>|dx} }t || d|z|||f|| zfVd\}}d}nZ|rA| ddd kr3| d dd kr%t || z||t| f|fVd}d}|| z}|| z}|dkr|s| sn d}| |krO| | d kr|dz}n2| | d kr|tzdztz}n| | dkrd}nn | dz} | |kO| |krn|r|Vd}| | dvr| | dkry| | dd}| t|z}t||| f|| t|zf| fVt| |d||f|t| f| fVn>ttf| | dk| | d|| f|t| f| fV0||dkr/| |t| d| |df|| f| fV||dkrT||vrtdd|| | f|dd}| r| |dkrd} d} d} td|| f|| f| fV||dkT| r| r| |dkrd} d} d} n| std|dfd}| |krt| | }|r|d\}}||f||f|} }}| ||| |}}|t"jvs |dkr|dkrt&|||| fVn|dvr,t(}|dkrt}n| rd} |r|Vd}||||| fVnO|dkr|r|Vd}t|||| fVn1|t*vrpt,|}|| | }|r9|d} | || }|r|Vd}t |||| f| fVn||f}| |d}| }n|t.vs"|ddt.vs|ddt.vrk|ddkrG||f}t,|p%t,|dpt,|d}| |dd}}| }nA|r|Vd}t |||| fVn"|r|dvr| r|dkrt2nt4|||| fVt6|||| f}|dkr|s|}/|dvrW|rU|dt6krD|ddkr8|dkr d} |d} t2|d|d|d|dfVd}|r|Vd}|Vnk|d kr|r|Vd}t|||| f| fVd}nJ|d!vr|dz}n |d"vr|dz }|r|Vd}t8|||| fVn t | | || f|| dzf| fV| dz} | |k|r|Vd}|ddD]}td|df|dfdfVt:d|df|dfdfVdS)#a4 The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline can be a callable function terminating with StopIteration: readline = open(myfile).next # Example of alternate readline The generator produces 5-tuples with these members: the token type; the token string; a 2-tuple (srow, scol) of ints specifying the row and column where the token begins in the source; a 2-tuple (erow, ecol) of ints specifying the row and column where the token ends in the source; and the line on which the token was found. The line passed is the physical line. r)r#rNFrr#zEOF in multi-line stringz\ z\ rt  z# #z rz3unindent does not match any outer indentation levelz zEOF in multi-line statement.Tr|r )asyncawaitr)defforr\z([{z)]})rr}rPrrSTRING ERRORTOKENtabsizerstripCOMMENTrrurIndentationErrorr pseudoprogspanstringdigitsrr triple_quotedendprogs single_quoted isidentifierrrrOP ENDMARKER)rglnumparenlev continuedcontstrneedcontcontlinerstashed async_defasync_def_indent async_def_nlr_posmaxstrstartendprogendmatchrcolumn comment_tokennl_pos pseudomatchrvsposeposrinitialnewlinerindents r rrOs #$#D#8iGXHcGGIL} 8::DD   DDD axc$iiS J  G !;XFFF}}T**H $LLOO+cwdsd3$ho????$)! d233i611d233i86K6K!7T>#dCII%6BBBB!D.#d? ]]9] F))9##fqjVV#Y$&&&'/A2Ew1N#Y$&&Ag )) czz5  CyG##9##$(J$5$5f$=$=M 3}#5#55F"M #;sS5G5G/G(H$PPPPtFGG} &>D#d))+'"+.M.M! $ #$  K !>q JJJICii$**455Kg (--a00 s#'-$cCd!%eCi$u+wfm++sNNu||!5$d;;;;;&&%G!||"$",'+ '% "&"E4t<<<<<^^'% "&"E4t<<<<<m++&uoG&}}T377H &ll1oo $U3Y"+")MMM&*G%udT3KFFFFF$(%="&uvv,#' --"1"I.."1"I..RyD(($(%=#+G#4$6q8J$6#+E!H#5 ,0L!#'"+")MMM&*G%udD$?????))++,8 222$%,1W,<,<55%#($d#<<<<$dD9C'''"% ..# +$+AJ$$6$6$+AJ'$9$9$~~,0 3:2; 0#('!*#*1:wqz#*1:#////'+G'% "&IIII__'% "&udT3K>>>> !II%''HqL E))hl8'% "&udD$77777!49 #;s1u t====AgSCiii}~ !""+55rD!9tQi44444 b4)dAY 333333s ( 77__main__)S__doc__ __author__ __credits__rrecodecsrrlib2to3.pgen2.tokenr#rdir__all__r NameErrorstrrrr r) WhitespaceCommentIgnoreName Binnumber Hexnumber Octnumber Decnumber IntnumberExponent PointfloatExpfloat Floatnumber ImagnumberNumberSingleDoubleSingle3Double3 _litprefixTripleStringOperatorBracketSpecialFunny PlainTokenTokenContStr PseudoExtras PseudoTokenmapcompile tokenprogrr=rD _strprefixesrrrr ExceptionrPrWrdrrfrlASCIIrrrrrrrRsysr}argvopenrgstdinr rr r$s##0* F ########!!!! 0 0cc%jj 0 0 04,4,4, ,  EE EEE :99///111   cc*z122 2UU7^^ C  & 7 . E+X 6 6 E)Y 9 = = # U57H I IEERZOO [ X %eJ)) U& g(= > > z; 2 2 $ # 2 2 7 zE!:#5 6 6 z;;;; = = 5GWeU%%    %% - - h)) U65&$ / /  % ;;c:&&';;c:&&' ( (uZ&11 55vugtLLL 25#J Wg63838/ :{KM#sC%%M#sC%%&&&&' F##*"*V*<*<{ 9 9FFFFF 9GFFFF 987,777  9 EN//,///0//,///0  #J-- ---.-- ---. !!!!!!!!%%%%%Y%%%??? #-    &   6 6 6 6 6 6 6 6 p BJ@"( K K 2:0"( ; ;   G$G$G$R###*`4`4`4D zJJJ s38}}q((44 #4#4#=>>>>> (39% & & & & &sAA  A pgen2/__pycache__/grammar.cpython-311.pyc000064400000016605151027012300014116 0ustar00 !A?hdZddlZddlmZGddeZdZiZeD]*Z e r&e \Z Z e ee ee <+[ [ [ dS)aThis module defines the data structures used to represent a grammar. These are a bit arcane because they are derived from the data structures used by Python's 'pgen' parser generator. There's also a table here mapping operators to their names in the token module; the Python tokenize module reports all operators as the fallback token code OP, but the parser needs the actual token code. N)tokenc6eZdZdZdZdZdZdZdZdZ dS) Grammara Pgen parsing tables conversion class. Once initialized, this class supplies the grammar tables for the parsing engine implemented by parse.py. The parsing engine accesses the instance variables directly. The class here does not provide initialization of the tables; several subclasses exist to do this (see the conv and pgen modules). The load() method reads the tables from a pickle file, which is much faster than the other ways offered by subclasses. The pickle file is written by calling dump() (after loading the grammar tables using a subclass). The report() method prints a readable representation of the tables to stdout, for debugging. The instance variables are as follows: symbol2number -- a dict mapping symbol names to numbers. Symbol numbers are always 256 or higher, to distinguish them from token numbers, which are between 0 and 255 (inclusive). number2symbol -- a dict mapping numbers to symbol names; these two are each other's inverse. states -- a list of DFAs, where each DFA is a list of states, each state is a list of arcs, and each arc is a (i, j) pair where i is a label and j is a state number. The DFA number is the index into this list. (This name is slightly confusing.) Final states are represented by a special arc of the form (0, j) where j is its own state number. dfas -- a dict mapping symbol numbers to (DFA, first) pairs, where DFA is an item from the states list above, and first is a set of tokens that can begin this grammar rule (represented by a dict whose values are always 1). labels -- a list of (x, y) pairs where x is either a token number or a symbol number, and y is either None or a string; the strings are keywords. The label number is the index in this list; label numbers are used to mark state transitions (arcs) in the DFAs. start -- the number of the grammar's start symbol. keywords -- a dict mapping keyword strings to arc labels. tokens -- a dict mapping token numbers to arc labels. ci|_i|_g|_i|_dg|_i|_i|_i|_d|_dS)N)rEMPTY) symbol2number number2symbolstatesdfaslabelskeywordstokens symbol2labelstart)selfs ./usr/lib64/python3.11/lib2to3/pgen2/grammar.py__init__zGrammar.__init__LsJ  #n    ct|d5}tj|j|tjddddS#1swxYwYdS)z)Dump the grammar tables to a pickle file.wbN)openpickledump__dict__HIGHEST_PROTOCOL)rfilenamefs rrz Grammar.dumpWs (D ! ! CQ K q&*A B B B C C C C C C C C C C C C C C C C C Cs&AA Act|d5}tj|}dddn #1swxYwY|j|dS)z+Load the grammar tables from a pickle file.rbN)rrloadrupdate)rrrds rr"z Grammar.load\s (D ! ! Q AA                Qs 266c^|jtj|dS)z3Load the grammar tables from a pickle bytes object.N)rr#rloads)rpkls rr&z Grammar.loadsbs( V\#../////rc |}dD]3}t||t||4|jdd|_|jdd|_|j|_|S)z# Copy the grammar. )r r r rrrN) __class__setattrgetattrcopyrr r)rnew dict_attrs rr,z Grammar.copyfsnn4 E EI CGD)$<$<$A$A$C$C D D D D[^ [^ J  rcrddlm}td||jtd||jtd||jtd||jtd||jtd|jd S) z:Dump the grammar tables to standard output, for debugging.r)pprints2nn2sr r rrN)r0printr r r r rr)rr0s rreportzGrammar.reportss!!!!!! e t!""" e t!""" ht{ f ty ht{ gtz"""""rN) __name__ __module__ __qualname____doc__rrr"r&r,r4rrrrs|33j   CCC    000    # # # # #rra ( LPAR ) RPAR [ LSQB ] RSQB : COLON , COMMA ; SEMI + PLUS - MINUS * STAR / SLASH | VBAR & AMPER < LESS > GREATER = EQUAL . DOT % PERCENT ` BACKQUOTE { LBRACE } RBRACE @ AT @= ATEQUAL == EQEQUAL != NOTEQUAL <> NOTEQUAL <= LESSEQUAL >= GREATEREQUAL ~ TILDE ^ CIRCUMFLEX << LEFTSHIFT >> RIGHTSHIFT ** DOUBLESTAR += PLUSEQUAL -= MINEQUAL *= STAREQUAL /= SLASHEQUAL %= PERCENTEQUAL &= AMPEREQUAL |= VBAREQUAL ^= CIRCUMFLEXEQUAL <<= LEFTSHIFTEQUAL >>= RIGHTSHIFTEQUAL **= DOUBLESTAREQUAL // DOUBLESLASH //= DOUBLESLASHEQUAL -> RARROW := COLONEQUAL )r8rrobjectr opmap_rawopmap splitlineslinesplitopnamer+r9rrrCs   j#j#j#j#j#fj#j#j#^1  f   " "))D )::<<DGE4((b "dddrpgen2/__pycache__/__init__.cpython-311.opt-1.pyc000064400000000276151027012300015163 0ustar00 !A?h dZdS)zThe pgen2 package.N)__doc__//usr/lib64/python3.11/lib2to3/pgen2/__init__.pyrsrpgen2/__pycache__/conv.cpython-311.opt-2.pyc000064400000020633151027012300014371 0ustar00 !A?h%F ddlZddlmZmZGddejZdS)N)grammartokenc(eZdZ dZdZdZdZdS) Converterc |||||dSN)parse_graminit_hparse_graminit_c finish_off)self graminit_h graminit_cs +/usr/lib64/python3.11/lib2to3/pgen2/conv.pyrunz Converter.run/sCJ j))) j))) c  t|}n-#t$r }td|d|Yd}~dSd}~wwxYwi|_i|_d}|D]}|dz }t jd|}|s>|r*t|d|d|\|\}}t|}||j|<||j|<d S) N Can't open : Frz^#define\s+(\w+)\s+(\d+)$(z): can't parse T) openOSErrorprint symbol2number number2symbolrematchstripgroupsint) r filenameferrlinenolinemosymbolnumbers rr zConverter.parse_graminit_h5s4  XAA    E337 8 8 855555   4 4D aKF6==B 4$**,, 4(((FFF26**,,,@AAAA"$V.4"6*-3"6**ts =8=c  t|}n-#t$r }td|d|Yd}~dSd}~wwxYwd}|dzt|}}|dzt|}}|dzt|}}i}g}|drf|drt jd|}ttt| \} } } g} t| D]y} |dzt|}}t jd|}ttt| \}}| ||fz|dzt|}}| || | f<|dzt|}}|dt jd |}ttt| \}}g}t|D]} |dzt|}}t jd |}ttt| \} } } || | f} | | | ||dzt|}}|dzt|}}|df||_ i}t jd |}t|d}t|D]#}|dzt|}}t jd |}|d }ttt|dddd\}}}}||}|dzt|}}t jd|}i}t|d}t!|D]9\}}t#|}tdD]}|d|zzr d||dz|z<:||f||<%|dzt|}}||_g}|dzt|}}t jd|}t|d}t|D]}|dzt|}}t jd|}| \}}t|}|dkrd}nt|}| ||f|dzt|}}||_|dzt|}}|dzt|}}t jd|}t|d}|dzt|}}|dzt|}}t jd|}t|d}|dzt|}}t jd|}t|d} | |_|dzt|}} |dzt|}}dS#t*$rYdSwxYw)NrrFrrz static arc z)static arc arcs_(\d+)_(\d+)\[(\d+)\] = {$z\s+{(\d+), (\d+)},$z'static state states_(\d+)\[(\d+)\] = {$z\s+{(\d+), arcs_(\d+)_(\d+)},$zstatic dfa dfas\[(\d+)\] = {$z0\s+{(\d+), "(\w+)", (\d+), (\d+), states_(\d+),$z\s+("(?:\\\d\d\d)*")},$z!static label labels\[(\d+)\] = {$z\s+{(\d+), (0|"\w+")},$0z \s+(\d+),$z\s+{(\d+), labels},$z \s+(\d+)$)rrrnext startswithrrlistmapr rrangeappendstatesgroupeval enumerateorddfaslabelsstart StopIteration)!r r!r"r#r$r%allarcsr6r&nmkarcs_ijststater;ndfasr'r(xyzfirst rawbitsetcbyter<nlabelsr=s! rr zConverter.parse_graminit_cTs 6 XAA    E337 8 8 855555 axaaxaaxaoom,,! -//-00 1XJ"$$s3 44551aq((A#)!8T!WWDF"8$??BC 5 566DAqKKA''''%axa"&A%axa//-00 1 DdKKBC--..DAqE1XX # #%axaX?FFs3 44551aq!t} T"""" MM% !!8T!WWDF!!8T!WWDFCoom,,! -D  X6 = =BHHQKK  u * *A!!8T!WWDFM  BXXa[[F"3sBHHQ1a,@,@#A#ABBOFAq!1IE!!8T!WWDF4d;;BERXXa[[))I!),, + +11vvq++Aq!t}+)*acAg+"5>DLLaxa axa X:D A Abhhqkk""w " "A!!8T!WWDF4d;;B99;;DAqAACxxGG MM1a& ! ! ! !axa axaaxa XmT * *BHHQKK  axaaxa X-t 4 4bhhqkk""axa XlD ) )BHHQKK   axa %!!8T!WWDFFF    DD s" =8=*[ [ [c i|_i|_t|jD]1\}\}}|tjkr | ||j|<%| ||j|<2dSr)keywordstokensr9r<rNAME)r ilabeltypevalues rr zConverter.finish_offsr?  %.t{%;%; + + !FMT5uz!!e&7'- e$$$* D!  + +rN)__name__ __module__ __qualname__rr r r rrrr$sY >c%c%c%J+++++rr)rpgen2rrGrammarrr]rrr`so4 ! ]+]+]+]+]+]+]+]+]+]+rpgen2/__pycache__/__init__.cpython-311.opt-2.pyc000064400000000232151027012300015154 0ustar00 !A?hdS)Nr//usr/lib64/python3.11/lib2to3/pgen2/__init__.pyrs rpgen2/__pycache__/token.cpython-311.opt-2.pyc000064400000004414151027012300014543 0ustar00 !A?h dZdZdZdZdZdZdZdZdZd Z d Z d Z d Z d Z dZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZd Z d!Z!d"Z"d#Z#d$Z$d%Z%d&Z&d'Z'd(Z(d)Z)d*Z*d+Z+d,Z,d-Z-d.Z.d/Z/d0Z0d1Z1d2Z2d3Z3d4Z4d5Z5d6Z6d7Z7d8Z8d9Z9d:Z:d;Z;de?e@AD] \ZBZCeDeCeDdureBe>eC<!d>ZEd?ZFd@ZGdAS)B  !"#$%&'()*+,-./0123456789:;<c|tkSN NT_OFFSETxs ,/usr/lib64/python3.11/lib2to3/pgen2/token.py ISTERMINALrGOs y=c|tkSrArBrDs rF ISNONTERMINALrJR >rHc|tkSrA) ENDMARKERrDs rFISEOFrNUrKrHN)HrMNAMENUMBERSTRINGNEWLINEINDENTDEDENTLPARRPARLSQBRSQBCOLONCOMMASEMIPLUSMINUSSTARSLASHVBARAMPERLESSGREATEREQUALDOTPERCENT BACKQUOTELBRACERBRACEEQEQUALNOTEQUAL LESSEQUAL GREATEREQUALTILDE CIRCUMFLEX LEFTSHIFT RIGHTSHIFT DOUBLESTAR PLUSEQUALMINEQUAL STAREQUAL SLASHEQUAL PERCENTEQUAL AMPEREQUAL VBAREQUALCIRCUMFLEXEQUALLEFTSHIFTEQUALRIGHTSHIFTEQUALDOUBLESTAREQUAL DOUBLESLASHDOUBLESLASHEQUALATATEQUALOPCOMMENTNLRARROWAWAITASYNC ERRORTOKEN COLONEQUALN_TOKENSrCtok_namelistglobalsitems_name_valuetyperGrJrNrHrFrs(                                                      T''))//++,,!!ME6 tF||ttAww rHpgen2/__pycache__/driver.cpython-311.pyc000064400000021150151027012300013752 0ustar00 !A?hQdZdZddgZddlZddlZddlZddlZddlZddlm Z m Z m Z m Z m Z GddeZd Z dd ZdZdZdZedkr$ejee dSdS)zZParser driver. This provides a high-level interface to parse a file into a syntax tree. z#Guido van Rossum Driver load_grammarN)grammarparsetokentokenizepgenc>eZdZd dZd dZd dZd dZd dZd dZdS) rNcZ||_|tj}||_||_dS)N)rlogging getLoggerloggerconvert)selfrrrs -/usr/lib64/python3.11/lib2to3/pgen2/driver.py__init__zDriver.__init__s. >&((F  FcXtj|j|j}|d}d}dx}x}x}x} } d} |D]D} | \}}}} } |||fkrE||f|ksJ||f|f|\} }|| kr| d| |z zz } | }d}||kr| | ||z } |}|t jt jfvr'| |z } | \}}|dr|dz }d}|tj krtj |}|r-|j dtj||| |||| |fr|r|j dn>d} | \}}|dr|dz }d}Ftjd||| |f|jS) z4Parse a series of tokens and return the syntax tree.rrN z%s %r (prefix=%r)zStop.zincomplete input)rParserrrsetupr COMMENTNLendswithrOPopmaprdebugtok_nameaddtoken ParseErrorrootnode)rtokensrplinenocolumntypevaluestartend line_textprefix quintuples_linenos_columns r parse_tokenszDriver.parse_tokens&s9 Lt| 4 4  1555u5u5sY$ A$ AI1: .D%Y((('5000FF3CU2K000%*"(H$$dh&788F%FFH$$ix88F%F((+666%!$>>$''aKFFux}U+ G !!"5"'."6vGGGzz$77 /K%%g...F NFF~~d## ! "#5#'AA Azrc`tj|j}|||Sz*Parse a stream and return the syntax tree.)r generate_tokensreadliner1)rstreamrr$s rparse_stream_rawzDriver.parse_stream_rawVs*)&/::  ///rc.|||Sr3)r7)rr6rs r parse_streamzDriver.parse_stream[s$$VU333rctj|d|5}|||cdddS#1swxYwYdS)z(Parse a file and return the syntax tree.r)encodingN)ioopenr9)rfilenamer<rr6s r parse_filezDriver.parse_file_s WXsX 6 6 6 4&$$VU33 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4s ;??ctjtj|j}|||S)z*Parse a string and return the syntax tree.)r r4r=StringIOr5r1)rtextrr$s r parse_stringzDriver.parse_stringds5)"+d*;*;*DEE  ///r)NN)F)NF) __name__ __module__ __qualname__rr1r7r9r@rDrrrrs....`0000 44444444 000000rctj|\}}|dkrd}||zdt t t jzdzS)Nz.txtr.z.pickle)ospathsplitextjoinmapstrsys version_info)gtheadtails r_generate_pickle_namerVjsV!!"%%JD$ v~~ $;#c3+;"<"<== = IIr Grammar.txtTFc|tj}|t|n|}|st||s|d|t j|}|rZ|d| ||nV#t$r }|d|Yd}~n1d}~wwxYwn(tj }| ||S)z'Load the grammar (maybe from a pickle).Nz!Generating grammar tables from %szWriting grammar tables to %szWriting failed: %s) r rrV_newerinfor generate_grammardumpOSErrorrGrammarload)rSgpsaveforcerges rrrqs~"$$&(j r " " "bB F2rNN  7<<<  !" % %  5 KK6 ; ; ; 5r  5 5 5 0!44444444 5  5 O   r Hs>B B>B99B>ctj|sdStj|sdStj|tj|kS)z0Inquire whether file a was written since file b.FT)rKrLexistsgetmtime)abs rrYrYsc 7>>!  u 7>>!  t 7  A  "'"2"21"5"5 55rc4tj|rt|St tj|}t j||}tj }| ||S)aNormally, loads a pickled grammar by doing pkgutil.get_data(package, pickled_grammar) where *pickled_grammar* is computed from *grammar_source* by adding the Python version and using a ``.pickle`` extension. However, if *grammar_source* is an extant file, load_grammar(grammar_source) is called instead. This facilitates using a packaged grammar file when needed but preserves load_grammar's automatic regeneration behavior when possible. ) rKrLisfilerrVbasenamepkgutilget_datarr^loads)packagegrammar_source pickled_namedatarcs rload_packaged_grammarrtsx w~~n%%,N+++()9)9.)I)IJJL  G\ 2 2DAGGDMMM Hrc|stjdd}tjtjtjd|D]}t |dddS)zMain program, when run as a script: produce grammar pickle files. Calls load_grammar for each argument, a path to a grammar text file. rNz %(message)s)levelr6formatT)rarb)rQargvr basicConfigINFOstdoutr)argsrSs rmainr}sl x| gl3:,....00Rd$///// 4r__main__)rWNTFN)__doc__ __author____all__r=rKr rmrQrrrrr r objectrrVrrYrtr}rEexitintrHrrrsF 3 ^ $  43333333333333J0J0J0J0J0VJ0J0J0ZJJJ'+04    *666   (    z CHSSTTVV__rpgen2/__pycache__/conv.cpython-311.pyc000064400000031677151027012300013443 0ustar00 !A?h%HdZddlZddlmZmZGddejZdS)aConvert graminit.[ch] spit out by pgen to Python code. Pgen is the Python parser generator. It is useful to quickly create a parser from a grammar file in Python's grammar notation. But I don't want my parsers to be written in C (yet), so I'm translating the parsing tables to Python data structures and writing a Python parse engine. Note that the token numbers are constants determined by the standard Python tokenizer. The standard token module defines these numbers and their names (the names are not used much). The token numbers are hardcoded into the Python tokenizer and into pgen. A Python implementation of the Python tokenizer is also available, in the standard tokenize module. On the other hand, symbol numbers (representing the grammar's non-terminals) are assigned by pgen based on the actual grammar input. Note: this module is pretty much obsolete; the pgen module generates equivalent grammar tables directly from the Grammar.txt input file without having to invoke the Python pgen C program. N)grammartokenc*eZdZdZdZdZdZdZdS) Convertera2Grammar subclass that reads classic pgen output files. The run() method reads the tables as produced by the pgen parser generator, typically contained in two C files, graminit.h and graminit.c. The other methods are for internal use only. See the base class for more documentation. c|||||dS)z|r*t|d|d |\|\}}t|}||jvsJ||jvsJ||j|<||j|<d S) zParse the .h file written by pgen. (Internal) This file is a sequence of #define statements defining the nonterminals of the grammar as numbers. We build two tables mapping the numbers to names and back. Can't open : NFrz^#define\s+(\w+)\s+(\d+)$(z): can't parse T) openOSErrorprint symbol2number number2symbolrematchstripgroupsint) r filenameferrlinenolinemosymbolnumbers rrzConverter.parse_graminit_h5sU XAA    E337 8 8 855555   4 4D aKF6==B 4$**,, 4(((FFF26**,,,@AAAA"$VT%77777T%77777-3"6*-3"6**ts <7<c @ t|}n-#t$r }td|d|Yd}~dSd}~wwxYwd}|dzt|}}|dks J||f|dzt|}}|dks J||f|dzt|}}i}g}|d r|d rKt jd |}|s J||fttt| \} } } g} t| D]} |dzt|}}t jd |}|s J||fttt| \}}| ||f|dzt|}}|d ks J||f| || | f<|dzt|}}|d Kt jd |}|s J||fttt| \}}|t|ks J||fg}t|D]} |dzt|}}t jd|}|s J||fttt| \} } } || | f} | t| ks J||f| | | ||dzt|}}|d ks J||f|dzt|}}|d ||_i}t jd|}|s J||ft|d}t|D]}|dzt|}}t jd|}|s J||f|d}ttt|dddd\}}}}|j||ks J||f|j||ks J||f|dks J||f||}|t|ks J||f|dzt|}}t jd|}|s J||fi}t%|d}t'|D]9\}}t)|}tdD]}|d|zzr d||dz|z<:||f||<|dzt|}}|d ks J||f||_g}|dzt|}}t jd|}|s J||ft|d}t|D]}|dzt|}}t jd|}|s J||f| \}}t|}|dkrd}nt%|}| ||f|dzt|}}|d ks J||f||_|dzt|}}|dks J||f|dzt|}}t jd|}|s J||ft|d}|t|jksJ|dzt|}}|dks J||f|dzt|}}t jd|}|s J||ft|d}|t|jks J||f|dzt|}}t jd|}|s J||ft|d} | |jvs J||f| |_|dzt|}}|d ks J||f |dzt|}}J||f#t0$rYdSwxYw)aParse the .c file written by pgen. (Internal) The file looks as follows. The first two lines are always this: #include "pgenheaders.h" #include "grammar.h" After that come four blocks: 1) one or more state definitions 2) a table defining dfas 3) a table defining labels 4) a struct defining the grammar A state definition has the following form: - one or more arc arrays, each of the form: static arc arcs__[] = { {, }, ... }; - followed by a state array, of the form: static state states_[] = { {, arcs__}, ... }; rrNFrrz#include "pgenheaders.h" z#include "grammar.h" z static arc z)static arc arcs_(\d+)_(\d+)\[(\d+)\] = {$z\s+{(\d+), (\d+)},$z}; z'static state states_(\d+)\[(\d+)\] = {$z\s+{(\d+), arcs_(\d+)_(\d+)},$zstatic dfa dfas\[(\d+)\] = {$z0\s+{(\d+), "(\w+)", (\d+), (\d+), states_(\d+),$z\s+("(?:\\\d\d\d)*")},$z!static label labels\[(\d+)\] = {$z\s+{(\d+), (0|"\w+")},$0zgrammar _PyParser_Grammar = { z \s+(\d+),$z dfas, z\s+{(\d+), labels},$z \s+(\d+)$)rrrnext startswithrrlistmaprrrangeappendlenstatesgrouprreval enumerateorddfaslabelsstart StopIteration)!r r r!r"r#r$allarcsr6r%nmkarcs_ijststater;ndfasr&r'xyzfirst rawbitsetcbyter<nlabelsr=s! rr zConverter.parse_graminit_cTs 8 XAA    E337 8 8 855555 axa3333fd^333axa////&$///axaoom,,! -//-00 1XJ"$$))FD>))rs3 44551aq((A#)!8T!WWDF"8$??B--~--2C 5 566DAqKKA''''%axav~~~~~~~"&A%axa//-00 1 DdKKB % %~ % %2C--..DAqF ###fd^###E1XX # #%axaX?FF))FD>))rs3 44551aq!t}CII~~~~~~~ T"""" MM% !!8T!WWDF6>>>FD>>>>!!8T!WWDFCoom,,! -D  X6 = =!!FD>!!rBHHQKK  u * *A!!8T!WWDFM  B % %~ % %2XXa[[F"3sBHHQ1a,@,@#A#ABBOFAq!%f-777&$777%f-777&$777666FD>6661IEE ???VTN???!!8T!WWDF4d;;B % %~ % %2ERXXa[[))I!),, + +11vvq++Aq!t}+)*acAg+"5>DLLaxav~~~~~~~ axa X:D A A!!FD>!!rbhhqkk""w " "A!!8T!WWDF4d;;B % %~ % %299;;DAqAACxxGG MM1a& ! ! ! !axav~~~~~~~ axa888864.888axa XmT * *!!FD>!!rBHHQKK  DI&&&&axa{"""VTN"""axa X-t 4 4!!FD>!!rbhhqkk""#dk*****VTN***axa XlD ) )!!FD>!!rBHHQKK  ****VTN*** axav~~~~~~~ %!!8T!WWDF %vtn $ $1    DD s" <7</d ddci|_i|_t|jD]1\}\}}|tjkr | ||j|<%| ||j|<2dS)z1Create additional useful structures. (Internal).N)keywordstokensr9r<rNAME)r ilabeltypevalues rr zConverter.finish_offso  %.t{%;%; + + !FMT5uz!!e&7'- e$$$* D!  + +rN)__name__ __module__ __qualname____doc__rrr r rrrr$s^ >c%c%c%J+++++rr)r]rpgen2rrGrammarrr^rrrast4 ! ]+]+]+]+]+]+]+]+]+]+rpgen2/__pycache__/grammar.cpython-311.opt-1.pyc000064400000016605151027012300015055 0ustar00 !A?hdZddlZddlmZGddeZdZiZeD]*Z e r&e \Z Z e ee ee <+[ [ [ dS)aThis module defines the data structures used to represent a grammar. These are a bit arcane because they are derived from the data structures used by Python's 'pgen' parser generator. There's also a table here mapping operators to their names in the token module; the Python tokenize module reports all operators as the fallback token code OP, but the parser needs the actual token code. N)tokenc6eZdZdZdZdZdZdZdZdZ dS) Grammara Pgen parsing tables conversion class. Once initialized, this class supplies the grammar tables for the parsing engine implemented by parse.py. The parsing engine accesses the instance variables directly. The class here does not provide initialization of the tables; several subclasses exist to do this (see the conv and pgen modules). The load() method reads the tables from a pickle file, which is much faster than the other ways offered by subclasses. The pickle file is written by calling dump() (after loading the grammar tables using a subclass). The report() method prints a readable representation of the tables to stdout, for debugging. The instance variables are as follows: symbol2number -- a dict mapping symbol names to numbers. Symbol numbers are always 256 or higher, to distinguish them from token numbers, which are between 0 and 255 (inclusive). number2symbol -- a dict mapping numbers to symbol names; these two are each other's inverse. states -- a list of DFAs, where each DFA is a list of states, each state is a list of arcs, and each arc is a (i, j) pair where i is a label and j is a state number. The DFA number is the index into this list. (This name is slightly confusing.) Final states are represented by a special arc of the form (0, j) where j is its own state number. dfas -- a dict mapping symbol numbers to (DFA, first) pairs, where DFA is an item from the states list above, and first is a set of tokens that can begin this grammar rule (represented by a dict whose values are always 1). labels -- a list of (x, y) pairs where x is either a token number or a symbol number, and y is either None or a string; the strings are keywords. The label number is the index in this list; label numbers are used to mark state transitions (arcs) in the DFAs. start -- the number of the grammar's start symbol. keywords -- a dict mapping keyword strings to arc labels. tokens -- a dict mapping token numbers to arc labels. ci|_i|_g|_i|_dg|_i|_i|_i|_d|_dS)N)rEMPTY) symbol2number number2symbolstatesdfaslabelskeywordstokens symbol2labelstart)selfs ./usr/lib64/python3.11/lib2to3/pgen2/grammar.py__init__zGrammar.__init__LsJ  #n    ct|d5}tj|j|tjddddS#1swxYwYdS)z)Dump the grammar tables to a pickle file.wbN)openpickledump__dict__HIGHEST_PROTOCOL)rfilenamefs rrz Grammar.dumpWs (D ! ! CQ K q&*A B B B C C C C C C C C C C C C C C C C C Cs&AA Act|d5}tj|}dddn #1swxYwY|j|dS)z+Load the grammar tables from a pickle file.rbN)rrloadrupdate)rrrds rr"z Grammar.load\s (D ! ! Q AA                Qs 266c^|jtj|dS)z3Load the grammar tables from a pickle bytes object.N)rr#rloads)rpkls rr&z Grammar.loadsbs( V\#../////rc |}dD]3}t||t||4|jdd|_|jdd|_|j|_|S)z# Copy the grammar. )r r r rrrN) __class__setattrgetattrcopyrr r)rnew dict_attrs rr,z Grammar.copyfsnn4 E EI CGD)$<$<$A$A$C$C D D D D[^ [^ J  rcrddlm}td||jtd||jtd||jtd||jtd||jtd|jd S) z:Dump the grammar tables to standard output, for debugging.r)pprints2nn2sr r rrN)r0printr r r r rr)rr0s rreportzGrammar.reportss!!!!!! e t!""" e t!""" ht{ f ty ht{ gtz"""""rN) __name__ __module__ __qualname____doc__rrr"r&r,r4rrrrs|33j   CCC    000    # # # # #rra ( LPAR ) RPAR [ LSQB ] RSQB : COLON , COMMA ; SEMI + PLUS - MINUS * STAR / SLASH | VBAR & AMPER < LESS > GREATER = EQUAL . DOT % PERCENT ` BACKQUOTE { LBRACE } RBRACE @ AT @= ATEQUAL == EQEQUAL != NOTEQUAL <> NOTEQUAL <= LESSEQUAL >= GREATEREQUAL ~ TILDE ^ CIRCUMFLEX << LEFTSHIFT >> RIGHTSHIFT ** DOUBLESTAR += PLUSEQUAL -= MINEQUAL *= STAREQUAL /= SLASHEQUAL %= PERCENTEQUAL &= AMPEREQUAL |= VBAREQUAL ^= CIRCUMFLEXEQUAL <<= LEFTSHIFTEQUAL >>= RIGHTSHIFTEQUAL **= DOUBLESTAREQUAL // DOUBLESLASH //= DOUBLESLASHEQUAL -> RARROW := COLONEQUAL )r8rrobjectr opmap_rawopmap splitlineslinesplitopnamer+r9rrrCs   j#j#j#j#j#fj#j#j#^1  f   " "))D )::<<DGE4((b "dddrpgen2/__pycache__/driver.cpython-311.opt-2.pyc000064400000017136151027012300014723 0ustar00 !A?hQ dZddgZddlZddlZddlZddlZddlZddlmZm Z m Z m Z m Z Gdde ZdZ dd Zd ZdZdZedkr$ejee dSdS)z#Guido van Rossum Driver load_grammarN)grammarparsetokentokenizepgenc>eZdZd dZd dZd dZd dZd dZd dZdS) rNcZ||_|tj}||_||_dSN)rlogging getLoggerloggerconvert)selfrrrs -/usr/lib64/python3.11/lib2to3/pgen2/driver.py__init__zDriver.__init__s. >&((F  Fc. tj|j|j}|d}d}dx}x}x}x} } d} |D].} | \}}}} } |||fkr/|\} }|| kr| d| |z zz } | }d}||kr| | ||z } |}|t jt jfvr'| |z } | \}}|dr|dz }d}|tj krtj |}|r-|j dtj||| |||| |fr|r|j dn>d} | \}}|dr|dz }d}0tjd||| |f|jS)Nrr z%s %r (prefix=%r)zStop.zincomplete input)rParserrrsetupr COMMENTNLendswithrOPopmaprdebugtok_nameaddtoken ParseErrorrootnode)rtokensr plinenocolumntypevaluestartend line_textprefix quintuples_linenos_columns r parse_tokenszDriver.parse_tokens&sB Lt| 4 4  1555u5u5sY$ A$ AI1: .D%Y(((%*"(H$$dh&788F%FFH$$ix88F%F((+666%!$>>$''aKFFux}U+ G !!"5"'."6vGGGzz$77 /K%%g...F NFF~~d## ! "#5#'AA Azrcb tj|j}|||Sr )r generate_tokensreadliner2)rstreamr r%s rparse_stream_rawzDriver.parse_stream_rawVs-8)&/::  ///rc0 |||Sr )r7)rr6r s r parse_streamzDriver.parse_stream[s8$$VU333rc tj|d|5}|||cdddS#1swxYwYdS)Nr)encoding)ioopenr9)rfilenamer<r r6s r parse_filezDriver.parse_file_s6 WXsX 6 6 6 4&$$VU33 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4s<AAc tjtj|j}|||Sr )r r4r=StringIOr5r2)rtextr r%s r parse_stringzDriver.parse_stringds88)"+d*;*;*DEE  ///r)NN)F)NF) __name__ __module__ __qualname__rr2r7r9r@rDrrrrs....`0000 44444444 000000rctj|\}}|dkrd}||zdt t t jzdzS)Nz.txtr.z.pickle)ospathsplitextjoinmapstrsys version_info)gtheadtails r_generate_pickle_namerVjsV!!"%%JD$ v~~ $;#c3+;"<"<== = IIr Grammar.txtTFc |tj}|t|n|}|st||s|d|t j|}|rZ|d| ||nV#t$r }|d|Yd}~n1d}~wwxYwn(tj }| ||S)Nz!Generating grammar tables from %szWriting grammar tables to %szWriting failed: %s) rrrV_newerinfor generate_grammardumpOSErrorrGrammarload)rSgpsaveforcerges rrrqs 1 ~"$$&(j r " " "bB F2rNN  7<<<  !" % %  5 KK6 ; ; ; 5r  5 5 5 0!44444444 5  5 O   r Hs?B B?B::B?c tj|sdStj|sdStj|tj|kS)NFT)rKrLexistsgetmtime)abs rrYrYsf: 7>>!  u 7>>!  t 7  A  "'"2"21"5"5 55rc6 tj|rt|St tj|}t j||}tj }| ||Sr ) rKrLisfilerrVbasenamepkgutilget_datarr^loads)packagegrammar_source pickled_namedatarcs rload_packaged_grammarrts}  w~~n%%,N+++()9)9.)I)IJJL  G\ 2 2DAGGDMMM Hrc |stjdd}tjtjtjd|D]}t |dddS)Nrz %(message)s)levelr6formatT)rarb)rQargvr basicConfigINFOstdoutr)argsrSs rmainr}sq x| gl3:,....00Rd$///// 4r__main__)rWNTFN) __author____all__r=rKrrmrQrrrrr r objectrrVrrYrtr}rEexitintrHrrrsA 3 ^ $  43333333333333J0J0J0J0J0VJ0J0J0ZJJJ'+04    *666   (    z CHSSTTVV__rpgen2/__pycache__/grammar.cpython-311.opt-2.pyc000064400000010517151027012300015052 0ustar00 !A?h ddlZddlmZGddeZdZiZeD]*Zer&e \Z Z e ee ee <+[[ [ dS)N)tokenc4eZdZ dZdZdZdZdZdZdS)Grammarci|_i|_g|_i|_dg|_i|_i|_i|_d|_dS)N)rEMPTY) symbol2number number2symbolstatesdfaslabelskeywordstokens symbol2labelstart)selfs ./usr/lib64/python3.11/lib2to3/pgen2/grammar.py__init__zGrammar.__init__LsJ  #n    c t|d5}tj|j|tjddddS#1swxYwYdS)Nwb)openpickledump__dict__HIGHEST_PROTOCOL)rfilenamefs rrz Grammar.dumpWs7 (D ! ! CQ K q&*A B B B C C C C C C C C C C C C C C C C C Cs&AA  A c t|d5}tj|}dddn #1swxYwY|j|dS)Nrb)rrloadrupdate)rrrds rr"z Grammar.load\s9 (D ! ! Q AA                Qs 377c` |jtj|dS)N)rr#rloads)rpkls rr&z Grammar.loadsbs+A V\#../////rc  |}dD]3}t||t||4|jdd|_|jdd|_|j|_|S)N)r r r rrr) __class__setattrgetattrcopyrr r)rnew dict_attrs rr,z Grammar.copyfs nn4 E EI CGD)$<$<$A$A$C$C D D D D[^ [^ J  rct ddlm}td||jtd||jtd||jtd||jtd||jtd|jdS) Nr)pprints2nn2sr r rr)r0printr r r r rr)rr0s rreportzGrammar.reportssH!!!!!! e t!""" e t!""" ht{ f ty ht{ gtz"""""rN) __name__ __module__ __qualname__rrr"r&r,r4rrrrsw3j   CCC    000    # # # # #rra ( LPAR ) RPAR [ LSQB ] RSQB : COLON , COMMA ; SEMI + PLUS - MINUS * STAR / SLASH | VBAR & AMPER < LESS > GREATER = EQUAL . DOT % PERCENT ` BACKQUOTE { LBRACE } RBRACE @ AT @= ATEQUAL == EQEQUAL != NOTEQUAL <> NOTEQUAL <= LESSEQUAL >= GREATEREQUAL ~ TILDE ^ CIRCUMFLEX << LEFTSHIFT >> RIGHTSHIFT ** DOUBLESTAR += PLUSEQUAL -= MINEQUAL *= STAREQUAL /= SLASHEQUAL %= PERCENTEQUAL &= AMPEREQUAL |= VBAREQUAL ^= CIRCUMFLEXEQUAL <<= LEFTSHIFTEQUAL >>= RIGHTSHIFTEQUAL **= DOUBLESTAREQUAL // DOUBLESLASH //= DOUBLESLASHEQUAL -> RARROW := COLONEQUAL ) rrobjectr opmap_rawopmap splitlineslinesplitopnamer+r8rrrBs  j#j#j#j#j#fj#j#j#^1  f   " "))D )::<<DGE4((b "dddrpgen2/__pycache__/token.cpython-311.opt-1.pyc000064400000004475151027012300014551 0ustar00 !A?hdZdZdZdZdZdZdZdZdZd Z d Z d Z d Z d Z dZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZd Z d!Z!d"Z"d#Z#d$Z$d%Z%d&Z&d'Z'd(Z(d)Z)d*Z*d+Z+d,Z,d-Z-d.Z.d/Z/d0Z0d1Z1d2Z2d3Z3d4Z4d5Z5d6Z6d7Z7d8Z8d9Z9d:Z:d;Z;dZ>iZ?e@eABD] \ZCZDeEeDeEdureCe?eD<!d?ZFd@ZGdAZHdBS)Cz!Token constants (from "token.h").  !"#$%&'()*+,-./0123456789:;<c|tkSN NT_OFFSETxs ,/usr/lib64/python3.11/lib2to3/pgen2/token.py ISTERMINALrGOs y=c|tkSrArBrDs rF ISNONTERMINALrJR >rHc|tkSrA) ENDMARKERrDs rFISEOFrNUrKrHN)I__doc__rMNAMENUMBERSTRINGNEWLINEINDENTDEDENTLPARRPARLSQBRSQBCOLONCOMMASEMIPLUSMINUSSTARSLASHVBARAMPERLESSGREATEREQUALDOTPERCENT BACKQUOTELBRACERBRACEEQEQUALNOTEQUAL LESSEQUAL GREATEREQUALTILDE CIRCUMFLEX LEFTSHIFT RIGHTSHIFT DOUBLESTAR PLUSEQUALMINEQUAL STAREQUAL SLASHEQUAL PERCENTEQUAL AMPEREQUAL VBAREQUALCIRCUMFLEXEQUALLEFTSHIFTEQUALRIGHTSHIFTEQUALDOUBLESTAREQUAL DOUBLESLASHDOUBLESLASHEQUALATATEQUALOPCOMMENTNLRARROWAWAITASYNC ERRORTOKEN COLONEQUALN_TOKENSrCtok_namelistglobalsitems_name_valuetyperGrJrNrHrFrs('                                                      T''))//++,,!!ME6 tF||ttAww rHpgen2/tokenize.py000064400000051177151027012300007763 0ustar00# Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006 Python Software Foundation. # All rights reserved. """Tokenization help for Python programs. generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or "" for EOF). It generates 5-tuples with these members: the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string) It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators Older entry points tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken) are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.""" __author__ = 'Ka-Ping Yee ' __credits__ = \ 'GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip Montanaro' import string, re from codecs import BOM_UTF8, lookup from lib2to3.pgen2.token import * from . import token __all__ = [x for x in dir(token) if x[0] != '_'] + ["tokenize", "generate_tokens", "untokenize"] del token try: bytes except NameError: # Support bytes type in Python <= 2.5, so 2to3 turns itself into # valid Python 3 code. bytes = str def group(*choices): return '(' + '|'.join(choices) + ')' def any(*choices): return group(*choices) + '*' def maybe(*choices): return group(*choices) + '?' def _combinations(*l): return set( x + y for x in l for y in l + ("",) if x.casefold() != y.casefold() ) Whitespace = r'[ \f\t]*' Comment = r'#[^\r\n]*' Ignore = Whitespace + any(r'\\\r?\n' + Whitespace) + maybe(Comment) Name = r'\w+' Binnumber = r'0[bB]_?[01]+(?:_[01]+)*' Hexnumber = r'0[xX]_?[\da-fA-F]+(?:_[\da-fA-F]+)*[lL]?' Octnumber = r'0[oO]?_?[0-7]+(?:_[0-7]+)*[lL]?' Decnumber = group(r'[1-9]\d*(?:_\d+)*[lL]?', '0[lL]?') Intnumber = group(Binnumber, Hexnumber, Octnumber, Decnumber) Exponent = r'[eE][-+]?\d+(?:_\d+)*' Pointfloat = group(r'\d+(?:_\d+)*\.(?:\d+(?:_\d+)*)?', r'\.\d+(?:_\d+)*') + maybe(Exponent) Expfloat = r'\d+(?:_\d+)*' + Exponent Floatnumber = group(Pointfloat, Expfloat) Imagnumber = group(r'\d+(?:_\d+)*[jJ]', Floatnumber + r'[jJ]') Number = group(Imagnumber, Floatnumber, Intnumber) # Tail end of ' string. Single = r"[^'\\]*(?:\\.[^'\\]*)*'" # Tail end of " string. Double = r'[^"\\]*(?:\\.[^"\\]*)*"' # Tail end of ''' string. Single3 = r"[^'\\]*(?:(?:\\.|'(?!''))[^'\\]*)*'''" # Tail end of """ string. Double3 = r'[^"\\]*(?:(?:\\.|"(?!""))[^"\\]*)*"""' _litprefix = r"(?:[uUrRbBfF]|[rR][fFbB]|[fFbBuU][rR])?" Triple = group(_litprefix + "'''", _litprefix + '"""') # Single-line ' or " string. String = group(_litprefix + r"'[^\n'\\]*(?:\\.[^\n'\\]*)*'", _litprefix + r'"[^\n"\\]*(?:\\.[^\n"\\]*)*"') # Because of leftmost-then-longest match semantics, be sure to put the # longest operators first (e.g., if = came before ==, == would get # recognized as two instances of =). Operator = group(r"\*\*=?", r">>=?", r"<<=?", r"<>", r"!=", r"//=?", r"->", r"[+\-*/%&@|^=<>]=?", r"~") Bracket = '[][(){}]' Special = group(r'\r?\n', r':=', r'[:;.,`@]') Funny = group(Operator, Bracket, Special) PlainToken = group(Number, Funny, String, Name) Token = Ignore + PlainToken # First (or only) line of ' or " string. ContStr = group(_litprefix + r"'[^\n'\\]*(?:\\.[^\n'\\]*)*" + group("'", r'\\\r?\n'), _litprefix + r'"[^\n"\\]*(?:\\.[^\n"\\]*)*' + group('"', r'\\\r?\n')) PseudoExtras = group(r'\\\r?\n', Comment, Triple) PseudoToken = Whitespace + group(PseudoExtras, Number, Funny, ContStr, Name) tokenprog, pseudoprog, single3prog, double3prog = map( re.compile, (Token, PseudoToken, Single3, Double3)) _strprefixes = ( _combinations('r', 'R', 'f', 'F') | _combinations('r', 'R', 'b', 'B') | {'u', 'U', 'ur', 'uR', 'Ur', 'UR'} ) endprogs = {"'": re.compile(Single), '"': re.compile(Double), "'''": single3prog, '"""': double3prog, **{f"{prefix}'''": single3prog for prefix in _strprefixes}, **{f'{prefix}"""': double3prog for prefix in _strprefixes}, **{prefix: None for prefix in _strprefixes}} triple_quoted = ( {"'''", '"""'} | {f"{prefix}'''" for prefix in _strprefixes} | {f'{prefix}"""' for prefix in _strprefixes} ) single_quoted = ( {"'", '"'} | {f"{prefix}'" for prefix in _strprefixes} | {f'{prefix}"' for prefix in _strprefixes} ) tabsize = 8 class TokenError(Exception): pass class StopTokenizing(Exception): pass def printtoken(type, token, xxx_todo_changeme, xxx_todo_changeme1, line): # for testing (srow, scol) = xxx_todo_changeme (erow, ecol) = xxx_todo_changeme1 print("%d,%d-%d,%d:\t%s\t%s" % \ (srow, scol, erow, ecol, tok_name[type], repr(token))) def tokenize(readline, tokeneater=printtoken): """ The tokenize() function accepts two parameters: one representing the input stream, and one providing an output mechanism for tokenize(). The first parameter, readline, must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. The second parameter, tokeneater, must also be a callable object. It is called once for each token, with five arguments, corresponding to the tuples generated by generate_tokens(). """ try: tokenize_loop(readline, tokeneater) except StopTokenizing: pass # backwards compatible interface def tokenize_loop(readline, tokeneater): for token_info in generate_tokens(readline): tokeneater(*token_info) class Untokenizer: def __init__(self): self.tokens = [] self.prev_row = 1 self.prev_col = 0 def add_whitespace(self, start): row, col = start assert row <= self.prev_row col_offset = col - self.prev_col if col_offset: self.tokens.append(" " * col_offset) def untokenize(self, iterable): for t in iterable: if len(t) == 2: self.compat(t, iterable) break tok_type, token, start, end, line = t self.add_whitespace(start) self.tokens.append(token) self.prev_row, self.prev_col = end if tok_type in (NEWLINE, NL): self.prev_row += 1 self.prev_col = 0 return "".join(self.tokens) def compat(self, token, iterable): startline = False indents = [] toks_append = self.tokens.append toknum, tokval = token if toknum in (NAME, NUMBER): tokval += ' ' if toknum in (NEWLINE, NL): startline = True for tok in iterable: toknum, tokval = tok[:2] if toknum in (NAME, NUMBER, ASYNC, AWAIT): tokval += ' ' if toknum == INDENT: indents.append(tokval) continue elif toknum == DEDENT: indents.pop() continue elif toknum in (NEWLINE, NL): startline = True elif startline and indents: toks_append(indents[-1]) startline = False toks_append(tokval) cookie_re = re.compile(r'^[ \t\f]*#.*?coding[:=][ \t]*([-\w.]+)', re.ASCII) blank_re = re.compile(br'^[ \t\f]*(?:[#\r\n]|$)', re.ASCII) def _get_normal_name(orig_enc): """Imitates get_normal_name in tokenizer.c.""" # Only care about the first 12 characters. enc = orig_enc[:12].lower().replace("_", "-") if enc == "utf-8" or enc.startswith("utf-8-"): return "utf-8" if enc in ("latin-1", "iso-8859-1", "iso-latin-1") or \ enc.startswith(("latin-1-", "iso-8859-1-", "iso-latin-1-")): return "iso-8859-1" return orig_enc def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used (as a string) and a list of any lines (left as bytes) it has read in. It detects the encoding from the presence of a utf-8 bom or an encoding cookie as specified in pep-0263. If both a bom and a cookie are present, but disagree, a SyntaxError will be raised. If the encoding cookie is an invalid charset, raise a SyntaxError. Note that if a utf-8 bom is found, 'utf-8-sig' is returned. If no encoding is specified, then the default of 'utf-8' will be returned. """ bom_found = False encoding = None default = 'utf-8' def read_or_stop(): try: return readline() except StopIteration: return bytes() def find_cookie(line): try: line_string = line.decode('ascii') except UnicodeDecodeError: return None match = cookie_re.match(line_string) if not match: return None encoding = _get_normal_name(match.group(1)) try: codec = lookup(encoding) except LookupError: # This behaviour mimics the Python interpreter raise SyntaxError("unknown encoding: " + encoding) if bom_found: if codec.name != 'utf-8': # This behaviour mimics the Python interpreter raise SyntaxError('encoding problem: utf-8') encoding += '-sig' return encoding first = read_or_stop() if first.startswith(BOM_UTF8): bom_found = True first = first[3:] default = 'utf-8-sig' if not first: return default, [] encoding = find_cookie(first) if encoding: return encoding, [first] if not blank_re.match(first): return default, [first] second = read_or_stop() if not second: return default, [first] encoding = find_cookie(second) if encoding: return encoding, [first, second] return default, [first, second] def untokenize(iterable): """Transform tokens back into Python source code. Each element returned by the iterable must be a token sequence with at least two elements, a token number and token value. If only two tokens are passed, the resulting output is poor. Round-trip invariant for full input: Untokenized source will match input source exactly Round-trip invariant for limited input: # Output text will tokenize the back to the input t1 = [tok[:2] for tok in generate_tokens(f.readline)] newcode = untokenize(t1) readline = iter(newcode.splitlines(1)).next t2 = [tok[:2] for tokin generate_tokens(readline)] assert t1 == t2 """ ut = Untokenizer() return ut.untokenize(iterable) def generate_tokens(readline): """ The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline can be a callable function terminating with StopIteration: readline = open(myfile).next # Example of alternate readline The generator produces 5-tuples with these members: the token type; the token string; a 2-tuple (srow, scol) of ints specifying the row and column where the token begins in the source; a 2-tuple (erow, ecol) of ints specifying the row and column where the token ends in the source; and the line on which the token was found. The line passed is the physical line. """ lnum = parenlev = continued = 0 contstr, needcont = '', 0 contline = None indents = [0] # 'stashed' and 'async_*' are used for async/await parsing stashed = None async_def = False async_def_indent = 0 async_def_nl = False while 1: # loop over lines in stream try: line = readline() except StopIteration: line = '' lnum = lnum + 1 pos, max = 0, len(line) if contstr: # continued string if not line: raise TokenError("EOF in multi-line string", strstart) endmatch = endprog.match(line) if endmatch: pos = end = endmatch.end(0) yield (STRING, contstr + line[:end], strstart, (lnum, end), contline + line) contstr, needcont = '', 0 contline = None elif needcont and line[-2:] != '\\\n' and line[-3:] != '\\\r\n': yield (ERRORTOKEN, contstr + line, strstart, (lnum, len(line)), contline) contstr = '' contline = None continue else: contstr = contstr + line contline = contline + line continue elif parenlev == 0 and not continued: # new statement if not line: break column = 0 while pos < max: # measure leading whitespace if line[pos] == ' ': column = column + 1 elif line[pos] == '\t': column = (column//tabsize + 1)*tabsize elif line[pos] == '\f': column = 0 else: break pos = pos + 1 if pos == max: break if stashed: yield stashed stashed = None if line[pos] in '#\r\n': # skip comments or blank lines if line[pos] == '#': comment_token = line[pos:].rstrip('\r\n') nl_pos = pos + len(comment_token) yield (COMMENT, comment_token, (lnum, pos), (lnum, pos + len(comment_token)), line) yield (NL, line[nl_pos:], (lnum, nl_pos), (lnum, len(line)), line) else: yield ((NL, COMMENT)[line[pos] == '#'], line[pos:], (lnum, pos), (lnum, len(line)), line) continue if column > indents[-1]: # count indents or dedents indents.append(column) yield (INDENT, line[:pos], (lnum, 0), (lnum, pos), line) while column < indents[-1]: if column not in indents: raise IndentationError( "unindent does not match any outer indentation level", ("", lnum, pos, line)) indents = indents[:-1] if async_def and async_def_indent >= indents[-1]: async_def = False async_def_nl = False async_def_indent = 0 yield (DEDENT, '', (lnum, pos), (lnum, pos), line) if async_def and async_def_nl and async_def_indent >= indents[-1]: async_def = False async_def_nl = False async_def_indent = 0 else: # continued statement if not line: raise TokenError("EOF in multi-line statement", (lnum, 0)) continued = 0 while pos < max: pseudomatch = pseudoprog.match(line, pos) if pseudomatch: # scan for tokens start, end = pseudomatch.span(1) spos, epos, pos = (lnum, start), (lnum, end), end token, initial = line[start:end], line[start] if initial in string.digits or \ (initial == '.' and token != '.'): # ordinary number yield (NUMBER, token, spos, epos, line) elif initial in '\r\n': newline = NEWLINE if parenlev > 0: newline = NL elif async_def: async_def_nl = True if stashed: yield stashed stashed = None yield (newline, token, spos, epos, line) elif initial == '#': assert not token.endswith("\n") if stashed: yield stashed stashed = None yield (COMMENT, token, spos, epos, line) elif token in triple_quoted: endprog = endprogs[token] endmatch = endprog.match(line, pos) if endmatch: # all on one line pos = endmatch.end(0) token = line[start:pos] if stashed: yield stashed stashed = None yield (STRING, token, spos, (lnum, pos), line) else: strstart = (lnum, start) # multiple lines contstr = line[start:] contline = line break elif initial in single_quoted or \ token[:2] in single_quoted or \ token[:3] in single_quoted: if token[-1] == '\n': # continued string strstart = (lnum, start) endprog = (endprogs[initial] or endprogs[token[1]] or endprogs[token[2]]) contstr, needcont = line[start:], 1 contline = line break else: # ordinary string if stashed: yield stashed stashed = None yield (STRING, token, spos, epos, line) elif initial.isidentifier(): # ordinary name if token in ('async', 'await'): if async_def: yield (ASYNC if token == 'async' else AWAIT, token, spos, epos, line) continue tok = (NAME, token, spos, epos, line) if token == 'async' and not stashed: stashed = tok continue if token in ('def', 'for'): if (stashed and stashed[0] == NAME and stashed[1] == 'async'): if token == 'def': async_def = True async_def_indent = indents[-1] yield (ASYNC, stashed[1], stashed[2], stashed[3], stashed[4]) stashed = None if stashed: yield stashed stashed = None yield tok elif initial == '\\': # continued stmt # This yield is new; needed for better idempotency: if stashed: yield stashed stashed = None yield (NL, token, spos, (lnum, pos), line) continued = 1 else: if initial in '([{': parenlev = parenlev + 1 elif initial in ')]}': parenlev = parenlev - 1 if stashed: yield stashed stashed = None yield (OP, token, spos, epos, line) else: yield (ERRORTOKEN, line[pos], (lnum, pos), (lnum, pos+1), line) pos = pos + 1 if stashed: yield stashed stashed = None for indent in indents[1:]: # pop remaining indent levels yield (DEDENT, '', (lnum, 0), (lnum, 0), '') yield (ENDMARKER, '', (lnum, 0), (lnum, 0), '') if __name__ == '__main__': # testing import sys if len(sys.argv) > 1: tokenize(open(sys.argv[1]).readline) else: tokenize(sys.stdin.readline) pgen2/literals.py000064400000003143151027012300007740 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Safely evaluate Python string literals without using eval().""" import re simple_escapes = {"a": "\a", "b": "\b", "f": "\f", "n": "\n", "r": "\r", "t": "\t", "v": "\v", "'": "'", '"': '"', "\\": "\\"} def escape(m): all, tail = m.group(0, 1) assert all.startswith("\\") esc = simple_escapes.get(tail) if esc is not None: return esc if tail.startswith("x"): hexes = tail[1:] if len(hexes) < 2: raise ValueError("invalid hex string escape ('\\%s')" % tail) try: i = int(hexes, 16) except ValueError: raise ValueError("invalid hex string escape ('\\%s')" % tail) from None else: try: i = int(tail, 8) except ValueError: raise ValueError("invalid octal string escape ('\\%s')" % tail) from None return chr(i) def evalString(s): assert s.startswith("'") or s.startswith('"'), repr(s[:1]) q = s[0] if s[:3] == q*3: q = q*3 assert s.endswith(q), repr(s[-len(q):]) assert len(s) >= 2*len(q) s = s[len(q):-len(q)] return re.sub(r"\\(\'|\"|\\|[abfnrtv]|x.{0,2}|[0-7]{1,3})", escape, s) def test(): for i in range(256): c = chr(i) s = repr(c) e = evalString(s) if e != c: print(i, c, s, e) if __name__ == "__main__": test() pgen2/__init__.py000064400000000217151027012300007657 0ustar00# Copyright 2004-2005 Elemental Security, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """The pgen2 package.""" pytree.py000064400000066506151027012300006432 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """ Python parse tree definitions. This is a very concrete parse tree; we need to keep every token and even the comments and whitespace between tokens. There's also a pattern matching implementation here. """ __author__ = "Guido van Rossum " import sys from io import StringIO HUGE = 0x7FFFFFFF # maximum repeat count, default max _type_reprs = {} def type_repr(type_num): global _type_reprs if not _type_reprs: from .pygram import python_symbols # printing tokens is possible but not as useful # from .pgen2 import token // token.__dict__.items(): for name, val in python_symbols.__dict__.items(): if type(val) == int: _type_reprs[val] = name return _type_reprs.setdefault(type_num, type_num) class Base(object): """ Abstract base class for Node and Leaf. This provides some default functionality and boilerplate using the template pattern. A node may be a subnode of at most one parent. """ # Default values for instance variables type = None # int: token number (< 256) or symbol number (>= 256) parent = None # Parent node pointer, or None children = () # Tuple of subnodes was_changed = False was_checked = False def __new__(cls, *args, **kwds): """Constructor that prevents Base from being instantiated.""" assert cls is not Base, "Cannot instantiate Base" return object.__new__(cls) def __eq__(self, other): """ Compare two nodes for equality. This calls the method _eq(). """ if self.__class__ is not other.__class__: return NotImplemented return self._eq(other) __hash__ = None # For Py3 compatibility. def _eq(self, other): """ Compare two nodes for equality. This is called by __eq__ and __ne__. It is only called if the two nodes have the same type. This must be implemented by the concrete subclass. Nodes should be considered equal if they have the same structure, ignoring the prefix string and other context information. """ raise NotImplementedError def clone(self): """ Return a cloned (deep) copy of self. This must be implemented by the concrete subclass. """ raise NotImplementedError def post_order(self): """ Return a post-order iterator for the tree. This must be implemented by the concrete subclass. """ raise NotImplementedError def pre_order(self): """ Return a pre-order iterator for the tree. This must be implemented by the concrete subclass. """ raise NotImplementedError def replace(self, new): """Replace this node with a new one in the parent.""" assert self.parent is not None, str(self) assert new is not None if not isinstance(new, list): new = [new] l_children = [] found = False for ch in self.parent.children: if ch is self: assert not found, (self.parent.children, self, new) if new is not None: l_children.extend(new) found = True else: l_children.append(ch) assert found, (self.children, self, new) self.parent.changed() self.parent.children = l_children for x in new: x.parent = self.parent self.parent = None def get_lineno(self): """Return the line number which generated the invocant node.""" node = self while not isinstance(node, Leaf): if not node.children: return node = node.children[0] return node.lineno def changed(self): if self.parent: self.parent.changed() self.was_changed = True def remove(self): """ Remove the node from the tree. Returns the position of the node in its parent's children before it was removed. """ if self.parent: for i, node in enumerate(self.parent.children): if node is self: self.parent.changed() del self.parent.children[i] self.parent = None return i @property def next_sibling(self): """ The node immediately following the invocant in their parent's children list. If the invocant does not have a next sibling, it is None """ if self.parent is None: return None # Can't use index(); we need to test by identity for i, child in enumerate(self.parent.children): if child is self: try: return self.parent.children[i+1] except IndexError: return None @property def prev_sibling(self): """ The node immediately preceding the invocant in their parent's children list. If the invocant does not have a previous sibling, it is None. """ if self.parent is None: return None # Can't use index(); we need to test by identity for i, child in enumerate(self.parent.children): if child is self: if i == 0: return None return self.parent.children[i-1] def leaves(self): for child in self.children: yield from child.leaves() def depth(self): if self.parent is None: return 0 return 1 + self.parent.depth() def get_suffix(self): """ Return the string immediately following the invocant node. This is effectively equivalent to node.next_sibling.prefix """ next_sib = self.next_sibling if next_sib is None: return "" return next_sib.prefix if sys.version_info < (3, 0): def __str__(self): return str(self).encode("ascii") class Node(Base): """Concrete implementation for interior nodes.""" def __init__(self,type, children, context=None, prefix=None, fixers_applied=None): """ Initializer. Takes a type constant (a symbol number >= 256), a sequence of child nodes, and an optional context keyword argument. As a side effect, the parent pointers of the children are updated. """ assert type >= 256, type self.type = type self.children = list(children) for ch in self.children: assert ch.parent is None, repr(ch) ch.parent = self if prefix is not None: self.prefix = prefix if fixers_applied: self.fixers_applied = fixers_applied[:] else: self.fixers_applied = None def __repr__(self): """Return a canonical string representation.""" return "%s(%s, %r)" % (self.__class__.__name__, type_repr(self.type), self.children) def __unicode__(self): """ Return a pretty string representation. This reproduces the input source exactly. """ return "".join(map(str, self.children)) if sys.version_info > (3, 0): __str__ = __unicode__ def _eq(self, other): """Compare two nodes for equality.""" return (self.type, self.children) == (other.type, other.children) def clone(self): """Return a cloned (deep) copy of self.""" return Node(self.type, [ch.clone() for ch in self.children], fixers_applied=self.fixers_applied) def post_order(self): """Return a post-order iterator for the tree.""" for child in self.children: yield from child.post_order() yield self def pre_order(self): """Return a pre-order iterator for the tree.""" yield self for child in self.children: yield from child.pre_order() @property def prefix(self): """ The whitespace and comments preceding this node in the input. """ if not self.children: return "" return self.children[0].prefix @prefix.setter def prefix(self, prefix): if self.children: self.children[0].prefix = prefix def set_child(self, i, child): """ Equivalent to 'node.children[i] = child'. This method also sets the child's parent attribute appropriately. """ child.parent = self self.children[i].parent = None self.children[i] = child self.changed() def insert_child(self, i, child): """ Equivalent to 'node.children.insert(i, child)'. This method also sets the child's parent attribute appropriately. """ child.parent = self self.children.insert(i, child) self.changed() def append_child(self, child): """ Equivalent to 'node.children.append(child)'. This method also sets the child's parent attribute appropriately. """ child.parent = self self.children.append(child) self.changed() class Leaf(Base): """Concrete implementation for leaf nodes.""" # Default values for instance variables _prefix = "" # Whitespace and comments preceding this token in the input lineno = 0 # Line where this token starts in the input column = 0 # Column where this token tarts in the input def __init__(self, type, value, context=None, prefix=None, fixers_applied=[]): """ Initializer. Takes a type constant (a token number < 256), a string value, and an optional context keyword argument. """ assert 0 <= type < 256, type if context is not None: self._prefix, (self.lineno, self.column) = context self.type = type self.value = value if prefix is not None: self._prefix = prefix self.fixers_applied = fixers_applied[:] def __repr__(self): """Return a canonical string representation.""" return "%s(%r, %r)" % (self.__class__.__name__, self.type, self.value) def __unicode__(self): """ Return a pretty string representation. This reproduces the input source exactly. """ return self.prefix + str(self.value) if sys.version_info > (3, 0): __str__ = __unicode__ def _eq(self, other): """Compare two nodes for equality.""" return (self.type, self.value) == (other.type, other.value) def clone(self): """Return a cloned (deep) copy of self.""" return Leaf(self.type, self.value, (self.prefix, (self.lineno, self.column)), fixers_applied=self.fixers_applied) def leaves(self): yield self def post_order(self): """Return a post-order iterator for the tree.""" yield self def pre_order(self): """Return a pre-order iterator for the tree.""" yield self @property def prefix(self): """ The whitespace and comments preceding this token in the input. """ return self._prefix @prefix.setter def prefix(self, prefix): self.changed() self._prefix = prefix def convert(gr, raw_node): """ Convert raw node information to a Node or Leaf instance. This is passed to the parser driver which calls it whenever a reduction of a grammar rule produces a new complete node, so that the tree is build strictly bottom-up. """ type, value, context, children = raw_node if children or type in gr.number2symbol: # If there's exactly one child, return that child instead of # creating a new node. if len(children) == 1: return children[0] return Node(type, children, context=context) else: return Leaf(type, value, context=context) class BasePattern(object): """ A pattern is a tree matching pattern. It looks for a specific node type (token or symbol), and optionally for a specific content. This is an abstract base class. There are three concrete subclasses: - LeafPattern matches a single leaf node; - NodePattern matches a single node (usually non-leaf); - WildcardPattern matches a sequence of nodes of variable length. """ # Defaults for instance variables type = None # Node type (token if < 256, symbol if >= 256) content = None # Optional content matching pattern name = None # Optional name used to store match in results dict def __new__(cls, *args, **kwds): """Constructor that prevents BasePattern from being instantiated.""" assert cls is not BasePattern, "Cannot instantiate BasePattern" return object.__new__(cls) def __repr__(self): args = [type_repr(self.type), self.content, self.name] while args and args[-1] is None: del args[-1] return "%s(%s)" % (self.__class__.__name__, ", ".join(map(repr, args))) def optimize(self): """ A subclass can define this as a hook for optimizations. Returns either self or another node with the same effect. """ return self def match(self, node, results=None): """ Does this pattern exactly match a node? Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. Default implementation for non-wildcard patterns. """ if self.type is not None and node.type != self.type: return False if self.content is not None: r = None if results is not None: r = {} if not self._submatch(node, r): return False if r: results.update(r) if results is not None and self.name: results[self.name] = node return True def match_seq(self, nodes, results=None): """ Does this pattern exactly match a sequence of nodes? Default implementation for non-wildcard patterns. """ if len(nodes) != 1: return False return self.match(nodes[0], results) def generate_matches(self, nodes): """ Generator yielding all matches for this pattern. Default implementation for non-wildcard patterns. """ r = {} if nodes and self.match(nodes[0], r): yield 1, r class LeafPattern(BasePattern): def __init__(self, type=None, content=None, name=None): """ Initializer. Takes optional type, content, and name. The type, if given must be a token type (< 256). If not given, this matches any *leaf* node; the content may still be required. The content, if given, must be a string. If a name is given, the matching node is stored in the results dict under that key. """ if type is not None: assert 0 <= type < 256, type if content is not None: assert isinstance(content, str), repr(content) self.type = type self.content = content self.name = name def match(self, node, results=None): """Override match() to insist on a leaf node.""" if not isinstance(node, Leaf): return False return BasePattern.match(self, node, results) def _submatch(self, node, results=None): """ Match the pattern's content to the node's children. This assumes the node type matches and self.content is not None. Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. When returning False, the results dict may still be updated. """ return self.content == node.value class NodePattern(BasePattern): wildcards = False def __init__(self, type=None, content=None, name=None): """ Initializer. Takes optional type, content, and name. The type, if given, must be a symbol type (>= 256). If the type is None this matches *any* single node (leaf or not), except if content is not None, in which it only matches non-leaf nodes that also match the content pattern. The content, if not None, must be a sequence of Patterns that must match the node's children exactly. If the content is given, the type must not be None. If a name is given, the matching node is stored in the results dict under that key. """ if type is not None: assert type >= 256, type if content is not None: assert not isinstance(content, str), repr(content) content = list(content) for i, item in enumerate(content): assert isinstance(item, BasePattern), (i, item) if isinstance(item, WildcardPattern): self.wildcards = True self.type = type self.content = content self.name = name def _submatch(self, node, results=None): """ Match the pattern's content to the node's children. This assumes the node type matches and self.content is not None. Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. When returning False, the results dict may still be updated. """ if self.wildcards: for c, r in generate_matches(self.content, node.children): if c == len(node.children): if results is not None: results.update(r) return True return False if len(self.content) != len(node.children): return False for subpattern, child in zip(self.content, node.children): if not subpattern.match(child, results): return False return True class WildcardPattern(BasePattern): """ A wildcard pattern can match zero or more nodes. This has all the flexibility needed to implement patterns like: .* .+ .? .{m,n} (a b c | d e | f) (...)* (...)+ (...)? (...){m,n} except it always uses non-greedy matching. """ def __init__(self, content=None, min=0, max=HUGE, name=None): """ Initializer. Args: content: optional sequence of subsequences of patterns; if absent, matches one node; if present, each subsequence is an alternative [*] min: optional minimum number of times to match, default 0 max: optional maximum number of times to match, default HUGE name: optional name assigned to this match [*] Thus, if content is [[a, b, c], [d, e], [f, g, h]] this is equivalent to (a b c | d e | f g h); if content is None, this is equivalent to '.' in regular expression terms. The min and max parameters work as follows: min=0, max=maxint: .* min=1, max=maxint: .+ min=0, max=1: .? min=1, max=1: . If content is not None, replace the dot with the parenthesized list of alternatives, e.g. (a b c | d e | f g h)* """ assert 0 <= min <= max <= HUGE, (min, max) if content is not None: content = tuple(map(tuple, content)) # Protect against alterations # Check sanity of alternatives assert len(content), repr(content) # Can't have zero alternatives for alt in content: assert len(alt), repr(alt) # Can have empty alternatives self.content = content self.min = min self.max = max self.name = name def optimize(self): """Optimize certain stacked wildcard patterns.""" subpattern = None if (self.content is not None and len(self.content) == 1 and len(self.content[0]) == 1): subpattern = self.content[0][0] if self.min == 1 and self.max == 1: if self.content is None: return NodePattern(name=self.name) if subpattern is not None and self.name == subpattern.name: return subpattern.optimize() if (self.min <= 1 and isinstance(subpattern, WildcardPattern) and subpattern.min <= 1 and self.name == subpattern.name): return WildcardPattern(subpattern.content, self.min*subpattern.min, self.max*subpattern.max, subpattern.name) return self def match(self, node, results=None): """Does this pattern exactly match a node?""" return self.match_seq([node], results) def match_seq(self, nodes, results=None): """Does this pattern exactly match a sequence of nodes?""" for c, r in self.generate_matches(nodes): if c == len(nodes): if results is not None: results.update(r) if self.name: results[self.name] = list(nodes) return True return False def generate_matches(self, nodes): """ Generator yielding matches for a sequence of nodes. Args: nodes: sequence of nodes Yields: (count, results) tuples where: count: the match comprises nodes[:count]; results: dict containing named submatches. """ if self.content is None: # Shortcut for special case (see __init__.__doc__) for count in range(self.min, 1 + min(len(nodes), self.max)): r = {} if self.name: r[self.name] = nodes[:count] yield count, r elif self.name == "bare_name": yield self._bare_name_matches(nodes) else: # The reason for this is that hitting the recursion limit usually # results in some ugly messages about how RuntimeErrors are being # ignored. We only have to do this on CPython, though, because other # implementations don't have this nasty bug in the first place. if hasattr(sys, "getrefcount"): save_stderr = sys.stderr sys.stderr = StringIO() try: for count, r in self._recursive_matches(nodes, 0): if self.name: r[self.name] = nodes[:count] yield count, r except RuntimeError: # Fall back to the iterative pattern matching scheme if the # recursive scheme hits the recursion limit (RecursionError). for count, r in self._iterative_matches(nodes): if self.name: r[self.name] = nodes[:count] yield count, r finally: if hasattr(sys, "getrefcount"): sys.stderr = save_stderr def _iterative_matches(self, nodes): """Helper to iteratively yield the matches.""" nodelen = len(nodes) if 0 >= self.min: yield 0, {} results = [] # generate matches that use just one alt from self.content for alt in self.content: for c, r in generate_matches(alt, nodes): yield c, r results.append((c, r)) # for each match, iterate down the nodes while results: new_results = [] for c0, r0 in results: # stop if the entire set of nodes has been matched if c0 < nodelen and c0 <= self.max: for alt in self.content: for c1, r1 in generate_matches(alt, nodes[c0:]): if c1 > 0: r = {} r.update(r0) r.update(r1) yield c0 + c1, r new_results.append((c0 + c1, r)) results = new_results def _bare_name_matches(self, nodes): """Special optimized matcher for bare_name.""" count = 0 r = {} done = False max = len(nodes) while not done and count < max: done = True for leaf in self.content: if leaf[0].match(nodes[count], r): count += 1 done = False break r[self.name] = nodes[:count] return count, r def _recursive_matches(self, nodes, count): """Helper to recursively yield the matches.""" assert self.content is not None if count >= self.min: yield 0, {} if count < self.max: for alt in self.content: for c0, r0 in generate_matches(alt, nodes): for c1, r1 in self._recursive_matches(nodes[c0:], count+1): r = {} r.update(r0) r.update(r1) yield c0 + c1, r class NegatedPattern(BasePattern): def __init__(self, content=None): """ Initializer. The argument is either a pattern or None. If it is None, this only matches an empty sequence (effectively '$' in regex lingo). If it is not None, this matches whenever the argument pattern doesn't have any matches. """ if content is not None: assert isinstance(content, BasePattern), repr(content) self.content = content def match(self, node): # We never match a node in its entirety return False def match_seq(self, nodes): # We only match an empty sequence of nodes in its entirety return len(nodes) == 0 def generate_matches(self, nodes): if self.content is None: # Return a match if there is an empty sequence if len(nodes) == 0: yield 0, {} else: # Return a match if the argument pattern has no matches for c, r in self.content.generate_matches(nodes): return yield 0, {} def generate_matches(patterns, nodes): """ Generator yielding matches for a sequence of patterns and nodes. Args: patterns: a sequence of patterns nodes: a sequence of nodes Yields: (count, results) tuples where: count: the entire sequence of patterns matches nodes[:count]; results: dict containing named submatches. """ if not patterns: yield 0, {} else: p, rest = patterns[0], patterns[1:] for c0, r0 in p.generate_matches(nodes): if not rest: yield c0, r0 else: for c1, r1 in generate_matches(rest, nodes[c0:]): r = {} r.update(r0) r.update(r1) yield c0 + c1, r btm_matcher.py000064400000014737151027012300007406 0ustar00"""A bottom-up tree matching algorithm implementation meant to speed up 2to3's matching process. After the tree patterns are reduced to their rarest linear path, a linear Aho-Corasick automaton is created. The linear automaton traverses the linear paths from the leaves to the root of the AST and returns a set of nodes for further matching. This reduces significantly the number of candidate nodes.""" __author__ = "George Boutsioukis " import logging import itertools from collections import defaultdict from . import pytree from .btm_utils import reduce_tree class BMNode(object): """Class for a node of the Aho-Corasick automaton used in matching""" count = itertools.count() def __init__(self): self.transition_table = {} self.fixers = [] self.id = next(BMNode.count) self.content = '' class BottomMatcher(object): """The main matcher class. After instantiating the patterns should be added using the add_fixer method""" def __init__(self): self.match = set() self.root = BMNode() self.nodes = [self.root] self.fixers = [] self.logger = logging.getLogger("RefactoringTool") def add_fixer(self, fixer): """Reduces a fixer's pattern tree to a linear path and adds it to the matcher(a common Aho-Corasick automaton). The fixer is appended on the matching states and called when they are reached""" self.fixers.append(fixer) tree = reduce_tree(fixer.pattern_tree) linear = tree.get_linear_subpattern() match_nodes = self.add(linear, start=self.root) for match_node in match_nodes: match_node.fixers.append(fixer) def add(self, pattern, start): "Recursively adds a linear pattern to the AC automaton" #print("adding pattern", pattern, "to", start) if not pattern: #print("empty pattern") return [start] if isinstance(pattern[0], tuple): #alternatives #print("alternatives") match_nodes = [] for alternative in pattern[0]: #add all alternatives, and add the rest of the pattern #to each end node end_nodes = self.add(alternative, start=start) for end in end_nodes: match_nodes.extend(self.add(pattern[1:], end)) return match_nodes else: #single token #not last if pattern[0] not in start.transition_table: #transition did not exist, create new next_node = BMNode() start.transition_table[pattern[0]] = next_node else: #transition exists already, follow next_node = start.transition_table[pattern[0]] if pattern[1:]: end_nodes = self.add(pattern[1:], start=next_node) else: end_nodes = [next_node] return end_nodes def run(self, leaves): """The main interface with the bottom matcher. The tree is traversed from the bottom using the constructed automaton. Nodes are only checked once as the tree is retraversed. When the automaton fails, we give it one more shot(in case the above tree matches as a whole with the rejected leaf), then we break for the next leaf. There is the special case of multiple arguments(see code comments) where we recheck the nodes Args: The leaves of the AST tree to be matched Returns: A dictionary of node matches with fixers as the keys """ current_ac_node = self.root results = defaultdict(list) for leaf in leaves: current_ast_node = leaf while current_ast_node: current_ast_node.was_checked = True for child in current_ast_node.children: # multiple statements, recheck if isinstance(child, pytree.Leaf) and child.value == ";": current_ast_node.was_checked = False break if current_ast_node.type == 1: #name node_token = current_ast_node.value else: node_token = current_ast_node.type if node_token in current_ac_node.transition_table: #token matches current_ac_node = current_ac_node.transition_table[node_token] for fixer in current_ac_node.fixers: results[fixer].append(current_ast_node) else: #matching failed, reset automaton current_ac_node = self.root if (current_ast_node.parent is not None and current_ast_node.parent.was_checked): #the rest of the tree upwards has been checked, next leaf break #recheck the rejected node once from the root if node_token in current_ac_node.transition_table: #token matches current_ac_node = current_ac_node.transition_table[node_token] for fixer in current_ac_node.fixers: results[fixer].append(current_ast_node) current_ast_node = current_ast_node.parent return results def print_ac(self): "Prints a graphviz diagram of the BM automaton(for debugging)" print("digraph g{") def print_node(node): for subnode_key in node.transition_table.keys(): subnode = node.transition_table[subnode_key] print("%d -> %d [label=%s] //%s" % (node.id, subnode.id, type_repr(subnode_key), str(subnode.fixers))) if subnode_key == 1: print(subnode.content) print_node(subnode) print_node(self.root) print("}") # taken from pytree.py for debugging; only used by print_ac _type_reprs = {} def type_repr(type_num): global _type_reprs if not _type_reprs: from .pygram import python_symbols # printing tokens is possible but not as useful # from .pgen2 import token // token.__dict__.items(): for name, val in python_symbols.__dict__.items(): if type(val) == int: _type_reprs[val] = name return _type_reprs.setdefault(type_num, type_num) pygram.py000064400000002431151027012300006404 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Export the Python grammar and symbols.""" # Python imports import os # Local imports from .pgen2 import token from .pgen2 import driver from . import pytree # The grammar file _GRAMMAR_FILE = os.path.join(os.path.dirname(__file__), "Grammar.txt") _PATTERN_GRAMMAR_FILE = os.path.join(os.path.dirname(__file__), "PatternGrammar.txt") class Symbols(object): def __init__(self, grammar): """Initializer. Creates an attribute for each grammar symbol (nonterminal), whose value is the symbol's type (an int >= 256). """ for name, symbol in grammar.symbol2number.items(): setattr(self, name, symbol) python_grammar = driver.load_packaged_grammar("lib2to3", _GRAMMAR_FILE) python_symbols = Symbols(python_grammar) python_grammar_no_print_statement = python_grammar.copy() del python_grammar_no_print_statement.keywords["print"] python_grammar_no_print_and_exec_statement = python_grammar_no_print_statement.copy() del python_grammar_no_print_and_exec_statement.keywords["exec"] pattern_grammar = driver.load_packaged_grammar("lib2to3", _PATTERN_GRAMMAR_FILE) pattern_symbols = Symbols(pattern_grammar) btm_utils.py000064400000023331151027012300007111 0ustar00"Utility functions used by the btm_matcher module" from . import pytree from .pgen2 import grammar, token from .pygram import pattern_symbols, python_symbols syms = pattern_symbols pysyms = python_symbols tokens = grammar.opmap token_labels = token TYPE_ANY = -1 TYPE_ALTERNATIVES = -2 TYPE_GROUP = -3 class MinNode(object): """This class serves as an intermediate representation of the pattern tree during the conversion to sets of leaf-to-root subpatterns""" def __init__(self, type=None, name=None): self.type = type self.name = name self.children = [] self.leaf = False self.parent = None self.alternatives = [] self.group = [] def __repr__(self): return str(self.type) + ' ' + str(self.name) def leaf_to_root(self): """Internal method. Returns a characteristic path of the pattern tree. This method must be run for all leaves until the linear subpatterns are merged into a single""" node = self subp = [] while node: if node.type == TYPE_ALTERNATIVES: node.alternatives.append(subp) if len(node.alternatives) == len(node.children): #last alternative subp = [tuple(node.alternatives)] node.alternatives = [] node = node.parent continue else: node = node.parent subp = None break if node.type == TYPE_GROUP: node.group.append(subp) #probably should check the number of leaves if len(node.group) == len(node.children): subp = get_characteristic_subpattern(node.group) node.group = [] node = node.parent continue else: node = node.parent subp = None break if node.type == token_labels.NAME and node.name: #in case of type=name, use the name instead subp.append(node.name) else: subp.append(node.type) node = node.parent return subp def get_linear_subpattern(self): """Drives the leaf_to_root method. The reason that leaf_to_root must be run multiple times is because we need to reject 'group' matches; for example the alternative form (a | b c) creates a group [b c] that needs to be matched. Since matching multiple linear patterns overcomes the automaton's capabilities, leaf_to_root merges each group into a single choice based on 'characteristic'ity, i.e. (a|b c) -> (a|b) if b more characteristic than c Returns: The most 'characteristic'(as defined by get_characteristic_subpattern) path for the compiled pattern tree. """ for l in self.leaves(): subp = l.leaf_to_root() if subp: return subp def leaves(self): "Generator that returns the leaves of the tree" for child in self.children: yield from child.leaves() if not self.children: yield self def reduce_tree(node, parent=None): """ Internal function. Reduces a compiled pattern tree to an intermediate representation suitable for feeding the automaton. This also trims off any optional pattern elements(like [a], a*). """ new_node = None #switch on the node type if node.type == syms.Matcher: #skip node = node.children[0] if node.type == syms.Alternatives : #2 cases if len(node.children) <= 2: #just a single 'Alternative', skip this node new_node = reduce_tree(node.children[0], parent) else: #real alternatives new_node = MinNode(type=TYPE_ALTERNATIVES) #skip odd children('|' tokens) for child in node.children: if node.children.index(child)%2: continue reduced = reduce_tree(child, new_node) if reduced is not None: new_node.children.append(reduced) elif node.type == syms.Alternative: if len(node.children) > 1: new_node = MinNode(type=TYPE_GROUP) for child in node.children: reduced = reduce_tree(child, new_node) if reduced: new_node.children.append(reduced) if not new_node.children: # delete the group if all of the children were reduced to None new_node = None else: new_node = reduce_tree(node.children[0], parent) elif node.type == syms.Unit: if (isinstance(node.children[0], pytree.Leaf) and node.children[0].value == '('): #skip parentheses return reduce_tree(node.children[1], parent) if ((isinstance(node.children[0], pytree.Leaf) and node.children[0].value == '[') or (len(node.children)>1 and hasattr(node.children[1], "value") and node.children[1].value == '[')): #skip whole unit if its optional return None leaf = True details_node = None alternatives_node = None has_repeater = False repeater_node = None has_variable_name = False for child in node.children: if child.type == syms.Details: leaf = False details_node = child elif child.type == syms.Repeater: has_repeater = True repeater_node = child elif child.type == syms.Alternatives: alternatives_node = child if hasattr(child, 'value') and child.value == '=': # variable name has_variable_name = True #skip variable name if has_variable_name: #skip variable name, '=' name_leaf = node.children[2] if hasattr(name_leaf, 'value') and name_leaf.value == '(': # skip parenthesis name_leaf = node.children[3] else: name_leaf = node.children[0] #set node type if name_leaf.type == token_labels.NAME: #(python) non-name or wildcard if name_leaf.value == 'any': new_node = MinNode(type=TYPE_ANY) else: if hasattr(token_labels, name_leaf.value): new_node = MinNode(type=getattr(token_labels, name_leaf.value)) else: new_node = MinNode(type=getattr(pysyms, name_leaf.value)) elif name_leaf.type == token_labels.STRING: #(python) name or character; remove the apostrophes from #the string value name = name_leaf.value.strip("'") if name in tokens: new_node = MinNode(type=tokens[name]) else: new_node = MinNode(type=token_labels.NAME, name=name) elif name_leaf.type == syms.Alternatives: new_node = reduce_tree(alternatives_node, parent) #handle repeaters if has_repeater: if repeater_node.children[0].value == '*': #reduce to None new_node = None elif repeater_node.children[0].value == '+': #reduce to a single occurrence i.e. do nothing pass else: #TODO: handle {min, max} repeaters raise NotImplementedError #add children if details_node and new_node is not None: for child in details_node.children[1:-1]: #skip '<', '>' markers reduced = reduce_tree(child, new_node) if reduced is not None: new_node.children.append(reduced) if new_node: new_node.parent = parent return new_node def get_characteristic_subpattern(subpatterns): """Picks the most characteristic from a list of linear patterns Current order used is: names > common_names > common_chars """ if not isinstance(subpatterns, list): return subpatterns if len(subpatterns)==1: return subpatterns[0] # first pick out the ones containing variable names subpatterns_with_names = [] subpatterns_with_common_names = [] common_names = ['in', 'for', 'if' , 'not', 'None'] subpatterns_with_common_chars = [] common_chars = "[]().,:" for subpattern in subpatterns: if any(rec_test(subpattern, lambda x: type(x) is str)): if any(rec_test(subpattern, lambda x: isinstance(x, str) and x in common_chars)): subpatterns_with_common_chars.append(subpattern) elif any(rec_test(subpattern, lambda x: isinstance(x, str) and x in common_names)): subpatterns_with_common_names.append(subpattern) else: subpatterns_with_names.append(subpattern) if subpatterns_with_names: subpatterns = subpatterns_with_names elif subpatterns_with_common_names: subpatterns = subpatterns_with_common_names elif subpatterns_with_common_chars: subpatterns = subpatterns_with_common_chars # of the remaining subpatterns pick out the longest one return max(subpatterns, key=len) def rec_test(sequence, test_func): """Tests test_func on all items of sequence and items of included sub-iterables""" for x in sequence: if isinstance(x, (list, tuple)): yield from rec_test(x, test_func) else: yield test_func(x) patcomp.py000064400000015616151027012300006561 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Pattern compiler. The grammar is taken from PatternGrammar.txt. The compiler compiles a pattern to a pytree.*Pattern instance. """ __author__ = "Guido van Rossum " # Python imports import io # Fairly local imports from .pgen2 import driver, literals, token, tokenize, parse, grammar # Really local imports from . import pytree from . import pygram class PatternSyntaxError(Exception): pass def tokenize_wrapper(input): """Tokenizes a string suppressing significant whitespace.""" skip = {token.NEWLINE, token.INDENT, token.DEDENT} tokens = tokenize.generate_tokens(io.StringIO(input).readline) for quintuple in tokens: type, value, start, end, line_text = quintuple if type not in skip: yield quintuple class PatternCompiler(object): def __init__(self, grammar_file=None): """Initializer. Takes an optional alternative filename for the pattern grammar. """ if grammar_file is None: self.grammar = pygram.pattern_grammar self.syms = pygram.pattern_symbols else: self.grammar = driver.load_grammar(grammar_file) self.syms = pygram.Symbols(self.grammar) self.pygrammar = pygram.python_grammar self.pysyms = pygram.python_symbols self.driver = driver.Driver(self.grammar, convert=pattern_convert) def compile_pattern(self, input, debug=False, with_tree=False): """Compiles a pattern string to a nested pytree.*Pattern object.""" tokens = tokenize_wrapper(input) try: root = self.driver.parse_tokens(tokens, debug=debug) except parse.ParseError as e: raise PatternSyntaxError(str(e)) from None if with_tree: return self.compile_node(root), root else: return self.compile_node(root) def compile_node(self, node): """Compiles a node, recursively. This is one big switch on the node type. """ # XXX Optimize certain Wildcard-containing-Wildcard patterns # that can be merged if node.type == self.syms.Matcher: node = node.children[0] # Avoid unneeded recursion if node.type == self.syms.Alternatives: # Skip the odd children since they are just '|' tokens alts = [self.compile_node(ch) for ch in node.children[::2]] if len(alts) == 1: return alts[0] p = pytree.WildcardPattern([[a] for a in alts], min=1, max=1) return p.optimize() if node.type == self.syms.Alternative: units = [self.compile_node(ch) for ch in node.children] if len(units) == 1: return units[0] p = pytree.WildcardPattern([units], min=1, max=1) return p.optimize() if node.type == self.syms.NegatedUnit: pattern = self.compile_basic(node.children[1:]) p = pytree.NegatedPattern(pattern) return p.optimize() assert node.type == self.syms.Unit name = None nodes = node.children if len(nodes) >= 3 and nodes[1].type == token.EQUAL: name = nodes[0].value nodes = nodes[2:] repeat = None if len(nodes) >= 2 and nodes[-1].type == self.syms.Repeater: repeat = nodes[-1] nodes = nodes[:-1] # Now we've reduced it to: STRING | NAME [Details] | (...) | [...] pattern = self.compile_basic(nodes, repeat) if repeat is not None: assert repeat.type == self.syms.Repeater children = repeat.children child = children[0] if child.type == token.STAR: min = 0 max = pytree.HUGE elif child.type == token.PLUS: min = 1 max = pytree.HUGE elif child.type == token.LBRACE: assert children[-1].type == token.RBRACE assert len(children) in (3, 5) min = max = self.get_int(children[1]) if len(children) == 5: max = self.get_int(children[3]) else: assert False if min != 1 or max != 1: pattern = pattern.optimize() pattern = pytree.WildcardPattern([[pattern]], min=min, max=max) if name is not None: pattern.name = name return pattern.optimize() def compile_basic(self, nodes, repeat=None): # Compile STRING | NAME [Details] | (...) | [...] assert len(nodes) >= 1 node = nodes[0] if node.type == token.STRING: value = str(literals.evalString(node.value)) return pytree.LeafPattern(_type_of_literal(value), value) elif node.type == token.NAME: value = node.value if value.isupper(): if value not in TOKEN_MAP: raise PatternSyntaxError("Invalid token: %r" % value) if nodes[1:]: raise PatternSyntaxError("Can't have details for token") return pytree.LeafPattern(TOKEN_MAP[value]) else: if value == "any": type = None elif not value.startswith("_"): type = getattr(self.pysyms, value, None) if type is None: raise PatternSyntaxError("Invalid symbol: %r" % value) if nodes[1:]: # Details present content = [self.compile_node(nodes[1].children[1])] else: content = None return pytree.NodePattern(type, content) elif node.value == "(": return self.compile_node(nodes[1]) elif node.value == "[": assert repeat is None subpattern = self.compile_node(nodes[1]) return pytree.WildcardPattern([[subpattern]], min=0, max=1) assert False, node def get_int(self, node): assert node.type == token.NUMBER return int(node.value) # Map named tokens to the type value for a LeafPattern TOKEN_MAP = {"NAME": token.NAME, "STRING": token.STRING, "NUMBER": token.NUMBER, "TOKEN": None} def _type_of_literal(value): if value[0].isalpha(): return token.NAME elif value in grammar.opmap: return grammar.opmap[value] else: return None def pattern_convert(grammar, raw_node_info): """Converts raw node information to a Node or Leaf instance.""" type, value, context, children = raw_node_info if children or type in grammar.number2symbol: return pytree.Node(type, children, context=context) else: return pytree.Leaf(type, value, context=context) def compile_pattern(pattern): return PatternCompiler().compile_pattern(pattern) PatternGrammar.txt000064400000001431151027012300010217 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. # A grammar to describe tree matching patterns. # Not shown here: # - 'TOKEN' stands for any token (leaf node) # - 'any' stands for any node (leaf or interior) # With 'any' we can still specify the sub-structure. # The start symbol is 'Matcher'. Matcher: Alternatives ENDMARKER Alternatives: Alternative ('|' Alternative)* Alternative: (Unit | NegatedUnit)+ Unit: [NAME '='] ( STRING [Repeater] | NAME [Details] [Repeater] | '(' Alternatives ')' [Repeater] | '[' Alternatives ']' ) NegatedUnit: 'not' (STRING | NAME [Details] | '(' Alternatives ')') Repeater: '*' | '+' | '{' NUMBER [',' NUMBER] '}' Details: '<' Alternatives '>' fixes/fix_asserts.py000064400000001730151027012300010556 0ustar00"""Fixer that replaces deprecated unittest method names.""" # Author: Ezio Melotti from ..fixer_base import BaseFix from ..fixer_util import Name NAMES = dict( assert_="assertTrue", assertEquals="assertEqual", assertNotEquals="assertNotEqual", assertAlmostEquals="assertAlmostEqual", assertNotAlmostEquals="assertNotAlmostEqual", assertRegexpMatches="assertRegex", assertRaisesRegexp="assertRaisesRegex", failUnlessEqual="assertEqual", failIfEqual="assertNotEqual", failUnlessAlmostEqual="assertAlmostEqual", failIfAlmostEqual="assertNotAlmostEqual", failUnless="assertTrue", failUnlessRaises="assertRaises", failIf="assertFalse", ) class FixAsserts(BaseFix): PATTERN = """ power< any+ trailer< '.' meth=(%s)> any* > """ % '|'.join(map(repr, NAMES)) def transform(self, node, results): name = results["meth"][0] name.replace(Name(NAMES[str(name)], prefix=name.prefix)) fixes/fix_xrange.py000064400000005206151027012300010360 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that changes xrange(...) into range(...).""" # Local imports from .. import fixer_base from ..fixer_util import Name, Call, consuming_calls from .. import patcomp class FixXrange(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< (name='range'|name='xrange') trailer< '(' args=any ')' > rest=any* > """ def start_tree(self, tree, filename): super(FixXrange, self).start_tree(tree, filename) self.transformed_xranges = set() def finish_tree(self, tree, filename): self.transformed_xranges = None def transform(self, node, results): name = results["name"] if name.value == "xrange": return self.transform_xrange(node, results) elif name.value == "range": return self.transform_range(node, results) else: raise ValueError(repr(name)) def transform_xrange(self, node, results): name = results["name"] name.replace(Name("range", prefix=name.prefix)) # This prevents the new range call from being wrapped in a list later. self.transformed_xranges.add(id(node)) def transform_range(self, node, results): if (id(node) not in self.transformed_xranges and not self.in_special_context(node)): range_call = Call(Name("range"), [results["args"].clone()]) # Encase the range call in list(). list_call = Call(Name("list"), [range_call], prefix=node.prefix) # Put things that were after the range() call after the list call. for n in results["rest"]: list_call.append_child(n) return list_call P1 = "power< func=NAME trailer< '(' node=any ')' > any* >" p1 = patcomp.compile_pattern(P1) P2 = """for_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > | comparison< any 'in' node=any any*> """ p2 = patcomp.compile_pattern(P2) def in_special_context(self, node): if node.parent is None: return False results = {} if (node.parent.parent is not None and self.p1.match(node.parent.parent, results) and results["node"] is node): # list(d.keys()) -> list(d.keys()), etc. return results["func"].value in consuming_calls # for ... in d.iterkeys() -> for ... in d.keys(), etc. return self.p2.match(node.parent, results) and results["node"] is node fixes/fix_zip.py000064400000002411151027012300007671 0ustar00""" Fixer that changes zip(seq0, seq1, ...) into list(zip(seq0, seq1, ...) unless there exists a 'from future_builtins import zip' statement in the top-level namespace. We avoid the transformation if the zip() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. """ # Local imports from .. import fixer_base from ..pytree import Node from ..pygram import python_symbols as syms from ..fixer_util import Name, ArgList, in_special_context class FixZip(fixer_base.ConditionalFix): BM_compatible = True PATTERN = """ power< 'zip' args=trailer< '(' [any] ')' > [trailers=trailer*] > """ skip_on = "future_builtins.zip" def transform(self, node, results): if self.should_skip(node): return if in_special_context(node): return None args = results['args'].clone() args.prefix = "" trailers = [] if 'trailers' in results: trailers = [n.clone() for n in results['trailers']] for n in trailers: n.prefix = "" new = Node(syms.power, [Name("zip"), args], prefix="") new = Node(syms.power, [Name("list"), ArgList([new])] + trailers) new.prefix = node.prefix return new fixes/fix_paren.py000064400000002312151027012300010174 0ustar00"""Fixer that adds parentheses where they are required This converts ``[x for x in 1, 2]`` to ``[x for x in (1, 2)]``.""" # By Taek Joo Kim and Benjamin Peterson # Local imports from .. import fixer_base from ..fixer_util import LParen, RParen # XXX This doesn't support nested for loops like [x for x in 1, 2 for x in 1, 2] class FixParen(fixer_base.BaseFix): BM_compatible = True PATTERN = """ atom< ('[' | '(') (listmaker< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > > | testlist_gexp< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > >) (']' | ')') > """ def transform(self, node, results): target = results["target"] lparen = LParen() lparen.prefix = target.prefix target.prefix = "" # Make it hug the parentheses target.insert_child(0, lparen) target.append_child(RParen()) fixes/fix_nonzero.py000064400000001117151027012300010563 0ustar00"""Fixer for __nonzero__ -> __bool__ methods.""" # Author: Collin Winter # Local imports from .. import fixer_base from ..fixer_util import Name class FixNonzero(fixer_base.BaseFix): BM_compatible = True PATTERN = """ classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='__nonzero__' parameters< '(' NAME ')' > any+ > any* > > """ def transform(self, node, results): name = results["name"] new = Name("__bool__", prefix=name.prefix) name.replace(new) fixes/fix_intern.py000064400000002170151027012300010370 0ustar00# Copyright 2006 Georg Brandl. # Licensed to PSF under a Contributor Agreement. """Fixer for intern(). intern(s) -> sys.intern(s)""" # Local imports from .. import fixer_base from ..fixer_util import ImportAndCall, touch_import class FixIntern(fixer_base.BaseFix): BM_compatible = True order = "pre" PATTERN = """ power< 'intern' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > """ def transform(self, node, results): if results: # I feel like we should be able to express this logic in the # PATTERN above but I don't know how to do it so... obj = results['obj'] if obj: if (obj.type == self.syms.argument and obj.children[0].value in {'**', '*'}): return # Make no change. names = ('sys', 'intern') new = ImportAndCall(node, results, names) touch_import(None, 'sys', node) return new fixes/fix_standarderror.py000064400000000701151027012300011741 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for StandardError -> Exception.""" # Local imports from .. import fixer_base from ..fixer_util import Name class FixStandarderror(fixer_base.BaseFix): BM_compatible = True PATTERN = """ 'StandardError' """ def transform(self, node, results): return Name("Exception", prefix=node.prefix) fixes/fix_getcwdu.py000064400000000703151027012300010533 0ustar00""" Fixer that changes os.getcwdu() to os.getcwd(). """ # Author: Victor Stinner # Local imports from .. import fixer_base from ..fixer_util import Name class FixGetcwdu(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< 'os' trailer< dot='.' name='getcwdu' > any* > """ def transform(self, node, results): name = results["name"] name.replace(Name("getcwd", prefix=name.prefix)) fixes/fix_sys_exc.py000064400000002012151027012300010541 0ustar00"""Fixer for sys.exc_{type, value, traceback} sys.exc_type -> sys.exc_info()[0] sys.exc_value -> sys.exc_info()[1] sys.exc_traceback -> sys.exc_info()[2] """ # By Jeff Balogh and Benjamin Peterson # Local imports from .. import fixer_base from ..fixer_util import Attr, Call, Name, Number, Subscript, Node, syms class FixSysExc(fixer_base.BaseFix): # This order matches the ordering of sys.exc_info(). exc_info = ["exc_type", "exc_value", "exc_traceback"] BM_compatible = True PATTERN = """ power< 'sys' trailer< dot='.' attribute=(%s) > > """ % '|'.join("'%s'" % e for e in exc_info) def transform(self, node, results): sys_attr = results["attribute"][0] index = Number(self.exc_info.index(sys_attr.value)) call = Call(Name("exc_info"), prefix=sys_attr.prefix) attr = Attr(Name("sys"), call) attr[1].children[0].prefix = results["dot"].prefix attr.append(Subscript(index)) return Node(syms.power, attr, prefix=node.prefix) fixes/fix_repr.py000064400000001145151027012300010042 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that transforms `xyzzy` into repr(xyzzy).""" # Local imports from .. import fixer_base from ..fixer_util import Call, Name, parenthesize class FixRepr(fixer_base.BaseFix): BM_compatible = True PATTERN = """ atom < '`' expr=any '`' > """ def transform(self, node, results): expr = results["expr"].clone() if expr.type == self.syms.testlist1: expr = parenthesize(expr) return Call(Name("repr"), [expr], prefix=node.prefix) fixes/fix_idioms.py000064400000011414151027012300010356 0ustar00"""Adjust some old Python 2 idioms to their modern counterparts. * Change some type comparisons to isinstance() calls: type(x) == T -> isinstance(x, T) type(x) is T -> isinstance(x, T) type(x) != T -> not isinstance(x, T) type(x) is not T -> not isinstance(x, T) * Change "while 1:" into "while True:". * Change both v = list(EXPR) v.sort() foo(v) and the more general v = EXPR v.sort() foo(v) into v = sorted(EXPR) foo(v) """ # Author: Jacques Frechet, Collin Winter # Local imports from .. import fixer_base from ..fixer_util import Call, Comma, Name, Node, BlankLine, syms CMP = "(n='!=' | '==' | 'is' | n=comp_op< 'is' 'not' >)" TYPE = "power< 'type' trailer< '(' x=any ')' > >" class FixIdioms(fixer_base.BaseFix): explicit = True # The user must ask for this fixer PATTERN = r""" isinstance=comparison< %s %s T=any > | isinstance=comparison< T=any %s %s > | while_stmt< 'while' while='1' ':' any+ > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' power< list='list' trailer< '(' (not arglist) any ')' > > > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' expr=any > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > """ % (TYPE, CMP, CMP, TYPE) def match(self, node): r = super(FixIdioms, self).match(node) # If we've matched one of the sort/sorted subpatterns above, we # want to reject matches where the initial assignment and the # subsequent .sort() call involve different identifiers. if r and "sorted" in r: if r["id1"] == r["id2"]: return r return None return r def transform(self, node, results): if "isinstance" in results: return self.transform_isinstance(node, results) elif "while" in results: return self.transform_while(node, results) elif "sorted" in results: return self.transform_sort(node, results) else: raise RuntimeError("Invalid match") def transform_isinstance(self, node, results): x = results["x"].clone() # The thing inside of type() T = results["T"].clone() # The type being compared against x.prefix = "" T.prefix = " " test = Call(Name("isinstance"), [x, Comma(), T]) if "n" in results: test.prefix = " " test = Node(syms.not_test, [Name("not"), test]) test.prefix = node.prefix return test def transform_while(self, node, results): one = results["while"] one.replace(Name("True", prefix=one.prefix)) def transform_sort(self, node, results): sort_stmt = results["sort"] next_stmt = results["next"] list_call = results.get("list") simple_expr = results.get("expr") if list_call: list_call.replace(Name("sorted", prefix=list_call.prefix)) elif simple_expr: new = simple_expr.clone() new.prefix = "" simple_expr.replace(Call(Name("sorted"), [new], prefix=simple_expr.prefix)) else: raise RuntimeError("should not have reached here") sort_stmt.remove() btwn = sort_stmt.prefix # Keep any prefix lines between the sort_stmt and the list_call and # shove them right after the sorted() call. if "\n" in btwn: if next_stmt: # The new prefix should be everything from the sort_stmt's # prefix up to the last newline, then the old prefix after a new # line. prefix_lines = (btwn.rpartition("\n")[0], next_stmt[0].prefix) next_stmt[0].prefix = "\n".join(prefix_lines) else: assert list_call.parent assert list_call.next_sibling is None # Put a blank line after list_call and set its prefix. end_line = BlankLine() list_call.parent.append_child(end_line) assert list_call.next_sibling is end_line # The new prefix should be everything up to the first new line # of sort_stmt's prefix. end_line.prefix = btwn.rpartition("\n")[0] fixes/fix_unicode.py000064400000002350151027012300010517 0ustar00r"""Fixer for unicode. * Changes unicode to str and unichr to chr. * If "...\u..." is not unicode literal change it into "...\\u...". * Change u"..." into "...". """ from ..pgen2 import token from .. import fixer_base _mapping = {"unichr" : "chr", "unicode" : "str"} class FixUnicode(fixer_base.BaseFix): BM_compatible = True PATTERN = "STRING | 'unicode' | 'unichr'" def start_tree(self, tree, filename): super(FixUnicode, self).start_tree(tree, filename) self.unicode_literals = 'unicode_literals' in tree.future_features def transform(self, node, results): if node.type == token.NAME: new = node.clone() new.value = _mapping[node.value] return new elif node.type == token.STRING: val = node.value if not self.unicode_literals and val[0] in '\'"' and '\\' in val: val = r'\\'.join([ v.replace('\\u', r'\\u').replace('\\U', r'\\U') for v in val.split(r'\\') ]) if val[0] in 'uU': val = val[1:] if val == node.value: return node new = node.clone() new.value = val return new fixes/fix_basestring.py000064400000000500151027012300011225 0ustar00"""Fixer for basestring -> str.""" # Author: Christian Heimes # Local imports from .. import fixer_base from ..fixer_util import Name class FixBasestring(fixer_base.BaseFix): BM_compatible = True PATTERN = "'basestring'" def transform(self, node, results): return Name("str", prefix=node.prefix) fixes/fix_reduce.py000064400000001505151027012300010341 0ustar00# Copyright 2008 Armin Ronacher. # Licensed to PSF under a Contributor Agreement. """Fixer for reduce(). Makes sure reduce() is imported from the functools module if reduce is used in that module. """ from lib2to3 import fixer_base from lib2to3.fixer_util import touch_import class FixReduce(fixer_base.BaseFix): BM_compatible = True order = "pre" PATTERN = """ power< 'reduce' trailer< '(' arglist< ( (not(argument) any ',' not(argument > """ def transform(self, node, results): touch_import('functools', 'reduce', node) fixes/fix_filter.py000064400000005315151027012300010362 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that changes filter(F, X) into list(filter(F, X)). We avoid the transformation if the filter() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on filter(F, X) to return a string if X is a string and a tuple if X is a tuple. That would require type inference, which we don't do. Let Python 2.6 figure it out. """ # Local imports from .. import fixer_base from ..pytree import Node from ..pygram import python_symbols as syms from ..fixer_util import Name, ArgList, ListComp, in_special_context, parenthesize class FixFilter(fixer_base.ConditionalFix): BM_compatible = True PATTERN = """ filter_lambda=power< 'filter' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'filter' trailer< '(' arglist< none='None' ',' seq=any > ')' > [extra_trailers=trailer*] > | power< 'filter' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > """ skip_on = "future_builtins.filter" def transform(self, node, results): if self.should_skip(node): return trailers = [] if 'extra_trailers' in results: for t in results['extra_trailers']: trailers.append(t.clone()) if "filter_lambda" in results: xp = results.get("xp").clone() if xp.type == syms.test: xp.prefix = "" xp = parenthesize(xp) new = ListComp(results.get("fp").clone(), results.get("fp").clone(), results.get("it").clone(), xp) new = Node(syms.power, [new] + trailers, prefix="") elif "none" in results: new = ListComp(Name("_f"), Name("_f"), results["seq"].clone(), Name("_f")) new = Node(syms.power, [new] + trailers, prefix="") else: if in_special_context(node): return None args = results['args'].clone() new = Node(syms.power, [Name("filter"), args], prefix="") new = Node(syms.power, [Name("list"), ArgList([new])] + trailers) new.prefix = "" new.prefix = node.prefix return new fixes/fix_apply.py000064400000004452151027012300010223 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for apply(). This converts apply(func, v, k) into (func)(*v, **k).""" # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Call, Comma, parenthesize class FixApply(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< 'apply' trailer< '(' arglist< (not argument ')' > > """ def transform(self, node, results): syms = self.syms assert results func = results["func"] args = results["args"] kwds = results.get("kwds") # I feel like we should be able to express this logic in the # PATTERN above but I don't know how to do it so... if args: if (args.type == self.syms.argument and args.children[0].value in {'**', '*'}): return # Make no change. if kwds and (kwds.type == self.syms.argument and kwds.children[0].value == '**'): return # Make no change. prefix = node.prefix func = func.clone() if (func.type not in (token.NAME, syms.atom) and (func.type != syms.power or func.children[-2].type == token.DOUBLESTAR)): # Need to parenthesize func = parenthesize(func) func.prefix = "" args = args.clone() args.prefix = "" if kwds is not None: kwds = kwds.clone() kwds.prefix = "" l_newargs = [pytree.Leaf(token.STAR, "*"), args] if kwds is not None: l_newargs.extend([Comma(), pytree.Leaf(token.DOUBLESTAR, "**"), kwds]) l_newargs[-2].prefix = " " # that's the ** token # XXX Sometimes we could be cleverer, e.g. apply(f, (x, y) + t) # can be translated into f(x, y, *t) instead of f(*(x, y) + t) #new = pytree.Node(syms.power, (func, ArgList(l_newargs))) return Call(func, l_newargs, prefix=prefix) fixes/fix_dict.py000064400000007260151027012300010021 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for dict methods. d.keys() -> list(d.keys()) d.items() -> list(d.items()) d.values() -> list(d.values()) d.iterkeys() -> iter(d.keys()) d.iteritems() -> iter(d.items()) d.itervalues() -> iter(d.values()) d.viewkeys() -> d.keys() d.viewitems() -> d.items() d.viewvalues() -> d.values() Except in certain very specific contexts: the iter() can be dropped when the context is list(), sorted(), iter() or for...in; the list() can be dropped when the context is list() or sorted() (but not iter() or for...in!). Special contexts that apply to both: list(), sorted(), tuple() set(), any(), all(), sum(). Note: iter(d.keys()) could be written as iter(d) but since the original d.iterkeys() was also redundant we don't fix this. And there are (rare) contexts where it makes a difference (e.g. when passing it as an argument to a function that introspects the argument). """ # Local imports from .. import pytree from .. import patcomp from .. import fixer_base from ..fixer_util import Name, Call, Dot from .. import fixer_util iter_exempt = fixer_util.consuming_calls | {"iter"} class FixDict(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< head=any+ trailer< '.' method=('keys'|'items'|'values'| 'iterkeys'|'iteritems'|'itervalues'| 'viewkeys'|'viewitems'|'viewvalues') > parens=trailer< '(' ')' > tail=any* > """ def transform(self, node, results): head = results["head"] method = results["method"][0] # Extract node for method name tail = results["tail"] syms = self.syms method_name = method.value isiter = method_name.startswith("iter") isview = method_name.startswith("view") if isiter or isview: method_name = method_name[4:] assert method_name in ("keys", "items", "values"), repr(method) head = [n.clone() for n in head] tail = [n.clone() for n in tail] special = not tail and self.in_special_context(node, isiter) args = head + [pytree.Node(syms.trailer, [Dot(), Name(method_name, prefix=method.prefix)]), results["parens"].clone()] new = pytree.Node(syms.power, args) if not (special or isview): new.prefix = "" new = Call(Name("iter" if isiter else "list"), [new]) if tail: new = pytree.Node(syms.power, [new] + tail) new.prefix = node.prefix return new P1 = "power< func=NAME trailer< '(' node=any ')' > any* >" p1 = patcomp.compile_pattern(P1) P2 = """for_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > """ p2 = patcomp.compile_pattern(P2) def in_special_context(self, node, isiter): if node.parent is None: return False results = {} if (node.parent.parent is not None and self.p1.match(node.parent.parent, results) and results["node"] is node): if isiter: # iter(d.iterkeys()) -> iter(d.keys()), etc. return results["func"].value in iter_exempt else: # list(d.keys()) -> list(d.keys()), etc. return results["func"].value in fixer_util.consuming_calls if not isiter: return False # for ... in d.iterkeys() -> for ... in d.keys(), etc. return self.p2.match(node.parent, results) and results["node"] is node fixes/fix_isinstance.py000064400000003110151027012300011224 0ustar00# Copyright 2008 Armin Ronacher. # Licensed to PSF under a Contributor Agreement. """Fixer that cleans up a tuple argument to isinstance after the tokens in it were fixed. This is mainly used to remove double occurrences of tokens as a leftover of the long -> int / unicode -> str conversion. eg. isinstance(x, (int, long)) -> isinstance(x, (int, int)) -> isinstance(x, int) """ from .. import fixer_base from ..fixer_util import token class FixIsinstance(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< 'isinstance' trailer< '(' arglist< any ',' atom< '(' args=testlist_gexp< any+ > ')' > > ')' > > """ run_order = 6 def transform(self, node, results): names_inserted = set() testlist = results["args"] args = testlist.children new_args = [] iterator = enumerate(args) for idx, arg in iterator: if arg.type == token.NAME and arg.value in names_inserted: if idx < len(args) - 1 and args[idx + 1].type == token.COMMA: next(iterator) continue else: new_args.append(arg) if arg.type == token.NAME: names_inserted.add(arg.value) if new_args and new_args[-1].type == token.COMMA: del new_args[-1] if len(new_args) == 1: atom = testlist.parent new_args[0].prefix = atom.prefix atom.replace(new_args[0]) else: args[:] = new_args node.changed() fixes/fix_input.py000064400000001304151027012300010226 0ustar00"""Fixer that changes input(...) into eval(input(...)).""" # Author: Andre Roberge # Local imports from .. import fixer_base from ..fixer_util import Call, Name from .. import patcomp context = patcomp.compile_pattern("power< 'eval' trailer< '(' any ')' > >") class FixInput(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< 'input' args=trailer< '(' [any] ')' > > """ def transform(self, node, results): # If we're already wrapped in an eval() call, we're done. if context.match(node.parent.parent): return new = node.clone() new.prefix = "" return Call(Name("eval"), [new], prefix=node.prefix) fixes/fix_imports.py000064400000013064151027012300010572 0ustar00"""Fix incompatible imports and module references.""" # Authors: Collin Winter, Nick Edds # Local imports from .. import fixer_base from ..fixer_util import Name, attr_chain MAPPING = {'StringIO': 'io', 'cStringIO': 'io', 'cPickle': 'pickle', '__builtin__' : 'builtins', 'copy_reg': 'copyreg', 'Queue': 'queue', 'SocketServer': 'socketserver', 'ConfigParser': 'configparser', 'repr': 'reprlib', 'FileDialog': 'tkinter.filedialog', 'tkFileDialog': 'tkinter.filedialog', 'SimpleDialog': 'tkinter.simpledialog', 'tkSimpleDialog': 'tkinter.simpledialog', 'tkColorChooser': 'tkinter.colorchooser', 'tkCommonDialog': 'tkinter.commondialog', 'Dialog': 'tkinter.dialog', 'Tkdnd': 'tkinter.dnd', 'tkFont': 'tkinter.font', 'tkMessageBox': 'tkinter.messagebox', 'ScrolledText': 'tkinter.scrolledtext', 'Tkconstants': 'tkinter.constants', 'Tix': 'tkinter.tix', 'ttk': 'tkinter.ttk', 'Tkinter': 'tkinter', 'markupbase': '_markupbase', '_winreg': 'winreg', 'thread': '_thread', 'dummy_thread': '_dummy_thread', # anydbm and whichdb are handled by fix_imports2 'dbhash': 'dbm.bsd', 'dumbdbm': 'dbm.dumb', 'dbm': 'dbm.ndbm', 'gdbm': 'dbm.gnu', 'xmlrpclib': 'xmlrpc.client', 'DocXMLRPCServer': 'xmlrpc.server', 'SimpleXMLRPCServer': 'xmlrpc.server', 'httplib': 'http.client', 'htmlentitydefs' : 'html.entities', 'HTMLParser' : 'html.parser', 'Cookie': 'http.cookies', 'cookielib': 'http.cookiejar', 'BaseHTTPServer': 'http.server', 'SimpleHTTPServer': 'http.server', 'CGIHTTPServer': 'http.server', #'test.test_support': 'test.support', 'commands': 'subprocess', 'UserString' : 'collections', 'UserList' : 'collections', 'urlparse' : 'urllib.parse', 'robotparser' : 'urllib.robotparser', } def alternates(members): return "(" + "|".join(map(repr, members)) + ")" def build_pattern(mapping=MAPPING): mod_list = ' | '.join(["module_name='%s'" % key for key in mapping]) bare_names = alternates(mapping.keys()) yield """name_import=import_name< 'import' ((%s) | multiple_imports=dotted_as_names< any* (%s) any* >) > """ % (mod_list, mod_list) yield """import_from< 'from' (%s) 'import' ['('] ( any | import_as_name< any 'as' any > | import_as_names< any* >) [')'] > """ % mod_list yield """import_name< 'import' (dotted_as_name< (%s) 'as' any > | multiple_imports=dotted_as_names< any* dotted_as_name< (%s) 'as' any > any* >) > """ % (mod_list, mod_list) # Find usages of module members in code e.g. thread.foo(bar) yield "power< bare_with_attr=(%s) trailer<'.' any > any* >" % bare_names class FixImports(fixer_base.BaseFix): BM_compatible = True keep_line_order = True # This is overridden in fix_imports2. mapping = MAPPING # We want to run this fixer late, so fix_import doesn't try to make stdlib # renames into relative imports. run_order = 6 def build_pattern(self): return "|".join(build_pattern(self.mapping)) def compile_pattern(self): # We override this, so MAPPING can be pragmatically altered and the # changes will be reflected in PATTERN. self.PATTERN = self.build_pattern() super(FixImports, self).compile_pattern() # Don't match the node if it's within another match. def match(self, node): match = super(FixImports, self).match results = match(node) if results: # Module usage could be in the trailer of an attribute lookup, so we # might have nested matches when "bare_with_attr" is present. if "bare_with_attr" not in results and \ any(match(obj) for obj in attr_chain(node, "parent")): return False return results return False def start_tree(self, tree, filename): super(FixImports, self).start_tree(tree, filename) self.replace = {} def transform(self, node, results): import_mod = results.get("module_name") if import_mod: mod_name = import_mod.value new_name = self.mapping[mod_name] import_mod.replace(Name(new_name, prefix=import_mod.prefix)) if "name_import" in results: # If it's not a "from x import x, y" or "import x as y" import, # marked its usage to be replaced. self.replace[mod_name] = new_name if "multiple_imports" in results: # This is a nasty hack to fix multiple imports on a line (e.g., # "import StringIO, urlparse"). The problem is that I can't # figure out an easy way to make a pattern recognize the keys of # MAPPING randomly sprinkled in an import statement. results = self.match(node) if results: self.transform(node, results) else: # Replace usage of the module. bare_name = results["bare_with_attr"][0] new_name = self.replace.get(bare_name.value) if new_name: bare_name.replace(Name(new_name, prefix=bare_name.prefix)) fixes/fix_execfile.py000064400000004000151027012300010647 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for execfile. This converts usages of the execfile function into calls to the built-in exec() function. """ from .. import fixer_base from ..fixer_util import (Comma, Name, Call, LParen, RParen, Dot, Node, ArgList, String, syms) class FixExecfile(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< 'execfile' trailer< '(' arglist< filename=any [',' globals=any [',' locals=any ] ] > ')' > > | power< 'execfile' trailer< '(' filename=any ')' > > """ def transform(self, node, results): assert results filename = results["filename"] globals = results.get("globals") locals = results.get("locals") # Copy over the prefix from the right parentheses end of the execfile # call. execfile_paren = node.children[-1].children[-1].clone() # Construct open().read(). open_args = ArgList([filename.clone(), Comma(), String('"rb"', ' ')], rparen=execfile_paren) open_call = Node(syms.power, [Name("open"), open_args]) read = [Node(syms.trailer, [Dot(), Name('read')]), Node(syms.trailer, [LParen(), RParen()])] open_expr = [open_call] + read # Wrap the open call in a compile call. This is so the filename will be # preserved in the execed code. filename_arg = filename.clone() filename_arg.prefix = " " exec_str = String("'exec'", " ") compile_args = open_expr + [Comma(), filename_arg, Comma(), exec_str] compile_call = Call(Name("compile"), compile_args, "") # Finally, replace the execfile call with an exec call. args = [compile_call] if globals is not None: args.extend([Comma(), globals.clone()]) if locals is not None: args.extend([Comma(), locals.clone()]) return Call(Name("exec"), args, prefix=node.prefix) fixes/fix_long.py000064400000000734151027012300010034 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that turns 'long' into 'int' everywhere. """ # Local imports from lib2to3 import fixer_base from lib2to3.fixer_util import is_probably_builtin class FixLong(fixer_base.BaseFix): BM_compatible = True PATTERN = "'long'" def transform(self, node, results): if is_probably_builtin(node): node.value = "int" node.changed() fixes/fix_raw_input.py000064400000000706151027012300011104 0ustar00"""Fixer that changes raw_input(...) into input(...).""" # Author: Andre Roberge # Local imports from .. import fixer_base from ..fixer_util import Name class FixRawInput(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< name='raw_input' trailer< '(' [any] ')' > any* > """ def transform(self, node, results): name = results["name"] name.replace(Name("input", prefix=name.prefix)) fixes/fix_renames.py000064400000004255151027012300010531 0ustar00"""Fix incompatible renames Fixes: * sys.maxint -> sys.maxsize """ # Author: Christian Heimes # based on Collin Winter's fix_import # Local imports from .. import fixer_base from ..fixer_util import Name, attr_chain MAPPING = {"sys": {"maxint" : "maxsize"}, } LOOKUP = {} def alternates(members): return "(" + "|".join(map(repr, members)) + ")" def build_pattern(): #bare = set() for module, replace in list(MAPPING.items()): for old_attr, new_attr in list(replace.items()): LOOKUP[(module, old_attr)] = new_attr #bare.add(module) #bare.add(old_attr) #yield """ # import_name< 'import' (module=%r # | dotted_as_names< any* module=%r any* >) > # """ % (module, module) yield """ import_from< 'from' module_name=%r 'import' ( attr_name=%r | import_as_name< attr_name=%r 'as' any >) > """ % (module, old_attr, old_attr) yield """ power< module_name=%r trailer< '.' attr_name=%r > any* > """ % (module, old_attr) #yield """bare_name=%s""" % alternates(bare) class FixRenames(fixer_base.BaseFix): BM_compatible = True PATTERN = "|".join(build_pattern()) order = "pre" # Pre-order tree traversal # Don't match the node if it's within another match def match(self, node): match = super(FixRenames, self).match results = match(node) if results: if any(match(obj) for obj in attr_chain(node, "parent")): return False return results return False #def start_tree(self, tree, filename): # super(FixRenames, self).start_tree(tree, filename) # self.replace = {} def transform(self, node, results): mod_name = results.get("module_name") attr_name = results.get("attr_name") #bare_name = results.get("bare_name") #import_mod = results.get("module") if mod_name and attr_name: new_attr = LOOKUP[(mod_name.value, attr_name.value)] attr_name.replace(Name(new_attr, prefix=attr_name.prefix)) fixes/fix_set_literal.py000064400000003241151027012300011400 0ustar00""" Optional fixer to transform set() calls to set literals. """ # Author: Benjamin Peterson from lib2to3 import fixer_base, pytree from lib2to3.fixer_util import token, syms class FixSetLiteral(fixer_base.BaseFix): BM_compatible = True explicit = True PATTERN = """power< 'set' trailer< '(' (atom=atom< '[' (items=listmaker< any ((',' any)* [',']) > | single=any) ']' > | atom< '(' items=testlist_gexp< any ((',' any)* [',']) > ')' > ) ')' > > """ def transform(self, node, results): single = results.get("single") if single: # Make a fake listmaker fake = pytree.Node(syms.listmaker, [single.clone()]) single.replace(fake) items = fake else: items = results["items"] # Build the contents of the literal literal = [pytree.Leaf(token.LBRACE, "{")] literal.extend(n.clone() for n in items.children) literal.append(pytree.Leaf(token.RBRACE, "}")) # Set the prefix of the right brace to that of the ')' or ']' literal[-1].prefix = items.next_sibling.prefix maker = pytree.Node(syms.dictsetmaker, literal) maker.prefix = node.prefix # If the original was a one tuple, we need to remove the extra comma. if len(maker.children) == 4: n = maker.children[2] n.remove() maker.children[-1].prefix = n.prefix # Finally, replace the set call with our shiny new literal. return maker fixes/fix_except.py000064400000006420151027012300010363 0ustar00"""Fixer for except statements with named exceptions. The following cases will be converted: - "except E, T:" where T is a name: except E as T: - "except E, T:" where T is not a name, tuple or list: except E as t: T = t This is done because the target of an "except" clause must be a name. - "except E, T:" where T is a tuple or list literal: except E as t: T = t.args """ # Author: Collin Winter # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Assign, Attr, Name, is_tuple, is_list, syms def find_excepts(nodes): for i, n in enumerate(nodes): if n.type == syms.except_clause: if n.children[0].value == 'except': yield (n, nodes[i+2]) class FixExcept(fixer_base.BaseFix): BM_compatible = True PATTERN = """ try_stmt< 'try' ':' (simple_stmt | suite) cleanup=(except_clause ':' (simple_stmt | suite))+ tail=(['except' ':' (simple_stmt | suite)] ['else' ':' (simple_stmt | suite)] ['finally' ':' (simple_stmt | suite)]) > """ def transform(self, node, results): syms = self.syms tail = [n.clone() for n in results["tail"]] try_cleanup = [ch.clone() for ch in results["cleanup"]] for except_clause, e_suite in find_excepts(try_cleanup): if len(except_clause.children) == 4: (E, comma, N) = except_clause.children[1:4] comma.replace(Name("as", prefix=" ")) if N.type != token.NAME: # Generate a new N for the except clause new_N = Name(self.new_name(), prefix=" ") target = N.clone() target.prefix = "" N.replace(new_N) new_N = new_N.clone() # Insert "old_N = new_N" as the first statement in # the except body. This loop skips leading whitespace # and indents #TODO(cwinter) suite-cleanup suite_stmts = e_suite.children for i, stmt in enumerate(suite_stmts): if isinstance(stmt, pytree.Node): break # The assignment is different if old_N is a tuple or list # In that case, the assignment is old_N = new_N.args if is_tuple(N) or is_list(N): assign = Assign(target, Attr(new_N, Name('args'))) else: assign = Assign(target, new_N) #TODO(cwinter) stopgap until children becomes a smart list for child in reversed(suite_stmts[:i]): e_suite.insert_child(0, child) e_suite.insert_child(i, assign) elif N.prefix == "": # No space after a comma is legal; no space after "as", # not so much. N.prefix = " " #TODO(cwinter) fix this when children becomes a smart list children = [c.clone() for c in node.children[:3]] + try_cleanup + tail return pytree.Node(node.type, children) fixes/fix_methodattrs.py000064400000001136151027012300011430 0ustar00"""Fix bound method attributes (method.im_? -> method.__?__). """ # Author: Christian Heimes # Local imports from .. import fixer_base from ..fixer_util import Name MAP = { "im_func" : "__func__", "im_self" : "__self__", "im_class" : "__self__.__class__" } class FixMethodattrs(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< any+ trailer< '.' attr=('im_func' | 'im_self' | 'im_class') > any* > """ def transform(self, node, results): attr = results["attr"][0] new = MAP[attr.value] attr.replace(Name(new, prefix=attr.prefix)) fixes/fix_metaclass.py000064400000020005151027012300011042 0ustar00"""Fixer for __metaclass__ = X -> (metaclass=X) methods. The various forms of classef (inherits nothing, inherits once, inherits many) don't parse the same in the CST so we look at ALL classes for a __metaclass__ and if we find one normalize the inherits to all be an arglist. For one-liner classes ('class X: pass') there is no indent/dedent so we normalize those into having a suite. Moving the __metaclass__ into the classdef can also cause the class body to be empty so there is some special casing for that as well. This fixer also tries very hard to keep original indenting and spacing in all those corner cases. """ # Author: Jack Diederich # Local imports from .. import fixer_base from ..pygram import token from ..fixer_util import syms, Node, Leaf def has_metaclass(parent): """ we have to check the cls_node without changing it. There are two possibilities: 1) clsdef => suite => simple_stmt => expr_stmt => Leaf('__meta') 2) clsdef => simple_stmt => expr_stmt => Leaf('__meta') """ for node in parent.children: if node.type == syms.suite: return has_metaclass(node) elif node.type == syms.simple_stmt and node.children: expr_node = node.children[0] if expr_node.type == syms.expr_stmt and expr_node.children: left_side = expr_node.children[0] if isinstance(left_side, Leaf) and \ left_side.value == '__metaclass__': return True return False def fixup_parse_tree(cls_node): """ one-line classes don't get a suite in the parse tree so we add one to normalize the tree """ for node in cls_node.children: if node.type == syms.suite: # already in the preferred format, do nothing return # !%@#! one-liners have no suite node, we have to fake one up for i, node in enumerate(cls_node.children): if node.type == token.COLON: break else: raise ValueError("No class suite and no ':'!") # move everything into a suite node suite = Node(syms.suite, []) while cls_node.children[i+1:]: move_node = cls_node.children[i+1] suite.append_child(move_node.clone()) move_node.remove() cls_node.append_child(suite) node = suite def fixup_simple_stmt(parent, i, stmt_node): """ if there is a semi-colon all the parts count as part of the same simple_stmt. We just want the __metaclass__ part so we move everything after the semi-colon into its own simple_stmt node """ for semi_ind, node in enumerate(stmt_node.children): if node.type == token.SEMI: # *sigh* break else: return node.remove() # kill the semicolon new_expr = Node(syms.expr_stmt, []) new_stmt = Node(syms.simple_stmt, [new_expr]) while stmt_node.children[semi_ind:]: move_node = stmt_node.children[semi_ind] new_expr.append_child(move_node.clone()) move_node.remove() parent.insert_child(i, new_stmt) new_leaf1 = new_stmt.children[0].children[0] old_leaf1 = stmt_node.children[0].children[0] new_leaf1.prefix = old_leaf1.prefix def remove_trailing_newline(node): if node.children and node.children[-1].type == token.NEWLINE: node.children[-1].remove() def find_metas(cls_node): # find the suite node (Mmm, sweet nodes) for node in cls_node.children: if node.type == syms.suite: break else: raise ValueError("No class suite!") # look for simple_stmt[ expr_stmt[ Leaf('__metaclass__') ] ] for i, simple_node in list(enumerate(node.children)): if simple_node.type == syms.simple_stmt and simple_node.children: expr_node = simple_node.children[0] if expr_node.type == syms.expr_stmt and expr_node.children: # Check if the expr_node is a simple assignment. left_node = expr_node.children[0] if isinstance(left_node, Leaf) and \ left_node.value == '__metaclass__': # We found an assignment to __metaclass__. fixup_simple_stmt(node, i, simple_node) remove_trailing_newline(simple_node) yield (node, i, simple_node) def fixup_indent(suite): """ If an INDENT is followed by a thing with a prefix then nuke the prefix Otherwise we get in trouble when removing __metaclass__ at suite start """ kids = suite.children[::-1] # find the first indent while kids: node = kids.pop() if node.type == token.INDENT: break # find the first Leaf while kids: node = kids.pop() if isinstance(node, Leaf) and node.type != token.DEDENT: if node.prefix: node.prefix = '' return else: kids.extend(node.children[::-1]) class FixMetaclass(fixer_base.BaseFix): BM_compatible = True PATTERN = """ classdef """ def transform(self, node, results): if not has_metaclass(node): return fixup_parse_tree(node) # find metaclasses, keep the last one last_metaclass = None for suite, i, stmt in find_metas(node): last_metaclass = stmt stmt.remove() text_type = node.children[0].type # always Leaf(nnn, 'class') # figure out what kind of classdef we have if len(node.children) == 7: # Node(classdef, ['class', 'name', '(', arglist, ')', ':', suite]) # 0 1 2 3 4 5 6 if node.children[3].type == syms.arglist: arglist = node.children[3] # Node(classdef, ['class', 'name', '(', 'Parent', ')', ':', suite]) else: parent = node.children[3].clone() arglist = Node(syms.arglist, [parent]) node.set_child(3, arglist) elif len(node.children) == 6: # Node(classdef, ['class', 'name', '(', ')', ':', suite]) # 0 1 2 3 4 5 arglist = Node(syms.arglist, []) node.insert_child(3, arglist) elif len(node.children) == 4: # Node(classdef, ['class', 'name', ':', suite]) # 0 1 2 3 arglist = Node(syms.arglist, []) node.insert_child(2, Leaf(token.RPAR, ')')) node.insert_child(2, arglist) node.insert_child(2, Leaf(token.LPAR, '(')) else: raise ValueError("Unexpected class definition") # now stick the metaclass in the arglist meta_txt = last_metaclass.children[0].children[0] meta_txt.value = 'metaclass' orig_meta_prefix = meta_txt.prefix if arglist.children: arglist.append_child(Leaf(token.COMMA, ',')) meta_txt.prefix = ' ' else: meta_txt.prefix = '' # compact the expression "metaclass = Meta" -> "metaclass=Meta" expr_stmt = last_metaclass.children[0] assert expr_stmt.type == syms.expr_stmt expr_stmt.children[1].prefix = '' expr_stmt.children[2].prefix = '' arglist.append_child(last_metaclass) fixup_indent(suite) # check for empty suite if not suite.children: # one-liner that was just __metaclass_ suite.remove() pass_leaf = Leaf(text_type, 'pass') pass_leaf.prefix = orig_meta_prefix node.append_child(pass_leaf) node.append_child(Leaf(token.NEWLINE, '\n')) elif len(suite.children) > 1 and \ (suite.children[-2].type == token.INDENT and suite.children[-1].type == token.DEDENT): # there was only one line in the class body and it was __metaclass__ pass_leaf = Leaf(text_type, 'pass') suite.insert_child(-1, pass_leaf) suite.insert_child(-1, Leaf(token.NEWLINE, '\n')) fixes/fix_itertools.py000064400000003014151027012300011113 0ustar00""" Fixer for itertools.(imap|ifilter|izip) --> (map|filter|zip) and itertools.ifilterfalse --> itertools.filterfalse (bugs 2360-2363) imports from itertools are fixed in fix_itertools_import.py If itertools is imported as something else (ie: import itertools as it; it.izip(spam, eggs)) method calls will not get fixed. """ # Local imports from .. import fixer_base from ..fixer_util import Name class FixItertools(fixer_base.BaseFix): BM_compatible = True it_funcs = "('imap'|'ifilter'|'izip'|'izip_longest'|'ifilterfalse')" PATTERN = """ power< it='itertools' trailer< dot='.' func=%(it_funcs)s > trailer< '(' [any] ')' > > | power< func=%(it_funcs)s trailer< '(' [any] ')' > > """ %(locals()) # Needs to be run after fix_(map|zip|filter) run_order = 6 def transform(self, node, results): prefix = None func = results['func'][0] if ('it' in results and func.value not in ('ifilterfalse', 'izip_longest')): dot, it = (results['dot'], results['it']) # Remove the 'itertools' prefix = it.prefix it.remove() # Replace the node which contains ('.', 'function') with the # function (to be consistent with the second part of the pattern) dot.remove() func.parent.replace(func) prefix = prefix or func.prefix func.replace(Name(func.value[1:], prefix=prefix)) fixes/fix_future.py000064400000001043151027012300010401 0ustar00"""Remove __future__ imports from __future__ import foo is replaced with an empty line. """ # Author: Christian Heimes # Local imports from .. import fixer_base from ..fixer_util import BlankLine class FixFuture(fixer_base.BaseFix): BM_compatible = True PATTERN = """import_from< 'from' module_name="__future__" 'import' any >""" # This should be run last -- some things check for the import run_order = 10 def transform(self, node, results): new = BlankLine() new.prefix = node.prefix return new fixes/fix_tuple_params.py000064400000012675151027012300011600 0ustar00"""Fixer for function definitions with tuple parameters. def func(((a, b), c), d): ... -> def func(x, d): ((a, b), c) = x ... It will also support lambdas: lambda (x, y): x + y -> lambda t: t[0] + t[1] # The parens are a syntax error in Python 3 lambda (x): x + y -> lambda x: x + y """ # Author: Collin Winter # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Assign, Name, Newline, Number, Subscript, syms def is_docstring(stmt): return isinstance(stmt, pytree.Node) and \ stmt.children[0].type == token.STRING class FixTupleParams(fixer_base.BaseFix): run_order = 4 #use a lower order since lambda is part of other #patterns BM_compatible = True PATTERN = """ funcdef< 'def' any parameters< '(' args=any ')' > ['->' any] ':' suite=any+ > | lambda= lambdef< 'lambda' args=vfpdef< '(' inner=any ')' > ':' body=any > """ def transform(self, node, results): if "lambda" in results: return self.transform_lambda(node, results) new_lines = [] suite = results["suite"] args = results["args"] # This crap is so "def foo(...): x = 5; y = 7" is handled correctly. # TODO(cwinter): suite-cleanup if suite[0].children[1].type == token.INDENT: start = 2 indent = suite[0].children[1].value end = Newline() else: start = 0 indent = "; " end = pytree.Leaf(token.INDENT, "") # We need access to self for new_name(), and making this a method # doesn't feel right. Closing over self and new_lines makes the # code below cleaner. def handle_tuple(tuple_arg, add_prefix=False): n = Name(self.new_name()) arg = tuple_arg.clone() arg.prefix = "" stmt = Assign(arg, n.clone()) if add_prefix: n.prefix = " " tuple_arg.replace(n) new_lines.append(pytree.Node(syms.simple_stmt, [stmt, end.clone()])) if args.type == syms.tfpdef: handle_tuple(args) elif args.type == syms.typedargslist: for i, arg in enumerate(args.children): if arg.type == syms.tfpdef: # Without add_prefix, the emitted code is correct, # just ugly. handle_tuple(arg, add_prefix=(i > 0)) if not new_lines: return # This isn't strictly necessary, but it plays nicely with other fixers. # TODO(cwinter) get rid of this when children becomes a smart list for line in new_lines: line.parent = suite[0] # TODO(cwinter) suite-cleanup after = start if start == 0: new_lines[0].prefix = " " elif is_docstring(suite[0].children[start]): new_lines[0].prefix = indent after = start + 1 for line in new_lines: line.parent = suite[0] suite[0].children[after:after] = new_lines for i in range(after+1, after+len(new_lines)+1): suite[0].children[i].prefix = indent suite[0].changed() def transform_lambda(self, node, results): args = results["args"] body = results["body"] inner = simplify_args(results["inner"]) # Replace lambda ((((x)))): x with lambda x: x if inner.type == token.NAME: inner = inner.clone() inner.prefix = " " args.replace(inner) return params = find_params(args) to_index = map_to_index(params) tup_name = self.new_name(tuple_name(params)) new_param = Name(tup_name, prefix=" ") args.replace(new_param.clone()) for n in body.post_order(): if n.type == token.NAME and n.value in to_index: subscripts = [c.clone() for c in to_index[n.value]] new = pytree.Node(syms.power, [new_param.clone()] + subscripts) new.prefix = n.prefix n.replace(new) ### Helper functions for transform_lambda() def simplify_args(node): if node.type in (syms.vfplist, token.NAME): return node elif node.type == syms.vfpdef: # These look like vfpdef< '(' x ')' > where x is NAME # or another vfpdef instance (leading to recursion). while node.type == syms.vfpdef: node = node.children[1] return node raise RuntimeError("Received unexpected node %s" % node) def find_params(node): if node.type == syms.vfpdef: return find_params(node.children[1]) elif node.type == token.NAME: return node.value return [find_params(c) for c in node.children if c.type != token.COMMA] def map_to_index(param_list, prefix=[], d=None): if d is None: d = {} for i, obj in enumerate(param_list): trailer = [Subscript(Number(str(i)))] if isinstance(obj, list): map_to_index(obj, trailer, d=d) else: d[obj] = prefix + trailer return d def tuple_name(param_list): l = [] for obj in param_list: if isinstance(obj, list): l.append(tuple_name(obj)) else: l.append(obj) return "_".join(l) fixes/fix_reload.py000064400000002071151027012300010337 0ustar00"""Fixer for reload(). reload(s) -> importlib.reload(s)""" # Local imports from .. import fixer_base from ..fixer_util import ImportAndCall, touch_import class FixReload(fixer_base.BaseFix): BM_compatible = True order = "pre" PATTERN = """ power< 'reload' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > """ def transform(self, node, results): if results: # I feel like we should be able to express this logic in the # PATTERN above but I don't know how to do it so... obj = results['obj'] if obj: if (obj.type == self.syms.argument and obj.children[0].value in {'**', '*'}): return # Make no change. names = ('importlib', 'reload') new = ImportAndCall(node, results, names) touch_import(None, 'importlib', node) return new fixes/fix_raise.py000064400000005556151027012300010207 0ustar00"""Fixer for 'raise E, V, T' raise -> raise raise E -> raise E raise E, V -> raise E(V) raise E, V, T -> raise E(V).with_traceback(T) raise E, None, T -> raise E.with_traceback(T) raise (((E, E'), E''), E'''), V -> raise E(V) raise "foo", V, T -> warns about string exceptions CAVEATS: 1) "raise E, V" will be incorrectly translated if V is an exception instance. The correct Python 3 idiom is raise E from V but since we can't detect instance-hood by syntax alone and since any client code would have to be changed as well, we don't automate this. """ # Author: Collin Winter # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Name, Call, Attr, ArgList, is_tuple class FixRaise(fixer_base.BaseFix): BM_compatible = True PATTERN = """ raise_stmt< 'raise' exc=any [',' val=any [',' tb=any]] > """ def transform(self, node, results): syms = self.syms exc = results["exc"].clone() if exc.type == token.STRING: msg = "Python 3 does not support string exceptions" self.cannot_convert(node, msg) return # Python 2 supports # raise ((((E1, E2), E3), E4), E5), V # as a synonym for # raise E1, V # Since Python 3 will not support this, we recurse down any tuple # literals, always taking the first element. if is_tuple(exc): while is_tuple(exc): # exc.children[1:-1] is the unparenthesized tuple # exc.children[1].children[0] is the first element of the tuple exc = exc.children[1].children[0].clone() exc.prefix = " " if "val" not in results: # One-argument raise new = pytree.Node(syms.raise_stmt, [Name("raise"), exc]) new.prefix = node.prefix return new val = results["val"].clone() if is_tuple(val): args = [c.clone() for c in val.children[1:-1]] else: val.prefix = "" args = [val] if "tb" in results: tb = results["tb"].clone() tb.prefix = "" e = exc # If there's a traceback and None is passed as the value, then don't # add a call, since the user probably just wants to add a # traceback. See issue #9661. if val.type != token.NAME or val.value != "None": e = Call(exc, args) with_tb = Attr(e, Name('with_traceback')) + [ArgList([tb])] new = pytree.Node(syms.simple_stmt, [Name("raise")] + with_tb) new.prefix = node.prefix return new else: return pytree.Node(syms.raise_stmt, [Name("raise"), Call(exc, args)], prefix=node.prefix) fixes/fix_types.py000064400000003356151027012300010244 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for removing uses of the types module. These work for only the known names in the types module. The forms above can include types. or not. ie, It is assumed the module is imported either as: import types from types import ... # either * or specific types The import statements are not modified. There should be another fixer that handles at least the following constants: type([]) -> list type(()) -> tuple type('') -> str """ # Local imports from .. import fixer_base from ..fixer_util import Name _TYPE_MAPPING = { 'BooleanType' : 'bool', 'BufferType' : 'memoryview', 'ClassType' : 'type', 'ComplexType' : 'complex', 'DictType': 'dict', 'DictionaryType' : 'dict', 'EllipsisType' : 'type(Ellipsis)', #'FileType' : 'io.IOBase', 'FloatType': 'float', 'IntType': 'int', 'ListType': 'list', 'LongType': 'int', 'ObjectType' : 'object', 'NoneType': 'type(None)', 'NotImplementedType' : 'type(NotImplemented)', 'SliceType' : 'slice', 'StringType': 'bytes', # XXX ? 'StringTypes' : '(str,)', # XXX ? 'TupleType': 'tuple', 'TypeType' : 'type', 'UnicodeType': 'str', 'XRangeType' : 'range', } _pats = ["power< 'types' trailer< '.' name='%s' > >" % t for t in _TYPE_MAPPING] class FixTypes(fixer_base.BaseFix): BM_compatible = True PATTERN = '|'.join(_pats) def transform(self, node, results): new_value = _TYPE_MAPPING.get(results["name"].value) if new_value: return Name(new_value, prefix=node.prefix) return None fixes/fix_urllib.py000064400000020257151027012300010370 0ustar00"""Fix changes imports of urllib which are now incompatible. This is rather similar to fix_imports, but because of the more complex nature of the fixing for urllib, it has its own fixer. """ # Author: Nick Edds # Local imports from lib2to3.fixes.fix_imports import alternates, FixImports from lib2to3.fixer_util import (Name, Comma, FromImport, Newline, find_indentation, Node, syms) MAPPING = {"urllib": [ ("urllib.request", ["URLopener", "FancyURLopener", "urlretrieve", "_urlopener", "urlopen", "urlcleanup", "pathname2url", "url2pathname", "getproxies"]), ("urllib.parse", ["quote", "quote_plus", "unquote", "unquote_plus", "urlencode", "splitattr", "splithost", "splitnport", "splitpasswd", "splitport", "splitquery", "splittag", "splittype", "splituser", "splitvalue", ]), ("urllib.error", ["ContentTooShortError"])], "urllib2" : [ ("urllib.request", ["urlopen", "install_opener", "build_opener", "Request", "OpenerDirector", "BaseHandler", "HTTPDefaultErrorHandler", "HTTPRedirectHandler", "HTTPCookieProcessor", "ProxyHandler", "HTTPPasswordMgr", "HTTPPasswordMgrWithDefaultRealm", "AbstractBasicAuthHandler", "HTTPBasicAuthHandler", "ProxyBasicAuthHandler", "AbstractDigestAuthHandler", "HTTPDigestAuthHandler", "ProxyDigestAuthHandler", "HTTPHandler", "HTTPSHandler", "FileHandler", "FTPHandler", "CacheFTPHandler", "UnknownHandler"]), ("urllib.error", ["URLError", "HTTPError"]), ] } # Duplicate the url parsing functions for urllib2. MAPPING["urllib2"].append(MAPPING["urllib"][1]) def build_pattern(): bare = set() for old_module, changes in MAPPING.items(): for change in changes: new_module, members = change members = alternates(members) yield """import_name< 'import' (module=%r | dotted_as_names< any* module=%r any* >) > """ % (old_module, old_module) yield """import_from< 'from' mod_member=%r 'import' ( member=%s | import_as_name< member=%s 'as' any > | import_as_names< members=any* >) > """ % (old_module, members, members) yield """import_from< 'from' module_star=%r 'import' star='*' > """ % old_module yield """import_name< 'import' dotted_as_name< module_as=%r 'as' any > > """ % old_module # bare_with_attr has a special significance for FixImports.match(). yield """power< bare_with_attr=%r trailer< '.' member=%s > any* > """ % (old_module, members) class FixUrllib(FixImports): def build_pattern(self): return "|".join(build_pattern()) def transform_import(self, node, results): """Transform for the basic import case. Replaces the old import name with a comma separated list of its replacements. """ import_mod = results.get("module") pref = import_mod.prefix names = [] # create a Node list of the replacement modules for name in MAPPING[import_mod.value][:-1]: names.extend([Name(name[0], prefix=pref), Comma()]) names.append(Name(MAPPING[import_mod.value][-1][0], prefix=pref)) import_mod.replace(names) def transform_member(self, node, results): """Transform for imports of specific module elements. Replaces the module to be imported from with the appropriate new module. """ mod_member = results.get("mod_member") pref = mod_member.prefix member = results.get("member") # Simple case with only a single member being imported if member: # this may be a list of length one, or just a node if isinstance(member, list): member = member[0] new_name = None for change in MAPPING[mod_member.value]: if member.value in change[1]: new_name = change[0] break if new_name: mod_member.replace(Name(new_name, prefix=pref)) else: self.cannot_convert(node, "This is an invalid module element") # Multiple members being imported else: # a dictionary for replacements, order matters modules = [] mod_dict = {} members = results["members"] for member in members: # we only care about the actual members if member.type == syms.import_as_name: as_name = member.children[2].value member_name = member.children[0].value else: member_name = member.value as_name = None if member_name != ",": for change in MAPPING[mod_member.value]: if member_name in change[1]: if change[0] not in mod_dict: modules.append(change[0]) mod_dict.setdefault(change[0], []).append(member) new_nodes = [] indentation = find_indentation(node) first = True def handle_name(name, prefix): if name.type == syms.import_as_name: kids = [Name(name.children[0].value, prefix=prefix), name.children[1].clone(), name.children[2].clone()] return [Node(syms.import_as_name, kids)] return [Name(name.value, prefix=prefix)] for module in modules: elts = mod_dict[module] names = [] for elt in elts[:-1]: names.extend(handle_name(elt, pref)) names.append(Comma()) names.extend(handle_name(elts[-1], pref)) new = FromImport(module, names) if not first or node.parent.prefix.endswith(indentation): new.prefix = indentation new_nodes.append(new) first = False if new_nodes: nodes = [] for new_node in new_nodes[:-1]: nodes.extend([new_node, Newline()]) nodes.append(new_nodes[-1]) node.replace(nodes) else: self.cannot_convert(node, "All module elements are invalid") def transform_dot(self, node, results): """Transform for calls to module members in code.""" module_dot = results.get("bare_with_attr") member = results.get("member") new_name = None if isinstance(member, list): member = member[0] for change in MAPPING[module_dot.value]: if member.value in change[1]: new_name = change[0] break if new_name: module_dot.replace(Name(new_name, prefix=module_dot.prefix)) else: self.cannot_convert(node, "This is an invalid module element") def transform(self, node, results): if results.get("module"): self.transform_import(node, results) elif results.get("mod_member"): self.transform_member(node, results) elif results.get("bare_with_attr"): self.transform_dot(node, results) # Renaming and star imports are not supported for these modules. elif results.get("module_star"): self.cannot_convert(node, "Cannot handle star imports.") elif results.get("module_as"): self.cannot_convert(node, "This module is now multiple modules") fixes/fix_exitfunc.py000064400000004677151027012300010734 0ustar00""" Convert use of sys.exitfunc to use the atexit module. """ # Author: Benjamin Peterson from lib2to3 import pytree, fixer_base from lib2to3.fixer_util import Name, Attr, Call, Comma, Newline, syms class FixExitfunc(fixer_base.BaseFix): keep_line_order = True BM_compatible = True PATTERN = """ ( sys_import=import_name<'import' ('sys' | dotted_as_names< (any ',')* 'sys' (',' any)* > ) > | expr_stmt< power< 'sys' trailer< '.' 'exitfunc' > > '=' func=any > ) """ def __init__(self, *args): super(FixExitfunc, self).__init__(*args) def start_tree(self, tree, filename): super(FixExitfunc, self).start_tree(tree, filename) self.sys_import = None def transform(self, node, results): # First, find the sys import. We'll just hope it's global scope. if "sys_import" in results: if self.sys_import is None: self.sys_import = results["sys_import"] return func = results["func"].clone() func.prefix = "" register = pytree.Node(syms.power, Attr(Name("atexit"), Name("register")) ) call = Call(register, [func], node.prefix) node.replace(call) if self.sys_import is None: # That's interesting. self.warning(node, "Can't find sys import; Please add an atexit " "import at the top of your file.") return # Now add an atexit import after the sys import. names = self.sys_import.children[1] if names.type == syms.dotted_as_names: names.append_child(Comma()) names.append_child(Name("atexit", " ")) else: containing_stmt = self.sys_import.parent position = containing_stmt.children.index(self.sys_import) stmt_container = containing_stmt.parent new_import = pytree.Node(syms.import_name, [Name("import"), Name("atexit", " ")] ) new = pytree.Node(syms.simple_stmt, [new_import]) containing_stmt.insert_child(position + 1, Newline()) containing_stmt.insert_child(position + 2, new) fixes/fix_operator.py000064400000006542151027012300010733 0ustar00"""Fixer for operator functions. operator.isCallable(obj) -> callable(obj) operator.sequenceIncludes(obj) -> operator.contains(obj) operator.isSequenceType(obj) -> isinstance(obj, collections.abc.Sequence) operator.isMappingType(obj) -> isinstance(obj, collections.abc.Mapping) operator.isNumberType(obj) -> isinstance(obj, numbers.Number) operator.repeat(obj, n) -> operator.mul(obj, n) operator.irepeat(obj, n) -> operator.imul(obj, n) """ import collections.abc # Local imports from lib2to3 import fixer_base from lib2to3.fixer_util import Call, Name, String, touch_import def invocation(s): def dec(f): f.invocation = s return f return dec class FixOperator(fixer_base.BaseFix): BM_compatible = True order = "pre" methods = """ method=('isCallable'|'sequenceIncludes' |'isSequenceType'|'isMappingType'|'isNumberType' |'repeat'|'irepeat') """ obj = "'(' obj=any ')'" PATTERN = """ power< module='operator' trailer< '.' %(methods)s > trailer< %(obj)s > > | power< %(methods)s trailer< %(obj)s > > """ % dict(methods=methods, obj=obj) def transform(self, node, results): method = self._check_method(node, results) if method is not None: return method(node, results) @invocation("operator.contains(%s)") def _sequenceIncludes(self, node, results): return self._handle_rename(node, results, "contains") @invocation("callable(%s)") def _isCallable(self, node, results): obj = results["obj"] return Call(Name("callable"), [obj.clone()], prefix=node.prefix) @invocation("operator.mul(%s)") def _repeat(self, node, results): return self._handle_rename(node, results, "mul") @invocation("operator.imul(%s)") def _irepeat(self, node, results): return self._handle_rename(node, results, "imul") @invocation("isinstance(%s, collections.abc.Sequence)") def _isSequenceType(self, node, results): return self._handle_type2abc(node, results, "collections.abc", "Sequence") @invocation("isinstance(%s, collections.abc.Mapping)") def _isMappingType(self, node, results): return self._handle_type2abc(node, results, "collections.abc", "Mapping") @invocation("isinstance(%s, numbers.Number)") def _isNumberType(self, node, results): return self._handle_type2abc(node, results, "numbers", "Number") def _handle_rename(self, node, results, name): method = results["method"][0] method.value = name method.changed() def _handle_type2abc(self, node, results, module, abc): touch_import(None, module, node) obj = results["obj"] args = [obj.clone(), String(", " + ".".join([module, abc]))] return Call(Name("isinstance"), args, prefix=node.prefix) def _check_method(self, node, results): method = getattr(self, "_" + results["method"][0].value) if isinstance(method, collections.abc.Callable): if "module" in results: return method else: sub = (str(results["obj"]),) invocation_str = method.invocation % sub self.warning(node, "You should use '%s' here." % invocation_str) return None fixes/fix_ne.py000064400000001073151027012300007474 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that turns <> into !=.""" # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base class FixNe(fixer_base.BaseFix): # This is so simple that we don't need the pattern compiler. _accept_type = token.NOTEQUAL def match(self, node): # Override return node.value == "<>" def transform(self, node, results): new = pytree.Leaf(token.NOTEQUAL, "!=", prefix=node.prefix) return new fixes/fix_buffer.py000064400000001116151027012300010341 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that changes buffer(...) into memoryview(...).""" # Local imports from .. import fixer_base from ..fixer_util import Name class FixBuffer(fixer_base.BaseFix): BM_compatible = True explicit = True # The user must ask for this fixer PATTERN = """ power< name='buffer' trailer< '(' [any] ')' > any* > """ def transform(self, node, results): name = results["name"] name.replace(Name("memoryview", prefix=name.prefix)) fixes/fix_next.py000064400000006146151027012300010056 0ustar00"""Fixer for it.next() -> next(it), per PEP 3114.""" # Author: Collin Winter # Things that currently aren't covered: # - listcomp "next" names aren't warned # - "with" statement targets aren't checked # Local imports from ..pgen2 import token from ..pygram import python_symbols as syms from .. import fixer_base from ..fixer_util import Name, Call, find_binding bind_warning = "Calls to builtin next() possibly shadowed by global binding" class FixNext(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< base=any+ trailer< '.' attr='next' > trailer< '(' ')' > > | power< head=any+ trailer< '.' attr='next' > not trailer< '(' ')' > > | classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='next' parameters< '(' NAME ')' > any+ > any* > > | global=global_stmt< 'global' any* 'next' any* > """ order = "pre" # Pre-order tree traversal def start_tree(self, tree, filename): super(FixNext, self).start_tree(tree, filename) n = find_binding('next', tree) if n: self.warning(n, bind_warning) self.shadowed_next = True else: self.shadowed_next = False def transform(self, node, results): assert results base = results.get("base") attr = results.get("attr") name = results.get("name") if base: if self.shadowed_next: attr.replace(Name("__next__", prefix=attr.prefix)) else: base = [n.clone() for n in base] base[0].prefix = "" node.replace(Call(Name("next", prefix=node.prefix), base)) elif name: n = Name("__next__", prefix=name.prefix) name.replace(n) elif attr: # We don't do this transformation if we're assigning to "x.next". # Unfortunately, it doesn't seem possible to do this in PATTERN, # so it's being done here. if is_assign_target(node): head = results["head"] if "".join([str(n) for n in head]).strip() == '__builtin__': self.warning(node, bind_warning) return attr.replace(Name("__next__")) elif "global" in results: self.warning(node, bind_warning) self.shadowed_next = True ### The following functions help test if node is part of an assignment ### target. def is_assign_target(node): assign = find_assign(node) if assign is None: return False for child in assign.children: if child.type == token.EQUAL: return False elif is_subtree(child, node): return True return False def find_assign(node): if node.type == syms.expr_stmt: return node if node.type == syms.simple_stmt or node.parent is None: return None return find_assign(node.parent) def is_subtree(root, node): if root == node: return True return any(is_subtree(c, node) for c in root.children) fixes/fix_funcattrs.py000064400000001204151027012300011077 0ustar00"""Fix function attribute names (f.func_x -> f.__x__).""" # Author: Collin Winter # Local imports from .. import fixer_base from ..fixer_util import Name class FixFuncattrs(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< any+ trailer< '.' attr=('func_closure' | 'func_doc' | 'func_globals' | 'func_name' | 'func_defaults' | 'func_code' | 'func_dict') > any* > """ def transform(self, node, results): attr = results["attr"][0] attr.replace(Name(("__%s__" % attr.value[5:]), prefix=attr.prefix)) fixes/fix_itertools_imports.py000064400000004046151027012300012676 0ustar00""" Fixer for imports of itertools.(imap|ifilter|izip|ifilterfalse) """ # Local imports from lib2to3 import fixer_base from lib2to3.fixer_util import BlankLine, syms, token class FixItertoolsImports(fixer_base.BaseFix): BM_compatible = True PATTERN = """ import_from< 'from' 'itertools' 'import' imports=any > """ %(locals()) def transform(self, node, results): imports = results['imports'] if imports.type == syms.import_as_name or not imports.children: children = [imports] else: children = imports.children for child in children[::2]: if child.type == token.NAME: member = child.value name_node = child elif child.type == token.STAR: # Just leave the import as is. return else: assert child.type == syms.import_as_name name_node = child.children[0] member_name = name_node.value if member_name in ('imap', 'izip', 'ifilter'): child.value = None child.remove() elif member_name in ('ifilterfalse', 'izip_longest'): node.changed() name_node.value = ('filterfalse' if member_name[1] == 'f' else 'zip_longest') # Make sure the import statement is still sane children = imports.children[:] or [imports] remove_comma = True for child in children: if remove_comma and child.type == token.COMMA: child.remove() else: remove_comma ^= True while children and children[-1].type == token.COMMA: children.pop().remove() # If there are no imports left, just get rid of the entire statement if (not (imports.children or getattr(imports, 'value', None)) or imports.parent is None): p = node.prefix node = BlankLine() node.prefix = p return node fixes/fix_exec.py000064400000001723151027012300010020 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for exec. This converts usages of the exec statement into calls to a built-in exec() function. exec code in ns1, ns2 -> exec(code, ns1, ns2) """ # Local imports from .. import fixer_base from ..fixer_util import Comma, Name, Call class FixExec(fixer_base.BaseFix): BM_compatible = True PATTERN = """ exec_stmt< 'exec' a=any 'in' b=any [',' c=any] > | exec_stmt< 'exec' (not atom<'(' [any] ')'>) a=any > """ def transform(self, node, results): assert results syms = self.syms a = results["a"] b = results.get("b") c = results.get("c") args = [a.clone()] args[0].prefix = "" if b is not None: args.extend([Comma(), b.clone()]) if c is not None: args.extend([Comma(), c.clone()]) return Call(Name("exec"), args, prefix=node.prefix) fixes/fix_print.py000064400000005434151027012300010233 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for print. Change: 'print' into 'print()' 'print ...' into 'print(...)' 'print ... ,' into 'print(..., end=" ")' 'print >>x, ...' into 'print(..., file=x)' No changes are applied if print_function is imported from __future__ """ # Local imports from .. import patcomp from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Name, Call, Comma, String parend_expr = patcomp.compile_pattern( """atom< '(' [atom|STRING|NAME] ')' >""" ) class FixPrint(fixer_base.BaseFix): BM_compatible = True PATTERN = """ simple_stmt< any* bare='print' any* > | print_stmt """ def transform(self, node, results): assert results bare_print = results.get("bare") if bare_print: # Special-case print all by itself bare_print.replace(Call(Name("print"), [], prefix=bare_print.prefix)) return assert node.children[0] == Name("print") args = node.children[1:] if len(args) == 1 and parend_expr.match(args[0]): # We don't want to keep sticking parens around an # already-parenthesised expression. return sep = end = file = None if args and args[-1] == Comma(): args = args[:-1] end = " " if args and args[0] == pytree.Leaf(token.RIGHTSHIFT, ">>"): assert len(args) >= 2 file = args[1].clone() args = args[3:] # Strip a possible comma after the file expression # Now synthesize a print(args, sep=..., end=..., file=...) node. l_args = [arg.clone() for arg in args] if l_args: l_args[0].prefix = "" if sep is not None or end is not None or file is not None: if sep is not None: self.add_kwarg(l_args, "sep", String(repr(sep))) if end is not None: self.add_kwarg(l_args, "end", String(repr(end))) if file is not None: self.add_kwarg(l_args, "file", file) n_stmt = Call(Name("print"), l_args) n_stmt.prefix = node.prefix return n_stmt def add_kwarg(self, l_nodes, s_kwd, n_expr): # XXX All this prefix-setting may lose comments (though rarely) n_expr.prefix = "" n_argument = pytree.Node(self.syms.argument, (Name(s_kwd), pytree.Leaf(token.EQUAL, "="), n_expr)) if l_nodes: l_nodes.append(Comma()) n_argument.prefix = " " l_nodes.append(n_argument) fixes/fix_map.py000064400000007070151027012300007652 0ustar00# Copyright 2007 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer that changes map(F, ...) into list(map(F, ...)) unless there exists a 'from future_builtins import map' statement in the top-level namespace. As a special case, map(None, X) is changed into list(X). (This is necessary because the semantics are changed in this case -- the new map(None, X) is equivalent to [(x,) for x in X].) We avoid the transformation (except for the special case mentioned above) if the map() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on map(F, X, Y, ...) to go on until the longest argument is exhausted, substituting None for missing values -- like zip(), it now stops as soon as the shortest argument is exhausted. """ # Local imports from ..pgen2 import token from .. import fixer_base from ..fixer_util import Name, ArgList, Call, ListComp, in_special_context from ..pygram import python_symbols as syms from ..pytree import Node class FixMap(fixer_base.ConditionalFix): BM_compatible = True PATTERN = """ map_none=power< 'map' trailer< '(' arglist< 'None' ',' arg=any [','] > ')' > [extra_trailers=trailer*] > | map_lambda=power< 'map' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'map' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > """ skip_on = 'future_builtins.map' def transform(self, node, results): if self.should_skip(node): return trailers = [] if 'extra_trailers' in results: for t in results['extra_trailers']: trailers.append(t.clone()) if node.parent.type == syms.simple_stmt: self.warning(node, "You should use a for loop here") new = node.clone() new.prefix = "" new = Call(Name("list"), [new]) elif "map_lambda" in results: new = ListComp(results["xp"].clone(), results["fp"].clone(), results["it"].clone()) new = Node(syms.power, [new] + trailers, prefix="") else: if "map_none" in results: new = results["arg"].clone() new.prefix = "" else: if "args" in results: args = results["args"] if args.type == syms.trailer and \ args.children[1].type == syms.arglist and \ args.children[1].children[0].type == token.NAME and \ args.children[1].children[0].value == "None": self.warning(node, "cannot convert map(None, ...) " "with multiple arguments because map() " "now truncates to the shortest sequence") return new = Node(syms.power, [Name("map"), args.clone()]) new.prefix = "" if in_special_context(node): return None new = Node(syms.power, [Name("list"), ArgList([new])] + trailers) new.prefix = "" new.prefix = node.prefix return new fixes/__pycache__/fix_nonzero.cpython-311.pyc000064400000002264151027012300015127 0ustar00 !A?hOHdZddlmZddlmZGddejZdS)z*Fixer for __nonzero__ -> __bool__ methods.) fixer_base)NameceZdZdZdZdZdS) FixNonzeroTz classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='__nonzero__' parameters< '(' NAME ')' > any+ > any* > > cl|d}td|j}||dS)Nname__bool__)prefix)rr replace)selfnoderesultsrnews 2/usr/lib64/python3.11/lib2to3/fixes/fix_nonzero.py transformzFixNonzero.transforms7v:dk222 SN)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MGrrN)__doc__r fixer_utilrBaseFixrrrrrsh00     #     rfixes/__pycache__/fix_apply.cpython-311.pyc000064400000005412151027012300014560 0ustar00 !A?h* hdZddlmZddlmZddlmZddlmZmZm Z Gddej Z dS) zIFixer for apply(). This converts apply(func, v, k) into (func)(*v, **k).)pytree)token) fixer_base)CallComma parenthesizeceZdZdZdZdZdS)FixApplyTa. power< 'apply' trailer< '(' arglist< (not argument ')' > > c|j}|sJ|d}|d}|d}|r+|j|jjkr|jdjdvrdS|r-|j|jjkr|jdjdkrdS|j}|}|jtj |j fvr?|j|j ks |jdjtj krt|}d|_|}d|_||}d|_tjtjd |g}|N|t%tjtj d|gd |d_t'||| S) Nfuncargskwds>***rr )prefix)symsgettypeargumentchildrenvaluerclonerNAMEatompower DOUBLESTARrrLeafSTARextendrr) selfnoderesultsrr r rr l_newargss 0/usr/lib64/python3.11/lib2to3/fixes/fix_apply.py transformzFixApply.transformsywvv{{6""   TY/// a &+55  TY$)"444]1%+t33 Fzz|| Iej$)4 4 4 Y$* $ $ ]2  #u'7 7 7%%D zz||  ::<r5s99 22222222226464646464z!6464646464r*fixes/__pycache__/fix_imports2.cpython-311.opt-1.pyc000064400000001256151027012300016153 0ustar00 !A?h!FdZddlmZdddZGddejZdS)zTFix incompatible imports and module references that must be fixed after fix_imports.) fix_importsdbm)whichdbanydbmceZdZdZeZdS) FixImports2N)__name__ __module__ __qualname__ run_orderMAPPINGmapping3/usr/lib64/python3.11/lib2to3/fixes/fix_imports2.pyrr sIGGGrrN)__doc__rr FixImportsrrrrrsl   +(rfixes/__pycache__/fix_types.cpython-311.pyc000064400000004714151027012300014603 0ustar00 !A?hdZddlmZddlmZidddddd d d d d dd ddddddddddddddddddd d!d"d#d$d d%d&d'Zd(eDZGd)d*ejZd+S),aFixer for removing uses of the types module. These work for only the known names in the types module. The forms above can include types. or not. ie, It is assumed the module is imported either as: import types from types import ... # either * or specific types The import statements are not modified. There should be another fixer that handles at least the following constants: type([]) -> list type(()) -> tuple type('') -> str ) fixer_base)Name BooleanTypebool BufferType memoryview ClassTypetype ComplexTypecomplexDictTypedictDictionaryType EllipsisTypeztype(Ellipsis) FloatTypefloatIntTypeintListTypelistLongType ObjectTypeobjectNoneTypez type(None)NotImplementedTypeztype(NotImplemented) SliceTypeslice StringTypebytes StringTypesz(str,)tuplestrrange) TupleTypeTypeType UnicodeType XRangeTypecg|]}d|zS)z)power< 'types' trailer< '.' name='%s' > >).0ts 0/usr/lib64/python3.11/lib2to3/fixes/fix_types.py r-3sPPPQ 4q 8PPPcBeZdZdZdeZdZdS)FixTypesT|ct|dj}|rt||jSdS)Nname)prefix) _TYPE_MAPPINGgetvaluerr4)selfnoderesults new_values r, transformzFixTypes.transform9s>!%%gfo&;<<  7 $+666 6tr.N)__name__ __module__ __qualname__ BM_compatiblejoin_patsPATTERNr<r)r.r,r0r05s7MhhuooGr.r0N) __doc__r fixer_utilrr5rBBaseFixr0r)r.r,rHsp&| f   F  6  ) W 5 F E x L 5 g!" g#$ %&- 2 QP-PPPz!r.fixes/__pycache__/fix_numliterals.cpython-311.opt-2.pyc000064400000003020151027012300016723 0ustar00 !A?hR ddlmZddlmZddlmZGddejZdS))token) fixer_base)Numberc(eZdZejZdZdZdS)FixNumliteralscT|jdp|jddvS)N0Ll)value startswith)selfnodes 6/usr/lib64/python3.11/lib2to3/fixes/fix_numliterals.pymatchzFixNumliterals.matchs( %%c**Ddjn.DEc|j}|ddvr |dd}nV|drA|r-tt |dkr d|ddz}t ||jS)Nr r r 0o)prefix)r r isdigitlensetrr)rrresultsvals r transformzFixNumliterals.transformsj r7d??crc(CC ^^C  !S[[]] !s3s88}}q7H7HQRR.Cc$+....rN)__name__ __module__ __qualname__rNUMBER _accept_typerrrrrr s>r'sy /////Z'/////rfixes/__pycache__/fix_paren.cpython-311.pyc000064400000003364151027012300014544 0ustar00 !A?hLdZddlmZddlmZmZGddejZdS)ztFixer that adds parentheses where they are required This converts ``[x for x in 1, 2]`` to ``[x for x in (1, 2)]``.) fixer_base)LParenRParenceZdZdZdZdZdS)FixParenTa atom< ('[' | '(') (listmaker< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > > | testlist_gexp< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > >) (']' | ')') > c|d}t}|j|_d|_|d||t dS)Ntarget)rprefix insert_child append_childr)selfnoderesultsr lparens 0/usr/lib64/python3.11/lib2to3/fixes/fix_paren.py transformzFixParen.transform%sY"   Av&&&FHH%%%%%N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG,&&&&&rrN)__doc__r r fixer_utilrrBaseFixrrrrrstCC'''''''' & & & & &z! & & & & &rfixes/__pycache__/fix_raise.cpython-311.opt-2.pyc000064400000006111151027012300015473 0ustar00 !A?hn n ddlmZddlmZddlmZddlmZmZmZm Z m Z Gddej Z dS))pytree)token) fixer_base)NameCallAttrArgListis_tupleceZdZdZdZdZdS)FixRaiseTzB raise_stmt< 'raise' exc=any [',' val=any [',' tb=any]] > ch|j}|d}|jtjkrd}|||dSt |rOt |r9|jdjd}t |9d|_d|vr7tj |j td|g}|j|_|S|d}t |rd|jdd D}n d |_|g}d |vr|d } d | _|} |jtj ks |jd krt||} t!| td t#| ggz} tj |jtdg| z}|j|_|Stj |j tdt||g|jS)Nexcz+Python 3 does not support string exceptions valraisec6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_raise.py z&FixRaise.transform..Ds :::!AGGII:::tbNonewith_traceback)prefix)symsrtyperSTRINGcannot_convertr childrenr!rNode raise_stmtrNAMEvaluerrr simple_stmt) selfnoderesultsr"rmsgnewrargsrewith_tbs r transformzFixRaise.transform&syen""$$ 8u| # #?C   c * * * F C== 3-- :l1o.q177993-- :CJ   +doW s/CDDCCJJen""$$ C== ::s|AbD'9:::DDCJ5D 7??$$&&BBIAx5:%%f)<)<dOO1d#34455"GG+d.g'0IJJCCJJ;t $W tC?&*k333 3rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr4rrrr r s/MG4343434343rr N) rrpgen2rr fixer_utilrrrr r BaseFixr rrrr=s2<<<<<<<<<<<<<<;3;3;3;3;3z!;3;3;3;3;3rfixes/__pycache__/fix_set_literal.cpython-311.opt-2.pyc000064400000005164151027012300016706 0ustar00 !A?hN ddlmZmZddlmZmZGddejZdS)) fixer_basepytree)tokensymsc eZdZdZdZdZdZdS) FixSetLiteralTajpower< 'set' trailer< '(' (atom=atom< '[' (items=listmaker< any ((',' any)* [',']) > | single=any) ']' > | atom< '(' items=testlist_gexp< any ((',' any)* [',']) > ')' > ) ')' > > c|d}|rJtjtj|g}|||}n|d}tjtj dg}| d|j D| tjtj d|jj|d_tjtj|}|j|_t#|j dkr8|j d}||j|j d_|S) Nsingleitems{c3>K|]}|VdS)N)clone).0ns 6/usr/lib64/python3.11/lib2to3/fixes/fix_set_literal.py z*FixSetLiteral.transform..'s*99Qqwwyy999999})getrNoder listmakerrreplaceLeafrLBRACEextendchildrenappendRBRACE next_siblingprefix dictsetmakerlenremove) selfnoderesultsr faker literalmakerrs r transformzFixSetLiteral.transforms.X&&  %;t~ /?@@D NN4 EEG$E;u|S11299%.999999v{5<55666"/6  D-w77{  u~  ! # #q!A HHJJJ()EN2  % rN)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNr-rrrr s4MHGrrN)lib2to3rrlib2to3.fixer_utilrrBaseFixrr4rrr8ss '&&&&&&&********)))))J&)))))rfixes/__pycache__/fix_intern.cpython-311.opt-1.pyc000064400000002745151027012300015677 0ustar00 !A?hxLdZddlmZddlmZmZGddejZdS)z/Fixer for intern(). intern(s) -> sys.intern(s)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixInternTprez power< 'intern' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***)sysinternr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_intern.py transformzFixIntern.transformsu  %.C H 222LO)[88F!D'511T5$''' N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)__doc__r fixer_utilrrBaseFixrr#rrr(sr 44444444 "rfixes/__pycache__/fix_itertools.cpython-311.pyc000064400000004127151027012300015461 0ustar00 !A?h HdZddlmZddlmZGddejZdS)aT Fixer for itertools.(imap|ifilter|izip) --> (map|filter|zip) and itertools.ifilterfalse --> itertools.filterfalse (bugs 2360-2363) imports from itertools are fixed in fix_itertools_import.py If itertools is imported as something else (ie: import itertools as it; it.izip(spam, eggs)) method calls will not get fixed. ) fixer_base)Namec:eZdZdZdZdezZdZdZdS) FixItertoolsTz7('imap'|'ifilter'|'izip'|'izip_longest'|'ifilterfalse')z power< it='itertools' trailer< dot='.' func=%(it_funcs)s > trailer< '(' [any] ')' > > | power< func=%(it_funcs)s trailer< '(' [any] ')' > > cfd}|dd}d|vrb|jdvrY|d|d}}|j}|||j||p|j}|t |jdd|dS)Nfuncit) ifilterfalse izip_longestdot)prefix)valuerremoveparentreplacer)selfnoderesultsrr rr s 4/usr/lib64/python3.11/lib2to3/fixes/fix_itertools.py transformzFixItertools.transformsvq! GOO J> > >u~wt}CYF IIKKK JJLLL K   % % %&4; T$*QRR.88899999N) __name__ __module__ __qualname__ BM_compatibleit_funcslocalsPATTERN run_orderrrrrrsKMHH FHH GI:::::rrN)__doc__r fixer_utilrBaseFixrr#rrr(sl::::::%:::::rfixes/__pycache__/fix_renames.cpython-311.opt-1.pyc000064400000006437151027012300016034 0ustar00 !A?hhdZddlmZddlmZmZdddiiZiZdZdZ Gd d ej Z d S) z?Fix incompatible renames Fixes: * sys.maxint -> sys.maxsize ) fixer_base)Name attr_chainsysmaxintmaxsizec^ddtt|zdzS)N(|))joinmaprepr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_renames.py alternatesrs( #dG,,-- - 33c #KttD]Q\}}t|D]*\}}|t||f<d|d|d|dVd|d|dV+RdS)Nz3 import_from< 'from' module_name=z, 'import' ( attr_name=z | import_as_name< attr_name=z! 'as' any >) > z& power< module_name=z trailer< '.' attr_name=z > any* > )listMAPPINGitemsLOOKUP)modulereplaceold_attrnew_attrs r build_patternrs 00++"&w}}"7"7 + + Hh)1FFH% & & 8885 5 5 5 5  + + + + + +++rcfeZdZdZdeZdZfdZdZ xZ S) FixRenamesTr prectt|j|}|r-tfdt |dDrdS|SdS)Nc3.K|]}|VdS)N).0objmatchs r z#FixRenames.match..5s+DD#55::DDDDDDrparentF)superrr&anyr)selfnoderesultsr& __class__s @rr&zFixRenames.match1sij$''-%++  DDDDD()C)CDDDDD uNurc|d}|d}|rF|rFt|j|jf}|t ||jdSdSdS)N module_name attr_name)prefix)getrvaluerrr2)r+r,r-mod_namer1rs r transformzFixRenames.transform>s;;}--KK ,,   G  Gx~y?@H   d8I4DEEE F F F F F G G G Gr) __name__ __module__ __qualname__ BM_compatibler rPATTERNorderr&r6 __classcell__)r.s@rrr*soMhh}}''G EGGGGGGGrrN) __doc__r fixer_utilrrrrrrBaseFixrr#rrrBs)))))))) Hy)  444+++*GGGGG#GGGGGrfixes/__pycache__/fix_exitfunc.cpython-311.opt-2.pyc000064400000007432151027012300016224 0ustar00 !A?h ^ ddlmZmZddlmZmZmZmZmZm Z Gddej Z dS))pytree fixer_base)NameAttrCallCommaNewlinesymsc:eZdZdZdZdZfdZfdZdZxZ S) FixExitfuncTa ( sys_import=import_name<'import' ('sys' | dotted_as_names< (any ',')* 'sys' (',' any)* > ) > | expr_stmt< power< 'sys' trailer< '.' 'exitfunc' > > '=' func=any > ) cBtt|j|dSN)superr __init__)selfargs __class__s 3/usr/lib64/python3.11/lib2to3/fixes/fix_exitfunc.pyrzFixExitfunc.__init__s#)k4  )40000chtt|||d|_dSr)rr start_tree sys_import)rtreefilenamers rrzFixExitfunc.start_tree!s. k4  ++D(;;;rc d|vr|j |d|_dS|d}d|_tjt jttdtd}t||g|j}| ||j| |ddS|jj d}|j t jkrF|t!|tdddS|jj}|j |j}|j} tjt jtd tddg} tjt j| g} ||dzt-||d z| dS) NrfuncatexitregisterzKCan't find sys import; Please add an atexit import at the top of your file. import)rcloneprefixrNoder powerrrrreplacewarningchildrentypedotted_as_names append_childrparentindex import_name simple_stmt insert_childr ) rnoderesultsrrcallnamescontaining_stmtpositionstmt_container new_importnews r transformzFixExitfunc.transform%s 7 " "&"),"7 Fv$$&& ;tz#DNND4D4DEE!!Htfdk22 T ? " LL ? @ @ @ F(+ :- - -   uww ' ' '   tHc22 3 3 3 3 3"o4O&/55doFFH,3NT%5#H~~tHc/B/BC  J+d. ==C  ( (Awyy A A A  ( (As ; ; ; ; ;r) __name__ __module__ __qualname__keep_line_order BM_compatiblePATTERNrrr< __classcell__)rs@rr r sqOM G11111#<#<#<#<#<#<#rHs '&&&&&&&EEEEEEEEEEEEEEEE=<=<=<=<=<*$=<=<=<=<=)z(power< 'type' trailer< '(' x=any ')' > >c XeZdZdZdedededed ZfdZdZdZ d Z d Z xZ S) FixIdiomsTz isinstance=comparison<  z8 T=any > | isinstance=comparison< T=any aX > | while_stmt< 'while' while='1' ':' any+ > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' power< list='list' trailer< '(' (not arglist) any ')' > > > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' expr=any > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > ctt||}|rd|vr|d|dkr|SdS|S)Nsortedid1id2)superr match)selfnoder __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_idioms.pyrzFixIdioms.matchOsT )T " " ( ( . .  Qx1U8##4cd|vr|||Sd|vr|||Sd|vr|||Std)N isinstancewhilerz Invalid match)transform_isinstancetransform_whiletransform_sort RuntimeError)rrresultss r transformzFixIdioms.transformZss 7 " ",,T7;; ;   ''g66 6  &&tW55 5// /rcb|d}|d}d|_d|_ttd|t |g}d|vr0d|_t t jtd|g}|j|_|S)NxTr rnnot)cloneprefixrrrrr not_test)rrr r#r$tests rrzFixIdioms.transform_isinstanceds CL    CL   D&&EGGQ88 '>>DK U T':;;Dk  rch|d}|td|jdS)NrTruer))replacerr))rrr ones rrzFixIdioms.transform_whileps3g D 33344444rc|d}|d}|d}|d}|r*|td|jne|rT|}d|_|t td|g|jnt d||j}d |vr|rJ|d d |d jf} d | |d _dSt} |j | |d d | _dSdS) Nsortnextlistexprrr.r%zshould not have reached here ) getr/rr)r(rrremove rpartitionjoinrparent append_child) rrr sort_stmt next_stmt list_call simple_exprnewbtwn prefix_linesend_lines rrzFixIdioms.transform_sorttsFO FO KK'' kk&))  ?   d8I4DEEE F F F F  ?##%%CCJ   T(^^cU,7,>!@!@!@ A A A A=>> > 4<< ;!% 5 5a 8)A,:MN &*ii &=&= ! ### %;; --h777#'//$"7"7":! rQs<AAAAAAAAAAAAAAAA81s;s;s;s;s; "s;s;s;s;s;rfixes/__pycache__/fix_long.cpython-311.opt-2.pyc000064400000001622151027012300015331 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)is_probably_builtinceZdZdZdZdZdS)FixLongTz'long'c^t|rd|_|dSdS)Nint)rvaluechanged)selfnoderesultss //usr/lib64/python3.11/lib2to3/fixes/fix_long.py transformzFixLong.transforms4 t $ $ DJ LLNNNNN  N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s-MGrrN)lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsg222222j rfixes/__pycache__/fix_standarderror.cpython-311.opt-2.pyc000064400000001560151027012300017245 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS)FixStandarderrorTz- 'StandardError' c.td|jS)N Exception)prefix)rr )selfnoderesultss 8/usr/lib64/python3.11/lib2to3/fixes/fix_standarderror.py transformzFixStandarderror.transformsK 4444N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rr s/MG55555rrN)r fixer_utilrBaseFixrrrr rsg,55555z)55555rfixes/__pycache__/fix_renames.cpython-311.opt-2.pyc000064400000006316151027012300016031 0ustar00 !A?hf ddlmZddlmZmZdddiiZiZdZdZGdd ej Z d S) ) fixer_base)Name attr_chainsysmaxintmaxsizec^ddtt|zdzS)N(|))joinmaprepr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_renames.py alternatesrs( #dG,,-- - 33c #KttD]Q\}}t|D]*\}}|t||f<d|d|d|dVd|d|dV+RdS)Nz3 import_from< 'from' module_name=z, 'import' ( attr_name=z | import_as_name< attr_name=z! 'as' any >) > z& power< module_name=z trailer< '.' attr_name=z > any* > )listMAPPINGitemsLOOKUP)modulereplaceold_attrnew_attrs r build_patternrs 00++"&w}}"7"7 + + Hh)1FFH% & & 8885 5 5 5 5  + + + + + +++rcfeZdZdZdeZdZfdZdZ xZ S) FixRenamesTr prectt|j|}|r-tfdt |dDrdS|SdS)Nc3.K|]}|VdS)N).0objmatchs r z#FixRenames.match..5s+DD#55::DDDDDDrparentF)superrr&anyr)selfnoderesultsr& __class__s @rr&zFixRenames.match1sij$''-%++  DDDDD()C)CDDDDD uNurc|d}|d}|rF|rFt|j|jf}|t ||jdSdSdS)N module_name attr_name)prefix)getrvaluerrr2)r+r,r-mod_namer1rs r transformzFixRenames.transform>s;;}--KK ,,   G  Gx~y?@H   d8I4DEEE F F F F F G G G Gr) __name__ __module__ __qualname__ BM_compatibler rPATTERNorderr&r6 __classcell__)r.s@rrr*soMhh}}''G EGGGGGGGrrN) r fixer_utilrrrrrrBaseFixrr#rrrAs)))))))) Hy)  444+++*GGGGG#GGGGGrfixes/__pycache__/fix_apply.cpython-311.opt-2.pyc000064400000005236151027012300015524 0ustar00 !A?h* f ddlmZddlmZddlmZddlmZmZmZGddej Z dS))pytree)token) fixer_base)CallComma parenthesizeceZdZdZdZdZdS)FixApplyTa. power< 'apply' trailer< '(' arglist< (not argument ')' > > c~|j}|d}|d}|d}|r+|j|jjkr|jdjdvrdS|r-|j|jjkr|jdjdkrdS|j}|}|jtj |j fvr?|j|j ks |jdjtj krt|}d|_|}d|_||}d|_tjtjd |g}|N|t%tjtj d|gd |d_t'||| S) Nfuncargskwds>***rr )prefix)symsgettypeargumentchildrenvaluerclonerNAMEatompower DOUBLESTARrrLeafSTARextendrr) selfnoderesultsrr r rr l_newargss 0/usr/lib64/python3.11/lib2to3/fixes/fix_apply.py transformzFixApply.transformsyvv{{6""   TY/// a &+55  TY$)"444]1%+t33 Fzz|| Iej$)4 4 4 Y$* $ $ ]2  #u'7 7 7%%D zz||  ::<r4s9 22222222226464646464z!6464646464r*fixes/__pycache__/fix_types.cpython-311.opt-1.pyc000064400000004714151027012300015542 0ustar00 !A?hdZddlmZddlmZidddddd d d d d dd ddddddddddddddddddd d!d"d#d$d d%d&d'Zd(eDZGd)d*ejZd+S),aFixer for removing uses of the types module. These work for only the known names in the types module. The forms above can include types. or not. ie, It is assumed the module is imported either as: import types from types import ... # either * or specific types The import statements are not modified. There should be another fixer that handles at least the following constants: type([]) -> list type(()) -> tuple type('') -> str ) fixer_base)Name BooleanTypebool BufferType memoryview ClassTypetype ComplexTypecomplexDictTypedictDictionaryType EllipsisTypeztype(Ellipsis) FloatTypefloatIntTypeintListTypelistLongType ObjectTypeobjectNoneTypez type(None)NotImplementedTypeztype(NotImplemented) SliceTypeslice StringTypebytes StringTypesz(str,)tuplestrrange) TupleTypeTypeType UnicodeType XRangeTypecg|]}d|zS)z)power< 'types' trailer< '.' name='%s' > >).0ts 0/usr/lib64/python3.11/lib2to3/fixes/fix_types.py r-3sPPPQ 4q 8PPPcBeZdZdZdeZdZdS)FixTypesT|ct|dj}|rt||jSdS)Nname)prefix) _TYPE_MAPPINGgetvaluerr4)selfnoderesults new_values r, transformzFixTypes.transform9s>!%%gfo&;<<  7 $+666 6tr.N)__name__ __module__ __qualname__ BM_compatiblejoin_patsPATTERNr<r)r.r,r0r05s7MhhuooGr.r0N) __doc__r fixer_utilrr5rBBaseFixr0r)r.r,rHsp&| f   F  6  ) W 5 F E x L 5 g!" g#$ %&- 2 QP-PPPz!r.fixes/__pycache__/fix_reload.cpython-311.opt-1.pyc000064400000002761151027012300015644 0ustar00 !A?h9LdZddlmZddlmZmZGddejZdS)z5Fixer for reload(). reload(s) -> importlib.reload(s)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixReloadTprez power< 'reload' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***) importlibreloadr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_reload.py transformzFixReload.transformsu  %.C H 222LO)[88F'D'511T;--- N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)__doc__r fixer_utilrrBaseFixrr#rrr(sr$$ 44444444 "rfixes/__pycache__/fix_tuple_params.cpython-311.opt-1.pyc000064400000020721151027012300017066 0ustar00 !A?hdZddlmZddlmZddlmZddlmZmZm Z m Z m Z m Z dZ GddejZd Zd Zgd fd Zd Zd S)a:Fixer for function definitions with tuple parameters. def func(((a, b), c), d): ... -> def func(x, d): ((a, b), c) = x ... It will also support lambdas: lambda (x, y): x + y -> lambda t: t[0] + t[1] # The parens are a syntax error in Python 3 lambda (x): x + y -> lambda x: x + y )pytree)token) fixer_base)AssignNameNewlineNumber Subscriptsymscvt|tjo|jdjt jkS)N) isinstancerNodechildrentyperSTRING)stmts 7/usr/lib64/python3.11/lib2to3/fixes/fix_tuple_params.py is_docstringrs/ dFK ( ( 1 =  EL 01c&eZdZdZdZdZdZdZdS)FixTupleParamsTa funcdef< 'def' any parameters< '(' args=any ')' > ['->' any] ':' suite=any+ > | lambda= lambdef< 'lambda' args=vfpdef< '(' inner=any ')' > ':' body=any > c d|vr||Sg |d}|d}|djdjtjkr)d}|djdj}t n#d}d}tjtjd d fd }|jtj kr ||nU|jtj kr@t|jD]+\}} | jtj kr|| |dk , sdS D]} |d| _ |} |dkrd d_n2t|dj|r| d_|dz} D]} |d| _  |dj| | <t!| dz| t# zdzD]}||dj|_|ddS)Nlambdasuiteargsr rz; Fct}|}d|_t ||}|rd|_||tjtj |gdS)Nr ) rnew_namecloneprefixrreplaceappendrrr simple_stmt) tuple_arg add_prefixnargrend new_linesselfs r handle_tuplez.FixTupleParams.transform..handle_tupleCsT]]__%%A//##CCJ#qwwyy))D    a   V[)9*. )<>> ? ? ? ? ?r)r)r!)F)transform_lambdarrrINDENTvaluerrLeafr tfpdef typedargslist enumerateparentr$rrangelenchanged)r.noderesultsrrstartindentr/ir+lineafterr,r-s` @@r transformzFixTupleParams.transform.sM w  ((w77 7  v 8 Q  $ 4 4E1X&q)/F))CCEF+elB//C ? ? ? ? ? ? ? ? 9 # # L     Y$, , ,#DM22 : :38t{**!L!a%9999  F # #D(DKK A::"%IaL   %(+E2 3 3 "(IaL AIE # #D(DKK)2a%+&uQwc)nn 4Q 677 1 1A*0E!H a ' ' arc|d}|d}t|d}|jtjkr2|}d|_||dSt|}t|}| t|}t|d} || | D]} | jtjkrv| j |vrmd|| j D} tjt j| g| z} | j| _| | dS)Nrbodyinnerr!)r$c6g|]}|S)r#.0cs r z3FixTupleParams.transform_lambda..s CCCAaggiiCCCr) simplify_argsrrNAMEr#r$r% find_params map_to_indexr" tuple_namer post_orderr2rrr power) r.r;r<rrDrEparamsto_indextup_name new_paramr* subscriptsnews rr0zFixTupleParams.transform_lambdans`vvgg.// : # #KKMMEEL LL    FT""''==F!3!344#...  Y__&&'''""  Av##8(;(;CC!'1BCCC k$*#,??#4#4"5 "BDDX  #   rN)__name__ __module__ __qualname__ run_order BM_compatiblePATTERNrBr0rGrrrrsDIMG>>>@rrc|jtjtjfvr|S|jtjkr9|jtjkr"|jd}|jtjk"|Std|z)NrzReceived unexpected node %s)rr vfplistrrMvfpdefr RuntimeErrorr;s rrLrLss yT\5:... dk ! !i4;&&=#Di4;&& 4t; < <.s, K K KqQVu{5J5JKNN5J5J5Jr)rr rarNrrrMr2rcs rrNrNsS yDK4=+,,, ej z K KDM K K KKrNc|i}t|D]_\}}ttt|g}t |t rt |||W||z||<`|S)N)d)r6r r strrlistrO) param_listr$rhr?objtrailers rrOrOsy J''&&3VCFF^^,,- c4  & g + + + + +g%AcFF Hrcg}|D]O}t|tr#|t|:||Pd|S)N_)rrjr&rPjoin)rklrls rrPrPsd A c4   HHZ__ % % % % HHSMMMM 88A;;r)__doc__rrpgen2rr fixer_utilrrrr r r rBaseFixrrLrNrOrPrGrrrvs*GGGGGGGGGGGGGGGG111gggggZ'gggX = = =LLL%'$     rfixes/__pycache__/fix_set_literal.cpython-311.pyc000064400000005300151027012300015736 0ustar00 !A?hPdZddlmZmZddlmZmZGddejZdS)z: Optional fixer to transform set() calls to set literals. ) fixer_basepytree)tokensymsc eZdZdZdZdZdZdS) FixSetLiteralTajpower< 'set' trailer< '(' (atom=atom< '[' (items=listmaker< any ((',' any)* [',']) > | single=any) ']' > | atom< '(' items=testlist_gexp< any ((',' any)* [',']) > ')' > ) ')' > > c|d}|rJtjtj|g}|||}n|d}tjtj dg}| d|j D| tjtj d|jj|d_tjtj|}|j|_t#|j dkr8|j d}||j|j d_|S) Nsingleitems{c3>K|]}|VdS)N)clone).0ns 6/usr/lib64/python3.11/lib2to3/fixes/fix_set_literal.py z*FixSetLiteral.transform..'s*99Qqwwyy999999})getrNoder listmakerrreplaceLeafrLBRACEextendchildrenappendRBRACE next_siblingprefix dictsetmakerlenremove) selfnoderesultsr faker literalmakerrs r transformzFixSetLiteral.transforms.X&&  %;t~ /?@@D NN4 EEG$E;u|S11299%.999999v{5<55666"/6  D-w77{  u~  ! # #q!A HHJJJ()EN2  % rN)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNr-rrrr s4MHGrrN) __doc__lib2to3rrlib2to3.fixer_utilrrBaseFixrr4rrr9sx '&&&&&&&********)))))J&)))))rfixes/__pycache__/fix_filter.cpython-311.opt-1.pyc000064400000007725151027012300015670 0ustar00 !A?h pdZddlmZddlmZddlmZddlm Z m Z m Z m Z m Z GddejZdS) aFixer that changes filter(F, X) into list(filter(F, X)). We avoid the transformation if the filter() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on filter(F, X) to return a string if X is a string and a tuple if X is a tuple. That would require type inference, which we don't do. Let Python 2.6 figure it out. ) fixer_base)Node)python_symbols)NameArgListListCompin_special_context parenthesizec eZdZdZdZdZdZdS) FixFilterTaV filter_lambda=power< 'filter' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'filter' trailer< '(' arglist< none='None' ',' seq=any > ')' > [extra_trailers=trailer*] > | power< 'filter' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.filterc||rdSg}d|vr2|dD])}||*d|vr|d}|jt jkrd|_t|}t|d|d|d|}tt j |g|zd}n d|vrrttd td |d td }tt j |g|zd}nt|rdS|d }tt j td |gd}tt j td t|gg|z}d|_|j|_|S)Nextra_trailers filter_lambdaxpfpit)prefixnone_fseqargsfilterlist) should_skipappendclonegettypesymstestrr rrpowerrr r)selfnoderesultstrailerstrnewrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_filter.py transformzFixFilter.transform:s   D ! !  F w & &-. + + **** g % %T""((**Bw$)## !"%%7;;t,,2244";;t,,2244";;t,,2244b::CtzC58#3B???CC w  4::::"5>//11::''CtzC58#3B???CC"$'' t6?((**DtzDNND#9"EEECtzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr*r+r)r r s6MG<'G$$$$$r+r N)__doc__rrpytreerpygramrr fixer_utilrrrr r ConditionalFixr r2r+r)r8s  ++++++RRRRRRRRRRRRRRGGGGG )GGGGGr+fixes/__pycache__/fix_intern.cpython-311.opt-2.pyc000064400000002644151027012300015676 0ustar00 !A?hxJ ddlmZddlmZmZGddejZdS)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixInternTprez power< 'intern' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***)sysinternr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_intern.py transformzFixIntern.transformsu  %.C H 222LO)[88F!D'511T5$''' N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)r fixer_utilrrBaseFixrr#rrr'sm 44444444 "rfixes/__pycache__/fix_filter.cpython-311.pyc000064400000007725151027012300014731 0ustar00 !A?h pdZddlmZddlmZddlmZddlm Z m Z m Z m Z m Z GddejZdS) aFixer that changes filter(F, X) into list(filter(F, X)). We avoid the transformation if the filter() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on filter(F, X) to return a string if X is a string and a tuple if X is a tuple. That would require type inference, which we don't do. Let Python 2.6 figure it out. ) fixer_base)Node)python_symbols)NameArgListListCompin_special_context parenthesizec eZdZdZdZdZdZdS) FixFilterTaV filter_lambda=power< 'filter' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'filter' trailer< '(' arglist< none='None' ',' seq=any > ')' > [extra_trailers=trailer*] > | power< 'filter' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.filterc||rdSg}d|vr2|dD])}||*d|vr|d}|jt jkrd|_t|}t|d|d|d|}tt j |g|zd}n d|vrrttd td |d td }tt j |g|zd}nt|rdS|d }tt j td |gd}tt j td t|gg|z}d|_|j|_|S)Nextra_trailers filter_lambdaxpfpit)prefixnone_fseqargsfilterlist) should_skipappendclonegettypesymstestrr rrpowerrr r)selfnoderesultstrailerstrnewrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_filter.py transformzFixFilter.transform:s   D ! !  F w & &-. + + **** g % %T""((**Bw$)## !"%%7;;t,,2244";;t,,2244";;t,,2244b::CtzC58#3B???CC w  4::::"5>//11::''CtzC58#3B???CC"$'' t6?((**DtzDNND#9"EEECtzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr*r+r)r r s6MG<'G$$$$$r+r N)__doc__rrpytreerpygramrr fixer_utilrrrr r ConditionalFixr r2r+r)r8s  ++++++RRRRRRRRRRRRRRGGGGG )GGGGGr+fixes/__pycache__/fix_execfile.cpython-311.opt-2.pyc000064400000005760151027012300016165 0ustar00 !A?hj ddlmZddlmZmZmZmZmZmZm Z m Z m Z m Z Gddej ZdS)) fixer_base) CommaNameCallLParenRParenDotNodeArgListStringsymsceZdZdZdZdZdS) FixExecfileTz power< 'execfile' trailer< '(' arglist< filename=any [',' globals=any [',' locals=any ] ] > ')' > > | power< 'execfile' trailer< '(' filename=any ')' > > ch|d}|d}|d}|jdjd}t|t t ddg|}t tjtd|g}t tj ttd gt tj ttgg} |g| z} |} d| _t d d} | t | t | gz} ttd | d }|g}|5|t |g|5|t |gttd ||jS)Nfilenameglobalslocalsz"rb" )rparenopenreadz'exec'compileexec)prefix)getchildrencloner rr r r powerrtrailerr rrrrextend)selfnoderesultsrrrexecfile_paren open_args open_callr open_expr filename_argexec_str compile_args compile_callargss 3/usr/lib64/python3.11/lib2to3/fixes/fix_execfile.py transformzFixExecfile.transforms:&++i((X&&r*3B7==??X^^--uwwvs8K8KL#1333 d6llI%>?? T\CEE4<<#899T\FHHfhh#788:K$&  ~~'' ! (C(( EGG\577H#MM DOO\2>> ~   KK'--//2 3 3 3   KK&,,..1 2 2 2DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNr0r1r/rrs/MG <<<<r:s 111111111111111111111111&<&<&<&<&<*$&<&<&<&<& trailer< '(' [any] ')' > > | power< func=%(it_funcs)s trailer< '(' [any] ')' > > cfd}|dd}d|vrb|jdvrY|d|d}}|j}|||j||p|j}|t |jdd|dS)Nfuncit) ifilterfalse izip_longestdot)prefix)valuerremoveparentreplacer)selfnoderesultsrr rr s 4/usr/lib64/python3.11/lib2to3/fixes/fix_itertools.py transformzFixItertools.transformsvq! GOO J> > >u~wt}CYF IIKKK JJLLL K   % % %&4; T$*QRR.88899999N) __name__ __module__ __qualname__ BM_compatibleit_funcslocalsPATTERN run_orderrrrrrsKMHH FHH GI:::::rrN)r fixer_utilrBaseFixrr#rrr'sg::::::%:::::rfixes/__pycache__/fix_raise.cpython-311.opt-1.pyc000064400000007271151027012300015502 0ustar00 !A?hn pdZddlmZddlmZddlmZddlmZmZm Z m Z m Z Gddej Z dS) a[Fixer for 'raise E, V, T' raise -> raise raise E -> raise E raise E, V -> raise E(V) raise E, V, T -> raise E(V).with_traceback(T) raise E, None, T -> raise E.with_traceback(T) raise (((E, E'), E''), E'''), V -> raise E(V) raise "foo", V, T -> warns about string exceptions CAVEATS: 1) "raise E, V" will be incorrectly translated if V is an exception instance. The correct Python 3 idiom is raise E from V but since we can't detect instance-hood by syntax alone and since any client code would have to be changed as well, we don't automate this. )pytree)token) fixer_base)NameCallAttrArgListis_tupleceZdZdZdZdZdS)FixRaiseTzB raise_stmt< 'raise' exc=any [',' val=any [',' tb=any]] > ch|j}|d}|jtjkrd}|||dSt |rOt |r9|jdjd}t |9d|_d|vr7tj |j td|g}|j|_|S|d}t |rd|jdd D}n d |_|g}d |vr|d } d | _|} |jtj ks |jd krt||} t!| td t#| ggz} tj |jtdg| z}|j|_|Stj |j tdt||g|jS)Nexcz+Python 3 does not support string exceptions valraisec6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_raise.py z&FixRaise.transform..Ds :::!AGGII:::tbNonewith_traceback)prefix)symsrtyperSTRINGcannot_convertr childrenr!rNode raise_stmtrNAMEvaluerrr simple_stmt) selfnoderesultsr"rmsgnewrargsrewith_tbs r transformzFixRaise.transform&syen""$$ 8u| # #?C   c * * * F C== 3-- :l1o.q177993-- :CJ   +doW s/CDDCCJJen""$$ C== ::s|AbD'9:::DDCJ5D 7??$$&&BBIAx5:%%f)<)<dOO1d#34455"GG+d.g'0IJJCCJJ;t $W tC?&*k333 3rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr4rrrr r s/MG4343434343rr N)__doc__rrpgen2rr fixer_utilrrrr r BaseFixr rrrr>s2<<<<<<<<<<<<<<;3;3;3;3;3z!;3;3;3;3;3rfixes/__pycache__/fix_exec.cpython-311.opt-2.pyc000064400000003156151027012300015322 0ustar00 !A?hN ddlmZddlmZmZmZGddejZdS)) fixer_base)CommaNameCallceZdZdZdZdZdS)FixExecTzx exec_stmt< 'exec' a=any 'in' b=any [',' c=any] > | exec_stmt< 'exec' (not atom<'(' [any] ')'>) a=any > c|j}|d}|d}|d}|g}d|d_|5|t |g|5|t |gt td||jS)Nabcexec)prefix)symsgetclonerextendrrr)selfnoderesultsrr r r argss //usr/lib64/python3.11/lib2to3/fixes/fix_exec.py transformzFixExec.transformsy CL KK   KK   {Q = KK!'')), - - - = KK!'')), - - -DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG < < < < r$ss**********<<<< raise raise E -> raise E raise E, V -> raise E(V) raise E, V, T -> raise E(V).with_traceback(T) raise E, None, T -> raise E.with_traceback(T) raise (((E, E'), E''), E'''), V -> raise E(V) raise "foo", V, T -> warns about string exceptions CAVEATS: 1) "raise E, V" will be incorrectly translated if V is an exception instance. The correct Python 3 idiom is raise E from V but since we can't detect instance-hood by syntax alone and since any client code would have to be changed as well, we don't automate this. )pytree)token) fixer_base)NameCallAttrArgListis_tupleceZdZdZdZdZdS)FixRaiseTzB raise_stmt< 'raise' exc=any [',' val=any [',' tb=any]] > ch|j}|d}|jtjkrd}|||dSt |rOt |r9|jdjd}t |9d|_d|vr7tj |j td|g}|j|_|S|d}t |rd|jdd D}n d |_|g}d |vr|d } d | _|} |jtj ks |jd krt||} t!| td t#| ggz} tj |jtdg| z}|j|_|Stj |j tdt||g|jS)Nexcz+Python 3 does not support string exceptions valraisec6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_raise.py z&FixRaise.transform..Ds :::!AGGII:::tbNonewith_traceback)prefix)symsrtyperSTRINGcannot_convertr childrenr!rNode raise_stmtrNAMEvaluerrr simple_stmt) selfnoderesultsr"rmsgnewrargsrewith_tbs r transformzFixRaise.transform&syen""$$ 8u| # #?C   c * * * F C== 3-- :l1o.q177993-- :CJ   +doW s/CDDCCJJen""$$ C== ::s|AbD'9:::DDCJ5D 7??$$&&BBIAx5:%%f)<)<dOO1d#34455"GG+d.g'0IJJCCJJ;t $W tC?&*k333 3rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr4rrrr r s/MG4343434343rr N)__doc__rrpgen2rr fixer_utilrrrr r BaseFixr rrrr>s2<<<<<<<<<<<<<<;3;3;3;3;3z!;3;3;3;3;3rfixes/__pycache__/fix_import.cpython-311.opt-1.pyc000064400000011276151027012300015711 0ustar00 !A?h ndZddlmZddlmZmZmZmZddlm Z m Z m Z dZ Gddej Zd S) zFixer for import statements. If spam is being imported from the local directory, this import: from spam import eggs Becomes: from .spam import eggs And this import: import spam Becomes: from . import spam ) fixer_base)dirnamejoinexistssep) FromImportsymstokenc#K|g}|r|}|jtjkr |jVn|jt jkr'dd|jDVn~|jt j kr!| |jdnH|jt j kr$| |jdddntd|dSdS)zF Walks over all the names imported in a dotted_as_names node. cg|] }|j S)value).0chs 1/usr/lib64/python3.11/lib2to3/fixes/fix_import.py z$traverse_imports..s<<<28<<<rNzunknown node type)poptyper NAMErr dotted_namerchildrendotted_as_nameappenddotted_as_namesextendAssertionError)namespendingnodes rtraverse_importsr$s gG  6{{}} 9 " "*     Y$* * *''< | import_name< 'import' imp=any > cvtt|||d|jv|_dS)Nabsolute_import)superr& start_treefuture_featuresskip)selftreename __class__s rr*zFixImport.start_tree/s6 i))$555%)== rc|jrdS|d}|jtjkrnt |ds|jd}t |d||jr%d|jz|_|dSdSd}d}t|D]}||rd}d}|r|r| |ddStd|g}|j |_ |S)Nimprr.FTz#absolute and local imports together) r,rr import_fromhasattrrprobably_a_local_importrchangedr$warningr prefix)r-r#resultsr2 have_local have_absolutemod_namenews r transformzFixImport.transform3s0 9  Fen 9( ( ( c7++ &l1oc7++ &++CI66 #)O    J!M,S11 ) )//99)!%JJ$(MM NLL'LMMMS3%((CCJJrcV|drdS|ddd}t|j}t ||}t t t|dsdSdt ddd d fD]}t ||zrd SdS) Nr3Frz __init__.pyz.pyz.pycz.soz.slz.pydT) startswithsplitrfilenamerrr)r-imp_name base_pathexts rr6z!FixImport.probably_a_local_importUs   s # # 5>>#q))!,DM** H-- d79--}==>> 53uf=  Ci#o&& tt ur) __name__ __module__ __qualname__ BM_compatiblePATTERNr*r?r6 __classcell__)r0s@rr&r&&scMG >>>>>   Drr&N)__doc__r ros.pathrrrr fixer_utilr r r r$BaseFixr&rrrrRs  ............0000000000666&===== "=====rfixes/__pycache__/fix_execfile.cpython-311.pyc000064400000006202151027012300015215 0ustar00 !A?hldZddlmZddlmZmZmZmZmZm Z m Z m Z m Z m Z GddejZdS)zoFixer for execfile. This converts usages of the execfile function into calls to the built-in exec() function. ) fixer_base) CommaNameCallLParenRParenDotNodeArgListStringsymsceZdZdZdZdZdS) FixExecfileTz power< 'execfile' trailer< '(' arglist< filename=any [',' globals=any [',' locals=any ] ] > ')' > > | power< 'execfile' trailer< '(' filename=any ')' > > cp|sJ|d}|d}|d}|jdjd}t|t t ddg|}t tjtd|g}t tj ttd gt tj ttgg} |g| z} |} d| _t d d} | t | t | gz} ttd | d }|g}|5|t |g|5|t |gttd ||jS)Nfilenameglobalslocalsz"rb" )rparenopenreadz'exec'compileexec)prefix)getchildrencloner rr r r powerrtrailerr rrrrextend)selfnoderesultsrrrexecfile_paren open_args open_callr open_expr filename_argexec_str compile_args compile_callargss 3/usr/lib64/python3.11/lib2to3/fixes/fix_execfile.py transformzFixExecfile.transformsw:&++i((X&&r*3B7==??X^^--uwwvs8K8KL#1333 d6llI%>?? T\CEE4<<#899T\FHHfhh#788:K$&  ~~'' ! (C(( EGG\577H#MM DOO\2>> ~   KK'--//2 3 3 3   KK&,,..1 2 2 2DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNr0r1r/rrs/MG <<<<r;s 111111111111111111111111&<&<&<&<&<*$&<&<&<&<& rest=any* > ctt|||t|_dSN)superr start_treesettransformed_xranges)selftreefilename __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_xrange.pyr zFixXrange.start_trees5 i))$999#&55   cd|_dSr )r)rrrs r finish_treezFixXrange.finish_trees#'   rc|d}|jdkr|||S|jdkr|||Stt |)Nnamexrangerange)valuetransform_xrangetransform_range ValueErrorreprrnoderesultsrs r transformzFixXrange.transformsev : ! !((w77 7 Z7 " "''g66 6T$ZZ(( (rc|d}|td|j|jt |dS)Nrrprefix)replacerr'raddidr!s rrzFixXrange.transform_xrange$sOv T'$+666777  $$RXX.....rcZt||jvr||stt d|dg}tt d|g|j}|dD]}|||SdSdS)Nrargslistr&rest)r*rin_special_contextrrcloner' append_child)rr"r# range_call list_callns rrzFixXrange.transform_range*s tHHD4 4 4''-- 5d7mmgfo.C.C.E.E-FGGJT&\\J<$(K111IV_ * *&&q))))  5 4 4 4rz3power< func=NAME trailer< '(' node=any ')' > any* >zfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > | comparison< any 'in' node=any any*> c |jdSi}|jjC|j|jj|r|d|ur|djtvS|j|j|o |d|uS)NFr"func)parentp1matchrrp2)rr"r#s rr/zFixXrange.in_special_context?s ; 5 K  *w}}T[/99 +v$&&6?(O; ;w}}T['22Nwv$7NNr)__name__ __module__ __qualname__ BM_compatiblePATTERNr rr$rrP1rcompile_patternr8P2r:r/ __classcell__)rs@rr r sMG )))))((()))///    ?B   $ $B B !  $ $B O O O O O O Orr N) __doc__r fixer_utilrrrrBaseFixr rrrIs654444444444=O=O=O=O=O "=O=O=O=O=Orfixes/__pycache__/fix_renames.cpython-311.pyc000064400000006437151027012300015075 0ustar00 !A?hhdZddlmZddlmZmZdddiiZiZdZdZ Gd d ej Z d S) z?Fix incompatible renames Fixes: * sys.maxint -> sys.maxsize ) fixer_base)Name attr_chainsysmaxintmaxsizec^ddtt|zdzS)N(|))joinmaprepr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_renames.py alternatesrs( #dG,,-- - 33c #KttD]Q\}}t|D]*\}}|t||f<d|d|d|dVd|d|dV+RdS)Nz3 import_from< 'from' module_name=z, 'import' ( attr_name=z | import_as_name< attr_name=z! 'as' any >) > z& power< module_name=z trailer< '.' attr_name=z > any* > )listMAPPINGitemsLOOKUP)modulereplaceold_attrnew_attrs r build_patternrs 00++"&w}}"7"7 + + Hh)1FFH% & & 8885 5 5 5 5  + + + + + +++rcfeZdZdZdeZdZfdZdZ xZ S) FixRenamesTr prectt|j|}|r-tfdt |dDrdS|SdS)Nc3.K|]}|VdS)N).0objmatchs r z#FixRenames.match..5s+DD#55::DDDDDDrparentF)superrr&anyr)selfnoderesultsr& __class__s @rr&zFixRenames.match1sij$''-%++  DDDDD()C)CDDDDD uNurc|d}|d}|rF|rFt|j|jf}|t ||jdSdSdS)N module_name attr_name)prefix)getrvaluerrr2)r+r,r-mod_namer1rs r transformzFixRenames.transform>s;;}--KK ,,   G  Gx~y?@H   d8I4DEEE F F F F F G G G Gr) __name__ __module__ __qualname__ BM_compatibler rPATTERNorderr&r6 __classcell__)r.s@rrr*soMhh}}''G EGGGGGGGrrN) __doc__r fixer_utilrrrrrrBaseFixrr#rrrBs)))))))) Hy)  444+++*GGGGG#GGGGGrfixes/__pycache__/fix_buffer.cpython-311.opt-1.pyc000064400000002102151027012300015634 0ustar00 !A?hNHdZddlmZddlmZGddejZdS)z4Fixer that changes buffer(...) into memoryview(...).) fixer_base)Namec eZdZdZdZdZdZdS) FixBufferTzR power< name='buffer' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nname memoryview)prefix)replacerr )selfnoderesultsrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_buffer.py transformzFixBuffer.transforms2v T,t{;;;<<<<<N)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNrrrrr s4MHG=====rrN)__doc__r fixer_utilrBaseFixrrrrrsj;: = = = = = " = = = = =rfixes/__pycache__/fix_itertools_imports.cpython-311.pyc000064400000005332151027012300017235 0ustar00 !A?h&PdZddlmZddlmZmZmZGddejZdS)zA Fixer for imports of itertools.(imap|ifilter|izip|ifilterfalse) ) fixer_base) BlankLinesymstokenc2eZdZdZdezZdZdS)FixItertoolsImportsTzT import_from< 'from' 'itertools' 'import' imports=any > c|d}|jtjks|js|g}n|j}|dddD]}|jtjkr |j}|}n<|jtjkrdS|jtjksJ|jd}|j}|dvrd|_||dvr)| |ddkrdnd |_|jddp|g}d } |D]3}| r*|jtj kr|.| d z} 4|r^|d jtj krC| |r|d jtj kC|jst|d dr|j |j} t}| |_|SdS) Nimportsr)imapizipifilter) ifilterfalse izip_longestf filterfalse zip_longestTvalue)typerimport_as_namechildrenrNAMErSTARremovechangedCOMMApopgetattrparentprefixr) selfnoderesultsr rchildmember name_node member_name remove_commaps r:stGG555555555511111*,11111r.fixes/__pycache__/fix_paren.cpython-311.opt-2.pyc000064400000003155151027012300015502 0ustar00 !A?hJ ddlmZddlmZmZGddejZdS)) fixer_base)LParenRParenceZdZdZdZdZdS)FixParenTa atom< ('[' | '(') (listmaker< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > > | testlist_gexp< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > >) (']' | ')') > c|d}t}|j|_d|_|d||t dS)Ntarget)rprefix insert_child append_childr)selfnoderesultsr lparens 0/usr/lib64/python3.11/lib2to3/fixes/fix_paren.py transformzFixParen.transform%sY"   Av&&&FHH%%%%%N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG,&&&&&rrN)r r fixer_utilrrBaseFixrrrrrsnC'''''''' & & & & &z! & & & & &rfixes/__pycache__/fix_import.cpython-311.opt-2.pyc000064400000010615151027012300015706 0ustar00 !A?h l ddlmZddlmZmZmZmZddlmZm Z m Z dZ Gddej Z dS) ) fixer_base)dirnamejoinexistssep) FromImportsymstokenc#K |g}|r|}|jtjkr |jVn|jt jkr'dd|jDVn~|jt j kr!| |jdnH|jt j kr$| |jdddntd|dSdS)Ncg|] }|j S)value).0chs 1/usr/lib64/python3.11/lib2to3/fixes/fix_import.py z$traverse_imports..s<<<28<<<rzunknown node type)poptyper NAMErr dotted_namerchildrendotted_as_nameappenddotted_as_namesextendAssertionError)namespendingnodes rtraverse_importsr$sgG  6{{}} 9 " "*     Y$* * *''< | import_name< 'import' imp=any > cvtt|||d|jv|_dS)Nabsolute_import)superr& start_treefuture_featuresskip)selftreename __class__s rr*zFixImport.start_tree/s6 i))$555%)== rc|jrdS|d}|jtjkrnt |ds|jd}t |d||jr%d|jz|_|dSdSd}d}t|D]}||rd}d}|r|r| |ddStd|g}|j |_ |S)Nimprr.FTz#absolute and local imports together) r,rr import_fromhasattrrprobably_a_local_importrchangedr$warningr prefix)r-r#resultsr2 have_local have_absolutemod_namenews r transformzFixImport.transform3s0 9  Fen 9( ( ( c7++ &l1oc7++ &++CI66 #)O    J!M,S11 ) )//99)!%JJ$(MM NLL'LMMMS3%((CCJJrcV|drdS|ddd}t|j}t ||}t t t|dsdSdt ddd d fD]}t ||zrd SdS) Nr3Frz __init__.pyz.pyz.pycz.soz.slz.pydT) startswithsplitrfilenamerrr)r-imp_name base_pathexts rr6z!FixImport.probably_a_local_importUs   s # # 5>>#q))!,DM** H-- d79--}==>> 53uf=  Ci#o&& tt ur) __name__ __module__ __qualname__ BM_compatiblePATTERNr*r?r6 __classcell__)r0s@rr&r&&scMG >>>>>   Drr&N)r ros.pathrrrr fixer_utilr r r r$BaseFixr&rrrrQs ............0000000000666&===== "=====rfixes/__pycache__/fix_reload.cpython-311.opt-2.pyc000064400000002652151027012300015644 0ustar00 !A?h9J ddlmZddlmZmZGddejZdS)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixReloadTprez power< 'reload' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***) importlibreloadr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_reload.py transformzFixReload.transformsu  %.C H 222LO)[88F'D'511T;--- N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)r fixer_utilrrBaseFixrr#rrr'sm$ 44444444 "rfixes/__pycache__/fix_unicode.cpython-311.pyc000064400000004746151027012300015072 0ustar00 !A?hRdZddlmZddlmZdddZGddejZd S) zFixer for unicode. * Changes unicode to str and unichr to chr. * If "...\u..." is not unicode literal change it into "...\\u...". * Change u"..." into "...". )token) fixer_basechrstr)unichrunicodec,eZdZdZdZfdZdZxZS) FixUnicodeTzSTRING | 'unicode' | 'unichr'cvtt|||d|jv|_dS)Nunicode_literals)superr start_treefuture_featuresr )selftreefilename __class__s 2/usr/lib64/python3.11/lib2to3/fixes/fix_unicode.pyrzFixUnicode.start_trees9 j$**4::: 2d6J Jc|jtjkr-|}t|j|_|S|jtjkr|j}|js@|ddvr6d|vr2dd| dD}|ddvr |dd}||jkr|S|}||_|SdS)Nz'"\z\\cbg|],}|dddd-S)z\uz\\uz\Uz\\U)replace).0vs r z(FixUnicode.transform.. sF"""IIeV,,44UFCC"""ruU) typerNAMEclone_mappingvalueSTRINGr joinsplit)rnoderesultsnewvals r transformzFixUnicode.transforms 9 " "**,,C ,CIJ Y%, & &*C( SVu__jj"" YYu--"""1v~~!""gdj   **,,CCIJ' &r)__name__ __module__ __qualname__ BM_compatiblePATTERNrr, __classcell__)rs@rr r sVM-GKKKKKrr N)__doc__pgen2rrr#BaseFixr rrr8sy% 0 0#rfixes/__pycache__/fix_operator.cpython-311.opt-2.pyc000064400000013211151027012300016222 0ustar00 !A?hb ` ddlZddlmZddlmZmZmZmZdZ Gddej Z dS)N) fixer_base)CallNameString touch_importcfd}|S)Nc|_|SN) invocation)fss 3/usr/lib64/python3.11/lib2to3/fixes/fix_operator.pydeczinvocation..decs )r rs` rr r s# JrcneZdZdZdZdZdZdeeezZdZ e dd Z e d d Z e d d Z e ddZe ddZe ddZe ddZdZdZdZdS) FixOperatorTprez method=('isCallable'|'sequenceIncludes' |'isSequenceType'|'isMappingType'|'isNumberType' |'repeat'|'irepeat') z'(' obj=any ')'z power< module='operator' trailer< '.' %(methods)s > trailer< %(obj)s > > | power< %(methods)s trailer< %(obj)s > > )methodsobjcN|||}| |||SdSr ) _check_method)selfnoderesultsmethods r transformzFixOperator.transform+s7##D'22  6$(( (  rzoperator.contains(%s)c0|||dS)Ncontains_handle_renamerrrs r_sequenceIncludeszFixOperator._sequenceIncludes0s""4*===rz callable(%s)c|d}ttd|g|jS)Nrcallableprefix)rrcloner')rrrrs r _isCallablezFixOperator._isCallable4s4enD$$syy{{mDKHHHHrzoperator.mul(%s)c0|||dS)Nmulr r"s r_repeatzFixOperator._repeat9s""4%888rzoperator.imul(%s)c0|||dS)Nimulr r"s r_irepeatzFixOperator._irepeat=s""4&999rz(isinstance(%s, collections.abc.Sequence)c2|||ddS)Ncollections.abcSequence_handle_type2abcr"s r_isSequenceTypezFixOperator._isSequenceTypeAs$$T74EzRRRrz'isinstance(%s, collections.abc.Mapping)c2|||ddS)Nr1Mappingr3r"s r_isMappingTypezFixOperator._isMappingTypeEs$$T74EyQQQrzisinstance(%s, numbers.Number)c2|||ddS)NnumbersNumberr3r"s r _isNumberTypezFixOperator._isNumberTypeIs$$T7IxHHHrcX|dd}||_|dS)Nrr)valuechanged)rrrnamers rr!zFixOperator._handle_renameMs."1% rctd|||d}|tdd||gzg}t t d||jS)Nrz, . isinstancer&)rr(rjoinrrr')rrrmoduleabcrargss rr4zFixOperator._handle_type2abcRskT64(((en VD388VSM+B+B$BCCDD&&T[AAAArc t|d|ddjz}t|tjjr?d|vr|St |df}|j|z}||d|zdS)N_rrrErzYou should use '%s' here.) getattrr>rC collectionsrFCallablestrr warning)rrrrsubinvocation_strs rrzFixOperator._check_methodXssWX%6q%9%??@@ fko6 7 7 Q7"" 75>**,!'!2S!8 T#>#OPPPtrN)__name__ __module__ __qualname__ BM_compatibleorderrrdictPATTERNrr r#r)r,r/r5r8r<r!r4rrrrrrsM EG C Dc222 3G))) Z'((>>)(>ZII IZ"##99$#9Z#$$::%$:Z:;;SS<;SZ9::RR;:RZ011II21I BBB     rr) collections.abcrKlib2to3rlib2to3.fixer_utilrrrrr BaseFixrrrrr\s ????????????GGGGG*$GGGGGrfixes/__pycache__/fix_sys_exc.cpython-311.pyc000064400000004272151027012300015113 0ustar00 !A?h `dZddlmZddlmZmZmZmZmZm Z m Z Gddej Z dS)zFixer for sys.exc_{type, value, traceback} sys.exc_type -> sys.exc_info()[0] sys.exc_value -> sys.exc_info()[1] sys.exc_traceback -> sys.exc_info()[2] ) fixer_base)AttrCallNameNumber SubscriptNodesymscdeZdZgdZdZdddeDzZdZdS) FixSysExc)exc_type exc_value exc_tracebackTzN power< 'sys' trailer< dot='.' attribute=(%s) > > |c# K|] }d|zV dS)z'%s'N).0es 2/usr/lib64/python3.11/lib2to3/fixes/fix_sys_exc.py zFixSysExc.s&::AVaZ::::::c|dd}t|j|j}t t d|j}tt d|}|dj|djd_| t|ttj ||jS)N attributeexc_info)prefixsysdot)rrindexvaluerrrrchildrenappendrr r power)selfnoderesultssys_attrr callattrs r transformzFixSysExc.transforms;'*t}**8>::;;D$$X_===DKK&&%,U^%:Q" Ie$$%%%DJT[9999rN)__name__ __module__ __qualname__r BM_compatiblejoinPATTERNr+rrrr r s]999HMHH:::::::;G:::::rr N) __doc__r fixer_utilrrrrrr r BaseFixr rrrr6sHHHHHHHHHHHHHHHHHH::::: ":::::rfixes/__pycache__/fix_xrange.cpython-311.pyc000064400000010160151027012300014713 0ustar00 !A?h \dZddlmZddlmZmZmZddlmZGddejZ dS)z/Fixer that changes xrange(...) into range(...).) fixer_base)NameCallconsuming_calls)patcompceZdZdZdZfdZdZdZdZdZ dZ e j e Z d Ze j eZd ZxZS) FixXrangeTz power< (name='range'|name='xrange') trailer< '(' args=any ')' > rest=any* > ctt|||t|_dSN)superr start_treesettransformed_xranges)selftreefilename __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_xrange.pyr zFixXrange.start_trees5 i))$999#&55   cd|_dSr )r)rrrs r finish_treezFixXrange.finish_trees#'   rc|d}|jdkr|||S|jdkr|||Stt |)Nnamexrangerange)valuetransform_xrangetransform_range ValueErrorreprrnoderesultsrs r transformzFixXrange.transformsev : ! !((w77 7 Z7 " "''g66 6T$ZZ(( (rc|d}|td|j|jt |dS)Nrrprefix)replacerr'raddidr!s rrzFixXrange.transform_xrange$sOv T'$+666777  $$RXX.....rcZt||jvr||stt d|dg}tt d|g|j}|dD]}|||SdSdS)Nrargslistr&rest)r*rin_special_contextrrcloner' append_child)rr"r# range_call list_callns rrzFixXrange.transform_range*s tHHD4 4 4''-- 5d7mmgfo.C.C.E.E-FGGJT&\\J<$(K111IV_ * *&&q))))  5 4 4 4rz3power< func=NAME trailer< '(' node=any ')' > any* >zfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > | comparison< any 'in' node=any any*> c |jdSi}|jjC|j|jj|r|d|ur|djtvS|j|j|o |d|uS)NFr"func)parentp1matchrrp2)rr"r#s rr/zFixXrange.in_special_context?s ; 5 K  *w}}T[/99 +v$&&6?(O; ;w}}T['22Nwv$7NNr)__name__ __module__ __qualname__ BM_compatiblePATTERNr rr$rrP1rcompile_patternr8P2r:r/ __classcell__)rs@rr r sMG )))))((()))///    ?B   $ $B B !  $ $B O O O O O O Orr N) __doc__r fixer_utilrrrrBaseFixr rrrIs654444444444=O=O=O=O=O "=O=O=O=O=Orfixes/__pycache__/fix_map.cpython-311.pyc000064400000011340151027012300014205 0ustar00 !A?h8|dZddlmZddlmZddlmZmZmZm Z m Z ddl m Z ddlmZGddejZd S) aFixer that changes map(F, ...) into list(map(F, ...)) unless there exists a 'from future_builtins import map' statement in the top-level namespace. As a special case, map(None, X) is changed into list(X). (This is necessary because the semantics are changed in this case -- the new map(None, X) is equivalent to [(x,) for x in X].) We avoid the transformation (except for the special case mentioned above) if the map() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on map(F, X, Y, ...) to go on until the longest argument is exhausted, substituting None for missing values -- like zip(), it now stops as soon as the shortest argument is exhausted. )token) fixer_base)NameArgListCallListCompin_special_context)python_symbols)Nodec eZdZdZdZdZdZdS)FixMapTaL map_none=power< 'map' trailer< '(' arglist< 'None' ',' arg=any [','] > ')' > [extra_trailers=trailer*] > | map_lambda=power< 'map' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'map' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.mapcN||rdSg}d|vr2|dD])}||*|jjt jkrQ||d|}d|_ttd|g}nd|vr{t|d|d|d}tt j |g|zd }n_d |vr"|d }d|_nd |vr|d }|jt jkr|jd jt jkrd|jd jdjt"jkr9|jd jdjdkr||ddStt j td|g}d|_t)|rdStt j tdt+|gg|z}d|_|j|_|S)Nextra_trailerszYou should use a for loop herelist map_lambdaxpfpit)prefixmap_noneargargsNonezjcannot convert map(None, ...) with multiple arguments because map() now truncates to the shortest sequencemap) should_skipappendcloneparenttypesyms simple_stmtwarningrrrrr powertrailerchildrenarglistrNAMEvaluer r)selfnoderesultstrailerstnewrs ./usr/lib64/python3.11/lib2to3/fixes/fix_map.py transformzFixMap.transform@sn   D ! !  F w & &-. + + **** ; t/ / / LL? @ @ @**,,CCJtF||cU++CC W $ $74=..00"4=..00"4=..0022CtzC58#3B???CCW$$en**,, W$$"6?DyDL00}Q', <<}Q'038EJFF}Q'039VCC T,NOOOtzDKK+FGGC!#CJ%d++ 4tzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr3r4r2r r s6MG:$G.....r4r N)__doc__pgen2rrr fixer_utilrrrrr pygramr r#pytreer ConditionalFixr r;r4r2rBs&JJJJJJJJJJJJJJ++++++PPPPPZ &PPPPPr4fixes/__pycache__/fix_print.cpython-311.pyc000064400000010044151027012300014564 0ustar00 !A?h dZddlmZddlmZddlmZddlmZddlmZm Z m Z m Z ej dZ Gdd ejZd S) a Fixer for print. Change: 'print' into 'print()' 'print ...' into 'print(...)' 'print ... ,' into 'print(..., end=" ")' 'print >>x, ...' into 'print(..., file=x)' No changes are applied if print_function is imported from __future__ )patcomp)pytree)token) fixer_base)NameCallCommaStringz"atom< '(' [atom|STRING|NAME] ')' >c"eZdZdZdZdZdZdS)FixPrintTzP simple_stmt< any* bare='print' any* > | print_stmt c (|sJ|d}|r9|ttdg|jdS|jdtdksJ|jdd}t |dkr"t|drdSdx}x}}|r$|dtkr |dd}d}|rb|dtj tj dkr9t |d ksJ|d}|d d}d |D}|r d |d_||||1||d t!t#||1||dt!t#||||d|ttd|} |j| _| S)Nbareprint)prefix z>>rc6g|]}|S)clone).0args 0/usr/lib64/python3.11/lib2to3/fixes/fix_print.py z&FixPrint.transform..?s ...##))++...sependfile)getreplacerrrchildrenlen parend_exprmatchr rLeafr RIGHTSHIFTr add_kwargr repr) selfnoderesults bare_printargsrr r!l_argsn_stmts r transformzFixPrint.transform%sw[[((     tDMM2&0&7 9 9 9 : : : F}Q4==0000}QRR  t99>>k//Q88> FcD  DH''9DC  DGv{5+;TBBBBt99>>>>7==??D8D.....  "!F1I  ?co1AvufT#YY.?.?@@@vufT#YY.?.?@@@vvt444d7mmV,,   rc*d|_tj|jjt |tjtjd|f}|r(| td|_| |dS)Nr=r) rrNodesymsargumentrr(rEQUALappendr )r,l_nodess_kwdn_expr n_arguments rr*zFixPrint.add_kwargMs [!3"&u++"(+ek3"?"?"("*++   $ NN577 # # # #J z"""""rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr3r*rrrr r s?MG&&&P # # # # #rr N)__doc__rrrpgen2rr fixer_utilrrr r compile_patternr&BaseFixr rrrrIs  222222222222&g%6 :#:#:#:#:#z!:#:#:#:#:#rfixes/__pycache__/fix_ne.cpython-311.pyc000064400000002136151027012300014035 0ustar00 !A?h;TdZddlmZddlmZddlmZGddejZdS)zFixer that turns <> into !=.)pytree)token) fixer_basec(eZdZejZdZdZdS)FixNec|jdkS)Nz<>)value)selfnodes -/usr/lib64/python3.11/lib2to3/fixes/fix_ne.pymatchz FixNe.matchszT!!cRtjtjd|j}|S)Nz!=)prefix)rLeafrNOTEQUALr)r r resultsnews r transformzFixNe.transforms!k%.$t{CCC rN)__name__ __module__ __qualname__rr _accept_typer rrr rr s;>L"""rrN)__doc__rpgen2rrBaseFixrrrr rs|#"     J      rfixes/__pycache__/fix_next.cpython-311.pyc000064400000012154151027012300014412 0ustar00 !A?hf ~dZddlmZddlmZddlmZddlm Z m Z m Z dZ Gddej Zd Zd Zd Zd S) z.Fixer for it.next() -> next(it), per PEP 3114.)token)python_symbols) fixer_base)NameCall find_bindingz;Calls to builtin next() possibly shadowed by global bindingc0eZdZdZdZdZfdZdZxZS)FixNextTa power< base=any+ trailer< '.' attr='next' > trailer< '(' ')' > > | power< head=any+ trailer< '.' attr='next' > not trailer< '(' ')' > > | classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='next' parameters< '(' NAME ')' > any+ > any* > > | global=global_stmt< 'global' any* 'next' any* > prectt|||td|}|r$||t d|_dSd|_dS)NnextTF)superr start_treerwarning bind_warning shadowed_next)selftreefilenamen __class__s //usr/lib64/python3.11/lib2to3/fixes/fix_next.pyrzFixNext.start_tree$sj gt''h777  & &  ' LLL ) ) )!%D   !&D   cd|sJ|d}|d}|d}|r|jr+|td|jdSd|D}d|d_|t td |j|dS|r-td|j}||dS|rt |rZ|d }dd |Dd kr| |tdS|tddSd |vr$| |td|_dSdS)Nbaseattrname__next__)prefixc6g|]}|S)clone.0rs r z%FixNext.transform..9s 000a 000rr headc,g|]}t|Sr!)strr#s rr%z%FixNext.transform..Es111qCFF111r __builtin__globalT) getrreplacerrris_assign_targetjoinstriprr)rnoderesultsrrrrr(s r transformzFixNext.transform.sw{{6""{{6""{{6""  &! K T*T[AAABBBBB004000!#Q T$vdk"B"B"BDIIJJJJJ  &Z 444A LLOOOOO  & %% v7711D1112288::mKKLL|444 LLj)) * * * * *  LL| , , ,!%D   ! r) __name__ __module__ __qualname__ BM_compatiblePATTERNorderrr4 __classcell__)rs@rr r sZM G E'''''&&&&&&&rr ct|}|dS|jD]-}|jtjkrdSt ||rdS.dS)NFT) find_assignchildrentyperEQUAL is_subtree)r2assignchilds rr/r/Qsc   F ~u : $ $55 t $ $ 44  5rc|jtjkr|S|jtjks|jdSt |jSN)r?syms expr_stmt simple_stmtparentr=)r2s rr=r=]sB yDN""  yD$$$ (;t t{ # ##rcT|krdStfd|jDS)NTc38K|]}t|VdSrE)rA)r$cr2s r zis_subtree..gs-::qz!T""::::::r)anyr>)rootr2s `rrArAds6 t||t ::::DM::: : ::rN)__doc__pgen2rpygramrrFr&r fixer_utilrrrrBaseFixr r/r=rAr!rrrUs44++++++1111111111L :&:&:&:&:&j :&:&:&@   $$$;;;;;rfixes/__pycache__/fix_operator.cpython-311.opt-1.pyc000064400000014156151027012300016232 0ustar00 !A?hb bdZddlZddlmZddlmZmZmZm Z dZ Gddej Z dS)aFixer for operator functions. operator.isCallable(obj) -> callable(obj) operator.sequenceIncludes(obj) -> operator.contains(obj) operator.isSequenceType(obj) -> isinstance(obj, collections.abc.Sequence) operator.isMappingType(obj) -> isinstance(obj, collections.abc.Mapping) operator.isNumberType(obj) -> isinstance(obj, numbers.Number) operator.repeat(obj, n) -> operator.mul(obj, n) operator.irepeat(obj, n) -> operator.imul(obj, n) N) fixer_base)CallNameString touch_importcfd}|S)Nc|_|SN) invocation)fss 3/usr/lib64/python3.11/lib2to3/fixes/fix_operator.pydeczinvocation..decs )r rs` rr r s# JrcneZdZdZdZdZdZdeeezZdZ e dd Z e d d Z e d d Z e ddZe ddZe ddZe ddZdZdZdZdS) FixOperatorTprez method=('isCallable'|'sequenceIncludes' |'isSequenceType'|'isMappingType'|'isNumberType' |'repeat'|'irepeat') z'(' obj=any ')'z power< module='operator' trailer< '.' %(methods)s > trailer< %(obj)s > > | power< %(methods)s trailer< %(obj)s > > )methodsobjcN|||}| |||SdSr ) _check_method)selfnoderesultsmethods r transformzFixOperator.transform+s7##D'22  6$(( (  rzoperator.contains(%s)c0|||dS)Ncontains_handle_renamerrrs r_sequenceIncludeszFixOperator._sequenceIncludes0s""4*===rz callable(%s)c|d}ttd|g|jS)Nrcallableprefix)rrcloner')rrrrs r _isCallablezFixOperator._isCallable4s4enD$$syy{{mDKHHHHrzoperator.mul(%s)c0|||dS)Nmulr r"s r_repeatzFixOperator._repeat9s""4%888rzoperator.imul(%s)c0|||dS)Nimulr r"s r_irepeatzFixOperator._irepeat=s""4&999rz(isinstance(%s, collections.abc.Sequence)c2|||ddS)Ncollections.abcSequence_handle_type2abcr"s r_isSequenceTypezFixOperator._isSequenceTypeAs$$T74EzRRRrz'isinstance(%s, collections.abc.Mapping)c2|||ddS)Nr1Mappingr3r"s r_isMappingTypezFixOperator._isMappingTypeEs$$T74EyQQQrzisinstance(%s, numbers.Number)c2|||ddS)NnumbersNumberr3r"s r _isNumberTypezFixOperator._isNumberTypeIs$$T7IxHHHrcX|dd}||_|dS)Nrr)valuechanged)rrrnamers rr!zFixOperator._handle_renameMs."1% rctd|||d}|tdd||gzg}t t d||jS)Nrz, . isinstancer&)rr(rjoinrrr')rrrmoduleabcrargss rr4zFixOperator._handle_type2abcRskT64(((en VD388VSM+B+B$BCCDD&&T[AAAArc t|d|ddjz}t|tjjr?d|vr|St |df}|j|z}||d|zdS)N_rrrErzYou should use '%s' here.) getattrr>rC collectionsrFCallablestrr warning)rrrrsubinvocation_strs rrzFixOperator._check_methodXssWX%6q%9%??@@ fko6 7 7 Q7"" 75>**,!'!2S!8 T#>#OPPPtrN)__name__ __module__ __qualname__ BM_compatibleorderrrdictPATTERNrr r#r)r,r/r5r8r<r!r4rrrrrrsM EG C Dc222 3G))) Z'((>>)(>ZII IZ"##99$#9Z#$$::%$:Z:;;SS<;SZ9::RR;:RZ011II21I BBB     rr) __doc__collections.abcrKlib2to3rlib2to3.fixer_utilrrrrr BaseFixrrrrr]s  ????????????GGGGG*$GGGGGrfixes/__pycache__/fix_idioms.cpython-311.pyc000064400000013700151027012300014716 0ustar00 !A?h ddZddlmZddlmZmZmZmZmZm Z dZ dZ Gddej Z dS) aAdjust some old Python 2 idioms to their modern counterparts. * Change some type comparisons to isinstance() calls: type(x) == T -> isinstance(x, T) type(x) is T -> isinstance(x, T) type(x) != T -> not isinstance(x, T) type(x) is not T -> not isinstance(x, T) * Change "while 1:" into "while True:". * Change both v = list(EXPR) v.sort() foo(v) and the more general v = EXPR v.sort() foo(v) into v = sorted(EXPR) foo(v) ) fixer_base)CallCommaNameNode BlankLinesymsz0(n='!=' | '==' | 'is' | n=comp_op< 'is' 'not' >)z(power< 'type' trailer< '(' x=any ')' > >c XeZdZdZdedededed ZfdZdZdZ d Z d Z xZ S) FixIdiomsTz isinstance=comparison<  z8 T=any > | isinstance=comparison< T=any aX > | while_stmt< 'while' while='1' ':' any+ > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' power< list='list' trailer< '(' (not arglist) any ')' > > > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' expr=any > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > ctt||}|rd|vr|d|dkr|SdS|S)Nsortedid1id2)superr match)selfnoder __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_idioms.pyrzFixIdioms.matchOsT )T " " ( ( . .  Qx1U8##4cd|vr|||Sd|vr|||Sd|vr|||Std)N isinstancewhilerz Invalid match)transform_isinstancetransform_whiletransform_sort RuntimeError)rrresultss r transformzFixIdioms.transformZss 7 " ",,T7;; ;   ''g66 6  &&tW55 5// /rcb|d}|d}d|_d|_ttd|t |g}d|vr0d|_t t jtd|g}|j|_|S)NxTr rnnot)cloneprefixrrrrr not_test)rrr r#r$tests rrzFixIdioms.transform_isinstanceds CL    CL   D&&EGGQ88 '>>DK U T':;;Dk  rch|d}|td|jdS)NrTruer))replacerr))rrr ones rrzFixIdioms.transform_whileps3g D 33344444rc@|d}|d}|d}|d}|r*|td|jne|rT|}d|_|t td|g|jnt d||j}d |vr|rJ|d d |d jf} d | |d _dS|j sJ|j Jt} |j | |j | usJ|d d | _dSdS) Nsortnextlistexprrr.r%zshould not have reached here )getr/rr)r(rrremove rpartitionjoinparent next_siblingr append_child) rrr sort_stmt next_stmt list_call simple_exprnewbtwn prefix_linesend_lines rrzFixIdioms.transform_sorttsFO FO KK'' kk&))  ?   d8I4DEEE F F F F  ?##%%CCJ   T(^^cU,7,>!@!@!@ A A A A=>> > 4<< ;!% 5 5a 8)A,:MN &*ii &=&= ! ### '''' -555$;; --h777 -9999#'//$"7"7":! rSs<AAAAAAAAAAAAAAAA81s;s;s;s;s; "s;s;s;s;s;rfixes/__pycache__/fix_methodattrs.cpython-311.pyc000064400000002403151027012300015766 0ustar00 !A?h^TdZddlmZddlmZddddZGdd ejZd S) z;Fix bound method attributes (method.im_? -> method.__?__). ) fixer_base)Name__func____self__z__self__.__class__)im_funcim_selfim_classceZdZdZdZdZdS)FixMethodattrsTzU power< any+ trailer< '.' attr=('im_func' | 'im_self' | 'im_class') > any* > c|dd}t|j}|t||jdS)Nattr)prefix)MAPvaluereplacerr)selfnoderesultsr news 6/usr/lib64/python3.11/lib2to3/fixes/fix_methodattrs.py transformzFixMethodattrs.transformsBvq!$*o T#dk22233333N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrr r s/MG44444rr N)__doc__r fixer_utilrrBaseFixr rrrr$s % 4 4 4 4 4Z' 4 4 4 4 4rfixes/__pycache__/fix_itertools.cpython-311.opt-1.pyc000064400000004127151027012300016420 0ustar00 !A?h HdZddlmZddlmZGddejZdS)aT Fixer for itertools.(imap|ifilter|izip) --> (map|filter|zip) and itertools.ifilterfalse --> itertools.filterfalse (bugs 2360-2363) imports from itertools are fixed in fix_itertools_import.py If itertools is imported as something else (ie: import itertools as it; it.izip(spam, eggs)) method calls will not get fixed. ) fixer_base)Namec:eZdZdZdZdezZdZdZdS) FixItertoolsTz7('imap'|'ifilter'|'izip'|'izip_longest'|'ifilterfalse')z power< it='itertools' trailer< dot='.' func=%(it_funcs)s > trailer< '(' [any] ')' > > | power< func=%(it_funcs)s trailer< '(' [any] ')' > > cfd}|dd}d|vrb|jdvrY|d|d}}|j}|||j||p|j}|t |jdd|dS)Nfuncit) ifilterfalse izip_longestdot)prefix)valuerremoveparentreplacer)selfnoderesultsrr rr s 4/usr/lib64/python3.11/lib2to3/fixes/fix_itertools.py transformzFixItertools.transformsvq! GOO J> > >u~wt}CYF IIKKK JJLLL K   % % %&4; T$*QRR.88899999N) __name__ __module__ __qualname__ BM_compatibleit_funcslocalsPATTERN run_orderrrrrrsKMHH FHH GI:::::rrN)__doc__r fixer_utilrBaseFixrr#rrr(sl::::::%:::::rfixes/__pycache__/fix_funcattrs.cpython-311.opt-2.pyc000064400000002335151027012300016405 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixFuncattrsTz power< any+ trailer< '.' attr=('func_closure' | 'func_doc' | 'func_globals' | 'func_name' | 'func_defaults' | 'func_code' | 'func_dict') > any* > c|dd}|td|jddz|jdS)Nattrz__%s__)prefix)replacervaluer )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_funcattrs.py transformzFixFuncattrs.transformsWvq! T8djn4!%... / / / / /N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG /////rrN)r fixer_utilrBaseFixrrrrrse9 / / / / /:% / / / / /rfixes/__pycache__/fix_repr.cpython-311.pyc000064400000002327151027012300014405 0ustar00 !A?hePdZddlmZddlmZmZmZGddejZdS)z/Fixer that transforms `xyzzy` into repr(xyzzy).) fixer_base)CallName parenthesizeceZdZdZdZdZdS)FixReprTz7 atom < '`' expr=any '`' > c|d}|j|jjkrt |}t t d|g|jS)Nexprrepr)prefix)clonetypesyms testlist1rrrr )selfnoderesultsr s //usr/lib64/python3.11/lib2to3/fixes/fix_repr.py transformzFixRepr.transformsUv$$&& 9 + + +%%DDLL4&====N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG>>>>>rrN) __doc__r fixer_utilrrrBaseFixrrrrr!sv651111111111 > > > > >j > > > > >rfixes/__pycache__/fix_print.cpython-311.opt-1.pyc000064400000007565151027012300015541 0ustar00 !A?h dZddlmZddlmZddlmZddlmZddlmZm Z m Z m Z ej dZ Gdd ejZd S) a Fixer for print. Change: 'print' into 'print()' 'print ...' into 'print(...)' 'print ... ,' into 'print(..., end=" ")' 'print >>x, ...' into 'print(..., file=x)' No changes are applied if print_function is imported from __future__ )patcomp)pytree)token) fixer_base)NameCallCommaStringz"atom< '(' [atom|STRING|NAME] ')' >c"eZdZdZdZdZdZdS)FixPrintTzP simple_stmt< any* bare='print' any* > | print_stmt c |d}|r9|ttdg|jdS|jdd}t |dkr"t|drdSdx}x}}|r$|dtkr |dd}d}|rM|dtj tj dkr$|d}|d d}d |D}|r d |d_||||1||d t!t#||1||d t!t#||||d|ttd|} |j| _| S)Nbareprint)prefix z>>c6g|]}|S)clone).0args 0/usr/lib64/python3.11/lib2to3/fixes/fix_print.py z&FixPrint.transform..?s ...##))++...sependfile)getreplacerrrchildrenlen parend_exprmatchr rLeafr RIGHTSHIFTr add_kwargr repr) selfnoderesults bare_printargsrr r!l_argsn_stmts r transformzFixPrint.transform%s[[((     tDMM2&0&7 9 9 9 : : : F}QRR  t99>>k//Q88> FcD  DH''9DC  DGv{5+;TBBBB7==??D8D.....  "!F1I  ?co1AvufT#YY.?.?@@@vufT#YY.?.?@@@vvt444d7mmV,,   rc*d|_tj|jjt |tjtjd|f}|r(| td|_| |dS)Nr=r) rrNodesymsargumentrr(rEQUALappendr )r,l_nodess_kwdn_expr n_arguments rr*zFixPrint.add_kwargMs [!3"&u++"(+ek3"?"?"("*++   $ NN577 # # # #J z"""""rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr3r*rrrr r s?MG&&&P # # # # #rr N)__doc__rrrpgen2rr fixer_utilrrr r compile_patternr&BaseFixr rrrrIs  222222222222&g%6 :#:#:#:#:#z!:#:#:#:#:#rfixes/__pycache__/fix_basestring.cpython-311.pyc000064400000001550151027012300015573 0ustar00 !A?h@HdZddlmZddlmZGddejZdS)zFixer for basestring -> str.) fixer_base)NameceZdZdZdZdZdS) FixBasestringTz 'basestring'c.td|jS)Nstr)prefix)rr )selfnoderesultss 5/usr/lib64/python3.11/lib2to3/fixes/fix_basestring.py transformzFixBasestring.transform sE$+....N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rrs-MG/////rrN)__doc__r fixer_utilrBaseFixrrrr rsh""/////J&/////rfixes/__pycache__/fix_tuple_params.cpython-311.opt-2.pyc000064400000020202151027012300017061 0ustar00 !A?h ddlmZddlmZddlmZddlmZmZmZm Z m Z m Z dZ Gddej ZdZd Zgd fd Zd Zd S) )pytree)token) fixer_base)AssignNameNewlineNumber Subscriptsymscvt|tjo|jdjt jkS)N) isinstancerNodechildrentyperSTRING)stmts 7/usr/lib64/python3.11/lib2to3/fixes/fix_tuple_params.py is_docstringrs/ dFK ( ( 1 =  EL 01c&eZdZdZdZdZdZdZdS)FixTupleParamsTa funcdef< 'def' any parameters< '(' args=any ')' > ['->' any] ':' suite=any+ > | lambda= lambdef< 'lambda' args=vfpdef< '(' inner=any ')' > ':' body=any > c d|vr||Sg |d}|d}|djdjtjkr)d}|djdj}t n#d}d}tjtjd d fd }|jtj kr ||nU|jtj kr@t|jD]+\}} | jtj kr|| |dk , sdS D]} |d| _ |} |dkrd d_n2t|dj|r| d_|dz} D]} |d| _  |dj| | <t!| dz| t# zdzD]}||dj|_|ddS)Nlambdasuiteargsr rz; Fct}|}d|_t ||}|rd|_||tjtj |gdS)Nr ) rnew_namecloneprefixrreplaceappendrrr simple_stmt) tuple_arg add_prefixnargrend new_linesselfs r handle_tuplez.FixTupleParams.transform..handle_tupleCsT]]__%%A//##CCJ#qwwyy))D    a   V[)9*. )<>> ? ? ? ? ?r)r)r!)F)transform_lambdarrrINDENTvaluerrLeafr tfpdef typedargslist enumerateparentr$rrangelenchanged)r.noderesultsrrstartindentr/ir+lineafterr,r-s` @@r transformzFixTupleParams.transform.sM w  ((w77 7  v 8 Q  $ 4 4E1X&q)/F))CCEF+elB//C ? ? ? ? ? ? ? ? 9 # # L     Y$, , ,#DM22 : :38t{**!L!a%9999  F # #D(DKK A::"%IaL   %(+E2 3 3 "(IaL AIE # #D(DKK)2a%+&uQwc)nn 4Q 677 1 1A*0E!H a ' ' arc|d}|d}t|d}|jtjkr2|}d|_||dSt|}t|}| t|}t|d} || | D]} | jtjkrv| j |vrmd|| j D} tjt j| g| z} | j| _| | dS)Nrbodyinnerr!)r$c6g|]}|S)r#.0cs r z3FixTupleParams.transform_lambda..s CCCAaggiiCCCr) simplify_argsrrNAMEr#r$r% find_params map_to_indexr" tuple_namer post_orderr2rrr power) r.r;r<rrDrEparamsto_indextup_name new_paramr* subscriptsnews rr0zFixTupleParams.transform_lambdans`vvgg.// : # #KKMMEEL LL    FT""''==F!3!344#...  Y__&&'''""  Av##8(;(;CC!'1BCCC k$*#,??#4#4"5 "BDDX  #   rN)__name__ __module__ __qualname__ run_order BM_compatiblePATTERNrBr0rGrrrrsDIMG>>>@rrc|jtjtjfvr|S|jtjkr9|jtjkr"|jd}|jtjk"|Std|z)NrzReceived unexpected node %s)rr vfplistrrMvfpdefr RuntimeErrorr;s rrLrLss yT\5:... dk ! !i4;&&=#Di4;&& 4t; < <.s, K K KqQVu{5J5JKNN5J5J5Jr)rr rarNrrrMr2rcs rrNrNsS yDK4=+,,, ej z K KDM K K KKrNc|i}t|D]_\}}ttt|g}t |t rt |||W||z||<`|S)N)d)r6r r strrlistrO) param_listr$rhr?objtrailers rrOrOsy J''&&3VCFF^^,,- c4  & g + + + + +g%AcFF Hrcg}|D]O}t|tr#|t|:||Pd|S)N_)rrjr&rPjoin)rklrls rrPrPsd A c4   HHZ__ % % % % HHSMMMM 88A;;r)rrpgen2rr fixer_utilrrrr r r rBaseFixrrLrNrOrPrGrrrus *GGGGGGGGGGGGGGGG111gggggZ'gggX = = =LLL%'$     rfixes/__pycache__/fix_reload.cpython-311.pyc000064400000002761151027012300014705 0ustar00 !A?h9LdZddlmZddlmZmZGddejZdS)z5Fixer for reload(). reload(s) -> importlib.reload(s)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixReloadTprez power< 'reload' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***) importlibreloadr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_reload.py transformzFixReload.transformsu  %.C H 222LO)[88F'D'511T;--- N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)__doc__r fixer_utilrrBaseFixrr#rrr(sr$$ 44444444 "rfixes/__pycache__/fix_dict.cpython-311.pyc000064400000011647151027012300014365 0ustar00 !A?hdZddlmZddlmZddlmZddlmZmZmZddlmZej dhzZ Gdd ej Z d S) ajFixer for dict methods. d.keys() -> list(d.keys()) d.items() -> list(d.items()) d.values() -> list(d.values()) d.iterkeys() -> iter(d.keys()) d.iteritems() -> iter(d.items()) d.itervalues() -> iter(d.values()) d.viewkeys() -> d.keys() d.viewitems() -> d.items() d.viewvalues() -> d.values() Except in certain very specific contexts: the iter() can be dropped when the context is list(), sorted(), iter() or for...in; the list() can be dropped when the context is list() or sorted() (but not iter() or for...in!). Special contexts that apply to both: list(), sorted(), tuple() set(), any(), all(), sum(). Note: iter(d.keys()) could be written as iter(d) but since the original d.iterkeys() was also redundant we don't fix this. And there are (rare) contexts where it makes a difference (e.g. when passing it as an argument to a function that introspects the argument). )pytree)patcomp) fixer_base)NameCallDot) fixer_utilitercjeZdZdZdZdZdZejeZ dZ eje Z dZ dS)FixDictTa power< head=any+ trailer< '.' method=('keys'|'items'|'values'| 'iterkeys'|'iteritems'|'itervalues'| 'viewkeys'|'viewitems'|'viewvalues') > parens=trailer< '(' ')' > tail=any* > c |d}|dd}|d}|j}|j}|d}|d} |s| r |dd}|dvsJt|d |D}d |D}| o|||} |t j|jtt||j g|d  gz} t j|j | } | s+| s)d | _ tt|rdnd| g} |rt j|j | g|z} |j | _ | S)Nheadmethodtailr view)keysitemsvaluesc6g|]}|Sclone.0ns //usr/lib64/python3.11/lib2to3/fixes/fix_dict.py z%FixDict.transform..A (((a (((c6g|]}|Srrrs rrz%FixDict.transform..Br r!)prefixparenslist)symsvalue startswithreprin_special_contextrNodetrailerrrr#rpowerr) selfnoderesultsrrrr' method_nameisiterisviewspecialargsnews r transformzFixDict.transform6sv"1%vyl ''//''//  *V *%abb/K99994<<999((4(((((4((((Dt66tVDDv{4<$'EE$(06 %?%?%?$@AAx(..00 22 k$*d++ B6 BCJtf8FF&99C5AAC  8+dj3%$,77C[  r!z3power< func=NAME trailer< '(' node=any ')' > any* >zmfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > cH|jdSi}|jj^|j|jj|r9|d|ur/|r|djtvS|djt jvS|sdS|j|j|o |d|uS)NFr0func)parentp1matchr( iter_exemptr consuming_callsp2)r/r0r3r1s rr+zFixDict.in_special_contextZs ; 5 K  *w}}T[/99 +v$&& Kv, ;;v, 0JJJ 5w}}T['22Nwv$7NNr!N) __name__ __module__ __qualname__ BM_compatiblePATTERNr8P1rcompile_patternr<P2r@r+rr!rr r )swMG8 ?B   $ $B B !  $ $BOOOOOr!r N) __doc__r%rrrr rrrr?r>BaseFixr rr!rrKs6(((((((((((F83 AOAOAOAOAOj AOAOAOAOAOr!fixes/__pycache__/fix_isinstance.cpython-311.pyc000064400000004665151027012300015604 0ustar00 !A?hHHdZddlmZddlmZGddejZdS)a,Fixer that cleans up a tuple argument to isinstance after the tokens in it were fixed. This is mainly used to remove double occurrences of tokens as a leftover of the long -> int / unicode -> str conversion. eg. isinstance(x, (int, long)) -> isinstance(x, (int, int)) -> isinstance(x, int) ) fixer_base)tokenc eZdZdZdZdZdZdS) FixIsinstanceTz power< 'isinstance' trailer< '(' arglist< any ',' atom< '(' args=testlist_gexp< any+ > ')' > > ')' > > ct}|d}|j}g}t|}|D]\}} | jtjkrN| j|vrE|t|dz kr.||dzjtjkrt|gh| | | jtjkr| | j|r|djtjkr|d=t|dkr6|j } | j |d_ | |ddS||dd<|dS)Nargs)setchildren enumeratetyperNAMEvaluelenCOMMAnextappendaddparentprefixreplacechanged) selfnoderesultsnames_insertedtestlistr new_argsiteratoridxargatoms 5/usr/lib64/python3.11/lib2to3/fixes/fix_isinstance.py transformzFixIsinstance.transformsO6? T??  2 2HCx5:%%#)~*E*ETQ&&4a=+=+L+LNNN$$$8uz))"&&sy111   )U[88 x==A  ?D!%HQK  LL! % % % % %DG LLNNNNNN)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderr'r(r&rrs6MGIr(rN)__doc__r fixer_utilrBaseFixrr/r(r&r4sl$$$$$J&$$$$$r(fixes/__pycache__/fix_metaclass.cpython-311.opt-2.pyc000064400000022164151027012300016352 0ustar00 !A?h ~ ddlmZddlmZddlmZmZmZdZdZ dZ dZ dZ d Z Gd d ejZd S) ) fixer_base)token)symsNodeLeafcR |jD]}|jtjkrt |cS|jtjkr`|jrY|jd}|jtjkr7|jr0|jd}t|tr|j dkrdSdS)N __metaclass__TF) childrentypersuite has_metaclass simple_stmt expr_stmt isinstancervalue)parentnode expr_node left_sides 4/usr/lib64/python3.11/lib2to3/fixes/fix_metaclass.pyrrs     9 " " && & & & Y$* * *t} * a(I~//I4F/%.q1 i.. !?::44 5c  |jD]}|jtjkrdSt |jD]\}}|jt jkrntdttjg}|j|dzdr]|j|dz}| | | |j|dzd]| ||}dS)NzNo class suite and no ':'!) r r rr enumeraterCOLON ValueErrorr append_childcloneremove)cls_noderir move_nodes rfixup_parse_treer$-s%! 9 " " FF # X.//774 9 # # E $5666 R E  AaCDD !%ac*  9??,,---  AaCDD ! %   DDDrcp t|jD]\}}|jtjkrndS|t tjg}t tj |g}|j|drW|j|}| | ||j|dW| |||jdjd}|jdjd} | j |_ dS)Nr )rr r rSEMIr rrrrrr insert_childprefix) rr" stmt_nodesemi_indrnew_exprnew_stmtr# new_leaf1 old_leaf1s rfixup_simple_stmtr/Gs3$I$677$ 9 " " E # KKMMMDNB''HD$xj11H  XYY '&x0 ioo//000  XYY ' 8$$$!!$-a0I"1%.q1I 'Irc|jrA|jdjtjkr#|jddSdSdS)N)r r rNEWLINEr )rs rremove_trailing_newliner3_sQ }#r*/5=@@ b  """""##@@rc#K|jD]}|jtjkrnt dt t |jD]\}}|jtjkr|jr}|jd}|jtjkr[|jrT|jd}t|tr2|j dkr't|||t||||fVdS)NzNo class suite!r r )r r rr rlistrrrrrrr/r3)r!rr" simple_noder left_nodes r find_metasr8ds!,, 9 " " E #*+++y7788 1 1;  t/ / /K4H /#,Q/I~//I4F/%.q1 i..1!?::%dA{;;;+K888K0000 1 1rcr |jddd}|r,|}|jtjkrn|,|ru|}t |t r%|jtjkr|jrd|_dS| |jddd|sdSdS)Nr1) r popr rINDENTrrDEDENTr(extend)r kidsrs r fixup_indentr@{s >$$B$ D xxzz 9 $ $   -xxzz dD ! ! -di5<&?&?{ !  F KK ddd+ , , , -----rceZdZdZdZdZdS) FixMetaclassTz classdef ct|sdSt|d}t|D]\}}}|}||jdj}t |jdkr|jdjtjkr|jd}nN|jd } ttj| g}| d|nt |jdkr1ttjg}| d|nt |jdkrttjg}| dttjd| d|| dttjdnt#d |jdjd} d | _| j} |jr5|ttjd d | _nd | _|jd} d | jd_d | jd_||t-||jso|t|d} | | _|| |ttjddSt |jdkr|jdjtjkrx|jdjtjkrZt|d} | d| | dttjddSdSdSdS)Nr r)(zUnexpected class definition metaclass, r:rpass r1)rr$r8r r r lenrarglistrr set_childr'rrRPARLPARrrr(rCOMMAr@r2r<r=)selfrresultslast_metaclassr r"stmt text_typerQrmeta_txtorig_meta_prefixr pass_leafs r transformzFixMetaclass.transformsT""  F(..  NE1d!N KKMMMMM!$)  t}   " "}Q$ 44-*q)//11t|fX66q'****   1 $ $4<,,G   a ) ) ) )   1 $ $4<,,G   aej#!6!6 7 7 7   a ) ) )   aej#!6!6 7 7 7 7:;; ;"*1-6q9$#?   !  ek3!7!7 8 8 8!HOO HO#+A. ') 1$') 1$^,,,U~ > LLNNNY//I/I    i ( ( (   d5=$77 8 8 8 8 8  1 $ $.$)U\99.$)U\99Y//I   r9 - - -   r4 t#<#< = = = = = % $9999rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr^rrrBrBs4MGL>L>L>L>L>rrBN)r:rpygramr fixer_utilrrrrr$r/r3r8r@BaseFixrBrdrrrhs())))))))))&4(((0### 111.---,S>S>S>S>S>:%S>S>S>S>S>rfixes/__pycache__/fix_funcattrs.cpython-311.opt-1.pyc000064400000002440151027012300016401 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z3Fix function attribute names (f.func_x -> f.__x__).) fixer_base)NameceZdZdZdZdZdS) FixFuncattrsTz power< any+ trailer< '.' attr=('func_closure' | 'func_doc' | 'func_globals' | 'func_name' | 'func_defaults' | 'func_code' | 'func_dict') > any* > c|dd}|td|jddz|jdS)Nattrz__%s__)prefix)replacervaluer )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_funcattrs.py transformzFixFuncattrs.transformsWvq! T8djn4!%... / / / / /N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG /////rrN)__doc__r fixer_utilrBaseFixrrrrrsh99 / / / / /:% / / / / /rfixes/__pycache__/fix_except.cpython-311.opt-1.pyc000064400000011264151027012300015664 0ustar00 !A?h zdZddlmZddlmZddlmZddlmZmZm Z m Z m Z m Z dZ GddejZd S) aFixer for except statements with named exceptions. The following cases will be converted: - "except E, T:" where T is a name: except E as T: - "except E, T:" where T is not a name, tuple or list: except E as t: T = t This is done because the target of an "except" clause must be a name. - "except E, T:" where T is a tuple or list literal: except E as t: T = t.args )pytree)token) fixer_base)AssignAttrNameis_tupleis_listsymsc#Kt|D]?\}}|jtjkr%|jdjdkr|||dzfV@dS)Nexceptr) enumeratetyper except_clausechildrenvalue)nodesins 1/usr/lib64/python3.11/lib2to3/fixes/fix_except.py find_exceptsrsh%  &&1 6T' ' 'z!}"h..%!*o%%%&&ceZdZdZdZdZdS) FixExceptTa1 try_stmt< 'try' ':' (simple_stmt | suite) cleanup=(except_clause ':' (simple_stmt | suite))+ tail=(['except' ':' (simple_stmt | suite)] ['else' ':' (simple_stmt | suite)] ['finally' ':' (simple_stmt | suite)]) > c j|j}d|dD}d|dD}t|D]\}}t|jdkr|jdd\}} } | t dd | jtjkrAt | d } | } d | _ | | | } |j} t| D]!\}}t|tjrn"t!| st#| r,t%| t'| t d }nt%| | }t)| d|D]}|d ||||| j d krd| _ d |jddD|z|z}tj|j|S)Nc6g|]}|Sclone).0rs r z'FixExcept.transform..2s 333a 333rtailc6g|]}|Srr)r!chs rr"z'FixExcept.transform..4s ???brxxzz???rcleanupas )prefixargsr c6g|]}|Srr)r!cs rr"z'FixExcept.transform..\s 999!AGGII999r)r rlenrreplacerrrNAMEnew_namer r+r isinstancerNoder r rrreversed insert_child)selfnoderesultsr r# try_cleanupre_suiteEcommaNnew_Ntarget suite_stmtsrstmtassignchildrs r transformzFixExcept.transform/s2y3376?333??GI,>??? &2;&?&?$ #$ # "M7=)**a// - 6qs ; E1 d44445556UZ'' ===EWWYYF$&FMIIe$$$!KKMME #*"2K#,[#9#9""4%dFK88"!E"  {{7gajj7!'UDLL0I0I!J!J!'!6!6"*+bqb/!:!:77,,Q6666((F3333X^^ #AH:9t}RaR'8999KG$N{49h///rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrGrrrrr$s/MG.0.0.0.0.0rrN)__doc__r,rpgen2rr fixer_utilrrrr r r rBaseFixrrrrrQs0DDDDDDDDDDDDDDDD&&& 9090909090 "9090909090rfixes/__pycache__/fix_getcwdu.cpython-311.opt-2.pyc000064400000001752151027012300016040 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixGetcwduTzR power< 'os' trailer< dot='.' name='getcwdu' > any* > ch|d}|td|jdS)Nnamegetcwd)prefix)replacerr )selfnoderesultsrs 2/usr/lib64/python3.11/lib2to3/fixes/fix_getcwdu.py transformzFixGetcwdu.transforms2v T(4;77788888N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG99999rrN)r fixer_utilrBaseFixrrrrrsg  9 9 9 9 9# 9 9 9 9 9rfixes/__pycache__/fix_itertools_imports.cpython-311.opt-1.pyc000064400000005230151027012300020171 0ustar00 !A?h&PdZddlmZddlmZmZmZGddejZdS)zA Fixer for imports of itertools.(imap|ifilter|izip|ifilterfalse) ) fixer_base) BlankLinesymstokenc2eZdZdZdezZdZdS)FixItertoolsImportsTzT import_from< 'from' 'itertools' 'import' imports=any > c~|d}|jtjks|js|g}n|j}|dddD]}|jtjkr |j}|}n%|jtjkrdS|jd}|j}|dvrd|_|m|dvr)| |ddkrdnd |_|jddp|g}d } |D]3}| r*|jtj kr|.| d z} 4|r^|d jtj krC| |r|d jtj kC|jst|d dr|j |j} t}| |_|SdS) Nimportsr)imapizipifilter) ifilterfalse izip_longestf filterfalse zip_longestTvalue)typerimport_as_namechildrenrNAMErSTARremovechangedCOMMApopgetattrparentprefixr) selfnoderesultsr rchildmember name_node member_name remove_commaps r:stGG555555555511111*,11111r.fixes/__pycache__/fix_xreadlines.cpython-311.opt-1.pyc000064400000003075151027012300016533 0ustar00 !A?hHdZddlmZddlmZGddejZdS)zpFix "for x in f.xreadlines()" -> "for x in f". This fixer will also convert g(f.xreadlines) into g(f.__iter__).) fixer_base)NameceZdZdZdZdZdS) FixXreadlinesTz power< call=any+ trailer< '.' 'xreadlines' > trailer< '(' ')' > > | power< any+ trailer< '.' no_call='xreadlines' > > c|d}|r+|td|jdS|d|dDdS)Nno_call__iter__)prefixc6g|]}|S)clone).0xs 5/usr/lib64/python3.11/lib2to3/fixes/fix_xreadlines.py z+FixXreadlines.transform..s ===!''))===call)getreplacerr )selfnoderesultsrs r transformzFixXreadlines.transformsm++i((  ? OODGNCCC D D D D D LL==WV_=== > > > > >rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrr rrrr s/MG ?????rrN)__doc__r fixer_utilrBaseFixrr rrr#snDD ?????J&?????rfixes/__pycache__/fix_imports.cpython-311.opt-2.pyc000064400000015343151027012300016074 0ustar00 !A?h4P ddlmZddlmZmZidddddddd d d d d ddddddddddddddddddddd d!id"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;dd?d@dAdBdCdCdDdEdFdGdHdIdIdIdJdKdKdLdMdNZdOZefdPZGdQdRejZ dSS)T) fixer_base)Name attr_chainStringIOio cStringIOcPicklepickle __builtin__builtinscopy_regcopyregQueuequeue SocketServer socketserver ConfigParser configparserreprreprlib FileDialogztkinter.filedialog tkFileDialog SimpleDialogztkinter.simpledialogtkSimpleDialogtkColorChooserztkinter.colorchoosertkCommonDialogztkinter.commondialogDialogztkinter.dialogTkdndz tkinter.dndtkFontz tkinter.font tkMessageBoxztkinter.messagebox ScrolledTextztkinter.scrolledtext Tkconstantsztkinter.constantsTixz tkinter.tixttkz tkinter.ttkTkintertkinter markupbase _markupbase_winregwinregthread_thread dummy_thread _dummy_threaddbhashzdbm.bsddumbdbmzdbm.dumbdbmzdbm.ndbmgdbmzdbm.gnu xmlrpclibz xmlrpc.clientDocXMLRPCServerz xmlrpc.serverz http.clientz html.entitiesz html.parserz http.cookieszhttp.cookiejarz http.server subprocess collectionsz urllib.parsezurllib.robotparser)SimpleXMLRPCServerhttplibhtmlentitydefs HTMLParserCookie cookielibBaseHTTPServerSimpleHTTPServer CGIHTTPServercommands UserStringUserListurlparse robotparserc^ddtt|zdzS)N(|))joinmapr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_imports.py alternatesrM=s( #dG,,-- - 33c#Kdd|D}t|}d|d|dVd|zVd|d|d Vd |zVdS) Nz | cg|]}d|zS)zmodule_name='%s').0keys rL z!build_pattern..BsGGG-3GGGrNz$name_import=import_name< 'import' ((z;) | multiple_imports=dotted_as_names< any* (z) any* >) > zimport_from< 'from' (%s) 'import' ['('] ( any | import_as_name< any 'as' any > | import_as_names< any* >) [')'] > z(import_name< 'import' (dotted_as_name< (zg) 'as' any > | multiple_imports=dotted_as_names< any* dotted_as_name< (z!) 'as' any > any* >) > z3power< bare_with_attr=(%s) trailer<'.' any > any* >)rIrMkeys)mappingmod_list bare_namess rL build_patternrYAszzGGwGGGHHHGLLNN++JJ888 %%%%  888 %%%% @* LLLLLLrNcNeZdZdZdZeZdZdZfdZ fdZ fdZ dZ xZ S) FixImportsTcPdt|jS)NrG)rIrYrV)selfs rLrYzFixImports.build_pattern`sxx dl33444rNc||_tt|dSN)rYPATTERNsuperr[compile_pattern)r^ __class__s rLrczFixImports.compile_patterncs:))++  j$//11111rNctt|j|}|r1d|vr+tfdt |dDrdS|SdS)Nbare_with_attrc3.K|]}|VdSr`rQ)rRobjmatchs rL z#FixImports.match..qs+IIsc IIIIIIrNparentF)rbr[rianyr)r^noderesultsrirds @rLrizFixImports.matchjsvj$''-%++   w..IIIIjx.H.HIIIII/uNurNchtt|||i|_dSr`)rbr[ start_treereplace)r^treefilenamerds rLrpzFixImports.start_treevs. j$**4::: rNc|d}|r|j}|j|}|t ||jd|vr ||j|<d|vr/||}|r|||dSdSdS|dd}|j|j}|r+|t ||jdSdS)N module_name)prefix name_importmultiple_importsrf)getvaluerVrqrrvri transform)r^rmrn import_modmod_namenew_name bare_names rLr|zFixImports.transformzs'[[//  K!'H|H-H   tHZ5FGGG H H H''*2 X&!W,, **T**2NN411111-, 22 01!4I|'' 88H K!!$x 8H"I"I"IJJJJJ K KrN)__name__ __module__ __qualname__ BM_compatiblekeep_line_orderMAPPINGrV run_orderrYrcrirpr| __classcell__)rds@rLr[r[UsMOGI55522222     KKKKKKKrNr[N) r fixer_utilrrrrMrYBaseFixr[rQrNrLrs5))))))))2 :2  2  h2  :2  y 2  G 2  > 2  >2  92  -2  /2  12  32  32  32  %2  M!2 2 " ^#2 $ /%2 & 1'2 ( -)2 * -+2 , --2 . i/2 0 12 2 h32 4 Y52 6 ?72 : Y;2 < j=2 > *?2 @ 9A2 B C2 D oE2 2 F"1#-'#(*,)#'%&/c2 2 2 j444"MMMM(<K<K<K<K<K#<K<K<K<K<KrNfixes/__pycache__/fix_xreadlines.cpython-311.pyc000064400000003075151027012300015574 0ustar00 !A?hHdZddlmZddlmZGddejZdS)zpFix "for x in f.xreadlines()" -> "for x in f". This fixer will also convert g(f.xreadlines) into g(f.__iter__).) fixer_base)NameceZdZdZdZdZdS) FixXreadlinesTz power< call=any+ trailer< '.' 'xreadlines' > trailer< '(' ')' > > | power< any+ trailer< '.' no_call='xreadlines' > > c|d}|r+|td|jdS|d|dDdS)Nno_call__iter__)prefixc6g|]}|S)clone).0xs 5/usr/lib64/python3.11/lib2to3/fixes/fix_xreadlines.py z+FixXreadlines.transform..s ===!''))===call)getreplacerr )selfnoderesultsrs r transformzFixXreadlines.transformsm++i((  ? OODGNCCC D D D D D LL==WV_=== > > > > >rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrr rrrr s/MG ?????rrN)__doc__r fixer_utilrBaseFixrr rrr#snDD ?????J&?????rfixes/__pycache__/fix_raw_input.cpython-311.opt-1.pyc000064400000002060151027012300016376 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z2Fixer that changes raw_input(...) into input(...).) fixer_base)NameceZdZdZdZdZdS) FixRawInputTzU power< name='raw_input' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nnameinput)prefix)replacerr )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_raw_input.py transformzFixRawInput.transforms2v T'$+66677777N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG88888rrN)__doc__r fixer_utilrBaseFixrrrrrsh88 8 8 8 8 8*$ 8 8 8 8 8rfixes/__pycache__/fix_apply.cpython-311.opt-1.pyc000064400000005371151027012300015523 0ustar00 !A?h* hdZddlmZddlmZddlmZddlmZmZm Z Gddej Z dS) zIFixer for apply(). This converts apply(func, v, k) into (func)(*v, **k).)pytree)token) fixer_base)CallComma parenthesizeceZdZdZdZdZdS)FixApplyTa. power< 'apply' trailer< '(' arglist< (not argument ')' > > c~|j}|d}|d}|d}|r+|j|jjkr|jdjdvrdS|r-|j|jjkr|jdjdkrdS|j}|}|jtj |j fvr?|j|j ks |jdjtj krt|}d|_|}d|_||}d|_tjtjd |g}|N|t%tjtj d|gd |d_t'||| S) Nfuncargskwds>***rr )prefix)symsgettypeargumentchildrenvaluerclonerNAMEatompower DOUBLESTARrrLeafSTARextendrr) selfnoderesultsrr r rr l_newargss 0/usr/lib64/python3.11/lib2to3/fixes/fix_apply.py transformzFixApply.transformsyvv{{6""   TY/// a &+55  TY$)"444]1%+t33 Fzz|| Iej$)4 4 4 Y$* $ $ ]2  #u'7 7 7%%D zz||  ::<r5s99 22222222226464646464z!6464646464r*fixes/__pycache__/fix_future.cpython-311.opt-1.pyc000064400000001761151027012300015707 0ustar00 !A?h#HdZddlmZddlmZGddejZdS)zVRemove __future__ imports from __future__ import foo is replaced with an empty line. ) fixer_base) BlankLinec eZdZdZdZdZdZdS) FixFutureTz;import_from< 'from' module_name="__future__" 'import' any > c:t}|j|_|S)N)rprefix)selfnoderesultsnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_future.py transformzFixFuture.transformskk[  N)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderrrrrr s4MOGIrrN)__doc__r fixer_utilrBaseFixrrrrrsl""""""      "     rfixes/__pycache__/fix_raw_input.cpython-311.pyc000064400000002060151027012300015437 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z2Fixer that changes raw_input(...) into input(...).) fixer_base)NameceZdZdZdZdZdS) FixRawInputTzU power< name='raw_input' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nnameinput)prefix)replacerr )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_raw_input.py transformzFixRawInput.transforms2v T'$+66677777N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG88888rrN)__doc__r fixer_utilrBaseFixrrrrrsh88 8 8 8 8 8*$ 8 8 8 8 8rfixes/__pycache__/fix_asserts.cpython-311.opt-1.pyc000064400000003317151027012300016060 0ustar00 !A?hpdZddlmZddlmZedddddd d dddddd d ZGddeZdS)z5Fixer that replaces deprecated unittest method names.)BaseFix)Name assertTrue assertEqualassertNotEqualassertAlmostEqualassertNotAlmostEqual assertRegexassertRaisesRegex assertRaises assertFalse)assert_ assertEqualsassertNotEqualsassertAlmostEqualsassertNotAlmostEqualsassertRegexpMatchesassertRaisesRegexpfailUnlessEqual failIfEqualfailUnlessAlmostEqualfailIfAlmostEqual failUnlessfailUnlessRaisesfailIfcXeZdZddeeezZdZdS) FixAssertszH power< any+ trailer< '.' meth=(%s)> any* > |c|dd}|ttt||jdS)Nmeth)prefix)replacerNAMESstrr")selfnoderesultsnames 2/usr/lib64/python3.11/lib2to3/fixes/fix_asserts.py transformzFixAsserts.transform sBvq! T%D *4;???@@@@@N) __name__ __module__ __qualname__joinmapreprr$PATTERNr+r,r*rrsOHHSSu--../GAAAAAr,rN)__doc__ fixer_baser fixer_utilrdictr$rr4r,r*r9s;;!   $*0%*! -,#    $AAAAAAAAAAr,fixes/__pycache__/fix_buffer.cpython-311.pyc000064400000002102151027012300014675 0ustar00 !A?hNHdZddlmZddlmZGddejZdS)z4Fixer that changes buffer(...) into memoryview(...).) fixer_base)Namec eZdZdZdZdZdZdS) FixBufferTzR power< name='buffer' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nname memoryview)prefix)replacerr )selfnoderesultsrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_buffer.py transformzFixBuffer.transforms2v T,t{;;;<<<<<N)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNrrrrr s4MHG=====rrN)__doc__r fixer_utilrBaseFixrrrrrsj;: = = = = = " = = = = =rfixes/__pycache__/fix_metaclass.cpython-311.pyc000064400000025005151027012300015407 0ustar00 !A?h dZddlmZddlmZddlmZmZmZdZ dZ dZ dZ d Z d ZGd d ejZd S)aFixer for __metaclass__ = X -> (metaclass=X) methods. The various forms of classef (inherits nothing, inherits once, inherits many) don't parse the same in the CST so we look at ALL classes for a __metaclass__ and if we find one normalize the inherits to all be an arglist. For one-liner classes ('class X: pass') there is no indent/dedent so we normalize those into having a suite. Moving the __metaclass__ into the classdef can also cause the class body to be empty so there is some special casing for that as well. This fixer also tries very hard to keep original indenting and spacing in all those corner cases. ) fixer_base)token)symsNodeLeafcP|jD]}|jtjkrt |cS|jtjkr`|jrY|jd}|jtjkr7|jr0|jd}t|tr|j dkrdSdS)z we have to check the cls_node without changing it. There are two possibilities: 1) clsdef => suite => simple_stmt => expr_stmt => Leaf('__meta') 2) clsdef => simple_stmt => expr_stmt => Leaf('__meta') __metaclass__TF) childrentypersuite has_metaclass simple_stmt expr_stmt isinstancervalue)parentnode expr_node left_sides 4/usr/lib64/python3.11/lib2to3/fixes/fix_metaclass.pyrrs     9 " " && & & & Y$* * *t} * a(I~//I4F/%.q1 i.. !?::44 5c |jD]}|jtjkrdSt |jD]\}}|jt jkrntdttjg}|j|dzdr]|j|dz}| | | |j|dzd]| ||}dS)zf one-line classes don't get a suite in the parse tree so we add one to normalize the tree NzNo class suite and no ':'!) r r rr enumeraterCOLON ValueErrorr append_childcloneremove)cls_noderir move_nodes rfixup_parse_treer$-s ! 9 " " FF # X.//774 9 # # E $5666 R E  AaCDD !%ac*  9??,,---  AaCDD ! %   DDDrcnt|jD]\}}|jtjkrndS|t tjg}t tj |g}|j|drW|j|}| | ||j|dW| |||jdjd}|jdjd} | j |_ dS)z if there is a semi-colon all the parts count as part of the same simple_stmt. We just want the __metaclass__ part so we move everything after the semi-colon into its own simple_stmt node Nr )rr r rSEMIr rrrrrr insert_childprefix) rr" stmt_nodesemi_indrnew_exprnew_stmtr# new_leaf1 old_leaf1s rfixup_simple_stmtr/Gs. $I$677$ 9 " " E # KKMMMDNB''HD$xj11H  XYY '&x0 ioo//000  XYY ' 8$$$!!$-a0I"1%.q1I 'Irc|jrA|jdjtjkr#|jddSdSdS)N)r r rNEWLINEr )rs rremove_trailing_newliner3_sQ }#r*/5=@@ b  """""##@@rc#K|jD]}|jtjkrnt dt t |jD]\}}|jtjkr|jr}|jd}|jtjkr[|jrT|jd}t|tr2|j dkr't|||t||||fVdS)NzNo class suite!r r )r r rr rlistrrrrrrr/r3)r!rr" simple_noder left_nodes r find_metasr8ds!,, 9 " " E #*+++y7788 1 1;  t/ / /K4H /#,Q/I~//I4F/%.q1 i..1!?::%dA{;;;+K888K0000 1 1rcp|jddd}|r,|}|jtjkrn|,|ru|}t |t r%|jtjkr|jrd|_dS| |jddd|sdSdS)z If an INDENT is followed by a thing with a prefix then nuke the prefix Otherwise we get in trouble when removing __metaclass__ at suite start Nr1) r popr rINDENTrrDEDENTr(extend)r kidsrs r fixup_indentr@{s >$$B$ D xxzz 9 $ $   -xxzz dD ! ! -di5<&?&?{ !  F KK ddd+ , , , -----rceZdZdZdZdZdS) FixMetaclassTz classdef ct|sdSt|d}t|D]\}}}|}||jdj}t |jdkr|jdjtjkr|jd}nN|jd } ttj| g}| d|nt |jdkr1ttjg}| d|nt |jdkrttjg}| dttjd| d|| dttjdnt#d |jdjd} d | _| j} |jr5|ttjd d | _nd | _|jd} | jtjksJd | jd_d | jd_||t/||jso|t|d} | | _|| |ttjddSt |jdkr|jdjtjkrx|jdjtjkrZt|d} | d| | dttjddSdSdSdS)Nr r)(zUnexpected class definition metaclass, r:rpass r1)rr$r8r r r lenrarglistrr set_childr'rrRPARLPARrrr(rCOMMArr@r2r<r=)selfrresultslast_metaclassr r"stmt text_typerQrmeta_txtorig_meta_prefixr pass_leafs r transformzFixMetaclass.transformsT""  F(..  NE1d!N KKMMMMM!$)  t}   " "}Q$ 44-*q)//11t|fX66q'****   1 $ $4<,,G   a ) ) ) )   1 $ $4<,,G   aej#!6!6 7 7 7   a ) ) )   aej#!6!6 7 7 7 7:;; ;"*1-6q9$#?   !  ek3!7!7 8 8 8!HOO HO#+A. ~////') 1$') 1$^,,,U~ > LLNNNY//I/I    i ( ( (   d5=$77 8 8 8 8 8  1 $ $.$)U\99.$)U\99Y//I   r9 - - -   r4 t#<#< = = = = = % $9999rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr^rrrBrBs4MGL>L>L>L>L>rrBN)__doc__r:rpygramr fixer_utilrrrrr$r/r3r8r@BaseFixrBrdrrris())))))))))&4(((0### 111.---,S>S>S>S>S>:%S>S>S>S>S>rfixes/__pycache__/fix_urllib.cpython-311.opt-2.pyc000064400000023233151027012300015665 0ustar00 !A?h  ddlmZmZddlmZmZmZmZmZm Z m Z dgdfdgdfddgfgdgd fdd d gfgd Z e d  e dddZ GddeZdS)) alternates FixImports)NameComma FromImportNewlinefind_indentationNodesymszurllib.request) URLopenerFancyURLopener urlretrieve _urlopenerurlopen urlcleanup pathname2url url2pathname getproxiesz urllib.parse)quote quote_plusunquote unquote_plus urlencode splitattr splithost splitnport splitpasswd splitport splitquerysplittag splittype splituser splitvaluez urllib.errorContentTooShortError)rinstall_opener build_openerRequestOpenerDirector BaseHandlerHTTPDefaultErrorHandlerHTTPRedirectHandlerHTTPCookieProcessor ProxyHandlerHTTPPasswordMgrHTTPPasswordMgrWithDefaultRealmAbstractBasicAuthHandlerHTTPBasicAuthHandlerProxyBasicAuthHandlerAbstractDigestAuthHandlerHTTPDigestAuthHandlerProxyDigestAuthHandler HTTPHandler HTTPSHandler FileHandler FTPHandlerCacheFTPHandlerUnknownHandlerURLError HTTPError)urlliburllib2r?r>c #Kt}tD]P\}}|D]H}|\}}t|}d|d|dVd|d|d|dVd|zVd |zVd |d |d VIQdS) Nzimport_name< 'import' (module=zB | dotted_as_names< any* module=z any* >) > zimport_from< 'from' mod_member=z* 'import' ( member=z | import_as_name< member=z] 'as' any > | import_as_names< members=any* >) > zIimport_from< 'from' module_star=%r 'import' star='*' > ztimport_name< 'import' dotted_as_name< module_as=%r 'as' any > > zpower< bare_with_attr=z trailer< '.' member=z > any* > )setMAPPINGitemsr)bare old_modulechangeschange new_modulememberss 1/usr/lib64/python3.11/lib2to3/fixes/fix_urllib.py build_patternrL0s 55D&}}.. G . .F"( J ))GG$ZZZ1 1 1 1 1 $WWWggg7 7 7 7"# # # #"# # # # # $WWW. . . . .! ...c,eZdZdZdZdZdZdZdS) FixUrllibcDdtS)N|)joinrL)selfs rKrLzFixUrllib.build_patternIsxx (((rMc |d}|j}g}t|jddD]:}|t |d|t g;|t t|jdd|||dS)Nmodulerprefix) getrXrCvalueextendrrappendreplace)rSnoderesults import_modprefnamesnames rKtransform_importzFixUrllib.transform_importLs [[**  J,-crc2 @ @D LL$tAwt444egg> ? ? ? ? T'*"23B7:4HHHIII5!!!!!rMc |d}|j}|d}|rt|tr|d}d}t|jD]}|j|dvr |d}n|r&|t||dS||ddSg}i} |d} | D]}|j tj kr%|j dj} |j dj} n |j} d} | d krst|jD]`}| |dvrT|d| vr| |d| |dg |ag} t|}d }d }|D]}| |}g}|dd D]B}||||| t#C|||d |t%||}|r|jj|r||_| |d }| rdg}| dd D]%}||t+g&| | d ||dS||ddS)N mod_membermemberrr@rW!This is an invalid module elementrJ,TcL|jtjkryt|jdj||jd|jdg}ttj|gSt|j|gS)NrrWr@ri)typer import_as_namerchildrenrZcloner )rcrXkidss rK handle_namez/FixUrllib.transform_member..handle_names9 333 q!1!7GGG M!,2244 M!,22446D!!4d;;<<TZ77788rMrVFzAll module elements are invalid)rYrX isinstancelistrCrZr]rcannot_convertrlr rmrnr\ setdefaultr r[rrparentendswithr)rSr^r_rfrargnew_namerHmodulesmod_dictrJas_name member_name new_nodes indentationfirstrqrUeltsrbeltnewnodesnew_nodes rKtransform_memberzFixUrllib.transform_member\se [[..  X&& @ M&$'' #H!*"23  <6!9,,%ayHE- O""4#>#>#>?????##D*MNNNNN GHi(G! N N;$"555$oa06G"(/!"4":KK"(,K"G#%%")**:";NN&&)33%ay88 'vay 9 9 9$//q 2>>EEfMMMI*400KE 9 9 9"  '9**CLLS$!7!7888LL)))) [[b488999 //- 2 ; ;K H H-!,CJ  %%% M )#2#88HLL(GII!67777 Yr]+++ U#######D*KLLLLLrMc| |d}|d}d}t|tr|d}t|jD]}|j|dvr |d}n|r+|t ||jdS||ddS)Nbare_with_attrrgrr@rWrh) rYrrrsrCrZr]rrXrt)rSr^r_ module_dotrgrxrHs rK transform_dotzFixUrllib.transform_dots<[[!122 X&& fd # # AYFj./  F|vay((!!9)  K   tH+5+< > > > ? ? ? ? ?   &I J J J J JrMc|dr|||dS|dr|||dS|dr|||dS|dr||ddS|dr||ddSdS)NrUrfr module_starzCannot handle star imports. module_asz#This module is now multiple modules)rYrdrrrt)rSr^r_s rK transformzFixUrllib.transforms ;;x M  ! !$ 0 0 0 0 0 [[ & & M  ! !$ 0 0 0 0 0 [[) * * M   tW - - - - - [[ ' ' M   &C D D D D D [[ % % M   &K L L L L L M MrMN)__name__ __module__ __qualname__rLrdrrrrMrKrOrOGsn)))""" JMJMJMXKKK" M M M M MrMrON)lib2to3.fixes.fix_importsrrlib2to3.fixer_utilrrrrr r r rCr\rLrOrrMrKrs=<<<<<<<>>>>>>>>>>>>>>>>>>"CCCD ???@  +,. /" ' ' ' ( -/   B '(+A.///....}M}M}M}M}M }M}M}M}M}MrMfixes/__pycache__/fix_xreadlines.cpython-311.opt-2.pyc000064400000002672151027012300016536 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixXreadlinesTz power< call=any+ trailer< '.' 'xreadlines' > trailer< '(' ')' > > | power< any+ trailer< '.' no_call='xreadlines' > > c|d}|r+|td|jdS|d|dDdS)Nno_call__iter__)prefixc6g|]}|S)clone).0xs 5/usr/lib64/python3.11/lib2to3/fixes/fix_xreadlines.py z+FixXreadlines.transform..s ===!''))===call)getreplacerr )selfnoderesultsrs r transformzFixXreadlines.transformsm++i((  ? OODGNCCC D D D D D LL==WV_=== > > > > >rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrr rrrr s/MG ?????rrN)r fixer_utilrBaseFixrr rrr"shD ?????J&?????rfixes/__pycache__/fix_raw_input.cpython-311.opt-2.pyc000064400000001756151027012300016412 0ustar00 !A?hF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixRawInputTzU power< name='raw_input' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nnameinput)prefix)replacerr )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_raw_input.py transformzFixRawInput.transforms2v T'$+66677777N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG88888rrN)r fixer_utilrBaseFixrrrrrse8 8 8 8 8 8*$ 8 8 8 8 8rfixes/__pycache__/fix_exitfunc.cpython-311.pyc000064400000007543151027012300015267 0ustar00 !A?h `dZddlmZmZddlmZmZmZmZm Z m Z Gddej Z dS)z7 Convert use of sys.exitfunc to use the atexit module. )pytree fixer_base)NameAttrCallCommaNewlinesymsc:eZdZdZdZdZfdZfdZdZxZ S) FixExitfuncTa ( sys_import=import_name<'import' ('sys' | dotted_as_names< (any ',')* 'sys' (',' any)* > ) > | expr_stmt< power< 'sys' trailer< '.' 'exitfunc' > > '=' func=any > ) cBtt|j|dSN)superr __init__)selfargs __class__s 3/usr/lib64/python3.11/lib2to3/fixes/fix_exitfunc.pyrzFixExitfunc.__init__s#)k4  )40000chtt|||d|_dSr)rr start_tree sys_import)rtreefilenamers rrzFixExitfunc.start_tree!s. k4  ++D(;;;rc d|vr|j |d|_dS|d}d|_tjt jttdtd}t||g|j}| ||j| |ddS|jj d}|j t jkrF|t!|tdddS|jj}|j |j}|j} tjt jtd tddg} tjt j| g} ||dzt-||d z| dS) NrfuncatexitregisterzKCan't find sys import; Please add an atexit import at the top of your file. import)rcloneprefixrNoder powerrrrreplacewarningchildrentypedotted_as_names append_childrparentindex import_name simple_stmt insert_childr ) rnoderesultsrrcallnamescontaining_stmtpositionstmt_container new_importnews r transformzFixExitfunc.transform%s 7 " "&"),"7 Fv$$&& ;tz#DNND4D4DEE!!Htfdk22 T ? " LL ? @ @ @ F(+ :- - -   uww ' ' '   tHc22 3 3 3 3 3"o4O&/55doFFH,3NT%5#H~~tHc/B/BC  J+d. ==C  ( (Awyy A A A  ( (As ; ; ; ; ;r) __name__ __module__ __qualname__keep_line_order BM_compatiblePATTERNrrr< __classcell__)rs@rr r sqOM G11111#<#<#<#<#<#<#rIs '&&&&&&&EEEEEEEEEEEEEEEE=<=<=<=<=<*$=<=<=<=<= any* > |c|dd}|ttt||jdS)Nmeth)prefix)replacerNAMESstrr")selfnoderesultsnames 2/usr/lib64/python3.11/lib2to3/fixes/fix_asserts.py transformzFixAsserts.transform sBvq! T%D *4;???@@@@@N) __name__ __module__ __qualname__joinmapreprr$PATTERNr+r,r*rrsOHHSSu--../GAAAAAr,rN) fixer_baser fixer_utilrdictr$rr4r,r*r8s;!   $*0%*! -,#    $AAAAAAAAAAr,fixes/__pycache__/fix_numliterals.cpython-311.opt-1.pyc000064400000003117151027012300016731 0ustar00 !A?hTdZddlmZddlmZddlmZGddejZdS)z-Fixer that turns 1L into 1, 0755 into 0o755. )token) fixer_base)Numberc(eZdZejZdZdZdS)FixNumliteralscT|jdp|jddvS)N0Ll)value startswith)selfnodes 6/usr/lib64/python3.11/lib2to3/fixes/fix_numliterals.pymatchzFixNumliterals.matchs( %%c**Ddjn.DEc|j}|ddvr |dd}nV|drA|r-tt |dkr d|ddz}t ||jS)Nr r r 0o)prefix)r r isdigitlensetrr)rrresultsvals r transformzFixNumliterals.transformsj r7d??crc(CC ^^C  !S[[]] !s3s88}}q7H7HQRR.Cc$+....rN)__name__ __module__ __qualname__rNUMBER _accept_typerrrrrr s>r(s~ /////Z'/////rfixes/__pycache__/fix_map.cpython-311.opt-2.pyc000064400000007703151027012300015155 0ustar00 !A?h8z ddlmZddlmZddlmZmZmZmZm Z ddl m Z ddl mZGddejZdS) )token) fixer_base)NameArgListCallListCompin_special_context)python_symbols)Nodec eZdZdZdZdZdZdS)FixMapTaL map_none=power< 'map' trailer< '(' arglist< 'None' ',' arg=any [','] > ')' > [extra_trailers=trailer*] > | map_lambda=power< 'map' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'map' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.mapcN||rdSg}d|vr2|dD])}||*|jjt jkrQ||d|}d|_ttd|g}nd|vr{t|d|d|d}tt j |g|zd }n_d |vr"|d }d|_nd |vr|d }|jt jkr|jd jt jkrd|jd jdjt"jkr9|jd jdjdkr||ddStt j td|g}d|_t)|rdStt j tdt+|gg|z}d|_|j|_|S)Nextra_trailerszYou should use a for loop herelist map_lambdaxpfpit)prefixmap_noneargargsNonezjcannot convert map(None, ...) with multiple arguments because map() now truncates to the shortest sequencemap) should_skipappendcloneparenttypesyms simple_stmtwarningrrrrr powertrailerchildrenarglistrNAMEvaluer r)selfnoderesultstrailerstnewrs ./usr/lib64/python3.11/lib2to3/fixes/fix_map.py transformzFixMap.transform@sn   D ! !  F w & &-. + + **** ; t/ / / LL? @ @ @**,,CCJtF||cU++CC W $ $74=..00"4=..00"4=..0022CtzC58#3B???CCW$$en**,, W$$"6?DyDL00}Q', <<}Q'038EJFF}Q'039VCC T,NOOOtzDKK+FGGC!#CJ%d++ 4tzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr3r4r2r r s6MG:$G.....r4r N)pgen2rrr fixer_utilrrrrr pygramr r#pytreer ConditionalFixr r;r4r2rAs&JJJJJJJJJJJJJJ++++++PPPPPZ &PPPPPr4fixes/__pycache__/fix_map.cpython-311.opt-1.pyc000064400000011340151027012300015144 0ustar00 !A?h8|dZddlmZddlmZddlmZmZmZm Z m Z ddl m Z ddlmZGddejZd S) aFixer that changes map(F, ...) into list(map(F, ...)) unless there exists a 'from future_builtins import map' statement in the top-level namespace. As a special case, map(None, X) is changed into list(X). (This is necessary because the semantics are changed in this case -- the new map(None, X) is equivalent to [(x,) for x in X].) We avoid the transformation (except for the special case mentioned above) if the map() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. NOTE: This is still not correct if the original code was depending on map(F, X, Y, ...) to go on until the longest argument is exhausted, substituting None for missing values -- like zip(), it now stops as soon as the shortest argument is exhausted. )token) fixer_base)NameArgListCallListCompin_special_context)python_symbols)Nodec eZdZdZdZdZdZdS)FixMapTaL map_none=power< 'map' trailer< '(' arglist< 'None' ',' arg=any [','] > ')' > [extra_trailers=trailer*] > | map_lambda=power< 'map' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'map' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.mapcN||rdSg}d|vr2|dD])}||*|jjt jkrQ||d|}d|_ttd|g}nd|vr{t|d|d|d}tt j |g|zd }n_d |vr"|d }d|_nd |vr|d }|jt jkr|jd jt jkrd|jd jdjt"jkr9|jd jdjdkr||ddStt j td|g}d|_t)|rdStt j tdt+|gg|z}d|_|j|_|S)Nextra_trailerszYou should use a for loop herelist map_lambdaxpfpit)prefixmap_noneargargsNonezjcannot convert map(None, ...) with multiple arguments because map() now truncates to the shortest sequencemap) should_skipappendcloneparenttypesyms simple_stmtwarningrrrrr powertrailerchildrenarglistrNAMEvaluer r)selfnoderesultstrailerstnewrs ./usr/lib64/python3.11/lib2to3/fixes/fix_map.py transformzFixMap.transform@sn   D ! !  F w & &-. + + **** ; t/ / / LL? @ @ @**,,CCJtF||cU++CC W $ $74=..00"4=..00"4=..0022CtzC58#3B???CCW$$en**,, W$$"6?DyDL00}Q', <<}Q'038EJFF}Q'039VCC T,NOOOtzDKK+FGGC!#CJ%d++ 4tzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr3r4r2r r s6MG:$G.....r4r N)__doc__pgen2rrr fixer_utilrrrrr pygramr r#pytreer ConditionalFixr r;r4r2rBs&JJJJJJJJJJJJJJ++++++PPPPPZ &PPPPPr4fixes/__pycache__/fix_getcwdu.cpython-311.opt-1.pyc000064400000002055151027012300016034 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z1 Fixer that changes os.getcwdu() to os.getcwd(). ) fixer_base)NameceZdZdZdZdZdS) FixGetcwduTzR power< 'os' trailer< dot='.' name='getcwdu' > any* > ch|d}|td|jdS)Nnamegetcwd)prefix)replacerr )selfnoderesultsrs 2/usr/lib64/python3.11/lib2to3/fixes/fix_getcwdu.py transformzFixGetcwdu.transforms2v T(4;77788888N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG99999rrN)__doc__r fixer_utilrBaseFixrrrrrsl  9 9 9 9 9# 9 9 9 9 9rfixes/__pycache__/fix_except.cpython-311.opt-2.pyc000064400000010365151027012300015666 0ustar00 !A?h x ddlmZddlmZddlmZddlmZmZmZm Z m Z m Z dZ Gddej ZdS) )pytree)token) fixer_base)AssignAttrNameis_tupleis_listsymsc#Kt|D]?\}}|jtjkr%|jdjdkr|||dzfV@dS)Nexceptr) enumeratetyper except_clausechildrenvalue)nodesins 1/usr/lib64/python3.11/lib2to3/fixes/fix_except.py find_exceptsrsh%  &&1 6T' ' 'z!}"h..%!*o%%%&&ceZdZdZdZdZdS) FixExceptTa1 try_stmt< 'try' ':' (simple_stmt | suite) cleanup=(except_clause ':' (simple_stmt | suite))+ tail=(['except' ':' (simple_stmt | suite)] ['else' ':' (simple_stmt | suite)] ['finally' ':' (simple_stmt | suite)]) > c j|j}d|dD}d|dD}t|D]\}}t|jdkr|jdd\}} } | t dd | jtjkrAt | d } | } d | _ | | | } |j} t| D]!\}}t|tjrn"t!| st#| r,t%| t'| t d }nt%| | }t)| d|D]}|d ||||| j d krd| _ d |jddD|z|z}tj|j|S)Nc6g|]}|Sclone).0rs r z'FixExcept.transform..2s 333a 333rtailc6g|]}|Srr)r!chs rr"z'FixExcept.transform..4s ???brxxzz???rcleanupas )prefixargsr c6g|]}|Srr)r!cs rr"z'FixExcept.transform..\s 999!AGGII999r)r rlenrreplacerrrNAMEnew_namer r+r isinstancerNoder r rrreversed insert_child)selfnoderesultsr r# try_cleanupre_suiteEcommaNnew_Ntarget suite_stmtsrstmtassignchildrs r transformzFixExcept.transform/s2y3376?333??GI,>??? &2;&?&?$ #$ # "M7=)**a// - 6qs ; E1 d44445556UZ'' ===EWWYYF$&FMIIe$$$!KKMME #*"2K#,[#9#9""4%dFK88"!E"  {{7gajj7!'UDLL0I0I!J!J!'!6!6"*+bqb/!:!:77,,Q6666((F3333X^^ #AH:9t}RaR'8999KG$N{49h///rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrGrrrrr$s/MG.0.0.0.0.0rrN)r,rpgen2rr fixer_utilrrrr r r rBaseFixrrrrrPs0DDDDDDDDDDDDDDDD&&& 9090909090 "9090909090rfixes/__pycache__/fix_sys_exc.cpython-311.opt-1.pyc000064400000004272151027012300016052 0ustar00 !A?h `dZddlmZddlmZmZmZmZmZm Z m Z Gddej Z dS)zFixer for sys.exc_{type, value, traceback} sys.exc_type -> sys.exc_info()[0] sys.exc_value -> sys.exc_info()[1] sys.exc_traceback -> sys.exc_info()[2] ) fixer_base)AttrCallNameNumber SubscriptNodesymscdeZdZgdZdZdddeDzZdZdS) FixSysExc)exc_type exc_value exc_tracebackTzN power< 'sys' trailer< dot='.' attribute=(%s) > > |c# K|] }d|zV dS)z'%s'N).0es 2/usr/lib64/python3.11/lib2to3/fixes/fix_sys_exc.py zFixSysExc.s&::AVaZ::::::c|dd}t|j|j}t t d|j}tt d|}|dj|djd_| t|ttj ||jS)N attributeexc_info)prefixsysdot)rrindexvaluerrrrchildrenappendrr r power)selfnoderesultssys_attrr callattrs r transformzFixSysExc.transforms;'*t}**8>::;;D$$X_===DKK&&%,U^%:Q" Ie$$%%%DJT[9999rN)__name__ __module__ __qualname__r BM_compatiblejoinPATTERNr+rrrr r s]999HMHH:::::::;G:::::rr N) __doc__r fixer_utilrrrrrr r BaseFixr rrrr6sHHHHHHHHHHHHHHHHHH::::: ":::::rfixes/__pycache__/fix_zip.cpython-311.opt-1.pyc000064400000004460151027012300015176 0ustar00 !A?h hdZddlmZddlmZddlmZddlm Z m Z m Z Gddej Z dS) a7 Fixer that changes zip(seq0, seq1, ...) into list(zip(seq0, seq1, ...) unless there exists a 'from future_builtins import zip' statement in the top-level namespace. We avoid the transformation if the zip() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. ) fixer_base)Node)python_symbols)NameArgListin_special_contextc eZdZdZdZdZdZdS)FixZipTzN power< 'zip' args=trailer< '(' [any] ')' > [trailers=trailer*] > zfuture_builtins.zipc||rdSt|rdS|d}d|_g}d|vrd|dD}|D] }d|_ t t jtd|gd}t t jtdt|gg|z}|j|_|S)Nargstrailersc6g|]}|S)clone).0ns ./usr/lib64/python3.11/lib2to3/fixes/fix_zip.py z$FixZip.transform..'s ???a ???zip)prefixlist) should_skiprrrrsymspowerrr)selfnoderesultsr rrnews r transformzFixZip.transforms   D ! !  F d # # 4v$$&&   ??7:+>???H  4:U T22>>>4:V gsenn=HII[  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr!rrrr r s6MG $Grr N)__doc__r rpytreerpygramrr fixer_utilrrrConditionalFixr rrrr-s++++++::::::::::Z &rfixes/__pycache__/fix_basestring.cpython-311.opt-2.pyc000064400000001474151027012300016540 0ustar00 !A?h@F ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixBasestringTz 'basestring'c.td|jS)Nstr)prefix)rr )selfnoderesultss 5/usr/lib64/python3.11/lib2to3/fixes/fix_basestring.py transformzFixBasestring.transform sE$+....N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rrs-MG/////rrN)r fixer_utilrBaseFixrrrr rse"/////J&/////rfixes/__pycache__/fix_long.cpython-311.pyc000064400000001723151027012300014373 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z/Fixer that turns 'long' into 'int' everywhere. ) fixer_base)is_probably_builtinceZdZdZdZdZdS)FixLongTz'long'c^t|rd|_|dSdS)Nint)rvaluechanged)selfnoderesultss //usr/lib64/python3.11/lib2to3/fixes/fix_long.py transformzFixLong.transforms4 t $ $ DJ LLNNNNN  N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s-MGrrN)__doc__lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsl222222j rfixes/__pycache__/fix_next.cpython-311.opt-1.pyc000064400000012135151027012300015350 0ustar00 !A?hf ~dZddlmZddlmZddlmZddlm Z m Z m Z dZ Gddej Zd Zd Zd Zd S) z.Fixer for it.next() -> next(it), per PEP 3114.)token)python_symbols) fixer_base)NameCall find_bindingz;Calls to builtin next() possibly shadowed by global bindingc0eZdZdZdZdZfdZdZxZS)FixNextTa power< base=any+ trailer< '.' attr='next' > trailer< '(' ')' > > | power< head=any+ trailer< '.' attr='next' > not trailer< '(' ')' > > | classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='next' parameters< '(' NAME ')' > any+ > any* > > | global=global_stmt< 'global' any* 'next' any* > prectt|||td|}|r$||t d|_dSd|_dS)NnextTF)superr start_treerwarning bind_warning shadowed_next)selftreefilenamen __class__s //usr/lib64/python3.11/lib2to3/fixes/fix_next.pyrzFixNext.start_tree$sj gt''h777  & &  ' LLL ) ) )!%D   !&D   c\|d}|d}|d}|r|jr+|td|jdSd|D}d|d_|t td |j|dS|r-td|j}||dS|rt |rZ|d }dd |Dd kr| |tdS|tddSd |vr$| |td|_dSdS)Nbaseattrname__next__)prefixc6g|]}|S)clone.0rs r z%FixNext.transform..9s 000a 000rr headc,g|]}t|Sr!)strr#s rr%z%FixNext.transform..Es111qCFF111r __builtin__globalT) getrreplacerrris_assign_targetjoinstriprr)rnoderesultsrrrrr(s r transformzFixNext.transform.s{{6""{{6""{{6""  &! K T*T[AAABBBBB004000!#Q T$vdk"B"B"BDIIJJJJJ  &Z 444A LLOOOOO  & %% v7711D1112288::mKKLL|444 LLj)) * * * * *  LL| , , ,!%D   ! r) __name__ __module__ __qualname__ BM_compatiblePATTERNorderrr4 __classcell__)rs@rr r sZM G E'''''&&&&&&&rr ct|}|dS|jD]-}|jtjkrdSt ||rdS.dS)NFT) find_assignchildrentyperEQUAL is_subtree)r2assignchilds rr/r/Qsc   F ~u : $ $55 t $ $ 44  5rc|jtjkr|S|jtjks|jdSt |jSN)r?syms expr_stmt simple_stmtparentr=)r2s rr=r=]sB yDN""  yD$$$ (;t t{ # ##rcT|krdStfd|jDS)NTc38K|]}t|VdSrE)rA)r$cr2s r zis_subtree..gs-::qz!T""::::::r)anyr>)rootr2s `rrArAds6 t||t ::::DM::: : ::rN)__doc__pgen2rpygramrrFr&r fixer_utilrrrrBaseFixr r/r=rAr!rrrUs44++++++1111111111L :&:&:&:&:&j :&:&:&@   $$$;;;;;rfixes/__pycache__/fix_ne.cpython-311.opt-1.pyc000064400000002136151027012300014774 0ustar00 !A?h;TdZddlmZddlmZddlmZGddejZdS)zFixer that turns <> into !=.)pytree)token) fixer_basec(eZdZejZdZdZdS)FixNec|jdkS)Nz<>)value)selfnodes -/usr/lib64/python3.11/lib2to3/fixes/fix_ne.pymatchz FixNe.matchszT!!cRtjtjd|j}|S)Nz!=)prefix)rLeafrNOTEQUALr)r r resultsnews r transformzFixNe.transforms!k%.$t{CCC rN)__name__ __module__ __qualname__rr _accept_typer rrr rr s;>L"""rrN)__doc__rpgen2rrBaseFixrrrr rs|#"     J      rfixes/__pycache__/fix_unicode.cpython-311.opt-2.pyc000064400000004462151027012300016025 0ustar00 !A?hP ddlmZddlmZdddZGddejZdS) )token) fixer_basechrstr)unichrunicodec,eZdZdZdZfdZdZxZS) FixUnicodeTzSTRING | 'unicode' | 'unichr'cvtt|||d|jv|_dS)Nunicode_literals)superr start_treefuture_featuresr )selftreefilename __class__s 2/usr/lib64/python3.11/lib2to3/fixes/fix_unicode.pyrzFixUnicode.start_trees9 j$**4::: 2d6J Jc|jtjkr-|}t|j|_|S|jtjkr|j}|js@|ddvr6d|vr2dd| dD}|ddvr |dd}||jkr|S|}||_|SdS)Nz'"\z\\cbg|],}|dddd-S)z\uz\\uz\Uz\\U)replace).0vs r z(FixUnicode.transform.. sF"""IIeV,,44UFCC"""ruU) typerNAMEclone_mappingvalueSTRINGr joinsplit)rnoderesultsnewvals r transformzFixUnicode.transforms 9 " "**,,C ,CIJ Y%, & &*C( SVu__jj"" YYu--"""1v~~!""gdj   **,,CCIJ' &r)__name__ __module__ __qualname__ BM_compatiblePATTERNrr, __classcell__)rs@rr r sVM-GKKKKKrr N)pgen2rrr#BaseFixr rrr7st% 0 0#rfixes/__pycache__/fix_imports.cpython-311.opt-1.pyc000064400000015442151027012300016073 0ustar00 !A?h4RdZddlmZddlmZmZiddddddd d d d d ddddddddddddddddddddd d!d"id#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdDdEdFdGdHdIdJdJdJdKdLdLdMdNdOZdPZefdQZGdRdSej Z dTS)Uz/Fix incompatible imports and module references.) fixer_base)Name attr_chainStringIOio cStringIOcPicklepickle __builtin__builtinscopy_regcopyregQueuequeue SocketServer socketserver ConfigParser configparserreprreprlib FileDialogztkinter.filedialog tkFileDialog SimpleDialogztkinter.simpledialogtkSimpleDialogtkColorChooserztkinter.colorchoosertkCommonDialogztkinter.commondialogDialogztkinter.dialogTkdndz tkinter.dndtkFontz tkinter.font tkMessageBoxztkinter.messagebox ScrolledTextztkinter.scrolledtext Tkconstantsztkinter.constantsTixz tkinter.tixttkz tkinter.ttkTkintertkinter markupbase _markupbase_winregwinregthread_thread dummy_thread _dummy_threaddbhashzdbm.bsddumbdbmzdbm.dumbdbmzdbm.ndbmgdbmzdbm.gnu xmlrpclibz xmlrpc.clientDocXMLRPCServerz xmlrpc.serverz http.clientz html.entitiesz html.parserz http.cookieszhttp.cookiejarz http.server subprocess collectionsz urllib.parsezurllib.robotparser)SimpleXMLRPCServerhttplibhtmlentitydefs HTMLParserCookie cookielibBaseHTTPServerSimpleHTTPServer CGIHTTPServercommands UserStringUserListurlparse robotparserc^ddtt|zdzS)N(|))joinmapr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_imports.py alternatesrM=s( #dG,,-- - 33c#Kdd|D}t|}d|d|dVd|zVd|d|d Vd |zVdS) Nz | cg|]}d|zS)zmodule_name='%s').0keys rL z!build_pattern..BsGGG-3GGGrNz$name_import=import_name< 'import' ((z;) | multiple_imports=dotted_as_names< any* (z) any* >) > zimport_from< 'from' (%s) 'import' ['('] ( any | import_as_name< any 'as' any > | import_as_names< any* >) [')'] > z(import_name< 'import' (dotted_as_name< (zg) 'as' any > | multiple_imports=dotted_as_names< any* dotted_as_name< (z!) 'as' any > any* >) > z3power< bare_with_attr=(%s) trailer<'.' any > any* >)rIrMkeys)mappingmod_list bare_namess rL build_patternrYAszzGGwGGGHHHGLLNN++JJ888 %%%%  888 %%%% @* LLLLLLrNcNeZdZdZdZeZdZdZfdZ fdZ fdZ dZ xZ S) FixImportsTcPdt|jS)NrG)rIrYrV)selfs rLrYzFixImports.build_pattern`sxx dl33444rNc||_tt|dSN)rYPATTERNsuperr[compile_pattern)r^ __class__s rLrczFixImports.compile_patterncs:))++  j$//11111rNctt|j|}|r1d|vr+tfdt |dDrdS|SdS)Nbare_with_attrc3.K|]}|VdSr`rQ)rRobjmatchs rL z#FixImports.match..qs+IIsc IIIIIIrNparentF)rbr[rianyr)r^noderesultsrirds @rLrizFixImports.matchjsvj$''-%++   w..IIIIjx.H.HIIIII/uNurNchtt|||i|_dSr`)rbr[ start_treereplace)r^treefilenamerds rLrpzFixImports.start_treevs. j$**4::: rNc|d}|r|j}|j|}|t ||jd|vr ||j|<d|vr/||}|r|||dSdSdS|dd}|j|j}|r+|t ||jdSdS)N module_name)prefix name_importmultiple_importsrf)getvaluerVrqrrvri transform)r^rmrn import_modmod_namenew_name bare_names rLr|zFixImports.transformzs'[[//  K!'H|H-H   tHZ5FGGG H H H''*2 X&!W,, **T**2NN411111-, 22 01!4I|'' 88H K!!$x 8H"I"I"IJJJJJ K KrN)__name__ __module__ __qualname__ BM_compatiblekeep_line_orderMAPPINGrV run_orderrYrcrirpr| __classcell__)rds@rLr[r[UsMOGI55522222     KKKKKKKrNr[N) __doc__r fixer_utilrrrrMrYBaseFixr[rQrNrLrs55))))))))2 :2  2  h2  :2  y 2  G 2  > 2  >2  92  -2  /2  12  32  32  32  %2  M!2 2 " ^#2 $ /%2 & 1'2 ( -)2 * -+2 , --2 . i/2 0 12 2 h32 4 Y52 6 ?72 : Y;2 < j=2 > *?2 @ 9A2 B C2 D oE2 2 F"1#-'#(*,)#'%&/c2 2 2 j444"MMMM(<K<K<K<K<K#<K<K<K<K<KrNfixes/__pycache__/fix_urllib.cpython-311.pyc000064400000024267151027012300014735 0ustar00 !A?h dZddlmZmZddlmZmZmZmZm Z m Z m Z dgdfdgdfdd gfgdgd fdd d gfgd Z e d e dddZGddeZdS)zFix changes imports of urllib which are now incompatible. This is rather similar to fix_imports, but because of the more complex nature of the fixing for urllib, it has its own fixer. ) alternates FixImports)NameComma FromImportNewlinefind_indentationNodesymszurllib.request) URLopenerFancyURLopener urlretrieve _urlopenerurlopen urlcleanup pathname2url url2pathname getproxiesz urllib.parse)quote quote_plusunquote unquote_plus urlencode splitattr splithost splitnport splitpasswd splitport splitquerysplittag splittype splituser splitvaluez urllib.errorContentTooShortError)rinstall_opener build_openerRequestOpenerDirector BaseHandlerHTTPDefaultErrorHandlerHTTPRedirectHandlerHTTPCookieProcessor ProxyHandlerHTTPPasswordMgrHTTPPasswordMgrWithDefaultRealmAbstractBasicAuthHandlerHTTPBasicAuthHandlerProxyBasicAuthHandlerAbstractDigestAuthHandlerHTTPDigestAuthHandlerProxyDigestAuthHandler HTTPHandler HTTPSHandler FileHandler FTPHandlerCacheFTPHandlerUnknownHandlerURLError HTTPError)urlliburllib2r?r>c #Kt}tD]P\}}|D]H}|\}}t|}d|d|dVd|d|d|dVd|zVd |zVd |d |d VIQdS) Nzimport_name< 'import' (module=zB | dotted_as_names< any* module=z any* >) > zimport_from< 'from' mod_member=z* 'import' ( member=z | import_as_name< member=z] 'as' any > | import_as_names< members=any* >) > zIimport_from< 'from' module_star=%r 'import' star='*' > ztimport_name< 'import' dotted_as_name< module_as=%r 'as' any > > zpower< bare_with_attr=z trailer< '.' member=z > any* > )setMAPPINGitemsr)bare old_modulechangeschange new_modulememberss 1/usr/lib64/python3.11/lib2to3/fixes/fix_urllib.py build_patternrL0s 55D&}}.. G . .F"( J ))GG$ZZZ1 1 1 1 1 $WWWggg7 7 7 7"# # # #"# # # # # $WWW. . . . .! ...c,eZdZdZdZdZdZdZdS) FixUrllibcDdtS)N|)joinrL)selfs rKrLzFixUrllib.build_patternIsxx (((rMc|d}|j}g}t|jddD]:}|t |d|t g;|t t|jdd|||dS)zTransform for the basic import case. Replaces the old import name with a comma separated list of its replacements. moduleNrprefix) getrXrCvalueextendrrappendreplace)rSnoderesults import_modprefnamesnames rKtransform_importzFixUrllib.transform_importLs [[**  J,-crc2 @ @D LL$tAwt444egg> ? ? ? ? T'*"23B7:4HHHIII5!!!!!rMc|d}|j}|d}|rt|tr|d}d}t|jD]}|j|dvr |d}n|r&|t||dS||ddSg}i} |d} | D]}|j tj kr%|j d j} |j dj} n |j} d} | d krst|jD]`}| |dvrT|d| vr| |d| |dg |ag} t|}d }d }|D]}| |}g}|dd D]B}||||| t#C|||d |t%||}|r|jj|r||_| |d}| rdg}| dd D]%}||t+g&| | d ||dS||ddS)zTransform for imports of specific module elements. Replaces the module to be imported from with the appropriate new module. mod_membermemberrNr@rW!This is an invalid module elementrJ,TcL|jtjkryt|jdj||jd|jdg}ttj|gSt|j|gS)NrrWr@ri)typer import_as_namerchildrenrZcloner )rcrXkidss rK handle_namez/FixUrllib.transform_member..handle_names9 333 q!1!7GGG M!,2244 M!,22446D!!4d;;<<TZ77788rMrVFzAll module elements are invalid)rYrX isinstancelistrCrZr]rcannot_convertrlr rmrnr\ setdefaultr r[rrparentendswithr)rSr^r_rfrargnew_namerHmodulesmod_dictrJas_name member_name new_nodes indentationfirstrqrUeltsrbeltnewnodesnew_nodes rKtransform_memberzFixUrllib.transform_member\s` [[..  X&& @ M&$'' #H!*"23  <6!9,,%ayHE- O""4#>#>#>?????##D*MNNNNN GHi(G! N N;$"555$oa06G"(/!"4":KK"(,K"G#%%")**:";NN&&)33%ay88 'vay 9 9 9$//q 2>>EEfMMMI*400KE 9 9 9"  '9**CLLS$!7!7888LL)))) [[b488999 //- 2 ; ;K H H-!,CJ  %%% M )#2#88HLL(GII!67777 Yr]+++ U#######D*KLLLLLrMcz|d}|d}d}t|tr|d}t|jD]}|j|dvr |d}n|r+|t ||jdS||ddS)z.Transform for calls to module members in code.bare_with_attrrgNrr@rWrh) rYrrrsrCrZr]rrXrt)rSr^r_ module_dotrgrxrHs rK transform_dotzFixUrllib.transform_dots[[!122 X&& fd # # AYFj./  F|vay((!!9)  K   tH+5+< > > > ? ? ? ? ?   &I J J J J JrMc|dr|||dS|dr|||dS|dr|||dS|dr||ddS|dr||ddSdS)NrUrfr module_starzCannot handle star imports. module_asz#This module is now multiple modules)rYrdrrrt)rSr^r_s rK transformzFixUrllib.transforms ;;x M  ! !$ 0 0 0 0 0 [[ & & M  ! !$ 0 0 0 0 0 [[) * * M   tW - - - - - [[ ' ' M   &C D D D D D [[ % % M   &K L L L L L M MrMN)__name__ __module__ __qualname__rLrdrrrrMrKrOrOGsn)))""" JMJMJMXKKK" M M M M MrMrON)__doc__lib2to3.fixes.fix_importsrrlib2to3.fixer_utilrrrrr r r rCr\rLrOrrMrKrs=<<<<<<<>>>>>>>>>>>>>>>>>>"CCCD ???@  +,. /" ' ' ' ( -/   B '(+A.///....}M}M}M}M}M }M}M}M}M}MrMfixes/__pycache__/fix_execfile.cpython-311.opt-1.pyc000064400000006161151027012300016160 0ustar00 !A?hldZddlmZddlmZmZmZmZmZm Z m Z m Z m Z m Z GddejZdS)zoFixer for execfile. This converts usages of the execfile function into calls to the built-in exec() function. ) fixer_base) CommaNameCallLParenRParenDotNodeArgListStringsymsceZdZdZdZdZdS) FixExecfileTz power< 'execfile' trailer< '(' arglist< filename=any [',' globals=any [',' locals=any ] ] > ')' > > | power< 'execfile' trailer< '(' filename=any ')' > > ch|d}|d}|d}|jdjd}t|t t ddg|}t tjtd|g}t tj ttd gt tj ttgg} |g| z} |} d| _t d d} | t | t | gz} ttd | d }|g}|5|t |g|5|t |gttd ||jS)Nfilenameglobalslocalsz"rb" )rparenopenreadz'exec'compileexec)prefix)getchildrencloner rr r r powerrtrailerr rrrrextend)selfnoderesultsrrrexecfile_paren open_args open_callr open_expr filename_argexec_str compile_args compile_callargss 3/usr/lib64/python3.11/lib2to3/fixes/fix_execfile.py transformzFixExecfile.transforms:&++i((X&&r*3B7==??X^^--uwwvs8K8KL#1333 d6llI%>?? T\CEE4<<#899T\FHHfhh#788:K$&  ~~'' ! (C(( EGG\577H#MM DOO\2>> ~   KK'--//2 3 3 3   KK&,,..1 2 2 2DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNr0r1r/rrs/MG <<<<r;s 111111111111111111111111&<&<&<&<&<*$&<&<&<&<& >ceZdZdZdZdZdS)FixInputTzL power< 'input' args=trailer< '(' [any] ')' > > ct|jjrdS|}d|_t t d|g|jS)Neval)prefix)contextmatchparentcloner rr)selfnoderesultsnews 0/usr/lib64/python3.11/lib2to3/fixes/fix_input.py transformzFixInput.transformsS ==+ , ,  Fjjll DLL3% <<<<N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG=====rrN) __doc__r r fixer_utilrrrcompile_patternr BaseFixrrrrr"s::######## "' !"J K K = = = = =z! = = = = =rfixes/__pycache__/fix_throw.cpython-311.opt-1.pyc000064400000005517151027012300015543 0ustar00 !A?h.pdZddlmZddlmZddlmZddlmZmZm Z m Z m Z Gddej Z dS) zFixer for generator.throw(E, V, T). g.throw(E) -> g.throw(E) g.throw(E, V) -> g.throw(E(V)) g.throw(E, V, T) -> g.throw(E(V).with_traceback(T)) g.throw("foo"[, V[, T]]) will warn about string exceptions.)pytree)token) fixer_base)NameCallArgListAttris_tupleceZdZdZdZdZdS)FixThrowTz power< any trailer< '.' 'throw' > trailer< '(' args=arglist< exc=any ',' val=any [',' tb=any] > ')' > > | power< any trailer< '.' 'throw' > trailer< '(' exc=any ')' > > c|j}|d}|jtjur||ddS|d}|dS|}t|rd|jddD}n d|_ |g}|d}d |vr|d }d|_ t||} t| td t|ggz} |tj|j| dS|t||dS) Nexcz+Python 3 does not support string exceptionsvalc6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_throw.py z&FixThrow.transform..)s :::!AGGII:::argstbwith_traceback)symsrtyperSTRINGcannot_convertgetr childrenprefixrr rrreplacerNodepower) selfnoderesultsrrrr throw_argsrewith_tbs r transformzFixThrow.transforms[yen""$$ 8u| # #   &S T T T Fkk%   ; Fiikk C== ::s|AbD'9:::DDCJ5DV_ 7??$$&&BBIS$A1d#34455"GG   v{4:w?? @ @ @ @ @   tC / / / / /rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr.rrrr r s/MG00000rr N)__doc__rrpgen2rr fixer_utilrrrr r BaseFixr rrrr8s??<<<<<<<<<<<<<<(0(0(0(0(0z!(0(0(0(0(0rfixes/__pycache__/fix_numliterals.cpython-311.pyc000064400000003117151027012300015772 0ustar00 !A?hTdZddlmZddlmZddlmZGddejZdS)z-Fixer that turns 1L into 1, 0755 into 0o755. )token) fixer_base)Numberc(eZdZejZdZdZdS)FixNumliteralscT|jdp|jddvS)N0Ll)value startswith)selfnodes 6/usr/lib64/python3.11/lib2to3/fixes/fix_numliterals.pymatchzFixNumliterals.matchs( %%c**Ddjn.DEc|j}|ddvr |dd}nV|drA|r-tt |dkr d|ddz}t ||jS)Nr r r 0o)prefix)r r isdigitlensetrr)rrresultsvals r transformzFixNumliterals.transformsj r7d??crc(CC ^^C  !S[[]] !s3s88}}q7H7HQRR.Cc$+....rN)__name__ __module__ __qualname__rNUMBER _accept_typerrrrrr s>r(s~ /////Z'/////rfixes/__pycache__/fix_throw.cpython-311.pyc000064400000005517151027012300014604 0ustar00 !A?h.pdZddlmZddlmZddlmZddlmZmZm Z m Z m Z Gddej Z dS) zFixer for generator.throw(E, V, T). g.throw(E) -> g.throw(E) g.throw(E, V) -> g.throw(E(V)) g.throw(E, V, T) -> g.throw(E(V).with_traceback(T)) g.throw("foo"[, V[, T]]) will warn about string exceptions.)pytree)token) fixer_base)NameCallArgListAttris_tupleceZdZdZdZdZdS)FixThrowTz power< any trailer< '.' 'throw' > trailer< '(' args=arglist< exc=any ',' val=any [',' tb=any] > ')' > > | power< any trailer< '.' 'throw' > trailer< '(' exc=any ')' > > c|j}|d}|jtjur||ddS|d}|dS|}t|rd|jddD}n d|_ |g}|d}d |vr|d }d|_ t||} t| td t|ggz} |tj|j| dS|t||dS) Nexcz+Python 3 does not support string exceptionsvalc6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_throw.py z&FixThrow.transform..)s :::!AGGII:::argstbwith_traceback)symsrtyperSTRINGcannot_convertgetr childrenprefixrr rrreplacerNodepower) selfnoderesultsrrrr throw_argsrewith_tbs r transformzFixThrow.transforms[yen""$$ 8u| # #   &S T T T Fkk%   ; Fiikk C== ::s|AbD'9:::DDCJ5DV_ 7??$$&&BBIS$A1d#34455"GG   v{4:w?? @ @ @ @ @   tC / / / / /rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr.rrrr r s/MG00000rr N)__doc__rrpgen2rr fixer_utilrrrr r BaseFixr rrrr8s??<<<<<<<<<<<<<<(0(0(0(0(0z!(0(0(0(0(0rfixes/__pycache__/fix_exec.cpython-311.pyc000064400000003446151027012300014364 0ustar00 !A?hPdZddlmZddlmZmZmZGddejZdS)zFixer for exec. This converts usages of the exec statement into calls to a built-in exec() function. exec code in ns1, ns2 -> exec(code, ns1, ns2) ) fixer_base)CommaNameCallceZdZdZdZdZdS)FixExecTzx exec_stmt< 'exec' a=any 'in' b=any [',' c=any] > | exec_stmt< 'exec' (not atom<'(' [any] ')'>) a=any > c|sJ|j}|d}|d}|d}|g}d|d_|5|t |g|5|t |gt td||jS)Nabcexec)prefix)symsgetclonerextendrrr)selfnoderesultsrr r r argss //usr/lib64/python3.11/lib2to3/fixes/fix_exec.py transformzFixExec.transformswy CL KK   KK   {Q = KK!'')), - - - = KK!'')), - - -DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG < < < < r%sx**********<<<< def func(x, d): ((a, b), c) = x ... It will also support lambdas: lambda (x, y): x + y -> lambda t: t[0] + t[1] # The parens are a syntax error in Python 3 lambda (x): x + y -> lambda x: x + y )pytree)token) fixer_base)AssignNameNewlineNumber Subscriptsymscvt|tjo|jdjt jkS)N) isinstancerNodechildrentyperSTRING)stmts 7/usr/lib64/python3.11/lib2to3/fixes/fix_tuple_params.py is_docstringrs/ dFK ( ( 1 =  EL 01c&eZdZdZdZdZdZdZdS)FixTupleParamsTa funcdef< 'def' any parameters< '(' args=any ')' > ['->' any] ':' suite=any+ > | lambda= lambdef< 'lambda' args=vfpdef< '(' inner=any ')' > ':' body=any > c d|vr||Sg |d}|d}|djdjtjkr)d}|djdj}t n#d}d}tjtjd d fd }|jtj kr ||nU|jtj kr@t|jD]+\}} | jtj kr|| |dk , sdS D]} |d| _ |} |dkrd d_n2t|dj|r| d_|dz} D]} |d| _  |dj| | <t!| dz| t# zdzD]}||dj|_|ddS)Nlambdasuiteargsr rz; Fct}|}d|_t ||}|rd|_||tjtj |gdS)Nr ) rnew_namecloneprefixrreplaceappendrrr simple_stmt) tuple_arg add_prefixnargrend new_linesselfs r handle_tuplez.FixTupleParams.transform..handle_tupleCsT]]__%%A//##CCJ#qwwyy))D    a   V[)9*. )<>> ? ? ? ? ?r)r)r!)F)transform_lambdarrrINDENTvaluerrLeafr tfpdef typedargslist enumerateparentr$rrangelenchanged)r.noderesultsrrstartindentr/ir+lineafterr,r-s` @@r transformzFixTupleParams.transform.sM w  ((w77 7  v 8 Q  $ 4 4E1X&q)/F))CCEF+elB//C ? ? ? ? ? ? ? ? 9 # # L     Y$, , ,#DM22 : :38t{**!L!a%9999  F # #D(DKK A::"%IaL   %(+E2 3 3 "(IaL AIE # #D(DKK)2a%+&uQwc)nn 4Q 677 1 1A*0E!H a ' ' arc|d}|d}t|d}|jtjkr2|}d|_||dSt|}t|}| t|}t|d} || | D]} | jtjkrv| j |vrmd|| j D} tjt j| g| z} | j| _| | dS)Nrbodyinnerr!)r$c6g|]}|S)r#.0cs r z3FixTupleParams.transform_lambda..s CCCAaggiiCCCr) simplify_argsrrNAMEr#r$r% find_params map_to_indexr" tuple_namer post_orderr2rrr power) r.r;r<rrDrEparamsto_indextup_name new_paramr* subscriptsnews rr0zFixTupleParams.transform_lambdans`vvgg.// : # #KKMMEEL LL    FT""''==F!3!344#...  Y__&&'''""  Av##8(;(;CC!'1BCCC k$*#,??#4#4"5 "BDDX  #   rN)__name__ __module__ __qualname__ run_order BM_compatiblePATTERNrBr0rGrrrrsDIMG>>>@rrc|jtjtjfvr|S|jtjkr9|jtjkr"|jd}|jtjk"|Std|z)NrzReceived unexpected node %s)rr vfplistrrMvfpdefr RuntimeErrorr;s rrLrLss yT\5:... dk ! !i4;&&=#Di4;&& 4t; < <.s, K K KqQVu{5J5JKNN5J5J5Jr)rr rarNrrrMr2rcs rrNrNsS yDK4=+,,, ej z K KDM K K KKrNc|i}t|D]_\}}ttt|g}t |t rt |||W||z||<`|S)N)d)r6r r strrlistrO) param_listr$rhr?objtrailers rrOrOsy J''&&3VCFF^^,,- c4  & g + + + + +g%AcFF Hrcg}|D]O}t|tr#|t|:||Pd|S)N_)rrjr&rPjoin)rklrls rrPrPsd A c4   HHZ__ % % % % HHSMMMM 88A;;r)__doc__rrpgen2rr fixer_utilrrrr r r rBaseFixrrLrNrOrPrGrrrvs*GGGGGGGGGGGGGGGG111gggggZ'gggX = = =LLL%'$     rfixes/__pycache__/fix_basestring.cpython-311.opt-1.pyc000064400000001550151027012300016532 0ustar00 !A?h@HdZddlmZddlmZGddejZdS)zFixer for basestring -> str.) fixer_base)NameceZdZdZdZdZdS) FixBasestringTz 'basestring'c.td|jS)Nstr)prefix)rr )selfnoderesultss 5/usr/lib64/python3.11/lib2to3/fixes/fix_basestring.py transformzFixBasestring.transform sE$+....N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rrs-MG/////rrN)__doc__r fixer_utilrBaseFixrrrr rsh""/////J&/////rfixes/__pycache__/fix_future.cpython-311.opt-2.pyc000064400000001611151027012300015702 0ustar00 !A?h#F ddlmZddlmZGddejZdS)) fixer_base) BlankLinec eZdZdZdZdZdZdS) FixFutureTz;import_from< 'from' module_name="__future__" 'import' any > c:t}|j|_|S)N)rprefix)selfnoderesultsnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_future.py transformzFixFuture.transformskk[  N)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderrrrrr s4MOGIrrN)r fixer_utilrBaseFixrrrrrsg""""""      "     rfixes/__pycache__/fix_print.cpython-311.opt-2.pyc000064400000007125151027012300015532 0ustar00 !A?h  ddlmZddlmZddlmZddlmZddlmZmZm Z m Z ej dZ Gddej Zd S) )patcomp)pytree)token) fixer_base)NameCallCommaStringz"atom< '(' [atom|STRING|NAME] ')' >c"eZdZdZdZdZdZdS)FixPrintTzP simple_stmt< any* bare='print' any* > | print_stmt c |d}|r9|ttdg|jdS|jdd}t |dkr"t|drdSdx}x}}|r$|dtkr |dd}d}|rM|dtj tj dkr$|d}|d d}d |D}|r d |d_||||1||d t!t#||1||d t!t#||||d|ttd|} |j| _| S)Nbareprint)prefix z>>c6g|]}|S)clone).0args 0/usr/lib64/python3.11/lib2to3/fixes/fix_print.py z&FixPrint.transform..?s ...##))++...sependfile)getreplacerrrchildrenlen parend_exprmatchr rLeafr RIGHTSHIFTr add_kwargr repr) selfnoderesults bare_printargsrr r!l_argsn_stmts r transformzFixPrint.transform%s[[((     tDMM2&0&7 9 9 9 : : : F}QRR  t99>>k//Q88> FcD  DH''9DC  DGv{5+;TBBBB7==??D8D.....  "!F1I  ?co1AvufT#YY.?.?@@@vufT#YY.?.?@@@vvt444d7mmV,,   rc*d|_tj|jjt |tjtjd|f}|r(| td|_| |dS)Nr=r) rrNodesymsargumentrr(rEQUALappendr )r,l_nodess_kwdn_expr n_arguments rr*zFixPrint.add_kwargMs [!3"&u++"(+ek3"?"?"("*++   $ NN577 # # # #J z"""""rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr3r*rrrr r s?MG&&&P # # # # #rr N)rrrpgen2rr fixer_utilrrr r compile_patternr&BaseFixr rrrrHs 222222222222&g%6 :#:#:#:#:#z!:#:#:#:#:#rfixes/__pycache__/fix_methodattrs.cpython-311.opt-1.pyc000064400000002403151027012300016725 0ustar00 !A?h^TdZddlmZddlmZddddZGdd ejZd S) z;Fix bound method attributes (method.im_? -> method.__?__). ) fixer_base)Name__func____self__z__self__.__class__)im_funcim_selfim_classceZdZdZdZdZdS)FixMethodattrsTzU power< any+ trailer< '.' attr=('im_func' | 'im_self' | 'im_class') > any* > c|dd}t|j}|t||jdS)Nattr)prefix)MAPvaluereplacerr)selfnoderesultsr news 6/usr/lib64/python3.11/lib2to3/fixes/fix_methodattrs.py transformzFixMethodattrs.transformsBvq!$*o T#dk22233333N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrr r s/MG44444rr N)__doc__r fixer_utilrrBaseFixr rrrr$s % 4 4 4 4 4Z' 4 4 4 4 4rfixes/__pycache__/fix_next.cpython-311.opt-2.pyc000064400000012037151027012300015352 0ustar00 !A?hf | ddlmZddlmZddlmZddlmZm Z m Z dZ Gddej Z dZd Zd Zd S) )token)python_symbols) fixer_base)NameCall find_bindingz;Calls to builtin next() possibly shadowed by global bindingc0eZdZdZdZdZfdZdZxZS)FixNextTa power< base=any+ trailer< '.' attr='next' > trailer< '(' ')' > > | power< head=any+ trailer< '.' attr='next' > not trailer< '(' ')' > > | classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='next' parameters< '(' NAME ')' > any+ > any* > > | global=global_stmt< 'global' any* 'next' any* > prectt|||td|}|r$||t d|_dSd|_dS)NnextTF)superr start_treerwarning bind_warning shadowed_next)selftreefilenamen __class__s //usr/lib64/python3.11/lib2to3/fixes/fix_next.pyrzFixNext.start_tree$sj gt''h777  & &  ' LLL ) ) )!%D   !&D   c\|d}|d}|d}|r|jr+|td|jdSd|D}d|d_|t td |j|dS|r-td|j}||dS|rt |rZ|d }dd |Dd kr| |tdS|tddSd |vr$| |td|_dSdS)Nbaseattrname__next__)prefixc6g|]}|S)clone.0rs r z%FixNext.transform..9s 000a 000rr headc,g|]}t|Sr!)strr#s rr%z%FixNext.transform..Es111qCFF111r __builtin__globalT) getrreplacerrris_assign_targetjoinstriprr)rnoderesultsrrrrr(s r transformzFixNext.transform.s{{6""{{6""{{6""  &! K T*T[AAABBBBB004000!#Q T$vdk"B"B"BDIIJJJJJ  &Z 444A LLOOOOO  & %% v7711D1112288::mKKLL|444 LLj)) * * * * *  LL| , , ,!%D   ! r) __name__ __module__ __qualname__ BM_compatiblePATTERNorderrr4 __classcell__)rs@rr r sZM G E'''''&&&&&&&rr ct|}|dS|jD]-}|jtjkrdSt ||rdS.dS)NFT) find_assignchildrentyperEQUAL is_subtree)r2assignchilds rr/r/Qsc   F ~u : $ $55 t $ $ 44  5rc|jtjkr|S|jtjks|jdSt |jSN)r?syms expr_stmt simple_stmtparentr=)r2s rr=r=]sB yDN""  yD$$$ (;t t{ # ##rcT|krdStfd|jDS)NTc38K|]}t|VdSrE)rA)r$cr2s r zis_subtree..gs-::qz!T""::::::r)anyr>)rootr2s `rrArAds6 t||t ::::DM::: : ::rN)pgen2rpygramrrFr&r fixer_utilrrrrBaseFixr r/r=rAr!rrrTs4++++++1111111111L :&:&:&:&:&j :&:&:&@   $$$;;;;;rfixes/__pycache__/fix_getcwdu.cpython-311.pyc000064400000002055151027012300015075 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z1 Fixer that changes os.getcwdu() to os.getcwd(). ) fixer_base)NameceZdZdZdZdZdS) FixGetcwduTzR power< 'os' trailer< dot='.' name='getcwdu' > any* > ch|d}|td|jdS)Nnamegetcwd)prefix)replacerr )selfnoderesultsrs 2/usr/lib64/python3.11/lib2to3/fixes/fix_getcwdu.py transformzFixGetcwdu.transforms2v T(4;77788888N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG99999rrN)__doc__r fixer_utilrBaseFixrrrrrsl  9 9 9 9 9# 9 9 9 9 9rfixes/__pycache__/fix_has_key.cpython-311.pyc000064400000010557151027012300015064 0ustar00 !A?h| XdZddlmZddlmZddlmZmZGddejZdS)a&Fixer for has_key(). Calls to .has_key() methods are expressed in terms of the 'in' operator: d.has_key(k) -> k in d CAVEATS: 1) While the primary target of this fixer is dict.has_key(), the fixer will change any has_key() method call, regardless of its class. 2) Cases like this will not be converted: m = d.has_key if m(k): ... Only *calls* to has_key() are converted. While it is possible to convert the above to something like m = d.__contains__ if m(k): ... this is currently not done. )pytree) fixer_base)Name parenthesizeceZdZdZdZdZdS) FixHasKeyTa anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > after=any* > | negation=not_test< 'not' anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > > > c R|sJ|j}|jj|jkr!|j|jrdS|d}|d}|j}d|dD}|d}|d} | r d| D} |j|j |j|j |j |j |j |jfvrt|}t!|dkr |d }nt#j|j|}d |_t)d d } |r-t)d d } t#j|j| | f} t#j|j || |f} | r:t| } t#j|j| ft-| z} |jj|j |j|j|j|j|j|j|j|jf vrt| } || _| S)Nnegationanchorc6g|]}|Sclone.0ns 2/usr/lib64/python3.11/lib2to3/fixes/fix_has_key.py z'FixHasKey.transform..Rs 777!''))777beforeargafterc6g|]}|Sr rrs rrz'FixHasKey.transform..Vs ...1QWWYY...r in)prefixnot)symsparenttypenot_testpatternmatchgetrr comparisonand_testor_testtestlambdefargumentrlenrNodepowerrcomp_optupleexprxor_exprand_expr shift_expr arith_exprtermfactor) selfnoderesultsr r r rrrrn_opn_notnews r transformzFixHasKey.transformGs#wy K  - - L  t{ + + .4;;z**"77WX%6777en""$$ G$$  /.....E 8  dit}N N Ns##C v;;!  AYFF[V44F D%%%  <s+++E;t|eT];;Dk$/Cv+>??  As##C+dj3&5<<*?@@C ; DM $ t $ $ TZ 9 9 9s##C  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr?r rrrr&s/MG<&&&&&rrN) __doc__rr fixer_utilrrBaseFixrr rrrIs:++++++++GGGGG "GGGGGrfixes/__pycache__/fix_isinstance.cpython-311.opt-2.pyc000064400000004164151027012300016536 0ustar00 !A?hHF ddlmZddlmZGddejZdS)) fixer_base)tokenc eZdZdZdZdZdZdS) FixIsinstanceTz power< 'isinstance' trailer< '(' arglist< any ',' atom< '(' args=testlist_gexp< any+ > ')' > > ')' > > ct}|d}|j}g}t|}|D]\}} | jtjkrN| j|vrE|t|dz kr.||dzjtjkrt|gh| | | jtjkr| | j|r|djtjkr|d=t|dkr6|j } | j |d_ | |ddS||dd<|dS)Nargs)setchildren enumeratetyperNAMEvaluelenCOMMAnextappendaddparentprefixreplacechanged) selfnoderesultsnames_insertedtestlistr new_argsiteratoridxargatoms 5/usr/lib64/python3.11/lib2to3/fixes/fix_isinstance.py transformzFixIsinstance.transformsO6? T??  2 2HCx5:%%#)~*E*ETQ&&4a=+=+L+LNNN$$$8uz))"&&sy111   )U[88 x==A  ?D!%HQK  LL! % % % % %DG LLNNNNNN)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderr'r(r&rrs6MGIr(rN)r fixer_utilrBaseFixrr/r(r&r3sg$$$$$J&$$$$$r(fixes/__pycache__/fix_repr.cpython-311.opt-1.pyc000064400000002327151027012300015344 0ustar00 !A?hePdZddlmZddlmZmZmZGddejZdS)z/Fixer that transforms `xyzzy` into repr(xyzzy).) fixer_base)CallName parenthesizeceZdZdZdZdZdS)FixReprTz7 atom < '`' expr=any '`' > c|d}|j|jjkrt |}t t d|g|jS)Nexprrepr)prefix)clonetypesyms testlist1rrrr )selfnoderesultsr s //usr/lib64/python3.11/lib2to3/fixes/fix_repr.py transformzFixRepr.transformsUv$$&& 9 + + +%%DDLL4&====N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG>>>>>rrN) __doc__r fixer_utilrrrBaseFixrrrrr!sv651111111111 > > > > >j > > > > >rfixes/__pycache__/fix_methodattrs.cpython-311.opt-2.pyc000064400000002266151027012300016735 0ustar00 !A?h^R ddlmZddlmZddddZGddejZd S) ) fixer_base)Name__func____self__z__self__.__class__)im_funcim_selfim_classceZdZdZdZdZdS)FixMethodattrsTzU power< any+ trailer< '.' attr=('im_func' | 'im_self' | 'im_class') > any* > c|dd}t|j}|t||jdS)Nattr)prefix)MAPvaluereplacerr)selfnoderesultsr news 6/usr/lib64/python3.11/lib2to3/fixes/fix_methodattrs.py transformzFixMethodattrs.transformsBvq!$*o T#dk22233333N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrr r s/MG44444rr N)r fixer_utilrrBaseFixr rrrr#s~ % 4 4 4 4 4Z' 4 4 4 4 4rfixes/__pycache__/fix_dict.cpython-311.opt-2.pyc000064400000007665151027012300015332 0ustar00 !A?h ddlmZddlmZddlmZddlmZmZmZddlmZejdhzZ Gddej Z d S) )pytree)patcomp) fixer_base)NameCallDot) fixer_utilitercjeZdZdZdZdZdZejeZ dZ eje Z dZ dS)FixDictTa power< head=any+ trailer< '.' method=('keys'|'items'|'values'| 'iterkeys'|'iteritems'|'itervalues'| 'viewkeys'|'viewitems'|'viewvalues') > parens=trailer< '(' ')' > tail=any* > c |d}|dd}|d}|j}|j}|d}|d} |s| r |dd}d|D}d |D}| o|||} |t j|jtt||j g|d  gz} t j|j | } | s+| s)d | _ tt|rdnd | g} |rt j|j | g|z} |j | _ | S)Nheadmethodtailr viewc6g|]}|Sclone.0ns //usr/lib64/python3.11/lib2to3/fixes/fix_dict.py z%FixDict.transform..A (((a (((c6g|]}|Srrrs rrz%FixDict.transform..Brr)prefixparenslist) symsvalue startswithin_special_contextrNodetrailerrrr rpowerr) selfnoderesultsrrrr$ method_nameisiterisviewspecialargsnews r transformzFixDict.transform6sv"1%vyl ''//''//  *V *%abb/K((4(((((4((((Dt66tVDDv{4<$'EE$(06 %?%?%?$@AAx(..00 22 k$*d++ B6 BCJtf8FF&99C5AAC  8+dj3%$,77C[  rz3power< func=NAME trailer< '(' node=any ')' > any* >zmfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > cH|jdSi}|jj^|j|jj|r9|d|ur/|r|djtvS|djt jvS|sdS|j|j|o |d|uS)NFr,func)parentp1matchr% iter_exemptr consuming_callsp2)r+r,r/r-s rr'zFixDict.in_special_contextZs ; 5 K  *w}}T[/99 +v$&& Kv, ;;v, 0JJJ 5w}}T['22Nwv$7NNrN) __name__ __module__ __qualname__ BM_compatiblePATTERNr4P1rcompile_patternr8P2r<r'rrrr r )swMG8 ?B   $ $B B !  $ $BOOOOOrr N) r"rrrr rrrr;r:BaseFixr rrrrFs6(((((((((((F83 AOAOAOAOAOj AOAOAOAOAOrfixes/__pycache__/fix_idioms.cpython-311.opt-1.pyc000064400000013510151027012300015654 0ustar00 !A?h ddZddlmZddlmZmZmZmZmZm Z dZ dZ Gddej Z dS) aAdjust some old Python 2 idioms to their modern counterparts. * Change some type comparisons to isinstance() calls: type(x) == T -> isinstance(x, T) type(x) is T -> isinstance(x, T) type(x) != T -> not isinstance(x, T) type(x) is not T -> not isinstance(x, T) * Change "while 1:" into "while True:". * Change both v = list(EXPR) v.sort() foo(v) and the more general v = EXPR v.sort() foo(v) into v = sorted(EXPR) foo(v) ) fixer_base)CallCommaNameNode BlankLinesymsz0(n='!=' | '==' | 'is' | n=comp_op< 'is' 'not' >)z(power< 'type' trailer< '(' x=any ')' > >c XeZdZdZdedededed ZfdZdZdZ d Z d Z xZ S) FixIdiomsTz isinstance=comparison<  z8 T=any > | isinstance=comparison< T=any aX > | while_stmt< 'while' while='1' ':' any+ > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' power< list='list' trailer< '(' (not arglist) any ')' > > > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > | sorted=any< any* simple_stmt< expr_stmt< id1=any '=' expr=any > '\n' > sort= simple_stmt< power< id2=any trailer< '.' 'sort' > trailer< '(' ')' > > '\n' > next=any* > ctt||}|rd|vr|d|dkr|SdS|S)Nsortedid1id2)superr match)selfnoder __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_idioms.pyrzFixIdioms.matchOsT )T " " ( ( . .  Qx1U8##4cd|vr|||Sd|vr|||Sd|vr|||Std)N isinstancewhilerz Invalid match)transform_isinstancetransform_whiletransform_sort RuntimeError)rrresultss r transformzFixIdioms.transformZss 7 " ",,T7;; ;   ''g66 6  &&tW55 5// /rcb|d}|d}d|_d|_ttd|t |g}d|vr0d|_t t jtd|g}|j|_|S)NxTr rnnot)cloneprefixrrrrr not_test)rrr r#r$tests rrzFixIdioms.transform_isinstanceds CL    CL   D&&EGGQ88 '>>DK U T':;;Dk  rch|d}|td|jdS)NrTruer))replacerr))rrr ones rrzFixIdioms.transform_whileps3g D 33344444rc|d}|d}|d}|d}|r*|td|jne|rT|}d|_|t td|g|jnt d||j}d |vr|rJ|d d |d jf} d | |d _dSt} |j | |d d | _dSdS) Nsortnextlistexprrr.r%zshould not have reached here ) getr/rr)r(rrremove rpartitionjoinrparent append_child) rrr sort_stmt next_stmt list_call simple_exprnewbtwn prefix_linesend_lines rrzFixIdioms.transform_sorttsFO FO KK'' kk&))  ?   d8I4DEEE F F F F  ?##%%CCJ   T(^^cU,7,>!@!@!@ A A A A=>> > 4<< ;!% 5 5a 8)A,:MN &*ii &=&= ! ### %;; --h777#'//$"7"7":! rRs<AAAAAAAAAAAAAAAA81s;s;s;s;s; "s;s;s;s;s;rfixes/__pycache__/fix_filter.cpython-311.opt-2.pyc000064400000007010151027012300015654 0ustar00 !A?h n ddlmZddlmZddlmZddlmZm Z m Z m Z m Z Gddej ZdS)) fixer_base)Node)python_symbols)NameArgListListCompin_special_context parenthesizec eZdZdZdZdZdZdS) FixFilterTaV filter_lambda=power< 'filter' trailer< '(' arglist< lambdef< 'lambda' (fp=NAME | vfpdef< '(' fp=NAME ')'> ) ':' xp=any > ',' it=any > ')' > [extra_trailers=trailer*] > | power< 'filter' trailer< '(' arglist< none='None' ',' seq=any > ')' > [extra_trailers=trailer*] > | power< 'filter' args=trailer< '(' [any] ')' > [extra_trailers=trailer*] > zfuture_builtins.filterc||rdSg}d|vr2|dD])}||*d|vr|d}|jt jkrd|_t|}t|d|d|d|}tt j |g|zd}n d|vrrttd td |d td }tt j |g|zd}nt|rdS|d }tt j td |gd}tt j td t|gg|z}d|_|j|_|S)Nextra_trailers filter_lambdaxpfpit)prefixnone_fseqargsfilterlist) should_skipappendclonegettypesymstestrr rrpowerrr r)selfnoderesultstrailerstrnewrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_filter.py transformzFixFilter.transform:s   D ! !  F w & &-. + + **** g % %T""((**Bw$)## !"%%7;;t,,2244";;t,,2244";;t,,2244b::CtzC58#3B???CC w  4::::"5>//11::''CtzC58#3B???CC"$'' t6?((**DtzDNND#9"EEECtzDLL'3%..#AH#LMMCCJ[  N)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr*r+r)r r s6MG<'G$$$$$r+r N)rrpytreerpygramrr fixer_utilrrrr r ConditionalFixr r2r+r)r7s ++++++RRRRRRRRRRRRRRGGGGG )GGGGGr+fixes/__pycache__/fix_has_key.cpython-311.opt-1.pyc000064400000010536151027012300016020 0ustar00 !A?h| XdZddlmZddlmZddlmZmZGddejZdS)a&Fixer for has_key(). Calls to .has_key() methods are expressed in terms of the 'in' operator: d.has_key(k) -> k in d CAVEATS: 1) While the primary target of this fixer is dict.has_key(), the fixer will change any has_key() method call, regardless of its class. 2) Cases like this will not be converted: m = d.has_key if m(k): ... Only *calls* to has_key() are converted. While it is possible to convert the above to something like m = d.__contains__ if m(k): ... this is currently not done. )pytree) fixer_base)Name parenthesizeceZdZdZdZdZdS) FixHasKeyTa anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > after=any* > | negation=not_test< 'not' anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > > > c J|j}|jj|jkr!|j|jrdS|d}|d}|j}d|dD}|d}|d} | r d| D} |j|j |j|j |j |j |j |jfvrt|}t!|dkr |d }nt#j|j|}d |_t)d d } |r-t)d d } t#j|j| | f} t#j|j || |f} | r:t| } t#j|j| ft-| z} |jj|j |j|j|j|j|j|j|j|jf vrt| } || _| S)Nnegationanchorc6g|]}|Sclone.0ns 2/usr/lib64/python3.11/lib2to3/fixes/fix_has_key.py z'FixHasKey.transform..Rs 777!''))777beforeargafterc6g|]}|Sr rrs rrz'FixHasKey.transform..Vs ...1QWWYY...r in)prefixnot)symsparenttypenot_testpatternmatchgetrr comparisonand_testor_testtestlambdefargumentrlenrNodepowerrcomp_optupleexprxor_exprand_expr shift_expr arith_exprtermfactor) selfnoderesultsr r r rrrrn_opn_notnews r transformzFixHasKey.transformGsy K  - - L  t{ + + .4;;z**"77WX%6777en""$$ G$$  /.....E 8  dit}N N Ns##C v;;!  AYFF[V44F D%%%  <s+++E;t|eT];;Dk$/Cv+>??  As##C+dj3&5<<*?@@C ; DM $ t $ $ TZ 9 9 9s##C  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr?r rrrr&s/MG<&&&&&rrN) __doc__rr fixer_utilrrBaseFixrr rrrIs:++++++++GGGGG "GGGGGrfixes/__pycache__/fix_standarderror.cpython-311.opt-1.pyc000064400000001645151027012300017250 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z%Fixer for StandardError -> Exception.) fixer_base)NameceZdZdZdZdZdS)FixStandarderrorTz- 'StandardError' c.td|jS)N Exception)prefix)rr )selfnoderesultss 8/usr/lib64/python3.11/lib2to3/fixes/fix_standarderror.py transformzFixStandarderror.transformsK 4444N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rr s/MG55555rrN)__doc__r fixer_utilrBaseFixrrrr rsj,+55555z)55555rfixes/__pycache__/__init__.cpython-311.pyc000064400000000234151027012300014321 0ustar00 !A?h/dS)Nr//usr/lib64/python3.11/lib2to3/fixes/__init__.pyrsrfixes/__pycache__/fix_metaclass.cpython-311.opt-1.pyc000064400000024675151027012300016362 0ustar00 !A?h dZddlmZddlmZddlmZmZmZdZ dZ dZ dZ d Z d ZGd d ejZd S)aFixer for __metaclass__ = X -> (metaclass=X) methods. The various forms of classef (inherits nothing, inherits once, inherits many) don't parse the same in the CST so we look at ALL classes for a __metaclass__ and if we find one normalize the inherits to all be an arglist. For one-liner classes ('class X: pass') there is no indent/dedent so we normalize those into having a suite. Moving the __metaclass__ into the classdef can also cause the class body to be empty so there is some special casing for that as well. This fixer also tries very hard to keep original indenting and spacing in all those corner cases. ) fixer_base)token)symsNodeLeafcP|jD]}|jtjkrt |cS|jtjkr`|jrY|jd}|jtjkr7|jr0|jd}t|tr|j dkrdSdS)z we have to check the cls_node without changing it. There are two possibilities: 1) clsdef => suite => simple_stmt => expr_stmt => Leaf('__meta') 2) clsdef => simple_stmt => expr_stmt => Leaf('__meta') __metaclass__TF) childrentypersuite has_metaclass simple_stmt expr_stmt isinstancervalue)parentnode expr_node left_sides 4/usr/lib64/python3.11/lib2to3/fixes/fix_metaclass.pyrrs     9 " " && & & & Y$* * *t} * a(I~//I4F/%.q1 i.. !?::44 5c |jD]}|jtjkrdSt |jD]\}}|jt jkrntdttjg}|j|dzdr]|j|dz}| | | |j|dzd]| ||}dS)zf one-line classes don't get a suite in the parse tree so we add one to normalize the tree NzNo class suite and no ':'!) r r rr enumeraterCOLON ValueErrorr append_childcloneremove)cls_noderir move_nodes rfixup_parse_treer$-s ! 9 " " FF # X.//774 9 # # E $5666 R E  AaCDD !%ac*  9??,,---  AaCDD ! %   DDDrcnt|jD]\}}|jtjkrndS|t tjg}t tj |g}|j|drW|j|}| | ||j|dW| |||jdjd}|jdjd} | j |_ dS)z if there is a semi-colon all the parts count as part of the same simple_stmt. We just want the __metaclass__ part so we move everything after the semi-colon into its own simple_stmt node Nr )rr r rSEMIr rrrrrr insert_childprefix) rr" stmt_nodesemi_indrnew_exprnew_stmtr# new_leaf1 old_leaf1s rfixup_simple_stmtr/Gs. $I$677$ 9 " " E # KKMMMDNB''HD$xj11H  XYY '&x0 ioo//000  XYY ' 8$$$!!$-a0I"1%.q1I 'Irc|jrA|jdjtjkr#|jddSdSdS)N)r r rNEWLINEr )rs rremove_trailing_newliner3_sQ }#r*/5=@@ b  """""##@@rc#K|jD]}|jtjkrnt dt t |jD]\}}|jtjkr|jr}|jd}|jtjkr[|jrT|jd}t|tr2|j dkr't|||t||||fVdS)NzNo class suite!r r )r r rr rlistrrrrrrr/r3)r!rr" simple_noder left_nodes r find_metasr8ds!,, 9 " " E #*+++y7788 1 1;  t/ / /K4H /#,Q/I~//I4F/%.q1 i..1!?::%dA{;;;+K888K0000 1 1rcp|jddd}|r,|}|jtjkrn|,|ru|}t |t r%|jtjkr|jrd|_dS| |jddd|sdSdS)z If an INDENT is followed by a thing with a prefix then nuke the prefix Otherwise we get in trouble when removing __metaclass__ at suite start Nr1) r popr rINDENTrrDEDENTr(extend)r kidsrs r fixup_indentr@{s >$$B$ D xxzz 9 $ $   -xxzz dD ! ! -di5<&?&?{ !  F KK ddd+ , , , -----rceZdZdZdZdZdS) FixMetaclassTz classdef ct|sdSt|d}t|D]\}}}|}||jdj}t |jdkr|jdjtjkr|jd}nN|jd } ttj| g}| d|nt |jdkr1ttjg}| d|nt |jdkrttjg}| dttjd| d|| dttjdnt#d |jdjd} d | _| j} |jr5|ttjd d | _nd | _|jd} d | jd_d | jd_||t-||jso|t|d} | | _|| |ttjddSt |jdkr|jdjtjkrx|jdjtjkrZt|d} | d| | dttjddSdSdSdS)Nr r)(zUnexpected class definition metaclass, r:rpass r1)rr$r8r r r lenrarglistrr set_childr'rrRPARLPARrrr(rCOMMAr@r2r<r=)selfrresultslast_metaclassr r"stmt text_typerQrmeta_txtorig_meta_prefixr pass_leafs r transformzFixMetaclass.transformsT""  F(..  NE1d!N KKMMMMM!$)  t}   " "}Q$ 44-*q)//11t|fX66q'****   1 $ $4<,,G   a ) ) ) )   1 $ $4<,,G   aej#!6!6 7 7 7   a ) ) )   aej#!6!6 7 7 7 7:;; ;"*1-6q9$#?   !  ek3!7!7 8 8 8!HOO HO#+A. ') 1$') 1$^,,,U~ > LLNNNY//I/I    i ( ( (   d5=$77 8 8 8 8 8  1 $ $.$)U\99.$)U\99Y//I   r9 - - -   r4 t#<#< = = = = = % $9999rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr^rrrBrBs4MGL>L>L>L>L>rrBN)__doc__r:rpygramr fixer_utilrrrrr$r/r3r8r@BaseFixrBrdrrris())))))))))&4(((0### 111.---,S>S>S>S>S>:%S>S>S>S>S>rfixes/__pycache__/fix_exec.cpython-311.opt-1.pyc000064400000003425151027012300015320 0ustar00 !A?hPdZddlmZddlmZmZmZGddejZdS)zFixer for exec. This converts usages of the exec statement into calls to a built-in exec() function. exec code in ns1, ns2 -> exec(code, ns1, ns2) ) fixer_base)CommaNameCallceZdZdZdZdZdS)FixExecTzx exec_stmt< 'exec' a=any 'in' b=any [',' c=any] > | exec_stmt< 'exec' (not atom<'(' [any] ')'>) a=any > c|j}|d}|d}|d}|g}d|d_|5|t |g|5|t |gt td||jS)Nabcexec)prefix)symsgetclonerextendrrr)selfnoderesultsrr r r argss //usr/lib64/python3.11/lib2to3/fixes/fix_exec.py transformzFixExec.transformsy CL KK   KK   {Q = KK!'')), - - - = KK!'')), - - -DLL$t{;;;;N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MG < < < < r%sx**********<<<< >).0ts 0/usr/lib64/python3.11/lib2to3/fixes/fix_types.py r-3sPPPQ 4q 8PPPcBeZdZdZdeZdZdS)FixTypesT|ct|dj}|rt||jSdS)Nname)prefix) _TYPE_MAPPINGgetvaluerr4)selfnoderesults new_values r, transformzFixTypes.transform9s>!%%gfo&;<<  7 $+666 6tr.N)__name__ __module__ __qualname__ BM_compatiblejoin_patsPATTERNr<r)r.r,r0r05s7MhhuooGr.r0N)r fixer_utilrr5rBBaseFixr0r)r.r,rGsk&| f   F  6  ) W 5 F E x L 5 g!" g#$ %&- 2 QP-PPPz!r.fixes/__pycache__/fix_set_literal.cpython-311.opt-1.pyc000064400000005300151027012300016675 0ustar00 !A?hPdZddlmZmZddlmZmZGddejZdS)z: Optional fixer to transform set() calls to set literals. ) fixer_basepytree)tokensymsc eZdZdZdZdZdZdS) FixSetLiteralTajpower< 'set' trailer< '(' (atom=atom< '[' (items=listmaker< any ((',' any)* [',']) > | single=any) ']' > | atom< '(' items=testlist_gexp< any ((',' any)* [',']) > ')' > ) ')' > > c|d}|rJtjtj|g}|||}n|d}tjtj dg}| d|j D| tjtj d|jj|d_tjtj|}|j|_t#|j dkr8|j d}||j|j d_|S) Nsingleitems{c3>K|]}|VdS)N)clone).0ns 6/usr/lib64/python3.11/lib2to3/fixes/fix_set_literal.py z*FixSetLiteral.transform..'s*99Qqwwyy999999})getrNoder listmakerrreplaceLeafrLBRACEextendchildrenappendRBRACE next_siblingprefix dictsetmakerlenremove) selfnoderesultsr faker literalmakerrs r transformzFixSetLiteral.transforms.X&&  %;t~ /?@@D NN4 EEG$E;u|S11299%.999999v{5<55666"/6  D-w77{  u~  ! # #q!A HHJJJ()EN2  % rN)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNr-rrrr s4MHGrrN) __doc__lib2to3rrlib2to3.fixer_utilrrBaseFixrr4rrr9sx '&&&&&&&********)))))J&)))))rfixes/__pycache__/fix_zip.cpython-311.pyc000064400000004460151027012300014237 0ustar00 !A?h hdZddlmZddlmZddlmZddlm Z m Z m Z Gddej Z dS) a7 Fixer that changes zip(seq0, seq1, ...) into list(zip(seq0, seq1, ...) unless there exists a 'from future_builtins import zip' statement in the top-level namespace. We avoid the transformation if the zip() call is directly contained in iter(<>), list(<>), tuple(<>), sorted(<>), ...join(<>), or for V in <>:. ) fixer_base)Node)python_symbols)NameArgListin_special_contextc eZdZdZdZdZdZdS)FixZipTzN power< 'zip' args=trailer< '(' [any] ')' > [trailers=trailer*] > zfuture_builtins.zipc||rdSt|rdS|d}d|_g}d|vrd|dD}|D] }d|_ t t jtd|gd}t t jtdt|gg|z}|j|_|S)Nargstrailersc6g|]}|S)clone).0ns ./usr/lib64/python3.11/lib2to3/fixes/fix_zip.py z$FixZip.transform..'s ???a ???zip)prefixlist) should_skiprrrrsymspowerrr)selfnoderesultsr rrnews r transformzFixZip.transforms   D ! !  F d # # 4v$$&&   ??7:+>???H  4:U T22>>>4:V gsenn=HII[  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr!rrrr r s6MG $Grr N)__doc__r rpytreerpygramrr fixer_utilrrrConditionalFixr rrrr-s++++++::::::::::Z &rfixes/__pycache__/fix_except.cpython-311.pyc000064400000011264151027012300014725 0ustar00 !A?h zdZddlmZddlmZddlmZddlmZmZm Z m Z m Z m Z dZ GddejZd S) aFixer for except statements with named exceptions. The following cases will be converted: - "except E, T:" where T is a name: except E as T: - "except E, T:" where T is not a name, tuple or list: except E as t: T = t This is done because the target of an "except" clause must be a name. - "except E, T:" where T is a tuple or list literal: except E as t: T = t.args )pytree)token) fixer_base)AssignAttrNameis_tupleis_listsymsc#Kt|D]?\}}|jtjkr%|jdjdkr|||dzfV@dS)Nexceptr) enumeratetyper except_clausechildrenvalue)nodesins 1/usr/lib64/python3.11/lib2to3/fixes/fix_except.py find_exceptsrsh%  &&1 6T' ' 'z!}"h..%!*o%%%&&ceZdZdZdZdZdS) FixExceptTa1 try_stmt< 'try' ':' (simple_stmt | suite) cleanup=(except_clause ':' (simple_stmt | suite))+ tail=(['except' ':' (simple_stmt | suite)] ['else' ':' (simple_stmt | suite)] ['finally' ':' (simple_stmt | suite)]) > c j|j}d|dD}d|dD}t|D]\}}t|jdkr|jdd\}} } | t dd | jtjkrAt | d } | } d | _ | | | } |j} t| D]!\}}t|tjrn"t!| st#| r,t%| t'| t d }nt%| | }t)| d|D]}|d ||||| j d krd| _ d |jddD|z|z}tj|j|S)Nc6g|]}|Sclone).0rs r z'FixExcept.transform..2s 333a 333rtailc6g|]}|Srr)r!chs rr"z'FixExcept.transform..4s ???brxxzz???rcleanupas )prefixargsr c6g|]}|Srr)r!cs rr"z'FixExcept.transform..\s 999!AGGII999r)r rlenrreplacerrrNAMEnew_namer r+r isinstancerNoder r rrreversed insert_child)selfnoderesultsr r# try_cleanupre_suiteEcommaNnew_Ntarget suite_stmtsrstmtassignchildrs r transformzFixExcept.transform/s2y3376?333??GI,>??? &2;&?&?$ #$ # "M7=)**a// - 6qs ; E1 d44445556UZ'' ===EWWYYF$&FMIIe$$$!KKMME #*"2K#,[#9#9""4%dFK88"!E"  {{7gajj7!'UDLL0I0I!J!J!'!6!6"*+bqb/!:!:77,,Q6666((F3333X^^ #AH:9t}RaR'8999KG$N{49h///rN)__name__ __module__ __qualname__ BM_compatiblePATTERNrGrrrrr$s/MG.0.0.0.0.0rrN)__doc__r,rpgen2rr fixer_utilrrrr r r rBaseFixrrrrrQs0DDDDDDDDDDDDDDDD&&& 9090909090 "9090909090rfixes/__pycache__/fix_standarderror.cpython-311.pyc000064400000001645151027012300016311 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z%Fixer for StandardError -> Exception.) fixer_base)NameceZdZdZdZdZdS)FixStandarderrorTz- 'StandardError' c.td|jS)N Exception)prefix)rr )selfnoderesultss 8/usr/lib64/python3.11/lib2to3/fixes/fix_standarderror.py transformzFixStandarderror.transformsK 4444N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrr rr s/MG55555rrN)__doc__r fixer_utilrBaseFixrrrr rsj,+55555z)55555rfixes/__pycache__/fix_xrange.cpython-311.opt-2.pyc000064400000010061151027012300015653 0ustar00 !A?h Z ddlmZddlmZmZmZddlmZGddejZdS)) fixer_base)NameCallconsuming_calls)patcompceZdZdZdZfdZdZdZdZdZ dZ e j e Z d Ze j eZd ZxZS) FixXrangeTz power< (name='range'|name='xrange') trailer< '(' args=any ')' > rest=any* > ctt|||t|_dSN)superr start_treesettransformed_xranges)selftreefilename __class__s 1/usr/lib64/python3.11/lib2to3/fixes/fix_xrange.pyr zFixXrange.start_trees5 i))$999#&55   cd|_dSr )r)rrrs r finish_treezFixXrange.finish_trees#'   rc|d}|jdkr|||S|jdkr|||Stt |)Nnamexrangerange)valuetransform_xrangetransform_range ValueErrorreprrnoderesultsrs r transformzFixXrange.transformsev : ! !((w77 7 Z7 " "''g66 6T$ZZ(( (rc|d}|td|j|jt |dS)Nrrprefix)replacerr'raddidr!s rrzFixXrange.transform_xrange$sOv T'$+666777  $$RXX.....rcZt||jvr||stt d|dg}tt d|g|j}|dD]}|||SdSdS)Nrargslistr&rest)r*rin_special_contextrrcloner' append_child)rr"r# range_call list_callns rrzFixXrange.transform_range*s tHHD4 4 4''-- 5d7mmgfo.C.C.E.E-FGGJT&\\J<$(K111IV_ * *&&q))))  5 4 4 4rz3power< func=NAME trailer< '(' node=any ')' > any* >zfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > | comparison< any 'in' node=any any*> c |jdSi}|jjC|j|jj|r|d|ur|djtvS|j|j|o |d|uS)NFr"func)parentp1matchrrp2)rr"r#s rr/zFixXrange.in_special_context?s ; 5 K  *w}}T[/99 +v$&&6?(O; ;w}}T['22Nwv$7NNr)__name__ __module__ __qualname__ BM_compatiblePATTERNr rr$rrP1rcompile_patternr8P2r:r/ __classcell__)rs@rr r sMG )))))((()))///    ?B   $ $B B !  $ $B O O O O O O Orr N) r fixer_utilrrrrBaseFixr rrrHs64444444444=O=O=O=O=O "=O=O=O=O=Orfixes/__pycache__/fix_reduce.cpython-311.pyc000064400000002475151027012300014710 0ustar00 !A?hEHdZddlmZddlmZGddejZdS)zqFixer for reduce(). Makes sure reduce() is imported from the functools module if reduce is used in that module. ) fixer_base touch_importc eZdZdZdZdZdZdS) FixReduceTpreai power< 'reduce' trailer< '(' arglist< ( (not(argument) any ',' not(argument > c(tdd|dS)N functoolsreducer)selfnoderesultss 1/usr/lib64/python3.11/lib2to3/fixes/fix_reduce.py transformzFixReduce.transform"s[(D11111N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrrs4M E G22222rrN)__doc__lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsl ++++++22222 "22222rfixes/__pycache__/fix_imports2.cpython-311.opt-2.pyc000064400000001110151027012300016141 0ustar00 !A?h!D ddlmZdddZGddejZdS)) fix_importsdbm)whichdbanydbmceZdZdZeZdS) FixImports2N)__name__ __module__ __qualname__ run_orderMAPPINGmapping3/usr/lib64/python3.11/lib2to3/fixes/fix_imports2.pyrr sIGGGrrN)rr FixImportsrrrrrsg   +(rfixes/__pycache__/fix_urllib.cpython-311.opt-1.pyc000064400000024267151027012300015674 0ustar00 !A?h dZddlmZmZddlmZmZmZmZm Z m Z m Z dgdfdgdfdd gfgdgd fdd d gfgd Z e d e dddZGddeZdS)zFix changes imports of urllib which are now incompatible. This is rather similar to fix_imports, but because of the more complex nature of the fixing for urllib, it has its own fixer. ) alternates FixImports)NameComma FromImportNewlinefind_indentationNodesymszurllib.request) URLopenerFancyURLopener urlretrieve _urlopenerurlopen urlcleanup pathname2url url2pathname getproxiesz urllib.parse)quote quote_plusunquote unquote_plus urlencode splitattr splithost splitnport splitpasswd splitport splitquerysplittag splittype splituser splitvaluez urllib.errorContentTooShortError)rinstall_opener build_openerRequestOpenerDirector BaseHandlerHTTPDefaultErrorHandlerHTTPRedirectHandlerHTTPCookieProcessor ProxyHandlerHTTPPasswordMgrHTTPPasswordMgrWithDefaultRealmAbstractBasicAuthHandlerHTTPBasicAuthHandlerProxyBasicAuthHandlerAbstractDigestAuthHandlerHTTPDigestAuthHandlerProxyDigestAuthHandler HTTPHandler HTTPSHandler FileHandler FTPHandlerCacheFTPHandlerUnknownHandlerURLError HTTPError)urlliburllib2r?r>c #Kt}tD]P\}}|D]H}|\}}t|}d|d|dVd|d|d|dVd|zVd |zVd |d |d VIQdS) Nzimport_name< 'import' (module=zB | dotted_as_names< any* module=z any* >) > zimport_from< 'from' mod_member=z* 'import' ( member=z | import_as_name< member=z] 'as' any > | import_as_names< members=any* >) > zIimport_from< 'from' module_star=%r 'import' star='*' > ztimport_name< 'import' dotted_as_name< module_as=%r 'as' any > > zpower< bare_with_attr=z trailer< '.' member=z > any* > )setMAPPINGitemsr)bare old_modulechangeschange new_modulememberss 1/usr/lib64/python3.11/lib2to3/fixes/fix_urllib.py build_patternrL0s 55D&}}.. G . .F"( J ))GG$ZZZ1 1 1 1 1 $WWWggg7 7 7 7"# # # #"# # # # # $WWW. . . . .! ...c,eZdZdZdZdZdZdZdS) FixUrllibcDdtS)N|)joinrL)selfs rKrLzFixUrllib.build_patternIsxx (((rMc|d}|j}g}t|jddD]:}|t |d|t g;|t t|jdd|||dS)zTransform for the basic import case. Replaces the old import name with a comma separated list of its replacements. moduleNrprefix) getrXrCvalueextendrrappendreplace)rSnoderesults import_modprefnamesnames rKtransform_importzFixUrllib.transform_importLs [[**  J,-crc2 @ @D LL$tAwt444egg> ? ? ? ? T'*"23B7:4HHHIII5!!!!!rMc|d}|j}|d}|rt|tr|d}d}t|jD]}|j|dvr |d}n|r&|t||dS||ddSg}i} |d} | D]}|j tj kr%|j d j} |j dj} n |j} d} | d krst|jD]`}| |dvrT|d| vr| |d| |dg |ag} t|}d }d }|D]}| |}g}|dd D]B}||||| t#C|||d |t%||}|r|jj|r||_| |d}| rdg}| dd D]%}||t+g&| | d ||dS||ddS)zTransform for imports of specific module elements. Replaces the module to be imported from with the appropriate new module. mod_membermemberrNr@rW!This is an invalid module elementrJ,TcL|jtjkryt|jdj||jd|jdg}ttj|gSt|j|gS)NrrWr@ri)typer import_as_namerchildrenrZcloner )rcrXkidss rK handle_namez/FixUrllib.transform_member..handle_names9 333 q!1!7GGG M!,2244 M!,22446D!!4d;;<<TZ77788rMrVFzAll module elements are invalid)rYrX isinstancelistrCrZr]rcannot_convertrlr rmrnr\ setdefaultr r[rrparentendswithr)rSr^r_rfrargnew_namerHmodulesmod_dictrJas_name member_name new_nodes indentationfirstrqrUeltsrbeltnewnodesnew_nodes rKtransform_memberzFixUrllib.transform_member\s` [[..  X&& @ M&$'' #H!*"23  <6!9,,%ayHE- O""4#>#>#>?????##D*MNNNNN GHi(G! N N;$"555$oa06G"(/!"4":KK"(,K"G#%%")**:";NN&&)33%ay88 'vay 9 9 9$//q 2>>EEfMMMI*400KE 9 9 9"  '9**CLLS$!7!7888LL)))) [[b488999 //- 2 ; ;K H H-!,CJ  %%% M )#2#88HLL(GII!67777 Yr]+++ U#######D*KLLLLLrMcz|d}|d}d}t|tr|d}t|jD]}|j|dvr |d}n|r+|t ||jdS||ddS)z.Transform for calls to module members in code.bare_with_attrrgNrr@rWrh) rYrrrsrCrZr]rrXrt)rSr^r_ module_dotrgrxrHs rK transform_dotzFixUrllib.transform_dots[[!122 X&& fd # # AYFj./  F|vay((!!9)  K   tH+5+< > > > ? ? ? ? ?   &I J J J J JrMc|dr|||dS|dr|||dS|dr|||dS|dr||ddS|dr||ddSdS)NrUrfr module_starzCannot handle star imports. module_asz#This module is now multiple modules)rYrdrrrt)rSr^r_s rK transformzFixUrllib.transforms ;;x M  ! !$ 0 0 0 0 0 [[ & & M  ! !$ 0 0 0 0 0 [[) * * M   tW - - - - - [[ ' ' M   &C D D D D D [[ % % M   &K L L L L L M MrMN)__name__ __module__ __qualname__rLrdrrrrMrKrOrOGsn)))""" JMJMJMXKKK" M M M M MrMrON)__doc__lib2to3.fixes.fix_importsrrlib2to3.fixer_utilrrrrr r r rCr\rLrOrrMrKrs=<<<<<<<>>>>>>>>>>>>>>>>>>"CCCD ???@  +,. /" ' ' ' ( -/   B '(+A.///....}M}M}M}M}M }M}M}M}M}MrMfixes/__pycache__/__init__.cpython-311.opt-1.pyc000064400000000234151027012300015260 0ustar00 !A?h/dS)Nr//usr/lib64/python3.11/lib2to3/fixes/__init__.pyrsrfixes/__pycache__/fix_buffer.cpython-311.opt-2.pyc000064400000001776151027012300015655 0ustar00 !A?hNF ddlmZddlmZGddejZdS)) fixer_base)Namec eZdZdZdZdZdZdS) FixBufferTzR power< name='buffer' trailer< '(' [any] ')' > any* > ch|d}|td|jdS)Nname memoryview)prefix)replacerr )selfnoderesultsrs 1/usr/lib64/python3.11/lib2to3/fixes/fix_buffer.py transformzFixBuffer.transforms2v T,t{;;;<<<<<N)__name__ __module__ __qualname__ BM_compatibleexplicitPATTERNrrrrr s4MHG=====rrN)r fixer_utilrBaseFixrrrrrsg; = = = = = " = = = = =rfixes/__pycache__/fix_future.cpython-311.pyc000064400000001761151027012300014750 0ustar00 !A?h#HdZddlmZddlmZGddejZdS)zVRemove __future__ imports from __future__ import foo is replaced with an empty line. ) fixer_base) BlankLinec eZdZdZdZdZdZdS) FixFutureTz;import_from< 'from' module_name="__future__" 'import' any > c:t}|j|_|S)N)rprefix)selfnoderesultsnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_future.py transformzFixFuture.transformskk[  N)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderrrrrr s4MOGIrrN)__doc__r fixer_utilrBaseFixrrrrrsl""""""      "     rfixes/__pycache__/fix_asserts.cpython-311.pyc000064400000003317151027012300015121 0ustar00 !A?hpdZddlmZddlmZedddddd d dddddd d ZGddeZdS)z5Fixer that replaces deprecated unittest method names.)BaseFix)Name assertTrue assertEqualassertNotEqualassertAlmostEqualassertNotAlmostEqual assertRegexassertRaisesRegex assertRaises assertFalse)assert_ assertEqualsassertNotEqualsassertAlmostEqualsassertNotAlmostEqualsassertRegexpMatchesassertRaisesRegexpfailUnlessEqual failIfEqualfailUnlessAlmostEqualfailIfAlmostEqual failUnlessfailUnlessRaisesfailIfcXeZdZddeeezZdZdS) FixAssertszH power< any+ trailer< '.' meth=(%s)> any* > |c|dd}|ttt||jdS)Nmeth)prefix)replacerNAMESstrr")selfnoderesultsnames 2/usr/lib64/python3.11/lib2to3/fixes/fix_asserts.py transformzFixAsserts.transform sBvq! T%D *4;???@@@@@N) __name__ __module__ __qualname__joinmapreprr$PATTERNr+r,r*rrsOHHSSu--../GAAAAAr,rN)__doc__ fixer_baser fixer_utilrdictr$rr4r,r*r9s;;!   $*0%*! -,#    $AAAAAAAAAAr,fixes/__pycache__/fix_dict.cpython-311.opt-1.pyc000064400000011464151027012300015321 0ustar00 !A?hdZddlmZddlmZddlmZddlmZmZmZddlmZej dhzZ Gdd ej Z d S) ajFixer for dict methods. d.keys() -> list(d.keys()) d.items() -> list(d.items()) d.values() -> list(d.values()) d.iterkeys() -> iter(d.keys()) d.iteritems() -> iter(d.items()) d.itervalues() -> iter(d.values()) d.viewkeys() -> d.keys() d.viewitems() -> d.items() d.viewvalues() -> d.values() Except in certain very specific contexts: the iter() can be dropped when the context is list(), sorted(), iter() or for...in; the list() can be dropped when the context is list() or sorted() (but not iter() or for...in!). Special contexts that apply to both: list(), sorted(), tuple() set(), any(), all(), sum(). Note: iter(d.keys()) could be written as iter(d) but since the original d.iterkeys() was also redundant we don't fix this. And there are (rare) contexts where it makes a difference (e.g. when passing it as an argument to a function that introspects the argument). )pytree)patcomp) fixer_base)NameCallDot) fixer_utilitercjeZdZdZdZdZdZejeZ dZ eje Z dZ dS)FixDictTa power< head=any+ trailer< '.' method=('keys'|'items'|'values'| 'iterkeys'|'iteritems'|'itervalues'| 'viewkeys'|'viewitems'|'viewvalues') > parens=trailer< '(' ')' > tail=any* > c |d}|dd}|d}|j}|j}|d}|d} |s| r |dd}d|D}d |D}| o|||} |t j|jtt||j g|d  gz} t j|j | } | s+| s)d | _ tt|rdnd | g} |rt j|j | g|z} |j | _ | S)Nheadmethodtailr viewc6g|]}|Sclone.0ns //usr/lib64/python3.11/lib2to3/fixes/fix_dict.py z%FixDict.transform..A (((a (((c6g|]}|Srrrs rrz%FixDict.transform..Brr)prefixparenslist) symsvalue startswithin_special_contextrNodetrailerrrr rpowerr) selfnoderesultsrrrr$ method_nameisiterisviewspecialargsnews r transformzFixDict.transform6sv"1%vyl ''//''//  *V *%abb/K((4(((((4((((Dt66tVDDv{4<$'EE$(06 %?%?%?$@AAx(..00 22 k$*d++ B6 BCJtf8FF&99C5AAC  8+dj3%$,77C[  rz3power< func=NAME trailer< '(' node=any ')' > any* >zmfor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > cH|jdSi}|jj^|j|jj|r9|d|ur/|r|djtvS|djt jvS|sdS|j|j|o |d|uS)NFr,func)parentp1matchr% iter_exemptr consuming_callsp2)r+r,r/r-s rr'zFixDict.in_special_contextZs ; 5 K  *w}}T[/99 +v$&& Kv, ;;v, 0JJJ 5w}}T['22Nwv$7NNrN) __name__ __module__ __qualname__ BM_compatiblePATTERNr4P1rcompile_patternr8P2r<r'rrrr r )swMG8 ?B   $ $B B !  $ $BOOOOOrr N) __doc__r"rrrr rrrr;r:BaseFixr rrrrGs6(((((((((((F83 AOAOAOAOAOj AOAOAOAOAOrfixes/__pycache__/fix_import.cpython-311.pyc000064400000011276151027012300014752 0ustar00 !A?h ndZddlmZddlmZmZmZmZddlm Z m Z m Z dZ Gddej Zd S) zFixer for import statements. If spam is being imported from the local directory, this import: from spam import eggs Becomes: from .spam import eggs And this import: import spam Becomes: from . import spam ) fixer_base)dirnamejoinexistssep) FromImportsymstokenc#K|g}|r|}|jtjkr |jVn|jt jkr'dd|jDVn~|jt j kr!| |jdnH|jt j kr$| |jdddntd|dSdS)zF Walks over all the names imported in a dotted_as_names node. cg|] }|j S)value).0chs 1/usr/lib64/python3.11/lib2to3/fixes/fix_import.py z$traverse_imports..s<<<28<<<rNzunknown node type)poptyper NAMErr dotted_namerchildrendotted_as_nameappenddotted_as_namesextendAssertionError)namespendingnodes rtraverse_importsr$s gG  6{{}} 9 " "*     Y$* * *''< | import_name< 'import' imp=any > cvtt|||d|jv|_dS)Nabsolute_import)superr& start_treefuture_featuresskip)selftreename __class__s rr*zFixImport.start_tree/s6 i))$555%)== rc|jrdS|d}|jtjkrnt |ds|jd}t |d||jr%d|jz|_|dSdSd}d}t|D]}||rd}d}|r|r| |ddStd|g}|j |_ |S)Nimprr.FTz#absolute and local imports together) r,rr import_fromhasattrrprobably_a_local_importrchangedr$warningr prefix)r-r#resultsr2 have_local have_absolutemod_namenews r transformzFixImport.transform3s0 9  Fen 9( ( ( c7++ &l1oc7++ &++CI66 #)O    J!M,S11 ) )//99)!%JJ$(MM NLL'LMMMS3%((CCJJrcV|drdS|ddd}t|j}t ||}t t t|dsdSdt ddd d fD]}t ||zrd SdS) Nr3Frz __init__.pyz.pyz.pycz.soz.slz.pydT) startswithsplitrfilenamerrr)r-imp_name base_pathexts rr6z!FixImport.probably_a_local_importUs   s # # 5>>#q))!,DM** H-- d79--}==>> 53uf=  Ci#o&& tt ur) __name__ __module__ __qualname__ BM_compatiblePATTERNr*r?r6 __classcell__)r0s@rr&r&&scMG >>>>>   Drr&N)__doc__r ros.pathrrrr fixer_utilr r r r$BaseFixr&rrrrRs  ............0000000000666&===== "=====rfixes/__pycache__/fix_paren.cpython-311.opt-1.pyc000064400000003364151027012300015503 0ustar00 !A?hLdZddlmZddlmZmZGddejZdS)ztFixer that adds parentheses where they are required This converts ``[x for x in 1, 2]`` to ``[x for x in (1, 2)]``.) fixer_base)LParenRParenceZdZdZdZdZdS)FixParenTa atom< ('[' | '(') (listmaker< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > > | testlist_gexp< any comp_for< 'for' NAME 'in' target=testlist_safe< any (',' any)+ [','] > [any] > >) (']' | ')') > c|d}t}|j|_d|_|d||t dS)Ntarget)rprefix insert_child append_childr)selfnoderesultsr lparens 0/usr/lib64/python3.11/lib2to3/fixes/fix_paren.py transformzFixParen.transform%sY"   Av&&&FHH%%%%%N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG,&&&&&rrN)__doc__r r fixer_utilrrBaseFixrrrrrstCC'''''''' & & & & &z! & & & & &rfixes/__pycache__/fix_sys_exc.cpython-311.opt-2.pyc000064400000004020151027012300016042 0ustar00 !A?h ^ ddlmZddlmZmZmZmZmZmZm Z Gddej Z dS)) fixer_base)AttrCallNameNumber SubscriptNodesymscdeZdZgdZdZdddeDzZdZdS) FixSysExc)exc_type exc_value exc_tracebackTzN power< 'sys' trailer< dot='.' attribute=(%s) > > |c# K|] }d|zV dS)z'%s'N).0es 2/usr/lib64/python3.11/lib2to3/fixes/fix_sys_exc.py zFixSysExc.s&::AVaZ::::::c|dd}t|j|j}t t d|j}tt d|}|dj|djd_| t|ttj ||jS)N attributeexc_info)prefixsysdot)rrindexvaluerrrrchildrenappendrr r power)selfnoderesultssys_attrr callattrs r transformzFixSysExc.transforms;'*t}**8>::;;D$$X_===DKK&&%,U^%:Q" Ie$$%%%DJT[9999rN)__name__ __module__ __qualname__r BM_compatiblejoinPATTERNr+rrrr r s]999HMHH:::::::;G:::::rr N) r fixer_utilrrrrrr r BaseFixr rrrr5sHHHHHHHHHHHHHHHHHH::::: ":::::rfixes/__pycache__/fix_nonzero.cpython-311.opt-1.pyc000064400000002264151027012300016066 0ustar00 !A?hOHdZddlmZddlmZGddejZdS)z*Fixer for __nonzero__ -> __bool__ methods.) fixer_base)NameceZdZdZdZdZdS) FixNonzeroTz classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='__nonzero__' parameters< '(' NAME ')' > any+ > any* > > cl|d}td|j}||dS)Nname__bool__)prefix)rr replace)selfnoderesultsrnews 2/usr/lib64/python3.11/lib2to3/fixes/fix_nonzero.py transformzFixNonzero.transforms7v:dk222 SN)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MGrrN)__doc__r fixer_utilrBaseFixrrrrrsh00     #     rfixes/__pycache__/fix_ws_comma.cpython-311.pyc000064400000003143151027012300015237 0ustar00 !A?hBTdZddlmZddlmZddlmZGddejZdS)zFixer that changes 'a ,b' into 'a, b'. This also changes '{a :b}' into '{a: b}', but does not touch other uses of colons. It does not touch other uses of whitespace. )pytree)token) fixer_basec|eZdZdZdZejejdZejej dZ ee fZ dZ dS) FixWsCommaTzH any<(not(',') any)+ ',' ((not(',') any)+ ',')* [not(',') any]> ,:c|}d}|jD]H}||jvr)|j}|r d|vrd|_d}4|r|j}|sd|_d}I|S)NF T )clonechildrenSEPSprefixisspace)selfnoderesultsnewcommachildrs 3/usr/lib64/python3.11/lib2to3/fixes/fix_ws_comma.py transformzFixWsComma.transformsjjll\  E !!>>##&F(:(:#%EL+"\F!+'*  N) __name__ __module__ __qualname__explicitPATTERNrLeafrCOMMACOLONrrrrrr sdHG FK S ) )E FK S ) )E 5>DrrN)__doc__r rpgen2rrBaseFixrr$rrr(s~#rfixes/__pycache__/__init__.cpython-311.opt-2.pyc000064400000000234151027012300015261 0ustar00 !A?h/dS)Nr//usr/lib64/python3.11/lib2to3/fixes/__init__.pyrsrfixes/__pycache__/fix_throw.cpython-311.opt-2.pyc000064400000005147151027012300015543 0ustar00 !A?h.n ddlmZddlmZddlmZddlmZmZmZm Z m Z Gddej Z dS))pytree)token) fixer_base)NameCallArgListAttris_tupleceZdZdZdZdZdS)FixThrowTz power< any trailer< '.' 'throw' > trailer< '(' args=arglist< exc=any ',' val=any [',' tb=any] > ')' > > | power< any trailer< '.' 'throw' > trailer< '(' exc=any ')' > > c|j}|d}|jtjur||ddS|d}|dS|}t|rd|jddD}n d|_ |g}|d}d |vr|d }d|_ t||} t| td t|ggz} |tj|j| dS|t||dS) Nexcz+Python 3 does not support string exceptionsvalc6g|]}|S)clone).0cs 0/usr/lib64/python3.11/lib2to3/fixes/fix_throw.py z&FixThrow.transform..)s :::!AGGII:::argstbwith_traceback)symsrtyperSTRINGcannot_convertgetr childrenprefixrr rrreplacerNodepower) selfnoderesultsrrrr throw_argsrewith_tbs r transformzFixThrow.transforms[yen""$$ 8u| # #   &S T T T Fkk%   ; Fiikk C== ::s|AbD'9:::DDCJ5DV_ 7??$$&&BBIS$A1d#34455"GG   v{4:w?? @ @ @ @ @   tC / / / / /rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr.rrrr r s/MG00000rr N) rrpgen2rr fixer_utilrrrr r BaseFixr rrrr7s?<<<<<<<<<<<<<<(0(0(0(0(0z!(0(0(0(0(0rfixes/__pycache__/fix_imports.cpython-311.pyc000064400000015442151027012300015134 0ustar00 !A?h4RdZddlmZddlmZmZiddddddd d d d d ddddddddddddddddddddd d!d"id#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdDdEdFdGdHdIdJdJdJdKdLdLdMdNdOZdPZefdQZGdRdSej Z dTS)Uz/Fix incompatible imports and module references.) fixer_base)Name attr_chainStringIOio cStringIOcPicklepickle __builtin__builtinscopy_regcopyregQueuequeue SocketServer socketserver ConfigParser configparserreprreprlib FileDialogztkinter.filedialog tkFileDialog SimpleDialogztkinter.simpledialogtkSimpleDialogtkColorChooserztkinter.colorchoosertkCommonDialogztkinter.commondialogDialogztkinter.dialogTkdndz tkinter.dndtkFontz tkinter.font tkMessageBoxztkinter.messagebox ScrolledTextztkinter.scrolledtext Tkconstantsztkinter.constantsTixz tkinter.tixttkz tkinter.ttkTkintertkinter markupbase _markupbase_winregwinregthread_thread dummy_thread _dummy_threaddbhashzdbm.bsddumbdbmzdbm.dumbdbmzdbm.ndbmgdbmzdbm.gnu xmlrpclibz xmlrpc.clientDocXMLRPCServerz xmlrpc.serverz http.clientz html.entitiesz html.parserz http.cookieszhttp.cookiejarz http.server subprocess collectionsz urllib.parsezurllib.robotparser)SimpleXMLRPCServerhttplibhtmlentitydefs HTMLParserCookie cookielibBaseHTTPServerSimpleHTTPServer CGIHTTPServercommands UserStringUserListurlparse robotparserc^ddtt|zdzS)N(|))joinmapr)memberss 2/usr/lib64/python3.11/lib2to3/fixes/fix_imports.py alternatesrM=s( #dG,,-- - 33c#Kdd|D}t|}d|d|dVd|zVd|d|d Vd |zVdS) Nz | cg|]}d|zS)zmodule_name='%s').0keys rL z!build_pattern..BsGGG-3GGGrNz$name_import=import_name< 'import' ((z;) | multiple_imports=dotted_as_names< any* (z) any* >) > zimport_from< 'from' (%s) 'import' ['('] ( any | import_as_name< any 'as' any > | import_as_names< any* >) [')'] > z(import_name< 'import' (dotted_as_name< (zg) 'as' any > | multiple_imports=dotted_as_names< any* dotted_as_name< (z!) 'as' any > any* >) > z3power< bare_with_attr=(%s) trailer<'.' any > any* >)rIrMkeys)mappingmod_list bare_namess rL build_patternrYAszzGGwGGGHHHGLLNN++JJ888 %%%%  888 %%%% @* LLLLLLrNcNeZdZdZdZeZdZdZfdZ fdZ fdZ dZ xZ S) FixImportsTcPdt|jS)NrG)rIrYrV)selfs rLrYzFixImports.build_pattern`sxx dl33444rNc||_tt|dSN)rYPATTERNsuperr[compile_pattern)r^ __class__s rLrczFixImports.compile_patterncs:))++  j$//11111rNctt|j|}|r1d|vr+tfdt |dDrdS|SdS)Nbare_with_attrc3.K|]}|VdSr`rQ)rRobjmatchs rL z#FixImports.match..qs+IIsc IIIIIIrNparentF)rbr[rianyr)r^noderesultsrirds @rLrizFixImports.matchjsvj$''-%++   w..IIIIjx.H.HIIIII/uNurNchtt|||i|_dSr`)rbr[ start_treereplace)r^treefilenamerds rLrpzFixImports.start_treevs. j$**4::: rNc|d}|r|j}|j|}|t ||jd|vr ||j|<d|vr/||}|r|||dSdSdS|dd}|j|j}|r+|t ||jdSdS)N module_name)prefix name_importmultiple_importsrf)getvaluerVrqrrvri transform)r^rmrn import_modmod_namenew_name bare_names rLr|zFixImports.transformzs'[[//  K!'H|H-H   tHZ5FGGG H H H''*2 X&!W,, **T**2NN411111-, 22 01!4I|'' 88H K!!$x 8H"I"I"IJJJJJ K KrN)__name__ __module__ __qualname__ BM_compatiblekeep_line_orderMAPPINGrV run_orderrYrcrirpr| __classcell__)rds@rLr[r[UsMOGI55522222     KKKKKKKrNr[N) __doc__r fixer_utilrrrrMrYBaseFixr[rQrNrLrs55))))))))2 :2  2  h2  :2  y 2  G 2  > 2  >2  92  -2  /2  12  32  32  32  %2  M!2 2 " ^#2 $ /%2 & 1'2 ( -)2 * -+2 , --2 . i/2 0 12 2 h32 4 Y52 6 ?72 : Y;2 < j=2 > *?2 @ 9A2 B C2 D oE2 2 F"1#-'#(*,)#'%&/c2 2 2 j444"MMMM(<K<K<K<K<K#<K<K<K<K<KrNfixes/__pycache__/fix_repr.cpython-311.opt-2.pyc000064400000002230151027012300015336 0ustar00 !A?heN ddlmZddlmZmZmZGddejZdS)) fixer_base)CallName parenthesizeceZdZdZdZdZdS)FixReprTz7 atom < '`' expr=any '`' > c|d}|j|jjkrt |}t t d|g|jS)Nexprrepr)prefix)clonetypesyms testlist1rrrr )selfnoderesultsr s //usr/lib64/python3.11/lib2to3/fixes/fix_repr.py transformzFixRepr.transformsUv$$&& 9 + + +%%DDLL4&====N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG>>>>>rrN)r fixer_utilrrrBaseFixrrrrr ss61111111111 > > > > >j > > > > >rfixes/__pycache__/fix_itertools_imports.cpython-311.opt-2.pyc000064400000005107151027012300020175 0ustar00 !A?h&N ddlmZddlmZmZmZGddejZdS)) fixer_base) BlankLinesymstokenc2eZdZdZdezZdZdS)FixItertoolsImportsTzT import_from< 'from' 'itertools' 'import' imports=any > c~|d}|jtjks|js|g}n|j}|dddD]}|jtjkr |j}|}n%|jtjkrdS|jd}|j}|dvrd|_|m|dvr)| |ddkrdnd |_|jddp|g}d } |D]3}| r*|jtj kr|.| d z} 4|r^|d jtj krC| |r|d jtj kC|jst|d dr|j |j} t}| |_|SdS) Nimportsr)imapizipifilter) ifilterfalse izip_longestf filterfalse zip_longestTvalue)typerimport_as_namechildrenrNAMErSTARremovechangedCOMMApopgetattrparentprefixr) selfnoderesultsr rchildmember name_node member_name remove_commaps r9sqG555555555511111*,11111r.fixes/__pycache__/fix_imports2.cpython-311.pyc000064400000001256151027012300015214 0ustar00 !A?h!FdZddlmZdddZGddejZdS)zTFix incompatible imports and module references that must be fixed after fix_imports.) fix_importsdbm)whichdbanydbmceZdZdZeZdS) FixImports2N)__name__ __module__ __qualname__ run_orderMAPPINGmapping3/usr/lib64/python3.11/lib2to3/fixes/fix_imports2.pyrr sIGGGrrN)__doc__rr FixImportsrrrrrsl   +(rfixes/__pycache__/fix_isinstance.cpython-311.opt-1.pyc000064400000004665151027012300016543 0ustar00 !A?hHHdZddlmZddlmZGddejZdS)a,Fixer that cleans up a tuple argument to isinstance after the tokens in it were fixed. This is mainly used to remove double occurrences of tokens as a leftover of the long -> int / unicode -> str conversion. eg. isinstance(x, (int, long)) -> isinstance(x, (int, int)) -> isinstance(x, int) ) fixer_base)tokenc eZdZdZdZdZdZdS) FixIsinstanceTz power< 'isinstance' trailer< '(' arglist< any ',' atom< '(' args=testlist_gexp< any+ > ')' > > ')' > > ct}|d}|j}g}t|}|D]\}} | jtjkrN| j|vrE|t|dz kr.||dzjtjkrt|gh| | | jtjkr| | j|r|djtjkr|d=t|dkr6|j } | j |d_ | |ddS||dd<|dS)Nargs)setchildren enumeratetyperNAMEvaluelenCOMMAnextappendaddparentprefixreplacechanged) selfnoderesultsnames_insertedtestlistr new_argsiteratoridxargatoms 5/usr/lib64/python3.11/lib2to3/fixes/fix_isinstance.py transformzFixIsinstance.transformsO6? T??  2 2HCx5:%%#)~*E*ETQ&&4a=+=+L+LNNN$$$8uz))"&&sy111   )U[88 x==A  ?D!%HQK  LL! % % % % %DG LLNNNNNN)__name__ __module__ __qualname__ BM_compatiblePATTERN run_orderr'r(r&rrs6MGIr(rN)__doc__r fixer_utilrBaseFixrr/r(r&r4sl$$$$$J&$$$$$r(fixes/__pycache__/fix_zip.cpython-311.opt-2.pyc000064400000003744151027012300015203 0ustar00 !A?h f ddlmZddlmZddlmZddlmZm Z m Z Gddej Z dS)) fixer_base)Node)python_symbols)NameArgListin_special_contextc eZdZdZdZdZdZdS)FixZipTzN power< 'zip' args=trailer< '(' [any] ')' > [trailers=trailer*] > zfuture_builtins.zipc||rdSt|rdS|d}d|_g}d|vrd|dD}|D] }d|_ t t jtd|gd}t t jtdt|gg|z}|j|_|S)Nargstrailersc6g|]}|S)clone).0ns ./usr/lib64/python3.11/lib2to3/fixes/fix_zip.py z$FixZip.transform..'s ???a ???zip)prefixlist) should_skiprrrrsymspowerrr)selfnoderesultsr rrnews r transformzFixZip.transforms   D ! !  F d # # 4v$$&&   ??7:+>???H  4:U T22>>>4:V gsenn=HII[  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNskip_onr!rrrr r s6MG $Grr N) r rpytreerpygramrr fixer_utilrrrConditionalFixr rrrr,s++++++::::::::::Z &rfixes/__pycache__/fix_unicode.cpython-311.opt-1.pyc000064400000004746151027012300016031 0ustar00 !A?hRdZddlmZddlmZdddZGddejZd S) zFixer for unicode. * Changes unicode to str and unichr to chr. * If "...\u..." is not unicode literal change it into "...\\u...". * Change u"..." into "...". )token) fixer_basechrstr)unichrunicodec,eZdZdZdZfdZdZxZS) FixUnicodeTzSTRING | 'unicode' | 'unichr'cvtt|||d|jv|_dS)Nunicode_literals)superr start_treefuture_featuresr )selftreefilename __class__s 2/usr/lib64/python3.11/lib2to3/fixes/fix_unicode.pyrzFixUnicode.start_trees9 j$**4::: 2d6J Jc|jtjkr-|}t|j|_|S|jtjkr|j}|js@|ddvr6d|vr2dd| dD}|ddvr |dd}||jkr|S|}||_|SdS)Nz'"\z\\cbg|],}|dddd-S)z\uz\\uz\Uz\\U)replace).0vs r z(FixUnicode.transform.. sF"""IIeV,,44UFCC"""ruU) typerNAMEclone_mappingvalueSTRINGr joinsplit)rnoderesultsnewvals r transformzFixUnicode.transforms 9 " "**,,C ,CIJ Y%, & &*C( SVu__jj"" YYu--"""1v~~!""gdj   **,,CCIJ' &r)__name__ __module__ __qualname__ BM_compatiblePATTERNrr, __classcell__)rs@rr r sVM-GKKKKKrr N)__doc__pgen2rrr#BaseFixr rrr8sy% 0 0#rfixes/__pycache__/fix_ws_comma.cpython-311.opt-2.pyc000064400000002650151027012300016201 0ustar00 !A?hBR ddlmZddlmZddlmZGddejZdS))pytree)token) fixer_basec|eZdZdZdZejejdZejej dZ ee fZ dZ dS) FixWsCommaTzH any<(not(',') any)+ ',' ((not(',') any)+ ',')* [not(',') any]> ,:c|}d}|jD]H}||jvr)|j}|r d|vrd|_d}4|r|j}|sd|_d}I|S)NF T )clonechildrenSEPSprefixisspace)selfnoderesultsnewcommachildrs 3/usr/lib64/python3.11/lib2to3/fixes/fix_ws_comma.py transformzFixWsComma.transformsjjll\  E !!>>##&F(:(:#%EL+"\F!+'*  N) __name__ __module__ __qualname__explicitPATTERNrLeafrCOMMACOLONrrrrrr sdHG FK S ) )E FK S ) )E 5>DrrN)r rpgen2rrBaseFixrr$rrr'sy#rfixes/__pycache__/fix_intern.cpython-311.pyc000064400000002745151027012300014740 0ustar00 !A?hxLdZddlmZddlmZmZGddejZdS)z/Fixer for intern(). intern(s) -> sys.intern(s)) fixer_base) ImportAndCall touch_importc eZdZdZdZdZdZdS) FixInternTprez power< 'intern' trailer< lpar='(' ( not(arglist | argument) any ','> ) rpar=')' > after=any* > c|r5|d}|r+|j|jjkr|jdjdvrdSd}t |||}t dd||S)Nobj>***)sysinternr)typesymsargumentchildrenvaluerr)selfnoderesultsr namesnews 1/usr/lib64/python3.11/lib2to3/fixes/fix_intern.py transformzFixIntern.transformsu  %.C H 222LO)[88F!D'511T5$''' N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrr s4M EG     rrN)__doc__r fixer_utilrrBaseFixrr#rrr(sr 44444444 "rfixes/__pycache__/fix_input.cpython-311.opt-2.pyc000064400000002461151027012300015533 0ustar00 !A?hv ddlmZddlmZmZddlmZejdZGddejZ dS)) fixer_base)CallName)patcompz&power< 'eval' trailer< '(' any ')' > >ceZdZdZdZdZdS)FixInputTzL power< 'input' args=trailer< '(' [any] ')' > > ct|jjrdS|}d|_t t d|g|jS)Neval)prefix)contextmatchparentcloner rr)selfnoderesultsnews 0/usr/lib64/python3.11/lib2to3/fixes/fix_input.py transformzFixInput.transformsS ==+ , ,  Fjjll DLL3% <<<<N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG=====rrN) r r fixer_utilrrrcompile_patternr BaseFixrrrrr!s:######## "' !"J K K = = = = =z! = = = = =rfixes/__pycache__/fix_input.cpython-311.pyc000064400000002565151027012300014600 0ustar00 !A?hxdZddlmZddlmZmZddlmZejdZGddej Z dS) z4Fixer that changes input(...) into eval(input(...)).) fixer_base)CallName)patcompz&power< 'eval' trailer< '(' any ')' > >ceZdZdZdZdZdS)FixInputTzL power< 'input' args=trailer< '(' [any] ')' > > ct|jjrdS|}d|_t t d|g|jS)Neval)prefix)contextmatchparentcloner rr)selfnoderesultsnews 0/usr/lib64/python3.11/lib2to3/fixes/fix_input.py transformzFixInput.transformsS ==+ , ,  Fjjll DLL3% <<<<N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG=====rrN) __doc__r r fixer_utilrrrcompile_patternr BaseFixrrrrr"s::######## "' !"J K K = = = = =z! = = = = =rfixes/__pycache__/fix_long.cpython-311.opt-1.pyc000064400000001723151027012300015332 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z/Fixer that turns 'long' into 'int' everywhere. ) fixer_base)is_probably_builtinceZdZdZdZdZdS)FixLongTz'long'c^t|rd|_|dSdS)Nint)rvaluechanged)selfnoderesultss //usr/lib64/python3.11/lib2to3/fixes/fix_long.py transformzFixLong.transforms4 t $ $ DJ LLNNNNN  N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s-MGrrN)__doc__lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsl222222j rfixes/__pycache__/fix_nonzero.cpython-311.opt-2.pyc000064400000002172151027012300016065 0ustar00 !A?hOF ddlmZddlmZGddejZdS)) fixer_base)NameceZdZdZdZdZdS) FixNonzeroTz classdef< 'class' any+ ':' suite< any* funcdef< 'def' name='__nonzero__' parameters< '(' NAME ')' > any+ > any* > > cl|d}td|j}||dS)Nname__bool__)prefix)rr replace)selfnoderesultsrnews 2/usr/lib64/python3.11/lib2to3/fixes/fix_nonzero.py transformzFixNonzero.transforms7v:dk222 SN)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrrs/MGrrN)r fixer_utilrBaseFixrrrrrse0     #     rfixes/__pycache__/fix_ne.cpython-311.opt-2.pyc000064400000002062151027012300014773 0ustar00 !A?h;R ddlmZddlmZddlmZGddejZdS))pytree)token) fixer_basec(eZdZejZdZdZdS)FixNec|jdkS)Nz<>)value)selfnodes -/usr/lib64/python3.11/lib2to3/fixes/fix_ne.pymatchz FixNe.matchszT!!cRtjtjd|j}|S)Nz!=)prefix)rLeafrNOTEQUALr)r r resultsnews r transformzFixNe.transforms!k%.$t{CCC rN)__name__ __module__ __qualname__rr _accept_typer rrr rr s;>L"""rrN)rpgen2rrBaseFixrrrr rsy#     J      rfixes/__pycache__/fix_ws_comma.cpython-311.opt-1.pyc000064400000003143151027012300016176 0ustar00 !A?hBTdZddlmZddlmZddlmZGddejZdS)zFixer that changes 'a ,b' into 'a, b'. This also changes '{a :b}' into '{a: b}', but does not touch other uses of colons. It does not touch other uses of whitespace. )pytree)token) fixer_basec|eZdZdZdZejejdZejej dZ ee fZ dZ dS) FixWsCommaTzH any<(not(',') any)+ ',' ((not(',') any)+ ',')* [not(',') any]> ,:c|}d}|jD]H}||jvr)|j}|r d|vrd|_d}4|r|j}|sd|_d}I|S)NF T )clonechildrenSEPSprefixisspace)selfnoderesultsnewcommachildrs 3/usr/lib64/python3.11/lib2to3/fixes/fix_ws_comma.py transformzFixWsComma.transformsjjll\  E !!>>##&F(:(:#%EL+"\F!+'*  N) __name__ __module__ __qualname__explicitPATTERNrLeafrCOMMACOLONrrrrrr sdHG FK S ) )E FK S ) )E 5>DrrN)__doc__r rpgen2rrBaseFixrr$rrr(s~#rfixes/__pycache__/fix_reduce.cpython-311.opt-2.pyc000064400000002272151027012300015643 0ustar00 !A?hEF ddlmZddlmZGddejZdS)) fixer_base touch_importc eZdZdZdZdZdZdS) FixReduceTpreai power< 'reduce' trailer< '(' arglist< ( (not(argument) any ',' not(argument > c(tdd|dS)N functoolsreducer)selfnoderesultss 1/usr/lib64/python3.11/lib2to3/fixes/fix_reduce.py transformzFixReduce.transform"s[(D11111N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrrs4M E G22222rrN)lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsg ++++++22222 "22222rfixes/__pycache__/fix_has_key.cpython-311.opt-2.pyc000064400000007443151027012300016024 0ustar00 !A?h| V ddlmZddlmZddlmZmZGddejZdS))pytree) fixer_base)Name parenthesizeceZdZdZdZdZdS) FixHasKeyTa anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > after=any* > | negation=not_test< 'not' anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > > > c J|j}|jj|jkr!|j|jrdS|d}|d}|j}d|dD}|d}|d} | r d| D} |j|j |j|j |j |j |j |jfvrt|}t!|dkr |d }nt#j|j|}d |_t)d d } |r-t)d d } t#j|j| | f} t#j|j || |f} | r:t| } t#j|j| ft-| z} |jj|j |j|j|j|j|j|j|j|jf vrt| } || _| S)Nnegationanchorc6g|]}|Sclone.0ns 2/usr/lib64/python3.11/lib2to3/fixes/fix_has_key.py z'FixHasKey.transform..Rs 777!''))777beforeargafterc6g|]}|Sr rrs rrz'FixHasKey.transform..Vs ...1QWWYY...r in)prefixnot)symsparenttypenot_testpatternmatchgetrr comparisonand_testor_testtestlambdefargumentrlenrNodepowerrcomp_optupleexprxor_exprand_expr shift_expr arith_exprtermfactor) selfnoderesultsr r r rrrrn_opn_notnews r transformzFixHasKey.transformGsy K  - - L  t{ + + .4;;z**"77WX%6777en""$$ G$$  /.....E 8  dit}N N Ns##C v;;!  AYFF[V44F D%%%  <s+++E;t|eT];;Dk$/Cv+>??  As##C+dj3&5<<*?@@C ; DM $ t $ $ TZ 9 9 9s##C  rN)__name__ __module__ __qualname__ BM_compatiblePATTERNr?r rrrr&s/MG<&&&&&rrN)rr fixer_utilrrBaseFixrr rrrHs:++++++++GGGGG "GGGGGrfixes/__pycache__/fix_funcattrs.cpython-311.pyc000064400000002440151027012300015442 0ustar00 !A?hHdZddlmZddlmZGddejZdS)z3Fix function attribute names (f.func_x -> f.__x__).) fixer_base)NameceZdZdZdZdZdS) FixFuncattrsTz power< any+ trailer< '.' attr=('func_closure' | 'func_doc' | 'func_globals' | 'func_name' | 'func_defaults' | 'func_code' | 'func_dict') > any* > c|dd}|td|jddz|jdS)Nattrz__%s__)prefix)replacervaluer )selfnoderesultsrs 4/usr/lib64/python3.11/lib2to3/fixes/fix_funcattrs.py transformzFixFuncattrs.transformsWvq! T8djn4!%... / / / / /N)__name__ __module__ __qualname__ BM_compatiblePATTERNrrrrr s/MG /////rrN)__doc__r fixer_utilrBaseFixrrrrrsh99 / / / / /:% / / / / /rfixes/__pycache__/fix_reduce.cpython-311.opt-1.pyc000064400000002475151027012300015647 0ustar00 !A?hEHdZddlmZddlmZGddejZdS)zqFixer for reduce(). Makes sure reduce() is imported from the functools module if reduce is used in that module. ) fixer_base touch_importc eZdZdZdZdZdZdS) FixReduceTpreai power< 'reduce' trailer< '(' arglist< ( (not(argument) any ',' not(argument > c(tdd|dS)N functoolsreducer)selfnoderesultss 1/usr/lib64/python3.11/lib2to3/fixes/fix_reduce.py transformzFixReduce.transform"s[(D11111N)__name__ __module__ __qualname__ BM_compatibleorderPATTERNrrrrrs4M E G22222rrN)__doc__lib2to3rlib2to3.fixer_utilrBaseFixrrrrrsl ++++++22222 "22222rfixes/__pycache__/fix_exitfunc.cpython-311.opt-1.pyc000064400000007543151027012300016226 0ustar00 !A?h `dZddlmZmZddlmZmZmZmZm Z m Z Gddej Z dS)z7 Convert use of sys.exitfunc to use the atexit module. )pytree fixer_base)NameAttrCallCommaNewlinesymsc:eZdZdZdZdZfdZfdZdZxZ S) FixExitfuncTa ( sys_import=import_name<'import' ('sys' | dotted_as_names< (any ',')* 'sys' (',' any)* > ) > | expr_stmt< power< 'sys' trailer< '.' 'exitfunc' > > '=' func=any > ) cBtt|j|dSN)superr __init__)selfargs __class__s 3/usr/lib64/python3.11/lib2to3/fixes/fix_exitfunc.pyrzFixExitfunc.__init__s#)k4  )40000chtt|||d|_dSr)rr start_tree sys_import)rtreefilenamers rrzFixExitfunc.start_tree!s. k4  ++D(;;;rc d|vr|j |d|_dS|d}d|_tjt jttdtd}t||g|j}| ||j| |ddS|jj d}|j t jkrF|t!|tdddS|jj}|j |j}|j} tjt jtd tddg} tjt j| g} ||dzt-||d z| dS) NrfuncatexitregisterzKCan't find sys import; Please add an atexit import at the top of your file. import)rcloneprefixrNoder powerrrrreplacewarningchildrentypedotted_as_names append_childrparentindex import_name simple_stmt insert_childr ) rnoderesultsrrcallnamescontaining_stmtpositionstmt_container new_importnews r transformzFixExitfunc.transform%s 7 " "&"),"7 Fv$$&& ;tz#DNND4D4DEE!!Htfdk22 T ? " LL ? @ @ @ F(+ :- - -   uww ' ' '   tHc22 3 3 3 3 3"o4O&/55doFFH,3NT%5#H~~tHc/B/BC  J+d. ==C  ( (Awyy A A A  ( (As ; ; ; ; ;r) __name__ __module__ __qualname__keep_line_order BM_compatiblePATTERNrrr< __classcell__)rs@rr r sqOM G11111#<#<#<#<#<#<#rIs '&&&&&&&EEEEEEEEEEEEEEEE=<=<=<=<=<*$=<=<=<=<= callable(obj) operator.sequenceIncludes(obj) -> operator.contains(obj) operator.isSequenceType(obj) -> isinstance(obj, collections.abc.Sequence) operator.isMappingType(obj) -> isinstance(obj, collections.abc.Mapping) operator.isNumberType(obj) -> isinstance(obj, numbers.Number) operator.repeat(obj, n) -> operator.mul(obj, n) operator.irepeat(obj, n) -> operator.imul(obj, n) N) fixer_base)CallNameString touch_importcfd}|S)Nc|_|SN) invocation)fss 3/usr/lib64/python3.11/lib2to3/fixes/fix_operator.pydeczinvocation..decs )r rs` rr r s# JrcneZdZdZdZdZdZdeeezZdZ e dd Z e d d Z e d d Z e ddZe ddZe ddZe ddZdZdZdZdS) FixOperatorTprez method=('isCallable'|'sequenceIncludes' |'isSequenceType'|'isMappingType'|'isNumberType' |'repeat'|'irepeat') z'(' obj=any ')'z power< module='operator' trailer< '.' %(methods)s > trailer< %(obj)s > > | power< %(methods)s trailer< %(obj)s > > )methodsobjcN|||}| |||SdSr ) _check_method)selfnoderesultsmethods r transformzFixOperator.transform+s7##D'22  6$(( (  rzoperator.contains(%s)c0|||dS)Ncontains_handle_renamerrrs r_sequenceIncludeszFixOperator._sequenceIncludes0s""4*===rz callable(%s)c|d}ttd|g|jS)Nrcallableprefix)rrcloner')rrrrs r _isCallablezFixOperator._isCallable4s4enD$$syy{{mDKHHHHrzoperator.mul(%s)c0|||dS)Nmulr r"s r_repeatzFixOperator._repeat9s""4%888rzoperator.imul(%s)c0|||dS)Nimulr r"s r_irepeatzFixOperator._irepeat=s""4&999rz(isinstance(%s, collections.abc.Sequence)c2|||ddS)Ncollections.abcSequence_handle_type2abcr"s r_isSequenceTypezFixOperator._isSequenceTypeAs$$T74EzRRRrz'isinstance(%s, collections.abc.Mapping)c2|||ddS)Nr1Mappingr3r"s r_isMappingTypezFixOperator._isMappingTypeEs$$T74EyQQQrzisinstance(%s, numbers.Number)c2|||ddS)NnumbersNumberr3r"s r _isNumberTypezFixOperator._isNumberTypeIs$$T7IxHHHrcX|dd}||_|dS)Nrr)valuechanged)rrrnamers rr!zFixOperator._handle_renameMs."1% rctd|||d}|tdd||gzg}t t d||jS)Nrz, . isinstancer&)rr(rjoinrrr')rrrmoduleabcrargss rr4zFixOperator._handle_type2abcRskT64(((en VD388VSM+B+B$BCCDD&&T[AAAArc t|d|ddjz}t|tjjr?d|vr|St |df}|j|z}||d|zdS)N_rrrErzYou should use '%s' here.) getattrr>rC collectionsrFCallablestrr warning)rrrrsubinvocation_strs rrzFixOperator._check_methodXssWX%6q%9%??@@ fko6 7 7 Q7"" 75>**,!'!2S!8 T#>#OPPPtrN)__name__ __module__ __qualname__ BM_compatibleorderrrdictPATTERNrr r#r)r,r/r5r8r<r!r4rrrrrrsM EG C Dc222 3G))) Z'((>>)(>ZII IZ"##99$#9Z#$$::%$:Z:;;SS<;SZ9::RR;:RZ011II21I BBB     rr) __doc__collections.abcrKlib2to3rlib2to3.fixer_utilrrrrr BaseFixrrrrr]s  ????????????GGGGG*$GGGGGrfixes/fix_xreadlines.py000064400000001261151027012300011227 0ustar00"""Fix "for x in f.xreadlines()" -> "for x in f". This fixer will also convert g(f.xreadlines) into g(f.__iter__).""" # Author: Collin Winter # Local imports from .. import fixer_base from ..fixer_util import Name class FixXreadlines(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< call=any+ trailer< '.' 'xreadlines' > trailer< '(' ')' > > | power< any+ trailer< '.' no_call='xreadlines' > > """ def transform(self, node, results): no_call = results.get("no_call") if no_call: no_call.replace(Name("__iter__", prefix=no_call.prefix)) else: node.replace([x.clone() for x in results["call"]]) fixes/fix_throw.py000064400000003056151027012300010240 0ustar00"""Fixer for generator.throw(E, V, T). g.throw(E) -> g.throw(E) g.throw(E, V) -> g.throw(E(V)) g.throw(E, V, T) -> g.throw(E(V).with_traceback(T)) g.throw("foo"[, V[, T]]) will warn about string exceptions.""" # Author: Collin Winter # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Name, Call, ArgList, Attr, is_tuple class FixThrow(fixer_base.BaseFix): BM_compatible = True PATTERN = """ power< any trailer< '.' 'throw' > trailer< '(' args=arglist< exc=any ',' val=any [',' tb=any] > ')' > > | power< any trailer< '.' 'throw' > trailer< '(' exc=any ')' > > """ def transform(self, node, results): syms = self.syms exc = results["exc"].clone() if exc.type is token.STRING: self.cannot_convert(node, "Python 3 does not support string exceptions") return # Leave "g.throw(E)" alone val = results.get("val") if val is None: return val = val.clone() if is_tuple(val): args = [c.clone() for c in val.children[1:-1]] else: val.prefix = "" args = [val] throw_args = results["args"] if "tb" in results: tb = results["tb"].clone() tb.prefix = "" e = Call(exc, args) with_tb = Attr(e, Name('with_traceback')) + [ArgList([tb])] throw_args.replace(pytree.Node(syms.power, with_tb)) else: throw_args.replace(Call(exc, args)) fixes/fix_numliterals.py000064400000001400151027012300011423 0ustar00"""Fixer that turns 1L into 1, 0755 into 0o755. """ # Copyright 2007 Georg Brandl. # Licensed to PSF under a Contributor Agreement. # Local imports from ..pgen2 import token from .. import fixer_base from ..fixer_util import Number class FixNumliterals(fixer_base.BaseFix): # This is so simple that we don't need the pattern compiler. _accept_type = token.NUMBER def match(self, node): # Override return (node.value.startswith("0") or node.value[-1] in "Ll") def transform(self, node, results): val = node.value if val[-1] in 'Ll': val = val[:-1] elif val.startswith('0') and val.isdigit() and len(set(val)) > 1: val = "0o" + val[1:] return Number(val, prefix=node.prefix) fixes/fix_imports2.py000064400000000441151027012300010647 0ustar00"""Fix incompatible imports and module references that must be fixed after fix_imports.""" from . import fix_imports MAPPING = { 'whichdb': 'dbm', 'anydbm': 'dbm', } class FixImports2(fix_imports.FixImports): run_order = 7 mapping = MAPPING fixes/fix_has_key.py000064400000006174151027012300010524 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Fixer for has_key(). Calls to .has_key() methods are expressed in terms of the 'in' operator: d.has_key(k) -> k in d CAVEATS: 1) While the primary target of this fixer is dict.has_key(), the fixer will change any has_key() method call, regardless of its class. 2) Cases like this will not be converted: m = d.has_key if m(k): ... Only *calls* to has_key() are converted. While it is possible to convert the above to something like m = d.__contains__ if m(k): ... this is currently not done. """ # Local imports from .. import pytree from .. import fixer_base from ..fixer_util import Name, parenthesize class FixHasKey(fixer_base.BaseFix): BM_compatible = True PATTERN = """ anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > after=any* > | negation=not_test< 'not' anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument) arg=any ','> ) ')' > > > """ def transform(self, node, results): assert results syms = self.syms if (node.parent.type == syms.not_test and self.pattern.match(node.parent)): # Don't transform a node matching the first alternative of the # pattern when its parent matches the second alternative return None negation = results.get("negation") anchor = results["anchor"] prefix = node.prefix before = [n.clone() for n in results["before"]] arg = results["arg"].clone() after = results.get("after") if after: after = [n.clone() for n in after] if arg.type in (syms.comparison, syms.not_test, syms.and_test, syms.or_test, syms.test, syms.lambdef, syms.argument): arg = parenthesize(arg) if len(before) == 1: before = before[0] else: before = pytree.Node(syms.power, before) before.prefix = " " n_op = Name("in", prefix=" ") if negation: n_not = Name("not", prefix=" ") n_op = pytree.Node(syms.comp_op, (n_not, n_op)) new = pytree.Node(syms.comparison, (arg, n_op, before)) if after: new = parenthesize(new) new = pytree.Node(syms.power, (new,) + tuple(after)) if node.parent.type in (syms.comparison, syms.expr, syms.xor_expr, syms.and_expr, syms.shift_expr, syms.arith_expr, syms.term, syms.factor, syms.power): new = parenthesize(new) new.prefix = prefix return new fixes/__init__.py000064400000000057151027012300007764 0ustar00# Dummy file to make this directory a package. fixes/fix_ws_comma.py000064400000002102151027012300010671 0ustar00"""Fixer that changes 'a ,b' into 'a, b'. This also changes '{a :b}' into '{a: b}', but does not touch other uses of colons. It does not touch other uses of whitespace. """ from .. import pytree from ..pgen2 import token from .. import fixer_base class FixWsComma(fixer_base.BaseFix): explicit = True # The user must ask for this fixers PATTERN = """ any<(not(',') any)+ ',' ((not(',') any)+ ',')* [not(',') any]> """ COMMA = pytree.Leaf(token.COMMA, ",") COLON = pytree.Leaf(token.COLON, ":") SEPS = (COMMA, COLON) def transform(self, node, results): new = node.clone() comma = False for child in new.children: if child in self.SEPS: prefix = child.prefix if prefix.isspace() and "\n" not in prefix: child.prefix = "" comma = True else: if comma: prefix = child.prefix if not prefix: child.prefix = " " comma = False return new fixes/fix_import.py000064400000006270151027012300010410 0ustar00"""Fixer for import statements. If spam is being imported from the local directory, this import: from spam import eggs Becomes: from .spam import eggs And this import: import spam Becomes: from . import spam """ # Local imports from .. import fixer_base from os.path import dirname, join, exists, sep from ..fixer_util import FromImport, syms, token def traverse_imports(names): """ Walks over all the names imported in a dotted_as_names node. """ pending = [names] while pending: node = pending.pop() if node.type == token.NAME: yield node.value elif node.type == syms.dotted_name: yield "".join([ch.value for ch in node.children]) elif node.type == syms.dotted_as_name: pending.append(node.children[0]) elif node.type == syms.dotted_as_names: pending.extend(node.children[::-2]) else: raise AssertionError("unknown node type") class FixImport(fixer_base.BaseFix): BM_compatible = True PATTERN = """ import_from< 'from' imp=any 'import' ['('] any [')'] > | import_name< 'import' imp=any > """ def start_tree(self, tree, name): super(FixImport, self).start_tree(tree, name) self.skip = "absolute_import" in tree.future_features def transform(self, node, results): if self.skip: return imp = results['imp'] if node.type == syms.import_from: # Some imps are top-level (eg: 'import ham') # some are first level (eg: 'import ham.eggs') # some are third level (eg: 'import ham.eggs as spam') # Hence, the loop while not hasattr(imp, 'value'): imp = imp.children[0] if self.probably_a_local_import(imp.value): imp.value = "." + imp.value imp.changed() else: have_local = False have_absolute = False for mod_name in traverse_imports(imp): if self.probably_a_local_import(mod_name): have_local = True else: have_absolute = True if have_absolute: if have_local: # We won't handle both sibling and absolute imports in the # same statement at the moment. self.warning(node, "absolute and local imports together") return new = FromImport(".", [imp]) new.prefix = node.prefix return new def probably_a_local_import(self, imp_name): if imp_name.startswith("."): # Relative imports are certainly not local imports. return False imp_name = imp_name.split(".", 1)[0] base_path = dirname(self.filename) base_path = join(base_path, imp_name) # If there is no __init__.py next to the file its not in a package # so can't be a relative import. if not exists(join(dirname(base_path), "__init__.py")): return False for ext in [".py", sep, ".pyc", ".so", ".sl", ".pyd"]: if exists(base_path + ext): return True return False __pycache__/btm_utils.cpython-311.opt-2.pyc000064400000023472151027012300014417 0ustar00 !A?h& ddlmZddlmZmZddlmZmZeZeZ ej Z eZ dZ dZdZGddeZd d Zd Zd Zd S))pytree)grammartoken)pattern_symbolspython_symbolsc0eZdZ ddZdZdZdZdZdS)MinNodeNch||_||_g|_d|_d|_g|_g|_dS)NF)typenamechildrenleafparent alternativesgroup)selfrrs */usr/lib64/python3.11/lib2to3/btm_utils.py__init__zMinNode.__init__s8      cZt|jdzt|jzS)N )strrr)rs r__repr__zMinNode.__repr__s"49~~#c$)nn44rc |}g}|r^|jtkrr|j|t |jt |jkr$t |jg}g|_|j}{|j}d}n|jtkrq|j |t |j t |jkr#t|j }g|_ |j}|j}d}n[|jtj kr"|j r||j n||j|j}|^|SN)rTYPE_ALTERNATIVESrappendlenrtupler TYPE_GROUPrget_characteristic_subpattern token_labelsNAMEr)rnodesubps r leaf_to_rootzMinNode.leaf_to_root!s_ 7! y---!((...t())S-?-???!$"3445D(*D%;D;DDyJ&& !!$'''tz??c$-&8&8888DDD!#DJ;D;DDyL---$)- DI&&&& DI&&&;DC! D rcj |D]}|}|r|cSdSr)leavesr))rlr(s rget_linear_subpatternzMinNode.get_linear_subpatternKsO   A>>##D     rc#nK |jD]}|Ed{V|js|VdSdSr)rr+)rchilds rr+zMinNode.leaves`s^7] & &E||~~ % % % % % % % %} JJJJJ  r)NN)__name__ __module__ __qualname__rrr)r-r+rrr r sj555(((T*rr Nc d}|jtjkr |jd}|jtjkrt |jdkrt |jd|}nstt}|jD]L}|j |dzr t ||}||j |Mn|jtj krt |jdkrVtt}|jD].}t ||}|r|j |/|jsd}nt |jd|}nh|jtj krRt|jdtjr1|jdjdkrt |jd|St|jdtjr|jdjdksIt |jdkr3t%|jddr|jdjdkrdSd}d}d}d }d} d } |jD]j}|jtjkrd }|}n1|jtjkrd}|} n|jtjkr|}t%|dr |jd krd} k| r6|jd} t%| dr| jdkr |jd } n |jd} | jt*jkr| jd krtt.}nt%t*| jr)tt1t*| j}ntt1t2| j}n| jt*jkr[| jd } | t8vrtt8| }nAtt*j| }n%| jtjkrt ||}|r7| jdjdkrd}n| jdjdkrnt:|r@|>|jddD].}t ||}||j |/|r||_|S)N)rr([valueTF=any')rr*+r)rsymsMatcherr Alternativesr! reduce_treer rindexr Alternativer#Unit isinstancerLeafr9hasattrDetailsRepeaterr%r&TYPE_ANYgetattrpysymsSTRINGstriptokensNotImplementedErrorr) r'rnew_noder/reducedr details_nodealternatives_node has_repeater repeater_nodehas_variable_name name_leafrs rrCrCgsH yDL  }Q yD%%% t}   " ""4=#3V<\. . .%''"111<99N&GL)/,R,RSSSHH&GFIO,L,LMMMHH ^|2 2 2?((--Dv~~"t 555" (9EEE ^t0 0 0"#4f==H  *%a(.#55'*0C77*)  6H0%.qt4 6 6%eX66&%,,W555!  Orc t|ts|St|dkr|dSg}g}gdg}d|D]}tt |drtt |fdr||Vtt |fdr|||||r|}n |r|}n|r|}t |tS) Nrr5)inforifnotNonez[]().,:c.t|tuSr)rr)xs rz/get_characteristic_subpattern..sd1ggnrc6t|to|vSrrGr)rb common_charss rrcz/get_characteristic_subpattern..sjC&8&8&NQ,=Nrc6t|to|vSrre)rb common_namess rrcz/get_characteristic_subpattern..s 1c(:(:(PqL?Pr)key)rGlistr!r<rec_testr max) subpatternssubpatterns_with_namessubpatterns_with_common_namessubpatterns_with_common_chars subpatternrfrhs @@rr$r$st k4 ( ( ;1~ $&!666L$&!L! : : x $<$<== > > :8JNNNNPPQQ :-44Z@@@@XjPPPPRRSS :-44Z@@@@'--j9994, &43 &43 { $ $ $$rc#K |D]B}t|ttfrt||Ed{V5||VCdSr)rGrjr"rk)sequence test_funcrbs rrkrksv  a$ ' ' 9-- - - - - - - - -)A,,     rr)rpgen2rrpygramrrr@rNopmaprQr%rLrr#objectr rCr$rkr3rrrzs2!!!!!!!!33333333     UUUUUfUUUnBBBBJ#%#%#%Jr__pycache__/btm_utils.cpython-311.pyc000064400000026374151027012300013463 0ustar00 !A?h&dZddlmZddlmZmZddlmZmZeZ eZ ej Z eZ dZdZdZGdd eZdd Zd Zd Zd S)z0Utility functions used by the btm_matcher module)pytree)grammartoken)pattern_symbolspython_symbolsc2eZdZdZddZdZdZdZdZdS) MinNodezThis class serves as an intermediate representation of the pattern tree during the conversion to sets of leaf-to-root subpatternsNch||_||_g|_d|_d|_g|_g|_dS)NF)typenamechildrenleafparent alternativesgroup)selfrrs */usr/lib64/python3.11/lib2to3/btm_utils.py__init__zMinNode.__init__s8      cZt|jdzt|jzS)N )strrr)rs r__repr__zMinNode.__repr__s"49~~#c$)nn44rc|}g}|r^|jtkrr|j|t |jt |jkr$t |jg}g|_|j}{|j}d}n|jtkrq|j |t |j t |jkr#t|j }g|_ |j}|j}d}n[|jtj kr"|j r||j n||j|j}|^|S)zInternal method. Returns a characteristic path of the pattern tree. This method must be run for all leaves until the linear subpatterns are merged into a singleN)rTYPE_ALTERNATIVESrappendlenrtupler TYPE_GROUPrget_characteristic_subpattern token_labelsNAMEr)rnodesubps r leaf_to_rootzMinNode.leaf_to_root!sZ! y---!((...t())S-?-???!$"3445D(*D%;D;DDyJ&& !!$'''tz??c$-&8&8888DDD!#DJ;D;DDyL---$)- DI&&&& DI&&&;DC! D rch|D]}|}|r|cSdS)aDrives the leaf_to_root method. The reason that leaf_to_root must be run multiple times is because we need to reject 'group' matches; for example the alternative form (a | b c) creates a group [b c] that needs to be matched. Since matching multiple linear patterns overcomes the automaton's capabilities, leaf_to_root merges each group into a single choice based on 'characteristic'ity, i.e. (a|b c) -> (a|b) if b more characteristic than c Returns: The most 'characteristic'(as defined by get_characteristic_subpattern) path for the compiled pattern tree. N)leavesr()rlr's rget_linear_subpatternzMinNode.get_linear_subpatternKsJ   A>>##D     rc#lK|jD]}|Ed{V|js|VdSdS)z-Generator that returns the leaves of the treeN)rr*)rchilds rr*zMinNode.leaves`s[] & &E||~~ % % % % % % % %} JJJJJ  r)NN) __name__ __module__ __qualname____doc__rrr(r,r*rrr r so555(((T*rr Nc d}|jtjkr |jd}|jtjkrt |jdkrt |jd|}nstt}|jD]L}|j |dzr t ||}||j |Mn|jtj krt |jdkrVtt}|jD].}t ||}|r|j |/|jsd}nt |jd|}nh|jtj krRt|jdtjr1|jdjdkrt |jd|St|jdtjr|jdjdksIt |jdkr3t%|jddr|jdjdkrdSd }d}d}d }d} d } |jD]j}|jtjkrd }|}n1|jtjkrd }|} n|jtjkr|}t%|dr |jd krd } k| r6|jd} t%| dr| jdkr |jd } n |jd} | jt*jkr| jd krtt.}nt%t*| jr)tt1t*| j}ntt1t2| j}n| jt*jkr[| jd} | t8vrtt8| }nAtt*j| }n%| jtjkrt ||}|r7| jdjdkrd}n| jdjdkrnt:|r@|>|jddD].}t ||}||j |/|r||_|S)z Internal function. Reduces a compiled pattern tree to an intermediate representation suitable for feeding the automaton. This also trims off any optional pattern elements(like [a], a*). N)rr([valueTF=any')rr*+r)rsymsMatcherr Alternativesr reduce_treer rindexr Alternativer"Unit isinstancerLeafr9hasattrDetailsRepeaterr$r%TYPE_ANYgetattrpysymsSTRINGstriptokensNotImplementedErrorr) r&rnew_noder.reducedr details_nodealternatives_node has_repeater repeater_nodehas_variable_name name_leafrs rrCrCgsH yDL  }Q yD%%% t}   " ""4=#3V<\. . .%''"111<99N&GL)/,R,RSSSHH&GFIO,L,LMMMHH ^|2 2 2?((--Dv~~"t 555" (9EEE ^t0 0 0"#4f==H  *%a(.#55'*0C77*)  6H0%.qt4 6 6%eX66&%,,W555!  Orct|ts|St|dkr|dSg}g}gdg}d|D]}tt |drtt |fdr||Vtt |fdr|||||r|}n |r|}n|r|}t |tS) zPicks the most characteristic from a list of linear patterns Current order used is: names > common_names > common_chars rr5)inforifnotNonez[]().,:c.t|tuSN)rr)xs rz/get_characteristic_subpattern..sd1ggnrc6t|to|vSrbrGr)rc common_charss rrdz/get_characteristic_subpattern..sjC&8&8&NQ,=Nrc6t|to|vSrbrf)rc common_namess rrdz/get_characteristic_subpattern..s 1c(:(:(PqL?Pr)key)rGlistr r<rec_testrmax) subpatternssubpatterns_with_namessubpatterns_with_common_namessubpatterns_with_common_chars subpatternrgris @@rr#r#so k4 ( ( ;1~ $&!666L$&!L! : : x $<$<== > > :8JNNNNPPQQ :-44Z@@@@XjPPPPRRSS :-44Z@@@@'--j9994, &43 &43 { $ $ $$rc#K|D]B}t|ttfrt||Ed{V5||VCdS)zPTests test_func on all items of sequence and items of included sub-iterablesN)rGrkr!rl)sequence test_funcrcs rrlrlss a$ ' ' 9-- - - - - - - - -)A,,     rrb)r2rpgen2rrpygramrrr@rNopmaprQr$rLrr"objectr rCr#rlr3rrr{s22!!!!!!!!33333333     UUUUUfUUUnBBBBJ#%#%#%Jr__pycache__/patcomp.cpython-311.opt-1.pyc000064400000023741151027012300014056 0ustar00 !A?hdZdZddlZddlmZmZmZmZmZm Z ddl m Z ddl m Z Gdd e Zd ZGd d eZejejejdd ZdZdZdZdS)zPattern compiler. The grammar is taken from PatternGrammar.txt. The compiler compiles a pattern to a pytree.*Pattern instance. z#Guido van Rossum N)driverliteralstokentokenizeparsegrammar)pytree)pygramceZdZdS)PatternSyntaxErrorN)__name__ __module__ __qualname__(/usr/lib64/python3.11/lib2to3/patcomp.pyr r sDrr c#Ktjtjtjh}t jt j|j}|D]}|\}}}}}||vr|VdS)z6Tokenizes a string suppressing significant whitespace.N) rNEWLINEINDENTDEDENTrgenerate_tokensioStringIOreadline) inputskiptokens quintupletypevaluestartend line_texts rtokenize_wrapperr%ss M5< 6D  %bk%&8&8&A B BF -6*eUC t  OOOrc2eZdZddZd dZdZddZdZdS) PatternCompilerNcL|#tj|_tj|_n7t j||_tj|j|_tj|_ tj |_ t j |jt|_dS)z^Initializer. Takes an optional alternative filename for the pattern grammar. N)convert)r pattern_grammarr pattern_symbolssymsr load_grammarSymbolspython_grammar pygrammarpython_symbolspysymsDriverpattern_convert)self grammar_files r__init__zPatternCompiler.__init__(su  !1DL.DII!.|<z0PatternCompiler.compile_node..Os'GGGbD%%b))GGGrNrcg|]}|gSrr)rFas rrHz0PatternCompiler.compile_node..Rs':':':':':':rminmaxc:g|]}|SrrDrEs rrHz0PatternCompiler.compile_node..Vs'CCCrT&&r**CCCr)r r,Matcherchildren Alternativeslenr WildcardPatternoptimize Alternative NegatedUnit compile_basicNegatedPatternrEQUALr!RepeaterSTARHUGEPLUSLBRACEget_intname) r5nodealtspunitspatternrdnodesrepeatrTchildrMrNs ` rr=zPatternCompiler.compile_nodeCs 9 ) ) )=#D 9 . . .GGGGDM##A#4FGGGD4yyA~~Aw&':':T':':':qIIIA::<<  9 - - -CCCCT]CCCE5zzQQx&wA1===A::<<  9 - - -((qrr):;;G%g..A::<<   u::??uQx} ;;8>D!""IE u::??uRy~1CCC2YF#2#JE$$UF33  HQKEzUZ''kuz))ku|++!LL!555cx==A%%,,x{33Caxx3!88!**,, 07)#3OOO  GL!!!rc|d}|jtjkrHtt j|j}tjt||S|jtj kr|j}| rS|tvrtd|z|ddrtdtjt|S|dkrd}n?|ds*t|j|d}|td|z|ddr(||djdg}nd}tj||S|jdkr||dS|jd kr4||d}tj|ggdd SdS) NrzInvalid token: %rrzCan't have details for tokenany_zInvalid symbol: %r([rL)r rSTRINGr<r evalStringr!r LeafPattern_type_of_literalNAMEisupper TOKEN_MAPr startswithgetattrr2r=rT NodePatternrW)r5rjrkrer!r content subpatterns rr[zPatternCompiler.compile_basicsQx 9 $ $+DJ7788E%&6u&=&=uEE E Y%* $ $JE}} 9 )),-@5-HIII9M,-KLLL))E*:;;;E>>DD))#..O"4;t<rs=3  EDDDDDDDDDDDDDDD        IIIIIfIIIZZ||   99966666r__pycache__/__main__.cpython-311.opt-2.pyc000064400000000475151027012300014133 0ustar00 !A?hCLddlZddlmZejeddS)N)mainz lib2to3.fixes)sysrexit)/usr/lib64/python3.11/lib2to3/__main__.pyr sB o  r__pycache__/main.cpython-311.opt-1.pyc000064400000035546151027012300013345 0ustar00 !A?hN.dZddlmZmZddlZddlZddlZddlZddlZddl Z ddl m Z dZ Gdde j Zd Zd d ZdS) z Main program for 2to3. )with_statementprint_functionN)refactorc |}|}tj||||dddS)z%Return a unified diff of two strings.z (original)z (refactored))lineterm) splitlinesdifflib unified_diff)abfilenames %/usr/lib64/python3.11/lib2to3/main.py diff_textsrsF A A  1h ,n)+ - - --c<eZdZdZ dfd ZdZfdZdZxZS)StdoutRefactoringToola2 A refactoring tool that can avoid overwriting its input files. Prints output to stdout. Output files can optionally be written to a different directory and or have an extra file suffix appended to their name for use in situations where you do not want to replace the input files. rc ||_||_|r.|tjs|tjz }||_||_||_tt| |||dS)aF Args: fixers: A list of fixers to import. options: A dict with RefactoringTool configuration. explicit: A list of fixers to run even if they are explicit. nobackups: If true no backup '.bak' files will be created for those files that are being refactored. show_diffs: Should diffs of the refactoring be printed to stdout? input_base_dir: The base directory for all input files. This class will strip this path prefix off of filenames before substituting it with output_dir. Only meaningful if output_dir is supplied. All files processed by refactor() must start with this path. output_dir: If supplied, all converted files will be written into this directory tree instead of input_base_dir. append_suffix: If supplied, all files output by this tool will have this appended to their filename. Useful for changing .py to .py3 for example by passing append_suffix='3'. N) nobackups show_diffsendswithossep_input_base_dir _output_dir_append_suffixsuperr__init__) selffixersoptionsexplicitrrinput_base_dir output_dir append_suffix __class__s rrzStdoutRefactoringTool.__init__$s(#$  %."9"9"&"A"A % bf $N-%+ #T**33FGXNNNNNrcl|j|||f|jj|g|Ri|dSN)errorsappendloggererror)r msgargskwargss r log_errorzStdoutRefactoringTool.log_errorAsJ Cv./// #/////////rc|}|jrt||jr@tj|j|t |jd}ntd|d|j|jr ||jz }||krktj |}tj |s|rtj || d|||j s|dz}tj|r< tj|n&#t $r| d|YnwxYw tj||n'#t $r| d||YnwxYwt%t&|j}||||||j st+j||||krt+j||dSdS)Nz filename z( does not start with the input_base_dir zWriting converted %s to %s.z.bakzCan't remove backup %szCan't rename %s to %s)r startswithrrpathjoinlen ValueErrorrdirnameisdirmakedirs log_messagerlexistsremoveOSErrorrenamerr write_fileshutilcopymode) r new_textrold_textencoding orig_filenamer%backupwriter's rr@z StdoutRefactoringTool.write_fileEsd   J""4#788 J7<<(8(0T5I1J1J1K1K(LNN!j)143G3G"IJJJ   , + +H H $ $22J7==,, ( ( J'''   :M% ' ' '~ L&Fwv&& GGIf%%%%GGG$$%=vFFFFFG L (F++++ L L L  !8(FKKKKK L+T22= h(H555~ . OFH - - - H $ $ OM8 4 4 4 4 4 % $s$-E E%$E%)E??!F#"F#c|r|d|dS|d||jrt|||} |jU|j5|D]}t |t jdddn #1swxYwYdSdS|D]}t |dS#t$rtd|dYdSwxYwdS)NzNo changes to %sz Refactored %szcouldn't encode z's diff for your terminal) r;rr output_lockprintsysstdoutflushUnicodeEncodeErrorwarn)r oldnewrequal diff_lineslines r print_outputz"StdoutRefactoringTool.print_outputls|     / : : : : :   _h 7 7 7 'S(;;  '3!-//(2,, %d J,,.../////////////////// %/((D!$KKKK(()D"((%&&&FF  s< B<3B B<BB<BB<&B<<CC)rrr) __name__ __module__ __qualname____doc__rr1r@rV __classcell__)r's@rrrsBDOOOOOO:000%5%5%5%5%5NrrcBtd|tjdS)Nz WARNING: file)rKrLstderr)r.s rrPrPs$ E33 sz222222rc  tjd}|dddd|dd d gd |d ddddd|ddd gd |dddd|dddd|dddd|d d!dd"|d#dd$|d%d&dd'|d(d)dd*d+ |d,d-dd.d/d01|d2d3dd4|d5dd.d/d61d*}i}||\}}|jr"d7|d8<|jst d9d7|_|jr|js| d:|j r|js| d;|js|j rt d<|js|jr| d=|j r9td>tjD]}t||sd?S|s8td@t jAtdBt jAdCSdD|vr&d7}|jrtdEt jAdCS|jrd7|dF<|jrd7|dG<|jr t*jn t*j}t+jdH|It+jdJ}t5tj} t5fdK|jD} t5} |jrJd*} |jD]&} | dLkrd7} | dMz| z'| r| | n| }n| | }| | }tBj"#|}|r]|$tBj%s>tBj"&|stBj"'|}|jr;|(tBj%}|)dN|j|tUtW||tW| |j|j ||j|j O}|j,s|r|-nZ |||j|j.|j/n1#tj0$rtdPt jAYdSwxYw|1tetg|j,S)QzMain program. Args: fixer_pkg: the name of a package where the fixers are located. args: optional; a list of command line arguments. If omitted, sys.argv[1:] is used. Returns a suggested exit status (0, 1, 2). z2to3 [options] file|dir ...)usagez-dz--doctests_only store_truezFix up doctests only)actionhelpz-fz--fixr+z1Each FIX specifies a transformation; default: all)rcdefaultrdz-jz --processesstorerintzRun 2to3 concurrently)rcretyperdz-xz--nofixz'Prevent a transformation from being runz-lz --list-fixeszList available transformationsz-pz--print-functionz0Modify the grammar so that print() is a functionz-ez--exec-functionz/Modify the grammar so that exec() is a functionz-vz --verbosezMore verbose loggingz --no-diffsz#Don't show diffs of the refactoringz-wz--writezWrite back modified filesz-nz --nobackupsFz&Don't write backups for modified filesz-oz --output-dirstrrzXPut output files in this directory instead of overwriting the input files. Requires -n.)rcrhrerdz-Wz--write-unchanged-fileszYAlso write files even if no changes were required (useful with --output-dir); implies -w.z --add-suffixzuAppend this string to all output filenames. Requires -n if non-empty. ex: --add-suffix='3' will generate .py3 files.Twrite_unchanged_filesz&--write-unchanged-files/-W implies -w.z%Can't use --output-dir/-o without -n.z"Can't use --add-suffix without -n.z@not writing files and not printing diffs; that's not very usefulzCan't use -n without -wz2Available transformations for the -f/--fix option:rz1At least one file or directory argument required.r]zUse --help to show usage.-zCan't write to stdin.r exec_functionz%(name)s: %(message)s)formatlevelz lib2to3.mainc3(K|] }dz|zV dS).fix_N).0fix fixer_pkgs r zmain..s-LLsW,s2LLLLLLrallrqz7Output in %r will mirror the input directory %r layout.)r$r%r&z+Sorry, -j isn't supported on this platform.)4optparse OptionParser add_option parse_argsrjrHrPr%rr- add_suffixno_diffs list_fixesrKrget_all_fix_namesrLr_rrmverboseloggingDEBUGINFO basicConfig getLoggersetget_fixers_from_packagenofixrtaddunion differencerr4 commonprefixrrr9r8rstripinforsortedr*refactor_stdin doctests_only processesMultiprocessingUnsupported summarizergbool)rur/parserrflagsr"fixnameror, avail_fixesunwanted_fixesr# all_presentrt requested fixer_namesr$rts` rmainrs  ")F G G GF d-l1333 dGHbNPPP dM'1 '>@@@ dIhDFFF dN<;=== d.|MOOO d-lLNNN dK 1333 l<@BBB dIl6888 dM,CEEE dN7 (NOOO d5lABBB nW5"GHHH N E%%d++MGT$)-%&} ; 9 : : : >'"3> <===;'"3; 9::: =QW-Q OPPP =0W.0 ./// BCCC1)<<  G 'NNNN 1  A SSSS ) ;;;;q d{{ =  ) ; ; ; ;1'"&&!%o%_ >GMM',E 6eDDDD  ~ . .Fh6yAABBKLLLLgmLLLLLNuuH{ 0 ; 8 8Ce||"  Y03677773>LK%%h///H %%h// &&~66KW))$//N9~66rv>>9 n--9 888'..rv66 M& 8 8 8  ;  x(8(8  7#33))!,  . . .B 9            D'-1F#-////6   C:''''qq    tBI  s;'U##*VVr))rZ __future__rrrLrr rrArxrrrMultiprocessRefactoringToolrrPrrrrrrs65555555  ---eeeeeH@eeeN333L L L L L L r__pycache__/fixer_util.cpython-311.pyc000064400000053537151027012300013634 0ustar00 !A?hf;dZddlmZddlmZmZddlmZddl m Z dZ dZ dZ d Zd-d Zd Zd ZdZe e fdZd.dZdZdZd-dZdZd-dZd-dZdZdZdZdZdZhdZ dZ!da"da#d a$d!a%d"Z&d#Z'd$Z(d%Z)d&Z*d'Z+d(Z,d)Z-ej.ej/hZ0d-d*Z1ej/ej.ej2hZ3d+Z4d-d,Z5d S)/z1Utility functions, node construction macros, etc.)token)LeafNode)python_symbols)patcompclttj|ttjd|gS)N=)rsymsargumentrrEQUAL)keywordvalues +/usr/lib64/python3.11/lib2to3/fixer_util.py KeywordArgrs.  $u{C00%8 : ::c6ttjdS)N()rrLPARrrLParenr  C  rc6ttjdS)N))rrRPARrrrRParenrrrc t|ts|g}t|ts d|_|g}ttj|t tjddgz|zS)zBuild an assignment statement r prefix) isinstancelistrrr atomrrr )targetsources rAssignr%su fd # # fd # #   $u{C<<<==F H HHrNc:ttj||S)zReturn a NAME leafr)rrNAME)namers rNamer)$s  D 0 0 00rcV|ttjt|ggS)zA node tuple for obj.attr)rr trailerDot)objattrs rAttrr/(s! dlSUUDM22 33rc6ttjdS)z A comma leaf,)rrCOMMArrrCommar3,s  S ! !!rc6ttjdS)zA period (.) leaf.)rrDOTrrrr,r,0s  3  rcttj||g}|r.|dttj||S)z-A parenthesised argument list, used by Call()r)rr r+clone insert_childarglist)argslparenrparennodes rArgListr?4sW  v||~~v||~~> ? ?D 7 !T$,55666 Krcjttj|t|g}|||_|S)zA function call)rr powerr?r) func_namer;rr>s rCallrC;s0  Y 6 7 7D  Krc6ttjdS)zA newline literal rrNEWLINErrrNewlinerHBs  t $ $$rc6ttjdS)z A blank linerFrrr BlankLinerKFs  r " ""rc:ttj||S)Nr)rrNUMBER)nrs rNumberrOJs  a / / //rc ttjttjd|ttjdgS)zA numeric or string subscript[])rr r+rrLBRACERBRACE) index_nodes r SubscriptrVMs=  tEL#66)#EL#668 9 99rc:ttj||S)z A string leafr)rrSTRING)stringrs rStringrZSs  fV 4 4 44rc pd|_d|_d|_ttjd}d|_ttjd}d|_||||g}|rWd|_ttjd}d|_|t t j||gt t j|t t j |g}t t j ttj d|ttj dgS)zuA list comprehension of the form [xp for fp in it if test]. If test is None, the "if test" part is omitted. rJrforinifrQrR) rrrr'appendrr comp_if listmakercomp_forr"rSrT) xpfpittestfor_leafin_leaf inner_argsif_leafinners rListComprlWs BIBIBIEJ&&HHO5:t$$GGNB,J ? uz4(($t|gt_==>>> "d4=*&E&E!F G GE  U\3//U\3//1 2 22rc@|D]}|ttjdttj|dttjddt t j|g}t t j|}|S)zO Return an import statement in the form: from package import name_leafsfromrrimport)removerrr'rr import_as_names import_from) package_name name_leafsleafchildrenimps r FromImportrxos UZ((UZc:::UZ#666T):668H t * *C Jrc l|d}|jtjkr|}n-t tj|g}|d}|r d|D}t tjt t|dt|dt tj|d||dggz|z}|j |_ |S)zfReturns an import statement and calls a method of the module: import module module.name()r-afterc6g|]}|Sr)r8).0rNs r z!ImportAndCall..s ***q***rrlparrpar) r8typer r:rrAr/r)r+r)r>resultsnamesr- newarglistrznews r ImportAndCallrs %.   C x4<YY[[ $, 66 G E +**E*** tzDqNNDqNN33T\fo++-- fo++--/001149 9 : :C CJ Jrct|tr'|jtt gkrdSt|tot |jdkot|jdt okt|jdtoKt|jdt o+|jdjdko|jdjdkS)z(Does the node represent a tuple literal?Tr~rrr)r rrvrrlenrrr>s ris_tuplers$$-FHHfhh3G"G"Gt tT " " .DM""a' .4=+T22 .4=+T22 .4=+T22  .  a &#-  .  a &#- /rc4t|tot|jdkokt|jdtoKt|jdto+|jdjdko|jdjdkS)z'Does the node represent a list literal?rr~rQrR)r rrrvrrrs ris_listrs tT " " /DM""Q& /4=+T22 /4=,d33 / a &#-  /  b!'3. 0rclttjt|t gSN)rr r"rrrs r parenthesizers#  FHHdFHH5 6 66r> allanymaxminsetsumr!tuplesorted enumeratec#^Kt||}|r|Vt||}|dSdS)alFollow an attribute chain. If you have a chain of objects where a.foo -> b, b.foo-> c, etc, use this to iterate over all objects in the chain. Iteration is terminated by getattr(x, attr) is None. Args: obj: the starting object attr: the name of the chaining attribute Yields: Each successive object in the chain. N)getattr)r-r.nexts r attr_chainrsU 3  D # tT"" #####rzefor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > z power< ( 'iter' | 'list' | 'tuple' | 'sorted' | 'set' | 'sum' | 'any' | 'all' | 'enumerate' | (any* trailer< '.' 'join' >) ) trailer< '(' node=any ')' > any* > z` power< ( 'sorted' | 'enumerate' ) trailer< '(' arglist ')' > any* > FchtsMtjtatjtatjt adattt g}t |t|dD]*\}}i}|||r |d|urdS+dS)a Returns true if node is in an environment where all that is required of it is being iterable (ie, it doesn't matter if it returns a list or an iterator). See test_map_nochange in test_fixers.py for some examples and tests. Tparentr>F) pats_builtrcompile_patternp0p1p2ziprmatch)r>patternspatternrrs rin_special_contextrs   $R ( (  $R ( (  $R ( ( B|HxD()C)CDD == ) ) gfo.E.E44 5rc|j}||jtjkrdS|j}|jt jt jfvrdS|jt jkr|j d|urdS|jt j ks;|jt j kr(||jtj ks|j d|urdSdS)zG Check that something isn't an attribute or function name etc. NFr~T) prev_siblingrrr6rr funcdefclassdef expr_stmtrv parameters typedargslistr2)r>prevrs ris_probably_builtinrs  D DI22u [F {t|T]333u {dn$$);t)C)Cu {do%% [D. . .  $)u{":": OA $ & &u 4rc|_|jtjkrAt|jdkr)|jd}|jt jkr|jS|j}|_dS)zFind the indentation of *node*.NrrrJ) rr suiterrvrINDENTrr)r>indents rfind_indentationrsd   9 " "s4='9'9A'='=]1%F{el**|#{   2rc|jtjkr|S|}|jdc}|_t tj|g}||_|Sr)rr rr8rr)r>rrs r make_suitersR yDJ ::<bindings rdoes_tree_importr/s' 44'::G ==rc@|jtjtjfvS)z0Returns true if the node is an import statement.)rr import_namerrrs r is_importr7s 9)4+;< <.is_import_stmt>s4 T--,$-,$-*++ -rNr~rrorr)rrrrvrr rrrXrrrr'rxrHr9) rr(r>rroot insert_posoffsetidxnode2import_rvs r touch_importr;s--- T??Dt,,Jt}-- T~d##  &t}STT':;;  MFE!>%((  6\  Q"4=11  IC T---$--}Q$ 44 1W t' X & & T# . . .*    WtEJS'I'I'I&JKK#Hj$t'7"B"BCCCCCrc P|jD]}d}|jtjkrNt ||jdr|cSt |t |jd|}|r|}n|jtjtjfvr/t |t |jd|}|r|}nK|jtj krt |t |jd|}|r|}nt|jddD]U\}}|jtj kr;|j dkr0t |t |j|dz|}|r|}Vn|jtvr|jdj |kr|}nmt|||r|}nY|jtjkrt |||}n2|jtjkrt ||jdr|}|r|s|cSt%|r|cSdS) z Returns the node which binds variable name, otherwise None. If optional argument package is supplied, only imports will be returned. See test cases for examples.Nrrrr:r~)rvrr for_stmt_findrrif_stmt while_stmttry_stmtrrCOLONr _def_syms_is_import_bindingrrr)r(r>rchildretrNikids rrris3 "" : & &T5>!,--  T:enR.@#A#A7KKAM# ZDL$/: : :T:enR.@#A#A7KKAM# Z4= ( (T:enQ.?#@#@'JJA &'qrr(:;;&&FAsx5;..393C3C(z%.1:M/N/NPWXX Ac & Z9 $ $):)@D)H)HCC tW 5 5 CC Z4+ + +tUG44CC Z4> ) )T5>!,--     ~~  4rc|g}|rl|}|jdkr)|jtvr||jn"|jt jkr |j|kr|S|ldS)N)popr _block_symsextendrvrr'r)r(r>nodess rrrs} FE yy{{ 9s??ty ;; LL ' ' ' ' Y%* $ $t););K  4rc.|jtjkr|s|jd}|jtjkr`|jD]V}|jtjkr|jdj|kr|cS2|jtjkr|j|kr|cSWn{|jtjkr1|jd}|jtjkr |j|kr|Sn5|jtjkr |j|kr|Sn|jtj kr|r2t|jd |krdS|jd}|rtd|rdS|jtj krt||r|S|jtjkr0|jd}|jtjkr |j|kr|Sn;|jtjkr |j|kr|S|r|jtjkr|SdS)z Will return node if node will import name, or node will import * from package. None is returned otherwise. See test cases for examples. rrrNras)rr rrvdotted_as_namesdotted_as_namerrr'rrstrstriprrqimport_as_nameSTAR)r>r(rrwrlastrNs rrrs   yD$$$W$mA 8t+ + +  :!444~a(.$66# 7Z5:--%+2E2EKKK  X, , ,<#DyEJ&&4:+=+= X # # T(9(9K d& & &  s4=+,,2244??4 M!   uT1~~ 4 Vt+ + +dA +K Vt* * *JqMEzUZ''EK4,?,? Vuz ! !agooK  5:--K 4rr)NN)6__doc__pgen2rpytreerrpygramrr rJrrrrr%r)r/r3r,r?rCrHrKrOrVrZrlrxrrrrconsuming_callsrrrrrrrrrrrrrrrrrr+rrrrrrrs77******:::!!!!!! H H H1111444"""    &&((%%%###0000999 555522220&8 / / /000777...###&  &.===*D*D*DZ]DL ) ((((T|T]DL9 ''''''r__pycache__/btm_matcher.cpython-311.opt-2.pyc000064400000014152151027012300014675 0ustar00 !A?h dZddlZddlZddlmZddlmZddlmZGdde Z Gd d e Z ia d Z dS) z+George Boutsioukis N) defaultdict)pytree) reduce_treec4eZdZ ejZdZdS)BMNodecli|_g|_ttj|_d|_dS)N)transition_tablefixersnextrcountidcontentselfs ,/usr/lib64/python3.11/lib2to3/btm_matcher.py__init__zBMNode.__init__s- " v|$$ N)__name__ __module__ __qualname__ itertoolsrrrrrrs5I IO  Errc.eZdZ dZdZdZdZdZdS) BottomMatcherct|_t|_|jg|_g|_t jd|_dS)NRefactoringTool) setmatchrrootnodesr logging getLoggerloggerrs rrzBottomMatcher.__init__sAUU HH i[  '(9:: rc |j|t|j}|}|||j}|D]}|j|dS)Nstart)r appendr pattern_treeget_linear_subpatternaddr!)rfixertreelinear match_nodes match_nodes r add_fixerzBottomMatcher.add_fixer%s  5!!!5-..++--hhvTYh77 % , ,J   $ $U + + + + , ,rc  |s|gSt|dtr\g}|dD]O}|||}|D]3}|||dd|4P|S|d|jvrt }||j|d<n|j|d}|ddr ||dd|}n|g}|S)Nrr'r) isinstancetupler,extendr r)rpatternr(r0 alternative end_nodesend next_nodes rr,zBottomMatcher.add1s'? 7N gaj% ( ( K&qz C C !HH[H>> $CCC&&txx S'A'ABBBBC qz!777"HH 5>&wqz22"271:> qrr{ ( HHWQRR[ HBB &K  rc8 |j}tt}|D]}|}|rd|_|jD]0}t |t jr|jdkr d|_n1|j dkr|j}n|j }||j vr3|j |}|j D]}|| |nV|j}|j |j jrnD||j vr2|j |}|j D]}|| ||j }||S)NT;Fr)r!rlist was_checkedchildrenr4rLeafvaluetyper r r)parent) rleavescurrent_ac_noderesultsleafcurrent_ast_nodechild node_tokenr-s rrunzBottomMatcher.runSs )d### ;# ;D# "! ;/3 ,-6E!%55%+:L:L7<(4#(A--!1!7JJ!1!6J!AAA&5&Fz&RO!0!7@@--.>????@'+iO(/;,3?<"_%EEE*9*J:*V%4%;DDE#EN112BCCCC#3#: C#! ;Drcp tdfd|jtddS)Nz digraph g{c "|jD]s}|j|}td|j|jt |t |jfz|dkrt|j|tdS)Nz%d -> %d [label=%s] //%sr)r keysprintr type_reprstrr r)node subnode_keysubnode print_nodes rrVz*BottomMatcher.print_ac..print_nodes#499;; $ $ / <0w Ik,B,BCDWDWXYZZZ!##'/*** 7####  $ $r})rPr!)rrVs @rprint_aczBottomMatcher.print_acsPF l $ $ $ $ $  49 c rN)rrrrr2r,rLrXrrrrrsf+;;; , , ,   D666p     rrctsGddlm}|jD]'\}}t |t kr |t|<(t||S)Nr)python_symbols) _type_reprspygramrZ__dict__itemsrCint setdefault)type_numrZnamevals rrQrQsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55r) __author__r#r collectionsrr r btm_utilsrobjectrrr[rQrrrrhsG; ######""""""V}}}}}F}}}@ 66666r__pycache__/main.cpython-311.pyc000064400000035624151027012300012403 0ustar00 !A?hN.dZddlmZmZddlZddlZddlZddlZddlZddl Z ddl m Z dZ Gdde j Zd Zd d ZdS) z Main program for 2to3. )with_statementprint_functionN)refactorc |}|}tj||||dddS)z%Return a unified diff of two strings.z (original)z (refactored))lineterm) splitlinesdifflib unified_diff)abfilenames %/usr/lib64/python3.11/lib2to3/main.py diff_textsrsF A A  1h ,n)+ - - --c<eZdZdZ dfd ZdZfdZdZxZS)StdoutRefactoringToola2 A refactoring tool that can avoid overwriting its input files. Prints output to stdout. Output files can optionally be written to a different directory and or have an extra file suffix appended to their name for use in situations where you do not want to replace the input files. rc ||_||_|r.|tjs|tjz }||_||_||_tt| |||dS)aF Args: fixers: A list of fixers to import. options: A dict with RefactoringTool configuration. explicit: A list of fixers to run even if they are explicit. nobackups: If true no backup '.bak' files will be created for those files that are being refactored. show_diffs: Should diffs of the refactoring be printed to stdout? input_base_dir: The base directory for all input files. This class will strip this path prefix off of filenames before substituting it with output_dir. Only meaningful if output_dir is supplied. All files processed by refactor() must start with this path. output_dir: If supplied, all converted files will be written into this directory tree instead of input_base_dir. append_suffix: If supplied, all files output by this tool will have this appended to their filename. Useful for changing .py to .py3 for example by passing append_suffix='3'. N) nobackups show_diffsendswithossep_input_base_dir _output_dir_append_suffixsuperr__init__) selffixersoptionsexplicitrrinput_base_dir output_dir append_suffix __class__s rrzStdoutRefactoringTool.__init__$s(#$  %."9"9"&"A"A % bf $N-%+ #T**33FGXNNNNNrcl|j|||f|jj|g|Ri|dSN)errorsappendloggererror)r msgargskwargss r log_errorzStdoutRefactoringTool.log_errorAsJ Cv./// #/////////rc|}|jrt||jr@tj|j|t |jd}ntd|d|j|jr ||jz }||krktj |}tj |s|rtj || d|||j s|dz}tj|r< tj|n&#t $r| d|YnwxYw tj||n'#t $r| d||YnwxYwt%t&|j}||||||j st+j||||krt+j||dSdS)Nz filename z( does not start with the input_base_dir zWriting converted %s to %s.z.bakzCan't remove backup %szCan't rename %s to %s)r startswithrrpathjoinlen ValueErrorrdirnameisdirmakedirs log_messagerlexistsremoveOSErrorrenamerr write_fileshutilcopymode) r new_textrold_textencoding orig_filenamer%backupwriter's rr@z StdoutRefactoringTool.write_fileEsd   J""4#788 J7<<(8(0T5I1J1J1K1K(LNN!j)143G3G"IJJJ   , + +H H $ $22J7==,, ( ( J'''   :M% ' ' '~ L&Fwv&& GGIf%%%%GGG$$%=vFFFFFG L (F++++ L L L  !8(FKKKKK L+T22= h(H555~ . OFH - - - H $ $ OM8 4 4 4 4 4 % $s$-E E%$E%)E??!F#"F#c|r|d|dS|d||jrt|||} |jU|j5|D]}t |t jdddn #1swxYwYdSdS|D]}t |dS#t$rtd|dYdSwxYwdS)NzNo changes to %sz Refactored %szcouldn't encode z's diff for your terminal) r;rr output_lockprintsysstdoutflushUnicodeEncodeErrorwarn)r oldnewrequal diff_lineslines r print_outputz"StdoutRefactoringTool.print_outputls|     / : : : : :   _h 7 7 7 'S(;;  '3!-//(2,, %d J,,.../////////////////// %/((D!$KKKK(()D"((%&&&FF  s< B<3B B<BB<BB<&B<<CC)rrr) __name__ __module__ __qualname____doc__rr1r@rV __classcell__)r's@rrrsBDOOOOOO:000%5%5%5%5%5NrrcBtd|tjdS)Nz WARNING: file)rKrLstderr)r.s rrPrPs$ E33 sz222222rc  tjd}|dddd|dd d gd |d ddddd|ddd gd |dddd|dddd|dddd|d d!dd"|d#dd$|d%d&dd'|d(d)dd*d+ |d,d-dd.d/d01|d2d3dd4|d5dd.d/d61d*}i}||\}}|jr"d7|d8<|jst d9d7|_|jr|js| d:|j r|js| d;|js|j rt d<|js|jr| d=|j r9td>tjD]}t||sd?S|s8td@t jAtdBt jAdCSdD|vr&d7}|jrtdEt jAdCS|jrd7|dF<|jrd7|dG<|jr t*jn t*j}t+jdH|It+jdJ}t5tj} t5fdK|jD} t5} |jrJd*} |jD]&} | dLkrd7} | dMz| z'| r| | n| }n| | }| | }tBj"#|}|r]|$tBj%s>tBj"&|stBj"'|}|jr;|(tBj%}|)dN|j|tUtW||tW| |j|j ||j|j O}|j,s|r|-ng |||j|j.|j/n>#tj0$r,|j/dksJtdPt jAYdSwxYw|1tetg|j,S)QzMain program. Args: fixer_pkg: the name of a package where the fixers are located. args: optional; a list of command line arguments. If omitted, sys.argv[1:] is used. Returns a suggested exit status (0, 1, 2). z2to3 [options] file|dir ...)usagez-dz--doctests_only store_truezFix up doctests only)actionhelpz-fz--fixr+z1Each FIX specifies a transformation; default: all)rcdefaultrdz-jz --processesstorerintzRun 2to3 concurrently)rcretyperdz-xz--nofixz'Prevent a transformation from being runz-lz --list-fixeszList available transformationsz-pz--print-functionz0Modify the grammar so that print() is a functionz-ez--exec-functionz/Modify the grammar so that exec() is a functionz-vz --verbosezMore verbose loggingz --no-diffsz#Don't show diffs of the refactoringz-wz--writezWrite back modified filesz-nz --nobackupsFz&Don't write backups for modified filesz-oz --output-dirstrrzXPut output files in this directory instead of overwriting the input files. Requires -n.)rcrhrerdz-Wz--write-unchanged-fileszYAlso write files even if no changes were required (useful with --output-dir); implies -w.z --add-suffixzuAppend this string to all output filenames. Requires -n if non-empty. ex: --add-suffix='3' will generate .py3 files.Twrite_unchanged_filesz&--write-unchanged-files/-W implies -w.z%Can't use --output-dir/-o without -n.z"Can't use --add-suffix without -n.z@not writing files and not printing diffs; that's not very usefulzCan't use -n without -wz2Available transformations for the -f/--fix option:rz1At least one file or directory argument required.r]zUse --help to show usage.-zCan't write to stdin.r exec_functionz%(name)s: %(message)s)formatlevelz lib2to3.mainc3(K|] }dz|zV dS).fix_N).0fix fixer_pkgs r zmain..s-LLsW,s2LLLLLLrallrqz7Output in %r will mirror the input directory %r layout.)r$r%r&z+Sorry, -j isn't supported on this platform.)4optparse OptionParser add_option parse_argsrjrHrPr%rr- add_suffixno_diffs list_fixesrKrget_all_fix_namesrLr_rrmverboseloggingDEBUGINFO basicConfig getLoggersetget_fixers_from_packagenofixrtaddunion differencerr4 commonprefixrrr9r8rstripinforsortedr*refactor_stdin doctests_only processesMultiprocessingUnsupported summarizergbool)rur/parserrflagsr"fixnameror, avail_fixesunwanted_fixesr# all_presentrt requested fixer_namesr$rts` rmainrs! ")F G G GF d-l1333 dGHbNPPP dM'1 '>@@@ dIhDFFF dN<;=== d.|MOOO d-lLNNN dK 1333 l<@BBB dIl6888 dM,CEEE dN7 (NOOO d5lABBB nW5"GHHH N E%%d++MGT$)-%&} ; 9 : : : >'"3> <===;'"3; 9::: =QW-Q OPPP =0W.0 ./// BCCC1)<<  G 'NNNN 1  A SSSS ) ;;;;q d{{ =  ) ; ; ; ;1'"&&!%o%_ >GMM',E 6eDDDD  ~ . .Fh6yAABBKLLLLgmLLLLLNuuH{ 0 ; 8 8Ce||"  Y03677773>LK%%h///H %%h// &&~66KW))$//N9~66rv>>9 n--9 888'..rv66 M& 8 8 8  ;  x(8(8  7#33))!,  . . .B 9            D'-1F#-////6   (1,,,,C:''''qq    tBI  s;'U##7VVr))rZ __future__rrrLrr rrArxrrrMultiprocessRefactoringToolrrPrrrrrrs65555555  ---eeeeeH@eeeN333L L L L L L r__pycache__/fixer_util.cpython-311.opt-1.pyc000064400000053537151027012300014573 0ustar00 !A?hf;dZddlmZddlmZmZddlmZddl m Z dZ dZ dZ d Zd-d Zd Zd ZdZe e fdZd.dZdZdZd-dZdZd-dZd-dZdZdZdZdZdZhdZ dZ!da"da#d a$d!a%d"Z&d#Z'd$Z(d%Z)d&Z*d'Z+d(Z,d)Z-ej.ej/hZ0d-d*Z1ej/ej.ej2hZ3d+Z4d-d,Z5d S)/z1Utility functions, node construction macros, etc.)token)LeafNode)python_symbols)patcompclttj|ttjd|gS)N=)rsymsargumentrrEQUAL)keywordvalues +/usr/lib64/python3.11/lib2to3/fixer_util.py KeywordArgrs.  $u{C00%8 : ::c6ttjdS)N()rrLPARrrLParenr  C  rc6ttjdS)N))rrRPARrrrRParenrrrc t|ts|g}t|ts d|_|g}ttj|t tjddgz|zS)zBuild an assignment statement r prefix) isinstancelistrrr atomrrr )targetsources rAssignr%su fd # # fd # #   $u{C<<<==F H HHrNc:ttj||S)zReturn a NAME leafr)rrNAME)namers rNamer)$s  D 0 0 00rcV|ttjt|ggS)zA node tuple for obj.attr)rr trailerDot)objattrs rAttrr/(s! dlSUUDM22 33rc6ttjdS)z A comma leaf,)rrCOMMArrrCommar3,s  S ! !!rc6ttjdS)zA period (.) leaf.)rrDOTrrrr,r,0s  3  rcttj||g}|r.|dttj||S)z-A parenthesised argument list, used by Call()r)rr r+clone insert_childarglist)argslparenrparennodes rArgListr?4sW  v||~~v||~~> ? ?D 7 !T$,55666 Krcjttj|t|g}|||_|S)zA function call)rr powerr?r) func_namer;rr>s rCallrC;s0  Y 6 7 7D  Krc6ttjdS)zA newline literal rrNEWLINErrrNewlinerHBs  t $ $$rc6ttjdS)z A blank linerFrrr BlankLinerKFs  r " ""rc:ttj||S)Nr)rrNUMBER)nrs rNumberrOJs  a / / //rc ttjttjd|ttjdgS)zA numeric or string subscript[])rr r+rrLBRACERBRACE) index_nodes r SubscriptrVMs=  tEL#66)#EL#668 9 99rc:ttj||S)z A string leafr)rrSTRING)stringrs rStringrZSs  fV 4 4 44rc pd|_d|_d|_ttjd}d|_ttjd}d|_||||g}|rWd|_ttjd}d|_|t t j||gt t j|t t j |g}t t j ttj d|ttj dgS)zuA list comprehension of the form [xp for fp in it if test]. If test is None, the "if test" part is omitted. rJrforinifrQrR) rrrr'appendrr comp_if listmakercomp_forr"rSrT) xpfpittestfor_leafin_leaf inner_argsif_leafinners rListComprlWs BIBIBIEJ&&HHO5:t$$GGNB,J ? uz4(($t|gt_==>>> "d4=*&E&E!F G GE  U\3//U\3//1 2 22rc@|D]}|ttjdttj|dttjddt t j|g}t t j|}|S)zO Return an import statement in the form: from package import name_leafsfromrrimport)removerrr'rr import_as_names import_from) package_name name_leafsleafchildrenimps r FromImportrxos UZ((UZc:::UZ#666T):668H t * *C Jrc l|d}|jtjkr|}n-t tj|g}|d}|r d|D}t tjt t|dt|dt tj|d||dggz|z}|j |_ |S)zfReturns an import statement and calls a method of the module: import module module.name()r-afterc6g|]}|Sr)r8).0rNs r z!ImportAndCall..s ***q***rrlparrpar) r8typer r:rrAr/r)r+r)r>resultsnamesr- newarglistrznews r ImportAndCallrs %.   C x4<YY[[ $, 66 G E +**E*** tzDqNNDqNN33T\fo++-- fo++--/001149 9 : :C CJ Jrct|tr'|jtt gkrdSt|tot |jdkot|jdt okt|jdtoKt|jdt o+|jdjdko|jdjdkS)z(Does the node represent a tuple literal?Tr~rrr)r rrvrrlenrrr>s ris_tuplers$$-FHHfhh3G"G"Gt tT " " .DM""a' .4=+T22 .4=+T22 .4=+T22  .  a &#-  .  a &#- /rc4t|tot|jdkokt|jdtoKt|jdto+|jdjdko|jdjdkS)z'Does the node represent a list literal?rr~rQrR)r rrrvrrrs ris_listrs tT " " /DM""Q& /4=+T22 /4=,d33 / a &#-  /  b!'3. 0rclttjt|t gSN)rr r"rrrs r parenthesizers#  FHHdFHH5 6 66r> allanymaxminsetsumr!tuplesorted enumeratec#^Kt||}|r|Vt||}|dSdS)alFollow an attribute chain. If you have a chain of objects where a.foo -> b, b.foo-> c, etc, use this to iterate over all objects in the chain. Iteration is terminated by getattr(x, attr) is None. Args: obj: the starting object attr: the name of the chaining attribute Yields: Each successive object in the chain. N)getattr)r-r.nexts r attr_chainrsU 3  D # tT"" #####rzefor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > z power< ( 'iter' | 'list' | 'tuple' | 'sorted' | 'set' | 'sum' | 'any' | 'all' | 'enumerate' | (any* trailer< '.' 'join' >) ) trailer< '(' node=any ')' > any* > z` power< ( 'sorted' | 'enumerate' ) trailer< '(' arglist ')' > any* > FchtsMtjtatjtatjt adattt g}t |t|dD]*\}}i}|||r |d|urdS+dS)a Returns true if node is in an environment where all that is required of it is being iterable (ie, it doesn't matter if it returns a list or an iterator). See test_map_nochange in test_fixers.py for some examples and tests. Tparentr>F) pats_builtrcompile_patternp0p1p2ziprmatch)r>patternspatternrrs rin_special_contextrs   $R ( (  $R ( (  $R ( ( B|HxD()C)CDD == ) ) gfo.E.E44 5rc|j}||jtjkrdS|j}|jt jt jfvrdS|jt jkr|j d|urdS|jt j ks;|jt j kr(||jtj ks|j d|urdSdS)zG Check that something isn't an attribute or function name etc. NFr~T) prev_siblingrrr6rr funcdefclassdef expr_stmtrv parameters typedargslistr2)r>prevrs ris_probably_builtinrs  D DI22u [F {t|T]333u {dn$$);t)C)Cu {do%% [D. . .  $)u{":": OA $ & &u 4rc|_|jtjkrAt|jdkr)|jd}|jt jkr|jS|j}|_dS)zFind the indentation of *node*.NrrrJ) rr suiterrvrINDENTrr)r>indents rfind_indentationrsd   9 " "s4='9'9A'='=]1%F{el**|#{   2rc|jtjkr|S|}|jdc}|_t tj|g}||_|Sr)rr rr8rr)r>rrs r make_suitersR yDJ ::<bindings rdoes_tree_importr/s' 44'::G ==rc@|jtjtjfvS)z0Returns true if the node is an import statement.)rr import_namerrrs r is_importr7s 9)4+;< <.is_import_stmt>s4 T--,$-,$-*++ -rNr~rrorr)rrrrvrr rrrXrrrr'rxrHr9) rr(r>rroot insert_posoffsetidxnode2import_rvs r touch_importr;s--- T??Dt,,Jt}-- T~d##  &t}STT':;;  MFE!>%((  6\  Q"4=11  IC T---$--}Q$ 44 1W t' X & & T# . . .*    WtEJS'I'I'I&JKK#Hj$t'7"B"BCCCCCrc P|jD]}d}|jtjkrNt ||jdr|cSt |t |jd|}|r|}n|jtjtjfvr/t |t |jd|}|r|}nK|jtj krt |t |jd|}|r|}nt|jddD]U\}}|jtj kr;|j dkr0t |t |j|dz|}|r|}Vn|jtvr|jdj |kr|}nmt|||r|}nY|jtjkrt |||}n2|jtjkrt ||jdr|}|r|s|cSt%|r|cSdS) z Returns the node which binds variable name, otherwise None. If optional argument package is supplied, only imports will be returned. See test cases for examples.Nrrrr:r~)rvrr for_stmt_findrrif_stmt while_stmttry_stmtrrCOLONr _def_syms_is_import_bindingrrr)r(r>rchildretrNikids rrris3 "" : & &T5>!,--  T:enR.@#A#A7KKAM# ZDL$/: : :T:enR.@#A#A7KKAM# Z4= ( (T:enQ.?#@#@'JJA &'qrr(:;;&&FAsx5;..393C3C(z%.1:M/N/NPWXX Ac & Z9 $ $):)@D)H)HCC tW 5 5 CC Z4+ + +tUG44CC Z4> ) )T5>!,--     ~~  4rc|g}|rl|}|jdkr)|jtvr||jn"|jt jkr |j|kr|S|ldS)N)popr _block_symsextendrvrr'r)r(r>nodess rrrs} FE yy{{ 9s??ty ;; LL ' ' ' ' Y%* $ $t););K  4rc.|jtjkr|s|jd}|jtjkr`|jD]V}|jtjkr|jdj|kr|cS2|jtjkr|j|kr|cSWn{|jtjkr1|jd}|jtjkr |j|kr|Sn5|jtjkr |j|kr|Sn|jtj kr|r2t|jd |krdS|jd}|rtd|rdS|jtj krt||r|S|jtjkr0|jd}|jtjkr |j|kr|Sn;|jtjkr |j|kr|S|r|jtjkr|SdS)z Will return node if node will import name, or node will import * from package. None is returned otherwise. See test cases for examples. rrrNras)rr rrvdotted_as_namesdotted_as_namerrr'rrstrstriprrqimport_as_nameSTAR)r>r(rrwrlastrNs rrrs   yD$$$W$mA 8t+ + +  :!444~a(.$66# 7Z5:--%+2E2EKKK  X, , ,<#DyEJ&&4:+=+= X # # T(9(9K d& & &  s4=+,,2244??4 M!   uT1~~ 4 Vt+ + +dA +K Vt* * *JqMEzUZ''EK4,?,? Vuz ! !agooK  5:--K 4rr)NN)6__doc__pgen2rpytreerrpygramrr rJrrrrr%r)r/r3r,r?rCrHrKrOrVrZrlrxrrrrconsuming_callsrrrrrrrrrrrrrrrrrr+rrrrrrrs77******:::!!!!!! H H H1111444"""    &&((%%%###0000999 555522220&8 / / /000777...###&  &.===*D*D*DZ]DL ) ((((T|T]DL9 ''''''r__pycache__/btm_matcher.cpython-311.pyc000064400000017170151027012300013740 0ustar00 !A?hdZdZddlZddlZddlmZddlmZddlm Z Gdd e Z Gd d e Z ia d ZdS) aA bottom-up tree matching algorithm implementation meant to speed up 2to3's matching process. After the tree patterns are reduced to their rarest linear path, a linear Aho-Corasick automaton is created. The linear automaton traverses the linear paths from the leaves to the root of the AST and returns a set of nodes for further matching. This reduces significantly the number of candidate nodes.z+George Boutsioukis N) defaultdict)pytree) reduce_treec6eZdZdZejZdZdS)BMNodez?Class for a node of the Aho-Corasick automaton used in matchingcli|_g|_ttj|_d|_dS)N)transition_tablefixersnextrcountidcontentselfs ,/usr/lib64/python3.11/lib2to3/btm_matcher.py__init__zBMNode.__init__s- " v|$$ N)__name__ __module__ __qualname____doc__ itertoolsrrrrrrs8II IO  Errc0eZdZdZdZdZdZdZdZdS) BottomMatcherzgThe main matcher class. After instantiating the patterns should be added using the add_fixer methodct|_t|_|jg|_g|_t jd|_dS)NRefactoringTool) setmatchrrootnodesr logging getLoggerloggerrs rrzBottomMatcher.__init__sAUU HH i[  '(9:: rc|j|t|j}|}|||j}|D]}|j|dS)zReduces a fixer's pattern tree to a linear path and adds it to the matcher(a common Aho-Corasick automaton). The fixer is appended on the matching states and called when they are reachedstartN)r appendr pattern_treeget_linear_subpatternaddr")rfixertreelinear match_nodes match_nodes r add_fixerzBottomMatcher.add_fixer%s 5!!!5-..++--hhvTYh77 % , ,J   $ $U + + + + , ,rc |s|gSt|dtr\g}|dD]O}|||}|D]3}|||dd|4P|S|d|jvrt }||j|d<n|j|d}|ddr ||dd|}n|g}|S)z5Recursively adds a linear pattern to the AC automatonrr(rN) isinstancetupler-extendr r)rpatternr)r1 alternative end_nodesend next_nodes rr-zBottomMatcher.add1s& 7N gaj% ( ( K&qz C C !HH[H>> $CCC&&txx S'A'ABBBBC qz!777"HH 5>&wqz22"271:> qrr{ ( HHWQRR[ HBB &K  rc6|j}tt}|D]}|}|rd|_|jD]0}t |t jr|jdkr d|_n1|j dkr|j}n|j }||j vr3|j |}|j D]}|| |nV|j}|j |j jrnD||j vr2|j |}|j D]}|| ||j }||S)auThe main interface with the bottom matcher. The tree is traversed from the bottom using the constructed automaton. Nodes are only checked once as the tree is retraversed. When the automaton fails, we give it one more shot(in case the above tree matches as a whole with the rejected leaf), then we break for the next leaf. There is the special case of multiple arguments(see code comments) where we recheck the nodes Args: The leaves of the AST tree to be matched Returns: A dictionary of node matches with fixers as the keys T;Fr)r"rlist was_checkedchildrenr5rLeafvaluetyper r r*parent) rleavescurrent_ac_noderesultsleafcurrent_ast_nodechild node_tokenr.s rrunzBottomMatcher.runSs )d### ;# ;D# "! ;/3 ,-6E!%55%+:L:L7<(4#(A--!1!7JJ!1!6J!AAA&5&Fz&RO!0!7@@--.>????@'+iO(/;,3?<"_%EEE*9*J:*V%4%;DDE#EN112BCCCC#3#: C#! ;Drcntdfd|jtddS)z %d [label=%s] //%sr)r keysprintr type_reprstrr r)node subnode_keysubnode print_nodes rrWz*BottomMatcher.print_ac..print_nodes#499;; $ $ / <0w Ik,B,BCDWDWXYZZZ!##'/*** 7####  $ $r}N)rQr")rrWs @rprint_aczBottomMatcher.print_acsM l $ $ $ $ $  49 c rN) rrrrrr3r-rMrYrrrrrsk++;;; , , ,   D666p     rrctsGddlm}|jD]'\}}t |t kr |t|<(t||S)Nr)python_symbols) _type_reprspygramr[__dict__itemsrDint setdefault)type_numr[namevals rrRrRsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55r)r __author__r$r collectionsrr r btm_utilsrobjectrrr\rRrrrrisGG; ######""""""V}}}}}F}}}@ 66666r__pycache__/pytree.cpython-311.pyc000064400000110163151027012300012757 0ustar00 !A?hFmdZdZddlZddlmZdZiadZGddeZ Gd d e Z Gd d e Z d Z GddeZ Gdde ZGdde ZGdde ZGdde ZdZdS)z Python parse tree definitions. This is a very concrete parse tree; we need to keep every token and even the comments and whitespace between tokens. There's also a pattern matching implementation here. z#Guido van Rossum N)StringIOictsGddlm}|jD]'\}}t |t kr |t|<(t||S)N)python_symbols) _type_reprspygramr__dict__itemstypeint setdefault)type_numrnamevals '/usr/lib64/python3.11/lib2to3/pytree.py type_reprrsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55ceZdZdZdZdZdZdZdZdZ dZ dZ dZ dZ d Zd Zd Zd Zd ZdZedZedZdZdZdZejdkrdZdSdS)Basez Abstract base class for Node and Leaf. This provides some default functionality and boilerplate using the template pattern. A node may be a subnode of at most one parent. NFc\|tus Jdt|S)z7Constructor that prevents Base from being instantiated.zCannot instantiate Base)robject__new__clsargskwdss rrz Base.__new__1s($ 9~~c"""rcV|j|jurtS||S)zW Compare two nodes for equality. This calls the method _eq(). ) __class__NotImplemented_eqselfothers r__eq__z Base.__eq__6s) > 0 0! !xxrct)a_ Compare two nodes for equality. This is called by __eq__ and __ne__. It is only called if the two nodes have the same type. This must be implemented by the concrete subclass. Nodes should be considered equal if they have the same structure, ignoring the prefix string and other context information. NotImplementedErrorr"s rr!zBase._eqBs "!rct)zr Return a cloned (deep) copy of self. This must be implemented by the concrete subclass. r'r#s rclonez Base.cloneM "!rct)zx Return a post-order iterator for the tree. This must be implemented by the concrete subclass. r'r*s r post_orderzBase.post_orderUr,rct)zw Return a pre-order iterator for the tree. This must be implemented by the concrete subclass. r'r*s r pre_orderzBase.pre_order]r,rc|jJt||Jt|ts|g}g}d}|jjD]N}||ur3|rJ|jj||f|||d}9||O|sJ|j||f|j||j_|D]}|j|_d|_dS)z/Replace this node with a new one in the parent.NFT)parentstr isinstancelistchildrenextendappendchanged)r#new l_childrenfoundchxs rreplacez Base.replacees{&&D &&&#t$$ %C +& & &BTzz CC4;#7s"CCCy?%%c***!!"%%%%00t}dC000u )  # #A{AHH rc|}t|ts+|jsdS|jd}t|t+|jS)z9Return the line number which generated the invocant node.Nr)r4Leafr6linenor#nodes r get_linenozBase.get_lineno|sRT4(( $= =#DT4(( ${rcT|jr|jd|_dS)NT)r2r9 was_changedr*s rr9z Base.changeds. ; " K   ! ! !rc|jrTt|jjD]<\}}||ur1|j|jj|=d|_|cS;dSdS)z Remove the node from the tree. Returns the position of the node in its parent's children before it was removed. N)r2 enumerater6r9)r#irDs rremovez Base.removes ; $T[%9::  44<<K''))) ,Q/"&DKHHH      rc|jdSt|jjD]3\}}||ur* |jj|dzcS#t$rYdSwxYw4dS)z The node immediately following the invocant in their parent's children list. If the invocant does not have a next sibling, it is None Nr)r2rIr6 IndexErrorr#rJchilds r next_siblingzBase.next_siblings ; 4"$+"677  HAu}} ;/!4444!   444   sA AAc|jdSt|jjD])\}}||ur |dkrdS|jj|dz cS*dS)z The node immediately preceding the invocant in their parent's children list. If the invocant does not have a previous sibling, it is None. Nrr)r2rIr6rNs r prev_siblingzBase.prev_siblingsu ; 4"$+"677 1 1HAu}}6644{+AaC0000 1 1rc#RK|jD]}|Ed{VdSN)r6leavesr#rOs rrUz Base.leavessD] & &E||~~ % % % % % % % % & &rcL|jdSd|jzS)Nrr)r2depthr*s rrXz Base.depths( ; 14;$$&&&&rc&|j}|dS|jS)z Return the string immediately following the invocant node. This is effectively equivalent to node.next_sibling.prefix N)rPprefix)r#next_sibs r get_suffixzBase.get_suffixs $  2rrcFt|dS)Nascii)r3encoder*s r__str__z Base.__str__st99##G,, ,r)__name__ __module__ __qualname____doc__r r2r6rG was_checkedrr%__hash__r!r+r.r0r?rEr9rKpropertyrPrRrUrXr]sys version_inforcrrrrrs` D FHKK### H " " """""""""".       X  1 1X 1&&&'''  &   - - - - -! rrceZdZdZ ddZdZdZejdkreZ dZ dZ d Z d Z ed Zejd Zd ZdZdZdS)Nodez+Concrete implementation for interior nodes.Nc|dks J|||_t||_|jD]'}|jJt |||_(|||_|r|dd|_dSd|_dS)z Initializer. Takes a type constant (a symbol number >= 256), a sequence of child nodes, and an optional context keyword argument. As a side effect, the parent pointers of the children are updated. N)r r5r6r2reprr[fixers_applied)r#r r6contextr[rrr=s r__init__z Node.__init__ss{{{D{{{ X -  B9$$d2hh$$$BII   DK  '"0"3D   "&D   rcZ|jjdt|jd|jdSz)Return a canonical string representation.(, ))rrdrr r6r*s r__repr__z Node.__repr__s6#~666(3333#}}}. .rc\dtt|jS)k Return a pretty string representation. This reproduces the input source exactly. rZ)joinmapr3r6r*s r __unicode__zNode.__unicode__s" wws3 ..///rr^c>|j|jf|j|jfkSzCompare two nodes for equality.)r r6r"s rr!zNode._eqs 4=)ej%.-IIIrcXt|jd|jD|jS)$Return a cloned (deep) copy of self.c6g|]}|Sr)r+).0r=s r zNode.clone..s CCCr CCCrrr)rnr r6rrr*s rr+z Node.clones6DICCT]CCC#'#6888 8rc#ZK|jD]}|Ed{V|VdSz*Return a post-order iterator for the tree.N)r6r.rVs rr.zNode.post_ordersK] * *E'')) ) ) ) ) ) ) ) ) rc#ZK|V|jD]}|Ed{VdSz)Return a pre-order iterator for the tree.N)r6r0rVs rr0zNode.pre_order sO ] ) )E(( ( ( ( ( ( ( ( ( ) )rc8|jsdS|jdjS)zO The whitespace and comments preceding this node in the input. rZrr6r[r*s rr[z Node.prefixs# } 2}Q&&rc<|jr||jd_dSdSNrrr#r[s rr[z Node.prefixs+ = -&,DM!  # # # - -rct||_d|j|_||j|<|dS)z Equivalent to 'node.children[i] = child'. This method also sets the child's parent attribute appropriately. N)r2r6r9rNs r set_childzNode.set_child s7  "& a  a rcr||_|j|||dS)z Equivalent to 'node.children.insert(i, child)'. This method also sets the child's parent attribute appropriately. N)r2r6insertr9rNs r insert_childzNode.insert_child*s4   Q&&& rcp||_|j||dS)z Equivalent to 'node.children.append(child)'. This method also sets the child's parent attribute appropriately. N)r2r6r8r9rVs r append_childzNode.append_child3s2   U### rNNN)rdrerfrgrtrzrrkrlrcr!r+r.r0rjr[setterrrrrrrrnrns55 $''''2... 000 &  JJJ888  ))) ''X' ]--]-rrnceZdZdZdZdZdZddgfdZdZdZ e j dkre Z d Z d Zd Zd Zd ZedZejdZdS)rAz'Concrete implementation for leaf nodes.rZrNcd|cxkrdks nJ|||\|_\|_|_||_||_|||_|dd|_dS)z Initializer. Takes a type constant (a token number < 256), a string value, and an optional context keyword argument. rrpN)_prefixrBcolumnr valuerr)r#r rrsr[rrs rrtz Leaf.__init__FsrD3  7> 4DL44;    !DL,QQQ/rc@|jjd|jd|jdSrv)rrdr rr*s rrzz Leaf.__repr__Ys,#~666#yyy#zzz+ +rc:|jt|jzS)r|)r[r3rr*s rrzLeaf.__unicode___s {S__,,rr^c>|j|jf|j|jfkSr)r rr"s rr!zLeaf._eqjs 4:&5:u{*CCCrclt|j|j|j|j|jff|jS)rr)rAr rr[rBrrrr*s rr+z Leaf.clonens:DItz[4; "<=#'#6888 8rc#K|VdSrTrr*s rrUz Leaf.leavests rc#K|VdSrrr*s rr.zLeaf.post_orderw rc#K|VdSrrr*s rr0zLeaf.pre_order{rrc|jS)zP The whitespace and comments preceding this token in the input. )rr*s rr[z Leaf.prefixs |rc<|||_dSrT)r9rrs rr[z Leaf.prefixs  r)rdrerfrgrrBrrtrzrrkrlrcr!r+rUr.r0rjr[rrrrrArA=s11G F F "0000&+++ --- &  DDD888 X  ]]rrAc|\}}}}|s ||jvr-t|dkr|dSt|||St|||S)z Convert raw node information to a Node or Leaf instance. This is passed to the parser driver which calls it whenever a reduction of a grammar rule produces a new complete node, so that the tree is build strictly bottom-up. rr)rs) number2symbollenrnrA)grraw_noder rrsr6s rconvertrsn&."D%(242+++ x==A  A; D(G4444D%1111rcFeZdZdZdZdZdZdZdZdZ d dZ d dZ dZ dS) BasePatterna A pattern is a tree matching pattern. It looks for a specific node type (token or symbol), and optionally for a specific content. This is an abstract base class. There are three concrete subclasses: - LeafPattern matches a single leaf node; - NodePattern matches a single node (usually non-leaf); - WildcardPattern matches a sequence of nodes of variable length. Nc\|tus Jdt|S)z>Constructor that prevents BasePattern from being instantiated.zCannot instantiate BasePattern)rrrrs rrzBasePattern.__new__s.+%%%'G%%%~~c"""rct|j|j|jg}|r|d |d=|r|d |jjddtt|dS)Nrwrxry) rr contentrrrdr}r~rq)r#rs rrzzBasePattern.__repr__sw$)$$dlDI> tBx'R tBx'>222DIIc$oo4N4N4N4NOOrc|S)z A subclass can define this as a hook for optimizations. Returns either self or another node with the same effect. rr*s roptimizezBasePattern.optimizes  rc|j|j|jkrdS|j5d}|i}|||sdS|r||||jr |||j<dS)a# Does this pattern exactly match a node? Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. Default implementation for non-wildcard patterns. NFT)r r _submatchupdater)r#rDresultsrs rmatchzBasePattern.matchs 9 TY$)%;%;5 < #A">>$** u "q!!!  49 !%GDI trcdt|dkrdS||d|S)z Does this pattern exactly match a sequence of nodes? Default implementation for non-wildcard patterns. rFr)rr)r#nodesrs r match_seqzBasePattern.match_seqs0 u::??5zz%(G,,,rc#^Ki}|r$||d|r d|fVdSdSdS)z} Generator yielding all matches for this pattern. Default implementation for non-wildcard patterns. rrN)r)r#rrs rgenerate_matcheszBasePattern.generate_matchessS   TZZa!,, Q$JJJJJ    rrT) rdrerfrgr rrrrzrrrrrrrrrs   DG D### PPP 2----rrc&eZdZddZddZddZdS) LeafPatternNc|d|cxkrdks nJ||,t|tsJt|||_||_||_dS)ap Initializer. Takes optional type, content, and name. The type, if given must be a token type (< 256). If not given, this matches any *leaf* node; the content may still be required. The content, if given, must be a string. If a name is given, the matching node is stored in the results dict under that key. Nrrp)r4r3rqr rr)r#r rrs rrtzLeafPattern.__init__sn  ????s?????D???  gs++ : :T']] : :+   rcht|tsdSt|||S)z*Override match() to insist on a leaf node.F)r4rArrr#rDrs rrzLeafPattern.match s1$%% 5  tW555rc"|j|jkS) Match the pattern's content to the node's children. This assumes the node type matches and self.content is not None. Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. When returning False, the results dict may still be updated. )rrrs rrzLeafPattern._submatchs|tz))rrrT)rdrerfrtrrrrrrrsP(6666 * * * * * *rrc"eZdZdZddZddZdS) NodePatternFNcr||dks J||t|trJt|t|}t |D]B\}}t|t s J||ft|t rd|_C||_||_ ||_ dS)ad Initializer. Takes optional type, content, and name. The type, if given, must be a symbol type (>= 256). If the type is None this matches *any* single node (leaf or not), except if content is not None, in which it only matches non-leaf nodes that also match the content pattern. The content, if not None, must be a sequence of Patterns that must match the node's children exactly. If the content is given, the type must not be None. If a name is given, the matching node is stored in the results dict under that key. NrpT) r4r3rqr5rIrWildcardPattern wildcardsr rr)r#r rrrJitems rrtzNodePattern.__init__$s  3;;;;;;  !'3// > >g > >/7mmG$W-- * *4!$ 44??q$i??4dO44*%)DN   rc|jrTt|j|jD]7\}}|t |jkr|||dS8dSt |jt |jkrdSt |j|jD]\}}|||sdSdS)rNTF)rrrr6rrzipr)r#rDrcr subpatternrOs rrzNodePattern._submatchAs > (t}EE  1DM*****q)))44+5 t|  DM 2 2 2 25!$T\4=!A!A   J##E733 uu trrrT)rdrerfrrtrrrrrr sAI:rrcPeZdZdZddedfdZdZd dZd dZdZ d Z d Z d Z dS) ra A wildcard pattern can match zero or more nodes. This has all the flexibility needed to implement patterns like: .* .+ .? .{m,n} (a b c | d e | f) (...)* (...)+ (...)? (...){m,n} except it always uses non-greedy matching. Nrcvd|cxkr|cxkr tksnJ||f|sttt|}t|sJt ||D](}t|sJt |)||_||_||_||_dS)a Initializer. Args: content: optional sequence of subsequences of patterns; if absent, matches one node; if present, each subsequence is an alternative [*] min: optional minimum number of times to match, default 0 max: optional maximum number of times to match, default HUGE name: optional name assigned to this match [*] Thus, if content is [[a, b, c], [d, e], [f, g, h]] this is equivalent to (a b c | d e | f g h); if content is None, this is equivalent to '.' in regular expression terms. The min and max parameters work as follows: min=0, max=maxint: .* min=1, max=maxint: .+ min=0, max=1: .? min=1, max=1: . If content is not None, replace the dot with the parenthesized list of alternatives, e.g. (a b c | d e | f g h)* rN) HUGEtupler~rrqrminmaxr)r#rrrralts rrtzWildcardPattern.__init__ks.C&&&&3&&&&$&&&&&c &&&  Cw//00Gw<< . .g . .< + +3xx**c**x*  rc<d}|jIt|jdkr1t|jddkr|jdd}|jdkrM|jdkrB|jt |jS|$|j|jkr|S|jdkrft|trQ|jdkrF|j|jkr6t|j|j|jz|j|jz|jS|S)z+Optimize certain stacked wildcard patterns.Nrr)r) rrrrrrrr4r)r#rs rrzWildcardPattern.optimizes L $    " "s4<?';';q'@'@a+J 8q==TX]]|#" 2222%49 +G+G!**,,, HMMj_EEM Na  DI$@$@":#5#'8JN#:#'8JN#:#-?44 4 rc0||g|S)z'Does this pattern exactly match a node?)rrs rrzWildcardPattern.matchs~~tfg...rc||D]P\}}|t|kr8|3|||jrt |||j<dSQdS)z4Does this pattern exactly match a sequence of nodes?NTF)rrrrr5)r#rrrrs rrzWildcardPattern.match_seqsy))%00  DAqCJJ&NN1%%%y9-1%[[ *tt  urc #.K|j^t|jdtt||jzD]#}i}|jr|d|||j<||fV$dS|jdkr||VdSttdr$tj }tt_ | |dD]$\}}|jr|d|||j<||fV%nJ#t$r=| |D]$\}}|jr|d|||j<||fV%YnwxYwttdr|t_ dSdS#ttdr |t_ wxYw)a" Generator yielding matches for a sequence of nodes. Args: nodes: sequence of nodes Yields: (count, results) tuples where: count: the match comprises nodes[:count]; results: dict containing named submatches. Nr bare_name getrefcountr)rrangerrrr_bare_name_matcheshasattrrkstderrr_recursive_matches RuntimeError_iterative_matches)r#rcountr save_stderrs rrz WildcardPattern.generate_matchess < txSUTX-F-F)FGG  91#(%=AdiLQh    Y+ % %))%00 0 0 0 0 0 sM** (!j %ZZ  - $ 7 7q A A##HE1y5',VeV}$) (NNNN#  # # #!% 7 7 > >##HE1y5',VeV}$) (NNNN## #3 ..-!,CJJJ--73 ..-!,CJ,,,,s+;DE1AE E1E  E11#Fc#Kt|}d|jkrdifVg}|jD]5}t||D]"\}}||fV|||f#6|rg}|D]\}} ||kr||jkr}|jD]u}t|||dD]Z\} } | dkrOi}|| || || z|fV||| z|f[v|}|dSdS)z(Helper to iteratively yield the matches.rN)rrrrr8rr) r#rnodelenrrrr new_resultsc0r0c1r1s rrz"WildcardPattern._iterative_matchesse** ==R%KKK< ' 'C(e44 ' '1d 1v&&&& '  "K! A AB<rs3  666n-n-n-n-n-6n-n-n-`kkkkk4kkk\LLLLL4LLL\222&SSSSS&SSSl)*)*)*)*)*+)*)*)*X:::::+:::zy)y)y)y)y)ky)y)y)x     [   F%%%%%r__pycache__/main.cpython-311.opt-2.pyc000064400000032236151027012300013337 0ustar00 !A?hN. ddlmZmZddlZddlZddlZddlZddlZddlZddl m Z dZ Gdde j Z dZd d ZdS) )with_statementprint_functionN)refactorc  |}|}tj||||dddS)Nz (original)z (refactored))lineterm) splitlinesdifflib unified_diff)abfilenames %/usr/lib64/python3.11/lib2to3/main.py diff_textsrsI/ A A  1h ,n)+ - - --c:eZdZ dfd ZdZfdZdZxZS)StdoutRefactoringToolrc  ||_||_|r.|tjs|tjz }||_||_||_tt| |||dSN) nobackups show_diffsendswithossep_input_base_dir _output_dir_append_suffixsuperr__init__) selffixersoptionsexplicitrrinput_base_dir output_dir append_suffix __class__s rr zStdoutRefactoringTool.__init__$s $#$  %."9"9"&"A"A % bf $N-%+ #T**33FGXNNNNNrcl|j|||f|jj|g|Ri|dSr)errorsappendloggererror)r!msgargskwargss r log_errorzStdoutRefactoringTool.log_errorAsJ Cv./// #/////////rc|}|jrt||jr@tj|j|t |jd}ntd|d|j|jr ||jz }||krktj |}tj |s|rtj || d|||j s|dz}tj|r< tj|n&#t $r| d|YnwxYw tj||n'#t $r| d||YnwxYwt%t&|j}||||||j st+j||||krt+j||dSdS)Nz filename z( does not start with the input_base_dir zWriting converted %s to %s.z.bakzCan't remove backup %szCan't rename %s to %s)r startswithrrpathjoinlen ValueErrorrdirnameisdirmakedirs log_messagerlexistsremoveOSErrorrenamerr write_fileshutilcopymode) r!new_textrold_textencoding orig_filenamer&backupwriter(s rr@z StdoutRefactoringTool.write_fileEsd   J""4#788 J7<<(8(0T5I1J1J1K1K(LNN!j)143G3G"IJJJ   , + +H H $ $22J7==,, ( ( J'''   :M% ' ' '~ L&Fwv&& GGIf%%%%GGG$$%=vFFFFFG L (F++++ L L L  !8(FKKKKK L+T22= h(H555~ . OFH - - - H $ $ OM8 4 4 4 4 4 % $s$-E E%$E%)E??!F#"F#c|r|d|dS|d||jrt|||} |jU|j5|D]}t |t jdddn #1swxYwYdSdS|D]}t |dS#t$rtd|dYdSwxYwdS)NzNo changes to %sz Refactored %szcouldn't encode z's diff for your terminal) r;rr output_lockprintsysstdoutflushUnicodeEncodeErrorwarn)r!oldnewrequal diff_lineslines r print_outputz"StdoutRefactoringTool.print_outputls|     / : : : : :   _h 7 7 7 'S(;;  '3!-//(2,, %d J,,.../////////////////// %/((D!$KKKK(()D"((%&&&FF  s< B<3B B<BB<BB<&B<<CC)rrr)__name__ __module__ __qualname__r r1r@rV __classcell__)r(s@rrrsBDOOOOOO:000%5%5%5%5%5NrrcBtd|tjdS)Nz WARNING: file)rKrLstderr)r.s rrPrPs$ E33 sz222222rc  tjd}|dddd|dd d gd |d ddddd|ddd gd |dddd|dddd|dddd|d d!dd"|d#dd$|d%d&dd'|d(d)dd*d+ |d,d-dd.d/d01|d2d3dd4|d5dd.d/d61d*}i}||\}}|jr"d7|d8<|jst d9d7|_|jr|js| d:|j r|js| d;|js|j rt d<|js|jr| d=|j r9td>tjD]}t||sd?S|s8td@t jAtdBt jAdCSdD|vr&d7}|jrtdEt jAdCS|jrd7|dF<|jrd7|dG<|jr t*jn t*j}t+jdH|It+jdJ}t5tj} t5fdK|jD} t5} |jrJd*} |jD]&} | dLkrd7} | dMz| z'| r| | n| }n| | }| | }tBj"#|}|r]|$tBj%s>tBj"&|stBj"'|}|jr;|(tBj%}|)dN|j|tUtW||tW| |j|j ||j|j O}|j,s|r|-nZ |||j|j.|j/n1#tj0$rtdPt jAYdSwxYw|1tetg|j,S)QNz2to3 [options] file|dir ...)usagez-dz--doctests_only store_truezFix up doctests only)actionhelpz-fz--fixr+z1Each FIX specifies a transformation; default: all)rbdefaultrcz-jz --processesstorerintzRun 2to3 concurrently)rbrdtypercz-xz--nofixz'Prevent a transformation from being runz-lz --list-fixeszList available transformationsz-pz--print-functionz0Modify the grammar so that print() is a functionz-ez--exec-functionz/Modify the grammar so that exec() is a functionz-vz --verbosezMore verbose loggingz --no-diffsz#Don't show diffs of the refactoringz-wz--writezWrite back modified filesz-nz --nobackupsFz&Don't write backups for modified filesz-oz --output-dirstrrzXPut output files in this directory instead of overwriting the input files. Requires -n.)rbrgrdrcz-Wz--write-unchanged-fileszYAlso write files even if no changes were required (useful with --output-dir); implies -w.z --add-suffixzuAppend this string to all output filenames. Requires -n if non-empty. ex: --add-suffix='3' will generate .py3 files.Twrite_unchanged_filesz&--write-unchanged-files/-W implies -w.z%Can't use --output-dir/-o without -n.z"Can't use --add-suffix without -n.z@not writing files and not printing diffs; that's not very usefulzCan't use -n without -wz2Available transformations for the -f/--fix option:rz1At least one file or directory argument required.r\zUse --help to show usage.-zCan't write to stdin.r exec_functionz%(name)s: %(message)s)formatlevelz lib2to3.mainc3(K|] }dz|zV dS).fix_N).0fix fixer_pkgs r zmain..s-LLsW,s2LLLLLLrallrpz7Output in %r will mirror the input directory %r layout.)r%r&r'z+Sorry, -j isn't supported on this platform.)4optparse OptionParser add_option parse_argsrirHrPr&rr- add_suffixno_diffs list_fixesrKrget_all_fix_namesrLr^rrlverboseloggingDEBUGINFO basicConfig getLoggersetget_fixers_from_packagenofixrsaddunion differencerr4 commonprefixrrr9r8rstripinforsortedr*refactor_stdin doctests_only processesMultiprocessingUnsupported summarizerfbool)rtr/parserrflagsr#fixnamernr, avail_fixesunwanted_fixesr$ all_presentrs requested fixer_namesr%rts` rmainrs ")F G G GF d-l1333 dGHbNPPP dM'1 '>@@@ dIhDFFF dN<;=== d.|MOOO d-lLNNN dK 1333 l<@BBB dIl6888 dM,CEEE dN7 (NOOO d5lABBB nW5"GHHH N E%%d++MGT$)-%&} ; 9 : : : >'"3> <===;'"3; 9::: =QW-Q OPPP =0W.0 ./// BCCC1)<<  G 'NNNN 1  A SSSS ) ;;;;q d{{ =  ) ; ; ; ;1'"&&!%o%_ >GMM',E 6eDDDD  ~ . .Fh6yAABBKLLLLgmLLLLLNuuH{ 0 ; 8 8Ce||"  Y03677773>LK%%h///H %%h// &&~66KW))$//N9~66rv>>9 n--9 888'..rv66 M& 8 8 8  ;  x(8(8  7#33))!,  . . .B 9            D'-1F#-////6   C:''''qq    tBI  s<'U$$*VVr) __future__rrrLrr rrArwrrrMultiprocessRefactoringToolrrPrrqrrrs65555555  ---eeeeeH@eeeN333L L L L L L r__pycache__/refactor.cpython-311.pyc000064400000113160151027012300013254 0ustar00 !A?hsk@dZdZddlZddlZddlZddlZddlZddlZddlZddl m Z ddl m Z m Z mZddlmZddlmZmZdd lmZdd ZGd d eZdZdZdZdZdZGddeZGddeZ GddeZ!Gdde Z"dS)zRefactoring framework. Used as a main program, this can refactor any number of files and/or recursively descend down directories. Imported as a module, this provides infrastructure to write your own refactoring tool. z#Guido van Rossum N)chain)drivertokenizetoken) find_root)pytreepygram) btm_matcherTct|ggdg}g}tj|jD]<\}}}|dr!|r |dd}||=|S)zEReturn a sorted list of all available fix names in the given package.*fix_N) __import__pkgutil iter_modules__path__ startswithappend) fixer_pkg remove_prefixpkg fix_namesfindernameispkgs )/usr/lib64/python3.11/lib2to3/refactor.pyget_all_fix_namesrs YB . .CI&3CLAA##e ??6 " " # ABBx   T " " " ceZdZdS) _EveryNodeN__name__ __module__ __qualname__rrr!r!+Drr!ct|tjtjfr|jt |jhSt|tjr"|jrt|jSt t|tj rAt}|jD])}|D]$}| t|%*|Std|z)zf Accepts a pytree Pattern Node and returns a set of the pattern types which will match first. Nz$Oh no! I don't understand pattern %s) isinstancer NodePattern LeafPatterntyper!NegatedPatterncontent_get_head_typesWildcardPatternsetupdate Exception)patrpxs rr/r//s#*F,>?@@ 8  z#v,-- ; 0"3;// /#v-.. EE - -A - -++,,,, - :SA B BBrcZtjt}g}|D]}|jr[ t |j}|D]}|||?#t $r||Y`wxYw|j!||j|||ttj j tj j D]}|||t|S)z^ Accepts a list of fixers and returns a dictionary of head node type --> fixer list. ) collections defaultdictlistpatternr/rr! _accept_typerr python_grammar symbol2numbervaluestokensextenddict) fixer_list head_nodeseveryfixerheads node_types r_get_headnode_dictrJKsL(..J E $ $ = $ 8' 66"'88Iy)0077778 $ $ $ U##### $ !-5-.55e<<<< U####60>EEGG!0799,, 9$$U++++   sAA?>A?c<fdtdDS)zN Return the fully qualified names for fixers in the package pkg_name. c g|] }dz|z S.r&).0fix_namepkg_names r z+get_fixers_from_package..hs8 @ @ @ sNX % @ @ @rF)r)rQs`rget_fixers_from_packagerSds@ @ @ @ @-h>> @ @ @@rc|SNr&)objs r _identityrWks Jrchd}tjtj|jfd}t t jtjt j h}t} |\}}||vr|t j kr|rnd}n|t j kr|dkr|\}}|t j ks|dkrn|\}}|t j ks|dkrn|\}}|t j kr|dkr |\}}|t j krV|||\}}|t j ks|dkrn|\}}|t j kVnn n#t$rYnwxYwt |S) NFcBt}|d|dfS)Nrr)next)tokgens radvancez(_detect_future_features..advancers 3ii1vs1v~rTfrom __future__import(,)rgenerate_tokensioStringIOreadline frozensetrNEWLINENLCOMMENTr1STRINGNAMEOPadd StopIteration)sourcehave_docstringr]ignorefeaturestpvaluer\s @r_detect_future_featuresrvosN  "2;v#6#6#? @ @C x{EMB C CFuuH   IBV||u|##!!%uz!!evoo#GII E##u '<'<#GII E##u'8'8#GII E>>esll ' IBEJ&&LL''' ' IBUX~~# ' IB EJ&&3 4      X  s3D!F F"!F"ceZdZdZdS) FixerErrorzA fixer could not be loaded.N)r#r$r%__doc__r&rrrxrxs&&&&rrxceZdZddddZdZdZddZdZdZd Z d Z d Z dd Z dd Z dZddZdZd dZdZdZ d!dZd"dZdZdZdZdZdZdZdZdZdS)#RefactoringToolF)print_function exec_functionwrite_unchanged_filesFixrNcP||_|pg|_|j|_||j|t j|_|jdr|jj d=n|jdr |jj d=|j d|_ g|_ tjd|_g|_d|_t%j|jt(j|j |_|\|_|_g|_t5j|_g|_g|_t?|j|jD]k}|j r|j!|$||jvr|j"|H||jvr|j"|ltG|j|_$tG|j|_%dS) zInitializer. Args: fixer_names: a list of fixers to import options: a dict with configuration. explicit: a list of fixers to run even if they are explicit. Nr|printr}execr~r{F)convertlogger)&fixersexplicit_default_optionscopyoptionsr2r r>grammarkeywordsgetr~errorslogging getLoggerr fixer_logwroterDriverr r get_fixers pre_order post_orderfilesbm BottomMatcherBM bmi_pre_orderbmi_post_orderr BM_compatible add_fixerrrJbmi_pre_order_headsbmi_post_order_heads)self fixer_namesrrrGs r__init__zRefactoringTool.__init__s"  B ,1133   L   ( ( (,1133 <( ) . %g.. \/ * . %f- &*\%5%56M%N%N" '(9::  mDL,2N+/;888 +///*;*;' "$$ 4?DN;; 2 2E" 2!!%(((($.(("))%0000$/))#**5111#5d6H#I#I $6t7J$K$K!!!rcg}g}|jD]}t|iidg}|ddd}||jr|t |jd}|d}|jdd|Dz} t||}n$#t$rtd |d|dwxYw||j |j } | jr*|jd ur!||jvr|d |!|d || jd kr|| Y| jdkr|| {td| jzt'jd} || || ||fS)aInspects the options to load the requested patterns and handlers. Returns: (pre_order, post_order), where pre_order is the list of fixers that want a pre-order AST traversal, and post_order is the list that want post-order traversal. r rNrN_c6g|]}|Sr&)title)rOr6s rrRz.RefactoringTool.get_fixers..s 5O5O5OAaggii5O5O5Orz Can't find TzSkipping optional fixer: %szAdding transformation: %sprepostzIllegal fixer order: %r run_orderkey)rrrsplitr FILE_PREFIXlensplit CLASS_PREFIXjoingetattrAttributeErrorrxrrr log_message log_debugorderroperator attrgettersort) rpre_order_fixerspost_order_fixers fix_mod_pathmodrPparts class_name fix_classrGkey_funcs rrzRefactoringTool.get_fixerss" K J JL\2rC599C#**32226H""4#344 <#C(8$9$9$:$:;NN3''E*RWW5O5O5O5O5O-P-PPJ X#C44 ! X X X jxxx!LMMSWW XIdlDN;;E~ $-t";";  55  !>III NN6 A A A{e## ''....&&!((//// !:U[!HIII&{33(+++8,,, "344s 1C!C#c)zCalled when an error occurs.r&)rmsgargskwdss r log_errorzRefactoringTool.log_errors rcH|r||z}|j|dS)zHook to log a message.N)rinforrrs rrzRefactoringTool.log_messages/  *C rcH|r||z}|j|dSrU)rdebugrs rrzRefactoringTool.log_debug s/  *C #rcdS)zTCalled with the old version, new version, and filename of a refactored file.Nr&)rold_textnew_textfilenameequals r print_outputzRefactoringTool.print_outputs  rc|D]P}tj|r||||9||||QdS)z)Refactor a list of files and directories.N)ospathisdir refactor_dir refactor_file)ritemswrite doctests_only dir_or_files rrefactorzRefactoringTool.refactorsn! F FKw}}[)) F!!+umDDDD"";}EEEE  F Frctjdz}tj|D]\}}}|d||||D]w}|ds`tj|d|kr7tj||} | | ||xd|D|dd<dS)zDescends down a directory and refactor every Python file found. Python files are assumed to have a .py extension. Files and subdirectories starting with '.' are skipped. pyzDescending into %srNrc<g|]}|d|SrM)r)rOdns rrRz0RefactoringTool.refactor_dir..2s)KKK" c8J8JK2KKKrN) rextsepwalkrrrrsplitextrr) rdir_namerrpy_extdirpathdirnames filenamesrfullnames rrzRefactoringTool.refactor_dir sT!,.GH,=,= L L (GXy NN/ 9 9 9 MMOOO NN   ! G G,,GG$$T**1-77!w||GT::H&&x FFFKKKKKHQQQKK L Lrc t|d}n/#t$r"}|d||Yd}~dSd}~wwxYw tj|jd}|n#|wxYwtj|d|d5}||fcdddS#1swxYwYdS) zG Do our best to decode a Python source file correctly. rbzCan't open %s: %sNNNrr5rencodingnewline) openOSErrorrrdetect_encodingrfcloserdread)rrferrrs r_read_python_sourcez#RefactoringTool._read_python_source4s" Xt$$AA    NN.# > > >:::::  / ;;A>H GGIIIIAGGIIII WXsXr B B B &a6688X% & & & & & & & & & & & & & & & & & &s. ?:?A77B (C  CCc||\}}|dS|dz }|rl|d||||}|js||kr||||||dS|d|dS|||}|js |r7|jr0|t|dd|||dS|d|dS)zRefactors a file.N zRefactoring doctests in %szNo doctest changes in %sr)rrzNo changes in %s)rrrefactor_docstringr~processed_filerefactor_string was_changedstr)rrrrinputroutputtrees rrzRefactoringTool.refactor_fileDs@228<<x = F    = NN7 B B B,,UH==F) EVu__##FHeUHMMMMM98DDDDD''x88D) =d =t7G =##CIIcrcNH*/($DDDDD18<<<<}|d||jj |Yd}~|j|j_dSd}~wwxYw |j|j_n#|j|j_wxYw||_ | d|| |||S)aFRefactor a given input string. Args: data: a string holding the code to be refactored. name: a human-readable name for use in error/log messages. Returns: An AST corresponding to the refactored input stream; None if there were errors during the parse. r|zCan't parse %s: %s: %sNzRefactoring %s) rvr !python_grammar_no_print_statementrr parse_stringr3r __class__r#future_featuresr refactor_tree)rdatarrsrrs rrzRefactoringTool.refactor_string[s +400 x ' '"("JDK  /;++D11DD    NN3!7 > > > FFF"&,DK       #',DK  $,DK  . . . .' '... 4&&& s/AB$ B"B 2B$ BB$$B7ctj}|rh|d||d}|js||kr||d|dS|ddS||d}|js |r-|jr&|t|d|dS|ddS)NzRefactoring doctests in stdinzzNo doctest changes in stdinzNo changes in stdin) sysstdinrrrr~rrrr)rrrrrs rrefactor_stdinzRefactoringTool.refactor_stdinvs     6 NN: ; ; ;,,UI>>F) >Vu__##FIu=====<=====''y99D) 6d 6t7G 6##CIIy%@@@@@455555rct|j|jD]}|||||j|||j||j| }t| r|jj D]}||vr||r|| tjjd|jr+|| tjjt'||D]8}|||vr||| t+|n#t,$rYEwxYw|jr ||jvrZ||}|r|||}||||D]*}|jsg|_|j|+|j| }|D],} | |vrg|| <|| || -:t| t|j|jD]}||||jS)aRefactors a parse tree (modifying the tree in place). For compatible patterns the bottom matcher module is used. Otherwise the tree is traversed node-to-node for matches. Args: tree: a pytree.Node instance representing the root of the tree to be refactored. name: a human-readable name for this tree. Returns: True if the tree was modified, False otherwise. T)rreverser)rrr start_tree traverse_byrrrrunleavesanyr@rrr Basedepthkeep_line_order get_linenor;remover ValueErrorfixers_appliedmatch transformreplacerrB finish_treer) rrrrG match_setnoderesultsnew new_matchesfxrs rr zRefactoringTool.refactor_trees% 4>4?;; ) )E   T4 ( ( ( ( 14>>3C3CDDD 2DOO4E4EFFFGKK .. )""$$%%/ L. L. LI%%)E*:%e$))fk.?)NNN,J"%(--&+2H-III $Yu%5 6 6$L$L9U#333%e,33D999%%dOOOO)%%%%H%  .%5D%A>@(;$($7$>$>u$E$E$E$E/3gkk#**,,.G.G +6!L!LC+.)+;+;79 #$-cN$9$9+c:J$K$K$K$K_)""$$%%/ Lb4>4?;; * *E   dD ) ) ) )sF%% F21F2c|sdS|D]X}||jD]H}||}|r/|||}||||}IYdS)aTraverse an AST, applying a set of fixers to each node. This is a helper method for refactor_tree(). Args: fixers: a list of fixer instances. traversal: a generator that yields AST nodes. Returns: None N)r,rrr)rr traversalr"rGr#r$s rrzRefactoringTool.traverse_bys  F # #D * # #++d++#//$88C S)))"  # # #rc^|j||||d}|dS||k}||||||r|d||jsdS|r|||||dS|d|dS)zR Called when a file has been refactored and there may be changes. NrzNo changes to %szNot writing changes to %s)rrrrrr~ write_file)rrrrrrrs rrzRefactoringTool.processed_files (###  //99!>>   NN-x 8 8 8-   B OOHh( C C C C C NN6 A A A A Arc tj|d|d}n/#t$r"}|d||Yd}~dSd}~wwxYw|5 ||n.#t$r!}|d||Yd}~nd}~wwxYwdddn #1swxYwY|d|d|_dS) zWrites a string to a file. It first shows a unified diff between the old text and the new text, and then rewrites the file; the latter is only done if the write option is set. wrrzCan't create %s: %sNzCan't write %s: %szWrote changes to %sT)rdrrrrrr)rrrrrfprs rr*zRefactoringTool.write_filesX 32FFFBB    NN0(C @ @ @ FFFFF  D D D"""" D D D3XsCCCCCCCC D D D D D D D D D D D D D D D D ,h777 sP AAA BA$#B$ B.B B BBB"%B"z>>> z... c g}d}d}d}d}|dD])}|dz }||jrW|+|||||||}|g}||j} |d| }|V|||jzs#|||jzdzkr| ||+||||||d}d}| |+|+||||||d |S)aRefactors a docstring, looking for doctests. This returns a modified version of the input string. It looks for doctests, which start with a ">>>" prompt, and may be continued with "..." prompts, as long as the "..." is indented the same as the ">>>". (Unfortunately we can't use the doctest module's parser, since, like most parsers, it is not geared towards preserving the original source.) NrTkeependsrrr) splitlineslstriprPS1rBrefactor_doctestfindPS2rstriprr) rrrresultblock block_linenoindentlinenolineis rrz"RefactoringTool.refactor_docstrings $$d$33 $ $D aKF{{}}''11 $$MM$"7"7|8>#J#JKKK% IIdh''bqb$??6DH#455%6DHOO$5$55<<< T""""$MM$"7"7|8>#J#JKKK d####   MM$//|06BB C C Cwwvrc ||}n#t$r}jtjr.|D]+}d|d,d|||j j ||cYd}~Sd}~wwxYw ||rt| d}|d|dz ||dz d}} | dg|dz zks J| |dds|dxxdz cc<jz|d zg}|r|fd |Dz }|S) zRefactors one doctest. A doctest is given as a block of lines, the first of which starts with ">>>" (possibly indented), while the remaining lines start with "..." (identically indented). z Source: %srz+Can't parse docstring in %s line %s: %s: %sNTr/rrrc*g|]}jz|zSr&)r6)rOr=r;rs rrRz4RefactoringTool.refactor_doctest..^s%CCCt&48+d2CCCr) parse_blockr3r isEnabledForrDEBUGrr7rrr#r rr1endswithr3pop) rr9r<r;rrrr=r$clippeds ` ` rr4z RefactoringTool.refactor_doctestDs ##E66::DD   {'' 66 D!DDDNN<T1B1BCCCC NNH#VS]-CS J J JLLLLLL     dH - - Dd))&&&55Cyqy>3vaxyy>SGtfq11117111r7##D)) B4dh&34E DCCCCCsCCCC s B'A6B"B'"B'c6|jrd}nd}|js|d|n5|d||jD]}|||jr4|d|jD]}|||jrut |jdkr|dn(|dt |j|jD]\}}}|j|g|Ri|dSdS) Nwerez need to bezNo files %s modified.zFiles that %s modified:z$Warnings/messages while refactoring:rzThere was 1 error:zThere were %d errors:)rrrrrr)rrHfilemessagerrrs r summarizezRefactoringTool.summarizeask : DDDz '   4d ; ; ; ;   6 = = =  ' '  &&&& > *   C D D D> * *  )))) ; 54;1$$  !56666  !8#dk:J:JKKK#'; 5 5T4  4t444t4444  5 5  5 5rc|j||||}t|_|S)zParses a block into a tree. This is necessary to get correct line number / offset information in the parser diagnostics and embedded into the parse tree. )r parse_tokens wrap_toksrgr)rr9r<r;rs rrAzRefactoringTool.parse_blockxs: {''uff(M(MNN({{ rc#Ktj|||j}|D]+\}}\}}\} } } ||dz z }| |dz z } ||||f| | f| fV,dS)z;Wraps a tokenize stream to systematically modify start/end.rN)rrc gen_lines__next__) rr9r<r;rAr,ruline0col0line1col1 line_texts rrNzRefactoringTool.wrap_tokss)$..*G*G*PQQDJ G G @D%% y VaZ E VaZ E t}udmYF F F F F G Grc#K||jz}||jz}|}|D]h}||r|t|dVn5||dzkrdVnt d|d||}i dV)zGenerates lines as expected by tokenize from a list of lines. This strips the first len(indent + self.PS1) characters off each line. Nrzline=z , prefix=Tr)r3r6rrr7AssertionError)rr9r;prefix1prefix2prefixr=s rrPzRefactoringTool.gen_liness 48#48#  Dv&& L3v;;<<(((((4/// $nTTT66%JKKKFF HHH rr)FF)F)NFNrU)r#r$r%rrrrrrrrrrrrrrrr rrr*r3r6rr4rKrArNrPr&rrr{r{s+0).2799LK3L3L3L3Ln&5&5&5P     FFFFLLLL(&&& ====.66666 M M M ^###.GL $BBBB** C C)))V:555. G G Grr{ceZdZdS)MultiprocessingUnsupportedNr"r&rrr]r]r'rr]cBeZdZfdZ dfd ZfdZfdZxZS)MultiprocessRefactoringToolcdtt|j|i|d|_d|_dSrU)superr_rqueue output_lockrrkwargsrs rrz$MultiprocessRefactoringTool.__init__s;9)40094J6JJJ rFrc|dkr*tt|||S ddln#t$rt wxYwjtd_ _ fdt|D} |D]}| tt|||j t|D]}jd|D]*}|r| +d_dS#j t|D]}jd|D]*}|r| +d_wxYw)Nrrz already doing multiple processescFg|]}jS))target)Process_child)rOr>multiprocessingrs rrRz8MultiprocessRefactoringTool.refactor..s<444%,,DK,@@444r)rar_rrk ImportErrorr]rb RuntimeError JoinableQueueLockrcrangestartrputis_alive) rrrr num_processes processesr6r>rkrs ` @rrz$MultiprocessRefactoringTool.refactors/ A  4d;;DDum-- - - " " " " " - - -, , - : !ABB B$2244 *//1144444#M22444     -t 4 4 = =eU>K M M M JOO   =)) % % t$$$$  ::<<FFHHHDJJJ JOO   =)) % % t$$$$  ::<<FFHHHDJ    s:A 4AE22A;G-c4|j}|{|\}} tt|j|i||jn#|jwxYw|j}|ydSdSrU)rbrrar_r task_done)rtaskrrers rrjz"MultiprocessRefactoringTool._childsz~~LD& 'F1488F%#%%% $$&&&& $$&&&&:>>##Ds AA8c|j|j||fdStt|j|i|SrU)rbrrrar_rrds rrz)MultiprocessRefactoringTool.refactor_filesV : ! JNND&> * * * * *I54d;;I!!! !r)FFr)r#r$r%rrrjr __classcell__)rs@rr_r_s     :? : $ $ $ $ $!!!!!!!!!rr_)T)#ry __author__rdrrr rrr9 itertoolsrpgen2rrr fixer_utilrrr r r rrr3r!r/rJrSrWrvrxobjectr{r]r_r&rrrs3   +*********!!!!!!            CCC82@@@%%%P''''''''FFFFFfFFFR        4!4!4!4!4!/4!4!4!4!4!r__pycache__/fixer_base.cpython-311.opt-2.pyc000064400000013152151027012300014516 0ustar00 !A?h"l ddlZddlmZddlmZddlmZGddeZGdd eZ dS) N)PatternCompiler)pygram)does_tree_importceZdZ dZdZdZdZdZej dZ e Z dZ dZdZdZdZdZejZdZdZdZd Zd Zdd Zd ZddZdZdZdZ dS)BaseFixNrpostFcL ||_||_|dSN)optionslogcompile_pattern)selfr rs +/usr/lib64/python3.11/lib2to3/fixer_base.py__init__zBaseFix.__init__/s/   c |j9t}||jd\|_|_dSdS)NT) with_tree)PATTERNrrpattern pattern_tree)rPCs rrzBaseFix.compile_pattern;sX < # ""B.0.@.@KO/A/Q/Q +DL$+++ $ #rc ||_dSr )filename)rrs r set_filenamezBaseFix.set_filenameFs ! rcF d|i}|j||o|S)Nnode)rmatchrrresultss rrz BaseFix.matchMs/ 4.|!!$00 B"DN HOO04=@ A A A      rc |}|}d|_d}||||fz|r||dSdS)NzLine %d: could not convert: %s) get_linenocloneprefixr3)rrreasonlineno for_outputmsgs rcannot_convertzBaseFix.cannot_convertzs| ""ZZ\\  .  33444  %   V $ $ $ $ $ % %rcd |}|d||fzdS)Nz Line %d: %s)r6r3)rrr9r:s rwarningzBaseFix.warnings< "" &&)99:::::rc |j|_||tjd|_d|_dS)NrT)r'r itertoolscountr*r0rtreers r start_treezBaseFix.start_treesB / (### q)) rc dSr rCs r finish_treezBaseFix.finish_trees r)r%r )!__name__ __module__ __qualname__rrrr rrArBr*setr'orderexplicit run_order _accept_typekeep_line_order BM_compatiblerpython_symbolssymsrrrrr$r.r3r=r?rErHrGrrrrs,GGLGHioa  GJ EHILOM  D    Q Q Q!!! = = =$$$    !!! % % % %;;;        rrc*eZdZ dZfdZdZxZS)ConditionalFixNcPtt|j|d|_dSr )superrVrE _should_skip)rargs __class__s rrEzConditionalFix.start_trees+.nd##.55 rc|j|jS|jd}|d}d|dd}t ||||_|jS)N.)rYskip_onsplitjoinr)rrpkgr-s r should_skipzConditionalFix.should_skipsh   ($ $l  %%2whhs3B3x  ,S$==  r)rIrJrKr_rErc __classcell__)r[s@rrVrVsQJG!!!!!!!!!!!!rrV) rApatcomprr5r fixer_utilrobjectrrVrGrrrhs9%$$$$$((((((X X X X X fX X X v!!!!!W!!!!!r__pycache__/btm_matcher.cpython-311.opt-1.pyc000064400000017170151027012300014677 0ustar00 !A?hdZdZddlZddlZddlmZddlmZddlm Z Gdd e Z Gd d e Z ia d ZdS) aA bottom-up tree matching algorithm implementation meant to speed up 2to3's matching process. After the tree patterns are reduced to their rarest linear path, a linear Aho-Corasick automaton is created. The linear automaton traverses the linear paths from the leaves to the root of the AST and returns a set of nodes for further matching. This reduces significantly the number of candidate nodes.z+George Boutsioukis N) defaultdict)pytree) reduce_treec6eZdZdZejZdZdS)BMNodez?Class for a node of the Aho-Corasick automaton used in matchingcli|_g|_ttj|_d|_dS)N)transition_tablefixersnextrcountidcontentselfs ,/usr/lib64/python3.11/lib2to3/btm_matcher.py__init__zBMNode.__init__s- " v|$$ N)__name__ __module__ __qualname____doc__ itertoolsrrrrrrs8II IO  Errc0eZdZdZdZdZdZdZdZdS) BottomMatcherzgThe main matcher class. After instantiating the patterns should be added using the add_fixer methodct|_t|_|jg|_g|_t jd|_dS)NRefactoringTool) setmatchrrootnodesr logging getLoggerloggerrs rrzBottomMatcher.__init__sAUU HH i[  '(9:: rc|j|t|j}|}|||j}|D]}|j|dS)zReduces a fixer's pattern tree to a linear path and adds it to the matcher(a common Aho-Corasick automaton). The fixer is appended on the matching states and called when they are reachedstartN)r appendr pattern_treeget_linear_subpatternaddr")rfixertreelinear match_nodes match_nodes r add_fixerzBottomMatcher.add_fixer%s 5!!!5-..++--hhvTYh77 % , ,J   $ $U + + + + , ,rc |s|gSt|dtr\g}|dD]O}|||}|D]3}|||dd|4P|S|d|jvrt }||j|d<n|j|d}|ddr ||dd|}n|g}|S)z5Recursively adds a linear pattern to the AC automatonrr(rN) isinstancetupler-extendr r)rpatternr)r1 alternative end_nodesend next_nodes rr-zBottomMatcher.add1s& 7N gaj% ( ( K&qz C C !HH[H>> $CCC&&txx S'A'ABBBBC qz!777"HH 5>&wqz22"271:> qrr{ ( HHWQRR[ HBB &K  rc6|j}tt}|D]}|}|rd|_|jD]0}t |t jr|jdkr d|_n1|j dkr|j}n|j }||j vr3|j |}|j D]}|| |nV|j}|j |j jrnD||j vr2|j |}|j D]}|| ||j }||S)auThe main interface with the bottom matcher. The tree is traversed from the bottom using the constructed automaton. Nodes are only checked once as the tree is retraversed. When the automaton fails, we give it one more shot(in case the above tree matches as a whole with the rejected leaf), then we break for the next leaf. There is the special case of multiple arguments(see code comments) where we recheck the nodes Args: The leaves of the AST tree to be matched Returns: A dictionary of node matches with fixers as the keys T;Fr)r"rlist was_checkedchildrenr5rLeafvaluetyper r r*parent) rleavescurrent_ac_noderesultsleafcurrent_ast_nodechild node_tokenr.s rrunzBottomMatcher.runSs )d### ;# ;D# "! ;/3 ,-6E!%55%+:L:L7<(4#(A--!1!7JJ!1!6J!AAA&5&Fz&RO!0!7@@--.>????@'+iO(/;,3?<"_%EEE*9*J:*V%4%;DDE#EN112BCCCC#3#: C#! ;Drcntdfd|jtddS)z %d [label=%s] //%sr)r keysprintr type_reprstrr r)node subnode_keysubnode print_nodes rrWz*BottomMatcher.print_ac..print_nodes#499;; $ $ / <0w Ik,B,BCDWDWXYZZZ!##'/*** 7####  $ $r}N)rQr")rrWs @rprint_aczBottomMatcher.print_acsM l $ $ $ $ $  49 c rN) rrrrrr3r-rMrYrrrrrsk++;;; , , ,   D666p     rrctsGddlm}|jD]'\}}t |t kr |t|<(t||S)Nr)python_symbols) _type_reprspygramr[__dict__itemsrDint setdefault)type_numr[namevals rrRrRsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55r)r __author__r$r collectionsrr r btm_utilsrobjectrrr\rRrrrrisGG; ######""""""V}}}}}F}}}@ 66666r__pycache__/refactor.cpython-311.opt-2.pyc000064400000103335151027012300014217 0ustar00 !A?hsk> dZddlZddlZddlZddlZddlZddlZddlZddlm Z ddl m Z m Z m Z ddlmZddlmZmZddlmZdd ZGd d eZd ZdZdZdZdZGddeZGddeZGddeZ GddeZ!dS)z#Guido van Rossum N)chain)drivertokenizetoken) find_root)pytreepygram) btm_matcherTc t|ggdg}g}tj|jD]<\}}}|dr!|r |dd}||=|S)N*fix_) __import__pkgutil iter_modules__path__ startswithappend) fixer_pkg remove_prefixpkg fix_namesfindernameispkgs )/usr/lib64/python3.11/lib2to3/refactor.pyget_all_fix_namesrsO YB . .CI&3CLAA##e ??6 " " # ABBx   T " " " ceZdZdS) _EveryNodeN__name__ __module__ __qualname__rrr!r!+Drr!c t|tjtjfr|jt |jhSt|tjr"|jrt|jSt t|tj rAt}|jD])}|D]$}| t|%*|Std|z)Nz$Oh no! I don't understand pattern %s) isinstancer NodePattern LeafPatterntyper!NegatedPatterncontent_get_head_typesWildcardPatternsetupdate Exception)patrpxs rr/r//s9#*F,>?@@ 8  z#v,-- ; 0"3;// /#v-.. EE - -A - -++,,,, - :SA B BBrc\ tjt}g}|D]}|jr[ t |j}|D]}|||?#t $r||Y`wxYw|j!||j|||ttj j tj j D]}|||t|SN) collections defaultdictlistpatternr/rr! _accept_typerr python_grammar symbol2numbervaluestokensextenddict) fixer_list head_nodeseveryfixerheads node_types r_get_headnode_dictrKKsO/(..J E $ $ = $ 8' 66"'88Iy)0077778 $ $ $ U##### $ !-5-.55e<<<< U####60>EEGG!0799,, 9$$U++++   sAB?Bc> fdtdDS)Nc g|] }dz|z S.r&).0fix_namepkg_names r z+get_fixers_from_package..hs8 @ @ @ sNX % @ @ @rF)r)rRs`rget_fixers_from_packagerTdsE @ @ @ @-h>> @ @ @@rc|Sr9r&)objs r _identityrWks Jrchd}tjtj|jfd}t t jtjt j h}t} |\}}||vr|t j kr|rnd}n|t j kr|dkr|\}}|t j ks|dkrn|\}}|t j ks|dkrn|\}}|t j kr|dkr |\}}|t j krV|||\}}|t j ks|dkrn|\}}|t j kVnn n#t$rYnwxYwt |S) NFcBt}|d|dfS)Nrr)next)tokgens radvancez(_detect_future_features..advancers 3ii1vs1v~rTfrom __future__import(,)rgenerate_tokensioStringIOreadline frozensetrNEWLINENLCOMMENTr1STRINGNAMEOPadd StopIteration)sourcehave_docstringr]ignorefeaturestpvaluer\s @r_detect_future_featuresrvosN  "2;v#6#6#? @ @C x{EMB C CFuuH   IBV||u|##!!%uz!!evoo#GII E##u '<'<#GII E##u'8'8#GII E>>esll ' IBEJ&&LL''' ' IBUX~~# ' IB EJ&&3 4      X  s3D!F F"!F"ceZdZdS) FixerErrorNr"r&rrrxrxs&&rrxceZdZddddZdZdZddZdZdZd Z d Z d Z dd Z dd Z dZddZdZd dZdZdZ d!dZd"dZdZdZdZdZdZdZdZdZdS)#RefactoringToolF)print_function exec_functionwrite_unchanged_filesFixrNcR ||_|pg|_|j|_||j|t j|_|jdr|jj d=n|jdr |jj d=|j d|_ g|_ tjd|_g|_d|_t%j|jt(j|j|_|\|_|_g|_t5j|_g|_g|_t?|j|jD]k}|j r|j!|$||jvr|j"|H||jvr|j"|ltG|j|_$tG|j|_%dS) Nr{printr|execr}rzF)convertlogger)&fixersexplicit_default_optionscopyoptionsr2r r?grammarkeywordsgetr}errorslogging getLoggerr fixer_logwroterDriverr r get_fixers pre_order post_orderfilesbm BottomMatcherBM bmi_pre_orderbmi_post_orderr BM_compatible add_fixerrrKbmi_pre_order_headsbmi_post_order_heads)self fixer_namesrrrHs r__init__zRefactoringTool.__init__s "  B ,1133   L   ( ( (,1133 <( ) . %g.. \/ * . %f- &*\%5%56M%N%N" '(9::  mDL,2N+/;888 +///*;*;' "$$ 4?DN;; 2 2E" 2!!%(((($.(("))%0000$/))#**5111#5d6H#I#I $6t7J$K$K!!!rc g}g}|jD]}t|iidg}|ddd}||jr|t |jd}|d}|jdd|Dz} t||}n$#t$rtd|d|dwxYw||j |j } | jr*|jd ur!||jvr|d |!|d || jd kr|| Y| jd kr|| {td| jzt'jd} || || ||fS)Nr rOr_c6g|]}|Sr&)title)rPr6s rrSz.RefactoringTool.get_fixers..s 5O5O5OAaggii5O5O5Orz Can't find TzSkipping optional fixer: %szAdding transformation: %sprepostzIllegal fixer order: %r run_orderkey)rrrsplitr FILE_PREFIXlensplit CLASS_PREFIXjoingetattrAttributeErrorrxrrr log_message log_debugorderroperator attrgettersort) rpre_order_fixerspost_order_fixers fix_mod_pathmodrQparts class_name fix_classrHkey_funcs rrzRefactoringTool.get_fixerss'  K J JL\2rC599C#**32226H""4#344 <#C(8$9$9$:$:;NN3''E*RWW5O5O5O5O5O-P-PPJ X#C44 ! X X X jxxx!LMMSWW XIdlDN;;E~ $-t";";  55  !>III NN6 A A A{e## ''....&&!((//// !:U[!HIII&{33(+++8,,, "344s 2C!C$c r9r&)rmsgargskwdss r log_errorzRefactoringTool.log_errors* rcJ |r||z}|j|dSr9)rinforrrs rrzRefactoringTool.log_messages2$  *C rcH|r||z}|j|dSr9)rdebugrs rrzRefactoringTool.log_debug s/  *C #rc dSr9r&)rold_textnew_textfilenameequals r print_outputzRefactoringTool.print_outputs   rc |D]P}tj|r||||9||||QdSr9)ospathisdir refactor_dir refactor_file)ritemswrite doctests_only dir_or_files rrefactorzRefactoringTool.refactorso7  F FKw}}[)) F!!+umDDDD"";}EEEE  F Frc tjdz}tj|D]\}}}|d||||D]w}|ds`tj|d|kr7tj||} | | ||xd|D|dd<dS)NpyzDescending into %srOrc<g|]}|d|SrN)r)rPdns rrSz0RefactoringTool.refactor_dir..2s)KKK" c8J8JK2KKKr) rextsepwalkrrrrsplitextrr) rdir_namerrpy_extdirpathdirnames filenamesrfullnames rrzRefactoringTool.refactor_dir s T!,.GH,=,= L L (GXy NN/ 9 9 9 MMOOO NN   ! G G,,GG$$T**1-77!w||GT::H&&x FFFKKKKKHQQQKK L Lrc t|d}n/#t$r"}|d||Yd}~dSd}~wwxYw tj|jd}|n#|wxYwtj|d|d5}||fcdddS#1swxYwYdS)NrbzCan't open %s: %sNNrr5rencodingnewline) openOSErrorrrdetect_encodingrfcloserdread)rrferrrs r_read_python_sourcez#RefactoringTool._read_python_source4s'  Xt$$AA    NN.# > > >:::::  / ;;A>H GGIIIIAGGIIII WXsXr B B B &a6688X% & & & & & & & & & & & & & & & & & &s0 A;AA88B)C  CCc ||\}}|dS|dz }|rl|d||||}|js||kr||||||dS|d|dS|||}|js |r7|jr0|t|dd|||dS|d|dS)N zRefactoring doctests in %szNo doctest changes in %sr)rrzNo changes in %s)rrrefactor_docstringr}processed_filerefactor_string was_changedstr)rrrrinputroutputtrees rrzRefactoringTool.refactor_fileDsC228<<x = F    = NN7 B B B,,UH==F) EVu__##FHeUHMMMMM98DDDDD''x88D) =d =t7G =##CIIcrcNH*/($DDDDD18<<<<}|d||jj |Yd}~|j|j_dSd}~wwxYw |j|j_n#|j|j_wxYw||_ | d|| |||S)Nr{zCan't parse %s: %s: %szRefactoring %s) rvr !python_grammar_no_print_statementrr parse_stringr3r __class__r#future_featuresr refactor_tree)rdatarrsrrs rrzRefactoringTool.refactor_string[s +400 x ' '"("JDK  /;++D11DD    NN3!7 > > > FFF"&,DK       #',DK  $,DK  . . . .' '... 4&&& s/AB% B"B 3B% BB%%B8ctj}|rh|d||d}|js||kr||d|dS|ddS||d}|js |r-|jr&|t|d|dS|ddS)NzRefactoring doctests in stdinzzNo doctest changes in stdinzNo changes in stdin) sysstdinrrrr}rrrr)rrrrrs rrefactor_stdinzRefactoringTool.refactor_stdinvs     6 NN: ; ; ;,,UI>>F) >Vu__##FIu=====<=====''y99D) 6d 6t7G 6##CIIy%@@@@@455555rc  t|j|jD]}|||||j|||j||j| }t| r|jj D]}||vr||r|| tjjd|jr+|| tjjt'||D]8}|||vr||| t+|n#t,$rYEwxYw|jr ||jvrZ||}|r|||}||||D]*}|jsg|_|j|+|j| }|D],} | |vrg|| <|| || -:t| t|j|jD]}||||jS)NT)rreverser)rrr start_tree traverse_byrrrrunleavesanyrArrr Basedepthkeep_line_order get_linenor<remover ValueErrorfixers_appliedmatch transformreplacerrC finish_treer) rrrrH match_setnoderesultsnew new_matchesfxrs rrzRefactoringTool.refactor_trees* 4>4?;; ) )E   T4 ( ( ( ( 14>>3C3CDDD 2DOO4E4EFFFGKK .. )""$$%%/ L. L. LI%%)E*:%e$))fk.?)NNN,J"%(--&+2H-III $Yu%5 6 6$L$L9U#333%e,33D999%%dOOOO)%%%%H%  .%5D%A>@(;$($7$>$>u$E$E$E$E/3gkk#**,,.G.G +6!L!LC+.)+;+;79 #$-cN$9$9+c:J$K$K$K$K_)""$$%%/ Lb4>4?;; * *E   dD ) ) ) )sF&& F32F3c |sdS|D]X}||jD]H}||}|r/|||}||||}IYdSr9)r,rrr)rr traversalr!rHr"r#s rrzRefactoringTool.traverse_bys   F # #D * # #++d++#//$88C S)))"  # # #rc` |j||||d}|dS||k}||||||r|d||jsdS|r|||||dS|d|dS)NrzNo changes to %szNot writing changes to %s)rrrrrr} write_file)rrrrrrrs rrzRefactoringTool.processed_files  (###  //99!>>   NN-x 8 8 8-   B OOHh( C C C C C NN6 A A A A Arc tj|d|d}n/#t$r"}|d||Yd}~dSd}~wwxYw|5 ||n.#t$r!}|d||Yd}~nd}~wwxYwdddn #1swxYwY|d|d|_dS)NwrrzCan't create %s: %szCan't write %s: %szWrote changes to %sT)rdrrrrrr)rrrrrfprs rr)zRefactoringTool.write_files]  32FFFBB    NN0(C @ @ @ FFFFF  D D D"""" D D D3XsCCCCCCCC D D D D D D D D D D D D D D D D ,h777 sP AAA BA%$B% B/B B BBB#&B#z>>> z... c  g}d}d}d}d}|dD])}|dz }||jrW|+|||||||}|g}||j} |d| }|V|||jzs#|||jzdzkr| ||+||||||d}d}| |+|+||||||d |S)NrTkeependsrrr) splitlineslstriprPS1rCrefactor_doctestfindPS2rstriprr) rrrresultblock block_linenoindentlinenolineis rrz"RefactoringTool.refactor_docstrings  $$d$33 $ $D aKF{{}}''11 $$MM$"7"7|8>#J#JKKK% IIdh''bqb$??6DH#455%6DHOO$5$55<<< T""""$MM$"7"7|8>#J#JKKK d####   MM$//|06BB C C Cwwvrc ||}n#t$r}jtjr.|D]+}d|d,d|||j j ||cYd}~Sd}~wwxYw ||rt| d}|d|dz ||dz d}} |dds|dxxdz cc<jz|dzg}|r|fd |Dz }|S) Nz Source: %srz+Can't parse docstring in %s line %s: %s: %sTr.rrrc*g|]}jz|zSr&)r5)rPr<r:rs rrSz4RefactoringTool.refactor_doctest..^s%CCCt&48+d2CCCr) parse_blockr3r isEnabledForrDEBUGrr6rrr#rrr0endswithr2pop) rr8r;r:rrrr<r#clippeds ` ` rr3z RefactoringTool.refactor_doctestDs  ##E66::DD   {'' 66 D!DDDNN<T1B1BCCCC NNH#VS]-CS J J JLLLLLL     dH - - Dd))&&&55Cyqy>3vaxyy>SGr7##D)) B4dh&34E DCCCCCsCCCC s B(A6B#B(#B(c6|jrd}nd}|js|d|n5|d||jD]}|||jr4|d|jD]}|||jrut |jdkr|dn(|dt |j|jD]\}}}|j|g|Ri|dSdS) Nwerez need to bezNo files %s modified.zFiles that %s modified:z$Warnings/messages while refactoring:rzThere was 1 error:zThere were %d errors:)rrrrrr)rrGfilemessagerrrs r summarizezRefactoringTool.summarizeask : DDDz '   4d ; ; ; ;   6 = = =  ' '  &&&& > *   C D D D> * *  )))) ; 54;1$$  !56666  !8#dk:J:JKKK#'; 5 5T4  4t444t4444  5 5  5 5rc |j||||}t|_|Sr9)r parse_tokens wrap_toksrgr)rr8r;r:rs rr@zRefactoringTool.parse_blockxs? {''uff(M(MNN({{ rc#K tj|||j}|D]+\}}\}}\} } } ||dz z }| |dz z } ||||f| | f| fV,dS)Nr)rrc gen_lines__next__) rr8r;r:rBr,ruline0col0line1col1 line_texts rrMzRefactoringTool.wrap_tokssI)$..*G*G*PQQDJ G G @D%% y VaZ E VaZ E t}udmYF F F F F G Grc#K ||jz}||jz}|}|D]h}||r|t|dVn5||dzkrdVnt d|d||}i dV)Nrzline=z , prefix=Tr)r2r5rrr6AssertionError)rr8r:prefix1prefix2prefixr<s rrOzRefactoringTool.gen_liness 48#48#  Dv&& L3v;;<<(((((4/// $nTTT66%JKKKFF HHH rr)FF)F)NFNr9)r#r$r%rrrrrrrrrrrrrrr rrrr)r2r5rr3rJr@rMrOr&rrrzrzs+0).2799LK3L3L3L3Ln&5&5&5P     FFFFLLLL(&&& ====.66666 M M M ^###.GL $BBBB** C C)))V:555. G G GrrzceZdZdS)MultiprocessingUnsupportedNr"r&rrr\r\r'rr\cBeZdZfdZ dfd ZfdZfdZxZS)MultiprocessRefactoringToolcdtt|j|i|d|_d|_dSr9)superr^rqueue output_lockrrkwargsrs rrz$MultiprocessRefactoringTool.__init__s;9)40094J6JJJ rFrc|dkr*tt|||S ddln#t$rt wxYwjtd_ _ fdt|D} |D]}| tt|||j t|D]}jd|D]*}|r| +d_dS#j t|D]}jd|D]*}|r| +d_wxYw)Nrrz already doing multiple processescFg|]}jS))target)Process_child)rPr=multiprocessingrs rrSz8MultiprocessRefactoringTool.refactor..s<444%,,DK,@@444r)r`r^rrj ImportErrorr\ra RuntimeError JoinableQueueLockrbrangestartrputis_alive) rrrr num_processes processesr6r=rjrs ` @rrz$MultiprocessRefactoringTool.refactors/ A  4d;;DDum-- - - " " " " " - - -, , - : !ABB B$2244 *//1144444#M22444     -t 4 4 = =eU>K M M M JOO   =)) % % t$$$$  ::<<FFHHHDJJJ JOO   =)) % % t$$$$  ::<<FFHHHDJ    s:A 4AE22A;G-c4|j}|{|\}} tt|j|i||jn#|jwxYw|j}|ydSdSr9)rarr`r^r task_done)rtaskrrdrs rriz"MultiprocessRefactoringTool._childsz~~LD& 'F1488F%#%%% $$&&&& $$&&&&:>>##Ds AA8c|j|j||fdStt|j|i|Sr9)rarqr`r^rrcs rrz)MultiprocessRefactoringTool.refactor_filesV : ! JNND&> * * * * *I54d;;I!!! !r)FFr)r#r$r%rrrir __classcell__)rs@rr^r^s     :? : $ $ $ $ $!!!!!!!!!rr^)T)" __author__rdrrr rrr: itertoolsrpgen2rrr fixer_utilrrr r r rrr3r!r/rKrTrWrvrxobjectrzr\r^r&rrrs3   +*********!!!!!!            CCC82@@@%%%P''''''''FFFFFfFFFR        4!4!4!4!4!/4!4!4!4!4!r__pycache__/__init__.cpython-311.pyc000064400000000565151027012300013212 0ustar00 !A?h4ddlZejdeddS)NzGlib2to3 package is deprecated and may not be able to parse Python 3.10+) stacklevel)warningswarnDeprecationWarning)/usr/lib64/python3.11/lib2to3/__init__.pyr s> Mr __pycache__/patcomp.cpython-311.pyc000064400000024727151027012300013124 0ustar00 !A?hdZdZddlZddlmZmZmZmZmZm Z ddl m Z ddl m Z Gdd e Zd ZGd d eZejejejdd ZdZdZdZdS)zPattern compiler. The grammar is taken from PatternGrammar.txt. The compiler compiles a pattern to a pytree.*Pattern instance. z#Guido van Rossum N)driverliteralstokentokenizeparsegrammar)pytree)pygramceZdZdS)PatternSyntaxErrorN)__name__ __module__ __qualname__(/usr/lib64/python3.11/lib2to3/patcomp.pyr r sDrr c#Ktjtjtjh}t jt j|j}|D]}|\}}}}}||vr|VdS)z6Tokenizes a string suppressing significant whitespace.N) rNEWLINEINDENTDEDENTrgenerate_tokensioStringIOreadline) inputskiptokens quintupletypevaluestartend line_texts rtokenize_wrapperr%ss M5< 6D  %bk%&8&8&A B BF -6*eUC t  OOOrc2eZdZddZd dZdZddZdZdS) PatternCompilerNcL|#tj|_tj|_n7t j||_tj|j|_tj|_ tj |_ t j |jt|_dS)z^Initializer. Takes an optional alternative filename for the pattern grammar. N)convert)r pattern_grammarr pattern_symbolssymsr load_grammarSymbolspython_grammar pygrammarpython_symbolspysymsDriverpattern_convert)self grammar_files r__init__zPatternCompiler.__init__(su  !1DL.DII!.|<z0PatternCompiler.compile_node..Os'GGGbD%%b))GGGrNrcg|]}|gSrr)rFas rrHz0PatternCompiler.compile_node..Rs':':':':':':rminmaxc:g|]}|SrrDrEs rrHz0PatternCompiler.compile_node..Vs'CCCrT&&r**CCCr)rPrR)r r,Matcherchildren Alternativeslenr WildcardPatternoptimize Alternative NegatedUnit compile_basicNegatedPatternUnitrEQUALr!RepeaterSTARHUGEPLUSLBRACERBRACEget_intname) r5nodealtspunitspatternrfnodesrepeatrTchildrMrNs ` rr=zPatternCompiler.compile_nodeCs] 9 ) ) )=#D 9 . . .GGGGDM##A#4FGGGD4yyA~~Aw&':':T':':':qIIIA::<<  9 - - -CCCCT]CCCE5zzQQx&wA1===A::<<  9 - - -((qrr):;;G%g..A::<< yDIN****  u::??uQx} ;;8>D!""IE u::??uRy~1CCC2YF#2#JE$$UF33  ;$)"44444HQKEzUZ''kuz))ku|++|(EL8888H //// LL!555cx==A%%,,x{33Cuaxx3!88!**,, 07)#3OOO  GL!!!rct|dksJ|d}|jtjkrHt t j|j}tj t||S|jtj kr|j}| rS|tvrtd|z|ddrtdtj t|S|dkrd}n?|ds*t!|j|d}|td|z|ddr(||djdg}nd}tj||S|jdkr||dS|jd kr8|J||d}tj|ggdd SJ|) NrrzInvalid token: %rzCan't have details for tokenany_zInvalid symbol: %r([rL)rVr rSTRINGr<r evalStringr!r LeafPattern_type_of_literalNAMEisupper TOKEN_MAPr startswithgetattrr2r=rT NodePatternrW)r5rlrmrgr!r content subpatterns rr[zPatternCompiler.compile_basics5zzQQx 9 $ $+DJ7788E%&6u&=&=uEE E Y%* $ $JE}} 9 )),-@5-HIII9M,-KLLL))E*:;;;E>>DD))#..O"4;t<>>**5844J)J<.aQGGG GdurcX|jtjksJt|jSN)r rNUMBERintr!)r5rgs rrezPatternCompiler.get_ints%yEL((((4:rr)FF)rrrr7rAr=r[rerrrr'r'&sw K K K K + + + +E"E"E"N!!!!Frr')rxrtrTOKENc|dr tjS|tjvrtj|SdS)Nr)isalpharrxr opmap)r!s rrwrwsA Qxz '-  }U##trc|\}}}}|s ||jvrtj|||Stj|||S)z9Converts raw node information to a Node or Leaf instance.)context) number2symbolr NodeLeaf)r raw_node_infor r!rrTs rr4r4sS%2"D%(947000{47;;;;{48888rcDt|Sr)r'rA)rks rrArAs    , ,W 5 55r)__doc__ __author__rpgen2rrrrrr r r Exceptionr r%objectr'rxrtrrzrwr4rArrrrs=3  EDDDDDDDDDDDDDDD        IIIIIfIIIZZ||   99966666r__pycache__/pytree.cpython-311.opt-1.pyc000064400000104574151027012300013727 0ustar00 !A?hFmdZdZddlZddlmZdZiadZGddeZ Gd d e Z Gd d e Z d Z GddeZ Gdde ZGdde ZGdde ZGdde ZdZdS)z Python parse tree definitions. This is a very concrete parse tree; we need to keep every token and even the comments and whitespace between tokens. There's also a pattern matching implementation here. z#Guido van Rossum N)StringIOictsGddlm}|jD]'\}}t |t kr |t|<(t||S)N)python_symbols) _type_reprspygramr__dict__itemstypeint setdefault)type_numrnamevals '/usr/lib64/python3.11/lib2to3/pytree.py type_reprrsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55ceZdZdZdZdZdZdZdZdZ dZ dZ dZ dZ d Zd Zd Zd Zd ZdZedZedZdZdZdZejdkrdZdSdS)Basez Abstract base class for Node and Leaf. This provides some default functionality and boilerplate using the template pattern. A node may be a subnode of at most one parent. NFc6t|S)z7Constructor that prevents Base from being instantiated.object__new__clsargskwdss rrz Base.__new__1~~c"""rcV|j|jurtS||S)zW Compare two nodes for equality. This calls the method _eq(). ) __class__NotImplemented_eqselfothers r__eq__z Base.__eq__6s) > 0 0! !xxrct)a_ Compare two nodes for equality. This is called by __eq__ and __ne__. It is only called if the two nodes have the same type. This must be implemented by the concrete subclass. Nodes should be considered equal if they have the same structure, ignoring the prefix string and other context information. NotImplementedErrorr$s rr#zBase._eqBs "!rct)zr Return a cloned (deep) copy of self. This must be implemented by the concrete subclass. r)r%s rclonez Base.cloneM "!rct)zx Return a post-order iterator for the tree. This must be implemented by the concrete subclass. r)r,s r post_orderzBase.post_orderUr.rct)zw Return a pre-order iterator for the tree. This must be implemented by the concrete subclass. r)r,s r pre_orderzBase.pre_order]r.rc<t|ts|g}g}d}|jjD]5}||ur|||d} ||6|j||j_|D]}|j|_d|_dS)z/Replace this node with a new one in the parent.FNT) isinstancelistparentchildrenextendappendchanged)r%new l_childrenfoundchxs rreplacez Base.replacees#t$$ %C +& & &BTzz?%%c***!!"%%%% )  # #A{AHH rc|}t|ts+|jsdS|jd}t|t+|jS)z9Return the line number which generated the invocant node.Nr)r4Leafr7linenor%nodes r get_linenozBase.get_lineno|sRT4(( $= =#DT4(( ${rcT|jr|jd|_dS)NT)r6r: was_changedr,s rr:z Base.changeds. ; " K   ! ! !rc|jrTt|jjD]<\}}||ur1|j|jj|=d|_|cS;dSdS)z Remove the node from the tree. Returns the position of the node in its parent's children before it was removed. N)r6 enumerater7r:)r%irEs rremovez Base.removes ; $T[%9::  44<<K''))) ,Q/"&DKHHH      rc|jdSt|jjD]3\}}||ur* |jj|dzcS#t$rYdSwxYw4dS)z The node immediately following the invocant in their parent's children list. If the invocant does not have a next sibling, it is None Nr)r6rJr7 IndexErrorr%rKchilds r next_siblingzBase.next_siblings ; 4"$+"677  HAu}} ;/!4444!   444   sA AAc|jdSt|jjD])\}}||ur |dkrdS|jj|dz cS*dS)z The node immediately preceding the invocant in their parent's children list. If the invocant does not have a previous sibling, it is None. Nrr)r6rJr7rOs r prev_siblingzBase.prev_siblingsu ; 4"$+"677 1 1HAu}}6644{+AaC0000 1 1rc#RK|jD]}|Ed{VdSN)r7leavesr%rPs rrVz Base.leavessD] & &E||~~ % % % % % % % % & &rcL|jdSd|jzS)Nrr)r6depthr,s rrYz Base.depths( ; 14;$$&&&&rc&|j}|dS|jS)z Return the string immediately following the invocant node. This is effectively equivalent to node.next_sibling.prefix N)rQprefix)r%next_sibs r get_suffixzBase.get_suffixs $  2rrcFt|dS)Nascii)strencoder,s r__str__z Base.__str__st99##G,, ,r)__name__ __module__ __qualname____doc__r r6r7rH was_checkedrr'__hash__r#r-r0r2r@rFr:rLpropertyrQrSrVrYr^sys version_inforerrrrrs` D FHKK### H " " """""""""".       X  1 1X 1&&&'''  &   - - - - -! rrceZdZdZ ddZdZdZejdkreZ dZ dZ d Z d Z ed Zejd Zd ZdZdZdS)Nodez+Concrete implementation for interior nodes.Nc||_t||_|jD] }||_ |||_|r|dd|_dSd|_dS)z Initializer. Takes a type constant (a symbol number >= 256), a sequence of child nodes, and an optional context keyword argument. As a side effect, the parent pointers of the children are updated. N)r r5r7r6r\fixers_applied)r%r r7contextr\rrr>s r__init__z Node.__init__sl X -  BBII   DK  '"0"3D   "&D   rcZ|jjdt|jd|jdSz)Return a canonical string representation.(, ))r!rfrr r7r,s r__repr__z Node.__repr__s6#~666(3333#}}}. .rc\dtt|jS)k Return a pretty string representation. This reproduces the input source exactly. r[)joinmaprcr7r,s r __unicode__zNode.__unicode__s" wws3 ..///rr_c>|j|jf|j|jfkSzCompare two nodes for equality.)r r7r$s rr#zNode._eqs 4=)ej%.-IIIrcXt|jd|jD|jS)$Return a cloned (deep) copy of self.c6g|]}|Sr)r-).0r>s r zNode.clone..s CCCr CCCrrr)rpr r7rrr,s rr-z Node.clones6DICCT]CCC#'#6888 8rc#ZK|jD]}|Ed{V|VdSz*Return a post-order iterator for the tree.N)r7r0rWs rr0zNode.post_ordersK] * *E'')) ) ) ) ) ) ) ) ) rc#ZK|V|jD]}|Ed{VdSz)Return a pre-order iterator for the tree.N)r7r2rWs rr2zNode.pre_order sO ] ) )E(( ( ( ( ( ( ( ( ( ) )rc8|jsdS|jdjS)zO The whitespace and comments preceding this node in the input. r[rr7r\r,s rr\z Node.prefixs# } 2}Q&&rc<|jr||jd_dSdSNrrr%r\s rr\z Node.prefixs+ = -&,DM!  # # # - -rct||_d|j|_||j|<|dS)z Equivalent to 'node.children[i] = child'. This method also sets the child's parent attribute appropriately. N)r6r7r:rOs r set_childzNode.set_child s7  "& a  a rcr||_|j|||dS)z Equivalent to 'node.children.insert(i, child)'. This method also sets the child's parent attribute appropriately. N)r6r7insertr:rOs r insert_childzNode.insert_child*s4   Q&&& rcp||_|j||dS)z Equivalent to 'node.children.append(child)'. This method also sets the child's parent attribute appropriately. N)r6r7r9r:rWs r append_childzNode.append_child3s2   U### rNNN)rfrgrhrirtrzrrmrnrer#r-r0r2rlr\setterrrrrrrrprps55 $''''2... 000 &  JJJ888  ))) ''X' ]--]-rrpceZdZdZdZdZdZddgfdZdZdZ e j dkre Z d Z d Zd Zd Zd ZedZejdZdS)rBz'Concrete implementation for leaf nodes.r[rNc||\|_\|_|_||_||_|||_|dd|_dS)z Initializer. Takes a type constant (a token number < 256), a string value, and an optional context keyword argument. N)_prefixrCcolumnr valuerr)r%r rrsr\rrs rrtz Leaf.__init__FsQ  7> 4DL44;    !DL,QQQ/rc@|jjd|jd|jdSrv)r!rfr rr,s rrzz Leaf.__repr__Ys,#~666#yyy#zzz+ +rc:|jt|jzS)r|)r\rcrr,s rrzLeaf.__unicode___s {S__,,rr_c>|j|jf|j|jfkSr)r rr$s rr#zLeaf._eqjs 4:&5:u{*CCCrclt|j|j|j|j|jff|jS)rr)rBr rr\rCrrrr,s rr-z Leaf.clonens:DItz[4; "<=#'#6888 8rc#K|VdSrUrr,s rrVz Leaf.leavests rc#K|VdSrrr,s rr0zLeaf.post_orderw rc#K|VdSrrr,s rr2zLeaf.pre_order{rrc|jS)zP The whitespace and comments preceding this token in the input. )rr,s rr\z Leaf.prefixs |rc<|||_dSrU)r:rrs rr\z Leaf.prefixs  r)rfrgrhrirrCrrtrzrrmrnrer#r-rVr0r2rlr\rrrrrBrB=s11G F F "0000&+++ --- &  DDD888 X  ]]rrBc|\}}}}|s ||jvr-t|dkr|dSt|||St|||S)z Convert raw node information to a Node or Leaf instance. This is passed to the parser driver which calls it whenever a reduction of a grammar rule produces a new complete node, so that the tree is build strictly bottom-up. rr)rs) number2symbollenrprB)grraw_noder rrsr7s rconvertrsn&."D%(242+++ x==A  A; D(G4444D%1111rcFeZdZdZdZdZdZdZdZdZ d dZ d dZ dZ dS) BasePatterna A pattern is a tree matching pattern. It looks for a specific node type (token or symbol), and optionally for a specific content. This is an abstract base class. There are three concrete subclasses: - LeafPattern matches a single leaf node; - NodePattern matches a single node (usually non-leaf); - WildcardPattern matches a sequence of nodes of variable length. Nc6t|S)z>Constructor that prevents BasePattern from being instantiated.rrs rrzBasePattern.__new__rrct|j|j|jg}|r|d |d=|r|d |jjddtt|dS)Nrwrxry) rr contentrr!rfr}r~repr)r%rs rrzzBasePattern.__repr__sw$)$$dlDI> tBx'R tBx'>222DIIc$oo4N4N4N4NOOrc|S)z A subclass can define this as a hook for optimizations. Returns either self or another node with the same effect. rr,s roptimizezBasePattern.optimizes  rc|j|j|jkrdS|j5d}|i}|||sdS|r||||jr |||j<dS)a# Does this pattern exactly match a node? Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. Default implementation for non-wildcard patterns. NFT)r r _submatchupdater)r%rEresultsrs rmatchzBasePattern.matchs 9 TY$)%;%;5 < #A">>$** u "q!!!  49 !%GDI trcdt|dkrdS||d|S)z Does this pattern exactly match a sequence of nodes? Default implementation for non-wildcard patterns. rFr)rr)r%nodesrs r match_seqzBasePattern.match_seqs0 u::??5zz%(G,,,rc#^Ki}|r$||d|r d|fVdSdSdS)z} Generator yielding all matches for this pattern. Default implementation for non-wildcard patterns. rrN)r)r%rrs rgenerate_matcheszBasePattern.generate_matchessS   TZZa!,, Q$JJJJJ    rrU) rfrgrhrir rrrrzrrrrrrrrrs   DG D### PPP 2----rrc&eZdZddZddZddZdS) LeafPatternNc8||||_||_||_dS)ap Initializer. Takes optional type, content, and name. The type, if given must be a token type (< 256). If not given, this matches any *leaf* node; the content may still be required. The content, if given, must be a string. If a name is given, the matching node is stored in the results dict under that key. N)r rr)r%r rrs rrtzLeafPattern.__init__s)       rcht|tsdSt|||S)z*Override match() to insist on a leaf node.F)r4rBrrr%rErs rrzLeafPattern.match s1$%% 5  tW555rc"|j|jkS) Match the pattern's content to the node's children. This assumes the node type matches and self.content is not None. Returns True if it matches, False if not. If results is not None, it must be a dict which will be updated with the nodes matching named subpatterns. When returning False, the results dict may still be updated. )rrrs rrzLeafPattern._submatchs|tz))rrrU)rfrgrhrtrrrrrrrsP(6666 * * * * * *rrc"eZdZdZddZddZdS) NodePatternFNc||@t|}t|D]!\}}t|trd|_"||_||_||_dS)ad Initializer. Takes optional type, content, and name. The type, if given, must be a symbol type (>= 256). If the type is None this matches *any* single node (leaf or not), except if content is not None, in which it only matches non-leaf nodes that also match the content pattern. The content, if not None, must be a sequence of Patterns that must match the node's children exactly. If the content is given, the type must not be None. If a name is given, the matching node is stored in the results dict under that key. NT)r5rJr4WildcardPattern wildcardsr rr)r%r rrrKitems rrtzNodePattern.__init__$si    7mmG$W-- * *4dO44*%)DN   rc|jrTt|j|jD]7\}}|t |jkr|||dS8dSt |jt |jkrdSt |j|jD]\}}|||sdSdS)rNTF)rrrr7rrzipr)r%rErcr subpatternrPs rrzNodePattern._submatchAs > (t}EE  1DM*****q)))44+5 t|  DM 2 2 2 25!$T\4=!A!A   J##E733 uu trrrU)rfrgrhrrtrrrrrr sAI:rrcPeZdZdZddedfdZdZd dZd dZdZ d Z d Z d Z dS) ra A wildcard pattern can match zero or more nodes. This has all the flexibility needed to implement patterns like: .* .+ .? .{m,n} (a b c | d e | f) (...)* (...)+ (...)? (...){m,n} except it always uses non-greedy matching. Nrc|'ttt|}|D]}||_||_||_||_dS)a Initializer. Args: content: optional sequence of subsequences of patterns; if absent, matches one node; if present, each subsequence is an alternative [*] min: optional minimum number of times to match, default 0 max: optional maximum number of times to match, default HUGE name: optional name assigned to this match [*] Thus, if content is [[a, b, c], [d, e], [f, g, h]] this is equivalent to (a b c | d e | f g h); if content is None, this is equivalent to '.' in regular expression terms. The min and max parameters work as follows: min=0, max=maxint: .* min=1, max=maxint: .+ min=0, max=1: .? min=1, max=1: . If content is not None, replace the dot with the parenthesized list of alternatives, e.g. (a b c | d e | f g h)* N)tupler~rminmaxr)r%rrrralts rrtzWildcardPattern.__init__ksT0  Cw//00G + +  rc<d}|jIt|jdkr1t|jddkr|jdd}|jdkrM|jdkrB|jt |jS|$|j|jkr|S|jdkrft|trQ|jdkrF|j|jkr6t|j|j|jz|j|jz|jS|S)z+Optimize certain stacked wildcard patterns.Nrr)r) rrrrrrrr4r)r%rs rrzWildcardPattern.optimizes L $    " "s4<?';';q'@'@a+J 8q==TX]]|#" 2222%49 +G+G!**,,, HMMj_EEM Na  DI$@$@":#5#'8JN#:#'8JN#:#-?44 4 rc0||g|S)z'Does this pattern exactly match a node?)rrs rrzWildcardPattern.matchs~~tfg...rc||D]P\}}|t|kr8|3|||jrt |||j<dSQdS)z4Does this pattern exactly match a sequence of nodes?NTF)rrrrr5)r%rrrrs rrzWildcardPattern.match_seqsy))%00  DAqCJJ&NN1%%%y9-1%[[ *tt  urc #.K|j^t|jdtt||jzD]#}i}|jr|d|||j<||fV$dS|jdkr||VdSttdr$tj }tt_ | |dD]$\}}|jr|d|||j<||fV%nJ#t$r=| |D]$\}}|jr|d|||j<||fV%YnwxYwttdr|t_ dSdS#ttdr |t_ wxYw)a" Generator yielding matches for a sequence of nodes. Args: nodes: sequence of nodes Yields: (count, results) tuples where: count: the match comprises nodes[:count]; results: dict containing named submatches. Nr bare_name getrefcountr)rrangerrrr_bare_name_matcheshasattrrmstderrr_recursive_matches RuntimeError_iterative_matches)r%rcountr save_stderrs rrz WildcardPattern.generate_matchess < txSUTX-F-F)FGG  91#(%=AdiLQh    Y+ % %))%00 0 0 0 0 0 sM** (!j %ZZ  - $ 7 7q A A##HE1y5',VeV}$) (NNNN#  # # #!% 7 7 > >##HE1y5',VeV}$) (NNNN## #3 ..-!,CJJJ--73 ..-!,CJ,,,,s+;DE1AE E1E  E11#Fc#Kt|}d|jkrdifVg}|jD]5}t||D]"\}}||fV|||f#6|rg}|D]\}} ||kr||jkr}|jD]u}t|||dD]Z\} } | dkrOi}|| || || z|fV||| z|f[v|}|dSdS)z(Helper to iteratively yield the matches.rN)rrrrr9rr) r%rnodelenrrrr new_resultsc0r0c1r1s rrz"WildcardPattern._iterative_matchesse** ==R%KKK< ' 'C(e44 ' '1d 1v&&&& '  "K! A AB<K||jkrdifV||jkr||jD]v}t||D]a\}}|||d|dzD]:\}}i}||||||z|fV;budSdS)z(Helper to recursively yield the matches.rNr)rrrrrr) r%rrrrrrrrs rrz"WildcardPattern._recursive_matches s DH  R%KKK 48  | ) ).sE::))FB"&"9"9%*eAg"N"N))B   2gqj(((( ))   ) )rrU) rfrgrhriHUGErtrrrrrrrrrrrr]s   $4!!!!F&////    +-+-+-Z""": ) ) ) ) )rrc(eZdZddZdZdZdZdS)NegatedPatternNc|||_dS)a Initializer. The argument is either a pattern or None. If it is None, this only matches an empty sequence (effectively '$' in regex lingo). If it is not None, this matches whenever the argument pattern doesn't have any matches. N)r)r%rs rrtzNegatedPattern.__init__s   rcdS)NFrrDs rrzNegatedPattern.match(surc(t|dkSr)r)r%rs rrzNegatedPattern.match_seq,s5zzQrc#K|jt|dkrdifVdSdS|j|D]\}}dSdifVdSr)rrr)r%rrrs rrzNegatedPattern.generate_matches0sr < 5zzQe  55e<<  1R%KKKKKrrU)rfrgrhrtrrrrrrrrsU         rrc#0K|sdifVdS|d|dd}}||D]a\}}|s||fVt|||dD]:\}}i}||||||z|fV;bdS)aR Generator yielding matches for a sequence of patterns and nodes. Args: patterns: a sequence of patterns nodes: a sequence of nodes Yields: (count, results) tuples where: count: the entire sequence of patterns matches nodes[:count]; results: dict containing named submatches. rrN)rr) patternsrprestrrrrrs rrr<s  %e 1+x|4((// % %FB %"f .tU233Z@@%%FBAHHRLLLHHRLLLr'1*$$$$ %  % %r)ri __author__rmiorrrrrrrprBrrrrrrrrrrrs3  666n-n-n-n-n-6n-n-n-`kkkkk4kkk\LLLLL4LLL\222&SSSSS&SSSl)*)*)*)*)*+)*)*)*X:::::+:::zy)y)y)y)y)ky)y)y)x     [   F%%%%%r__pycache__/__init__.cpython-311.opt-1.pyc000064400000000565151027012300014151 0ustar00 !A?h4ddlZejdeddS)NzGlib2to3 package is deprecated and may not be able to parse Python 3.10+) stacklevel)warningswarnDeprecationWarning)/usr/lib64/python3.11/lib2to3/__init__.pyr s> Mr __pycache__/btm_utils.cpython-311.opt-1.pyc000064400000026374151027012300014422 0ustar00 !A?h&dZddlmZddlmZmZddlmZmZeZ eZ ej Z eZ dZdZdZGdd eZdd Zd Zd Zd S)z0Utility functions used by the btm_matcher module)pytree)grammartoken)pattern_symbolspython_symbolsc2eZdZdZddZdZdZdZdZdS) MinNodezThis class serves as an intermediate representation of the pattern tree during the conversion to sets of leaf-to-root subpatternsNch||_||_g|_d|_d|_g|_g|_dS)NF)typenamechildrenleafparent alternativesgroup)selfrrs */usr/lib64/python3.11/lib2to3/btm_utils.py__init__zMinNode.__init__s8      cZt|jdzt|jzS)N )strrr)rs r__repr__zMinNode.__repr__s"49~~#c$)nn44rc|}g}|r^|jtkrr|j|t |jt |jkr$t |jg}g|_|j}{|j}d}n|jtkrq|j |t |j t |jkr#t|j }g|_ |j}|j}d}n[|jtj kr"|j r||j n||j|j}|^|S)zInternal method. Returns a characteristic path of the pattern tree. This method must be run for all leaves until the linear subpatterns are merged into a singleN)rTYPE_ALTERNATIVESrappendlenrtupler TYPE_GROUPrget_characteristic_subpattern token_labelsNAMEr)rnodesubps r leaf_to_rootzMinNode.leaf_to_root!sZ! y---!((...t())S-?-???!$"3445D(*D%;D;DDyJ&& !!$'''tz??c$-&8&8888DDD!#DJ;D;DDyL---$)- DI&&&& DI&&&;DC! D rch|D]}|}|r|cSdS)aDrives the leaf_to_root method. The reason that leaf_to_root must be run multiple times is because we need to reject 'group' matches; for example the alternative form (a | b c) creates a group [b c] that needs to be matched. Since matching multiple linear patterns overcomes the automaton's capabilities, leaf_to_root merges each group into a single choice based on 'characteristic'ity, i.e. (a|b c) -> (a|b) if b more characteristic than c Returns: The most 'characteristic'(as defined by get_characteristic_subpattern) path for the compiled pattern tree. N)leavesr()rlr's rget_linear_subpatternzMinNode.get_linear_subpatternKsJ   A>>##D     rc#lK|jD]}|Ed{V|js|VdSdS)z-Generator that returns the leaves of the treeN)rr*)rchilds rr*zMinNode.leaves`s[] & &E||~~ % % % % % % % %} JJJJJ  r)NN) __name__ __module__ __qualname____doc__rrr(r,r*rrr r so555(((T*rr Nc d}|jtjkr |jd}|jtjkrt |jdkrt |jd|}nstt}|jD]L}|j |dzr t ||}||j |Mn|jtj krt |jdkrVtt}|jD].}t ||}|r|j |/|jsd}nt |jd|}nh|jtj krRt|jdtjr1|jdjdkrt |jd|St|jdtjr|jdjdksIt |jdkr3t%|jddr|jdjdkrdSd }d}d}d }d} d } |jD]j}|jtjkrd }|}n1|jtjkrd }|} n|jtjkr|}t%|dr |jd krd } k| r6|jd} t%| dr| jdkr |jd } n |jd} | jt*jkr| jd krtt.}nt%t*| jr)tt1t*| j}ntt1t2| j}n| jt*jkr[| jd} | t8vrtt8| }nAtt*j| }n%| jtjkrt ||}|r7| jdjdkrd}n| jdjdkrnt:|r@|>|jddD].}t ||}||j |/|r||_|S)z Internal function. Reduces a compiled pattern tree to an intermediate representation suitable for feeding the automaton. This also trims off any optional pattern elements(like [a], a*). N)rr([valueTF=any')rr*+r)rsymsMatcherr Alternativesr reduce_treer rindexr Alternativer"Unit isinstancerLeafr9hasattrDetailsRepeaterr$r%TYPE_ANYgetattrpysymsSTRINGstriptokensNotImplementedErrorr) r&rnew_noder.reducedr details_nodealternatives_node has_repeater repeater_nodehas_variable_name name_leafrs rrCrCgsH yDL  }Q yD%%% t}   " ""4=#3V<\. . .%''"111<99N&GL)/,R,RSSSHH&GFIO,L,LMMMHH ^|2 2 2?((--Dv~~"t 555" (9EEE ^t0 0 0"#4f==H  *%a(.#55'*0C77*)  6H0%.qt4 6 6%eX66&%,,W555!  Orct|ts|St|dkr|dSg}g}gdg}d|D]}tt |drtt |fdr||Vtt |fdr|||||r|}n |r|}n|r|}t |tS) zPicks the most characteristic from a list of linear patterns Current order used is: names > common_names > common_chars rr5)inforifnotNonez[]().,:c.t|tuSN)rr)xs rz/get_characteristic_subpattern..sd1ggnrc6t|to|vSrbrGr)rc common_charss rrdz/get_characteristic_subpattern..sjC&8&8&NQ,=Nrc6t|to|vSrbrf)rc common_namess rrdz/get_characteristic_subpattern..s 1c(:(:(PqL?Pr)key)rGlistr r<rec_testrmax) subpatternssubpatterns_with_namessubpatterns_with_common_namessubpatterns_with_common_chars subpatternrgris @@rr#r#so k4 ( ( ;1~ $&!666L$&!L! : : x $<$<== > > :8JNNNNPPQQ :-44Z@@@@XjPPPPRRSS :-44Z@@@@'--j9994, &43 &43 { $ $ $$rc#K|D]B}t|ttfrt||Ed{V5||VCdS)zPTests test_func on all items of sequence and items of included sub-iterablesN)rGrkr!rl)sequence test_funcrcs rrlrlss a$ ' ' 9-- - - - - - - - -)A,,     rrb)r2rpgen2rrpygramrrr@rNopmaprQr$rLrr"objectr rCr#rlr3rrr{s22!!!!!!!!33333333     UUUUUfUUUnBBBBJ#%#%#%Jr__pycache__/__init__.cpython-311.opt-2.pyc000064400000000565151027012300014152 0ustar00 !A?h4ddlZejdeddS)NzGlib2to3 package is deprecated and may not be able to parse Python 3.10+) stacklevel)warningswarnDeprecationWarning)/usr/lib64/python3.11/lib2to3/__init__.pyr s> Mr __pycache__/__main__.cpython-311.pyc000064400000000475151027012300013173 0ustar00 !A?hCLddlZddlmZejeddS)N)mainz lib2to3.fixes)sysrexit)/usr/lib64/python3.11/lib2to3/__main__.pyr sB o  r__pycache__/fixer_util.cpython-311.opt-2.pyc000064400000047602151027012300014570 0ustar00 !A?hf; ddlmZddlmZmZddlmZddlm Z dZ dZ dZ dZ d,d Zd Zd Zd Ze e fdZd-dZdZdZd,dZdZd,dZd,dZdZdZdZdZdZhdZdZ da!da"da#d a$d!Z%d"Z&d#Z'd$Z(d%Z)d&Z*d'Z+d(Z,ej-ej.hZ/d,d)Z0ej.ej-ej1hZ2d*Z3d,d+Z4d S).)token)LeafNode)python_symbols)patcompclttj|ttjd|gS)N=)rsymsargumentrrEQUAL)keywordvalues +/usr/lib64/python3.11/lib2to3/fixer_util.py KeywordArgrs.  $u{C00%8 : ::c6ttjdS)N()rrLPARrrLParenr  C  rc6ttjdS)N))rrRPARrrrRParenrrrc  t|ts|g}t|ts d|_|g}ttj|t tjddgz|zS)N r prefix) isinstancelistrrr atomrrr )targetsources rAssignr%sx' fd # # fd # #   $u{C<<<==F H HHrNc< ttj||SNr)rrNAME)namers rNamer*$s  D 0 0 00rcX |ttjt|ggSN)rr trailerDot)objattrs rAttrr1(s$# dlSUUDM22 33rc8 ttjdS)N,)rrCOMMArrrCommar5,s  S ! !!rc8 ttjdS)N.)rrDOTrrrr.r.0s  3  rc ttj||g}|r.|dttj||S)Nr)rr r-clone insert_childarglist)argslparenrparennodes rArgListrA4sZ7  v||~~v||~~> ? ?D 7 !T$,55666 Krcl ttj|t|g}|||_|Sr,)rr powerrAr) func_namer=rr@s rCallrE;s3  Y 6 7 7D  Krc8 ttjdS)N rrNEWLINErrrNewlinerJBs  t $ $$rc8 ttjdS)NrHrrr BlankLinerMFs  r " ""rc:ttj||Sr')rrNUMBER)nrs rNumberrQJs  a / / //rc  ttjttjd|ttjdgS)N[])rr r-rrLBRACERBRACE) index_nodes r SubscriptrXMs@'  tEL#66)#EL#668 9 99rc< ttj||Sr')rrSTRING)stringrs rStringr\Ss  fV 4 4 44rc r d|_d|_d|_ttjd}d|_ttjd}d|_||||g}|rWd|_ttjd}d|_|t t j||gt t j|t t j |g}t t j ttj d|ttj dgS)NrLrforinifrSrT) rrrr(appendrr comp_if listmakercomp_forr"rUrV) xpfpittestfor_leafin_leaf inner_argsif_leafinners rListComprnWsBIBIBIEJ&&HHO5:t$$GGNB,J ? uz4(($t|gt_==>>> "d4=*&E&E!F G GE  U\3//U\3//1 2 22rcB |D]}|ttjdttj|dttjddt t j|g}t t j|}|S)Nfromrrimport)removerrr(rr import_as_names import_from) package_name name_leafsleafchildrenimps r FromImportrzos* UZ((UZc:::UZ#666T):668H t * *C Jrc n |d}|jtjkr|}n-t tj|g}|d}|r d|D}t tjt t|dt|dt tj|d||dggz|z}|j |_ |S)Nr/afterc6g|]}|Sr)r:).0rPs r z!ImportAndCall..s ***q***rrlparrpar) r:typer r<rrCr1r*r-r)r@resultsnamesr/ newarglistr|news r ImportAndCallrs %.   C x4<YY[[ $, 66 G E +**E*** tzDqNNDqNN33T\fo++-- fo++--/001149 9 : :C CJ Jrc t|tr'|jtt gkrdSt|tot |jdkot|jdt okt|jdtoKt|jdt o+|jdjdko|jdjdkS)NTrrrr)r rrxrrlenrrr@s ris_tuplers2$$-FHHfhh3G"G"Gt tT " " .DM""a' .4=+T22 .4=+T22 .4=+T22  .  a &#-  .  a &#- /rc6 t|tot|jdkokt|jdtoKt|jdto+|jdjdko|jdjdkS)NrrrSrT)r rrrxrrrs ris_listrs1 tT " " /DM""Q& /4=+T22 /4=,d33 / a &#-  /  b!'3. 0rclttjt|t gSr,)rr r"rrrs r parenthesizers#  FHHdFHH5 6 66r> allanymaxminsetsumr!tuplesorted enumeratec#`K t||}|r|Vt||}|dSdSr,)getattr)r/r0nexts r attr_chainrsZ  3  D # tT"" #####rzefor_stmt< 'for' any 'in' node=any ':' any* > | comp_for< 'for' any 'in' node=any any* > z power< ( 'iter' | 'list' | 'tuple' | 'sorted' | 'set' | 'sum' | 'any' | 'all' | 'enumerate' | (any* trailer< '.' 'join' >) ) trailer< '(' node=any ')' > any* > z` power< ( 'sorted' | 'enumerate' ) trailer< '(' arglist ')' > any* > Fcj tsMtjtatjtatjt adattt g}t |t|dD]*\}}i}|||r |d|urdS+dS)NTparentr@F) pats_builtrcompile_patternp0p1p2ziprmatch)r@patternspatternrrs rin_special_contextrs   $R ( (  $R ( (  $R ( ( B|HxD()C)CDD == ) ) gfo.E.E44 5rc |j}||jtjkrdS|j}|jt jt jfvrdS|jt jkr|j d|urdS|jt j ks;|jt j kr(||jtj ks|j d|urdSdS)NFrT) prev_siblingrrr8rr funcdefclassdef expr_stmtrx parameters typedargslistr4)r@prevrs ris_probably_builtinrs  D DI22u [F {t|T]333u {dn$$);t)C)Cu {do%% [D. . .  $)u{":": OA $ & &u 4rc |_|jtjkrAt|jdkr)|jd}|jt jkr|jS|j}|_dS)NrrrL) rr suiterrxrINDENTrr)r@indents rfind_indentationrsg)   9 " "s4='9'9A'='=]1%F{el**|#{   2rc|jtjkr|S|}|jdc}|_t tj|g}||_|Sr,)rr rr:rr)r@rrs r make_suitersR yDJ ::<.is_import_stmt>s4 T--,$-,$-*++ -rrrrqrr)rrrrxrr rrrZrrrr(rzrJr;) rr)r@rroot insert_posoffsetidxnode2import_rxs r touch_importr;s$--- T??Dt,,Jt}-- T~d##  &t}STT':;;  MFE!>%((  6\  Q"4=11  IC T---$--}Q$ 44 1W t' X & & T# . . .*    WtEJS'I'I'I&JKK#Hj$t'7"B"BCCCCCrc R |jD]}d}|jtjkrNt ||jdr|cSt |t |jd|}|r|}n|jtjtjfvr/t |t |jd|}|r|}nK|jtj krt |t |jd|}|r|}nt|jddD]U\}}|jtj kr;|j dkr0t |t |j|dz|}|r|}Vn|jtvr|jdj |kr|}nmt|||r|}nY|jtjkrt |||}n2|jtjkrt ||jdr|}|r|s|cSt%|r|cSdS)Nrrrr:r)rxrr for_stmt_findrrif_stmt while_stmttry_stmtrrCOLONr _def_syms_is_import_bindingrrr)r)r@rchildretrPikids rrris8("" : & &T5>!,--  T:enR.@#A#A7KKAM# ZDL$/: : :T:enR.@#A#A7KKAM# Z4= ( (T:enQ.?#@#@'JJA &'qrr(:;;&&FAsx5;..393C3C(z%.1:M/N/NPWXX Ac & Z9 $ $):)@D)H)HCC tW 5 5 CC Z4+ + +tUG44CC Z4> ) )T5>!,--     ~~  4rc|g}|rl|}|jdkr)|jtvr||jn"|jt jkr |j|kr|S|ldS)N)popr _block_symsextendrxrr(r)r)r@nodess rrrs} FE yy{{ 9s??ty ;; LL ' ' ' ' Y%* $ $t););K  4rc0 |jtjkr|s|jd}|jtjkr`|jD]V}|jtjkr|jdj|kr|cS2|jtjkr|j|kr|cSWn{|jtjkr1|jd}|jtjkr |j|kr|Sn5|jtjkr |j|kr|Sn|jtj kr|r2t|jd |krdS|jd}|rtd|rdS|jtj krt||r|S|jtjkr0|jd}|jtjkr |j|kr|Sn;|jtjkr |j|kr|S|r|jtjkr|SdS)Nrrrras)rr rrxdotted_as_namesdotted_as_namerrr(rtstrstriprrsimport_as_nameSTAR)r@r)rryrlastrPs rrrs) yD$$$W$mA 8t+ + +  :!444~a(.$66# 7Z5:--%+2E2EKKK  X, , ,<#DyEJ&&4:+=+= X # # T(9(9K d& & &  s4=+,,2244??4 M!   uT1~~ 4 Vt+ + +dA +K Vt* * *JqMEzUZ''EK4,?,? Vuz ! !agooK  5:--K 4rr,)NN)5pgen2rpytreerrpygramrr rLrrrrr%r*r1r5r.rArErJrMrQrXr\rnrzrrrrconsuming_callsrrrrrrrrrrrrrrrrrr-rrrrrrrs7******:::!!!!!! H H H1111444"""    &&((%%%###0000999 555522220&8 / / /000777...###&  &.===*D*D*DZ]DL ) ((((T|T]DL9 ''''''r__pycache__/pygram.cpython-311.pyc000064400000004010151027012300012737 0ustar00 !A?hdZddlZddlmZddlmZddlmZejej e dZ ejej e dZ Gd d e Zejd e ZeeZeZejd =eZejd =ejd e ZeeZdS)z&Export the Python grammar and symbols.N)token)driver)pytreez Grammar.txtzPatternGrammar.txtceZdZdZdS)Symbolscf|jD]\}}t|||dS)zInitializer. Creates an attribute for each grammar symbol (nonterminal), whose value is the symbol's type (an int >= 256). N) symbol2numberitemssetattr)selfgrammarnamesymbols '/usr/lib64/python3.11/lib2to3/pygram.py__init__zSymbols.__init__sE $17799 ( (LD& D$ ' ' ' ' ( (N)__name__ __module__ __qualname__rrrrrs#(((((rrlib2to3printexec)__doc__ospgen2rrrpathjoindirname__file__ _GRAMMAR_FILE_PATTERN_GRAMMAR_FILEobjectrload_packaged_grammarpython_grammarpython_symbolscopy!python_grammar_no_print_statementkeywords*python_grammar_no_print_and_exec_statementpattern_grammarpattern_symbolsrrrr/sO-,  RW__X66 FF  RW__X%>%>%9;; ( ( ( ( (f ( ( (.-iGG(($2$7$7$9$9!%.w7-N-S-S-U-U*.7?.&.y:OPP'/**r__pycache__/patcomp.cpython-311.opt-2.pyc000064400000023002151027012300014045 0ustar00 !A?h dZddlZddlmZmZmZmZmZmZddl m Z ddl m Z Gdde Z d ZGd d eZejejejdd Zd ZdZdZdS)z#Guido van Rossum N)driverliteralstokentokenizeparsegrammar)pytree)pygramceZdZdS)PatternSyntaxErrorN)__name__ __module__ __qualname__(/usr/lib64/python3.11/lib2to3/patcomp.pyr r sDrr c#K tjtjtjh}t jt j|j}|D]}|\}}}}}||vr|VdSN) rNEWLINEINDENTDEDENTrgenerate_tokensioStringIOreadline) inputskiptokens quintupletypevaluestartend line_texts rtokenize_wrapperr&sv@ M5< 6D  %bk%&8&8&A B BF -6*eUC t  OOOrc2eZdZddZd dZdZddZdZdS) PatternCompilerNcN |#tj|_tj|_n7t j||_tj|j|_tj|_ tj |_ t j |jt|_dS)N)convert)r pattern_grammarr pattern_symbolssymsr load_grammarSymbolspython_grammar pygrammarpython_symbolspysymsDriverpattern_convert)self grammar_files r__init__zPatternCompiler.__init__(sz   !1DL.DII!.|<.0chr6s r z0PatternCompiler.compile_node..Os'GGGbD%%b))GGGrrcg|]}|gSrr)rGas rrIz0PatternCompiler.compile_node..Rs':':':':':':rminmaxc:g|]}|SrrErFs rrIz0PatternCompiler.compile_node..Vs'CCCrT&&r**CCCr)r!r-Matcherchildren Alternativeslenr WildcardPatternoptimize Alternative NegatedUnit compile_basicNegatedPatternrEQUALr"RepeaterSTARHUGEPLUSLBRACEget_intname) r6nodealtspunitspatternrenodesrepeatrUchildrNrOs ` rr>zPatternCompiler.compile_nodeCs 9 ) ) )=#D 9 . . .GGGGDM##A#4FGGGD4yyA~~Aw&':':T':':':qIIIA::<<  9 - - -CCCCT]CCCE5zzQQx&wA1===A::<<  9 - - -((qrr):;;G%g..A::<<   u::??uQx} ;;8>D!""IE u::??uRy~1CCC2YF#2#JE$$UF33  HQKEzUZ''kuz))ku|++!LL!555cx==A%%,,x{33Caxx3!88!**,, 07)#3OOO  GL!!!rc|d}|jtjkrHtt j|j}tjt||S|jtj kr|j}| rS|tvrtd|z|ddrtdtjt|S|dkrd}n?|ds*t|j|d}|td|z|ddr(||djdg}nd}tj||S|jdkr||dS|jd kr4||d}tj|ggdd SdS) NrzInvalid token: %rrzCan't have details for tokenany_zInvalid symbol: %r([rM)r!rSTRINGr=r evalStringr"r LeafPattern_type_of_literalNAMEisupper TOKEN_MAPr startswithgetattrr3r>rU NodePatternrX)r6rkrlrfr"r!content subpatterns rr\zPatternCompiler.compile_basicsQx 9 $ $+DJ7788E%&6u&=&=uEE E Y%* $ $JE}} 9 )),-@5-HIII9M,-KLLL))E*:;;;E>>DD))#..O"4;t<r\rdrrrr(r(&sw K K K K + + + +E"E"E"N!!!!Frr()rwrsNUMBERTOKENc|dr tjS|tjvrtj|SdS)Nr)isalpharrwr opmap)r"s rrvrvsA Qxz '-  }U##trc |\}}}}|s ||jvrtj|||Stj|||S)N)context) number2symbolr NodeLeaf)r raw_node_infor!r"rrUs rr5r5sVC%2"D%(947000{47;;;;{48888rcDt|Sr)r(rB)rjs rrBrBs    , ,W 5 55r) __author__rpgen2rrrrrr r r Exceptionr r&objectr(rwrsrryrvr5rBrrrrs83  EDDDDDDDDDDDDDDD        IIIIIfIIIZZ||   99966666r__pycache__/pygram.cpython-311.opt-2.pyc000064400000003503151027012300013705 0ustar00 !A?h ddlZddlmZddlmZddlmZejeje dZ ejeje dZ Gdd e Z ejd e Ze eZeZejd =eZejd =ejd e Ze eZdS) N)token)driver)pytreez Grammar.txtzPatternGrammar.txtceZdZdZdS)Symbolsch |jD]\}}t|||dS)N) symbol2numberitemssetattr)selfgrammarnamesymbols '/usr/lib64/python3.11/lib2to3/pygram.py__init__zSymbols.__init__sJ $17799 ( (LD& D$ ' ' ' ' ( (N)__name__ __module__ __qualname__rrrrrs#(((((rrlib2to3printexec)ospgen2rrrpathjoindirname__file__ _GRAMMAR_FILE_PATTERN_GRAMMAR_FILEobjectrload_packaged_grammarpython_grammarpython_symbolscopy!python_grammar_no_print_statementkeywords*python_grammar_no_print_and_exec_statementpattern_grammarpattern_symbolsrrrr.sL-  RW__X66 FF  RW__X%>%>%9;; ( ( ( ( (f ( ( (.-iGG(($2$7$7$9$9!%.w7-N-S-S-U-U*.7?.&.y:OPP'/**r__pycache__/__main__.cpython-311.opt-1.pyc000064400000000475151027012300014132 0ustar00 !A?hCLddlZddlmZejeddS)N)mainz lib2to3.fixes)sysrexit)/usr/lib64/python3.11/lib2to3/__main__.pyr sB o  r__pycache__/refactor.cpython-311.opt-1.pyc000064400000113036151027012300014215 0ustar00 !A?hsk@dZdZddlZddlZddlZddlZddlZddlZddlZddl m Z ddl m Z m Z mZddlmZddlmZmZdd lmZdd ZGd d eZdZdZdZdZdZGddeZGddeZ GddeZ!Gdde Z"dS)zRefactoring framework. Used as a main program, this can refactor any number of files and/or recursively descend down directories. Imported as a module, this provides infrastructure to write your own refactoring tool. z#Guido van Rossum N)chain)drivertokenizetoken) find_root)pytreepygram) btm_matcherTct|ggdg}g}tj|jD]<\}}}|dr!|r |dd}||=|S)zEReturn a sorted list of all available fix names in the given package.*fix_N) __import__pkgutil iter_modules__path__ startswithappend) fixer_pkg remove_prefixpkg fix_namesfindernameispkgs )/usr/lib64/python3.11/lib2to3/refactor.pyget_all_fix_namesrs YB . .CI&3CLAA##e ??6 " " # ABBx   T " " " ceZdZdS) _EveryNodeN__name__ __module__ __qualname__rrr!r!+Drr!ct|tjtjfr|jt |jhSt|tjr"|jrt|jSt t|tj rAt}|jD])}|D]$}| t|%*|Std|z)zf Accepts a pytree Pattern Node and returns a set of the pattern types which will match first. Nz$Oh no! I don't understand pattern %s) isinstancer NodePattern LeafPatterntyper!NegatedPatterncontent_get_head_typesWildcardPatternsetupdate Exception)patrpxs rr/r//s#*F,>?@@ 8  z#v,-- ; 0"3;// /#v-.. EE - -A - -++,,,, - :SA B BBrcZtjt}g}|D]}|jr[ t |j}|D]}|||?#t $r||Y`wxYw|j!||j|||ttj j tj j D]}|||t|S)z^ Accepts a list of fixers and returns a dictionary of head node type --> fixer list. ) collections defaultdictlistpatternr/rr! _accept_typerr python_grammar symbol2numbervaluestokensextenddict) fixer_list head_nodeseveryfixerheads node_types r_get_headnode_dictrJKsL(..J E $ $ = $ 8' 66"'88Iy)0077778 $ $ $ U##### $ !-5-.55e<<<< U####60>EEGG!0799,, 9$$U++++   sAA?>A?c<fdtdDS)zN Return the fully qualified names for fixers in the package pkg_name. c g|] }dz|z S.r&).0fix_namepkg_names r z+get_fixers_from_package..hs8 @ @ @ sNX % @ @ @rF)r)rQs`rget_fixers_from_packagerSds@ @ @ @ @-h>> @ @ @@rc|SNr&)objs r _identityrWks Jrchd}tjtj|jfd}t t jtjt j h}t} |\}}||vr|t j kr|rnd}n|t j kr|dkr|\}}|t j ks|dkrn|\}}|t j ks|dkrn|\}}|t j kr|dkr |\}}|t j krV|||\}}|t j ks|dkrn|\}}|t j kVnn n#t$rYnwxYwt |S) NFcBt}|d|dfS)Nrr)next)tokgens radvancez(_detect_future_features..advancers 3ii1vs1v~rTfrom __future__import(,)rgenerate_tokensioStringIOreadline frozensetrNEWLINENLCOMMENTr1STRINGNAMEOPadd StopIteration)sourcehave_docstringr]ignorefeaturestpvaluer\s @r_detect_future_featuresrvosN  "2;v#6#6#? @ @C x{EMB C CFuuH   IBV||u|##!!%uz!!evoo#GII E##u '<'<#GII E##u'8'8#GII E>>esll ' IBEJ&&LL''' ' IBUX~~# ' IB EJ&&3 4      X  s3D!F F"!F"ceZdZdZdS) FixerErrorzA fixer could not be loaded.N)r#r$r%__doc__r&rrrxrxs&&&&rrxceZdZddddZdZdZddZdZdZd Z d Z d Z dd Z dd Z dZddZdZd dZdZdZ d!dZd"dZdZdZdZdZdZdZdZdZdS)#RefactoringToolF)print_function exec_functionwrite_unchanged_filesFixrNcP||_|pg|_|j|_||j|t j|_|jdr|jj d=n|jdr |jj d=|j d|_ g|_ tjd|_g|_d|_t%j|jt(j|j |_|\|_|_g|_t5j|_g|_g|_t?|j|jD]k}|j r|j!|$||jvr|j"|H||jvr|j"|ltG|j|_$tG|j|_%dS) zInitializer. Args: fixer_names: a list of fixers to import options: a dict with configuration. explicit: a list of fixers to run even if they are explicit. Nr|printr}execr~r{F)convertlogger)&fixersexplicit_default_optionscopyoptionsr2r r>grammarkeywordsgetr~errorslogging getLoggerr fixer_logwroterDriverr r get_fixers pre_order post_orderfilesbm BottomMatcherBM bmi_pre_orderbmi_post_orderr BM_compatible add_fixerrrJbmi_pre_order_headsbmi_post_order_heads)self fixer_namesrrrGs r__init__zRefactoringTool.__init__s"  B ,1133   L   ( ( (,1133 <( ) . %g.. \/ * . %f- &*\%5%56M%N%N" '(9::  mDL,2N+/;888 +///*;*;' "$$ 4?DN;; 2 2E" 2!!%(((($.(("))%0000$/))#**5111#5d6H#I#I $6t7J$K$K!!!rcg}g}|jD]}t|iidg}|ddd}||jr|t |jd}|d}|jdd|Dz} t||}n$#t$rtd |d|dwxYw||j |j } | jr*|jd ur!||jvr|d |!|d || jd kr|| Y| jdkr|| {td| jzt'jd} || || ||fS)aInspects the options to load the requested patterns and handlers. Returns: (pre_order, post_order), where pre_order is the list of fixers that want a pre-order AST traversal, and post_order is the list that want post-order traversal. r rNrN_c6g|]}|Sr&)title)rOr6s rrRz.RefactoringTool.get_fixers..s 5O5O5OAaggii5O5O5Orz Can't find TzSkipping optional fixer: %szAdding transformation: %sprepostzIllegal fixer order: %r run_orderkey)rrrsplitr FILE_PREFIXlensplit CLASS_PREFIXjoingetattrAttributeErrorrxrrr log_message log_debugorderroperator attrgettersort) rpre_order_fixerspost_order_fixers fix_mod_pathmodrPparts class_name fix_classrGkey_funcs rrzRefactoringTool.get_fixerss" K J JL\2rC599C#**32226H""4#344 <#C(8$9$9$:$:;NN3''E*RWW5O5O5O5O5O-P-PPJ X#C44 ! X X X jxxx!LMMSWW XIdlDN;;E~ $-t";";  55  !>III NN6 A A A{e## ''....&&!((//// !:U[!HIII&{33(+++8,,, "344s 1C!C#c)zCalled when an error occurs.r&)rmsgargskwdss r log_errorzRefactoringTool.log_errors rcH|r||z}|j|dS)zHook to log a message.N)rinforrrs rrzRefactoringTool.log_messages/  *C rcH|r||z}|j|dSrU)rdebugrs rrzRefactoringTool.log_debug s/  *C #rcdS)zTCalled with the old version, new version, and filename of a refactored file.Nr&)rold_textnew_textfilenameequals r print_outputzRefactoringTool.print_outputs  rc|D]P}tj|r||||9||||QdS)z)Refactor a list of files and directories.N)ospathisdir refactor_dir refactor_file)ritemswrite doctests_only dir_or_files rrefactorzRefactoringTool.refactorsn! F FKw}}[)) F!!+umDDDD"";}EEEE  F Frctjdz}tj|D]\}}}|d||||D]w}|ds`tj|d|kr7tj||} | | ||xd|D|dd<dS)zDescends down a directory and refactor every Python file found. Python files are assumed to have a .py extension. Files and subdirectories starting with '.' are skipped. pyzDescending into %srNrc<g|]}|d|SrM)r)rOdns rrRz0RefactoringTool.refactor_dir..2s)KKK" c8J8JK2KKKrN) rextsepwalkrrrrsplitextrr) rdir_namerrpy_extdirpathdirnames filenamesrfullnames rrzRefactoringTool.refactor_dir sT!,.GH,=,= L L (GXy NN/ 9 9 9 MMOOO NN   ! G G,,GG$$T**1-77!w||GT::H&&x FFFKKKKKHQQQKK L Lrc t|d}n/#t$r"}|d||Yd}~dSd}~wwxYw tj|jd}|n#|wxYwtj|d|d5}||fcdddS#1swxYwYdS) zG Do our best to decode a Python source file correctly. rbzCan't open %s: %sNNNrr5rencodingnewline) openOSErrorrrdetect_encodingrfcloserdread)rrferrrs r_read_python_sourcez#RefactoringTool._read_python_source4s" Xt$$AA    NN.# > > >:::::  / ;;A>H GGIIIIAGGIIII WXsXr B B B &a6688X% & & & & & & & & & & & & & & & & & &s. ?:?A77B (C  CCc||\}}|dS|dz }|rl|d||||}|js||kr||||||dS|d|dS|||}|js |r7|jr0|t|dd|||dS|d|dS)zRefactors a file.N zRefactoring doctests in %szNo doctest changes in %sr)rrzNo changes in %s)rrrefactor_docstringr~processed_filerefactor_string was_changedstr)rrrrinputroutputtrees rrzRefactoringTool.refactor_fileDs@228<<x = F    = NN7 B B B,,UH==F) EVu__##FHeUHMMMMM98DDDDD''x88D) =d =t7G =##CIIcrcNH*/($DDDDD18<<<<}|d||jj |Yd}~|j|j_dSd}~wwxYw |j|j_n#|j|j_wxYw||_ | d|| |||S)aFRefactor a given input string. Args: data: a string holding the code to be refactored. name: a human-readable name for use in error/log messages. Returns: An AST corresponding to the refactored input stream; None if there were errors during the parse. r|zCan't parse %s: %s: %sNzRefactoring %s) rvr !python_grammar_no_print_statementrr parse_stringr3r __class__r#future_featuresr refactor_tree)rdatarrsrrs rrzRefactoringTool.refactor_string[s +400 x ' '"("JDK  /;++D11DD    NN3!7 > > > FFF"&,DK       #',DK  $,DK  . . . .' '... 4&&& s/AB$ B"B 2B$ BB$$B7ctj}|rh|d||d}|js||kr||d|dS|ddS||d}|js |r-|jr&|t|d|dS|ddS)NzRefactoring doctests in stdinzzNo doctest changes in stdinzNo changes in stdin) sysstdinrrrr~rrrr)rrrrrs rrefactor_stdinzRefactoringTool.refactor_stdinvs     6 NN: ; ; ;,,UI>>F) >Vu__##FIu=====<=====''y99D) 6d 6t7G 6##CIIy%@@@@@455555rct|j|jD]}|||||j|||j||j| }t| r|jj D]}||vr||r|| tjjd|jr+|| tjjt'||D]8}|||vr||| t+|n#t,$rYEwxYw|jr ||jvrZ||}|r|||}||||D]*}|jsg|_|j|+|j| }|D],} | |vrg|| <|| || -:t| t|j|jD]}||||jS)aRefactors a parse tree (modifying the tree in place). For compatible patterns the bottom matcher module is used. Otherwise the tree is traversed node-to-node for matches. Args: tree: a pytree.Node instance representing the root of the tree to be refactored. name: a human-readable name for this tree. Returns: True if the tree was modified, False otherwise. T)rreverser)rrr start_tree traverse_byrrrrunleavesanyr@rrr Basedepthkeep_line_order get_linenor;remover ValueErrorfixers_appliedmatch transformreplacerrB finish_treer) rrrrG match_setnoderesultsnew new_matchesfxrs rr zRefactoringTool.refactor_trees% 4>4?;; ) )E   T4 ( ( ( ( 14>>3C3CDDD 2DOO4E4EFFFGKK .. )""$$%%/ L. L. LI%%)E*:%e$))fk.?)NNN,J"%(--&+2H-III $Yu%5 6 6$L$L9U#333%e,33D999%%dOOOO)%%%%H%  .%5D%A>@(;$($7$>$>u$E$E$E$E/3gkk#**,,.G.G +6!L!LC+.)+;+;79 #$-cN$9$9+c:J$K$K$K$K_)""$$%%/ Lb4>4?;; * *E   dD ) ) ) )sF%% F21F2c|sdS|D]X}||jD]H}||}|r/|||}||||}IYdS)aTraverse an AST, applying a set of fixers to each node. This is a helper method for refactor_tree(). Args: fixers: a list of fixer instances. traversal: a generator that yields AST nodes. Returns: None N)r,rrr)rr traversalr"rGr#r$s rrzRefactoringTool.traverse_bys  F # #D * # #++d++#//$88C S)))"  # # #rc^|j||||d}|dS||k}||||||r|d||jsdS|r|||||dS|d|dS)zR Called when a file has been refactored and there may be changes. NrzNo changes to %szNot writing changes to %s)rrrrrr~ write_file)rrrrrrrs rrzRefactoringTool.processed_files (###  //99!>>   NN-x 8 8 8-   B OOHh( C C C C C NN6 A A A A Arc tj|d|d}n/#t$r"}|d||Yd}~dSd}~wwxYw|5 ||n.#t$r!}|d||Yd}~nd}~wwxYwdddn #1swxYwY|d|d|_dS) zWrites a string to a file. It first shows a unified diff between the old text and the new text, and then rewrites the file; the latter is only done if the write option is set. wrrzCan't create %s: %sNzCan't write %s: %szWrote changes to %sT)rdrrrrrr)rrrrrfprs rr*zRefactoringTool.write_filesX 32FFFBB    NN0(C @ @ @ FFFFF  D D D"""" D D D3XsCCCCCCCC D D D D D D D D D D D D D D D D ,h777 sP AAA BA$#B$ B.B B BBB"%B"z>>> z... c g}d}d}d}d}|dD])}|dz }||jrW|+|||||||}|g}||j} |d| }|V|||jzs#|||jzdzkr| ||+||||||d}d}| |+|+||||||d |S)aRefactors a docstring, looking for doctests. This returns a modified version of the input string. It looks for doctests, which start with a ">>>" prompt, and may be continued with "..." prompts, as long as the "..." is indented the same as the ">>>". (Unfortunately we can't use the doctest module's parser, since, like most parsers, it is not geared towards preserving the original source.) NrTkeependsrrr) splitlineslstriprPS1rBrefactor_doctestfindPS2rstriprr) rrrresultblock block_linenoindentlinenolineis rrz"RefactoringTool.refactor_docstrings $$d$33 $ $D aKF{{}}''11 $$MM$"7"7|8>#J#JKKK% IIdh''bqb$??6DH#455%6DHOO$5$55<<< T""""$MM$"7"7|8>#J#JKKK d####   MM$//|06BB C C Cwwvrc ||}n#t$r}jtjr.|D]+}d|d,d|||j j ||cYd}~Sd}~wwxYw ||rt| d}|d|dz ||dz d}} |dds|dxxdz cc<jz|d zg}|r|fd |Dz }|S) zRefactors one doctest. A doctest is given as a block of lines, the first of which starts with ">>>" (possibly indented), while the remaining lines start with "..." (identically indented). z Source: %srz+Can't parse docstring in %s line %s: %s: %sNTr/rrrc*g|]}jz|zSr&)r6)rOr=r;rs rrRz4RefactoringTool.refactor_doctest..^s%CCCt&48+d2CCCr) parse_blockr3r isEnabledForrDEBUGrr7rrr#r rr1endswithr3pop) rr9r<r;rrrr=r$clippeds ` ` rr4z RefactoringTool.refactor_doctestDs ##E66::DD   {'' 66 D!DDDNN<T1B1BCCCC NNH#VS]-CS J J JLLLLLL     dH - - Dd))&&&55Cyqy>3vaxyy>SGr7##D)) B4dh&34E DCCCCCsCCCC s B'A6B"B'"B'c6|jrd}nd}|js|d|n5|d||jD]}|||jr4|d|jD]}|||jrut |jdkr|dn(|dt |j|jD]\}}}|j|g|Ri|dSdS) Nwerez need to bezNo files %s modified.zFiles that %s modified:z$Warnings/messages while refactoring:rzThere was 1 error:zThere were %d errors:)rrrrrr)rrHfilemessagerrrs r summarizezRefactoringTool.summarizeask : DDDz '   4d ; ; ; ;   6 = = =  ' '  &&&& > *   C D D D> * *  )))) ; 54;1$$  !56666  !8#dk:J:JKKK#'; 5 5T4  4t444t4444  5 5  5 5rc|j||||}t|_|S)zParses a block into a tree. This is necessary to get correct line number / offset information in the parser diagnostics and embedded into the parse tree. )r parse_tokens wrap_toksrgr)rr9r<r;rs rrAzRefactoringTool.parse_blockxs: {''uff(M(MNN({{ rc#Ktj|||j}|D]+\}}\}}\} } } ||dz z }| |dz z } ||||f| | f| fV,dS)z;Wraps a tokenize stream to systematically modify start/end.rN)rrc gen_lines__next__) rr9r<r;rAr,ruline0col0line1col1 line_texts rrNzRefactoringTool.wrap_tokss)$..*G*G*PQQDJ G G @D%% y VaZ E VaZ E t}udmYF F F F F G Grc#K||jz}||jz}|}|D]h}||r|t|dVn5||dzkrdVnt d|d||}i dV)zGenerates lines as expected by tokenize from a list of lines. This strips the first len(indent + self.PS1) characters off each line. Nrzline=z , prefix=Tr)r3r6rrr7AssertionError)rr9r;prefix1prefix2prefixr=s rrPzRefactoringTool.gen_liness 48#48#  Dv&& L3v;;<<(((((4/// $nTTT66%JKKKFF HHH rr)FF)F)NFNrU)r#r$r%rrrrrrrrrrrrrrrr rrr*r3r6rr4rKrArNrPr&rrr{r{s+0).2799LK3L3L3L3Ln&5&5&5P     FFFFLLLL(&&& ====.66666 M M M ^###.GL $BBBB** C C)))V:555. G G Grr{ceZdZdS)MultiprocessingUnsupportedNr"r&rrr]r]r'rr]cBeZdZfdZ dfd ZfdZfdZxZS)MultiprocessRefactoringToolcdtt|j|i|d|_d|_dSrU)superr_rqueue output_lockrrkwargsrs rrz$MultiprocessRefactoringTool.__init__s;9)40094J6JJJ rFrc|dkr*tt|||S ddln#t$rt wxYwjtd_ _ fdt|D} |D]}| tt|||j t|D]}jd|D]*}|r| +d_dS#j t|D]}jd|D]*}|r| +d_wxYw)Nrrz already doing multiple processescFg|]}jS))target)Process_child)rOr>multiprocessingrs rrRz8MultiprocessRefactoringTool.refactor..s<444%,,DK,@@444r)rar_rrk ImportErrorr]rb RuntimeError JoinableQueueLockrcrangestartrputis_alive) rrrr num_processes processesr6r>rkrs ` @rrz$MultiprocessRefactoringTool.refactors/ A  4d;;DDum-- - - " " " " " - - -, , - : !ABB B$2244 *//1144444#M22444     -t 4 4 = =eU>K M M M JOO   =)) % % t$$$$  ::<<FFHHHDJJJ JOO   =)) % % t$$$$  ::<<FFHHHDJ    s:A 4AE22A;G-c4|j}|{|\}} tt|j|i||jn#|jwxYw|j}|ydSdSrU)rbrrar_r task_done)rtaskrrers rrjz"MultiprocessRefactoringTool._childsz~~LD& 'F1488F%#%%% $$&&&& $$&&&&:>>##Ds AA8c|j|j||fdStt|j|i|SrU)rbrrrar_rrds rrz)MultiprocessRefactoringTool.refactor_filesV : ! JNND&> * * * * *I54d;;I!!! !r)FFr)r#r$r%rrrjr __classcell__)rs@rr_r_s     :? : $ $ $ $ $!!!!!!!!!rr_)T)#ry __author__rdrrr rrr9 itertoolsrpgen2rrr fixer_utilrrr r r rrr3r!r/rJrSrWrvrxobjectr{r]r_r&rrrs3   +*********!!!!!!            CCC82@@@%%%P''''''''FFFFFfFFFR        4!4!4!4!4!/4!4!4!4!4!r__pycache__/pytree.cpython-311.opt-2.pyc000064400000064337151027012300013732 0ustar00 !A?hFm dZddlZddlmZdZiadZGddeZGdd eZ Gd d eZ d Z Gd deZ Gdde Z Gdde ZGdde ZGdde ZdZdS)z#Guido van Rossum N)StringIOictsGddlm}|jD]'\}}t |t kr |t|<(t||S)N)python_symbols) _type_reprspygramr__dict__itemstypeint setdefault)type_numrnamevals '/usr/lib64/python3.11/lib2to3/pytree.py type_reprrsq 9******(06688 9 9ID#CyyCDS!1  ! !(H 5 55ceZdZ dZdZdZdZdZdZdZ dZ dZ dZ dZ d Zd Zd Zd Zd ZedZedZdZdZdZejdkrdZdSdS)BaseNFc8 t|SNobject__new__clsargskwdss rrz Base.__new__1sE~~c"""rcX |j|jurtS||Sr) __class__NotImplemented_eqselfothers r__eq__z Base.__eq__6s. > 0 0! !xxrc trNotImplementedErrorr$s rr#zBase._eqBs "!rc trr)r%s rclonez Base.cloneM "!rc trr)r,s r post_orderzBase.post_orderUr.rc trr)r,s r pre_orderzBase.pre_order]r.rc> t|ts|g}g}d}|jjD]5}||ur|||d} ||6|j||j_|D]}|j|_d|_dSNFT) isinstancelistparentchildrenextendappendchanged)r%new l_childrenfoundchxs rreplacez Base.replacees=#t$$ %C +& & &BTzz?%%c***!!"%%%% )  # #A{AHH rc |}t|ts+|jsdS|jd}t|t+|jSNr)r5Leafr8linenor%nodes r get_linenozBase.get_lineno|sUGT4(( $= =#DT4(( ${rcT|jr|jd|_dSNT)r7r; was_changedr,s rr;z Base.changeds. ; " K   ! ! !rc |jrTt|jjD]<\}}||ur1|j|jj|=d|_|cS;dSdSr)r7 enumerater8r;)r%irGs rremovez Base.removes  ; $T[%9::  44<<K''))) ,Q/"&DKHHH      rc |jdSt|jjD]3\}}||ur* |jj|dzcS#t$rYdSwxYw4dS)Nr)r7rMr8 IndexErrorr%rNchilds r next_siblingzBase.next_siblings  ; 4"$+"677  HAu}} ;/!4444!   444   sA AAc |jdSt|jjD])\}}||ur |dkrdS|jj|dz cS*dSNrr)r7rMr8rRs r prev_siblingzBase.prev_siblingsz  ; 4"$+"677 1 1HAu}}6644{+AaC0000 1 1rc#RK|jD]}|Ed{VdSr)r8leavesr%rSs rrYz Base.leavessD] & &E||~~ % % % % % % % % & &rcL|jdSd|jzSrV)r7depthr,s rr\z Base.depths( ; 14;$$&&&&rc( |j}|dS|jSN)rTprefix)r%next_sibs r get_suffixzBase.get_suffixs$ $  2rrcFt|dS)Nascii)strencoder,s r__str__z Base.__str__st99##G,, ,r)__name__ __module__ __qualname__r r7r8rK was_checkedrr'__hash__r#r-r0r2rArHr;rOpropertyrTrWrYr\rbsys version_inforirrrrrs[ D FHKK### H " " """""""""".       X  1 1X 1&&&'''  &   - - - - -! rrceZdZ ddZdZdZejdkreZdZ dZ dZ d Z e d Zejd Zd Zd ZdZdS)NodeNc ||_t||_|jD] }||_ |||_|r|dd|_dSd|_dSr)r r6r8r7r`fixers_applied)r%r r8contextr`rur?s r__init__z Node.__init__sq  X -  BBII   DK  '"0"3D   "&D   rc\ |jjdt|jd|jdSN(, ))r!rjrr r8r,s r__repr__z Node.__repr__s97#~666(3333#}}}. .rc^ dtt|jSr^)joinmaprgr8r,s r __unicode__zNode.__unicode__s' wws3 ..///rrcc@ |j|jf|j|jfkSr)r r8r$s rr#zNode._eqs"- 4=)ej%.-IIIrcZ t|jd|jD|jS)Nc6g|]}|Sr)r-).0r?s r zNode.clone..s CCCr CCCrru)rsr r8rur,s rr-z Node.clones92DICCT]CCC#'#6888 8rc#\K |jD]}|Ed{V|VdSr)r8r0rZs rr0zNode.post_ordersN8] * *E'')) ) ) ) ) ) ) ) ) rc#\K |V|jD]}|Ed{VdSr)r8r2rZs rr2zNode.pre_order sR7 ] ) )E(( ( ( ( ( ( ( ( ( ) )rc: |jsdS|jdjS)Nr_rr8r`r,s rr`z Node.prefixs( } 2}Q&&rc<|jr||jd_dSdSrCrr%r`s rr`z Node.prefixs+ = -&,DM!  # # # - -rcv ||_d|j|_||j|<|dSr)r7r8r;rRs r set_childzNode.set_child s<  "& a  a rct ||_|j|||dSr)r7r8insertr;rRs r insert_childzNode.insert_child*s9   Q&&& rcr ||_|j||dSr)r7r8r:r;rZs r append_childzNode.append_child3s7   U### rNNN)rjrkrlrwr}rrprqrir#r-r0r2ror`setterrrrrrrrsrss 5 $''''2... 000 &  JJJ888  ))) ''X' ]--]-rrsceZdZ dZdZdZddgfdZdZdZe j dkreZ dZ d Z d Zd Zd Zed ZejdZdS)rDr_rNc ||\|_\|_|_||_||_|||_|dd|_dSr)_prefixrEcolumnr valueru)r%r rrvr`rus rrwz Leaf.__init__FsV   7> 4DL44;    !DL,QQQ/rcB |jjd|jd|jdSry)r!rjr rr,s rr}z Leaf.__repr__Ys/7#~666#yyy#zzz+ +rc< |jt|jzSr)r`rgrr,s rrzLeaf.__unicode___s {S__,,rrcc@ |j|jf|j|jfkSr)r rr$s rr#zLeaf._eqjs"- 4:&5:u{*CCCrcn t|j|j|j|j|jff|jS)Nr)rDr rr`rErrur,s rr-z Leaf.clonens=2DItz[4; "<=#'#6888 8rc#K|VdSrrr,s rrYz Leaf.leavests rc#K |VdSrrr,s rr0zLeaf.post_orderws8 rc#K |VdSrrr,s rr2zLeaf.pre_order{s7 rc |jSr)rr,s rr`z Leaf.prefixs |rc<|||_dSr)r;rrs rr`z Leaf.prefixs  r)rjrkrlrrErrwr}rrprqrir#r-rYr0r2ror`rrrrrDrD=s1G F F "0000&+++ --- &  DDD888 X  ]]rrDc |\}}}}|s ||jvr-t|dkr|dSt|||St|||S)Nrr)rv) number2symbollenrsrD)grraw_noder rrvr8s rconvertrss&."D%(242+++ x==A  A; D(G4444D%1111rcDeZdZ dZdZdZdZdZdZddZ ddZ dZ dS) BasePatternNc8 t|Srrrs rrzBasePattern.__new__sL~~c"""rct|j|j|jg}|r|d |d=|r|d |jjddtt|dS)Nrzr{r|) rr contentrr!rjrrrepr)r%rs rr}zBasePattern.__repr__sw$)$$dlDI> tBx'R tBx'>222DIIc$oo4N4N4N4NOOrc |Srrr,s roptimizezBasePattern.optimizes  rc |j|j|jkrdS|j5d}|i}|||sdS|r||||jr |||j<dSr4)r r _submatchupdater)r%rGresultsrs rmatchzBasePattern.matchs  9 TY$)%;%;5 < #A">>$** u "q!!!  49 !%GDI trcf t|dkrdS||d|S)NrFr)rr)r%nodesrs r match_seqzBasePattern.match_seqs5 u::??5zz%(G,,,rc#`K i}|r$||d|r d|fVdSdSdSrV)r)r%rrs rgenerate_matcheszBasePattern.generate_matchessX   TZZa!,, Q$JJJJJ    rr) rjrkrlr rrrr}rrrrrrrrrs  DG D### PPP 2----rrc&eZdZddZddZddZdS) LeafPatternNc: ||||_||_||_dSr)r rr)r%r rrs rrwzLeafPattern.__init__s.        rcj t|tsdSt|||SNF)r5rDrrr%rGrs rrzLeafPattern.match s48$%% 5  tW555rc$ |j|jkSr)rrrs rrzLeafPattern._submatchs |tz))rrr)rjrkrlrwrrrrrrrsP(6666 * * * * * *rrc"eZdZdZddZddZdS) NodePatternFNc ||@t|}t|D]!\}}t|trd|_"||_||_||_dSrJ)r6rMr5WildcardPattern wildcardsr rr)r%r rrrNitems rrwzNodePattern.__init__$sn     7mmG$W-- * *4dO44*%)DN   rc |jrTt|j|jD]7\}}|t |jkr|||dS8dSt |jt |jkrdSt |j|jD]\}}|||sdSdSNTF)rrrr8rrzipr)r%rGrcr subpatternrSs rrzNodePattern._submatchAs  > (t}EE  1DM*****q)))44+5 t|  DM 2 2 2 25!$T\4=!A!A   J##E733 uu trrr)rjrkrlrrwrrrrrr sAI:rrcNeZdZ ddedfdZdZd dZd dZdZdZ d Z d Z dS) rNrc |'ttt|}|D]}||_||_||_||_dSr)tuplerrminmaxr)r%rrrralts rrwzWildcardPattern.__init__ksY .  Cw//00G + +  rc> d}|jIt|jdkr1t|jddkr|jdd}|jdkrM|jdkrB|jt |jS|$|j|jkr|S|jdkrft|trQ|jdkrF|j|jkr6t|j|j|jz|j|jz|jS|S)Nrr)r) rrrrrrrr5r)r%rs rrzWildcardPattern.optimizes9 L $    " "s4<?';';q'@'@a+J 8q==TX]]|#" 2222%49 +G+G!**,,, HMMj_EEM Na  DI$@$@":#5#'8JN#:#'8JN#:#-?44 4 rc2 ||g|Sr)rrs rrzWildcardPattern.matchs5~~tfg...rc ||D]P\}}|t|kr8|3|||jrt |||j<dSQdSr)rrrrr6)r%rrrrs rrzWildcardPattern.match_seqs|B))%00  DAqCJJ&NN1%%%y9-1%[[ *tt  urc #0K |j^t|jdtt||jzD]#}i}|jr|d|||j<||fV$dS|jdkr||VdSttdr$tj }tt_ | |dD]$\}}|jr|d|||j<||fV%nJ#t$r=| |D]$\}}|jr|d|||j<||fV%YnwxYwttdr|t_ dSdS#ttdr |t_ wxYw)Nr bare_name getrefcountr)rrangerrrr_bare_name_matcheshasattrrpstderrr_recursive_matches RuntimeError_iterative_matches)r%rcountr save_stderrs rrz WildcardPattern.generate_matchess  < txSUTX-F-F)FGG  91#(%=AdiLQh    Y+ % %))%00 0 0 0 0 0 sM** (!j %ZZ  - $ 7 7q A A##HE1y5',VeV}$) (NNNN#  # # #!% 7 7 > >##HE1y5',VeV}$) (NNNN## #3 ..-!,CJJJ--73 ..-!,CJ,,,,s+;DE2AE E2 E  E22#Fc# K t|}d|jkrdifVg}|jD]5}t||D]"\}}||fV|||f#6|rg}|D]\}} ||kr||jkr}|jD]u}t|||dD]Z\} } | dkrOi}|| || || z|fV||| z|f[v|}|dSdSrC)rrrrr:rr) r%rnodelenrrrr new_resultsc0r0c1r1s rrz"WildcardPattern._iterative_matchess6e** ==R%KKK< ' 'C(e44 ' '1d 1v&&&& '  "K! A AB<rs3  666n-n-n-n-n-6n-n-n-`kkkkk4kkk\LLLLL4LLL\222&SSSSS&SSSl)*)*)*)*)*+)*)*)*X:::::+:::zy)y)y)y)y)ky)y)y)x     [   F%%%%%r__pycache__/fixer_base.cpython-311.pyc000064400000020341151027012300013554 0ustar00 !A?h"ndZddlZddlmZddlmZddlmZGddeZ Gd d e Z dS) z2Base class for fixers (optional, but recommended).N)PatternCompiler)pygram)does_tree_importceZdZdZdZdZdZdZdZe j dZ e Z dZdZdZdZdZdZejZdZdZd Zd Zd Zdd ZdZddZdZdZ dZ!dS)BaseFixaOptional base class for fixers. The subclass name must be FixFooBar where FooBar is the result of removing underscores and capitalizing the words of the fix name. For example, the class name for a fixer named 'has_key' should be FixHasKey. NrpostFcJ||_||_|dS)aInitializer. Subclass may override. Args: options: a dict containing the options passed to RefactoringTool that could be used to customize the fixer through the command line. log: a list to append warnings and other messages to. N)optionslogcompile_pattern)selfr r s +/usr/lib64/python3.11/lib2to3/fixer_base.py__init__zBaseFix.__init__/s*  c|j9t}||jd\|_|_dSdS)zCompiles self.PATTERN into self.pattern. Subclass may override if it doesn't want to use self.{pattern,PATTERN} in .match(). NT) with_tree)PATTERNrrpattern pattern_tree)rPCs rrzBaseFix.compile_pattern;sS < # ""B.0.@.@KO/A/Q/Q +DL$+++ $ #rc||_dS)zOSet the filename. The main refactoring tool should call this. N)filename)rrs r set_filenamezBaseFix.set_filenameFs ! rcDd|i}|j||o|S)aReturns match for a given parse tree node. Should return a true or false object (not necessarily a bool). It may return a non-empty dict of matching sub-nodes as returned by a matching pattern. Subclass may override. node)rmatchrrresultss rrz BaseFix.matchMs*4.|!!$00 B"DN HOO04=@ A A A      rc|}|}d|_d}||||fz|r||dSdS)aWarn the user that a given chunk of code is not valid Python 3, but that it cannot be converted automatically. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. zLine %d: could not convert: %sN) get_linenocloneprefixr2)rrreasonlineno for_outputmsgs rcannot_convertzBaseFix.cannot_convertzsw""ZZ\\  .  33444  %   V $ $ $ $ $ % %rcb|}|d||fzdS)zUsed for warning the user about possible uncertainty in the translation. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. z Line %d: %sN)r5r2)rrr8r9s rwarningzBaseFix.warnings7"" &&)99:::::rc|j|_||tjd|_d|_dS)zSome fixers need to maintain tree-wide state. This method is called once, at the start of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. rTN)r&r itertoolscountr)r/rtreers r start_treezBaseFix.start_trees=/ (### q)) rcdS)zSome fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. NrBs r finish_treezBaseFix.finish_trees  r)r$N)"__name__ __module__ __qualname____doc__rrrr rr@rAr)setr&orderexplicit run_order _accept_typekeep_line_order BM_compatiblerpython_symbolssymsrrrrr#r-r2r<r>rDrGrFrrrrs1GGLGHioa  GJ EHILOM  D    Q Q Q!!! = = =$$$    !!! % % % %;;;        rrc,eZdZdZdZfdZdZxZS)ConditionalFixz@ Base class for fixers which not execute if an import is found. NcPtt|j|d|_dSrH)superrWrD _should_skip)rargs __class__s rrDzConditionalFix.start_trees+.nd##.55 rc|j|jS|jd}|d}d|dd}t ||||_|jS)N.)rZskip_onsplitjoinr)rrpkgr,s r should_skipzConditionalFix.should_skipsh   ($ $l  %%2whhs3B3x  ,S$==  r)rIrJrKrLr`rDrd __classcell__)r\s@rrWrWsTJJG!!!!!!!!!!!!rrW) rLr@patcomprr4r fixer_utilrobjectrrWrFrrris98%$$$$$((((((X X X X X fX X X v!!!!!W!!!!!r__pycache__/pygram.cpython-311.opt-1.pyc000064400000004010151027012300013676 0ustar00 !A?hdZddlZddlmZddlmZddlmZejej e dZ ejej e dZ Gd d e Zejd e ZeeZeZejd =eZejd =ejd e ZeeZdS)z&Export the Python grammar and symbols.N)token)driver)pytreez Grammar.txtzPatternGrammar.txtceZdZdZdS)Symbolscf|jD]\}}t|||dS)zInitializer. Creates an attribute for each grammar symbol (nonterminal), whose value is the symbol's type (an int >= 256). N) symbol2numberitemssetattr)selfgrammarnamesymbols '/usr/lib64/python3.11/lib2to3/pygram.py__init__zSymbols.__init__sE $17799 ( (LD& D$ ' ' ' ' ( (N)__name__ __module__ __qualname__rrrrrs#(((((rrlib2to3printexec)__doc__ospgen2rrrpathjoindirname__file__ _GRAMMAR_FILE_PATTERN_GRAMMAR_FILEobjectrload_packaged_grammarpython_grammarpython_symbolscopy!python_grammar_no_print_statementkeywords*python_grammar_no_print_and_exec_statementpattern_grammarpattern_symbolsrrrr/sO-,  RW__X66 FF  RW__X%>%>%9;; ( ( ( ( (f ( ( (.-iGG(($2$7$7$9$9!%.w7-N-S-S-U-U*.7?.&.y:OPP'/**r__pycache__/fixer_base.cpython-311.opt-1.pyc000064400000020341151027012300014513 0ustar00 !A?h"ndZddlZddlmZddlmZddlmZGddeZ Gd d e Z dS) z2Base class for fixers (optional, but recommended).N)PatternCompiler)pygram)does_tree_importceZdZdZdZdZdZdZdZe j dZ e Z dZdZdZdZdZdZejZdZdZd Zd Zd Zdd ZdZddZdZdZ dZ!dS)BaseFixaOptional base class for fixers. The subclass name must be FixFooBar where FooBar is the result of removing underscores and capitalizing the words of the fix name. For example, the class name for a fixer named 'has_key' should be FixHasKey. NrpostFcJ||_||_|dS)aInitializer. Subclass may override. Args: options: a dict containing the options passed to RefactoringTool that could be used to customize the fixer through the command line. log: a list to append warnings and other messages to. N)optionslogcompile_pattern)selfr r s +/usr/lib64/python3.11/lib2to3/fixer_base.py__init__zBaseFix.__init__/s*  c|j9t}||jd\|_|_dSdS)zCompiles self.PATTERN into self.pattern. Subclass may override if it doesn't want to use self.{pattern,PATTERN} in .match(). NT) with_tree)PATTERNrrpattern pattern_tree)rPCs rrzBaseFix.compile_pattern;sS < # ""B.0.@.@KO/A/Q/Q +DL$+++ $ #rc||_dS)zOSet the filename. The main refactoring tool should call this. N)filename)rrs r set_filenamezBaseFix.set_filenameFs ! rcDd|i}|j||o|S)aReturns match for a given parse tree node. Should return a true or false object (not necessarily a bool). It may return a non-empty dict of matching sub-nodes as returned by a matching pattern. Subclass may override. node)rmatchrrresultss rrz BaseFix.matchMs*4.|!!$00 B"DN HOO04=@ A A A      rc|}|}d|_d}||||fz|r||dSdS)aWarn the user that a given chunk of code is not valid Python 3, but that it cannot be converted automatically. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. zLine %d: could not convert: %sN) get_linenocloneprefixr2)rrreasonlineno for_outputmsgs rcannot_convertzBaseFix.cannot_convertzsw""ZZ\\  .  33444  %   V $ $ $ $ $ % %rcb|}|d||fzdS)zUsed for warning the user about possible uncertainty in the translation. First argument is the top-level node for the code in question. Optional second argument is why it can't be converted. z Line %d: %sN)r5r2)rrr8r9s rwarningzBaseFix.warnings7"" &&)99:::::rc|j|_||tjd|_d|_dS)zSome fixers need to maintain tree-wide state. This method is called once, at the start of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. rTN)r&r itertoolscountr)r/rtreers r start_treezBaseFix.start_trees=/ (### q)) rcdS)zSome fixers need to maintain tree-wide state. This method is called once, at the conclusion of tree fix-up. tree - the root node of the tree to be processed. filename - the name of the file the tree came from. NrBs r finish_treezBaseFix.finish_trees  r)r$N)"__name__ __module__ __qualname____doc__rrrr rr@rAr)setr&orderexplicit run_order _accept_typekeep_line_order BM_compatiblerpython_symbolssymsrrrrr#r-r2r<r>rDrGrFrrrrs1GGLGHioa  GJ EHILOM  D    Q Q Q!!! = = =$$$    !!! % % % %;;;        rrc,eZdZdZdZfdZdZxZS)ConditionalFixz@ Base class for fixers which not execute if an import is found. NcPtt|j|d|_dSrH)superrWrD _should_skip)rargs __class__s rrDzConditionalFix.start_trees+.nd##.55 rc|j|jS|jd}|d}d|dd}t ||||_|jS)N.)rZskip_onsplitjoinr)rrpkgr,s r should_skipzConditionalFix.should_skipsh   ($ $l  %%2whhs3B3x  ,S$==  r)rIrJrKrLr`rDrd __classcell__)r\s@rrWrWsTJJG!!!!!!!!!!!!rrW) rLr@patcomprr4r fixer_utilrobjectrrWrFrrris98%$$$$$((((((X X X X X fX X X v!!!!!W!!!!!rrefactor.py000064400000065563151027012300006731 0ustar00# Copyright 2006 Google, Inc. All Rights Reserved. # Licensed to PSF under a Contributor Agreement. """Refactoring framework. Used as a main program, this can refactor any number of files and/or recursively descend down directories. Imported as a module, this provides infrastructure to write your own refactoring tool. """ __author__ = "Guido van Rossum " # Python imports import io import os import pkgutil import sys import logging import operator import collections from itertools import chain # Local imports from .pgen2 import driver, tokenize, token from .fixer_util import find_root from . import pytree, pygram from . import btm_matcher as bm def get_all_fix_names(fixer_pkg, remove_prefix=True): """Return a sorted list of all available fix names in the given package.""" pkg = __import__(fixer_pkg, [], [], ["*"]) fix_names = [] for finder, name, ispkg in pkgutil.iter_modules(pkg.__path__): if name.startswith("fix_"): if remove_prefix: name = name[4:] fix_names.append(name) return fix_names class _EveryNode(Exception): pass def _get_head_types(pat): """ Accepts a pytree Pattern Node and returns a set of the pattern types which will match first. """ if isinstance(pat, (pytree.NodePattern, pytree.LeafPattern)): # NodePatters must either have no type and no content # or a type and content -- so they don't get any farther # Always return leafs if pat.type is None: raise _EveryNode return {pat.type} if isinstance(pat, pytree.NegatedPattern): if pat.content: return _get_head_types(pat.content) raise _EveryNode # Negated Patterns don't have a type if isinstance(pat, pytree.WildcardPattern): # Recurse on each node in content r = set() for p in pat.content: for x in p: r.update(_get_head_types(x)) return r raise Exception("Oh no! I don't understand pattern %s" %(pat)) def _get_headnode_dict(fixer_list): """ Accepts a list of fixers and returns a dictionary of head node type --> fixer list. """ head_nodes = collections.defaultdict(list) every = [] for fixer in fixer_list: if fixer.pattern: try: heads = _get_head_types(fixer.pattern) except _EveryNode: every.append(fixer) else: for node_type in heads: head_nodes[node_type].append(fixer) else: if fixer._accept_type is not None: head_nodes[fixer._accept_type].append(fixer) else: every.append(fixer) for node_type in chain(pygram.python_grammar.symbol2number.values(), pygram.python_grammar.tokens): head_nodes[node_type].extend(every) return dict(head_nodes) def get_fixers_from_package(pkg_name): """ Return the fully qualified names for fixers in the package pkg_name. """ return [pkg_name + "." + fix_name for fix_name in get_all_fix_names(pkg_name, False)] def _identity(obj): return obj def _detect_future_features(source): have_docstring = False gen = tokenize.generate_tokens(io.StringIO(source).readline) def advance(): tok = next(gen) return tok[0], tok[1] ignore = frozenset({token.NEWLINE, tokenize.NL, token.COMMENT}) features = set() try: while True: tp, value = advance() if tp in ignore: continue elif tp == token.STRING: if have_docstring: break have_docstring = True elif tp == token.NAME and value == "from": tp, value = advance() if tp != token.NAME or value != "__future__": break tp, value = advance() if tp != token.NAME or value != "import": break tp, value = advance() if tp == token.OP and value == "(": tp, value = advance() while tp == token.NAME: features.add(value) tp, value = advance() if tp != token.OP or value != ",": break tp, value = advance() else: break except StopIteration: pass return frozenset(features) class FixerError(Exception): """A fixer could not be loaded.""" class RefactoringTool(object): _default_options = {"print_function" : False, "exec_function": False, "write_unchanged_files" : False} CLASS_PREFIX = "Fix" # The prefix for fixer classes FILE_PREFIX = "fix_" # The prefix for modules with a fixer within def __init__(self, fixer_names, options=None, explicit=None): """Initializer. Args: fixer_names: a list of fixers to import options: a dict with configuration. explicit: a list of fixers to run even if they are explicit. """ self.fixers = fixer_names self.explicit = explicit or [] self.options = self._default_options.copy() if options is not None: self.options.update(options) self.grammar = pygram.python_grammar.copy() if self.options['print_function']: del self.grammar.keywords["print"] elif self.options['exec_function']: del self.grammar.keywords["exec"] # When this is True, the refactor*() methods will call write_file() for # files processed even if they were not changed during refactoring. If # and only if the refactor method's write parameter was True. self.write_unchanged_files = self.options.get("write_unchanged_files") self.errors = [] self.logger = logging.getLogger("RefactoringTool") self.fixer_log = [] self.wrote = False self.driver = driver.Driver(self.grammar, convert=pytree.convert, logger=self.logger) self.pre_order, self.post_order = self.get_fixers() self.files = [] # List of files that were or should be modified self.BM = bm.BottomMatcher() self.bmi_pre_order = [] # Bottom Matcher incompatible fixers self.bmi_post_order = [] for fixer in chain(self.post_order, self.pre_order): if fixer.BM_compatible: self.BM.add_fixer(fixer) # remove fixers that will be handled by the bottom-up # matcher elif fixer in self.pre_order: self.bmi_pre_order.append(fixer) elif fixer in self.post_order: self.bmi_post_order.append(fixer) self.bmi_pre_order_heads = _get_headnode_dict(self.bmi_pre_order) self.bmi_post_order_heads = _get_headnode_dict(self.bmi_post_order) def get_fixers(self): """Inspects the options to load the requested patterns and handlers. Returns: (pre_order, post_order), where pre_order is the list of fixers that want a pre-order AST traversal, and post_order is the list that want post-order traversal. """ pre_order_fixers = [] post_order_fixers = [] for fix_mod_path in self.fixers: mod = __import__(fix_mod_path, {}, {}, ["*"]) fix_name = fix_mod_path.rsplit(".", 1)[-1] if fix_name.startswith(self.FILE_PREFIX): fix_name = fix_name[len(self.FILE_PREFIX):] parts = fix_name.split("_") class_name = self.CLASS_PREFIX + "".join([p.title() for p in parts]) try: fix_class = getattr(mod, class_name) except AttributeError: raise FixerError("Can't find %s.%s" % (fix_name, class_name)) from None fixer = fix_class(self.options, self.fixer_log) if fixer.explicit and self.explicit is not True and \ fix_mod_path not in self.explicit: self.log_message("Skipping optional fixer: %s", fix_name) continue self.log_debug("Adding transformation: %s", fix_name) if fixer.order == "pre": pre_order_fixers.append(fixer) elif fixer.order == "post": post_order_fixers.append(fixer) else: raise FixerError("Illegal fixer order: %r" % fixer.order) key_func = operator.attrgetter("run_order") pre_order_fixers.sort(key=key_func) post_order_fixers.sort(key=key_func) return (pre_order_fixers, post_order_fixers) def log_error(self, msg, *args, **kwds): """Called when an error occurs.""" raise def log_message(self, msg, *args): """Hook to log a message.""" if args: msg = msg % args self.logger.info(msg) def log_debug(self, msg, *args): if args: msg = msg % args self.logger.debug(msg) def print_output(self, old_text, new_text, filename, equal): """Called with the old version, new version, and filename of a refactored file.""" pass def refactor(self, items, write=False, doctests_only=False): """Refactor a list of files and directories.""" for dir_or_file in items: if os.path.isdir(dir_or_file): self.refactor_dir(dir_or_file, write, doctests_only) else: self.refactor_file(dir_or_file, write, doctests_only) def refactor_dir(self, dir_name, write=False, doctests_only=False): """Descends down a directory and refactor every Python file found. Python files are assumed to have a .py extension. Files and subdirectories starting with '.' are skipped. """ py_ext = os.extsep + "py" for dirpath, dirnames, filenames in os.walk(dir_name): self.log_debug("Descending into %s", dirpath) dirnames.sort() filenames.sort() for name in filenames: if (not name.startswith(".") and os.path.splitext(name)[1] == py_ext): fullname = os.path.join(dirpath, name) self.refactor_file(fullname, write, doctests_only) # Modify dirnames in-place to remove subdirs with leading dots dirnames[:] = [dn for dn in dirnames if not dn.startswith(".")] def _read_python_source(self, filename): """ Do our best to decode a Python source file correctly. """ try: f = open(filename, "rb") except OSError as err: self.log_error("Can't open %s: %s", filename, err) return None, None try: encoding = tokenize.detect_encoding(f.readline)[0] finally: f.close() with io.open(filename, "r", encoding=encoding, newline='') as f: return f.read(), encoding def refactor_file(self, filename, write=False, doctests_only=False): """Refactors a file.""" input, encoding = self._read_python_source(filename) if input is None: # Reading the file failed. return input += "\n" # Silence certain parse errors if doctests_only: self.log_debug("Refactoring doctests in %s", filename) output = self.refactor_docstring(input, filename) if self.write_unchanged_files or output != input: self.processed_file(output, filename, input, write, encoding) else: self.log_debug("No doctest changes in %s", filename) else: tree = self.refactor_string(input, filename) if self.write_unchanged_files or (tree and tree.was_changed): # The [:-1] is to take off the \n we added earlier self.processed_file(str(tree)[:-1], filename, write=write, encoding=encoding) else: self.log_debug("No changes in %s", filename) def refactor_string(self, data, name): """Refactor a given input string. Args: data: a string holding the code to be refactored. name: a human-readable name for use in error/log messages. Returns: An AST corresponding to the refactored input stream; None if there were errors during the parse. """ features = _detect_future_features(data) if "print_function" in features: self.driver.grammar = pygram.python_grammar_no_print_statement try: tree = self.driver.parse_string(data) except Exception as err: self.log_error("Can't parse %s: %s: %s", name, err.__class__.__name__, err) return finally: self.driver.grammar = self.grammar tree.future_features = features self.log_debug("Refactoring %s", name) self.refactor_tree(tree, name) return tree def refactor_stdin(self, doctests_only=False): input = sys.stdin.read() if doctests_only: self.log_debug("Refactoring doctests in stdin") output = self.refactor_docstring(input, "") if self.write_unchanged_files or output != input: self.processed_file(output, "", input) else: self.log_debug("No doctest changes in stdin") else: tree = self.refactor_string(input, "") if self.write_unchanged_files or (tree and tree.was_changed): self.processed_file(str(tree), "", input) else: self.log_debug("No changes in stdin") def refactor_tree(self, tree, name): """Refactors a parse tree (modifying the tree in place). For compatible patterns the bottom matcher module is used. Otherwise the tree is traversed node-to-node for matches. Args: tree: a pytree.Node instance representing the root of the tree to be refactored. name: a human-readable name for this tree. Returns: True if the tree was modified, False otherwise. """ for fixer in chain(self.pre_order, self.post_order): fixer.start_tree(tree, name) #use traditional matching for the incompatible fixers self.traverse_by(self.bmi_pre_order_heads, tree.pre_order()) self.traverse_by(self.bmi_post_order_heads, tree.post_order()) # obtain a set of candidate nodes match_set = self.BM.run(tree.leaves()) while any(match_set.values()): for fixer in self.BM.fixers: if fixer in match_set and match_set[fixer]: #sort by depth; apply fixers from bottom(of the AST) to top match_set[fixer].sort(key=pytree.Base.depth, reverse=True) if fixer.keep_line_order: #some fixers(eg fix_imports) must be applied #with the original file's line order match_set[fixer].sort(key=pytree.Base.get_lineno) for node in list(match_set[fixer]): if node in match_set[fixer]: match_set[fixer].remove(node) try: find_root(node) except ValueError: # this node has been cut off from a # previous transformation ; skip continue if node.fixers_applied and fixer in node.fixers_applied: # do not apply the same fixer again continue results = fixer.match(node) if results: new = fixer.transform(node, results) if new is not None: node.replace(new) #new.fixers_applied.append(fixer) for node in new.post_order(): # do not apply the fixer again to # this or any subnode if not node.fixers_applied: node.fixers_applied = [] node.fixers_applied.append(fixer) # update the original match set for # the added code new_matches = self.BM.run(new.leaves()) for fxr in new_matches: if not fxr in match_set: match_set[fxr]=[] match_set[fxr].extend(new_matches[fxr]) for fixer in chain(self.pre_order, self.post_order): fixer.finish_tree(tree, name) return tree.was_changed def traverse_by(self, fixers, traversal): """Traverse an AST, applying a set of fixers to each node. This is a helper method for refactor_tree(). Args: fixers: a list of fixer instances. traversal: a generator that yields AST nodes. Returns: None """ if not fixers: return for node in traversal: for fixer in fixers[node.type]: results = fixer.match(node) if results: new = fixer.transform(node, results) if new is not None: node.replace(new) node = new def processed_file(self, new_text, filename, old_text=None, write=False, encoding=None): """ Called when a file has been refactored and there may be changes. """ self.files.append(filename) if old_text is None: old_text = self._read_python_source(filename)[0] if old_text is None: return equal = old_text == new_text self.print_output(old_text, new_text, filename, equal) if equal: self.log_debug("No changes to %s", filename) if not self.write_unchanged_files: return if write: self.write_file(new_text, filename, old_text, encoding) else: self.log_debug("Not writing changes to %s", filename) def write_file(self, new_text, filename, old_text, encoding=None): """Writes a string to a file. It first shows a unified diff between the old text and the new text, and then rewrites the file; the latter is only done if the write option is set. """ try: fp = io.open(filename, "w", encoding=encoding, newline='') except OSError as err: self.log_error("Can't create %s: %s", filename, err) return with fp: try: fp.write(new_text) except OSError as err: self.log_error("Can't write %s: %s", filename, err) self.log_debug("Wrote changes to %s", filename) self.wrote = True PS1 = ">>> " PS2 = "... " def refactor_docstring(self, input, filename): """Refactors a docstring, looking for doctests. This returns a modified version of the input string. It looks for doctests, which start with a ">>>" prompt, and may be continued with "..." prompts, as long as the "..." is indented the same as the ">>>". (Unfortunately we can't use the doctest module's parser, since, like most parsers, it is not geared towards preserving the original source.) """ result = [] block = None block_lineno = None indent = None lineno = 0 for line in input.splitlines(keepends=True): lineno += 1 if line.lstrip().startswith(self.PS1): if block is not None: result.extend(self.refactor_doctest(block, block_lineno, indent, filename)) block_lineno = lineno block = [line] i = line.find(self.PS1) indent = line[:i] elif (indent is not None and (line.startswith(indent + self.PS2) or line == indent + self.PS2.rstrip() + "\n")): block.append(line) else: if block is not None: result.extend(self.refactor_doctest(block, block_lineno, indent, filename)) block = None indent = None result.append(line) if block is not None: result.extend(self.refactor_doctest(block, block_lineno, indent, filename)) return "".join(result) def refactor_doctest(self, block, lineno, indent, filename): """Refactors one doctest. A doctest is given as a block of lines, the first of which starts with ">>>" (possibly indented), while the remaining lines start with "..." (identically indented). """ try: tree = self.parse_block(block, lineno, indent) except Exception as err: if self.logger.isEnabledFor(logging.DEBUG): for line in block: self.log_debug("Source: %s", line.rstrip("\n")) self.log_error("Can't parse docstring in %s line %s: %s: %s", filename, lineno, err.__class__.__name__, err) return block if self.refactor_tree(tree, filename): new = str(tree).splitlines(keepends=True) # Undo the adjustment of the line numbers in wrap_toks() below. clipped, new = new[:lineno-1], new[lineno-1:] assert clipped == ["\n"] * (lineno-1), clipped if not new[-1].endswith("\n"): new[-1] += "\n" block = [indent + self.PS1 + new.pop(0)] if new: block += [indent + self.PS2 + line for line in new] return block def summarize(self): if self.wrote: were = "were" else: were = "need to be" if not self.files: self.log_message("No files %s modified.", were) else: self.log_message("Files that %s modified:", were) for file in self.files: self.log_message(file) if self.fixer_log: self.log_message("Warnings/messages while refactoring:") for message in self.fixer_log: self.log_message(message) if self.errors: if len(self.errors) == 1: self.log_message("There was 1 error:") else: self.log_message("There were %d errors:", len(self.errors)) for msg, args, kwds in self.errors: self.log_message(msg, *args, **kwds) def parse_block(self, block, lineno, indent): """Parses a block into a tree. This is necessary to get correct line number / offset information in the parser diagnostics and embedded into the parse tree. """ tree = self.driver.parse_tokens(self.wrap_toks(block, lineno, indent)) tree.future_features = frozenset() return tree def wrap_toks(self, block, lineno, indent): """Wraps a tokenize stream to systematically modify start/end.""" tokens = tokenize.generate_tokens(self.gen_lines(block, indent).__next__) for type, value, (line0, col0), (line1, col1), line_text in tokens: line0 += lineno - 1 line1 += lineno - 1 # Don't bother updating the columns; this is too complicated # since line_text would also have to be updated and it would # still break for tokens spanning lines. Let the user guess # that the column numbers for doctests are relative to the # end of the prompt string (PS1 or PS2). yield type, value, (line0, col0), (line1, col1), line_text def gen_lines(self, block, indent): """Generates lines as expected by tokenize from a list of lines. This strips the first len(indent + self.PS1) characters off each line. """ prefix1 = indent + self.PS1 prefix2 = indent + self.PS2 prefix = prefix1 for line in block: if line.startswith(prefix): yield line[len(prefix):] elif line == prefix.rstrip() + "\n": yield "\n" else: raise AssertionError("line=%r, prefix=%r" % (line, prefix)) prefix = prefix2 while True: yield "" class MultiprocessingUnsupported(Exception): pass class MultiprocessRefactoringTool(RefactoringTool): def __init__(self, *args, **kwargs): super(MultiprocessRefactoringTool, self).__init__(*args, **kwargs) self.queue = None self.output_lock = None def refactor(self, items, write=False, doctests_only=False, num_processes=1): if num_processes == 1: return super(MultiprocessRefactoringTool, self).refactor( items, write, doctests_only) try: import multiprocessing except ImportError: raise MultiprocessingUnsupported if self.queue is not None: raise RuntimeError("already doing multiple processes") self.queue = multiprocessing.JoinableQueue() self.output_lock = multiprocessing.Lock() processes = [multiprocessing.Process(target=self._child) for i in range(num_processes)] try: for p in processes: p.start() super(MultiprocessRefactoringTool, self).refactor(items, write, doctests_only) finally: self.queue.join() for i in range(num_processes): self.queue.put(None) for p in processes: if p.is_alive(): p.join() self.queue = None def _child(self): task = self.queue.get() while task is not None: args, kwargs = task try: super(MultiprocessRefactoringTool, self).refactor_file( *args, **kwargs) finally: self.queue.task_done() task = self.queue.get() def refactor_file(self, *args, **kwargs): if self.queue is not None: self.queue.put((args, kwargs)) else: return super(MultiprocessRefactoringTool, self).refactor_file( *args, **kwargs) main.py000064400000027116151027012300006040 0ustar00""" Main program for 2to3. """ from __future__ import with_statement, print_function import sys import os import difflib import logging import shutil import optparse from . import refactor def diff_texts(a, b, filename): """Return a unified diff of two strings.""" a = a.splitlines() b = b.splitlines() return difflib.unified_diff(a, b, filename, filename, "(original)", "(refactored)", lineterm="") class StdoutRefactoringTool(refactor.MultiprocessRefactoringTool): """ A refactoring tool that can avoid overwriting its input files. Prints output to stdout. Output files can optionally be written to a different directory and or have an extra file suffix appended to their name for use in situations where you do not want to replace the input files. """ def __init__(self, fixers, options, explicit, nobackups, show_diffs, input_base_dir='', output_dir='', append_suffix=''): """ Args: fixers: A list of fixers to import. options: A dict with RefactoringTool configuration. explicit: A list of fixers to run even if they are explicit. nobackups: If true no backup '.bak' files will be created for those files that are being refactored. show_diffs: Should diffs of the refactoring be printed to stdout? input_base_dir: The base directory for all input files. This class will strip this path prefix off of filenames before substituting it with output_dir. Only meaningful if output_dir is supplied. All files processed by refactor() must start with this path. output_dir: If supplied, all converted files will be written into this directory tree instead of input_base_dir. append_suffix: If supplied, all files output by this tool will have this appended to their filename. Useful for changing .py to .py3 for example by passing append_suffix='3'. """ self.nobackups = nobackups self.show_diffs = show_diffs if input_base_dir and not input_base_dir.endswith(os.sep): input_base_dir += os.sep self._input_base_dir = input_base_dir self._output_dir = output_dir self._append_suffix = append_suffix super(StdoutRefactoringTool, self).__init__(fixers, options, explicit) def log_error(self, msg, *args, **kwargs): self.errors.append((msg, args, kwargs)) self.logger.error(msg, *args, **kwargs) def write_file(self, new_text, filename, old_text, encoding): orig_filename = filename if self._output_dir: if filename.startswith(self._input_base_dir): filename = os.path.join(self._output_dir, filename[len(self._input_base_dir):]) else: raise ValueError('filename %s does not start with the ' 'input_base_dir %s' % ( filename, self._input_base_dir)) if self._append_suffix: filename += self._append_suffix if orig_filename != filename: output_dir = os.path.dirname(filename) if not os.path.isdir(output_dir) and output_dir: os.makedirs(output_dir) self.log_message('Writing converted %s to %s.', orig_filename, filename) if not self.nobackups: # Make backup backup = filename + ".bak" if os.path.lexists(backup): try: os.remove(backup) except OSError: self.log_message("Can't remove backup %s", backup) try: os.rename(filename, backup) except OSError: self.log_message("Can't rename %s to %s", filename, backup) # Actually write the new file write = super(StdoutRefactoringTool, self).write_file write(new_text, filename, old_text, encoding) if not self.nobackups: shutil.copymode(backup, filename) if orig_filename != filename: # Preserve the file mode in the new output directory. shutil.copymode(orig_filename, filename) def print_output(self, old, new, filename, equal): if equal: self.log_message("No changes to %s", filename) else: self.log_message("Refactored %s", filename) if self.show_diffs: diff_lines = diff_texts(old, new, filename) try: if self.output_lock is not None: with self.output_lock: for line in diff_lines: print(line) sys.stdout.flush() else: for line in diff_lines: print(line) except UnicodeEncodeError: warn("couldn't encode %s's diff for your terminal" % (filename,)) return def warn(msg): print("WARNING: %s" % (msg,), file=sys.stderr) def main(fixer_pkg, args=None): """Main program. Args: fixer_pkg: the name of a package where the fixers are located. args: optional; a list of command line arguments. If omitted, sys.argv[1:] is used. Returns a suggested exit status (0, 1, 2). """ # Set up option parser parser = optparse.OptionParser(usage="2to3 [options] file|dir ...") parser.add_option("-d", "--doctests_only", action="store_true", help="Fix up doctests only") parser.add_option("-f", "--fix", action="append", default=[], help="Each FIX specifies a transformation; default: all") parser.add_option("-j", "--processes", action="store", default=1, type="int", help="Run 2to3 concurrently") parser.add_option("-x", "--nofix", action="append", default=[], help="Prevent a transformation from being run") parser.add_option("-l", "--list-fixes", action="store_true", help="List available transformations") parser.add_option("-p", "--print-function", action="store_true", help="Modify the grammar so that print() is a function") parser.add_option("-e", "--exec-function", action="store_true", help="Modify the grammar so that exec() is a function") parser.add_option("-v", "--verbose", action="store_true", help="More verbose logging") parser.add_option("--no-diffs", action="store_true", help="Don't show diffs of the refactoring") parser.add_option("-w", "--write", action="store_true", help="Write back modified files") parser.add_option("-n", "--nobackups", action="store_true", default=False, help="Don't write backups for modified files") parser.add_option("-o", "--output-dir", action="store", type="str", default="", help="Put output files in this directory " "instead of overwriting the input files. Requires -n.") parser.add_option("-W", "--write-unchanged-files", action="store_true", help="Also write files even if no changes were required" " (useful with --output-dir); implies -w.") parser.add_option("--add-suffix", action="store", type="str", default="", help="Append this string to all output filenames." " Requires -n if non-empty. " "ex: --add-suffix='3' will generate .py3 files.") # Parse command line arguments refactor_stdin = False flags = {} options, args = parser.parse_args(args) if options.write_unchanged_files: flags["write_unchanged_files"] = True if not options.write: warn("--write-unchanged-files/-W implies -w.") options.write = True # If we allowed these, the original files would be renamed to backup names # but not replaced. if options.output_dir and not options.nobackups: parser.error("Can't use --output-dir/-o without -n.") if options.add_suffix and not options.nobackups: parser.error("Can't use --add-suffix without -n.") if not options.write and options.no_diffs: warn("not writing files and not printing diffs; that's not very useful") if not options.write and options.nobackups: parser.error("Can't use -n without -w") if options.list_fixes: print("Available transformations for the -f/--fix option:") for fixname in refactor.get_all_fix_names(fixer_pkg): print(fixname) if not args: return 0 if not args: print("At least one file or directory argument required.", file=sys.stderr) print("Use --help to show usage.", file=sys.stderr) return 2 if "-" in args: refactor_stdin = True if options.write: print("Can't write to stdin.", file=sys.stderr) return 2 if options.print_function: flags["print_function"] = True if options.exec_function: flags["exec_function"] = True # Set up logging handler level = logging.DEBUG if options.verbose else logging.INFO logging.basicConfig(format='%(name)s: %(message)s', level=level) logger = logging.getLogger('lib2to3.main') # Initialize the refactoring tool avail_fixes = set(refactor.get_fixers_from_package(fixer_pkg)) unwanted_fixes = set(fixer_pkg + ".fix_" + fix for fix in options.nofix) explicit = set() if options.fix: all_present = False for fix in options.fix: if fix == "all": all_present = True else: explicit.add(fixer_pkg + ".fix_" + fix) requested = avail_fixes.union(explicit) if all_present else explicit else: requested = avail_fixes.union(explicit) fixer_names = requested.difference(unwanted_fixes) input_base_dir = os.path.commonprefix(args) if (input_base_dir and not input_base_dir.endswith(os.sep) and not os.path.isdir(input_base_dir)): # One or more similar names were passed, their directory is the base. # os.path.commonprefix() is ignorant of path elements, this corrects # for that weird API. input_base_dir = os.path.dirname(input_base_dir) if options.output_dir: input_base_dir = input_base_dir.rstrip(os.sep) logger.info('Output in %r will mirror the input directory %r layout.', options.output_dir, input_base_dir) rt = StdoutRefactoringTool( sorted(fixer_names), flags, sorted(explicit), options.nobackups, not options.no_diffs, input_base_dir=input_base_dir, output_dir=options.output_dir, append_suffix=options.add_suffix) # Refactor all files and directories passed as arguments if not rt.errors: if refactor_stdin: rt.refactor_stdin() else: try: rt.refactor(args, options.write, options.doctests_only, options.processes) except refactor.MultiprocessingUnsupported: assert options.processes > 1 print("Sorry, -j isn't supported on this platform.", file=sys.stderr) return 1 rt.summarize() # Return error status (0 if rt.errors is zero) return int(bool(rt.errors)) Grammar3.11.13.final.0.pickle000064400000035721151027012300011275 0ustar00;}( symbol2number}( file_inputMand_exprMand_testM annassignMarglistMargumentM arith_exprM assert_stmtM async_funcdefM async_stmtM atomM  augassignM  break_stmtM classdefM comp_forMcomp_ifM comp_iterMcomp_opM comparisonM compound_stmtM continue_stmtM decoratedM decoratorM decoratorsMdel_stmtM dictsetmakerMdotted_as_nameMdotted_as_namesM dotted_nameM encoding_declM eval_inputM except_clauseM exec_stmtM exprM! expr_stmtM"exprlistM#factorM$ flow_stmtM%for_stmtM&funcdefM' global_stmtM(if_stmtM)import_as_nameM*import_as_namesM+ import_fromM, import_nameM- import_stmtM.lambdefM/ listmakerM0namedexpr_testM1not_testM2 old_lambdefM3old_testM4or_testM5 parametersM6 pass_stmtM7powerM8 print_stmtM9 raise_stmtM: return_stmtM; shift_exprM< simple_stmtM= single_inputM>sliceopM? small_stmtM@ star_exprMAstmtMB subscriptMC subscriptlistMDsuiteMEtermMFtestMGtestlistMH testlist1MI testlist_gexpMJ testlist_safeMKtestlist_star_exprMLtfpdefMMtfplistMNtnameMOtrailerMPtry_stmtMQ typedargslistMR varargslistMSvfpdefMTvfplistMUvnameMV while_stmtMW with_itemMX with_stmtMYwith_varMZxor_exprM[ yield_argM\ yield_exprM] yield_stmtM^u number2symbol}(MhMhMhMhMhMhMh Mh Mh M h M h M hM hM hMhMhMhMhMhMhMhMhMhMhMhMhMhMhMhMh Mh!Mh"M h#M!h$M"h%M#h&M$h'M%h(M&h)M'h*M(h+M)h,M*h-M+h.M,h/M-h0M.h1M/h2M0h3M1h4M2h5M3h6M4h7M5h8M6h9M7h:M8h;M9hM<h?M=h@M>hAM?hBM@hCMAhDMBhEMChFMDhGMEhHMFhIMGhJMHhKMIhLMJhMMKhNMLhOMMhPMNhQMOhRMPhSMQhTMRhUMShVMThWMUhXMVhYMWhZMXh[MYh\MZh]M[h^M\h_M]h`M^haustates](](](KKKKKKe]KKae](]K*Ka](K+KKKee](]K,Ka](K-KKKee](]K.Ka]K/Ka](K0KKKe]K/Ka]KKae](]K1Ka](K2KKKe](K1KKKee](](KKK3KK/Ke]K/Ka](K4KK0KK5KKKe]KKae](]K6Ka](KKKKKKee](]K Ka]K/Ka](K2KKKe]K/Ka]KKae](]K%Ka]K7Ka]KKae](]K%Ka](K8KK7KK9Ke]KKae](](KKKKK KK KK#KK'KK(KK)Ke](K:KK;KKK e]K?K a](K@KKAK e]KKa](K)KKKe]K:Ka]KKa]K=Ka]K Ka]K@Kae](](KBKKCKKDKKEKKFKKGKKHKKIKKJKKKKKLKKMKKNKe]KKae](]K Ka]KKae](]KKa]K'Ka](KKK.Ke](K:KKOKe]KPKa]K.Ka]K:Ka]KKae](](KKK%Ke]KQKa]KKa]KRKa]KSKa](KTKKKe]KKae](]KKa]KUKa](KTKKKe]KKae](](K5KKVKe]KKae](](KWKKXKKYKKWKKZKK[KK\KKRKK]KKKe]KKa](KKKKe]KRKae](]K^Ka](K_KKKee](](K`KKaKKbKK8KK7KKcKKdKKeKK9Ke]KKae](]KKa]KKae](]KfKa](KgKKaKK7Ke]KKae](]K Ka]KhKa](KKKKe](K:KKOKe]KKa]KKa]K:Kae](]KiKa](KiKKKee](]KKa]KQKa]KKae](](K3KKjKK/Ke]K^Ka](K2KK5KKKe](K2KK.KK5KKKe](K2KK5KKKe](KjK K/K KKe]KKa]K/Ka](K3K K/K KKe](K2KKK e]K^K a]K.K a](K2KKK e]K/K ae](]KhKa](KkKKKe]K'Ka]KKae](]KlKa](K2KKKee](]K'Ka](KKKKee](]K'Ka]KKae](]KmKa](KKKKe]KKae](]KnKa](K/KKKe](K2KKkKKKe]K/Ka]KKae](]KKa]K^Ka](KRKKKe]K/Ka](K2KKKe]K/Ka]KKae](]KoKa](KpKKKee](]KqKa](K0KKrKKsKKKe](KqKKjL}(KKKKKKKKKKK KK KK KK KK KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKK KK!KK"KK#KK$KK%KK&KK'KKKK(KK)KuM?jU}K.KsM@j]}(KKKKKKKKKKK KK KK KK KKKKKKKKKKKKKKKKKKKKKKKKKKKK"KK#KK$KK&KK'KK(KK)KuMAjj}KKsMBjq}(KKKKKKKKKKK KK KK KK KK KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKK KK!KK"KK#KK$KK%KK&KK'KK(KK)KuMCjw}(KKKKKKKKK.KK KK KKKKKK#KK$KK&KK'KK(KK)KuMDj}(KKKKKKKKK.KK KK KKKKKK#KK$KK&KK'KK(KK)KuMEj}(KKKKKKKKKKK KK KK KK KKKKKKKKKKKKKKKKKKKKKKKKKKKK"KK#KK$KK&KK'KKKK(KK)KuMFj}(KKKKKKKKK KK KK#KK$KK&KK'KK(KK)KuMGj}(KKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMHj}(KKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMIj}(KKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMJj}(KKKKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMKj}(KKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMLj}(KKKKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMMj}(KKK'KuMNj}(KKK'KuMOj}K'KsMPj }(KKKKK KuMQj}KKsMRj>}(KKKKK3KK'KuMSj}(KKKKK3KK'KuMTj}(KKK'KuMUj}(KKK'KuMVj}K'KsMWj}K KsMXj}(KKKKKKKKK KK KKKKKK#KK$KK&KK'KK(KK)KuMYj}K!KsMZj}KkKsM[j}(KKKKKKKKK KK KK#KK$KK&KK'KK(KK)KuM\j!}(KKKKKKKKKKK KK KKKKKKKK#KK$KK&KK'KK(KK)KuM]j)}K"KsM^j1}K"Ksulabels](KEMPTYKNKNMBNKNKNKNKNKNK2NK NKNKassertKbreakKclassKcontinueKdefKdelKexecKforKfromKglobalKifKimportKlambdaKnonlocalKnotKpassKprintKraiseKreturnKtryKwhileKwithKyieldKNK NK9NK8NKNKNKNM<NKNM2NKandK NMGNKNMNK NK$NK;NMNMFNM'NM&NMYNKNMJNM]NK NM0NMINKNMNK)NK*NK/NK'NK%NK&NK1NK(NK-NK.NK3NK,NK+NMNMENM#NKinMKNMNM4NMNKNKNKNKNKNKNKisM!NMNM NM NMNM)NMQNMWNMNMNMNMNMANKasMNMHNKexceptM[NKNMLNMNM NM8NM$NM NMNM:NM;NM^NKelseM6NK7NM1NKelifM*NM+NMNM,NM-NMSNMNM3NM5NMNKorMRNM NMPNK#NMNK"NM@NK NMNM=NMNMNM NM"NM%NM(NM.NM7NM9NM?NMCNKNKNKNKNK0NM/NMONMNNMMNMDNKfinallyMNMTNMVNMUNMXNMNK!NM\Nekeywords}(jK jK j Kj Kj KjKjKjKjKjKjKjKjKjKj!Kj#Kj%Kj'Kj)Kj+Kj-K j/K!j1K"j=K-jcKRjoK]j~KkjKnjK{jKjKjKutokens}(KKKKKKKKKKKKKKK2K K K KK KK#K K$K9K%K8K&KK'KK(KK)KK+K K.KK0K K2K$K3K;K4KK:K K=KK@K)KBK*KCK/KDK'KEK%KFK&KGK1KHK(KIK-KJK.KKK3KLK,KMK+KNKKWKKXKKYKKZKK[KK\KKpK7K}K#KK"KK KKKKKKKKKK0KK!Ku symbol2label}(stmtK shift_exprK*not_testK,testK/argumentK1comp_forK5termK6funcdefK7for_stmtK8 with_stmtK9 testlist_gexpK; yield_exprK< listmakerK> testlist1K? dictsetmakerKAarglistKOsuiteKPexprlistKQ testlist_safeKS comp_iterKTold_testKUcomp_ifKVexprK^comp_opK_ async_stmtK`classdefKa decoratedKbif_stmtKctry_stmtKd while_stmtKe decoratorsKf async_funcdefKg dotted_nameKh decoratorKi star_exprKjdotted_as_nameKltestlistKmxor_exprKotestlist_star_exprKq annassignKr augassignKspowerKtfactorKu break_stmtKv continue_stmtKw raise_stmtKx return_stmtKy yield_stmtKz parametersK|namedexpr_testK~import_as_nameKimport_as_namesKdotted_as_namesK import_fromK import_nameK varargslistK comparisonK old_lambdefKor_testKand_testK typedargslistKatomKtrailerK arith_exprK small_stmtK compound_stmtK simple_stmtK assert_stmtKdel_stmtK exec_stmtK expr_stmtK flow_stmtK global_stmtK import_stmtK pass_stmtK print_stmtKsliceopK subscriptKlambdefKtnameKtfplistKtfpdefK subscriptlistK except_clauseKvfpdefKvnameKvfplistK with_itemKand_exprK yield_argKustartMu.PatternGrammar3.11.13.final.0.pickle000064400000002311151027012300012620 0ustar00}( symbol2number}(MatcherM AlternativeM AlternativesMDetailsM NegatedUnitMRepeaterMUnitMu number2symbol}(MhMhMhMhMhMhMh ustates](](]KKa]KKa]KKae](](KKK Ke](KKK KKKee](]K Ka](K KKKee](]K Ka]KKa]K Ka]KKae](]KKa](KKKKKKe]KKa](KKKKe]KKa]KKae](](KKKKKKe]KKa]KKa](KKKKe]KKa]KKae](](KKKKKKKKe]KKa]KKa](KKKKKKKKe](KKKKe]KKa]KKa](KKKKKK KKe]KKa](KKKKKK eeedfas}(Mh}(KKKKKKKKKKuMh}(KKKKKKKKKKuMh}(KKKKKKKKKKuMh#}K KsMh,}KKsMh<}(KKKKKKuMhL}(KKKKKKKKuulabels](KEMPTYMNKNKNK NKnotKNKNMNMNMNKNKNKNMNKNKNKNKNKNK NKNKNMNK Nekeywords}hKstokens}(KKKKK KKKKKKK KK KK KKKKKKKKKKK KKKKKK Ku symbol2label}( AlternativesK NegatedUnitKUnitK AlternativeK DetailsKRepeaterKustartMu.__init__.py000064400000000234151027012300006643 0ustar00import warnings warnings.warn( "lib2to3 package is deprecated and may not be able to parse Python 3.10+", DeprecationWarning, stacklevel=2, )