Skip to content

robot.parsing.lexer.tokens

Token

1
2
3
4
5
6
7
Token(
    type: str | None = None,
    value: str | None = None,
    lineno: int = -1,
    col_offset: int = -1,
    error: str | None = None,
)

Token representing piece of Robot Framework data.

Each token has type, value, line number, column offset and end column offset in :attr:type, :attr:value, :attr:lineno, :attr:col_offset and :attr:end_col_offset attributes, respectively. Tokens representing error also have their error message in :attr:error attribute.

Token types are declared as class attributes such as :attr:SETTING_HEADER and :attr:EOL. Values of these constants have changed slightly in Robot Framework 4.0, and they may change again in the future. It is thus safer to use the constants, not their values, when types are needed. For example, use Token(Token.EOL) instead of Token('EOL') and token.type == Token.EOL instead of token.type == 'EOL'.

If :attr:value is not given and :attr:type is a special marker like :attr:IF or :attr:EOL`, the value is set automatically.

Source code in src/robot/parsing/lexer/tokens.py
def __init__(self, type: 'str|None' = None, value: 'str|None' = None,
             lineno: int = -1, col_offset: int = -1, error: 'str|None' = None):
    self.type = type
    if value is None:
        value = {
            Token.IF: 'IF', Token.INLINE_IF: 'IF', Token.ELSE_IF: 'ELSE IF',
            Token.ELSE: 'ELSE', Token.FOR: 'FOR', Token.WHILE: 'WHILE',
            Token.TRY: 'TRY', Token.EXCEPT: 'EXCEPT', Token.FINALLY: 'FINALLY',
            Token.END: 'END', Token.VAR: 'VAR', Token.CONTINUE: 'CONTINUE',
            Token.BREAK: 'BREAK', Token.RETURN_STATEMENT: 'RETURN',
            Token.CONTINUATION: '...', Token.EOL: '\n', Token.WITH_NAME: 'AS',
            Token.AS: 'AS'
        }.get(type, '')    # type: ignore
    self.value = cast(str, value)
    self.lineno = lineno
    self.col_offset = col_offset
    self.error = error
    # Used internally be lexer to indicate that EOS is needed before/after.
    self._add_eos_before = False
    self._add_eos_after = False

tokenize_variables

tokenize_variables() -> Iterator[Token]

Tokenizes possible variables in token value.

Yields the token itself if the token does not allow variables (see :attr:Token.ALLOW_VARIABLES) or its value does not contain variables. Otherwise, yields variable tokens as well as tokens before, after, or between variables so that they have the same type as the original token.

Source code in src/robot/parsing/lexer/tokens.py
def tokenize_variables(self) -> 'Iterator[Token]':
    """Tokenizes possible variables in token value.

    Yields the token itself if the token does not allow variables (see
    :attr:`Token.ALLOW_VARIABLES`) or its value does not contain
    variables. Otherwise, yields variable tokens as well as tokens
    before, after, or between variables so that they have the same
    type as the original token.
    """
    if self.type not in Token.ALLOW_VARIABLES:
        return self._tokenize_no_variables()
    matches = VariableMatches(self.value)
    if not matches:
        return self._tokenize_no_variables()
    return self._tokenize_variables(matches)

EOS

EOS(lineno: int = -1, col_offset: int = -1)

Bases: Token

Token representing end of a statement.

Source code in src/robot/parsing/lexer/tokens.py
def __init__(self, lineno: int = -1, col_offset: int = -1):
    super().__init__(Token.EOS, '', lineno, col_offset)

tokenize_variables

tokenize_variables() -> Iterator[Token]

Tokenizes possible variables in token value.

Yields the token itself if the token does not allow variables (see :attr:Token.ALLOW_VARIABLES) or its value does not contain variables. Otherwise, yields variable tokens as well as tokens before, after, or between variables so that they have the same type as the original token.

Source code in src/robot/parsing/lexer/tokens.py
def tokenize_variables(self) -> 'Iterator[Token]':
    """Tokenizes possible variables in token value.

    Yields the token itself if the token does not allow variables (see
    :attr:`Token.ALLOW_VARIABLES`) or its value does not contain
    variables. Otherwise, yields variable tokens as well as tokens
    before, after, or between variables so that they have the same
    type as the original token.
    """
    if self.type not in Token.ALLOW_VARIABLES:
        return self._tokenize_no_variables()
    matches = VariableMatches(self.value)
    if not matches:
        return self._tokenize_no_variables()
    return self._tokenize_variables(matches)

END

1
2
3
4
5
END(
    lineno: int = -1,
    col_offset: int = -1,
    virtual: bool = False,
)

Bases: Token

Token representing END token used to signify block ending.

Virtual END tokens have '' as their value, with "real" END tokens the value is 'END'.

Source code in src/robot/parsing/lexer/tokens.py
def __init__(self, lineno: int = -1, col_offset: int = -1, virtual: bool = False):
    value = 'END' if not virtual else ''
    super().__init__(Token.END, value, lineno, col_offset)

tokenize_variables

tokenize_variables() -> Iterator[Token]

Tokenizes possible variables in token value.

Yields the token itself if the token does not allow variables (see :attr:Token.ALLOW_VARIABLES) or its value does not contain variables. Otherwise, yields variable tokens as well as tokens before, after, or between variables so that they have the same type as the original token.

Source code in src/robot/parsing/lexer/tokens.py
def tokenize_variables(self) -> 'Iterator[Token]':
    """Tokenizes possible variables in token value.

    Yields the token itself if the token does not allow variables (see
    :attr:`Token.ALLOW_VARIABLES`) or its value does not contain
    variables. Otherwise, yields variable tokens as well as tokens
    before, after, or between variables so that they have the same
    type as the original token.
    """
    if self.type not in Token.ALLOW_VARIABLES:
        return self._tokenize_no_variables()
    matches = VariableMatches(self.value)
    if not matches:
        return self._tokenize_no_variables()
    return self._tokenize_variables(matches)