HEX
Server: LiteSpeed
System: Linux php-prod-1.spaceapp.ru 5.15.0-157-generic #167-Ubuntu SMP Wed Sep 17 21:35:53 UTC 2025 x86_64
User: xnsbb3110 (1041)
PHP: 8.1.33
Disabled: NONE
Upload Files
File: //usr/local/CyberPanel/lib64/python3.10/site-packages/sqlparse/__pycache__/lexer.cpython-310.pyc
o

��hg�@s\dZddlZddlmZddlmZddlmZmZddl	m
Z
Gdd�d�Zdd	d
�ZdS)z	SQL Lexer�N)�Lock)�
TextIOBase)�tokens�keywords)�consumec@sXeZdZdZdZe�Zedd��Zdd�Z	dd�Z
d	d
�Zdd�Zd
d�Z
ddd�ZdS)�LexerzrThe Lexer supports configurable syntax.
    To add support for additional keywords, use the `add_keywords` method.NcCs^|j�!|jdur|�|_|j��Wd�|jSWd�|jS1s'wY|jS)zRReturns the lexer instance used internally
        by the sqlparse core functions.N)�_lock�_default_instance�default_initialization)�cls�r�D/usr/local/CyberPanel/lib/python3.10/site-packages/sqlparse/lexer.py�get_default_instance0s

��
��zLexer.get_default_instancecCs�|��|�tj�|�tj�|�tj�|�tj�|�tj�|�tj	�|�tj
�|�tj�|�tj�|�tj
�dS)zlInitialize the lexer with default dictionaries.
        Useful if you need to revert custom syntax settings.N)�clear�
set_SQL_REGEXr�	SQL_REGEX�add_keywords�KEYWORDS_COMMON�KEYWORDS_ORACLE�KEYWORDS_MYSQL�KEYWORDS_PLPGSQL�KEYWORDS_HQL�KEYWORDS_MSACCESS�KEYWORDS_SNOWFLAKE�KEYWORDS_BIGQUERY�KEYWORDS��selfrrr
r
:szLexer.default_initializationcCsg|_g|_dS)z�Clear all syntax configurations.
        Useful if you want to load a reduced set of syntax configurations.
        After this call, regexps and keyword dictionaries need to be loaded
        to make the lexer functional again.N)�
_SQL_REGEX�	_keywordsrrrr
rIs
zLexer.clearcs$tjtjB��fdd�|D�|_dS)z.Set the list of regex that will parse the SQL.cs"g|]
\}}t�|��j|f�qSr)�re�compile�match)�.0�rx�tt��FLAGSrr
�
<listcomp>Ts��z'Lexer.set_SQL_REGEX.<locals>.<listcomp>N)r �
IGNORECASE�UNICODEr)rrrr&r
rQs
�zLexer.set_SQL_REGEXcCs|j�|�dS)zhAdd keyword dictionaries. Keywords are looked up in the same order
        that dictionaries were added.N)r�append)rrrrr
rYszLexer.add_keywordscCs6|��}|jD]}||vr|||fSqtj|fS)z�Checks for a keyword.

        If the given value is in one of the KEYWORDS_* dictionary
        it's considered a keyword. Otherwise, tokens.Name is returned.
        )�upperrr�Name)r�value�val�kwdictrrr
�
is_keyword^s
�
zLexer.is_keywordc	cs�t|t�r
|��}t|t�rn,t|t�r3|r|�|�}nz|�d�}Wnty2|�d�}Yn
wtd�t	|����t
|�}|D]A\}}|jD]3\}}|||�}|sUqIt|tj
�rc||��fVn
|tjurp|�|���Vt||��|d�ntj|fVqBdS)a�
        Return an iterable of (tokentype, value) pairs generated from
        `text`. If `unfiltered` is set to `True`, the filtering mechanism
        is bypassed even if filters are defined.

        Also preprocess the text, i.e. expand tabs and strip it if
        wanted and applies registered filters.

        Split ``text`` into (tokentype, text) pairs.

        ``stack`` is the initial stack (default: ``['root']``)
        zutf-8zunicode-escapez+Expected text or file-like object, got {!r}�N)�
isinstancer�read�str�bytes�decode�UnicodeDecodeError�	TypeError�format�type�	enumeraterr�
_TokenType�groupr�PROCESS_AS_KEYWORDr1r�end�Error)	r�text�encoding�iterable�pos�char�rexmatch�action�mrrr
�
get_tokensks>�



�
�

��zLexer.get_tokens�N)�__name__�
__module__�__qualname__�__doc__r	rr�classmethodrr
rrrr1rJrrrr
rs
	
rcCst���||�S)z�Tokenize sql.

    Tokenize *sql* using the :class:`Lexer` and return a 2-tuple stream
    of ``(token type, value)`` items.
    )rrrJ)�sqlrCrrr
�tokenize�srRrK)
rOr �	threadingr�ior�sqlparserr�sqlparse.utilsrrrRrrrr
�<module>s