HEX
Server: Apache
System: Linux srv1.prosuiteplus.com 5.4.0-216-generic #236-Ubuntu SMP Fri Apr 11 19:53:21 UTC 2025 x86_64
User: prosuiteplus (1001)
PHP: 8.3.20
Disabled: NONE
Upload Files
File: //usr/lib/python3/dist-packages/pygments/lexers/__pycache__/special.cpython-38.pyc
U

`a�[O�@szdZddlZddlmZddlmZmZmZddlm	Z	m
Z
mZddgZGdd�de�Z
iZe�d	�ZGd
d�de�ZdS)z�
    pygments.lexers.special
    ~~~~~~~~~~~~~~~~~~~~~~~

    Special lexers.

    :copyright: Copyright 2006-2017 by the Pygments team, see AUTHORS.
    :license: BSD, see LICENSE for details.
�N)�Lexer)�Token�Error�Text)�get_choice_opt�	text_type�BytesIO�	TextLexer�
RawTokenLexerc@s:eZdZdZdZdgZdgZdgZdZdd�Z	d	d
�Z
dS)r	z3
    "Null" lexer, doesn't highlight anything.
    z	Text only�textz*.txtz
text/plaing{�G�z�?ccsdt|fVdS)Nr)r)�selfr�r
�9/usr/lib/python3/dist-packages/pygments/lexers/special.py�get_tokens_unprocessed sz TextLexer.get_tokens_unprocessedcCstjS)N)r	�priority)rr
r
r�analyse_text#szTextLexer.analyse_textN)�__name__�
__module__�__qualname__�__doc__�name�aliases�	filenames�	mimetypesrrrr
r
r
rr	ss.*?
c@s<eZdZdZdZdgZgZdgZdd�Zdd�Z	d	d
�Z
dS)r
aq
    Recreate a token stream formatted with the `RawTokenFormatter`.  This
    lexer raises exceptions during parsing if the token stream in the
    file is malformed.

    Additional options accepted:

    `compress`
        If set to ``"gz"`` or ``"bz2"``, decompress the token stream with
        the given compression algorithm before lexing (default: ``""``).
    zRaw token data�rawzapplication/x-pygments-tokenscKs*t|dddddgd�|_tj|f|�dS)N�compress�Znone�gz�bz2)rrr�__init__)rZoptionsr
r
rr<s

�zRawTokenLexer.__init__ccs�t|t�r|�d�}|jdkrDddl}|�dddt|��}|��}n|jdkr`ddl}|�	|�}|�
d�d}|�|�D]\}}}||fVqxdS)	N�asciirrr�rb�	r�
)�
isinstancer�encoder�gzipZGzipFiler�readr�
decompress�stripr)rrr&Zgzipfiler�i�t�vr
r
r�
get_tokensAs





zRawTokenLexer.get_tokensc		cs�d}t�|�D]�}z|���dd�\}}Wn(tk
rR|���dd�}t}YnlXt�|�}|s�t	}|�d�dd�}|D]&}|r�|d�
�s�td��t||�}q||t|<|dd	��d
�}|||fV|t|�7}qdS)Nr�	�r �replace�.zmalformed token name����zunicode-escape)
�line_re�finditer�group�split�
ValueError�decoder�_ttype_cache�getr�isupper�getattr�len)	rrZlength�matchZttypestr�valZttypeZttypesZttype_r
r
rrSs&

z$RawTokenLexer.get_tokens_unprocessedN)rrrrrrrrrr-rr
r
r
rr
+s)r�reZpygments.lexerrZpygments.tokenrrrZ
pygments.utilrrr�__all__r	r:�compiler4r
r
r
r
r�<module>s