Python FFI to y-scope/log-surgeon.
Project description
log-surgeon-ffi
Python FFI bindings for log-surgeon, a high-performance library for parsing unstructured log messages into structured data.
Overview
log-surgeon-ffi provides a Pythonic interface to the log-surgeon C++ library, enabling efficient extraction of structured information from unstructured log files.
Why Log Surgeon?
Traditional regex engines are brittle—slow to execute, prone to errors, and they demand complex pattern maintenance. For instance, Meta uses RE2 (a state-of-the-art regex engine) to parse their logs but they face scalability and maintenance challenges; as a result they can only afford to extract limited patterns (timestamp, log level, and component name).
Log Surgeon streamlines the entire process by (1) identifying, extracting, and labeling variable values with semantic context and (2) inferring a log template, all in a single, efficient pass.
Log Surgeon is built to accommodate structural variability: values may shift position, appear multiple times, or change order entirely. You simply define the variable patterns, optionally enriched with surrounding text or sequence of variables. Log Surgeon then JIT-compiles a tagged-DFA state machine to drive the full pipeline.
Key Capabilities
- Extract variables from log messages using regex patterns with named capture groups
- Generate log types (templates) automatically for log analysis
- Parse streams efficiently for large-scale log processing
- Export data to pandas DataFrames and PyArrow Tables
Installation
pip install log-surgeon-ffi
Note: pandas and pyarrow are included as dependencies for DataFrame/Arrow support.
⚠️ IMPORTANT: READ BEFORE USING
log-surgeon uses token-based parsing and has different regex behavior than traditional engines.
You MUST read the Key Concepts section and understand it fully before writing patterns, or you will encounter unexpected behavior and pain.
Critical differences:
.*only matches within a single token (not across delimiters)abc|defrequires grouping: use(abc)|(def)instead- Use
{0,1}for optional patterns, NOT?Tip: Use raw f-strings (
rf"...") for regex patterns—see Using Raw F-Strings for details.
Quick Start
Basic Parsing
from log_surgeon import Parser, PATTERN
# Parse a sample log event
log_line = "16/05/04 04:24:58 INFO Registering worker with 1 core and 4.0 GiB ram\n"
# Create a parser and define extraction patterns
parser = Parser()
parser.add_var("resource", rf"(?<memory_gb>{PATTERN.FLOAT}) GiB ram")
parser.compile()
# Parse a single event
event = parser.parse_event(log_line)
# Access extracted data
print(f"Message: {event.get_log_message().strip()}")
print(f"LogType: {event.get_log_type().strip()}")
print(f"Parsed Logs: {event}")
Output:
Message: 16/05/04 04:24:58 INFO Registering worker with 1 core and 4.0 GiB ram
LogType: 16/05/04 04:24:58 INFO Registering worker with 1 core and <memory_gb> GiB ram
Parsed Logs: {
"memory_gb": "4.0"
}
The parser extracted structured data from the unstructured log line:
- Message: The original log line
- LogType: Template with variable placeholder
<memory_gb>showing the pattern structure - Parsed variables: Successfully extracted
memory_gbvalue of "4.0" from the pattern match
Multiple Capture Groups
from log_surgeon import Parser, PATTERN
# Parse a sample log event
log_line = """16/05/04 12:22:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:750)
"""
# Create a parser and define extraction patterns
parser = Parser()
# Add timestamp pattern
parser.add_timestamp("TIMESTAMP_SPARK_1_6", rf"\d{{2}}/\d{{2}}/\d{{2}} \d{{2}}:\d{{2}}:\d{{2}}")
# Add variable patterns
parser.add_var("SYSTEM_LEVEL", rf"(?<level>(INFO)|(WARN)|(ERROR))")
parser.add_var("SPARK_HOST_IP_PORT", rf"(?<spark_host>spark\-{PATTERN.INT})/(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.add_var(
"SYSTEM_EXCEPTION",
rf"(?<system_exception_type>({PATTERN.JAVA_PACKAGE_SEGMENT})+[{PATTERN.JAVA_IDENTIFIER_CHARSET}]*Exception): "
rf"(?<system_exception_msg>{PATTERN.LOG_LINE})"
)
parser.add_var(
rf"SYSTEM_STACK_TRACE",
rf"(\s{{1,4}}at (?<system_stack>{PATTERN.JAVA_STACK_LOCATION})"
)
parser.compile()
# Parse a single event
event = parser.parse_event(log_line)
# Access extracted data
print(f"Message: {event.get_log_message().strip()}")
print(f"LogType: {event.get_log_type().strip()}")
print(f"Parsed Logs: {event}")
Output:
Message: 16/05/04 12:22:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:750)
LogType: <timestamp> <level> server.TransportChannelHandler: Exception in connection from <spark_host>/<system_ip>:<system_port>
<system_exception_type>: <system_exception_msg><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine>
Parsed Logs: {
"timestamp": "16/05/04 12:22:37",
"level": "WARN",
"spark_host": "spark-35",
"system_ip": "192.168.10.50",
"system_port": "55392",
"system_exception_type": "java.io.IOException",
"system_exception_msg": "Connection reset by peer",
"system_stack": [
"sun.nio.ch.FileDispatcherImpl.read0(Native Method)",
"sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)",
"sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)",
"sun.nio.ch.IOUtil.read(IOUtil.java:192)",
"sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)",
"io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)",
"io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)",
"io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)",
"io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)",
"io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)",
"io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)",
"io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)",
"io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)",
"io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)",
"java.lang.Thread.run(Thread.java:750)"
]
}
The parser extracted multiple named capture groups from a complex multi-line Java stack trace:
- Scalar fields:
timestamp,level,spark_host,system_ip,system_port,system_exception_type,system_exception_msg - Array field:
system_stackcontains all 15 stack trace locations (demonstrates automatic aggregation of repeated capture groups) - LogType: Template shows the structure with
<newLine>markers indicating line boundaries in the original log
Stream Parsing
When parsing log streams or files, timestamps are required to perform contextual anchoring. Timestamps act as delimiters that separate individual log events, enabling the parser to correctly group multi-line entries (like stack traces) into single events.
from log_surgeon import Parser, PATTERN
# Parse from string (automatically converted to io.StringIO)
SAMPLE_LOGS = """16/05/04 04:31:13 INFO master.Master: Registering app SparkSQL::192.168.10.76
16/05/04 12:32:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:750)
16/05/04 04:37:53 INFO master.Master: 192.168.10.76:41747 got disassociated, removing it.
"""
# Define parser with patterns
parser = Parser()
# REQUIRED: Timestamp acts as contextual anchor to separate individual log events in the stream
parser.add_timestamp("TIMESTAMP_SPARK_1_6", rf"\d{{2}}/\d{{2}}/\d{{2}} \d{{2}}:\d{{2}}:\d{{2}}")
parser.add_var("SYSTEM_LEVEL", rf"(?<level>(INFO)|(WARN)|(ERROR))")
parser.add_var("SPARK_APP_NAME", rf"(?<spark_app_name>SparkSQL::{PATTERN.IPV4})")
parser.add_var("SPARK_HOST_IP_PORT", rf"(?<spark_host>spark\-{PATTERN.INT})/(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.add_var(
"SYSTEM_EXCEPTION",
rf"(?<system_exception_type>({PATTERN.JAVA_PACKAGE_SEGMENT})+[{PATTERN.JAVA_IDENTIFIER_CHARSET}]*Exception): "
rf"(?<system_exception_msg>{PATTERN.LOG_LINE})"
)
parser.add_var(
rf"SYSTEM_STACK_TRACE", rf"(\s{{1,4}}at (?<system_stack>{PATTERN.JAVA_STACK_LOCATION})"
)
parser.add_var("IP_PORT", rf"(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.compile()
# Stream parsing: iterate over multi-line log events
for idx, event in enumerate(parser.parse(SAMPLE_LOGS)):
print(f"log-event-{idx} log template type:{event.get_log_type().strip()}")
Output:
log-event-0 log template type:<timestamp> <level> master.Master: Registering app <spark_app_name>
log-event-1 log template type:<timestamp> <level> server.TransportChannelHandler: Exception in connection from <spark_host>/<system_ip>:<system_port>
<system_exception_type>: <system_exception_msg><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack><newLine> at <system_stack>
log-event-2 log template type:<timestamp> <level> master.Master: <system_ip>:<system_port> got disassociated, removing it.<newLine>
The parser successfully separated the log stream into 3 distinct events using timestamps as contextual anchors:
- Event 0: Single-line app registration log
- Event 1: Multi-line exception with 15 stack trace lines (demonstrates how timestamps bind multi-line events together)
- Event 2: Single-line disassociation log
Each log type shows the template structure with variable placeholders (<level>, <system_ip>, etc.), enabling pattern-based log analysis and grouping.
Using Pattern Constants
from log_surgeon import Parser, Pattern
parser = Parser()
parser.add_var("network", rf"IP: (?<ip>{Pattern.IPV4}) UUID: (?<id>{Pattern.UUID})")
parser.add_var("metrics", rf"value=(?<value>{Pattern.FLOAT})")
parser.compile()
log_line = "IP: 192.168.1.1 UUID: 550e8400-e29b-41d4-a716-446655440000 value=42.5"
event = parser.parse_event(log_line)
print(f"IP: {event['ip']}")
print(f"UUID: {event['id']}")
print(f"Value: {event['value']}")
Export to DataFrame
from log_surgeon import Parser, Query
parser = Parser()
parser.add_var(
"metric",
rf"metric=(?<metric_name>\w+) value=(?<value>\d+)"
)
parser.compile()
log_data = """
2024-01-01 INFO: metric=cpu value=42
2024-01-01 INFO: metric=memory value=100
2024-01-01 INFO: metric=disk value=7
"""
# Create a query and export to DataFrame
query = (
Query(parser)
.select(["metric_name", "value"])
.from_(log_data)
.validate_query()
)
df = query.to_dataframe()
print(df)
Filtering Events
from log_surgeon import Parser, Query
parser = Parser()
parser.add_var("metric", rf"metric=(?<metric_name>\w+) value=(?<value>\d+)")
parser.compile()
log_data = """
2024-01-01 INFO: metric=cpu value=42
2024-01-01 INFO: metric=memory value=100
2024-01-01 INFO: metric=disk value=7
2024-01-01 INFO: metric=cpu value=85
"""
# Filter events where value > 50
query = (
Query(parser)
.select(["metric_name", "value"])
.from_(log_data)
.filter(lambda event: int(event['value']) > 50)
.validate_query()
)
df = query.to_dataframe()
print(df)
# Output:
# metric_name value
# 0 memory 100
# 1 cpu 85
Including Log Template Type and Log Message
Use special fields @log_type and @log_message to include alongside extracted variables:
from log_surgeon import Parser, Query
parser = Parser()
parser.add_var("metric", rf"value=(?<value>\d+)")
parser.compile()
log_data = """
2024-01-01 INFO: Processing value=42
2024-01-01 WARN: Processing value=100
"""
# Select log type, message, and all variables
query = (
Query(parser)
.select(["@log_type", "@log_message", "*"])
.from_(log_data)
.validate_query()
)
df = query.to_dataframe()
print(df)
# Output:
# @log_type @log_message value
# 0 <timestamp> INFO: Processing <metric> 2024-01-01 INFO: Processing value=42 42
# 1 <timestamp> WARN: Processing <metric> 2024-01-01 WARN: Processing value=100 100
The "*" wildcard expands to all variables defined in the schema and can be combined with other fields like @log_type and @log_message.
Analyzing Log Types
Discover and analyze log patterns in your data using log type analysis methods:
from log_surgeon import Parser, Query
parser = Parser()
parser.add_var("metric", rf"value=(?<value>\d+)")
parser.add_var("status", rf"status=(?<status>\w+)")
parser.compile()
log_data = """
2024-01-01 INFO: Processing value=42
2024-01-01 INFO: Processing value=100
2024-01-01 WARN: System status=degraded
2024-01-01 INFO: Processing value=7
2024-01-01 ERROR: System status=failed
"""
query = Query(parser).from_(log_data)
# Get all unique log types
print("Unique log types:")
for log_type in query.get_log_types():
print(f" {log_type}")
# Reset stream for next analysis
query.from_(log_data)
# Get log type occurrence counts
print("\nLog type counts:")
counts = query.get_log_type_counts()
for log_type, count in sorted(counts.items(), key=lambda x: -x[1]):
print(f" {count:3d} {log_type}")
# Reset stream for next analysis
query.from_(log_data)
# Get sample messages for each log type
print("\nLog type samples:")
samples = query.get_log_type_with_sample(sample_size=2)
for log_type, messages in samples.items():
print(f" {log_type}")
for msg in messages:
print(f" - {msg.strip()}")
Output:
Unique log types:
<timestamp> INFO: Processing <metric>
<timestamp> WARN: System <status>
<timestamp> ERROR: System <status>
Log type counts:
3 <timestamp> INFO: Processing <metric>
1 <timestamp> WARN: System <status>
1 <timestamp> ERROR: System <status>
Log type samples:
<timestamp> INFO: Processing <metric>
- 2024-01-01 INFO: Processing value=42
- 2024-01-01 INFO: Processing value=100
<timestamp> WARN: System <status>
- 2024-01-01 WARN: System status=degraded
<timestamp> ERROR: System <status>
- 2024-01-01 ERROR: System status=failed
API Reference
Parser
High-level parser for extracting structured data from unstructured log messages.
Constructor
Parser(delimiters: str = r" \t\r\n:,!;%@/\(\)\[\]")- Initialize a parser with optional custom delimiters
- Default delimiters include space, tab, newline, and common punctuation
Methods
-
add_var(name: str, regex: str, hide_var_name_if_named_group_present: bool = True) -> Parser- Add a variable pattern to the parser's schema
- Supports named capture groups using
(?<name>)syntax - Use raw f-strings (
rf"...") for regex patterns (see Using Raw F-Strings) - Returns self for method chaining
-
add_timestamp(name: str, regex: str) -> Parser- Add a timestamp pattern to the parser's schema
- Returns self for method chaining
-
compile(enable_debug_logs: bool = False) -> None- Build and initialize the parser with the configured schema
- Must be called after adding variables and before parsing
- Set
enable_debug_logs=Trueto output debug information to stderr
-
load_schema(schema: str, group_name_resolver: GroupNameResolver) -> None- Load a pre-built schema string to configure the parser
-
parse(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Generator[LogEvent, None, None]- Parse all log events from a string, file object, or stream
- Accepts strings, text/binary file objects, StringIO, or BytesIO
- Yields LogEvent objects for each parsed event
-
parse_event(payload: str) -> LogEvent | None- Parse a single log event from a string (convenience method)
- Wraps
parse()and returns the first event - Returns LogEvent or None if no event found
LogEvent
Represents a parsed log event with extracted variables.
Methods
-
get_log_message() -> str- Get the original log message
-
get_log_type() -> str- Get the generated log type (template) with logical group names
-
get_capture_group(logical_capture_group_name: str, raw_output: bool = False) -> str | list | None- Get the value of a capture group by its logical name
- If
raw_output=False(default), single values are unwrapped from lists - Returns None if capture group not found
-
get_capture_group_str_representation(field: str, raw_output: bool = False) -> str- Get the string representation of a capture group value
-
get_resolved_dict() -> dict[str, str | list]- Get a dictionary with all capture groups using logical (user-defined) names
- Physical names (CGPrefix*) are converted to logical names
- Timestamp fields are consolidated under "timestamp" key
- Single-value lists are unwrapped to scalar values
- "@LogType" is excluded from the output
-
__getitem__(key: str) -> str | list- Access capture group values by name (e.g.,
event['field_name']) - Shorthand for
get_capture_group(key, raw_output=False)
- Access capture group values by name (e.g.,
-
__str__() -> str- Get formatted JSON representation of the log event with logical group names
- Uses
get_resolved_dict()internally
Query
Query builder for parsing log events into structured data formats.
Constructor
Query(parser: Parser)- Initialize a query with a configured parser
Methods
-
select(fields: list[str]) -> Query- Select fields to extract from log events
- Supports variable names,
"*"for all variables,"@log_type"for log type, and"@log_message"for original message - The
"*"wildcard can be combined with other fields (e.g.,["@log_type", "*"]) - Returns self for method chaining
-
filter(predicate: Callable[[LogEvent], bool]) -> Query- Filter log events using a predicate function
- Predicate receives a LogEvent and returns True to include it, False to exclude
- Returns self for method chaining
- Example:
query.filter(lambda event: int(event['value']) > 50)
-
from_(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Query- Set the input source to parse
- Accepts strings, text/binary file objects, StringIO, or BytesIO
- Strings are automatically wrapped in StringIO
- Returns self for method chaining
-
select_from(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Query- Alias for
from_() - Returns self for method chaining
- Alias for
-
validate_query() -> Query- Validate that the query is properly configured
- Returns self for method chaining
-
to_dataframe() -> pd.DataFrame- Convert parsed events to a pandas DataFrame
-
to_df() -> pd.DataFrame- Alias for
to_dataframe()
- Alias for
-
to_arrow() -> pa.Table- Convert parsed events to a PyArrow Table
-
to_pa() -> pa.Table- Alias for
to_arrow()
- Alias for
-
get_rows() -> list[list]- Extract rows of field values from parsed events
-
get_vars() -> KeysView[str]- Get all variable names (logical capture group names) defined in the schema
-
get_log_types() -> Generator[str, None, None]- Get all unique log types from parsed events
- Yields log types in the order they are first encountered
- Useful for discovering log patterns in your data
-
get_log_type_counts() -> dict[str, int]- Get count of occurrences for each unique log type
- Returns dictionary mapping log types to their counts
- Useful for analyzing log type distribution
-
get_log_type_with_sample(sample_size: int = 3) -> dict[str, list[str]]- Get sample log messages for each unique log type
- Returns dictionary mapping log types to lists of sample messages
- Useful for understanding what actual messages match each template
SchemaCompiler
Compiler for constructing log-surgeon schema definitions.
Constructor
SchemaCompiler(delimiters: str = DEFAULT_DELIMITERS)- Initialize a schema compiler with optional custom delimiters
Methods
-
add_var(name: str, regex: str, hide_var_name_if_named_group_present: bool = True) -> SchemaCompiler- Add a variable pattern to the schema
- Returns self for method chaining
-
add_timestamp(name: str, regex: str) -> SchemaCompiler- Add a timestamp pattern to the schema
- Returns self for method chaining
-
remove_var(var_name: str) -> SchemaCompiler- Remove a variable from the schema
- Returns self for method chaining
-
get_var(var_name: str) -> Variable- Get a variable by name
-
compile() -> str- Compile the final schema string
-
get_capture_group_name_resolver() -> GroupNameResolver- Get the resolver for mapping logical to physical capture group names
GroupNameResolver
Bidirectional mapping between logical (user-defined) and physical (auto-generated) group names.
Constructor
GroupNameResolver(physical_name_prefix: str)- Initialize with a prefix for auto-generated physical names
Methods
-
create_new_physical_name(logical_name: str) -> str- Create a new unique physical name for a logical name
- Each call generates a new physical name
-
get_physical_names(logical_name: str) -> set[str]- Get all physical names associated with a logical name
-
get_logical_name(physical_name: str) -> str- Get the logical name for a physical name
-
get_all_logical_names() -> KeysView[str]- Get all logical names that have been registered
Pattern
Collection of common regex patterns for log parsing.
Class Attributes
-
Pattern.UUID- Pattern for UUID (Universally Unique Identifier) strings
-
Pattern.IP_OCTET- Pattern for a single IPv4 octet (0-255)
-
Pattern.IPV4- Pattern for IPv4 addresses
-
Pattern.INT- Pattern for integer numbers (with optional negative sign)
-
Pattern.FLOAT- Pattern for floating-point numbers (with optional negative sign)
Example Usage
from log_surgeon import Parser, Pattern
parser = Parser()
parser.add_var("ip", rf"IP: (?<ip_address>{Pattern.IPV4})")
parser.add_var("id", rf"ID: (?<uuid>{Pattern.UUID})")
parser.add_var("value", rf"value=(?<val>{Pattern.FLOAT})")
parser.compile()
Key Concepts
⚠️ CRITICAL: You must understand these concepts to use log-surgeon correctly.
log-surgeon works fundamentally differently from traditional regex engines like Python's
remodule, PCRE, or JavaScript regex. Skipping this section will lead to patterns that don't work as expected.
Token-Based Parsing and Delimiters
CRITICAL: log-surgeon uses token-based parsing, not character-based regex matching like traditional regex engines. This is the most important difference that affects how patterns work.
How Tokenization Works
Delimiters are characters used to split log messages into tokens. The default delimiters include:
- Whitespace: space, tab (
\t), newline (\n), carriage return (\r) - Punctuation:
:,,,!,;,%,@,/,(,),[,]
For example, with default delimiters, the log message:
"abc def ghi"
is tokenized into three tokens: ["abc", "def", "ghi"]
You can customize delimiters when creating a Parser:
parser = Parser(delimiters=r" \t\n,:") # Custom delimiters
Token-Based Pattern Matching
Critical: Patterns like .* only match within a single token, not across multiple tokens or delimiters.
from log_surgeon import Parser
parser = Parser() # Default delimiters include space
parser.add_var("token", rf"(?<match>d.*)")
parser.compile()
# With "abc def ghi" tokenized as ["abc", "def", "ghi"]
event = parser.parse_event("abc def ghi")
# ✅ Matches only "def" (single token starting with 'd')
# ❌ Does NOT match "def ghi" (would cross token boundary)
print(event['match']) # Output: "def"
In a traditional regex engine, d.* would match "def ghi" (everything from 'd' to end).
In log-surgeon, d.* matches only "def" because patterns cannot cross delimiter boundaries.
Why Token-Based?
Token-based parsing enables:
- Faster parsing by reducing search space
- Predictable behavior aligned with log structure
- Efficient log type generation for analytics
Working with Token Boundaries
To match across multiple tokens, you must use character classes like [a-zA-Z]* instead of .:
from log_surgeon import Parser
parser = Parser() # Default delimiters include space
# ❌ Using .* - only matches within a single token
parser.add_var("wrong", rf"(?<match>d.*)") # Matches only "def"
# ✅ Using character classes - matches across tokens
parser.add_var("correct", rf"(?<match>d[a-z ]*i)") # Matches "def ghi"
parser.compile()
event = parser.parse_event("abc def ghi")
print(event['match']) # Output: "def ghi"
Key Rule: Character classes like [a-zA-Z]*, [a-z ]*, or [\w\s]* can match across token boundaries, but .* cannot.
Alternation Requires Grouping
CRITICAL: Alternation (|) works differently in log-surgeon compared to traditional regex engines. You must use parentheses to group alternatives.
from log_surgeon import Parser
parser = Parser()
# ❌ WRONG: Without grouping - matches "ab" AND ("c" OR "d") AND "ef"
parser.add_var("wrong", rf"(?<word>abc|def)")
# In log-surgeon, this is interpreted as: "ab" + "c|d" + "ef"
# Matches: "abcef" or "abdef" (NOT "abc" or "def")
# ✅ CORRECT: With grouping - matches "abc" OR "def"
parser.add_var("correct", rf"(?<word>(abc)|(def))")
# Matches: "abc" or "def"
parser.compile()
In traditional regex engines, abc|def means "abc" OR "def".
In log-surgeon, abc|def means "ab" + ("c" OR "d") + "ef".
Key Rule: Always use (abc)|(def) syntax for alternation to match complete alternatives.
# More examples:
parser.add_var("level", rf"(?<level>(ERROR)|(WARN)|(INFO))") # ✅ Correct
parser.add_var("status", rf"(?<status>(success)|(failure))") # ✅ Correct
parser.add_var("bad", rf"(?<status>success|failure)") # ❌ Wrong - unexpected behavior
Optional Patterns
For optional patterns, use {0,1} instead of *:
from log_surgeon import Parser
parser = Parser()
# ❌ Avoid using * for optional patterns (matches 0 or more)
parser.add_var("avoid", rf"(?<level>(ERROR)|(WARN))*") # Can match empty string or multiple repetitions
# ❌ Do not use ? for optional patterns
parser.add_var("avoid2", rf"(?<level>(ERROR)|(WARN))?") # May not work as expected
# ✅ Use {0,1} for optional patterns (matches 0 or 1)
parser.add_var("optional", rf"(?<level>(ERROR)|(WARN)){0,1}") # Matches 0 or 1 occurrence
parser.compile()
Best Practice: Use {0,1} for optional elements. Avoid * (0 or more) and ? for optional matching.
You can also explicitly include delimiters in your pattern:
# To match "def ghi", explicitly include the space delimiter
parser.add_var("multi", rf"(?<match>d\w+\s+\w+)")
# This matches "def " as one token segment, followed by "ghi"
Or adjust your delimiters to change tokenization behavior:
# Use only newline as delimiter to treat entire lines as tokens
parser = Parser(delimiters=r"\n")
Named Capture Groups
Use named capture groups in regex patterns to extract specific fields:
parser.add_var("metric", rf"metric=(?<metric_name>\w+) value=(?<value>\d+)")
The syntax (?<name>pattern) creates a capture group that can be accessed as event['name'].
Note: See Using Raw F-Strings for best practices on writing regex patterns.
Using Raw F-Strings for Regex Patterns
⚠️ STRONGLY RECOMMENDED: Use raw f-strings (
rf"...") for all regex patterns.While not absolutely required, using regular strings will likely cause escaping issues and pattern failures. Raw f-strings prevent these problems.
Raw f-strings combine the benefits of:
- Raw strings (
r"..."): No need to double-escape regex special characters like\d,\w,\n - F-strings (
f"..."): Easy interpolation of variables and pattern constants
Why Use Raw F-Strings?
# ❌ Without raw strings - requires double-escaping
parser.add_var("metric", "value=(\\d+)") # Hard to read, error-prone
# ✅ With raw f-strings - single escaping, clean and readable
parser.add_var("metric", rf"value=(?<value>\d+)")
Watch Out for Braces in F-Strings
When using f-strings, literal { and } characters must be escaped by doubling them:
from log_surgeon import Parser, Pattern
parser = Parser()
# ✅ Correct: Escape literal braces in regex
parser.add_var("json", rf"data={{(?<content>[^}}]+)}}") # Matches: data={...}
parser.add_var("range", rf"range={{(?<min>\d+),(?<max>\d+)}}") # Matches: range={10,20}
# ✅ Using Pattern constants with interpolation
parser.add_var("ip", rf"IP: (?<ip>{Pattern.IPV4})")
parser.add_var("float", rf"value=(?<val>{Pattern.FLOAT})")
# ✅ Common regex patterns
parser.add_var("digits", rf"\d+ items") # No double-escaping needed
parser.add_var("word", rf"name=(?<name>\w+)")
parser.add_var("whitespace", rf"split\s+by\s+spaces")
parser.compile()
Examples: Raw F-Strings vs Regular Strings
# Regular string - requires double-escaping
parser.add_var("path", "path=(?<path>\\w+/\\w+)") # Hard to read
# Raw f-string - natural regex syntax
parser.add_var("path", rf"path=(?<path>\w+/\w+)") # Clean and readable
# With interpolation
log_level = "INFO|WARN|ERROR"
parser.add_var("level", rf"(?<level>{log_level})") # Easy to compose
Recommendation: Consistently use rf"..." for all regex patterns. This approach:
- Avoids double-escaping mistakes that break patterns
- Makes patterns more readable
- Allows easy use of Pattern constants and variables
- Only requires watching for literal braces
{and}in f-strings (escape as{{and}})
Using regular strings ("...") will require double-escaping (e.g., "\\d+") which is error-prone and hard to read.
Logical vs Physical Names
Internally, log-surgeon uses "physical" names (e.g., CGPrefix0, CGPrefix1) for capture groups, while you work with "logical" names (e.g., user_id, thread). The GroupNameResolver handles this mapping automatically.
Schema Format
The schema defines delimiters, timestamps, and variables for parsing:
// schema delimiters
delimiters: \t\r\n:,!;%@/\(\)\[\]
// schema timestamps
timestamp:<timestamp_regex>
// schema variables
variable_name:<variable_regex>
When using the fluent API (Parser.add_var() and Parser.compile()), the schema is built automatically.
Development
Building from Source
# Clone the repository
git clone https://github.com/y-scope/log-surgeon-ffi-py.git
cd log-surgeon-ffi-py
# Install the project in editable mode
pip install -e .
# Build the extension
cmake -S . -B build
cmake --build build
Running Tests
# Install test dependencies
pip install pytest
# Run tests
python -m pytest tests/
Requirements
- Python >= 3.9
- pandas
- pyarrow
Build Requirements
- C++20 compatible compiler
- CMake >= 3.15
License
Apache License 2.0 - See LICENSE for details.
Links
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.13, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3bd0442049d17218a4aaffe7002748f2dc617cbc7ec757a903b10400de2eae95
|
|
| MD5 |
9fc8ed7fa1a60ef9be794872550e6049
|
|
| BLAKE2b-256 |
e447940df65757c59b738e66c608aa1fd9f09c600a079a6cbf690950a790f262
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_i686.whl
- Upload date:
- Size: 1.4 MB
- Tags: CPython 3.13, musllinux: musl 1.2+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e4b9b81b2fd3163aa57592217f9abe9034a94ede44853b4c00a45d0f28c2bdf7
|
|
| MD5 |
c66e91ed26f5e0fcf7694f08836b05d3
|
|
| BLAKE2b-256 |
5e1cf36c77c16df217873595ae606031dfc534c76c532fda8f81c6c80922de1b
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-musllinux_1_2_aarch64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.13, musllinux: musl 1.2+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5df97a3cfc27a6e6451a2ca1d74c4d1c65876cff8fa1fda7fcd8b05f1ba9c411
|
|
| MD5 |
899c2f0427667b8f754f0051134127a0
|
|
| BLAKE2b-256 |
5ecad527fc65cf0f7f6bd9b17ba339499725202a53db5635f57654198682e2bf
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 347.9 kB
- Tags: CPython 3.13, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d379b36d2a27bfcb99629cbda5d4d5465ad5d9efd69d28b7deb5e0c77ac80011
|
|
| MD5 |
6ebef4869022f83c5b0fd5916b632ec9
|
|
| BLAKE2b-256 |
5e0e4250ac595097b950ea25eb0606ea02bec4114d300e0651f4f05c0922ab70
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl
- Upload date:
- Size: 364.2 kB
- Tags: CPython 3.13, manylinux: glibc 2.17+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
33025fcf823282bbfa8aba7d3275e46d6123b72c8d29b41275c76df79bc32abb
|
|
| MD5 |
e98d64e3fc8593c2ebddb9af787f5cae
|
|
| BLAKE2b-256 |
c9ab096c3b7be9558bd51923263abcd2099f39fa97c0ad17eae0d79aa282d5cf
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 336.1 kB
- Tags: CPython 3.13, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
22fe646d2dbff3f987eae6196b1e8e936c4698e21666cbcbdb07282bf88c446c
|
|
| MD5 |
4bef7b8741ec0d8efa95760b032dd14d
|
|
| BLAKE2b-256 |
595d08ac021c311682c6d8f5b90be591c7416a8673127b3fa579a92f6cbdb83f
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.12, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ee8d71fce55c0872828a7bbd2452104ba55fe15262943de27f29a46a47c24970
|
|
| MD5 |
5c5f9a24df0e28fd2cf26e51e94aac0a
|
|
| BLAKE2b-256 |
18ab9fe6b6f92a62e559a1e2193ea74a1c369a300e5e490fe844ff770c754237
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_i686.whl
- Upload date:
- Size: 1.4 MB
- Tags: CPython 3.12, musllinux: musl 1.2+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0f167123aec94cbc779996abd2697152220d7cfb3d19d7af78646c186a07818f
|
|
| MD5 |
6dfa5eead28b04499d69750037c22314
|
|
| BLAKE2b-256 |
8ac6505fe4303c418c73d14b70425fc11349418fea5da658da22a8f148ed94ce
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-musllinux_1_2_aarch64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.12, musllinux: musl 1.2+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ac32ccbe1aa5739a0216a55d32b5b2b8f88b75f7389bd52685e9808565f46e06
|
|
| MD5 |
559f1e90f86f01c5bc778ca688681357
|
|
| BLAKE2b-256 |
cc5eadaea3f83fd031793052280a4f2ac67433424d3fd13b04d1a1844a584315
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 347.9 kB
- Tags: CPython 3.12, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
857c68f07d16bbe908d4daac699b4577269df6abe05e6b54b4c42bfe77442b7a
|
|
| MD5 |
c40cd828e7ffc73556aeebbd73aae6e6
|
|
| BLAKE2b-256 |
141cb99fba0603b455d681c2309dd3b0fa30c80b2ec785dc6a13ceb339ede0aa
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl
- Upload date:
- Size: 364.2 kB
- Tags: CPython 3.12, manylinux: glibc 2.17+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e65133412c736ef39f910be2838171ec2e35d897437d902e370f5f0b8673265d
|
|
| MD5 |
6e2ff18c8154e4db3c9ac92e99c320af
|
|
| BLAKE2b-256 |
2cb4b0633109362161b81c33fbbb119582370d629f75838e24bb1174750a1f54
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 336.1 kB
- Tags: CPython 3.12, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e3495fdedb8b87375efc9d31bec5ed4a5a263ddcbd02f6476f4c1b39919609aa
|
|
| MD5 |
8201a924ed6333e12e74095dd2ccfd0c
|
|
| BLAKE2b-256 |
f56d9c66d8422f315411ad5de58dbea86560bf08f9905772b9ba8059bb23a427
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.11, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9f01a84d1a425b5a8a2252b58174e7acccb12f778ab8b847c966c57a2df47aa8
|
|
| MD5 |
dbd01c5baf5e8e8eee84b63f8070ab45
|
|
| BLAKE2b-256 |
da99a4f016ea3059fa796aae378980d84e7984ff85d679a9485d6d1085651c16
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_i686.whl
- Upload date:
- Size: 1.4 MB
- Tags: CPython 3.11, musllinux: musl 1.2+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
94ed982342196ba33c1b6963e0659d180e6f382c6a804643a2d0a7942a3af845
|
|
| MD5 |
c5a3937f3d3ed170241cfdd368274043
|
|
| BLAKE2b-256 |
5fde55964fbc8a87e45fefa6ea6815ddc4f9913007b0e36e45badf21304bede7
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-musllinux_1_2_aarch64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.11, musllinux: musl 1.2+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40397985b3a4ed611af358d62a79f154f1c134c0d810af8127fb9d609f4514c0
|
|
| MD5 |
4e2b3a3862983b48fbde4511ffcfc616
|
|
| BLAKE2b-256 |
1b74b08e46e2734369ca0b1efa295c886d02827b3f2c21e38ba690110d881cdb
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 347.7 kB
- Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1cec1219fe9a2dc3bfc5fec8e2ef387ba1f653a3c7cc2801686af7a0fdf9c56d
|
|
| MD5 |
5e211f2fb1addc62eeac2ee15c2dca05
|
|
| BLAKE2b-256 |
59ef413b12c40fe0f577fa7aba48cff4abb1a80b2069cba1eae707c04ca037c0
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl
- Upload date:
- Size: 364.2 kB
- Tags: CPython 3.11, manylinux: glibc 2.17+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ecb02aa68558d5b08559b6128d4bd7260236922415103a40649307ecbd34773
|
|
| MD5 |
4c7cb3b8b36092d1eba62867b02ad325
|
|
| BLAKE2b-256 |
b85e6e1b8beecf8c2bad7305c7b93edb9c6ef52424caea7c20e32feb0c287f12
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 336.1 kB
- Tags: CPython 3.11, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1001442709f2a94bba0dd383aece03d10e09eb6e03991bde20f00e8bc7653835
|
|
| MD5 |
38a979f4a4d51ae0de041bdd72d15662
|
|
| BLAKE2b-256 |
0be3ac0742d176e1233cfb1561d697216027d7026581d643cce44dc551a202d0
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.10, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a3d083772e96e575393457bd0b155e7f04d23556ba0ff8cd3bbbbc835968c1a9
|
|
| MD5 |
645431a845b64e81ff7c5b676b29d2c7
|
|
| BLAKE2b-256 |
065ac476eabd35d3e26b740c5943d6f9b662142eb92d160539b02e4004ce18c1
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_i686.whl
- Upload date:
- Size: 1.4 MB
- Tags: CPython 3.10, musllinux: musl 1.2+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4df219ec70287c7eb0d0ff50e9cacf60bc56574e158c62940e13669a59b7bde9
|
|
| MD5 |
a0e8607baee354b224563c876c492a13
|
|
| BLAKE2b-256 |
acf0ba470144844d79a248d5b527533dae378b8cb58ef001b19d973d656edd42
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-musllinux_1_2_aarch64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.10, musllinux: musl 1.2+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
10f058aea02b88617604e78efa862eadda7a46815fa9baa4c908b084e4cef3d7
|
|
| MD5 |
63bdf166523c5176bce6216307ec3c07
|
|
| BLAKE2b-256 |
cd3cb9e78fd29786cb9972a5648c7694993814fa44df822557398fb5c8971944
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 347.7 kB
- Tags: CPython 3.10, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ecfaa59e4f9bdfb3895cf33adb408bcc182dc5d98a21c1cc48ed8b54b17d180d
|
|
| MD5 |
c209231d8f414845e56ba2858983ab66
|
|
| BLAKE2b-256 |
0328c48adb17e10618fdfbfa76179250a7c0c21290612820a0e6349e90c91086
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl
- Upload date:
- Size: 364.2 kB
- Tags: CPython 3.10, manylinux: glibc 2.17+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c2f7ac7e46a32d8f01e32481355d37760a0ecf9effb369fa68bd9555a84dc728
|
|
| MD5 |
070facb27b0579bb0eed0289a6181908
|
|
| BLAKE2b-256 |
5d443f4042558eb9a6de92f13b5a41586936cffab60367f144a5993bb19f2eff
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 336.1 kB
- Tags: CPython 3.10, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
04a0d73bce412dc9b88544a99783109796956bd936a3026c771b079e5461dcf2
|
|
| MD5 |
b6add918580277ddd02ad55942543b0d
|
|
| BLAKE2b-256 |
e5e4560f2516ab5c8b07c97e7883ea3d4aab2b8a296d383621f707744fe2d17b
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.9, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ff2aea66b7e0e262558adb824ca3d1d05771fde7b41e1b81f5ce8a0108c418da
|
|
| MD5 |
58536cbf477ee8170755aa234e2fb50f
|
|
| BLAKE2b-256 |
c59f2a234aecbf33e73187f73a198740cf50af81bd2d2da0527f053b4e27f8d6
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_i686.whl
- Upload date:
- Size: 1.4 MB
- Tags: CPython 3.9, musllinux: musl 1.2+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a39e36af5de97811f1d60d5b07f5f1ce9c98c67af0adc988c139ba2c2b86328c
|
|
| MD5 |
6a5b80865481820a21213f4fdbe6216e
|
|
| BLAKE2b-256 |
915e475cd9d955b9de58913194f627a3106478ae57c0bfabe29dadd041a82360
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-musllinux_1_2_aarch64.whl
- Upload date:
- Size: 1.3 MB
- Tags: CPython 3.9, musllinux: musl 1.2+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
774ce9fa3494a62259dd5aa54d53a581d1e3a3612de68835f475dc63d31fbc9e
|
|
| MD5 |
d1efa8338240e9e0bc2903dd0e82f2ec
|
|
| BLAKE2b-256 |
d94399f266de0a16b7ddaee533ece32dee667a512f0e43a73d478b6aa6825a84
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 347.7 kB
- Tags: CPython 3.9, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1a22c591848803d350d611a0edf615b87da2ad81cb8821db7b5479d60685be8e
|
|
| MD5 |
20926da312abf9fdda859acf4a759370
|
|
| BLAKE2b-256 |
ad3a5019502958b09f6865b018b35659688fb0e8d25a60e4513757ab068922d6
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl
- Upload date:
- Size: 364.2 kB
- Tags: CPython 3.9, manylinux: glibc 2.17+ i686
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a498e342111b920f44b6f693b18393cbef0dffc757e6efab3bf3eae332ddb8c
|
|
| MD5 |
e2c8aac38c0e9aa603ab771f177d7c06
|
|
| BLAKE2b-256 |
c953dedbb7b7769ab9fc49765f929a2543f0b215c8a08fd032b65f400b2e4826
|
File details
Details for the file log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: log_surgeon_ffi-0.1.0b2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 336.1 kB
- Tags: CPython 3.9, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
94a5152ee1730cabcc041f451d205c18d280c758129accda7d64428bb59093b5
|
|
| MD5 |
e1c687f0d995cb65c5b71c81078c0b01
|
|
| BLAKE2b-256 |
326bc777e1c6c7758d80bc17ad61e9b3d4d9889bcbdd910bcd61df88c7b00110
|