Skip to main content

Python FFI bindings for log-surgeon: high-performance parsing of unstructured logs into structured data

Project description

log-surgeon-ffi

log-surgeon-ffi provides Python foreign function interface (FFI) bindings for log-surgeon.


Quick navigation

Overview

Getting started

Key concepts

Reference

Development


Overview

log-surgeon, is a high-performance C++ library that enables efficient extraction of structured information from unstructured log files.

Why log-surgeon?

Traditional regex engines are often slow to execute, prone to errors, and costly to maintain.

log-surgeon streamlines the process by identifying, extracting, and labeling variable values with semantic context, and then inferring a log template in a single pass. log-surgeon is also built to accommodate structural variability. Values may shift position, appear multiple times, or change order entirely, but with log-surgeon, you simply define the variable patterns, and log-surgeon JIT-compiles a tagged-DFA state machine to drive the full pipeline.

Key capabilities

  • Extract variables from log messages using regex patterns with named capture groups
  • Generate log types (templates) automatically for log analysis
  • Parse streams efficiently for large-scale log processing
  • Export data to pandas DataFrames and PyArrow Tables

Structured output and downstream capabilities

Unstructured log data is automatically transformed into structured semantic representations.

  • Log types (templates): Variables are replaced with placeholders to form reusable templates. For example, roughly 200,000 Spark log messages can reduce to about 55 distinct templates, which supports pattern analysis and anomaly detection.

  • Semantic Variables: Extracted key-value pairs with semantic context (e.g., app_id, app_name, worker_id) can be used directly for analysis.

This structured output unlocks powerful downstream capabilities:

  • Knowledge graph construction. Build relationship graphs between entities extracted from logs (e.g., linking app_idapp_nameworker_id).

  • Template-based summarization. Compress massive datasets into compact template sets for human and agent consumption. Templates act as natural tokens for LLMs. Instead of millions of raw lines, provide a small number of distinct templates with statistics.

  • Hybrid search Combine free-text search with structured queries. Log types enable auto-completion and query suggestions on large datasets. Instead of searching through millions of raw log lines, search across a compact set of templates first. Then project and filter on structured variables (e.g., status == "ERROR", response_time > 1000), and aggregate for analysis.

  • Agentic automation. Agents can query by template, analyze variable distributions, identify anomalies, and automate debugging tasks using structured signals rather than raw text.

When to use log-surgeon

Good fit

  • Large-scale log processing (millions of lines)
  • Extracting structured data from semi-structured logs
  • Generating log templates for analytics
  • Multi-line log events (stack traces, JSON dumps)
  • Performance-critical parsing

Not ideal

  • Simple one-off text extraction (use Python re module)
  • Highly irregular text without consistent delimiters
  • Patterns requiring full PCRE features (lookahead, backreferences)

Getting started

Follow the instructions below to get started with log-surgeon-ffi.

System requirements

  • Python >= 3.9
  • pandas
  • pyarrow

Build requirements

  • C++20 compatible compiler
  • CMake >= 3.15

Installation

To install the library with pandas and PyArrow support for DataFrame/Arrow table exports, run the following command:

pip install log-surgeon-ffi

To verify your installation, run the following command:

python -c "from log_surgeon import Parser; print('Installation successful.')"

Note: If you only need core parsing without DataFrame or Arrow exports, you can install a minimal environment, although pandas and PyArrow are included by default for convenience.

First steps

After installation, follow these steps:

  1. Read Key Concepts. Token based parsing differs from traditional regex.
  2. Run a Quick start example to see how it works.
  3. Use rf"..." for patterns to avoid escaping issues. See Using Raw f-strings.
  4. Check out examples/ to study some complete working examples.

Important prerequisites

log-surgeon uses token-based parsing, and its regex behavior differs from traditional engines. Read the Key Concepts section before writing patterns.

Critical differences between token-based parsing and traditional regex behavior:

  • .* only matches within a single token (not across delimiters)
  • abc|def requires grouping: use (abc)|(def) instead
  • Use {0,1} for optional patterns, NOT ?

Tip: Use raw f-strings (rf"...") for regex patterns. See Using Raw f-strings for more details.


Quick start examples

Use the following examples to get started.

Basic parsing

The following code parses a simple log event with log-surgeon.

from log_surgeon import Parser, PATTERN

# Parse a sample log event
log_line = "16/05/04 04:24:58 INFO Registering worker with 1 core and 4.0 GiB ram\n"

# Create a parser and define extraction patterns
parser = Parser()
parser.add_var("resource", rf"(?<memory_gb>{PATTERN.FLOAT}) GiB ram")
parser.compile()

# Parse a single event
event = parser.parse_event(log_line)

# Access extracted data
print(f"Message: {event.get_log_message().strip()}")
print(f"LogType: {event.get_log_type().strip()}")
print(f"Parsed Logs: {event}")

Output:

Message: 16/05/04 04:24:58 INFO Registering worker with 1 core and 4.0 GiB ram
LogType: 16/05/04 04:24:58 INFO Registering worker with 1 core and <memory_gb> GiB ram
Parsed Logs: {
  "memory_gb": "4.0"
}

We can see that the parser extracted structured data from the unstructured log line:

  • *Message: The original log line
  • LogType: Template with variable placeholder <memory_gb> showing the pattern structure
  • Parsed variables: Successfully extracted memory_gb value of "4.0" from the pattern match

Try it yourself

Copy this code and modify the pattern to extract both memory_gb AND cores:

from log_surgeon import Parser, PATTERN

log_line = "16/05/04 04:24:58 INFO Registering worker with 1 core and 4.0 GiB ram\n"
parser = Parser()
# TODO: Add pattern to capture both "1" (cores) and "4.0" (memory_gb)
parser.add_var("resource", rf"...")
parser.compile()

event = parser.parse_event(log_line)
print(f"Cores: {event['cores']}, Memory: {event['memory_gb']}")
Solution
parser.add_var("resource", rf"(?<cores>\d+) core and (?<memory_gb>{PATTERN.FLOAT}) GiB ram")

Multiple capture groups

The following code parses a more-complex log event.

from log_surgeon import Parser, PATTERN

# Parse a sample log event
log_line = """16/05/04 12:22:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
        at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:750)
"""

# Create a parser and define extraction patterns
parser = Parser()

# Add timestamp pattern
parser.add_timestamp("TIMESTAMP_SPARK_1_6", rf"\d{{2}}/\d{{2}}/\d{{2}} \d{{2}}:\d{{2}}:\d{{2}}")

# Add variable patterns
parser.add_var("SYSTEM_LEVEL", rf"(?<level>(INFO)|(WARN)|(ERROR))")
parser.add_var("SPARK_HOST_IP_PORT", rf"(?<spark_host>spark\-{PATTERN.INT})/(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.add_var(
  "SYSTEM_EXCEPTION",
  rf"(?<system_exception_type>({PATTERN.JAVA_PACKAGE_SEGMENT})+[{PATTERN.JAVA_IDENTIFIER_CHARSET}]*Exception): "
  rf"(?<system_exception_msg>{PATTERN.LOG_LINE})"
)
parser.add_var(
  rf"SYSTEM_STACK_TRACE",
  rf"(\s{{1,4}}at (?<system_stack>{PATTERN.JAVA_STACK_LOCATION})"
)
parser.compile()

# Parse a single event
event = parser.parse_event(log_line)

# Access extracted data
print(f"Message: {event.get_log_message().strip()}")
print(f"LogType: {event.get_log_type().strip()}")
print(f"Parsed Logs: {event}")

Output:

Message: 16/05/04 12:22:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
        at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:750)
LogType: <timestamp> <level> server.TransportChannelHandler: Exception in connection from <spark_host>/<system_ip>:<system_port>
<system_exception_type>: <system_exception_msg><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>
Parsed Logs: {
  "timestamp": "16/05/04 12:22:37",
  "level": "WARN",
  "spark_host": "spark-35",
  "system_ip": "192.168.10.50",
  "system_port": "55392",
  "system_exception_type": "java.io.IOException",
  "system_exception_msg": "Connection reset by peer",
  "system_stack": [
    "sun.nio.ch.FileDispatcherImpl.read0(Native Method)",
    "sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)",
    "sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)",
    "sun.nio.ch.IOUtil.read(IOUtil.java:192)",
    "sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)",
    "io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)",
    "io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)",
    "io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)",
    "io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)",
    "io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)",
    "io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)",
    "io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)",
    "io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)",
    "io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)",
    "java.lang.Thread.run(Thread.java:750)"
  ]
}

The parser extracted multiple named capture groups from a complex multi-line Java stack trace:

  • Scalar fields: timestamp, level, spark_host, system_ip, system_port, system_exception_type, system_exception_msg
  • Array field: system_stack contains all 15 stack trace locations (demonstrates automatic aggregation of repeated capture groups)
  • LogType: Template shows the structure with <newLine> markers indicating line boundaries in the original log

Stream parsing

When parsing log streams or files, timestamps are required to perform contextual anchoring. Timestamps act as delimiters that separate individual log events, enabling the parser to correctly group multi-line entries (like stack traces) into single events.

from log_surgeon import Parser, PATTERN

# Parse from string (automatically converted to io.StringIO)
SAMPLE_LOGS = """16/05/04 04:31:13 INFO master.Master: Registering app SparkSQL::192.168.10.76
16/05/04 12:32:37 WARN server.TransportChannelHandler: Exception in connection from spark-35/192.168.10.50:55392
java.io.IOException: Connection reset by peer
        at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:313)
        at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
        at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:750)
16/05/04 04:37:53 INFO master.Master: 192.168.10.76:41747 got disassociated, removing it.
"""

# Define parser with patterns
parser = Parser()
# REQUIRED: Timestamp acts as contextual anchor to separate individual log events in the stream
parser.add_timestamp("TIMESTAMP_SPARK_1_6", rf"\d{{2}}/\d{{2}}/\d{{2}} \d{{2}}:\d{{2}}:\d{{2}}")
parser.add_var("SYSTEM_LEVEL", rf"(?<level>(INFO)|(WARN)|(ERROR))")
parser.add_var("SPARK_APP_NAME", rf"(?<spark_app_name>SparkSQL::{PATTERN.IPV4})")
parser.add_var("SPARK_HOST_IP_PORT", rf"(?<spark_host>spark\-{PATTERN.INT})/(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.add_var(
    "SYSTEM_EXCEPTION",
    rf"(?<system_exception_type>({PATTERN.JAVA_PACKAGE_SEGMENT})+[{PATTERN.JAVA_IDENTIFIER_CHARSET}]*Exception): "
    rf"(?<system_exception_msg>{PATTERN.LOG_LINE})"
)
parser.add_var(
    rf"SYSTEM_STACK_TRACE", rf"(\s{{1,4}}at (?<system_stack>{PATTERN.JAVA_STACK_LOCATION})"
)
parser.add_var("IP_PORT", rf"(?<system_ip>{PATTERN.IPV4}):(?<system_port>{PATTERN.PORT})")
parser.compile()

# Stream parsing: iterate over multi-line log events
for idx, event in enumerate(parser.parse(SAMPLE_LOGS)):
    print(f"log-event-{idx} log template type:{event.get_log_type().strip()}")

Output:

log-event-0 log template type:<timestamp> <level> master.Master: Registering app <spark_app_name>
log-event-1 log template type:<timestamp> <level> server.TransportChannelHandler: Exception in connection from <spark_host>/<system_ip>:<system_port>
<system_exception_type>: <system_exception_msg><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack><newLine>        at <system_stack>
log-event-2 log template type:<timestamp> <level> master.Master: <system_ip>:<system_port> got disassociated, removing it.<newLine>

The parser successfully separated the log stream into three distinct events using timestamps as contextual anchors:

  • Event 0: Single-line app registration log
  • Event 1: Multi-line exception with 15 stack trace lines (demonstrates how timestamps bind multi-line events together)
  • Event 2: Single-line disassociation log

Each log type shows the template structure with variable placeholders (<level>, <system_ip>, etc.), enabling pattern-based log analysis and grouping.


Using PATTERN constants

The PATTERN class provides pre-built regex patterns for common log elements like IP addresses, UUIDs, numbers, and file paths. See the PATTERN reference for the complete list of available patterns.

from log_surgeon import Parser, PATTERN

parser = Parser()
parser.add_var("network", rf"IP: (?<ip>{PATTERN.IPV4}) UUID: (?<id>{PATTERN.UUID})")
parser.add_var("metrics", rf"value=(?<value>{PATTERN.FLOAT})")
parser.compile()

log_line = "IP: 192.168.1.1 UUID: 550e8400-e29b-41d4-a716-446655440000 value=42.5"
event = parser.parse_event(log_line)

print(f"IP: {event['ip']}")
print(f"UUID: {event['id']}")
print(f"Value: {event['value']}")

Output:

IP: 192.168.1.1
UUID: 550e8400-e29b-41d4-a716-446655440000
Value: 42.5

Export to DataFrame

from log_surgeon import Parser, Query

parser = Parser()
parser.add_var(
  "metric",
  rf"metric=(?<metric_name>\w+) value=(?<value>\d+)"
)
parser.compile()

log_data = """
2024-01-01 INFO: metric=cpu value=42
2024-01-01 INFO: metric=memory value=100
2024-01-01 INFO: metric=disk value=7
"""

# Create a query and export to DataFrame
query = (
  Query(parser)
  .select(["metric_name", "value"])
  .from_(log_data)
  .validate_query()
)

df = query.to_dataframe()
print(df)

Filtering events

from log_surgeon import Parser, Query

parser = Parser()
parser.add_var("metric", rf"metric=(?<metric_name>\w+) value=(?<value>\d+)")
parser.compile()

log_data = """
2024-01-01 INFO: metric=cpu value=42
2024-01-01 INFO: metric=memory value=100
2024-01-01 INFO: metric=disk value=7
2024-01-01 INFO: metric=cpu value=85
"""

# Filter events where value > 50
query = (
  Query(parser)
  .select(["metric_name", "value"])
  .from_(log_data)
  .filter(lambda event: int(event['value']) > 50)
  .validate_query()
)

df = query.to_dataframe()
print(df)
# Output:
#   metric_name  value
# 0      memory    100
# 1         cpu     85

Including log template type and log message

Use special fields @log_type and @log_message to include alongside extracted variables:

from log_surgeon import Parser, Query

parser = Parser()
parser.add_var("metric", rf"value=(?<value>\d+)")
parser.compile()

log_data = """
2024-01-01 INFO: Processing value=42
2024-01-01 WARN: Processing value=100
"""

# Select log type, message, and all variables
query = (
  Query(parser)
  .select(["@log_type", "@log_message", "*"])
  .from_(log_data)
  .validate_query()
)

df = query.to_dataframe()
print(df)
# Output:
#                          @log_type                         @log_message value
# 0  <timestamp> INFO: Processing <metric>  2024-01-01 INFO: Processing value=42    42
# 1  <timestamp> WARN: Processing <metric>  2024-01-01 WARN: Processing value=100  100

The "*" wildcard expands to all variables defined in the schema and can be combined with other fields like @log_type and @log_message.


Analyzing Log Types

Discover and analyze log patterns in your data using log type analysis methods:

from log_surgeon import Parser, Query

parser = Parser()
parser.add_var("metric", rf"value=(?<value>\d+)")
parser.add_var("status", rf"status=(?<status>\w+)")
parser.compile()

log_data = """
2024-01-01 INFO: Processing value=42
2024-01-01 INFO: Processing value=100
2024-01-01 WARN: System status=degraded
2024-01-01 INFO: Processing value=7
2024-01-01 ERROR: System status=failed
"""

query = Query(parser).from_(log_data)

# Get all unique log types
print("Unique log types:")
for log_type in query.get_log_types():
  print(f"  {log_type}")

# Reset stream for next analysis
query.from_(log_data)

# Get log type occurrence counts
print("\nLog type counts:")
counts = query.get_log_type_counts()
for log_type, count in sorted(counts.items(), key=lambda x: -x[1]):
  print(f"  {count:3d}  {log_type}")

# Reset stream for next analysis
query.from_(log_data)

# Get sample messages for each log type
print("\nLog type samples:")
samples = query.get_log_type_with_sample(sample_size=2)
for log_type, messages in samples.items():
  print(f"  {log_type}")
  for msg in messages:
    print(f"    - {msg.strip()}")

Output:

Unique log types:
  <timestamp> INFO: Processing <metric>
  <timestamp> WARN: System <status>
  <timestamp> ERROR: System <status>

Log type counts:
    3  <timestamp> INFO: Processing <metric>
    1  <timestamp> WARN: System <status>
    1  <timestamp> ERROR: System <status>

Log type samples:
  <timestamp> INFO: Processing <metric>
    - 2024-01-01 INFO: Processing value=42
    - 2024-01-01 INFO: Processing value=100
  <timestamp> WARN: System <status>
    - 2024-01-01 WARN: System status=degraded
  <timestamp> ERROR: System <status>
    - 2024-01-01 ERROR: System status=failed

Key concepts

CRITICAL: You must understand these concepts to use log-surgeon correctly.

log-surgeon works fundamentally differently from traditional regex engines like Python's re module, PCRE, or JavaScript regex. Skipping this section may lead to patterns that don't work as expected.

Token-based parsing and delimiters

CRITICAL: log-surgeon uses token-based parsing, not character-based regex matching like traditional regex engines. This is the most important difference that affects how patterns work.

How tokenization works

Delimiters are characters used to split log messages into tokens. The default delimiters include:

  • Whitespace: space, tab (\t), newline (\n), carriage return (\r)
  • Punctuation: :, ,, !, ;, %, @, /, (, ), [, ]

For example, with default delimiters, the log message:

"abc def ghi"

is tokenized into three tokens: ["abc", "def", "ghi"]

You can customize delimiters when creating a Parser:

parser = Parser(delimiters=r" \t\n,:")  # Custom delimiters

Token-Based Pattern Matching

Critical: Patterns like .* only match within a single token, not across multiple tokens or delimiters.

from log_surgeon import Parser

parser = Parser()  # Default delimiters include space
parser.add_var("token", rf"(?<match>d.*)")
parser.compile()

# With "abc def ghi" tokenized as ["abc", "def", "ghi"]
event = parser.parse_event("abc def ghi")

# Matches only "def" (single token starting with 'd')
# Does NOT match "def ghi" (would cross token boundary)
print(event['match'])  # Output: "def"

In a traditional regex engine, d.* would match "def ghi" (everything from 'd' to end). In log-surgeon, d.* matches only "def" because patterns cannot cross delimiter boundaries.

Why token-based?

Token-based parsing enables:

  • Faster parsing by reducing search space
  • Predictable behavior aligned with log structure
  • Efficient log type generation for analytics

Working with token boundaries

To match across multiple tokens, you must use character classes like [a-zA-Z]* instead of .:

from log_surgeon import Parser

parser = Parser()  # Default delimiters include space

#  Using .* - only matches within a single token
parser.add_var("wrong", rf"(?<match>d.*)")  # Matches only "def"

#  Using character classes - matches across tokens
parser.add_var("correct", rf"(?<match>d[a-z ]*i)")  # Matches "def ghi"
parser.compile()

event = parser.parse_event("abc def ghi")
print(event['match'])  # Output: "def ghi"

Key Rule: Character classes like [a-zA-Z]*, [a-z ]*, or [\w\s]* can match across token boundaries, but .* cannot.

Alternation requires grouping

CRITICAL: Alternation (|) works differently in log-surgeon compared to traditional regex engines. You must use parentheses to group alternatives.

from log_surgeon import Parser

parser = Parser()

#  WRONG: Without grouping - matches "ab" AND ("c" OR "d") AND "ef"
parser.add_var("wrong", rf"(?<word>abc|def)")
# In log-surgeon, this is interpreted as: "ab" + "c|d" + "ef"
# Matches: "abcef" or "abdef" (NOT "abc" or "def")

#  CORRECT: With grouping - matches "abc" OR "def"
parser.add_var("correct", rf"(?<word>(abc)|(def))")
# Matches: "abc" or "def"
parser.compile()

In traditional regex engines, abc|def means "abc" OR "def". In log-surgeon, abc|def means "ab" + ("c" OR "d") + "ef".

Key Rule: Always use (abc)|(def) syntax for alternation to match complete alternatives.

# More examples:
parser.add_var("level", rf"(?<level>(ERROR)|(WARN)|(INFO))")  #  Correct
parser.add_var("status", rf"(?<status>(success)|(failure))")  #  Correct
parser.add_var("bad", rf"(?<status>success|failure)")         #  Wrong - unexpected behavior

Optional patterns

For optional patterns, use {0,1} instead of *:

from log_surgeon import Parser

parser = Parser()

#  Avoid using * for optional patterns (matches 0 or more)
parser.add_var("avoid", rf"(?<level>(ERROR)|(WARN))*")  # Can match empty string or multiple reps

#  Do not use ? for optional patterns
parser.add_var("avoid2", rf"(?<level>(ERROR)|(WARN))?")  # May not work as expected

#  Use {0,1} for optional patterns (matches 0 or 1)
parser.add_var("optional", rf"(?<level>(ERROR)|(WARN)){0,1}")  # Matches 0 or 1 occurrence
parser.compile()

Best practice: Use {0,1} for optional elements. Avoid * (0 or more) and ? for optional matching.

You can also explicitly include delimiters in your pattern:

# To match "def ghi", explicitly include the space delimiter
parser.add_var("multi", rf"(?<match>d\w+\s+\w+)")
# This matches "def " as one token segment, followed by "ghi"

Or adjust your delimiters to change tokenization behavior:

# Use only newline as delimiter to treat entire lines as tokens
parser = Parser(delimiters=r"\n")

Named capture groups

Use named capture groups in regex patterns to extract specific fields:

parser.add_var("metric", rf"metric=(?<metric_name>\w+) value=(?<value>\d+)")

The syntax (?<name>pattern) creates a capture group that can be accessed as event['name'].

Note: See Using Raw f-strings for best practices on writing regex patterns.

Using raw f-strings for regex patterns

⚠️ STRONGLY RECOMMENDED: Use raw f-strings (rf"...") for all regex patterns.

While not absolutely required, using regular strings will likely cause escaping issues and pattern failures. Raw f-strings prevent these problems.

Raw f-strings combine the benefits of:

  • Raw strings (r"..."): No need to double-escape regex special characters like \d, \w, \n
  • f-strings (f"..."): Easy interpolation of variables and pattern constants

Why use raw f-strings?

#  Without raw strings - requires double-escaping
parser.add_var("metric", "value=(\\d+)")  # Hard to read, error-prone

#  With raw f-strings - single escaping, clean and readable
parser.add_var("metric", rf"value=(?<value>\d+)")

Watch out for braces in f-strings

When using f-strings, literal { and } characters must be escaped by doubling them:

from log_surgeon import Parser, Pattern

parser = Parser()

#  Correct: Escape literal braces in regex
parser.add_var("json", rf"data={{(?<content>[^}}]+)}}")  # Matches: data={...}
parser.add_var("range", rf"range={{(?<min>\d+),(?<max>\d+)}}")  # Matches: range={10,20}

#  Using Pattern constants with interpolation
parser.add_var("ip", rf"IP: (?<ip>{Pattern.IPV4})")
parser.add_var("float", rf"value=(?<val>{Pattern.FLOAT})")

#  Common regex patterns
parser.add_var("digits", rf"\d+ items")  # No double-escaping needed
parser.add_var("word", rf"name=(?<name>\w+)")
parser.add_var("whitespace", rf"split\s+by\s+spaces")

parser.compile()

Examples: raw f-strings vs regular strings

# Regular string - requires double-escaping
parser.add_var("path", "path=(?<path>\\w+/\\w+)")  # Hard to read

# Raw f-string - natural regex syntax
parser.add_var("path", rf"path=(?<path>\w+/\w+)")  # Clean and readable

# With interpolation
log_level = "INFO|WARN|ERROR"
parser.add_var("level", rf"(?<level>{log_level})")  # Easy to compose

Recommendation: Consistently use rf"..." for all regex patterns. This approach:

  • Avoids double-escaping mistakes that break patterns
  • Makes patterns more readable
  • Allows easy use of Pattern constants and variables
  • Only requires watching for literal braces { and } in f-strings (escape as {{ and }})

Using regular strings ("...") will require double-escaping (e.g., "\\d+") which is error-prone and can be hard to read.

Logical vs. physical names

Internally, log-surgeon uses "physical" names (e.g., CGPrefix0, CGPrefix1) for capture groups, while you work with "logical" names (e.g., user_id, thread). The GroupNameResolver handles this mapping automatically.

Schema Format

The schema defines delimiters, timestamps, and variables for parsing:

// schema delimiters
delimiters: \t\r\n:,!;%@/\(\)\[\]

// schema timestamps
timestamp:<timestamp_regex>

// schema variables
variable_name:<variable_regex>

When using the fluent API (Parser.add_var() and Parser.compile()), the schema is built automatically.

Common Pitfalls

Pattern doesn't match anything

  • Check: Are you using .* to match across tokens? Use [a-zA-Z ]* instead
  • Check: Did you forget to call parser.compile()?
  • Check: Are your delimiters splitting tokens unexpectedly?

Alternation not working (abc|def)

  • Problem: (?<name>abc|def) doesn't match "abc" or "def" as expected
  • Solution: Use (?<name>(abc)|(def)) with explicit grouping

Pattern works in regex tester but not here

  • Remember: log-surgeon is token-based, not character-based
  • Traditional regex engines match across entire strings
  • log-surgeon matches within token boundaries (delimited by spaces, colons, etc.)
  • Read: Token-Based Parsing

Escape sequence errors in Python

  • Problem: parser.add_var("digits", "(?<num>\d+)") raises SyntaxError
  • Solution: Use rf"..." (raw f-string) instead of "..." or f"..."
  • Example: parser.add_var("digits", rf"(?<num>\d+)")

Optional pattern matching incorrectly

  • Problem: Using ? or * for optional patterns
  • Solution: Use {0,1} for optional elements
  • Example: (?<level>(ERROR)|(WARN)){0,1} for optional log level

Reference

Task Syntax
Named capture (?<name>pattern)
Alternation (?<name>(opt1)|(opt2)) NOT (opt1|opt2)
Optional {0,1} (NOT ? or *)
Match across tokens Use [a-z ]* (NOT .*)
Pattern string rf"..." (raw f-string recommended)
Log type .select(["@log_type"])
Original message .select(["@log_message"])

Parser

High-level parser for extracting structured data from unstructured log messages.

Constructor

  • Parser(delimiters: str = r" \t\r\n:,!;%@/\(\)\[\]")
    • Initialize a parser with optional custom delimiters
    • Default delimiters include space, tab, newline, and common punctuation

Methods

  • add_var(name: str, regex: str, hide_var_name_if_named_group_present: bool = True) -> Parser

    • Add a variable pattern to the parser's schema
    • Supports named capture groups using (?<name>) syntax
    • Use raw f-strings (rf"...") for regex patterns (see Using Raw f-strings)
    • Returns self for method chaining
  • add_timestamp(name: str, regex: str) -> Parser

    • Add a timestamp pattern to the parser's schema
    • Returns self for method chaining
  • compile(enable_debug_logs: bool = False) -> None

    • Build and initialize the parser with the configured schema
    • Must be called after adding variables and before parsing
    • Set enable_debug_logs=True to output debug information to stderr
  • load_schema(schema: str, group_name_resolver: GroupNameResolver) -> None

    • Load a pre-built schema string to configure the parser
  • parse(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Generator[LogEvent, None, None]

    • Parse all log events from a string, file object, or stream
    • Accepts strings, text/binary file objects, StringIO, or BytesIO
    • Yields LogEvent objects for each parsed event
  • parse_event(payload: str) -> LogEvent | None

    • Parse a single log event from a string (convenience method)
    • Wraps parse() and returns the first event
    • Returns LogEvent or None if no event found

LogEvent

Represents a parsed log event with extracted variables.

Methods

  • get_log_message() -> str

    • Get the original log message
  • get_log_type() -> str

    • Get the generated log type (template) with logical group names
  • get_capture_group(logical_capture_group_name: str, raw_output: bool = False) -> str | list | None

    • Get the value of a capture group by its logical name
    • If raw_output=False (default), single values are unwrapped from lists
    • Returns None if capture group not found
  • get_capture_group_str_representation(field: str, raw_output: bool = False) -> str

    • Get the string representation of a capture group value
  • get_resolved_dict() -> dict[str, str | list]

    • Get a dictionary with all capture groups using logical (user-defined) names
    • Physical names (CGPrefix*) are converted to logical names
    • Timestamp fields are consolidated under "timestamp" key
    • Single-value lists are unwrapped to scalar values
    • "@LogType" is excluded from the output
  • __getitem__(key: str) -> str | list

    • Access capture group values by name (e.g., event['field_name'])
    • Shorthand for get_capture_group(key, raw_output=False)
  • __str__() -> str

    • Get formatted JSON representation of the log event with logical group names
    • Uses get_resolved_dict() internally

Query

Query builder for parsing log events into structured data formats.

Constructor

  • Query(parser: Parser)
    • Initialize a query with a configured parser

Methods

  • select(fields: list[str]) -> Query

    • Select fields to extract from log events
    • Supports variable names, "*" for all variables, "@log_type" for log type, and "@log_message" for original message
    • The "*" wildcard can be combined with other fields (e.g., ["@log_type", "*"])
    • Returns self for method chaining
  • filter(predicate: Callable[[LogEvent], bool]) -> Query

    • Filter log events using a predicate function
    • Predicate receives a LogEvent and returns True to include it, False to exclude
    • Returns self for method chaining
    • Example: query.filter(lambda event: int(event['value']) > 50)
  • from_(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Query

    • Set the input source to parse
    • Accepts strings, text/binary file objects, StringIO, or BytesIO
    • Strings are automatically wrapped in StringIO
    • Returns self for method chaining
  • select_from(input: str | TextIO | BinaryIO | io.StringIO | io.BytesIO) -> Query

    • Alias for from_()
    • Returns self for method chaining
  • validate_query() -> Query

    • Validate that the query is properly configured
    • Returns self for method chaining
  • to_dataframe() -> pd.DataFrame

    • Convert parsed events to a pandas DataFrame
  • to_df() -> pd.DataFrame

    • Alias for to_dataframe()
  • to_arrow() -> pa.Table

    • Convert parsed events to a PyArrow Table
  • to_pa() -> pa.Table

    • Alias for to_arrow()
  • get_rows() -> list[list]

    • Extract rows of field values from parsed events
  • get_vars() -> KeysView[str]

    • Get all variable names (logical capture group names) defined in the schema
  • get_log_types() -> Generator[str, None, None]

    • Get all unique log types from parsed events
    • Yields log types in the order they are first encountered
    • Useful for discovering log patterns in your data
  • get_log_type_counts() -> dict[str, int]

    • Get count of occurrences for each unique log type
    • Returns dictionary mapping log types to their counts
    • Useful for analyzing log type distribution
  • get_log_type_with_sample(sample_size: int = 3) -> dict[str, list[str]]

    • Get sample log messages for each unique log type
    • Returns dictionary mapping log types to lists of sample messages
    • Useful for understanding what actual messages match each template

SchemaCompiler

Compiler for constructing log-surgeon schema definitions.

Constructor

  • SchemaCompiler(delimiters: str = DEFAULT_DELIMITERS)
    • Initialize a schema compiler with optional custom delimiters

Methods

  • add_var(name: str, regex: str, hide_var_name_if_named_group_present: bool = True) -> SchemaCompiler

    • Add a variable pattern to the schema
    • Returns self for method chaining
  • add_timestamp(name: str, regex: str) -> SchemaCompiler

    • Add a timestamp pattern to the schema
    • Returns self for method chaining
  • remove_var(var_name: str) -> SchemaCompiler

    • Remove a variable from the schema
    • Returns self for method chaining
  • get_var(var_name: str) -> Variable

    • Get a variable by name
  • compile() -> str

    • Compile the final schema string
  • get_capture_group_name_resolver() -> GroupNameResolver

    • Get the resolver for mapping logical to physical capture group names

GroupNameResolver

Bidirectional mapping between logical (user-defined) and physical (auto-generated) group names.

Constructor

  • GroupNameResolver(physical_name_prefix: str)
    • Initialize with a prefix for auto-generated physical names

Methods

  • create_new_physical_name(logical_name: str) -> str

    • Create a new unique physical name for a logical name
    • Each call generates a new physical name
  • get_physical_names(logical_name: str) -> set[str]

    • Get all physical names associated with a logical name
  • get_logical_name(physical_name: str) -> str

    • Get the logical name for a physical name
  • get_all_logical_names() -> KeysView[str]

    • Get all logical names that have been registered

PATTERN

Collection of pre-built regex patterns optimized for log parsing. These patterns follow log-surgeon's syntax requirements and are ready to use with named capture groups.

Available Patterns

Network Patterns

Pattern Description Example Match
PATTERN.UUID UUID (Universally Unique Identifier) 550e8400-e29b-41d4-a716-446655440000
PATTERN.IP_OCTET Single IPv4 octet (0-255) 192, 10, 255
PATTERN.IPV4 IPv4 address 192.168.1.1, 10.0.0.1
PATTERN.PORT Network port number (1-5 digits) 80, 8080, 65535

Numeric Patterns

Pattern Description Example Match
PATTERN.INT Integer with optional negative sign 42, -123, 0
PATTERN.FLOAT Float with optional negative sign 3.14, -123.456, 0.5

File System Patterns

Pattern Description Example Match
PATTERN.LINUX_FILE_NAME_CHARSET Character set for Linux file names a-zA-Z0-9 ._-
PATTERN.LINUX_FILE_NAME Linux file name app.log, config-2024.yaml
PATTERN.LINUX_FILE_PATH Linux file path (relative) logs/app.log, var/log/system.log

Character Sets and Word Patterns

Pattern Description Example Match
PATTERN.JAVA_IDENTIFIER_CHARSET Java identifier character set a-zA-Z0-9_
PATTERN.JAVA_IDENTIFIER Java identifier myVariable, $value, Test123
PATTERN.LOG_LINE_CHARSET Common log line characters Alphanumeric + symbols + whitespace
PATTERN.LOG_LINE General log line content Error: connection timeout
PATTERN.LOG_LINE_NO_WHITE_SPACE_CHARSET Log line chars without whitespace Alphanumeric + symbols only
PATTERN.LOG_LINE_NO_WHITE_SPACE Log content without spaces ERROR, /var/log/app.log

Java-Specific Patterns

Pattern Description Example Match
PATTERN.JAVA_LITERAL_CHARSET Java literal character set a-zA-Z0-9_$
PATTERN.JAVA_PACKAGE_SEGMENT Single Java package segment com., example.
PATTERN.JAVA_CLASS_NAME Java class name MyClass, ArrayList
PATTERN.JAVA_FULLY_QUALIFIED_CLASS_NAME Fully qualified class name java.util.ArrayList
PATTERN.JAVA_LOGGING_CODE_LOCATION_HINT Java logging location hint ~[MyClass.java:42?]
PATTERN.JAVA_STACK_LOCATION Java stack trace location java.util.ArrayList.add(ArrayList.java:123)

Example usage

from log_surgeon import Parser, PATTERN

parser = Parser()

# Network patterns
parser.add_var("network", rf"IP: (?<ip>{PATTERN.IPV4}) Port: (?<port>{PATTERN.PORT})")

# Numeric patterns
parser.add_var("metrics", rf"value=(?<value>{PATTERN.FLOAT}) count=(?<count>{PATTERN.INT})")

# File system patterns
parser.add_var("file", rf"Opening (?<filepath>{PATTERN.LINUX_FILE_PATH})")

# Java patterns
parser.add_var("exception", rf"at (?<stack>{PATTERN.JAVA_STACK_LOCATION})")

parser.compile()

Composing Patterns

PATTERN constants can be composed to build more complex patterns:

from log_surgeon import Parser, PATTERN

parser = Parser()

# Combine multiple patterns
parser.add_var(
    "server_info",
    rf"Server (?<name>{PATTERN.JAVA_IDENTIFIER}) at (?<ip>{PATTERN.IPV4}):(?<port>{PATTERN.PORT})"
)

# Use character sets to build custom patterns
parser.add_var(
    "custom_id",
    rf"ID-(?<id>[{PATTERN.JAVA_IDENTIFIER_CHARSET}]+)"
)

parser.compile()

Development

Building from source

# Clone the repository
git clone https://github.com/y-scope/log-surgeon-ffi-py.git
cd log-surgeon-ffi-py

# Install the project in editable mode
pip install -e .

# Build the extension
cmake -S . -B build
cmake --build build

Running tests

# Install test dependencies
pip install pytest

# Run tests
python -m pytest tests/

License

Apache License 2.0 - See LICENSE for details.


Links


Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ x86-64

log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_i686.whl (1.4 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ i686

log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_aarch64.whl (1.3 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ ARM64

log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (350.8 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl (368.0 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ i686

log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (338.3 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ ARM64

log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ x86-64

log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_i686.whl (1.4 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ i686

log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_aarch64.whl (1.3 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ ARM64

log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (350.8 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl (368.0 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ i686

log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (338.3 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ ARM64

log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ x86-64

log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_i686.whl (1.4 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ i686

log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_aarch64.whl (1.3 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ ARM64

log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (350.7 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl (368.0 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ i686

log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (338.2 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.10musllinux: musl 1.2+ x86-64

log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_i686.whl (1.4 MB view details)

Uploaded CPython 3.10musllinux: musl 1.2+ i686

log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_aarch64.whl (1.3 MB view details)

Uploaded CPython 3.10musllinux: musl 1.2+ ARM64

log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (350.7 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl (368.0 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ i686

log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (338.2 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.9musllinux: musl 1.2+ x86-64

log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_i686.whl (1.4 MB view details)

Uploaded CPython 3.9musllinux: musl 1.2+ i686

log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_aarch64.whl (1.3 MB view details)

Uploaded CPython 3.9musllinux: musl 1.2+ ARM64

log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (350.7 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl (368.0 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ i686

log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (338.2 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.13, musllinux: musl 1.2+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 b936267438a6d03fa8a4582b44a97df6c67b8402b1d731c7ef53e47f8c961533
MD5 7186a93b2372c5ae70cdbbdb23461794
BLAKE2b-256 35d11a201beb9afbe1a92a56b76fbf42630298a3284f5ca5d80c9f648e1063c2

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_i686.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: CPython 3.13, musllinux: musl 1.2+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 5a7c7f622dfb6df54819b65cf812cbf06dde0005feaec060fec16380a94063dc
MD5 c5e44db465964d0493592dc2f0d002c0
BLAKE2b-256 6d50280e6bb76d849232dee57ccff01aca71dfa0383011faba344b8e86873322

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_aarch64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.13, musllinux: musl 1.2+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 6581f64d3a3e7b7ee223b7eef82626e4b7185d8bc73d660c8d349c1ea163e0d1
MD5 1268fa823e80de1c90ec88948f597bed
BLAKE2b-256 02311339054fefae7c63df3e1a1f8d4aa5a4e31c083ae592500687ffe641e4a9

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 350.8 kB
  • Tags: CPython 3.13, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 92651dda0705b47b53993c53e8e07dced3e89f33c3ef5917bec7cac477d3572c
MD5 fa1fbc11c6b347958078173365ed4d52
BLAKE2b-256 17e8564adc4cf08290d2af401eb0f7eb9bad02dccb9751f187d6047d9d123930

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl
  • Upload date:
  • Size: 368.0 kB
  • Tags: CPython 3.13, manylinux: glibc 2.17+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 bd6b596e5e59d9fda941b012e1a0de0f2059335602745d660b9a4653f8a381e8
MD5 5fa5e308301a5ee2377178bce1afd880
BLAKE2b-256 80bb505833c76bb6162ea94fe2d648ce57b00d9b3a6eefff4986f5ccb3b2236e

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • Upload date:
  • Size: 338.3 kB
  • Tags: CPython 3.13, manylinux: glibc 2.17+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 d1b799e0d74c0f3b423d3b2343ecdd219e7ea5611d1023c7f8ee30431f45f7d8
MD5 057b7a240a15e4601d290e3b19c06713
BLAKE2b-256 c00891b311cce11ba9276bbef7803ea9db2863581671f933dfea6023aff51566

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.12, musllinux: musl 1.2+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 0a0498a5be3ac889d84b9e7b9f2485e39318d8581bd8dabcda88ebb7e96af10b
MD5 b491e502a8c5a6c85a70c6d984aeb65d
BLAKE2b-256 54adb86ab94c6b71c82a0a88498c5344966dc054e64b47f806cbe29413a0b450

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_i686.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: CPython 3.12, musllinux: musl 1.2+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 e7693b6b2cf22e394e6c1e75324767e0a18beefe4852dd7f33d17323b0d10e8d
MD5 c332e995c1b50cf83777eef827729310
BLAKE2b-256 d2238eb626cf5e67600e110c647a7e793bbcd61373741f3b6d0dd5392a495186

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_aarch64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.12, musllinux: musl 1.2+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 4d5d3755ff89d279785fa132abda371529a97f5210e0883338ab71ddc25dc569
MD5 bdb9528fc00a231c3dd61641df2d8917
BLAKE2b-256 53d80712bb30bc13d4e9df2b3059b3528a4c911a03965156f0a105b93e2d4669

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 350.8 kB
  • Tags: CPython 3.12, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 be7bd883ed68a1fcf685c5e1cbec442d4a57e7133e50a78533d48a5b10072429
MD5 d1b5172a7c7f87095ee9d08020ad4298
BLAKE2b-256 b9e3d11536510d7d7c03e9adbc332b0be087475e45a1abf6b97cb3cfba2b465a

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl
  • Upload date:
  • Size: 368.0 kB
  • Tags: CPython 3.12, manylinux: glibc 2.17+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 78ff10a9464b00c5c2c57cd541063c5c2279ffeda6d380e8a052596413aa04f9
MD5 cdfddb92fceb40f2310507b9adb30c51
BLAKE2b-256 e7d7e72743c3da8a51b1e7049fc7ac1d2ca894e07b239fe35a47924d15c4fca8

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • Upload date:
  • Size: 338.3 kB
  • Tags: CPython 3.12, manylinux: glibc 2.17+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 2883f7ed1d334a442194e12c49e4e788cf13bef1717b0f9de59fbc78918e7695
MD5 7afa7b55fb7959d4b9e0778f9c85d306
BLAKE2b-256 43572dd916b70354cbb048bbcca263c3ddb4efa7d43d66f61a7fdcb8e93d6bb1

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.11, musllinux: musl 1.2+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 d3fd745b0f1fc1e7e397d1a0bedecfe9610e26d0c9d2f27d8fbfea85b1871d48
MD5 67a9d47dc3ae1f2ede55c58da9f9231f
BLAKE2b-256 ac25cc105d565107512d4725856a695b1a8e548ff0a4fdc7224dcc05501ede55

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_i686.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: CPython 3.11, musllinux: musl 1.2+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 7a21d5b9ce647967bc60c1d4aa781d3660b3759d46a22d0190ecdf01c994ff86
MD5 ed11302a0250a108d4610c91d500c1fb
BLAKE2b-256 324c93d64fa5ea77c7c320666fc8a533f5636bef6e32089063a27ff88e9b415c

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_aarch64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.11, musllinux: musl 1.2+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 613f10a9da783bb116baa4aeb604751d5054c4b22778b77adbdffc27f9db3d49
MD5 6876442dac5bd13a6b19bed08de25f07
BLAKE2b-256 d8035e9101a2e29c89eb8619a44f9ed106c4f32b202cf6f27c07af6ab70677c7

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 350.7 kB
  • Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5a91a6f1182b37beaed075a05b1d5ee57a07ed3bf07e6bbd89b347d2a25235a0
MD5 4b0c3b4d6f316a50796097308ddb0b70
BLAKE2b-256 94947734d437a0c0c22bca211460b673e3ef17715db8160b6a60ad241bb33a8d

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl
  • Upload date:
  • Size: 368.0 kB
  • Tags: CPython 3.11, manylinux: glibc 2.17+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 91b80fa0cca8dbfc62e0ab408c8d59ced90c18bd6b7a8ab0c196ea36fd8d914e
MD5 3123dcd7997d0d1b8e0557620fe0884c
BLAKE2b-256 83e6e6628ace8cb42817bafadbef88f50d0027b2610f1e98d66ff1cb7352ee3d

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • Upload date:
  • Size: 338.2 kB
  • Tags: CPython 3.11, manylinux: glibc 2.17+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 3eaedf9f315fe0a6fffa20d6f265848bc85f4f6ca44dbef1da07810639c834eb
MD5 a3b346d3bad452174cd8378fce657cc7
BLAKE2b-256 f945322935ddcb8b4c78d047d4d68d9126815da62da6e865341be9fce506433f

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.10, musllinux: musl 1.2+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 6e968613704a55fdf064c81dd2a1c9f78b1f14af8d794bc867f830f1166b5e55
MD5 7dacded700fbfd2f9a7440ec2e5b3ed0
BLAKE2b-256 a1dc578d126e9d67033fbb7308d82651e9f18e34dd9c3533f3756d9541b8ded5

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_i686.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: CPython 3.10, musllinux: musl 1.2+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 3503641177b8c7f7544077fd57f8cb4fe283cd7f1538a1fd36207db98196ed25
MD5 c2b859a91381c81934dc58fd578f194a
BLAKE2b-256 920337739cdc656a896e1bfb2ca68b21921743b50efc8ae3adc42ce34470d8fa

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_aarch64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.10, musllinux: musl 1.2+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 736ba58d03baad6ad576e1576d38a513c719bba5be5b34bf708e1b5cb0a9dd6c
MD5 b6fc44973d77961dc0b4243b06a39d7d
BLAKE2b-256 a3a2ae58999bae52ee08350c5fe1b5c7a5cadfd6403be72bb1768bae98c40c8d

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 350.7 kB
  • Tags: CPython 3.10, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f621ac4dce07060b4b2550a4732470f2b69b4414b72531903447158e79a399b2
MD5 ce5d7bfb89ca692fafa4d1595c278d29
BLAKE2b-256 c4136ba240cdb55e1be5a6e31c153bfe97f8189fa7c22f309eb06f88062cf9b4

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl
  • Upload date:
  • Size: 368.0 kB
  • Tags: CPython 3.10, manylinux: glibc 2.17+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 ce1728e7b149fef640bf9e442a991d24b7d7932e1fb36c88b58254d1d3c6f100
MD5 98660044ee0837af934b464f35015b79
BLAKE2b-256 b8e15968c0fd600d80c226b3d47d247a1d449219f7d7d169f07f8daeee6ecc8d

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • Upload date:
  • Size: 338.2 kB
  • Tags: CPython 3.10, manylinux: glibc 2.17+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 bdc7838de4aa9e27eb7672d9cf098b59f44c98bc16957752cc9576f08ab8dbd8
MD5 5061e1fc15556b5dde336c7a5ba72835
BLAKE2b-256 9003b4d27635cbd45bf35366bdaaef00dd6e2b6be82361f524bb093a780588f3

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.9, musllinux: musl 1.2+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 9f960f68d4f19dddc9e1a6f5126be09c88b73cba76a869a0baf8ee1e6d5c9918
MD5 abeb416899c776ccf732a97c0a813871
BLAKE2b-256 bce6b160a6ac6d78d87ebb41cfa18f333b9403fe95b8e0a26831d99d40851187

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_i686.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: CPython 3.9, musllinux: musl 1.2+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 2550f0cecd2467133e5efa98eaa17775a91d1418152c8a3d5ef9ac3b407d4475
MD5 c61f4e10f230fe06b80e4f171f95aa1b
BLAKE2b-256 84e6ca483aa321a32b4fdbcd109f5440de24910d14401b6f7f0ecda04ed0f474

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_aarch64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 3.9, musllinux: musl 1.2+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 c594a9f3ea24af7be17ae63666fef6d2ea6403b595afe6bce70e07be9452e723
MD5 49be22fd54398f3e24a11a8d624838f4
BLAKE2b-256 6e0ca6292ca2b0c66205e2e404ad352bab0e8f0c391b932738b420831ef4c0d2

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 350.7 kB
  • Tags: CPython 3.9, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4ca23ea6c41d0f5600fe14cd5a86b007010dff9e0d19f77d8cb151293926dba9
MD5 8f3cc79682b1c83bdccb1b4624d616f3
BLAKE2b-256 b9b83ef7d2e694b7b3c0dc5c279d3829539e82c16b2f775809eae14d1d7133ce

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl
  • Upload date:
  • Size: 368.0 kB
  • Tags: CPython 3.9, manylinux: glibc 2.17+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 27a61dc65e8ca1c88b39d29af88c00f5daf46c906b10a3ec9a3f974eb633e4b1
MD5 dcac3dd369ca93de2ea9f064bdb3f9ad
BLAKE2b-256 1978c4188304933017f1ae242a122d0aae2cf97a3c25d0952398a2eab3063a27

See more details on using hashes here.

File details

Details for the file log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

  • Download URL: log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
  • Upload date:
  • Size: 338.2 kB
  • Tags: CPython 3.9, manylinux: glibc 2.17+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.32.5 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for log_surgeon_ffi-0.1.0b6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 e63928d0474383d7b18a8fa0af8733c9a6ca05e85fdd81b8c460d65fbb877430
MD5 1019f1fdcb8f3fdf2371089e848448f8
BLAKE2b-256 dd26cfcd2de3f1439ff67b8d3c939c6ee74fe14f17707071db169155e2bd49e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page