Skip to main content

PyThaiNLP For spaCy

Project description

spaCy-PyThaiNLP

PyPI version License Python 3.9+

This package wraps the PyThaiNLP library to add Thai language support for spaCy.

Features

Support List

  • Word segmentation (tokenization)
  • Part-of-speech tagging
  • Named entity recognition (NER)
  • Sentence segmentation
  • Dependency parsing
  • Word vectors

Table of Contents

Installation

Prerequisites

  • Python 3.9 or higher
  • spaCy 3.0 or higher
  • PyThaiNLP 3.1.0 or higher

Install via pip

pip install spacy-pythainlp

Quick Start

import spacy
import spacy_pythainlp.core

# Create a blank Thai language model
nlp = spacy.blank("th")

# Add the PyThaiNLP pipeline component
nlp.add_pipe("pythainlp")

# Process text
doc = nlp("ผมเป็นคนไทย แต่มะลิอยากไปโรงเรียนส่วนผมจะไปไหน ผมอยากไปเที่ยว")

# Access sentences
for sent in doc.sents:
    print(sent)
# Output:
# ผมเป็นคนไทย แต่มะลิอยากไปโรงเรียนส่วนผมจะไปไหน
# ผมอยากไปเที่ยว

Usage Examples

Basic Sentence Segmentation

import spacy
import spacy_pythainlp.core

nlp = spacy.blank("th")
nlp.add_pipe("pythainlp")

doc = nlp("ผมเป็นคนไทย แต่มะลิอยากไปโรงเรียนส่วนผมจะไปไหน ผมอยากไปเที่ยว")

# Get sentences
sentences = list(doc.sents)
print(f"Number of sentences: {len(sentences)}")
for i, sent in enumerate(sentences, 1):
    print(f"Sentence {i}: {sent.text}")

Part-of-Speech Tagging

import spacy
import spacy_pythainlp.core

nlp = spacy.blank("th")
nlp.add_pipe("pythainlp", config={"pos": True})

doc = nlp("ผมเป็นคนไทย")

# Print tokens with POS tags
for token in doc:
    print(f"{token.text}: {token.pos_}")

Named Entity Recognition

import spacy
import spacy_pythainlp.core

nlp = spacy.blank("th")
nlp.add_pipe("pythainlp", config={"ner": True})

doc = nlp("วันที่ 15 กันยายน 2564 ทดสอบระบบที่กรุงเทพ")

# Print named entities
for ent in doc.ents:
    print(f"{ent.text}: {ent.label_}")

Dependency Parsing

import spacy
import spacy_pythainlp.core

nlp = spacy.blank("th")
nlp.add_pipe("pythainlp", config={"dependency_parsing": True})

doc = nlp("ผมเป็นคนไทย")

# Print dependency relations
for token in doc:
    print(f"{token.text}: {token.dep_} <- {token.head.text}")

Word Vectors

import spacy
import spacy_pythainlp.core

nlp = spacy.blank("th")
nlp.add_pipe("pythainlp", config={"word_vector": True, "word_vector_model": "thai2fit_wv"})

doc = nlp("แมว สุนัข")

# Access word vectors
for token in doc:
    print(f"{token.text}: vector shape = {token.vector.shape}")
    
# Calculate similarity
token1 = doc[0]  # แมว
token2 = doc[1]  # สุนัข
print(f"Similarity: {token1.similarity(token2)}")

Configuration

You can customize the PyThaiNLP pipeline component by passing a configuration dictionary to nlp.add_pipe():

nlp.add_pipe(
    "pythainlp",
    config={
        "pos_engine": "perceptron",
        "pos": True,
        "pos_corpus": "orchid_ud",
        "sent_engine": "crfcut",
        "sent": True,
        "ner_engine": "thainer",
        "ner": True,
        "tokenize_engine": "newmm",
        "tokenize": False,
        "dependency_parsing": False,
        "dependency_parsing_engine": "esupar",
        "dependency_parsing_model": None,
        "word_vector": True,
        "word_vector_model": "thai2fit_wv"
    }
)

Configuration Options

Parameter Type Default Description
tokenize bool False Enable/disable word tokenization (spaCy uses PyThaiNLP's newmm by default)
tokenize_engine str "newmm" Tokenization engine. See options
sent bool True Enable/disable sentence segmentation
sent_engine str "crfcut" Sentence tokenizer engine. See options
pos bool True Enable/disable part-of-speech tagging
pos_engine str "perceptron" POS tagging engine. See options
pos_corpus str "orchid_ud" Corpus for POS tagging
ner bool True Enable/disable named entity recognition
ner_engine str "thainer" NER engine. See options
dependency_parsing bool False Enable/disable dependency parsing
dependency_parsing_engine str "esupar" Dependency parsing engine. See options
dependency_parsing_model str None Dependency parsing model. See options
word_vector bool True Enable/disable word vectors
word_vector_model str "thai2fit_wv" Word vector model. See options

Important Notes:

  • When dependency_parsing is enabled, word segmentation and sentence segmentation are automatically disabled to use the tokenization from the dependency parser.
  • All configuration options are optional and have sensible defaults.

Resources

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

License

   Copyright 2016-2026 PyThaiNLP Project

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spacy_pythainlp-1.0.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spacy_pythainlp-1.0-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file spacy_pythainlp-1.0.tar.gz.

File metadata

  • Download URL: spacy_pythainlp-1.0.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for spacy_pythainlp-1.0.tar.gz
Algorithm Hash digest
SHA256 b1624e981d83399fe304e57ab769b038ac1a16c1b6af73dfe0ae1c6cb97d3aee
MD5 91a4df9b57285477db64dc0184f13c2d
BLAKE2b-256 94da92bf67a4a7029aa5d04f8178e6f3275660fdaf0c925b5e29e762026f20de

See more details on using hashes here.

File details

Details for the file spacy_pythainlp-1.0-py3-none-any.whl.

File metadata

  • Download URL: spacy_pythainlp-1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for spacy_pythainlp-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5a42db44e094f9fe7cea0d9b8311be1459da75b721e25ed40439d78d6eb8d4fd
MD5 8b6331699adea2d48e0dbb7d796481d1
BLAKE2b-256 bf9293e384f06e716f3c9d115ebd381470c68755326b5061eb80880b1044a13b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page