Skip to main content

A Python package for generating educational questions and answers using various LLM providers.

Project description

QAGeneratorLLM

A Python package for generating educational questions and answers using various LLM providers.

Features

  • Support for multiple LLM providers (Anthropic, Ollama, OpenAI, XAI)
  • Generate both Multiple Choice Questions (MCQ) and Open-Ended Questions
  • Choose between structured dataclass or raw JSON output formats
  • Batch processing support
  • File-based context input

Installation

pip install qageneratorllm

Usage

from qageneratorllm import QuestionGenerator, LLMProviderType, QuestionType, OutputType

# Initialize with default settings (Ollama + Open-Ended Questions)
generator = QuestionGenerator()

# Generate open-ended questions from text
result = generator.invoke("Your context text here")

# Generate Multiple Choice Questions using OpenAI
generator = QuestionGenerator(
    provider_type=LLMProviderType.OPENAI,
    question_type=QuestionType.MCQ
)
result = generator.invoke("Your context text here")

# Generate from file
result = generator.invoke_from_file("path/to/your/file.txt")

# Generate with JSON output instead of dataclass
generator = QuestionGenerator(
    question_type=QuestionType.MCQ,
    output_type=OutputType.JSON
)
result = generator.invoke("Your context text here")
# result will be a dictionary instead of a Pydantic model

Command Line Usage

# Basic usage
python -m qageneratorllm.generator --input input.txt --output questions.json

# Generate MCQs in batch mode from a folder of text files
python -m qageneratorllm.generator --input texts_folder/ --output results/ --batch --questions 5 --output-type json

Gradio Interface

The package includes a Gradio web interface for interactive question generation:

# Launch the Gradio app
python -m qageneratorllm.gradio_app

With the Gradio interface, you can:

  • Upload documents (.txt, .md, .pdf)
  • View document chunks separated by headers
  • Filter chunks by header level
  • Select specific chunks for question generation
  • Choose question type (MCQ or open-ended)
  • Select LLM provider
  • Generate questions interactively

Environment Variables

  • ANTHROPIC_MODEL_NAME: Anthropic model name (default: claude-3-sonnet-20240229)
  • OLLAMA_MODEL_NAME: Ollama model name (default: qwen2.5)
  • OPENAI_MODEL_NAME: OpenAI model name (default: gpt-4o)
  • XAI_MODEL_NAME: XAI model name (default: grok-beta)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qageneratorllm-0.1.3.tar.gz (209.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qageneratorllm-0.1.3-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file qageneratorllm-0.1.3.tar.gz.

File metadata

  • Download URL: qageneratorllm-0.1.3.tar.gz
  • Upload date:
  • Size: 209.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for qageneratorllm-0.1.3.tar.gz
Algorithm Hash digest
SHA256 d7eaaddb5bca0c428af65b1d701089b529a75c6fa71258484801fc7da36cfa89
MD5 768b244864a2deb133f756e8fc2b4023
BLAKE2b-256 5b75573563056b5ec67e73663a653695c6d06803661730be9f8123e0996ee228

See more details on using hashes here.

Provenance

The following attestation bundles were made for qageneratorllm-0.1.3.tar.gz:

Publisher: python-publish.yml on KameniAlexNea/create-dataset

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file qageneratorllm-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: qageneratorllm-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for qageneratorllm-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a3a97b3e84d4189ee6de72c3d630da12047a34c977ee8f1bb7f9099ee0c2eb28
MD5 a7d4d660097c1122bdda66ba67299c6e
BLAKE2b-256 7fa5685950afaa238b03fd38d7ca5db9544d2bb1285063eacee00d03210e5264

See more details on using hashes here.

Provenance

The following attestation bundles were made for qageneratorllm-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on KameniAlexNea/create-dataset

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page