Skip to main content

A Python package for generating educational questions and answers using various LLM providers.

Project description

QAGeneratorLLM

A Python package for generating educational questions and answers using various LLM providers.

Features

  • Support for multiple LLM providers (Anthropic, Ollama, OpenAI, XAI)
  • Generate both Multiple Choice Questions (MCQ) and Open-Ended Questions
  • Choose between structured dataclass or raw JSON output formats
  • Batch processing support
  • File-based context input

Installation

pip install qageneratorllm

Usage

from qageneratorllm import QuestionGenerator, LLMProviderType, QuestionType, OutputType

# Initialize with default settings (Ollama + Open-Ended Questions)
generator = QuestionGenerator()

# Generate open-ended questions from text
result = generator.invoke("Your context text here")

# Generate Multiple Choice Questions using OpenAI
generator = QuestionGenerator(
    provider_type=LLMProviderType.OPENAI,
    question_type=QuestionType.MCQ
)
result = generator.invoke("Your context text here")

# Generate from file
result = generator.invoke_from_file("path/to/your/file.txt")

# Generate with JSON output instead of dataclass
generator = QuestionGenerator(
    question_type=QuestionType.MCQ,
    output_type=OutputType.JSON
)
result = generator.invoke("Your context text here")
# result will be a dictionary instead of a Pydantic model

Command Line Usage

# Basic usage
python -m qageneratorllm.generator --input input.txt --output questions.json

# Generate MCQs in batch mode from a folder of text files
python -m qageneratorllm.generator --input texts_folder/ --output results/ --batch --questions 5 --output-type json

Gradio Interface

The package includes a Gradio web interface for interactive question generation:

# Launch the Gradio app
python -m qageneratorllm.gradio_app

With the Gradio interface, you can:

  • Upload documents (.txt, .md, .pdf)
  • View document chunks separated by headers
  • Filter chunks by header level
  • Select specific chunks for question generation
  • Choose question type (MCQ or open-ended)
  • Select LLM provider
  • Generate questions interactively

Environment Variables

  • ANTHROPIC_MODEL_NAME: Anthropic model name (default: claude-3-sonnet-20240229)
  • OLLAMA_MODEL_NAME: Ollama model name (default: qwen2.5)
  • OPENAI_MODEL_NAME: OpenAI model name (default: gpt-4o)
  • XAI_MODEL_NAME: XAI model name (default: grok-beta)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qageneratorllm-0.1.3a0.tar.gz (209.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qageneratorllm-0.1.3a0-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file qageneratorllm-0.1.3a0.tar.gz.

File metadata

  • Download URL: qageneratorllm-0.1.3a0.tar.gz
  • Upload date:
  • Size: 209.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for qageneratorllm-0.1.3a0.tar.gz
Algorithm Hash digest
SHA256 e0586943268cc475bea5a06055b8e5093a80c29e4ea2895c65dba5b0a292f65e
MD5 ae83b83a40d57030d12a4b648e076aa6
BLAKE2b-256 e73ff41853e7269b4b8bb35e6b78be888a4191ef02676f75c75d3185f5131f26

See more details on using hashes here.

Provenance

The following attestation bundles were made for qageneratorllm-0.1.3a0.tar.gz:

Publisher: python-publish.yml on KameniAlexNea/create-dataset

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file qageneratorllm-0.1.3a0-py3-none-any.whl.

File metadata

File hashes

Hashes for qageneratorllm-0.1.3a0-py3-none-any.whl
Algorithm Hash digest
SHA256 a5505e6a6e4f40b4016ea4313e0184a1cc128a192de772bbe4923f499bd18f06
MD5 96dce9500fa46fe7dc6320ed77a5c0b6
BLAKE2b-256 5cf8500f081b5cb63f1c7f31ce834025768f17a1422936cdc6b04643cc264ac8

See more details on using hashes here.

Provenance

The following attestation bundles were made for qageneratorllm-0.1.3a0-py3-none-any.whl:

Publisher: python-publish.yml on KameniAlexNea/create-dataset

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page