A Python package for generating educational questions and answers using various LLM providers.
Project description
QAGeneratorLLM
A Python package for generating educational questions and answers using various LLM providers.
Features
- Support for multiple LLM providers (Anthropic, Ollama, OpenAI, XAI)
- Generate both Multiple Choice Questions (MCQ) and Open-Ended Questions
- Choose between structured dataclass or raw JSON output formats
- Batch processing support
- File-based context input
Installation
pip install qageneratorllm
Usage
from qageneratorllm import QuestionGenerator, LLMProviderType, QuestionType, OutputType
# Initialize with default settings (Ollama + Open-Ended Questions)
generator = QuestionGenerator()
# Generate open-ended questions from text
result = generator.invoke("Your context text here")
# Generate Multiple Choice Questions using OpenAI
generator = QuestionGenerator(
provider_type=LLMProviderType.OPENAI,
question_type=QuestionType.MCQ
)
result = generator.invoke("Your context text here")
# Generate from file
result = generator.invoke_from_file("path/to/your/file.txt")
# Generate with JSON output instead of dataclass
generator = QuestionGenerator(
question_type=QuestionType.MCQ,
output_type=OutputType.JSON
)
result = generator.invoke("Your context text here")
# result will be a dictionary instead of a Pydantic model
Command Line Usage
# Basic usage
python -m qageneratorllm.generator --input input.txt --output questions.json
# Generate MCQs in batch mode from a folder of text files
python -m qageneratorllm.generator --input texts_folder/ --output results/ --batch --questions 5 --output-type json
Gradio Interface
The package includes a Gradio web interface for interactive question generation:
# Launch the Gradio app
python -m qageneratorllm.gradio_app
With the Gradio interface, you can:
- Upload documents (.txt, .md, .pdf)
- View document chunks separated by headers
- Filter chunks by header level
- Select specific chunks for question generation
- Choose question type (MCQ or open-ended)
- Select LLM provider
- Generate questions interactively
Environment Variables
ANTHROPIC_MODEL_NAME: Anthropic model name (default: claude-3-sonnet-20240229)OLLAMA_MODEL_NAME: Ollama model name (default: qwen2.5)OPENAI_MODEL_NAME: OpenAI model name (default: gpt-4o)XAI_MODEL_NAME: XAI model name (default: grok-beta)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file qageneratorllm-0.1.3.tar.gz.
File metadata
- Download URL: qageneratorllm-0.1.3.tar.gz
- Upload date:
- Size: 209.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d7eaaddb5bca0c428af65b1d701089b529a75c6fa71258484801fc7da36cfa89
|
|
| MD5 |
768b244864a2deb133f756e8fc2b4023
|
|
| BLAKE2b-256 |
5b75573563056b5ec67e73663a653695c6d06803661730be9f8123e0996ee228
|
Provenance
The following attestation bundles were made for qageneratorllm-0.1.3.tar.gz:
Publisher:
python-publish.yml on KameniAlexNea/create-dataset
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
qageneratorllm-0.1.3.tar.gz -
Subject digest:
d7eaaddb5bca0c428af65b1d701089b529a75c6fa71258484801fc7da36cfa89 - Sigstore transparency entry: 219357603
- Sigstore integration time:
-
Permalink:
KameniAlexNea/create-dataset@d6edcb47aadaa4a0f3f705319e788d9783137294 -
Branch / Tag:
refs/tags/qageneratorllm-v0.1.3 - Owner: https://github.com/KameniAlexNea
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@d6edcb47aadaa4a0f3f705319e788d9783137294 -
Trigger Event:
release
-
Statement type:
File details
Details for the file qageneratorllm-0.1.3-py3-none-any.whl.
File metadata
- Download URL: qageneratorllm-0.1.3-py3-none-any.whl
- Upload date:
- Size: 13.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a3a97b3e84d4189ee6de72c3d630da12047a34c977ee8f1bb7f9099ee0c2eb28
|
|
| MD5 |
a7d4d660097c1122bdda66ba67299c6e
|
|
| BLAKE2b-256 |
7fa5685950afaa238b03fd38d7ca5db9544d2bb1285063eacee00d03210e5264
|
Provenance
The following attestation bundles were made for qageneratorllm-0.1.3-py3-none-any.whl:
Publisher:
python-publish.yml on KameniAlexNea/create-dataset
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
qageneratorllm-0.1.3-py3-none-any.whl -
Subject digest:
a3a97b3e84d4189ee6de72c3d630da12047a34c977ee8f1bb7f9099ee0c2eb28 - Sigstore transparency entry: 219357606
- Sigstore integration time:
-
Permalink:
KameniAlexNea/create-dataset@d6edcb47aadaa4a0f3f705319e788d9783137294 -
Branch / Tag:
refs/tags/qageneratorllm-v0.1.3 - Owner: https://github.com/KameniAlexNea
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@d6edcb47aadaa4a0f3f705319e788d9783137294 -
Trigger Event:
release
-
Statement type: