Skip to main content

Merge. Synthesize. Create. Dialektik generates new content by fusing ideas from diverse sources, revealing unexpected insights and perspectives.

Project description

Dialektik

Merge. Synthesize. Create. Dialektik generates new content by fusing ideas from diverse sources, revealing unexpected insights and perspectives through a dialectical process.

Features

  • Loads and processes datasets from multiple sources
  • Summarizes text into concise bullet points
  • Generates thesis, antithesis, and synthesis from summarized content
  • Supports various AI models for text generation
  • Model-agnostic design allows easy swapping of different LLMs

Requirements

  • Required: datasets, huggingface_hub
  • Optional: phi-3-vision-mlx (required only if you need to create a new dataset with the provided setup() function for custom dataset processing)

Installation

To install Dialektik with core dependencies only:

pip install dialektik

To install Dialektik with all dependencies, including those required for the setup() function:

pip install dialektik[setup]

Note: Install the full version if you plan to process custom datasets using the setup() function.

Usage

Command Line Interface

Dialektik can be used from the command line after installation. Here are some example usages:

  1. Generate a synthesis with default settings:

    dialektik
    
  2. Specify sources:

    dialektik --source arxiv
    
  3. Set the number of bullet points per book and choose a different model:

    dialektik --per-book 5 --model "your-preferred-model"
    
  4. Run the setup function:

    dialektik --setup
    
  5. For a full list of options, use:

    dialektik --help
    

Python API

You can also use Dialektik in your Python scripts:

from dialektik import synthesize

# Generate a synthesis with default settings
thesis, antithesis, synthesis = synthesize()

# Customize the synthesis process
output = synthesize(
   list_source=['your_source'],
   per_book=3,
   api_model="mistralai/Mistral-Nemo-Instruct-2407"
)

Accessing the Dataset

The default dataset at 'JosefAlbers/StampyAI-alignment-research-dataset' is publicly available. You don't need to set up any environment variables or run the setup() function to use dialektik with this dataset.

(Optional) Using Custom Datasets

If you want to use your own dataset:

  1. Prepare your dataset according to the required format.
  2. Modify the PATH_DS variable in the code to point to your dataset.
  3. If your dataset is private or requires authentication, set up the following environment variables:
    • HF_WRITE_TOKEN: Hugging Face write token (for pushing datasets)
    • HF_READ_TOKEN: Hugging Face read token (for accessing private datasets)

Note: The setup() function provided in the code is a demonstration of how you might process a custom dataset. Different datasets may require different processing steps, so you'll need to adapt this function to your specific needs.

Customizing the LLM

Dialektik is designed to be model-agnostic. The default model is "mistralai/Mistral-Nemo-Instruct-2407", but you can easily change this by passing a different api_model parameter to the synthesize() function.

Output

The synthesize() function generates three outputs:

  1. Thesis: An article exploring the main themes and insights from the selected sources.
  2. Antithesis: A text presenting alternative perspectives and counterarguments to the thesis.
  3. Synthesis: A reconciliation of the thesis and antithesis, presenting a new, unified viewpoint.

All outputs are saved in the 'syntheses' folder with timestamps for easy reference.

License

This project is licensed under the MIT License.

Citation

DOI

Contributing

Contributions to Dialektik are always welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dialektik-0.0.1.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

dialektik-0.0.1-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file dialektik-0.0.1.tar.gz.

File metadata

  • Download URL: dialektik-0.0.1.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.3

File hashes

Hashes for dialektik-0.0.1.tar.gz
Algorithm Hash digest
SHA256 f17d46eb45cdc63ebc9daffc8a4c69bd9ed8f5dc38bef33cf43e052cd79269be
MD5 6a6184c4cb94059cb1f1c2defd3c1398
BLAKE2b-256 b05c57de02240102cd87c7294170566bd33795f00cecf80bc20b99913c890591

See more details on using hashes here.

File details

Details for the file dialektik-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: dialektik-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.3

File hashes

Hashes for dialektik-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0a2b4379a4192918e6c2a78f32233e0d26c74cf97ee28561dc3280b9a9461181
MD5 5e621db9be737cb5c3bfe55ffa5e631e
BLAKE2b-256 aaf2818b8f01a986a098cffdfe01b0c939b54a25845effdd57e2451e61cdab11

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page