Add your description here
Project description
Dont know how to do something in your terminal?
howto print my system information?
And you'll be told!
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ You can use the system_profiler command on macOS to print detailed system information. For example: │
│ │
│ │
│ system_profiler SPHardwareDataType │
│ │
│ │
│ This command will display detailed information about your Mac's hardware, including the model name, │
│ processor, number of processors, total number of cores, memory, and other hardware specifics. For more │
│ comprehensive system information, you can run: │
│ │
│ │
│ system_profiler │
│ │
│ │
│ This command provides an extensive report on hardware, software, and network configurations. It may take a │
│ few moments to complete due to the amount of information it gathers. │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
Installing
howto
is available on pypi and can be installed in any manner of ways
pip install howto-ai
pipx install howto-ai
uvx --from howto-ai howto
Note: pre 1.0.0 release are assumed to be unstable and might break or do unexpected things, use wisely
Backends
howto
uses 3rd party services as backends. By default` configured to use ollama which is expected to be running locally. If it
is, it should just start.
howto
can also be used against popular local, private and public AI APIs including ChatGPT, Gemini and others. To
configure this, you need to edit the config file. Locating the config file can be done with:
howto --config-path
Edit this in your favourite text editor. If this is difficult, please open an Issue and we can help.
Open AI
Set the model you want by updating the config file
model = "gpt-4o"
# or
model = "gpt-4"
# or
model = "gpt-3.5-turbo"
# ...
then export your OPENAI_API_KEY
for that session (in your .envrc or .bashrc or wherever)
export OPENAI_API_KEY="<your key here>" # https://platform.openai.com/api-keys
and you should be good to go
Hugging Face
Set the model you want by updating the config file
model = "huggingface/<hugging_face_model>"
# for example
model = "huggingface/facebook/blenderbot-400M-distill" # https://huggingface.co/facebook/blenderbot-400M-distill
# or
model = "Qwen/Qwen2.5-Coder-32B-Instruct" # https://huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct
# ...
then export your HUGGINGFACE_API_KEY
for that session (in your .envrc or .bashrc or wherever)
export HUGGINGFACE_API_KEY="<your key here>" # https://huggingface.co/docs/hub/security-tokens
and you should be good to go
Anthropic
Set the model you want by updating the config file
model = "claude-3-5-sonnet-20241022" # https://docs.anthropic.com/en/docs/about-claude/models
# or
model = "claude-3-5-haiku-latest"
# or
model = "claude-3-opus-latest"
# ...
then export your ANTHROPIC_API_KEY
for that session (in your .envrc or .bashrc or wherever)
export ANTHROPIC_API_KEY="<your key here>" # https://huggingface.co/docs/hub/security-tokens
and you should be good to go
Remember AI can produce rubbish code, remember to research the commands
howto
spits out
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file howto_ai-0.1.0.tar.gz
.
File metadata
- Download URL: howto_ai-0.1.0.tar.gz
- Upload date:
- Size: 257.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a656ca62e212c50b7c89f6269c64bd1bafbb2b2834edbf18da52f71aacc4926e |
|
MD5 | fd37fde5da5650121f427cc334bec4aa |
|
BLAKE2b-256 | a0e9230416cb8707fe9ed01fc9b83a06724131ca4a04bd13da82fb66e1de0c89 |
Provenance
The following attestation bundles were made for howto_ai-0.1.0.tar.gz
:
Publisher:
build-and-publish.yaml
on GitToby/howto-ai
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
howto_ai-0.1.0.tar.gz
- Subject digest:
a656ca62e212c50b7c89f6269c64bd1bafbb2b2834edbf18da52f71aacc4926e
- Sigstore transparency entry: 150810184
- Sigstore integration time:
- Predicate type:
File details
Details for the file howto_ai-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: howto_ai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 6.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fed66b6d569ce37f30b9a5196f35ca0598fa2efeefd5f9c3b4663f1cf90ceadf |
|
MD5 | ec05d3d0e3019fcd7859f478a0f2dbf4 |
|
BLAKE2b-256 | b5656bc8c9f02f70b90fe7005c736d669df8f8bfa037c256d01ec98425431a6c |
Provenance
The following attestation bundles were made for howto_ai-0.1.0-py3-none-any.whl
:
Publisher:
build-and-publish.yaml
on GitToby/howto-ai
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
howto_ai-0.1.0-py3-none-any.whl
- Subject digest:
fed66b6d569ce37f30b9a5196f35ca0598fa2efeefd5f9c3b4663f1cf90ceadf
- Sigstore transparency entry: 150810186
- Sigstore integration time:
- Predicate type: