Cloud object-storage agent skills (AWS S3 + Google Cloud Storage + Azure Blob) for Concinno — optional-extras install so the core wheel stays small.
Project description
concinno-skills-cloud
Cloud object-storage skills for Concinno.
AWS S3 + Google Cloud Storage + Azure Blob Storage via their official
Python SDKs — thin call(**kwargs) adapters for agents that need to
upload / download / list / delete objects without owning cloud boilerplate.
Status
MVP (0.1.0) — three tools covering the canonical "talk to cloud object storage" agent need. Vendor SDKs are behind optional extras so installing this package does not pull ~65MB of cloud SDK dependencies unless you ask.
| Tool | Library | License | Install | Purpose |
|---|---|---|---|---|
S3Object |
boto3 (v1.34+) |
Apache-2.0 | [aws] |
S3 upload/download/list/delete |
GcsObject |
google-cloud-storage (v2.18+) |
Apache-2.0 | [gcp] |
GCS upload/download/list/delete |
AzureBlobObject |
azure-storage-blob (v12.24+) |
MIT | [azure] |
Azure Blob upload/download/list/delete |
Install
# Just AWS:
pip install 'concinno-skills-cloud[aws]'
# Just GCP:
pip install 'concinno-skills-cloud[gcp]'
# Just Azure:
pip install 'concinno-skills-cloud[azure]'
# All three:
pip install 'concinno-skills-cloud[all]'
# Core only (no SDKs — tool calls will return missing_driver errors):
pip install concinno-skills-cloud
The core wheel is ~5KB of Python. Each extra adds its vendor SDK plus its transitive deps. Most agents only need one cloud path and install exactly that extra.
Safety
Every tool routes through a shared _safety module before touching
the vendor:
actionenum — reject bogus actions early with a clear list.- Content size cap — upload / download payloads hard-capped at 1 MiB. The agent context window is precious; dumping 100MiB files into it is never the right call. Use the vendor SDK directly for larger artifacts.
max_keyscap — list results capped at 1000 (default 100). Paginate with prefix narrowing for bigger buckets.- Type checks — non-empty string bucket / blob names;
contentmust bebytesorstr(UTF-8 encoded internally).
Credentials
No credentials are stored. Each call accepts the vendor-specific
credential kwargs:
- AWS:
aws_access_key_id,aws_secret_access_key,aws_region,aws_session_token(optional STS token). If omitted,boto3's default credential chain applies (env vars, shared config, IMDSv2, …). - GCS:
gcs_credentials_json(parsed service-account dict) ORGOOGLE_APPLICATION_CREDENTIALSenv var pointing at a JSON file. Omit both for Application Default Credentials (Workload Identity). - Azure:
azure_storage_connection_string(preferred, single- string) ORazure_storage_account_url+azure_storage_sas_token(scoped SAS flow).
Agents wanting indirection can resolve upstream via environment
expansion or Concinno's CredentialStore:
from concinno.core.credentials import CredentialStore
cs = CredentialStore()
creds = {
"aws_access_key_id": cs.get("AWS_ACCESS_KEY_ID"),
"aws_secret_access_key": cs.get("AWS_SECRET_ACCESS_KEY"),
"aws_region": "us-east-1",
}
Usage via Concinno ToolRegistry
When the consumer sets CONCINNO_LOAD_PLUGINS=1, the default
registry auto-mounts all three tools:
import os
os.environ["CONCINNO_LOAD_PLUGINS"] = "1"
from concinno.tools.registry import get_default_registry
reg = get_default_registry()
expected = {"S3Object", "GcsObject", "AzureBlobObject"}
assert expected.issubset(set(reg.list_deferred()))
Direct Python usage
from concinno_skills_cloud import S3Object, GcsObject, AzureBlobObject
# ── AWS S3 ───────────────────────────────────────────────────────
S3Object().call(
action="upload",
bucket="mybucket",
key="reports/q3.json",
content=b'{"revenue": 1234}',
aws_access_key_id="AKIA...",
aws_secret_access_key="...",
aws_region="us-east-1",
)
# → {"ok": True, "bucket": "mybucket", "key": "reports/q3.json", "size": 18}
S3Object().call(
action="download",
bucket="mybucket",
key="reports/q3.json",
aws_access_key_id="AKIA...",
aws_secret_access_key="...",
)
# → {"ok": True, "content": "<base64>", "content_type": "...", "size": 18}
S3Object().call(
action="list",
bucket="mybucket",
prefix="reports/",
max_keys=100,
aws_access_key_id="AKIA...",
aws_secret_access_key="...",
)
# → {"ok": True, "keys": ["reports/q1.json", ...], "truncated": False}
# ── GCS ──────────────────────────────────────────────────────────
GcsObject().call(
action="upload",
bucket="mybucket",
blob_name="reports/q3.json",
content=b'{"revenue": 1234}',
content_type="application/json",
gcs_credentials_json={"client_email": "...", "private_key": "...", ...},
)
# → {"ok": True, "bucket": "mybucket", "blob_name": "reports/q3.json", "size": 18}
# ── Azure Blob ───────────────────────────────────────────────────
AzureBlobObject().call(
action="upload",
container="mycontainer",
blob_name="reports/q3.json",
content=b'{"revenue": 1234}',
content_type="application/json",
azure_storage_connection_string="DefaultEndpointsProtocol=https;...",
)
# → {"ok": True, "container": "mycontainer", "blob_name": "...", "size": 18}
All tools return either {"ok": True, ...} on success or
{"error": "..."} on any validation or vendor-SDK failure — same
envelope as other Concinno built-in tools.
Base64 round-trip
download responses base64-encode the object body so the JSON envelope
stays clean. Decode with:
import base64
resp = S3Object().call(action="download", ...)
data = base64.b64decode(resp["content"])
What this package is NOT
- Not a file-system abstraction. Use
fsspec/s3fs/gcsfsif you want posix-style paths across cloud providers. - Not a sync / migration tool. Bucket / container creation, IAM, lifecycle rules, replication, encryption-at-rest policies go through each vendor's admin API.
- Not a large-object pipeline. 1 MiB cap per call. Use the vendor
SDK directly (or
rclone/aws s3 cp) for video / archive / ML checkpoint objects. - Not a secrets store. Credentials are per-call kwargs.
- Not a presigned-URL generator. Scope limited to direct object operations in 0.1.0; presigned URLs may come in a later release.
Caveats
boto3IMDSv2 fallback: On EC2 hosts with an attached IAM instance profile, omitting explicit AWS creds silently succeeds via IMDSv2. Agents crossing environments should always pass explicit creds or setAWS_EC2_METADATA_DISABLED=true.- GCS ADC complexity: Workload Identity + GKE metadata + service-
account keyfile +
GOOGLE_APPLICATION_CREDENTIALSall interact viagoogle.auth.default(). If an agent in production authenticates unexpectedly, the vendor's auth-chain docs are the source of truth. - Azure connection string carries the account key: Prefer SAS tokens scoped to the specific container + permissions for agent use. Connection strings should be reserved for admin/migration paths.
License
Apache-2.0. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file concinno_skills_cloud-0.1.0.tar.gz.
File metadata
- Download URL: concinno_skills_cloud-0.1.0.tar.gz
- Upload date:
- Size: 24.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b81ed345ab956e2231b99c804fd358bf44506531f2410c484df1de9738934823
|
|
| MD5 |
4427845185436d5671588ddce8936729
|
|
| BLAKE2b-256 |
8f301763af17ded0158580e1965290c5304b540724f1b53ab898e7be62584b5d
|
File details
Details for the file concinno_skills_cloud-0.1.0-py3-none-any.whl.
File metadata
- Download URL: concinno_skills_cloud-0.1.0-py3-none-any.whl
- Upload date:
- Size: 24.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f4e326c325784b0a122662285e53bf377528c2c0954452d77c5df44015af9ba2
|
|
| MD5 |
7145cfe598a1fdde4e1a462d50842939
|
|
| BLAKE2b-256 |
64d776274e18474685eabc0ec118144692d5675234cd195728129b55cc43320b
|