AI Agent as a Service (Clients Bring Their Own LLM)

Scenario
Offer custom-built AI agents + ongoing retainer support, deployed to clients’ own channels, while clients pay their own LLM usage directly to providers.

Key Requirements

  • Billing model: AgenticFlow (AF) credits cover AF platform/orchestration only. Client supplies their own LLM API key(s) (OpenAI, Gemini, etc.) and is billed by the provider directly.

  • BYO-LLM keys UX:

    • Prompt users to enter an API key before first use (or per session).

    • Options: one-time workspace-level storage (encrypted) or session-only (not stored).

    • Validate key + test call, surface model picker (per channel), and set usage caps.

  • Security & storage:

    • Encrypted at rest, redaction in logs, role-based access, key rotation, and easy revoke/reset.

  • Deployments:

    • “Build → Install/Publish” to client-owned Slack/Discord/Telegram (and web widget if available).

    • Self-contained installer or guided setup with health checks and webhooks.

  • Admin & analytics:

    • Dashboard for model selection, usage meters, error logs, and cost split (AF credits vs. LLM provider).

    • Alerting on invalid/expired keys.

  • Support model:

    • Custom build + install, plus retainer for updates, fine-tuning, and monitoring.

Reference UX
Gated usage requiring API key first (similar to this flow): https://prompt-enhancer.streamlit.app/

Deliverables

1) Custom agent build, 2) Channel installation, 3) BYO-LLM key flow, 4) Admin dashboard + docs, 5) Retainer plan.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
💡

Feature Request

Date

7 months ago

Author

Alvin Alvelino

Subscribe to post

Get notified by email when there are changes.