Doku is an open-source observability tool engineered for Large Language Models (LLMs). Designed for ease of integration into existing LLM applications, Doku offers unparalleled insights into usage, performance, and overheadโallowing you to analyze, optimize, and scale your AI applications and LLM usage effectively.
Whether you're working with OpenAI, Cohere, or Anthropic, It tracks all your LLM requests transparently and conveys the insights needed to make data-driven decisions. From monitoring usage and understanding latencies to managing costs and collaborating effortlessly, Doku grants you the lens to view your models in high definition.
Leveraging Doku, you can get:
- ๐ต๏ธ In-depth LLM Monitoring: Track every request to LLM platforms like OpenAI with precision, ensuring comprehensive visibility over your model's interactions.
- ๐๏ธ Granular Usage Insights of your LLM Applications: Assess your LLM's performance and costs with fine-grained control, breaking down metrics by environment (such as staging or production) or application, to optimize for efficiency and scalability.
- ๐ Connect to Observability Platforms: Export LLM Observablity data and insights from Doku to popular observability platforms such as Grafana Cloud or Datadog.
- ๐ฅ Team-centric Workflow: Enhance team collaboration with seamless data sharing capabilities, creating an integrated environment for observability-driven teamwork.
And this is only the beginningโas we grow, so will our list of supported LLM platforms. We're dedicated to continually refining our features to enhance your LLM and Generative AI observability experience.
Jumpstart your journey with Doku by deploying it via our Helm chart, designed to simplify the installation process on any Kubernetes cluster.
To install the Doku Helm chart, follow these steps:
- Add the Doku Helm repository to your Helm setup:
helm repo add dokulabs https://dokulabs.github.io/helm/
- Update your Helm repositories to fetch the latest chart information:
helm repo update
- Install the Doku chart with the release name
doku
:
helm install doku dokulabs/doku
NOTE:
As Doku does not have a built-in visualization UI yet, it is preferred that you set up the
observabilityPlatform
configuration within the values.yaml file in the Helm Chart. Doing so enables visualization of the LLM Observability data processed by Doku using an external observability platform.
For a detailed list of configurable parameters for the Helm chart, refer to the values.yaml
file in the Helm chart.
Once Doku is up and running, proceed to generate your first API key:
- Hit the
/api/keys
endpoint of the Doku service:
curl -X POST http://<Doku-URL>/api/keys \
-H 'Authorization: ""' \
-H 'Content-Type: application/json' \
-d '{"Name": "YourKeyName"}'
Note:
- For your initial API call,
Authorization
header can be set to""
.- Store the provided API key securely; it will be required to pass the generated and a valid API Ke in
Authorization
header for subsequent API calls.
With the dokumetry
SDKs for Python and NodeJS, sending observability data to Doku is just TWO lines of code in your application. Once integrated, the SDKs take care of capturing and conveying LLM usage data directly to your Doku instance, requiring minimal effort on your part.
Install the dokumetry
Python SDK using pip:
pip install dokumetry
Add the following two lines to your application code:
import dokumetry
dokumetry.init(llm=client, doku_url="YOUR_DOKU_URL", api_key="YOUR_DOKU_TOKEN")
from openai import OpenAI
import dokumetry
client = OpenAI(
api_key="YOUR_OPENAI_KEY"
)
# Pass the above `client` object along with your DOKU URL and API key and this will make sure that all OpenAI calls are automatically tracked.
dokumetry.init(llm=client, doku_url="YOUR_DOKU_URL", api_key="YOUR_DOKU_TOKEN")
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "What is LLM Observability",
}
],
model="gpt-3.5-turbo",
)
Refer to the dokumetry
Python SDK repository for more advanced configurations and use cases.
Install the dokumetry
NodeJS SDK using npm:
npm install dokumetry
Add the following two lines to your application code:
import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuURL: "YOUR_DOKU_URL", apiKey: "YOUR_DOKU_TOKEN"})
import OpenAI from 'openai';
import DokuMetry from 'dokumetry';
const openai = new OpenAI({
apiKey: 'My API Key', // defaults to process.env["OPENAI_API_KEY"]
});
// Pass the above `openai` object along with your DOKU URL and API key and this will make sure that all OpenAI calls are automatically tracked.
DokuMetry.init({llm: openai, dokuURL: "YOUR_DOKU_URL", apiKey: "YOUR_DOKU_TOKEN"})
async function main() {
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: 'What are the key to effective observability?' }],
model: 'gpt-3.5-turbo',
});
}
main();
Refer to the dokumetry
NodeJS SDK repository for more advanced configurations and use cases.
Doku uses key based authentication mechanism to ensure the security of your data. Be sure to keep your API keys confidential and manage permissions diligently. Refer to our Security Policy
We welcome contributions to the Doku project. Please refer to CONTRIBUTING for detailed guidelines on how you can participate.
Doku is available under the Apache-2.0 license.
For support, issues, or feature requests, submit an issue through the GitHub issues associated with this repository.
Join us on this voyage to reshape the future of AI insights. Share your thoughts, suggest features, and explore contributions. Engage with us on GitHub and be part of Doku's community-led innovation.