SDK Quickstart
Integrate NjiraAI into your AI agent in minutes with Python and TypeScript.
Installation
Python
pip install njiraai
TypeScript / JavaScript
npm install @njiraai/sdk
Quick Start
Python
from njiraai import NjiraAI
# Initialize the SDK
njira = NjiraAI(
api_key="your-api-key",
project_id="your-project-id",
mode="active", # or 'shadow', 'dry_run'
)
# Enforce before LLM call
pre_decision = await njira.enforce_pre(
input_data=user_message,
metadata={"endpoint": "/chat"},
)
if pre_decision["verdict"] == "block":
raise RuntimeError("Blocked by policy")
# Trace LLM call
span_id = njira.start_span(
name="llm-call",
span_type="llm",
input_data=prompt,
)
response = await call_llm(prompt)
njira.end_span(span_id, output=response)
# Enforce after LLM call
post_decision = await njira.enforce_post(output=response)
if post_decision["verdict"] == "block":
raise RuntimeError("Output blocked")
TypeScript
import { NjiraAI } from "@njiraai/sdk";
// Initialize the SDK
const njira = new NjiraAI({
apiKey: process.env.NJIRA_API_KEY!,
projectId: process.env.NJIRA_PROJECT_ID!,
mode: "active", // or 'shadow', 'dry_run'
});
// Enforce before LLM call
const preDecision = await njira.enforcePre({
input: userMessage,
metadata: { endpoint: "/chat" },
});
if (preDecision.verdict === "block") {
throw new Error("Blocked by policy");
}
// Trace LLM call
const spanId = njira.trace.startSpan({
name: "llm-call",
type: "llm",
input: prompt,
});
const response = await callLLM(prompt);
njira.trace.endSpan(spanId, { output: response });
// Enforce after LLM call
const postDecision = await njira.enforcePost({ output: response });
if (postDecision.verdict === "block") {
throw new Error("Output blocked");
}
Framework Integrations
LangChain (Python)
from njiraai import NjiraAI
from njiraai_langchain import NjiraCallbackHandler
njira = NjiraAI(api_key="...", project_id="...")
handler = NjiraCallbackHandler(njira)
llm.invoke("Hello", config={"callbacks": [handler]})
LangChain.js (TypeScript)
import { NjiraAI } from "@njiraai/sdk";
import { NjiraCallbackHandler } from "@njiraai/langchain";
const njira = new NjiraAI({ apiKey: "...", projectId: "..." });
const handler = new NjiraCallbackHandler(njira);
await llm.invoke("Hello", { callbacks: [handler] });
CrewAI (Python)
from njiraai import NjiraAI
from njiraai_crewai import NjiraToolHooks
njira = NjiraAI(api_key="...", project_id="...")
hooks = NjiraToolHooks(njira)
# Attach hooks to your crew/agent tooling
# Hooks provide before_tool_call and after_tool_call methods
Web Framework Middleware
FastAPI (Python)
from fastapi import FastAPI
from njiraai.middleware import create_middleware
app = FastAPI()
app.add_middleware(create_middleware(njira))
Express (TypeScript)
import express from "express";
const app = express();
app.use(njira.middleware.express());
Next.js (TypeScript)
export const POST = njira.nextRoute(async (request: Request) => {
const { message } = await request.json();
const decision = await njira.enforcePre({ input: message });
return Response.json({ ok: true });
});
Modes
- shadow: compute decisions but never block (perfect for testing)
- active: enforce blocks and modifications
- dry_run: no network calls, local allow decisions