Skip to main content
Create a chat API endpoint that handles streaming responses from the C1 model.
1

Install required dependencies

Install the necessary packages for your backend API:
npm
npm install openai @crayonai/stream
2

Create the message store

First, create a simple in-memory message store to manage conversation history. This in-memory store just stores the list of messages for a given threadId including messages that are not sent to the client like the tool call messages.
app/api/chat/messageStore.ts
import OpenAI from "openai";

export type DBMessage = OpenAI.Chat.ChatCompletionMessageParam & {
  id?: string;
};

const messagesStore: {
  [threadId: string]: DBMessage[];
} = {};

export const getMessageStore = (threadId: string) => {
  if (!messagesStore[threadId]) {
    messagesStore[threadId] = [];
  }
  const messageList = messagesStore[threadId];
  return {
    addMessage: (message: DBMessage) => {
      messageList.push(message);
    },
    getOpenAICompatibleMessageList: () => {
      return messageList.map((m) => {
        const message = { ...m };
        delete message.id;
        return message;
      });
    },
  };
};
3

Create the chat endpoint

Create the main API endpoint that handles incoming chat requests with streaming:
app/api/chat/route.ts
import { NextRequest, NextResponse } from "next/server";
import OpenAI from "openai";
import { transformStream } from "@crayonai/stream";
import { DBMessage, getMessageStore } from "./messageStore";

export async function POST(req: NextRequest) {
  const { prompt, threadId, responseId } = (await req.json()) as {
    prompt: DBMessage;
    threadId: string;
    responseId: string;
  };

  // Initialize the OpenAI client
  const client = new OpenAI({
    baseURL: "https://api.thesys.dev/v1/embed/",
    apiKey: process.env.THESYS_API_KEY,
  });

  // Get message store and add user message
  const messageStore = getMessageStore(threadId);
  messageStore.addMessage(prompt);

  // Create streaming chat completion
  const llmStream = await client.chat.completions.create({
    model: "c1/anthropic/claude-sonnet-4/v-20251230",
    messages: messageStore.getOpenAICompatibleMessageList(),
    stream: true,
  });

  // Transform the response stream
  const responseStream = transformStream(
    llmStream,
    (chunk) => {
      return chunk.choices[0].delta.content;
    },
    {
      onEnd: ({ accumulated }) => {
        const message = accumulated.filter((message) => message).join("");
        messageStore.addMessage({
          role: "assistant",
          content: message,
          id: responseId,
        });
      },
    }
  ) as ReadableStream;

  // Return the streaming response
  return new NextResponse(responseStream, {
    headers: {
      "Content-Type": "text/event-stream",
      "Cache-Control": "no-cache, no-transform",
      Connection: "keep-alive",
    },
  });
}
4

Set your API key

Make sure to set your Thesys API key as an environment variable:
export THESYS_API_KEY=<your-api-key>
Next.js (.env.local)
THESYS_API_KEY=<your-api-key>
Your API endpoint is now ready to handle streaming chat conversations with the C1 model!