Open
Description
As the title says, I have my custom REST API which handles LLM calls. They are all running inside EC2 containers because we dont wanna expose company data .
I want to connect the BlockNote AI component to my endpoint, so that I can route the calls through my backend, to our internally-hosted LLMs. The AI SDK has no clear way of doing this. I tried using createBlockNoteAIClient
, but how can I plug this AI client to the editor?
export const customBlocknoteAIClient = createBlockNoteAIClient({baseURL:
${API_URL}/canvas/block_level_call, apiKey: ""});
const editor = useCreateBlockNote({
initialContent: placeholderText,
dropCursor: ({editor}) => createCustomMDImagePlugin(editor, chatSessionId!),
schema: customSchema,
codeBlock,
dictionary: {
...en,
ai: aiEn, // add default translations for the AI extension
},
// // Register the AI extension
extensions: [
createAIExtension({
executor: async (opts) => {
console.log({opts})
// You may need to extract blockId, sessionId, etc. from options or context
const { blockId, sessionId, blockType, user } = opts;
// Call your backend
const response = await apiService.blockLevelCanvasCall(sessionId, prompt);
// BlockNote expects a Block or array of Blocks as the result
return response;
}})
]
});
Metadata
Metadata
Assignees
Labels
No labels