Skip to content

Commit

Permalink
feat: lafClaude
Browse files Browse the repository at this point in the history
  • Loading branch information
c121914yu committed May 4, 2023
1 parent 3c8f387 commit 0d6897e
Show file tree
Hide file tree
Showing 22 changed files with 325 additions and 229 deletions.
1 change: 0 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@
"axios": "^1.3.3",
"crypto": "^1.0.1",
"dayjs": "^1.11.7",
"delay": "^5.0.0",
"eventsource-parser": "^0.1.0",
"formidable": "^2.1.1",
"framer-motion": "^9.0.6",
Expand Down
7 changes: 0 additions & 7 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

32 changes: 17 additions & 15 deletions public/docs/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,39 +3,41 @@
[Git 仓库](https://github.com/c121914yu/FastGPT)

### 交流群/问题反馈

扫码满了,加个小号,定时拉
wx号: fastgpt123
wx 号: fastgpt123
![](/imgs/wx300.jpg)


### 快速开始
1. 使用手机号注册账号。
2. 进入账号页面,添加关联账号,目前只有 openai 的账号可以添加,直接去 openai 官网,把 API Key 粘贴过来。

1. 使用手机号注册账号。
2. 进入账号页面,添加关联账号,目前只有 openai 的账号可以添加,直接去 openai 官网,把 API Key 粘贴过来。
3. 如果填写了自己的 openai 账号,使用时会直接用你的账号。如果没有填写,需要付费使用平台的账号。
4. 进入模型页,创建一个模型,建议直接用 ChatGPT。
5. 在模型列表点击【对话】,即可使用 API 进行聊天。
4. 进入模型页,创建一个模型,建议直接用 ChatGPT。
5. 在模型列表点击【对话】,即可使用 API 进行聊天。

### 价格表

如果使用了自己的 Api Key,不会计费。可以在账号页,看到详细账单。单纯使用 chatGPT 模型进行对话,只有一个计费项目。使用知识库时,包含**对话****索引**生成两个计费项。
| 计费项 | 价格: 元/ 1K tokens(包含上下文)|
| --- | --- |
| --- | --- |
| claude - 对话 | 免费 |
| chatgpt - 对话 | 0.03 |
| 知识库 - 对话 | 0.03 |
| 知识库 - 索引 | 0.004 |
| 文件拆分 | 0.03 |


### 定制 prompt

1. 进入模型编辑页
2. 调整温度和提示词
1. 进入模型编辑页
2. 调整温度和提示词
3. 使用该模型对话。每次对话时,提示词和温度都会自动注入,方便管理个人的模型。建议把自己日常经常需要使用的 5~10 个方向预设好。

### 知识库

1. 创建模型时选择【知识库】
2. 进入模型编辑页
3. 导入数据,可以选择手动导入,或者选择文件导入。文件导入会自动调用 chatGPT 理解文件内容,并生成知识库。
4. 使用该模型对话。
1. 创建模型时选择【知识库】
2. 进入模型编辑页
3. 导入数据,可以选择手动导入,或者选择文件导入。文件导入会自动调用 chatGPT 理解文件内容,并生成知识库。
4. 使用该模型对话。

注意:使用知识库模型对话时,tokens 消耗会加快。
注意:使用知识库模型对话时,tokens 消耗会加快。
1 change: 1 addition & 0 deletions src/api/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ export const delChatHistoryById = (id: string) => GET(`/chat/removeHistory?id=${
*/
export const postSaveChat = (data: {
modelId: string;
newChatId: '' | string;
chatId: '' | string;
prompts: ChatItemType[];
}) => POST<string>('/chat/saveChat', data);
Expand Down
98 changes: 47 additions & 51 deletions src/api/fetch.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { getToken } from '../utils/user';
import { SYSTEM_PROMPT_PREFIX } from '@/constants/chat';
import { SYSTEM_PROMPT_HEADER, NEW_CHATID_HEADER } from '@/constants/chat';

interface StreamFetchProps {
url: string;
Expand All @@ -8,60 +8,56 @@ interface StreamFetchProps {
abortSignal: AbortController;
}
export const streamFetch = ({ url, data, onMessage, abortSignal }: StreamFetchProps) =>
new Promise<{ responseText: string; systemPrompt: string }>(async (resolve, reject) => {
try {
const res = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: getToken() || ''
},
body: JSON.stringify(data),
signal: abortSignal.signal
});
const reader = res.body?.getReader();
if (!reader) return;
new Promise<{ responseText: string; systemPrompt: string; newChatId: string }>(
async (resolve, reject) => {
try {
const res = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: getToken() || ''
},
body: JSON.stringify(data),
signal: abortSignal.signal
});
const reader = res.body?.getReader();
if (!reader) return;

const decoder = new TextDecoder();
let responseText = '';
let systemPrompt = '';
const decoder = new TextDecoder();

const read = async () => {
try {
const { done, value } = await reader?.read();
if (done) {
if (res.status === 200) {
resolve({ responseText, systemPrompt });
} else {
const parseError = JSON.parse(responseText);
reject(parseError?.message || '请求异常');
}
const systemPrompt = decodeURIComponent(res.headers.get(SYSTEM_PROMPT_HEADER) || '');
const newChatId = decodeURIComponent(res.headers.get(NEW_CHATID_HEADER) || '');

return;
}
let text = decoder.decode(value).replace(/<br\/>/g, '\n');
// check system prompt
if (text.includes(SYSTEM_PROMPT_PREFIX)) {
const arr = text.split(SYSTEM_PROMPT_PREFIX);
systemPrompt = arr.pop() || '';
let responseText = '';

text = arr.join('');
}
responseText += text;
onMessage(text);
const read = async () => {
try {
const { done, value } = await reader?.read();
if (done) {
if (res.status === 200) {
resolve({ responseText, systemPrompt, newChatId });
} else {
const parseError = JSON.parse(responseText);
reject(parseError?.message || '请求异常');
}

read();
} catch (err: any) {
if (err?.message === 'The user aborted a request.') {
return resolve({ responseText, systemPrompt });
return;
}
const text = decoder.decode(value).replace(/<br\/>/g, '\n');
responseText += text;
onMessage(text);
read();
} catch (err: any) {
if (err?.message === 'The user aborted a request.') {
return resolve({ responseText, systemPrompt, newChatId });
}
reject(typeof err === 'string' ? err : err?.message || '请求异常');
}
reject(typeof err === 'string' ? err : err?.message || '请求异常');
}
};

read();
} catch (err: any) {
console.log(err, '====');
reject(typeof err === 'string' ? err : err?.message || '请求异常');
};
read();
} catch (err: any) {
console.log(err, '====');
reject(typeof err === 'string' ? err : err?.message || '请求异常');
}
}
});
);
3 changes: 2 additions & 1 deletion src/constants/chat.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
export const SYSTEM_PROMPT_PREFIX = 'SYSTEM_PROMPT:';
export const SYSTEM_PROMPT_HEADER = 'System-Prompt-Header';
export const NEW_CHATID_HEADER = 'Chat-Id-Header';

export enum ChatRoleEnum {
System = 'System',
Expand Down
22 changes: 20 additions & 2 deletions src/constants/model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,17 @@ export enum OpenAiChatEnum {
'GPT4' = 'gpt-4',
'GPT432k' = 'gpt-4-32k'
}
export enum ClaudeEnum {
'Claude' = 'Claude'
}

export type ChatModelType = `${OpenAiChatEnum}`;
export type ChatModelType = `${OpenAiChatEnum}` | `${ClaudeEnum}`;

export type ChatModelItemType = {
chatModel: ChatModelType;
name: string;
contextMaxToken: number;
systemMaxToken: number;
maxTemperature: number;
price: number;
};
Expand All @@ -24,26 +28,40 @@ export const ChatModelMap = {
chatModel: OpenAiChatEnum.GPT35,
name: 'ChatGpt',
contextMaxToken: 4096,
systemMaxToken: 3000,
maxTemperature: 1.5,
price: 3
},
[OpenAiChatEnum.GPT4]: {
chatModel: OpenAiChatEnum.GPT4,
name: 'Gpt4',
contextMaxToken: 8000,
systemMaxToken: 4000,
maxTemperature: 1.5,
price: 30
},
[OpenAiChatEnum.GPT432k]: {
chatModel: OpenAiChatEnum.GPT432k,
name: 'Gpt4-32k',
contextMaxToken: 32000,
systemMaxToken: 4000,
maxTemperature: 1.5,
price: 30
},
[ClaudeEnum.Claude]: {
chatModel: ClaudeEnum.Claude,
name: 'Claude(免费体验)',
contextMaxToken: 9000,
systemMaxToken: 2500,
maxTemperature: 1,
price: 0
}
};

export const chatModelList: ChatModelItemType[] = [ChatModelMap[OpenAiChatEnum.GPT35]];
export const chatModelList: ChatModelItemType[] = [
ChatModelMap[OpenAiChatEnum.GPT35],
ChatModelMap[ClaudeEnum.Claude]
];

export enum ModelStatusEnum {
running = 'running',
Expand Down
22 changes: 12 additions & 10 deletions src/pages/api/chat/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -42,11 +42,12 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
await connectToDatabase();
let startTime = Date.now();

const { model, showModelDetail, content, userApiKey, systemApiKey, userId } = await authChat({
modelId,
chatId,
authorization
});
const { model, showModelDetail, content, userOpenAiKey, systemAuthKey, userId } =
await authChat({
modelId,
chatId,
authorization
});

const modelConstantsData = ChatModelMap[model.chat.chatModel];

Expand All @@ -56,8 +57,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
// 使用了知识库搜索
if (model.chat.useKb) {
const { code, searchPrompt } = await searchKb({
userApiKey,
systemApiKey,
userOpenAiKey,
prompts,
similarity: ModelVectorSearchModeMap[model.chat.searchMode]?.similarity,
model,
Expand Down Expand Up @@ -86,10 +86,12 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)

// 发出请求
const { streamResponse } = await modelServiceToolMap[model.chat.chatModel].chatCompletion({
apiKey: userApiKey || systemApiKey,
apiKey: userOpenAiKey || systemAuthKey,
temperature: +temperature,
messages: prompts,
stream: true
stream: true,
res,
chatId
});

console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
Expand All @@ -108,7 +110,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)

// 只有使用平台的 key 才计费
pushChatBill({
isPay: !userApiKey,
isPay: !userOpenAiKey,
chatModel: model.chat.chatModel,
userId,
chatId,
Expand Down
4 changes: 3 additions & 1 deletion src/pages/api/chat/saveChat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ import mongoose from 'mongoose';
/* 聊天内容存存储 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { chatId, modelId, prompts } = req.body as {
const { chatId, modelId, prompts, newChatId } = req.body as {
newChatId: '' | string;
chatId: '' | string;
modelId: string;
prompts: ChatItemType[];
Expand All @@ -35,6 +36,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
// 没有 chatId, 创建一个对话
if (!chatId) {
const { _id } = await Chat.create({
_id: newChatId ? new mongoose.Types.ObjectId(newChatId) : undefined,
userId,
modelId,
content,
Expand Down
1 change: 0 additions & 1 deletion src/pages/api/openapi/chat/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,6 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
const similarity = ModelVectorSearchModeMap[model.chat.searchMode]?.similarity || 0.22;

const { code, searchPrompt } = await searchKb({
systemApiKey: apiKey,
prompts,
similarity,
model,
Expand Down
1 change: 0 additions & 1 deletion src/pages/api/openapi/chat/lafGpt.ts
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,6 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)

// 获取向量匹配到的提示词
const { searchPrompt } = await searchKb({
systemApiKey: apiKey,
similarity: ModelVectorSearchModeMap[model.chat.searchMode]?.similarity,
prompts,
model,
Expand Down
4 changes: 2 additions & 2 deletions src/pages/chat/index.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -206,7 +206,7 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
};

// 流请求,获取数据
const { responseText, systemPrompt } = await streamFetch({
let { responseText, systemPrompt, newChatId } = await streamFetch({
url: '/api/chat/chat',
data: {
prompt,
Expand Down Expand Up @@ -234,10 +234,10 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
return;
}

let newChatId = '';
// save chat record
try {
newChatId = await postSaveChat({
newChatId, // 如果有newChatId,会自动以这个Id创建对话框
modelId,
chatId,
prompts: [
Expand Down
Loading

0 comments on commit 0d6897e

Please sign in to comment.