Skip to content

Commit

Permalink
[.NET] Return ChatCompletions instead of ChatResponseMessage for toke…
Browse files Browse the repository at this point in the history
…n usage. (microsoft#2545)

* update

* update

* update

* update

* update

* add sample project

* revert notebook change back

* update

* update interactive version

* add nuget package

* refactor Message

* update example

* add azure nightly build pipeline

* Set up CI with Azure Pipelines

[skip ci]

* Update nightly-build.yml for Azure Pipelines

* add dotnet interactive package

* add dotnet interactive package

* update pipeline

* add nuget feed back

* remove dotnet-tool feed

* remove dotnet-tool feed comment

* update pipeline

* update build name

* Update nightly-build.yml

* Delete .github/workflows/dotnet-ci.yml

* update

* add working_dir to use step

* add initateChat api

* update oai package

* Update dotnet-build.yml

* Update dotnet-run-openai-test-and-notebooks.yml

* update build workflow

* update build workflow

* update nuget feed

* update nuget feed

* update aoai and sk version

* Update InteractiveService.cs

* add support for GPT 4V

* add DalleAndGPT4V example

* update example

* add user proxy agent

* add readme

* bump version

* update example

* add dotnet interactive hook

* update

* udpate tests

* add website

* update index.md

* add docs

* update doc

* move sk dependency out of core package

* udpate doc

* Update Use-function-call.md

* add type safe function call document

* update doc

* update doc

* add dock

* Update Use-function-call.md

* add GenerateReplyOptions

* remove IChatLLM

* update version

* update doc

* update website

* add sample

* fix link

* add middleware agent

* clean up doc

* bump version

* update doc

* update

* add Other Language

* remove warnings

* add sign.props

* add sign step

* fix pipelien

* auth

* real sign

* disable PR trigger

* update

* disable PR trigger

* use microbuild machine

* update build pipeline to add publish to internal feed

* add internal feed

* fix build pipeline

* add dotnet prefix

* update ci

* add build number

* update run number

* update source

* update token

* update

* remove adding source

* add publish to github package

* try again

* try again

* ask for write pacakge

* disable package when branch is not main

* update

* implement streaming agent

* add test for streaming function call

* update

* fix microsoft#1588

* enable PR check for dotnet branch

* add website readme

* only publish to dotnet feed when pushing to dotnet branch

* remove openai-test-and-notebooks workflow

* update readme

* update readme

* update workflow

* update getting-start

* upgrade test and sample proejct to use .net 8

* fix global.json format && make loadFromConfig API internal only before implementing

* update

* add support for LM studio

* add doc

* Update README.md

* add push and workflow_dispatch trigger

* disable PR for main

* add dotnet env

* Update Installation.md

* add nuget

* refer to newtonsoft 13

* update branch to dotnet in docfx

* Update Installation.md

* pull out HumanInputMiddleware and FunctionCallMiddleware

* fix tests

* add link to sample folder

* refactor message

* refactor over IMessage

* add more tests

* add more test

* fix build error

* rename header

* add semantic kernel project

* update sk example

* update dotnet version

* add LMStudio function call example

* rename LLaMAFunctin

* remove dotnet run openai test and notebook workflow

* add FunctionContract and test

* update doc

* add documents

* add workflow

* update

* update sample

* fix warning in test

* reult length can be less then maximumOutputToKeep (microsoft#1804)

* merge with main

* add option to retrieve inner agent and middlewares from MiddlewareAgent

* update doc

* adjust namespace

* update readme

* fix test

* use IMessage

* more updates

* update

* fix test

* add comments

* use FunctionContract to replace FunctionDefinition

* move AutoGen contrac to AutoGen.Core

* update installation

* refactor streamingAgent by adding StreamingMessage type

* update sample

* update samples

* update

* update

* add test

* fix test

* bump version

* add openaichat test

* update

* Update Example03_Agent_FunctionCall.cs

* [.Net] improve docs (microsoft#1862)

* add doc

* add doc

* add doc

* add doc

* add doc

* add doc

* update

* fix test error

* fix some error

* fix test

* fix test

* add more tests

* edits

---------

Co-authored-by: ekzhu <[email protected]>

* [.Net] Add fill form example (microsoft#1911)

* add form filler example

* update

* fix ci error

* [.Net] Add using AutoGen.Core in source generator (microsoft#1983)

* fix using namespace bug in source generator

* remove using in sourcegenerator test

* disable PR test

* Add .idea to .gitignore (microsoft#1988)

* [.Net] publish to nuget.org feed (microsoft#1987)

* publish to nuget

* update ci

* update dotnet-release

* update release pipeline

* add source

* remove empty symbol package

* update pipeline

* remove tag

* update installation guide

* [.Net] Rename some classes && APIs based on doc review (microsoft#1980)

* rename sequential group chat to round robin group chat

* rename to sendInstruction

* rename workflow to graph

* rename some api

* bump version

* move Graph to GroupChat folder

* rename fill application example

* [.Net] Improve package description (microsoft#2161)

* add discord link and update package description

* Update getting-start.md

* [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (microsoft#2231)

* update

* rename RegisterPrintMessageHook to RegisterPrintMessage

* update website

* update update.md

* fix link error

* [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (microsoft#2347)

* update openai version && add sample for json output

* add example in web

* update update.md

* update image url

* [.Net] Add AutoGen.Mistral package (microsoft#2330)

* add mstral client

* enable streaming support

* add mistralClientAgent

* add test for function call

* add extension

* add support for toolcall and toolcall result message

* add support for aggregate message

* implement streaming function call

* track (microsoft#2471)

* [.Net] add mistral example (microsoft#2482)

* update existing examples to use messageCOnnector

* add overview

* add function call document

* add example 14

* add mistral token count usage example

* update version

* Update dotnet-release.yml (microsoft#2488)

* update

* revert gitattributes

* Return ChatCompletions instead of ChatResponseMessage for token usage.

---------

Co-authored-by: XiaoYun Zhang <[email protected]>
Co-authored-by: Xiaoyun Zhang <[email protected]>
Co-authored-by: mhensen <[email protected]>
Co-authored-by: ekzhu <[email protected]>
Co-authored-by: Krzysztof Kasprowicz <[email protected]>
Co-authored-by: luongdavid <[email protected]>
  • Loading branch information
7 people authored May 2, 2024
1 parent f4a07ff commit 3e69357
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 9 deletions.
4 changes: 2 additions & 2 deletions dotnet/src/AutoGen.OpenAI/Agent/OpenAIChatAgent.cs
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ public async Task<IMessage> GenerateReplyAsync(
var settings = this.CreateChatCompletionsOptions(options, messages);
var reply = await this.openAIClient.GetChatCompletionsAsync(settings, cancellationToken);

return new MessageEnvelope<ChatResponseMessage>(reply.Value.Choices.First().Message, from: this.Name);
return new MessageEnvelope<ChatCompletions>(reply, from: this.Name);
}

public Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(
Expand All @@ -101,7 +101,7 @@ private async IAsyncEnumerable<IStreamingMessage> StreamingReplyAsync(
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var settings = this.CreateChatCompletionsOptions(options, messages);
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(settings);
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(settings, cancellationToken);
await foreach (var update in response.WithCancellation(cancellationToken))
{
if (update.ChoiceIndex > 0)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,7 @@ public IMessage PostProcessMessage(IMessage message)
Message => message,
AggregateMessage<ToolCallMessage, ToolCallResultMessage> => message,
IMessage<ChatResponseMessage> m => PostProcessMessage(m),
IMessage<ChatCompletions> m => PostProcessMessage(m),
_ => throw new InvalidOperationException("The type of message is not supported. Must be one of TextMessage, ImageMessage, MultiModalMessage, ToolCallMessage, ToolCallResultMessage, Message, IMessage<ChatRequestMessage>, AggregateMessage<ToolCallMessage, ToolCallResultMessage>"),
};
}
Expand Down Expand Up @@ -129,15 +130,24 @@ public IMessage PostProcessMessage(IMessage message)

private IMessage PostProcessMessage(IMessage<ChatResponseMessage> message)
{
var chatResponseMessage = message.Content;
return PostProcessMessage(message.Content, message.From);
}

private IMessage PostProcessMessage(IMessage<ChatCompletions> message)
{
return PostProcessMessage(message.Content.Choices[0].Message, message.From);
}

private IMessage PostProcessMessage(ChatResponseMessage chatResponseMessage, string? from)
{
if (chatResponseMessage.Content is string content)
{
return new TextMessage(Role.Assistant, content, message.From);
return new TextMessage(Role.Assistant, content, from);
}

if (chatResponseMessage.FunctionCall is FunctionCall functionCall)
{
return new ToolCallMessage(functionCall.Name, functionCall.Arguments, message.From);
return new ToolCallMessage(functionCall.Name, functionCall.Arguments, from);
}

if (chatResponseMessage.ToolCalls.Where(tc => tc is ChatCompletionsFunctionToolCall).Any())
Expand All @@ -148,7 +158,7 @@ private IMessage PostProcessMessage(IMessage<ChatResponseMessage> message)

var toolCalls = functionToolCalls.Select(tc => new ToolCall(tc.Name, tc.Arguments));

return new ToolCallMessage(toolCalls, message.From);
return new ToolCallMessage(toolCalls, from);
}

throw new InvalidOperationException("Invalid ChatResponseMessage");
Expand Down
7 changes: 4 additions & 3 deletions dotnet/test/AutoGen.Tests/OpenAIChatAgentTest.cs
Original file line number Diff line number Diff line change
Expand Up @@ -41,9 +41,10 @@ public async Task BasicConversationTestAsync()
var chatMessageContent = MessageEnvelope.Create(new ChatRequestUserMessage("Hello"));
var reply = await openAIChatAgent.SendAsync(chatMessageContent);

reply.Should().BeOfType<MessageEnvelope<ChatResponseMessage>>();
reply.As<MessageEnvelope<ChatResponseMessage>>().From.Should().Be("assistant");
reply.As<MessageEnvelope<ChatResponseMessage>>().Content.Role.Should().Be(ChatRole.Assistant);
reply.Should().BeOfType<MessageEnvelope<ChatCompletions>>();
reply.As<MessageEnvelope<ChatCompletions>>().From.Should().Be("assistant");
reply.As<MessageEnvelope<ChatCompletions>>().Content.Choices.First().Message.Role.Should().Be(ChatRole.Assistant);
reply.As<MessageEnvelope<ChatCompletions>>().Content.Usage.TotalTokens.Should().BeGreaterThan(0);

// test streaming
var streamingReply = await openAIChatAgent.GenerateStreamingReplyAsync(new[] { chatMessageContent });
Expand Down

0 comments on commit 3e69357

Please sign in to comment.