Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[9.0] [Obs Ai Assistant] Add system message (#209773) #211397

Merged
merged 1 commit into from
Feb 17, 2025

Conversation

kibanamachine
Copy link
Contributor

Backport

This will backport the following commits from main to 9.0:

Questions ?

Please refer to the Backport tool documentation

Fix: System Message Missing in Inference Plugin
Closes elastic#209548
## Summary

A regression was introduced in 8.18
([elastic#199286](elastic#199286)), where the
system message is no longer passed to the inference plugin and,
consequently, the LLM.

Currently, only user messages are being sent, which impacts conversation
guidance and guardrails. The system message is crucial for steering
responses and maintaining contextual integrity.

The filtering of the system message happens here:

https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512

Fix Approach
- Ensure the `system` message is included as a parameter in
`inferenceClient.chatComplete.`
```typescript
const options = {
      connectorId,
      system,
      messages: convertMessagesForInference(messages),
      toolChoice,
      tools,
      functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,
    };
    if (stream) {
      return defer(() =>
        this.dependencies.inferenceClient.chatComplete({
          ...options,
          stream: true,
        })
      ).pipe(
        convertInferenceEventsToStreamingEvents(),
        instrumentAndCountTokens(name),
        failOnNonExistingFunctionCall({ functions }),
        tap((event) => {
          if (
            event.type === StreamingChatResponseEventType.ChatCompletionChunk &&
            this.dependencies.logger.isLevelEnabled('trace')
          ) {
            this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);
          }
        }),
        shareReplay()
      ) as TStream extends true
        ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>
        : never;
    } else {
      return this.dependencies.inferenceClient.chatComplete({
        ...options,
        stream: false,
      }) as TStream extends true ? never : Promise<ChatCompleteResponse>;
    }
  }
 ```
- Add an API test to verify that the system message is correctly passed to the LLM.

(cherry picked from commit 0ae28aa)
@kibanamachine kibanamachine enabled auto-merge (squash) February 17, 2025 10:18
@botelastic botelastic bot added ci:project-deploy-observability Create an Observability project Team:Obs AI Assistant Observability AI Assistant labels Feb 17, 2025
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant)

Copy link
Contributor

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run docs-build : Re-trigger the docs validation. (use unformatted text in the comment!)

@elasticmachine
Copy link
Contributor

elasticmachine commented Feb 17, 2025

💛 Build succeeded, but was flaky

Failed CI Steps

Test Failures

  • [job] [logs] FTR Configs #58 / Visualize smoke telemetry tests should trigger render event for "agg based" legacy_metric visualization

Metrics [docs]

Public APIs missing comments

Total count of every public API that lacks a comment. Target amount is 0. Run node scripts/build_api_docs --plugin [yourplugin] --stats comments for more detailed information.

id before after diff
observabilityAIAssistant 378 383 +5

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
observabilityAIAssistant 19.7KB 19.6KB -39.0B
observabilityAIAssistantApp 249.9KB 249.6KB -281.0B
searchAssistant 146.3KB 146.0KB -281.0B
total -601.0B

Page load bundle

Size of the bundles that are downloaded on every page load. Target size is below 100kb

id before after diff
observabilityAIAssistant 38.4KB 38.3KB -146.0B
Unknown metric groups

API count

id before after diff
observabilityAIAssistant 380 385 +5

History

cc @arturoliduena

@kibanamachine kibanamachine merged commit 522b0d2 into elastic:9.0 Feb 17, 2025
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport ci:project-deploy-observability Create an Observability project Team:Obs AI Assistant Observability AI Assistant
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants