Open
Description
Expected Behavior
When injecting the ChatClient in any service I want to log the actual underlying model being used. Because we have multiple ChatClients of different underlying models, we want to log the actual model being used in the service that uses the ChatClient. E.g.
public class QcrIntentAgent {
@Autowired
private ChatClient chatClient;
public void doSomething(){
log.debug("LLM being used: {}", chatClient.getChatModel().getModel());
// E.g.:
// LLM being used: gpt-4.1-nano
// LLM being used: gpt-4o-mini
// LLM being used: mistral
//. LLM being used: llama3
chatClient.prompt(...);
}
Current Behavior
There is no clean way of getting this info besides injecting the ChatModel itself. Resulting in either injecting both ChatModel and ChatClient OR injecting just the model and creating a new ChatClient here.
@Autowired
private ChatClient chatClient;
@Autowired
private ChatModel chatModel;
public void doSomething(){
log.debug("LLM being used: {}", chatModel.getDefaultOptions().getModel());
chatClient.prompt(...);
}
// OR
@Autowired
private ChatModel chatModel;
public void doSomething(){
log.debug("LLM being used: {}", chatModel.getDefaultOptions().getModel());
ChatClient.builder(chatModel).build().prompt(...);
}
Context
Alternatives are see Current Behavior.
Creating a utility class
@UtilityClass
public class ChatClientUtils {
/**
* Returns the model configured for the ChatClient.
* This workaround is needed because Spring AI 1.0.0 does not expose the model via public API.
* WARNING: This depends on the internal structure of DefaultChatClient.
*/
public static String getModel(ChatClient chatClient) {
if (chatClient instanceof DefaultChatClient defaultClient) {
DefaultChatClient.DefaultChatClientRequestSpec requestSpec = (DefaultChatClient.DefaultChatClientRequestSpec) defaultClient.prompt("Hello");
return requestSpec.getChatOptions().getModel();
} else {
throw new IllegalArgumentException("Unsupported ChatClient implementation: " + chatClient.getClass().getName());
}
}
}