Skip to content

[AiBundle] Use model FQCN for indexer config, to allow any Symfony\AI\Platform\Model child class #88

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

welcoMattic
Copy link
Member

@welcoMattic welcoMattic commented Jul 11, 2025

Q A
Bug fix? yes
New feature? no
Docs? no
Issues Fix #59
License MIT

Before:

ai:
    indexer:
        default:
            platform: 'symfony_ai.platform.mistral'
            model:
                name: 'Embeddings'
                version: 'mistral-embed'

After

ai:
    indexer:
        default:
            platform: 'symfony_ai.platform.mistral'
            model:
                name: 'Symfony\AI\Platform\Bridge\Mistral\Embeddings'
                version: 'mistral-embed'

This way, any class extending Symfony\AI\Platform\Model can be used as embedder model, even one written by users themselves.
Embeddings model of providers (OpenAI, Mistral, Gemini, ...) can be named Embeddings with no issue.


Replace php-llm/llm-chain-bundle#99

@welcoMattic welcoMattic force-pushed the fix/config-embeddings-model-name branch from c6d0eef to 77b48a2 Compare July 11, 2025 13:05
@welcoMattic
Copy link
Member Author

Repost original comment from @chr-hertel (php-llm/llm-chain-bundle#99 (comment)):

Yup, that first implementation was def too stupid, true. And I guess the className thing is something that should be supported - especially when you thin about user land model classes for example.

However, looking at this config:

llm_chain:
    embedder:
        default:
            platform: 'llm_chain.platform.mistral'
            model:
                name: 'PhpLlm\LlmChain\Platform\Bridge\Mistral\Embeddings'
                version: 'mistral-embed'

This is a bit redundant, right? the service llm_chain.platform.mistral would anyways only support one embeddings class - I think we could also solve that by the Platform exposing which models (and their names) it would support. so that we would basically only chose from a subset of those classes.

WDYT?

@welcoMattic welcoMattic added Bug Something isn't working AI Bundle Issues & PRs about the AI integration bundle labels Jul 11, 2025
@chr-hertel chr-hertel added the BC Break Breaking the Backwards Compatibility Promise label Jul 12, 2025
@chr-hertel
Copy link
Contributor

Yes, let's do this to unblock the usage for now - we can work on a slimmer config as a next step.

Copy link
Contributor

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's break it once for both parameters and also the agent config - same issue.

What do you think about?

  • class instead of name or className
  • name instead of version (also rather deprecated)

and I guess the docs and demo need an update as well here

@chr-hertel
Copy link
Contributor

Conflicts after merging #94, but rebase should do the trick hopefully

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
AI Bundle Issues & PRs about the AI integration bundle BC Break Breaking the Backwards Compatibility Promise Bug Something isn't working Status: Needs Work
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[AiBundle] OpenAI Embeddings model used no matter the platform defined on the indexed
2 participants