diff --git a/README.md b/README.md index 023d0117..56f0ed9a 100644 --- a/README.md +++ b/README.md @@ -6,856 +6,34 @@ Symfony AI -Symfony AI is a set of packages that integrate AI capabilities into PHP applications. +Symfony AI is a set of components that integrate AI capabilities into PHP applications. -## Requirements +## Components -* PHP 8.2 or higher +Symfony AI consists of several lower and higher level **components** and the respective integration **bundles**: -## Installation - -The recommended way to install Symfony AI components is through [Composer](http://getcomposer.org/): - -```bash -composer require symfony/ai-agent -composer require symfony/ai-platform -composer require symfony/ai-store -``` - -When using Symfony Framework, check out the integration bundle symfony/ai-bundle. +* **Components** + * **[Platform](src/platform/README.md)**: A unified interface to various AI platforms like OpenAI, Anthropic, Azure, Google, and more. + * **[Agent](src/agent/README.md)**: Framework for building AI agents that can interact with users and perform tasks. + * **[Store](src/store/README.md)**: Data storage abstraction with indexing and retrieval for AI applications. + * **[MCP SDK](src/mcp-sdk/README.md)**: SDK for [Model Context Protocol](https://modelcontextprotocol.io) enabling communication between AI agents and tools. +* **Bundles** + * **[AI Bundle](src/ai-bundle/README.md)**: Symfony integration for AI Platform, Store and Agent components. + * **[MCP Bundle](src/mcp-bundle/README.md)**: Symfony integration for MCP SDK, allowing them to act as MCP servers or clients. ## Examples -See [the examples folder](examples) to run example implementations using this library. -Depending on the example you need to export different environment variables -for API keys or deployment configurations or create a `.env.local` based on `.env` file. - -To run all examples, use `make run-examples` or `php example` - to run a subgroup like all HuggingFace related examples -use `php example huggingface`. - -For a more sophisticated demo, see the [Symfony Demo Application](https://github.com/php-llm/symfony-demo). - -## Basic Concepts & Usage - -### Models & Platforms - -Symfony AI categorizes two main types of models: **Language Models** and **Embeddings Models**. On top of that, there are -other models, like text-to-speech, image generation, or classification models that are also supported. - -Language Models, like GPT, Claude, and Llama, as essential centerpiece of LLM applications -and Embeddings Models as supporting models to provide vector representations of a text. - -Those models are provided by different **platforms**, like OpenAI, Azure, Google, Replicate, and others. - -#### Example Instantiation - -```php -use Symfony\AI\Platform\Bridge\OpenAI\Embeddings; -use Symfony\AI\Platform\Bridge\OpenAI\GPT; -use Symfony\AI\Platform\Bridge\OpenAI\PlatformFactory; - -// Platform: OpenAI -$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']); - -// Language Model: GPT (OpenAI) -$model = new GPT(GPT::GPT_4O_MINI); - -// Embeddings Model: Embeddings (OpenAI) -$embeddings = new Embeddings(); -``` - -#### Supported Models & Platforms - -* Language Models - * [OpenAI's GPT](https://platform.openai.com/docs/models/overview) with [OpenAI](https://platform.openai.com/docs/overview) and [Azure](https://learn.microsoft.com/azure/ai-services/openai/concepts/models) as Platform - * [Anthropic's Claude](https://www.anthropic.com/claude) with [Anthropic](https://www.anthropic.com/) and [AWS](https://aws.amazon.com/bedrock/) as Platform - * [Meta's Llama](https://www.llama.com/) with [Azure](https://learn.microsoft.com/azure/machine-learning/how-to-deploy-models-llama), [Ollama](https://ollama.com/), [Replicate](https://replicate.com/) and [AWS](https://aws.amazon.com/bedrock/) as Platform - * [Google's Gemini](https://gemini.google.com/) with [Google](https://ai.google.dev/) and [OpenRouter](https://www.openrouter.com/) as Platform - * [DeepSeek's R1](https://www.deepseek.com/) with [OpenRouter](https://www.openrouter.com/) as Platform - * [Amazon's Nova](https://nova.amazon.com) with [AWS](https://aws.amazon.com/bedrock/) as Platform - * [Mistral's Mistral](https://www.mistral.ai/) with [Mistral](https://www.mistral.ai/) as Platform -* Embeddings Models - * [OpenAI's Text Embeddings](https://platform.openai.com/docs/guides/embeddings/embedding-models) with [OpenAI](https://platform.openai.com/docs/overview) and [Azure](https://learn.microsoft.com/azure/ai-services/openai/concepts/models) as Platform - * [Voyage's Embeddings](https://docs.voyageai.com/docs/embeddings) with [Voyage](https://www.voyageai.com/) as Platform - * [Mistral Embed](https://www.mistral.ai/) with [Mistral](https://www.mistral.ai/) as Platform -* Other Models - * [OpenAI's Dall·E](https://platform.openai.com/docs/guides/image-generation) with [OpenAI](https://platform.openai.com/docs/overview) as Platform - * [OpenAI's Whisper](https://platform.openai.com/docs/guides/speech-to-text) with [OpenAI](https://platform.openai.com/docs/overview) and [Azure](https://learn.microsoft.com/azure/ai-services/openai/concepts/models) as Platform - * All models provided by [HuggingFace](https://huggingface.co/) can be listed with `make huggingface-models` - And more filtered with `php examples/huggingface/_model-listing.php --provider=hf-inference --task=object-detection` - -See [issue #28](https://github.com/php-llm/llm-chain/issues/28) for planned support of other models and platforms. - -### Agent & Messages - -The core feature of the Symfony AI is to interact with language models via messages. This interaction is done by sending -a **MessageBag** to an **Agent**, which takes care of LLM invocation and response handling. - -Messages can be of different types, most importantly `UserMessage`, `SystemMessage`, or `AssistantMessage`, and can also -have different content types, like `Text`, `Image` or `Audio`. - -#### Example Agent call with messages - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -// Platform & LLM instantiation - -$agent = new Agent($platform, $model); -$messages = new MessageBag( - Message::forSystem('You are a helpful chatbot answering questions about LLM agent.'), - Message::ofUser('Hello, how are you?'), -); -$response = $agent->call($messages); - -echo $response->getContent(); // "I'm fine, thank you. How can I help you today?" -``` - -The `MessageInterface` and `Content` interface help to customize this process if needed, e.g. additional state handling. - -#### Options - -The second parameter of the `call` method is an array of options, which can be used to configure the behavior of the -agent, like `stream`, `output_structure`, or `response_format`. This behavior is a combination of features provided by -the underlying model and platform, or additional features provided by processors registered to the agent. - -Options designed for additional features provided by Symfony AI can be found in this documentation. For model- and -platform-specific options, please refer to the respective documentation. - -```php -// agent and MessageBag instantiation - -$response = $agent->call($messages, [ - 'temperature' => 0.5, // example option controlling the randomness of the response, e.g. GPT and Claude - 'n' => 3, // example option controlling the number of responses generated, e.g. GPT -]); -``` - -#### Code Examples - -1. [Anthropic's Claude](examples/anthropic/chat.php) -1. [OpenAI's GPT with Azure](examples/azure/chat-gpt.php) -1. [OpenAI's GPT](examples/openai/chat.php) -1. [OpenAI's o1](examples/openai/chat-o1.php) -1. [Meta's Llama with Azure](examples/azure/chat-llama.php) -1. [Meta's Llama with Ollama](examples/ollama/chat-llama.php) -1. [Meta's Llama with Replicate](examples/replicate/chat-llama.php) -1. [Google's Gemini with Google](examples/google/chat.php) -1. [Google's Gemini with OpenRouter](examples/openrouter/chat-gemini.php) -1. [Mistral's Mistral with Mistral](examples/mistral/chat-mistral.php) - -### Tools - -To integrate LLMs with your application, Symfony AI supports [tool calling](https://platform.openai.com/docs/guides/function-calling) out of the box. -Tools are services that can be called by the LLM to provide additional features or process data. - -Tool calling can be enabled by registering the processors in the agent: - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Agent\Toolbox\AgentProcessor; -use Symfony\AI\Agent\Toolbox\Toolbox; - -// Platform & LLM instantiation - -$yourTool = new YourTool(); - -$toolbox = Toolbox::create($yourTool); -$toolProcessor = new AgentProcessor($toolbox); - -$agent = new Agent($platform, $model, inputProcessors: [$toolProcessor], outputProcessors: [$toolProcessor]); -``` - -Custom tools can basically be any class, but must configure by the `#[AsTool]` attribute. - -```php -use Symfony\AI\Toolbox\Attribute\AsTool; - -#[AsTool('company_name', 'Provides the name of your company')] -final class CompanyName -{ - public function __invoke(): string - { - return 'ACME Corp.'; - } -} -``` - -#### Tool Return Value - -In the end, the tool's response needs to be a string, but Symfony AI converts arrays and objects, that implement the -`JsonSerializable` interface, to JSON strings for you. So you can return arrays or objects directly from your tool. - -#### Tool Methods - -You can configure the method to be called by the LLM with the `#[AsTool]` attribute and have multiple tools per class: - -```php -use Symfony\AI\Toolbox\Attribute\AsTool; - -#[AsTool( - name: 'weather_current', - description: 'get current weather for a location', - method: 'current', -)] -#[AsTool( - name: 'weather_forecast', - description: 'get weather forecast for a location', - method: 'forecast', -)] -final readonly class OpenMeteo -{ - public function current(float $latitude, float $longitude): array - { - // ... - } - - public function forecast(float $latitude, float $longitude): array - { - // ... - } -} -``` - -#### Tool Parameters - -Symfony AI generates a JSON Schema representation for all tools in the `Toolbox` based on the `#[AsTool]` attribute and -method arguments and param comments in the doc block. Additionally, JSON Schema support validation rules, which are -partially support by LLMs like GPT. - -To leverage this, configure the `#[With]` attribute on the method arguments of your tool: - -```php -use Symfony\AI\Agent\Toolbox\Attribute\AsTool; -use Symfony\AI\Platform\Contract\JsonSchema\Attribute\With; - -#[AsTool('my_tool', 'Example tool with parameters requirements.')] -final class MyTool -{ - /** - * @param string $name The name of an object - * @param int $number The number of an object - */ - public function __invoke( - #[With(pattern: '/([a-z0-1]){5}/')] - string $name, - #[With(minimum: 0, maximum: 10)] - int $number, - ): string { - // ... - } -} -``` - -See attribute class [With](src/agent/JsonSchema/Attribute/With.php) for all available options. - -> [!NOTE] -> Please be aware, that this is only converted in a JSON Schema for the LLM to respect, but not validated by Symfony AI. - -#### Third-Party Tools - -In some cases you might want to use third-party tools, which are not part of your application. Adding the `#[AsTool]` -attribute to the class is not possible in those cases, but you can explicitly register the tool in the `MemoryFactory`: - -```php -use Symfony\AI\Agent\Toolbox\Toolbox; -use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; -use Symfony\Component\Clock\Clock; - -$metadataFactory = (new MemoryToolFactory()) - ->addTool(Clock::class, 'clock', 'Get the current date and time', 'now'); -$toolbox = new Toolbox($metadataFactory, [new Clock()]); -``` - -> [!NOTE] -> Please be aware that not all return types are supported by the toolbox, so a decorator might still be needed. - -This can be combined with the `ChainFactory` which enables you to use explicitly registered tools and `#[AsTool]` tagged -tools in the same chain - which even enables you to overwrite the pre-existing configuration of a tool: - -```php -use Symfony\AI\Agent\Toolbox\Toolbox; -use Symfony\AI\Agent\Toolbox\ToolFactory\ChainFactory; -use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; -use Symfony\AI\Agent\Toolbox\ToolFactory\ReflectionToolFactory; - -$reflectionFactory = new ReflectionToolFactory(); // Register tools with #[AsTool] attribute -$metadataFactory = (new MemoryToolFactory()) // Register or overwrite tools explicitly - ->addTool(...); -$toolbox = new Toolbox(new AgentFactory($metadataFactory, $reflectionFactory), [...]); -``` - -> [!NOTE] -> The order of the factories in the `ChainFactory` matters, as the first factory has the highest priority. - -#### Agent uses Agent 🤯 - -Similar to third-party tools, an agent can also use an different agent as a tool. This can be useful to encapsulate -complex logic or to reuse an agent in multiple places or hide sub-agents from the LLM. - -```php -use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; -use Symfony\AI\Agent\Toolbox\Toolbox; -use Symfony\AI\Agent\Toolbox\Tool\Agent; - -// agent was initialized before - -$agentTool = new Agent($agent); -$metadataFactory = (new MemoryToolFactory()) - ->addTool($agentTool, 'research_agent', 'Meaningful description for sub-agent'); -$toolbox = new Toolbox($metadataFactory, [$agentTool]); -``` - -#### Fault Tolerance - -To gracefully handle errors that occur during tool calling, e.g. wrong tool names or runtime errors, you can use the -`FaultTolerantToolbox` as a decorator for the `Toolbox`. It will catch the exceptions and return readable error messages -to the LLM. - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Agent\Toolbox\AgentProcessor; -use Symfony\AI\Agent\Toolbox\FaultTolerantToolbox; - -// Platform, LLM & Toolbox instantiation - -$toolbox = new FaultTolerantToolbox($innerToolbox); -$toolProcessor = new AgentProcessor($toolbox); - -$agent = new Agent($platform, $model, inputProcessor: [$toolProcessor], outputProcessor: [$toolProcessor]); -``` - -#### Tool Filtering - -To limit the tools provided to the LLM in a specific agent call to a subset of the configured tools, you can use the -`tools` option with a list of tool names: - -```php -$this->agent->call($messages, ['tools' => ['tavily_search']]); -``` - -#### Tool Result Interception - -To react to the result of a tool, you can implement an EventListener or EventSubscriber, that listens to the -`ToolCallsExecuted` event. This event is dispatched after the `Toolbox` executed all current tool calls and enables -you to skip the next LLM call by setting a response yourself: - -```php -$eventDispatcher->addListener(ToolCallsExecuted::class, function (ToolCallsExecuted $event): void { - foreach ($event->toolCallResults as $toolCallResult) { - if (str_starts_with($toolCallResult->toolCall->name, 'weather_')) { - $event->response = new StructuredResponse($toolCallResult->result); - } - } -}); -``` - -#### Keeping Tool Messages - -Sometimes you might wish to keep the tool messages (`AssistantMessage` containing the `toolCalls` and `ToolCallMessage` containing the response) in the context. -Enable the `keepToolMessages` flag of the toolbox' `AgentProcessor` to ensure those messages will be added to your `MessageBag`. - -```php -use Symfony\AI\Agent\Toolbox\AgentProcessor; -use Symfony\AI\Agent\Toolbox\Toolbox; - -// Platform & LLM instantiation -$messages = new MessageBag( - Message::forSystem(<<call($messages); -// $messages will now include the tool messages -``` - -#### Code Examples (with built-in tools) - -1. [Brave Tool](examples/toolbox/brave.php) -1. [Clock Tool](examples/toolbox/clock.php) -1. [Crawler Tool](examples/toolbox/brave.php) -1. [SerpAPI Tool](examples/toolbox/serpapi.php) -1. [Tavily Tool](examples/toolbox/tavily.php) -1. [Weather Tool with Event Listener](examples/toolbox/weather-event.php) -1. [Wikipedia Tool](examples/anthropic/toolcall.php) -1. [YouTube Transcriber Tool](examples/openai/toolcall.php) - -### Document Embedding, Vector Stores & Similarity Search (RAG) - -Symfony AI supports document embedding and similarity search using vector stores like ChromaDB, Azure AI Search, MongoDB -Atlas Search, or Pinecone. - -For populating a vector store, Symfony AI provides the service `Indexer`, which requires an instance of an -`EmbeddingsModel` and one of `StoreInterface`, and works with a collection of `Document` objects as input: - -```php -use Symfony\AI\Platform\Bridge\OpenAI\Embeddings; -use Symfony\AI\Platform\Bridge\OpenAI\PlatformFactory; -use Symfony\AI\Store\Bridge\Pinecone\Store; -use Symfony\AI\Store\Indexer; -use Probots\Pinecone\Pinecone; - -$indexer = new Indexer( - PlatformFactory::create($_ENV['OPENAI_API_KEY']), - new Embeddings(), - new Store(Pinecone::client($_ENV['PINECONE_API_KEY'], $_ENV['PINECONE_HOST']), -); -$indexer->index($documents); -``` - -The collection of `Document` instances is usually created by text input of your domain entities: - -```php -use Symfony\AI\Store\Document\Metadata; -use Symfony\AI\Store\Document\TextDocument; - -foreach ($entities as $entity) { - $documents[] = new TextDocument( - id: $entity->getId(), // UUID instance - content: $entity->toString(), // Text representation of relevant data for embedding - metadata: new Metadata($entity->toArray()), // Array representation of an entity to be stored additionally - ); -} -``` -> [!NOTE] -> Not all data needs to be stored in the vector store, but you could also hydrate the original data entry based -> on the ID or metadata after retrieval from the store.* - -In the end the agent is used in combination with a retrieval tool on top of the vector store, e.g. the built-in -`SimilaritySearch` tool provided by the library: - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Agent\Toolbox\AgentProcessor; -use Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch; -use Symfony\AI\Agent\Toolbox\Toolbox; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -// Initialize Platform & Models +To get started with Symfony AI, check out the following [examples/](./examples) folder. -$similaritySearch = new SimilaritySearch($model, $store); -$toolbox = Toolbox::create($similaritySearch); -$processor = new Agent($toolbox); -$agent = new Agent($platform, $model, [$processor], [$processor]); - -$messages = new MessageBag( - Message::forSystem(<<call($messages); -``` - -#### Code Examples - -1. [MongoDB Store](examples/store/mongodb-similarity-search.php) -1. [Pinecone Store](examples/store/pinecone-similarity-search.php) - -#### Supported Stores - -* [ChromaDB](https://trychroma.com) (requires `codewithkyrian/chromadb-php` as additional dependency) -* [Azure AI Search](https://azure.microsoft.com/en-us/products/ai-services/ai-search) -* [MongoDB Atlas Search](https://mongodb.com/products/platform/atlas-vector-search) (requires `mongodb/mongodb` as additional dependency) -* [Pinecone](https://pinecone.io) (requires `probots-io/pinecone-php` as additional dependency) - -See [issue #28](https://github.com/php-llm/llm-chain/issues/28) for planned support of other models and platforms. - -## Advanced Usage & Features - -### Structured Output - -A typical use-case of LLMs is to classify and extract data from unstructured sources, which is supported by some models -by features like **Structured Output** or providing a **Response Format**. - -#### PHP Classes as Output - -Symfony AI supports that use-case by abstracting the hustle of defining and providing schemas to the LLM and converting -the response back to PHP objects. - -To achieve this, a specific agent processor needs to be registered: - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Agent\StructuredOutput\AgentProcessor; -use Symfony\AI\Agent\StructuredOutput\ResponseFormatFactory; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; -use Symfony\AI\Fixtures\StructuredOutput\MathReasoning; -use Symfony\Component\Serializer\Encoder\JsonEncoder; -use Symfony\Component\Serializer\Normalizer\ObjectNormalizer; -use Symfony\Component\Serializer\Serializer; - -// Initialize Platform and LLM - -$serializer = new Serializer([new ObjectNormalizer()], [new JsonEncoder()]); -$processor = new AgentProcessor(new ResponseFormatFactory(), $serializer); -$agent = new Agent($platform, $model, [$processor], [$processor]); - -$messages = new MessageBag( - Message::forSystem('You are a helpful math tutor. Guide the user through the solution step by step.'), - Message::ofUser('how can I solve 8x + 7 = -23'), -); -$response = $agent->call($messages, ['output_structure' => MathReasoning::class]); - -dump($response->getContent()); // returns an instance of `MathReasoning` class -``` - -#### Array Structures as Output - -Also PHP array structures as `response_format` are supported, which also requires the agent processor mentioned above: - -```php -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -// Initialize Platform, LLM and agent with processors and Clock tool - -$messages = new MessageBag(Message::ofUser('What date and time is it?')); -$response = $agent->call($messages, ['response_format' => [ - 'type' => 'json_schema', - 'json_schema' => [ - 'name' => 'clock', - 'strict' => true, - 'schema' => [ - 'type' => 'object', - 'properties' => [ - 'date' => ['type' => 'string', 'description' => 'The current date in the format YYYY-MM-DD.'], - 'time' => ['type' => 'string', 'description' => 'The current time in the format HH:MM:SS.'], - ], - 'required' => ['date', 'time'], - 'additionalProperties' => false, - ], - ], -]]); - -dump($response->getContent()); // returns an array -``` - -#### Code Examples - -1. [Structured Output with PHP class)](examples/openai/structured-output-math.php) -1. [Structured Output with array](examples/openai/structured-output-clock.php) - -### Response Streaming - -Since LLMs usually generate a response word by word, most of them also support streaming the response using Server Side -Events. Symfony AI supports that by abstracting the conversion and returning a Generator as content of the response. - -```php -use Symfony\AI\Agent\Agent; -use Symfony\AI\Message\Message; -use Symfony\AI\Message\MessageBag; - -// Initialize Platform and LLM - -$agent = new Agent($model); -$messages = new MessageBag( - Message::forSystem('You are a thoughtful philosopher.'), - Message::ofUser('What is the purpose of an ant?'), -); -$response = $agent->call($messages, [ - 'stream' => true, // enable streaming of response text -]); - -foreach ($response->getContent() as $word) { - echo $word; -} -``` - -In a terminal application this generator can be used directly, but with a web app an additional layer like [Mercure](https://mercure.rocks) -needs to be used. - -#### Code Examples - -1. [Streaming Claude](examples/anthropic/stream.php) -1. [Streaming GPT](examples/openai/stream.php) -1. [Streaming Mistral](examples/mistral/stream.php) - -### Image Processing - -Some LLMs also support images as input, which Symfony AI supports as `Content` type within the `UserMessage`: - -```php -use Symfony\AI\Platform\Message\Content\Image; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -// Initialize Platform, LLM & agent - -$messages = new MessageBag( - Message::forSystem('You are an image analyzer bot that helps identify the content of images.'), - Message::ofUser( - 'Describe the image as a comedian would do it.', - Image::fromFile(dirname(__DIR__).'/tests/fixtures/image.jpg'), // Path to an image file - Image::fromDataUrl('data:image/png;base64,...'), // Data URL of an image - new ImageUrl('https://foo.com/bar.png'), // URL to an image - ), -); -$response = $agent->call($messages); -``` - -#### Code Examples - -1. [Binary Image Input with GPT](examples/openai/image-input-binary.php) -1. [Image URL Input with GPT](examples/openai/image-input-url.php) - -### Audio Processing - -Similar to images, some LLMs also support audio as input, which is just another `Content` type within the `UserMessage`: - -```php -use Symfony\AI\Platform\Message\Content\Audio; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -// Initialize Platform, LLM & agent - -$messages = new MessageBag( - Message::ofUser( - 'What is this recording about?', - Audio::fromFile(dirname(__DIR__).'/tests/fixtures/audio.mp3'), // Path to an audio file - ), -); -$response = $agent->call($messages); -``` - -#### Code Examples - -1. [Audio Input with GPT](examples/openai/audio-input.php) - -### Embeddings - -Creating embeddings of word, sentences, or paragraphs is a typical use case around the interaction with LLMs, and -therefore Symfony AI implements a `EmbeddingsModel` interface with various models, see above. - -The standalone usage results in an `Vector` instance: - -```php -use Symfony\AI\Platform\Bridge\OpenAI\Embeddings; - -// Initialize Platform - -$embeddings = new Embeddings($platform, Embeddings::TEXT_3_SMALL); - -$vectors = $platform->request($embeddings, $textInput)->getContent(); - -dump($vectors[0]->getData()); // Array of float values -``` - -#### Code Examples - -1. [OpenAI's Emebddings](examples/openai/embeddings.php) -1. [Voyage's Embeddings](examples/voyage/embeddings.php) -1. [Mistral's Embed](examples/mistral/embeddings.php) - -### Parallel Platform Calls - -Platform supports multiple model calls in parallel, which can be useful to speed up the processing: - -```php -// Initialize Platform & Model - -foreach ($inputs as $input) { - $responses[] = $platform->request($model, $input); -} - -foreach ($responses as $response) { - echo $response->getContent().PHP_EOL; -} -``` - -> [!NOTE] -> This requires cURL and the `ext-curl` extension to be installed. - -#### Code Examples - -1. [Parallel GPT Calls](examples/misc/parallel-chat-gpt.php) -1. [Parallel Embeddings Calls](examples/misc/parallel-embeddings.php) - -> [!NOTE] -> Please be aware that some embedding models also support batch processing out of the box. - -### Input & Output Processing - -The behavior of the agent is extendable with services that implement `InputProcessor` and/or `OutputProcessor` -interface. They are provided while instantiating the agent instance: - -```php -use Symfony\AI\Agent\Agent; - -// Initialize Platform, LLM and processors - -$agent = new Agent($platform, $model, $inputProcessors, $outputProcessors); -``` - -#### InputProcessor - -`InputProcessor` instances are called in the agent before handing over the `MessageBag` and the `$options` array to the LLM and are -able to mutate both on top of the `Input` instance provided. - -```php -use Symfony\AI\Agent\Input; -use Symfony\AI\Agent\InputProcessorInterface; -use Symfony\AI\Platform\Message\AssistantMessage; - -final class MyProcessor implements InputProcessorInterface -{ - public function processInput(Input $input): void - { - // mutate options - $options = $input->getOptions(); - $options['foo'] = 'bar'; - $input->setOptions($options); - - // mutate MessageBag - $input->messages->append(new AssistantMessage(sprintf('Please answer using the locale %s', $this->locale))); - } -} -``` - -#### OutputProcessor - -`OutputProcessor` instances are called after the LLM provided a response and can - on top of options and messages - -mutate or replace the given response: - -```php -use Symfony\AI\Agent\Output; -use Symfony\AI\Agent\OutputProcessorInterface; - -final class MyProcessor implements OutputProcessorInterface -{ - public function processOutput(Output $out): void - { - // mutate response - if (str_contains($output->response->getContent, self::STOP_WORD)) { - $output->reponse = new TextReponse('Sorry, we were unable to find relevant information.') - } - } -} -``` - -#### Agent Awareness - -Both, `Input` and `Output` instances, provide access to the LLM used by the agent, but the agent itself is only -provided, in case the processor implemented the `AgentAwareInterface` interface, which can be combined with using the -`AgentAwareTrait`: - -```php -use Symfony\AI\Agent\AgentAwareInterface; -use Symfony\AI\Agent\AgentAwareTrait; -use Symfony\AI\Agent\Output; -use Symfony\AI\Agent\OutputProcessorInterface; - -final class MyProcessor implements OutputProcessorInterface, AgentAwareInterface -{ - use AgentAwareTrait; - - public function processOutput(Output $out): void - { - // additional agent interaction - $response = $this->agent->call(...); - } -} -``` - -## HuggingFace - -Symfony AI comes out of the box with an integration for [HuggingFace](https://huggingface.co/) which is a platform for -hosting and sharing all kinds of models, including LLMs, embeddings, image generation, and classification models. - -You can just instantiate the Platform with the corresponding HuggingFace bridge and use it with the `task` option: - -```php -use Symfony\AI\Bridge\HuggingFace\Model; -use Symfony\AI\Platform\Bridge\HuggingFace\PlatformFactory; -use Symfony\AI\Platform\Bridge\HuggingFace\Task; -use Symfony\AI\Platform\Message\Content\Image; - -$platform = PlatformFactory::create($apiKey); -$model = new Model('facebook/detr-resnet-50'); - -$image = Image::fromFile(dirname(__DIR__, 2).'/tests/fixtures/image.jpg'); -$response = $platform->request($model, $image, [ - 'task' => Task::OBJECT_DETECTION, // defining a task is mandatory for internal request & response handling -]); - -dump($response->getContent()); -``` - -#### Code Examples - -1. [Audio Classification](examples/huggingface/audio-classification.php) -1. [Automatic Speech Recognition](examples/huggingface/automatic-speech-recognition.php) -1. [Chat Completion](examples/huggingface/chat-completion.php) -1. [Feature Extraction (Embeddings)](examples/huggingface/feature-extraction.php) -1. [Fill Mask](examples/huggingface/fill-mask.php) -1. [Image Classification](examples/huggingface/image-classification.php) -1. [Image Segmentation.php](examples/huggingface/image-segmentation.php) -1. [Image-to-Text](examples/huggingface/image-to-text.php) -1. [Object Detection](examples/huggingface/object-detection.php) -1. [Question Answering](examples/huggingface/question-answering.php) -1. [Sentence Similarity](examples/huggingface/sentence-similarity.php) -1. [Summarization](examples/huggingface/summarization.php) -1. [Table Question Answering](examples/huggingface/table-question-answering.php) -1. [Text Classification](examples/huggingface/text-classification.php) -1. [Text Generation](examples/huggingface/text-generation.php) -1. [Text-to-Image](examples/huggingface/text-to-image.php) -1. [Token Classification](examples/huggingface/token-classification.php) -1. [Translation](examples/huggingface/translation.php) -1. [Zero-shot Classification](examples/huggingface/zero-shot-classification.php) - -## TransformerPHP - -With installing the library `codewithkyrian/transformers` it is possible to run [ONNX](https://onnx.ai/) models locally -without the need of an extra tool like Ollama or a cloud service. This requires [FFI](https://www.php.net/manual/en/book.ffi.php) -and comes with an extra setup, see [TransformersPHP's Getting Starter](https://transformers.codewithkyrian.com/getting-started). - -The usage with Symfony AI is similar to the HuggingFace integration, and also requires the `task` option to be set: - -```php -use Codewithkyrian\Transformers\Pipelines\Task; -use Symfony\AI\Bridge\TransformersPHP\Model; -use Symfony\AI\Platform\Bridge\TransformersPHP\PlatformFactory; - -$platform = PlatformFactory::create(); -$model = new Model('Xenova/LaMini-Flan-T5-783M'); - -$response = $platform->request($model, 'How many continents are there in the world?', [ - 'task' => Task::Text2TextGeneration, -]); - -echo $response->getContent().PHP_EOL; -``` - -#### Code Examples - -1. [Text Generation with TransformersPHP](examples/transformers/text-generation.php) ## Sponsor -Help Symfony by [sponsoring][2] its development! +Help Symfony by [sponsoring](https://symfony.com/sponsor) its development! ## Contributing Thank you for considering contributing to Symfony AI! You can find the [contribution guide here](CONTRIBUTING.md). -[1]: https://symfony.com/backers -[2]: https://symfony.com/sponsor - ## Fixture Licenses For testing multi-modal features, the repository contains binary media content, with the following owners and licenses: diff --git a/examples/README.md b/examples/README.md index 15cca731..5b95e493 100644 --- a/examples/README.md +++ b/examples/README.md @@ -20,7 +20,7 @@ composer install #### Configuration Depending on the examples you want to run, you may need to configure the needed API keys. Therefore, you need to create a -`.env.local` file in the root of the examples directory. This file should contain the environment variables for the +`.env.local` file in the root of the examples' directory. This file should contain the environment variables for the corresponding example you want to run. _Now you can run examples standalone or via the example runner._ diff --git a/src/agent/README.md b/src/agent/README.md new file mode 100644 index 00000000..431c2f0b --- /dev/null +++ b/src/agent/README.md @@ -0,0 +1,25 @@ +# Symfony AI - Agent Component + +The Agent component provides a framework for building AI agents that, sits on top of the Platform and Store components, +allowing you to create agents that can interact with users, perform tasks, and manage workflows. + +**This Component is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + +## Installation + +```bash +composer require symfony/ai-agent +``` + +**This repository is a READ-ONLY sub-tree split**. See +https://github.com/symfony/ai to create issues or submit pull requests. + +## Resources + +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/agent/doc/index.rst b/src/agent/doc/index.rst new file mode 100644 index 00000000..e7dcf709 --- /dev/null +++ b/src/agent/doc/index.rst @@ -0,0 +1,481 @@ +Symfony AI - Agent Component +============================ + +The Agent component provides a framework for building AI agents that, sits on top of the Platform and Store components, +allowing you to create agents that can interact with users, perform tasks, and manage workflows. + +Installation +------------ + +Install the component using Composer: + +.. code-block:: terminal + + $ composer require symfony/ai-agent + +Basic Usage +----------- + +To instantiate an agent, you need to pass a ``Symfony\AI\Platform\PlatformInterface`` and a +``Symfony\AI\Platform\Model`` instance to the ``Symfony\AI\Agent\Agent`` class:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Platform\Bridge\OpenAI\GPT; + use Symfony\AI\Platform\Bridge\OpenAI\PlatformFactory; + + $platform = PlatformFactory::create($apiKey); + $model = new GPT(GPT::GPT_4O_MINI); + + $agent = new Agent($platform, $model); + +You can then run the agent with a ``Symfony\AI\Platform\Message\MessageBagInterface`` instance as input and an optional +array of options:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Platform & LLM instantiation + + $agent = new Agent($platform, $model); + $input = new MessageBag( + Message::forSystem('You are a helpful chatbot answering questions about LLM agent.'), + Message::ofUser('Hello, how are you?'), + ); + $response = $agent->call($messages); + + echo $response->getContent(); // "I'm fine, thank you. How can I help you today?" + + +The structure of the input message bag is flexible, see `Platform Component`_ for more details on how to use it. + +**Options** + +As with the Platform component, you can pass options to the agent when running it. These options configure the agent's +behavior, for example available tools to execute, or are forwarded to the underlying platform and model. + +Tools +----- + +To integrate LLMs with your application, Symfony AI supports tool calling out of the box. Tools are services that can be +called by the LLM to provide additional features or process data. + +Tool calling can be enabled by registering the processors in the agent:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Agent\Toolbox\AgentProcessor; + use Symfony\AI\Agent\Toolbox\Toolbox; + + // Platform & LLM instantiation + + $yourTool = new YourTool(); + + $toolbox = Toolbox::create($yourTool); + $toolProcessor = new AgentProcessor($toolbox); + + $agent = new Agent($platform, $model, inputProcessors: [$toolProcessor], outputProcessors: [$toolProcessor]); + +Custom tools can basically be any class, but must configure by the ``#[AsTool]`` attribute:: + + use Symfony\AI\Toolbox\Attribute\AsTool; + + #[AsTool('company_name', 'Provides the name of your company')] + final class CompanyName + { + public function __invoke(): string + { + return 'ACME Corp.'; + } + } + +**Tool Return Value** + +In the end, the tool's response needs to be a string, but Symfony AI converts arrays and objects, that implement the +JsonSerializable interface, to JSON strings for you. So you can return arrays or objects directly from your tool. + +**Tool Methods** + +You can configure the method to be called by the LLM with the #[AsTool] attribute and have multiple tools per class:: + + use Symfony\AI\Toolbox\Attribute\AsTool; + + #[AsTool( + name: 'weather_current', + description: 'get current weather for a location', + method: 'current', + )] + #[AsTool( + name: 'weather_forecast', + description: 'get weather forecast for a location', + method: 'forecast', + )] + final readonly class OpenMeteo + { + public function current(float $latitude, float $longitude): array + { + // ... + } + + public function forecast(float $latitude, float $longitude): array + { + // ... + } + } + +**Tool Parameters** + +Symfony AI generates a JSON Schema representation for all tools in the Toolbox based on the #[AsTool] attribute and +method arguments and param comments in the doc block. Additionally, JSON Schema support validation rules, which are +partially support by LLMs like GPT. + +To leverage this, configure the ``#[With]`` attribute on the method arguments of your tool:: + + use Symfony\AI\Agent\Toolbox\Attribute\AsTool; + use Symfony\AI\Platform\Contract\JsonSchema\Attribute\With; + + #[AsTool('my_tool', 'Example tool with parameters requirements.')] + final class MyTool + { + /** + * @param string $name The name of an object + * @param int $number The number of an object + */ + public function __invoke( + #[With(pattern: '/([a-z0-1]){5}/')] + string $name, + #[With(minimum: 0, maximum: 10)] + int $number, + ): string { + // ... + } + } + +See attribute class ``Symfony\AI\Platform\Contract\JsonSchema\Attribute\With`` for all available options. + +.. note:: + + Please be aware, that this is only converted in a JSON Schema for the LLM to respect, but not validated by Symfony AI. + +**Third-Party Tools** + +In some cases you might want to use third-party tools, which are not part of your application. Adding the ``#[AsTool]`` +attribute to the class is not possible in those cases, but you can explicitly register the tool in the MemoryFactory:: + + use Symfony\AI\Agent\Toolbox\Toolbox; + use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; + use Symfony\Component\Clock\Clock; + + $metadataFactory = (new MemoryToolFactory()) + ->addTool(Clock::class, 'clock', 'Get the current date and time', 'now'); + $toolbox = new Toolbox($metadataFactory, [new Clock()]); + +.. note:: + + Please be aware that not all return types are supported by the toolbox, so a decorator might still be needed. + +This can be combined with the ChainFactory which enables you to use explicitly registered tools and ``#[AsTool]`` tagged +tools in the same chain - which even enables you to overwrite the pre-existing configuration of a tool:: + + use Symfony\AI\Agent\Toolbox\Toolbox; + use Symfony\AI\Agent\Toolbox\ToolFactory\ChainFactory; + use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; + use Symfony\AI\Agent\Toolbox\ToolFactory\ReflectionToolFactory; + + $reflectionFactory = new ReflectionToolFactory(); // Register tools with #[AsTool] attribute + $metadataFactory = (new MemoryToolFactory()) // Register or overwrite tools explicitly + ->addTool(...); + $toolbox = new Toolbox(new AgentFactory($metadataFactory, $reflectionFactory), [...]); + +.. note:: + + The order of the factories in the ChainFactory matters, as the first factory has the highest priority. + +**Agent uses Agent 🤯** + +Similar to third-party tools, an agent can also use an different agent as a tool. This can be useful to encapsulate +complex logic or to reuse an agent in multiple places or hide sub-agents from the LLM:: + + use Symfony\AI\Agent\Toolbox\Tool\Agent; + use Symfony\AI\Agent\Toolbox\Toolbox; + use Symfony\AI\Agent\Toolbox\ToolFactory\MemoryToolFactory; + + // agent was initialized before + + $agentTool = new Agent($agent); + $metadataFactory = (new MemoryToolFactory()) + ->addTool($agentTool, 'research_agent', 'Meaningful description for sub-agent'); + $toolbox = new Toolbox($metadataFactory, [$agentTool]); + +**Fault Tolerance** + +To gracefully handle errors that occur during tool calling, e.g. wrong tool names or runtime errors, you can use the +``FaultTolerantToolbox`` as a decorator for the Toolbox. It will catch the exceptions and return readable error messages +to the LLM:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Agent\Toolbox\AgentProcessor; + use Symfony\AI\Agent\Toolbox\FaultTolerantToolbox; + + // Platform, LLM & Toolbox instantiation + + $toolbox = new FaultTolerantToolbox($innerToolbox); + $toolProcessor = new AgentProcessor($toolbox); + + $agent = new Agent($platform, $model, inputProcessor: [$toolProcessor], outputProcessor: [$toolProcessor]); + +**Tool Filtering** + +To limit the tools provided to the LLM in a specific agent call to a subset of the configured tools, you can use the +tools option with a list of tool names:: + + $this->agent->call($messages, ['tools' => ['tavily_search']]); + +**Tool Result Interception** + +To react to the result of a tool, you can implement an EventListener or EventSubscriber, that listens to the +``ToolCallsExecuted`` event. This event is dispatched after the Toolbox executed all current tool calls and enables you +to skip the next LLM call by setting a response yourself:: + + $eventDispatcher->addListener(ToolCallsExecuted::class, function (ToolCallsExecuted $event): void { + foreach ($event->toolCallResults as $toolCallResult) { + if (str_starts_with($toolCallResult->toolCall->name, 'weather_')) { + $event->response = new StructuredResponse($toolCallResult->result); + } + } + }); + +**Keeping Tool Messages** + +Sometimes you might wish to keep the tool messages (AssistantMessage containing the toolCalls and ToolCallMessage +containing the response) in the context. Enable the keepToolMessages flag of the toolbox' AgentProcessor to ensure those +messages will be added to your MessageBag:: + + use Symfony\AI\Agent\Toolbox\AgentProcessor; + use Symfony\AI\Agent\Toolbox\Toolbox; + + // Platform & LLM instantiation + $messages = new MessageBag( + Message::forSystem(<<call($messages); + // $messages will now include the tool messages + +**Code Examples (with built-in tools)** + +* `Brave Tool`_ +* `Clock Tool`_ +* `Crawler Tool`_ +* `SerpAPI Tool`_ +* `Tavily Tool`_ +* `Weather Tool with Event Listener`_ +* `Wikipedia Tool`_ +* `YouTube Transcriber Tool`_ + +Retrieval Augmented Generation (RAG) +------------------------------------ + +In combination with the `Store Component`_, the Agent component can be used to build agents that perform Retrieval +Augmented Generation (RAG). This allows the agent to retrieve relevant documents from a store and use them to generate +more accurate and context-aware responses. Therefore, the component provides a built-in tool called +``Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch``:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Agent\Toolbox\AgentProcessor; + use Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch; + use Symfony\AI\Agent\Toolbox\Toolbox; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Initialize Platform & Models + + $similaritySearch = new SimilaritySearch($model, $store); + $toolbox = Toolbox::create($similaritySearch); + $processor = new Agent($toolbox); + $agent = new Agent($platform, $model, [$processor], [$processor]); + + $messages = new MessageBag( + Message::forSystem(<<call($messages); + +**Code Examples** + +* `RAG with MongoDB`_ +* `RAG with Pinecone`_ + +Structured Output +----------------- + +A typical use-case of LLMs is to classify and extract data from unstructured sources, which is supported by some models +by features like **Structured Output** or providing a **Response Format**. + +**PHP Classes as Output** + +Symfony AI supports that use-case by abstracting the hustle of defining and providing schemas to the LLM and converting +the response back to PHP objects. + +To achieve this, a specific agent processor needs to be registered:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Agent\StructuredOutput\AgentProcessor; + use Symfony\AI\Agent\StructuredOutput\ResponseFormatFactory; + use Symfony\AI\Fixtures\StructuredOutput\MathReasoning; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + use Symfony\Component\Serializer\Encoder\JsonEncoder; + use Symfony\Component\Serializer\Normalizer\ObjectNormalizer; + use Symfony\Component\Serializer\Serializer; + + // Initialize Platform and LLM + + $serializer = new Serializer([new ObjectNormalizer()], [new JsonEncoder()]); + $processor = new AgentProcessor(new ResponseFormatFactory(), $serializer); + $agent = new Agent($platform, $model, [$processor], [$processor]); + + $messages = new MessageBag( + Message::forSystem('You are a helpful math tutor. Guide the user through the solution step by step.'), + Message::ofUser('how can I solve 8x + 7 = -23'), + ); + $response = $agent->call($messages, ['output_structure' => MathReasoning::class]); + + dump($response->getContent()); // returns an instance of `MathReasoning` class + +**Array Structures as Output** + +Also PHP array structures as response_format are supported, which also requires the agent processor mentioned above:: + + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Initialize Platform, LLM and agent with processors and Clock tool + + $messages = new MessageBag(Message::ofUser('What date and time is it?')); + $response = $agent->call($messages, ['response_format' => [ + 'type' => 'json_schema', + 'json_schema' => [ + 'name' => 'clock', + 'strict' => true, + 'schema' => [ + 'type' => 'object', + 'properties' => [ + 'date' => ['type' => 'string', 'description' => 'The current date in the format YYYY-MM-DD.'], + 'time' => ['type' => 'string', 'description' => 'The current time in the format HH:MM:SS.'], + ], + 'required' => ['date', 'time'], + 'additionalProperties' => false, + ], + ], + ]]); + + dump($response->getContent()); // returns an array + +**Code Examples** + +* `Structured Output with PHP class`_ +* `Structured Output with array`_ + +Input & Output Processing +------------------------- + +The behavior of the agent is extendable with services that implement InputProcessor and/or OutputProcessor interface. +They are provided while instantiating the agent instance:: + + use Symfony\AI\Agent\Agent; + + // Initialize Platform, LLM and processors + + $agent = new Agent($platform, $model, $inputProcessors, $outputProcessors); + +**InputProcessor** + +InputProcessor instances are called in the agent before handing over the MessageBag and the $options array to the LLM +and are able to mutate both on top of the Input instance provided:: + + use Symfony\AI\Agent\Input; + use Symfony\AI\Agent\InputProcessorInterface; + use Symfony\AI\Platform\Message\AssistantMessage; + + final class MyProcessor implements InputProcessorInterface + { + public function processInput(Input $input): void + { + // mutate options + $options = $input->getOptions(); + $options['foo'] = 'bar'; + $input->setOptions($options); + + // mutate MessageBag + $input->messages->append(new AssistantMessage(sprintf('Please answer using the locale %s', $this->locale))); + } + } + +**OutputProcessor** + +OutputProcessor instances are called after the LLM provided a response and can - on top of options and messages - mutate +or replace the given response:: + + use Symfony\AI\Agent\Output; + use Symfony\AI\Agent\OutputProcessorInterface; + + final class MyProcessor implements OutputProcessorInterface + { + public function processOutput(Output $out): void + { + // mutate response + if (str_contains($output->response->getContent, self::STOP_WORD)) { + $output->reponse = new TextReponse('Sorry, we were unable to find relevant information.') + } + } + } + +**Agent Awareness** + +Both, Input and Output instances, provide access to the LLM used by the agent, but the agent itself is only provided, +in case the processor implemented the AgentAwareInterface interface, which can be combined with using the +AgentAwareTrait:: + + use Symfony\AI\Agent\AgentAwareInterface; + use Symfony\AI\Agent\AgentAwareTrait; + use Symfony\AI\Agent\Output; + use Symfony\AI\Agent\OutputProcessorInterface; + + final class MyProcessor implements OutputProcessorInterface, AgentAwareInterface + { + use AgentAwareTrait; + + public function processOutput(Output $out): void + { + // additional agent interaction + $response = $this->agent->call(...); + } + } + +.. _`Platform Component`: https://github.com/symfony/ai-platform +.. _`Brave Tool`: https://github.com/symfony/ai/blob/main/examples/toolbox/brave.php +.. _`Clock Tool`: https://github.com/symfony/ai/blob/main/examples/toolbox/clock.php +.. _`Crawler Tool`: https://github.com/symfony/ai/blob/main/examples/toolbox/brave.php +.. _`SerpAPI Tool`: https://github.com/symfony/ai/blob/main/examples/toolbox/serpapi.php +.. _`Tavily Tool`: https://github.com/symfony/ai/blob/main/examples/toolbox/tavily.php +.. _`Weather Tool with Event Listener`: https://github.com/symfony/ai/blob/main/examples/toolbox/weather-event.php +.. _`Wikipedia Tool`: https://github.com/symfony/ai/blob/main/examples/openai/toolcall-stream.php +.. _`YouTube Transcriber Tool`: https://github.com/symfony/ai/blob/main/examples/openai/toolcall.php +.. _`Store Component`: https://github.com/symfony/ai-store +.. _`RAG with MongoDB`: https://github.com/symfony/ai/blob/main/examples/store/mongodb-similarity-search.php +.. _`RAG with Pinecone`: https://github.com/symfony/ai/blob/main/examples/store/pinecone-similarity-search.php +.. _`Structured Output with PHP class`: https://github.com/symfony/ai/blob/main/examples/openai/structured-output-math.php +.. _`Structured Output with array`: https://github.com/symfony/ai/blob/main/examples/openai/structured-output-clock.php diff --git a/src/ai-bundle/README.md b/src/ai-bundle/README.md index 14121490..63e89929 100644 --- a/src/ai-bundle/README.md +++ b/src/ai-bundle/README.md @@ -2,175 +2,23 @@ Integration bundle for Symfony AI components. +**This Bundle is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + ## Installation ```bash composer require symfony/ai-bundle ``` -## Configuration - -### Simple Example with OpenAI - -```yaml -# config/packages/ai.yaml -ai: - platform: - openai: - api_key: '%env(OPENAI_API_KEY)%' - agent: - default: - model: - name: 'GPT' -``` - -### Advanced Example with Anthropic, Azure, Google and multiple agents -```yaml -# config/packages/ai.yaml -ai: - platform: - anthropic: - api_key: '%env(ANTHROPIC_API_KEY)%' - azure: - # multiple deployments possible - gpt_deployment: - base_url: '%env(AZURE_OPENAI_BASEURL)%' - deployment: '%env(AZURE_OPENAI_GPT)%' - api_key: '%env(AZURE_OPENAI_KEY)%' - api_version: '%env(AZURE_GPT_VERSION)%' - google: - api_key: '%env(GOOGLE_API_KEY)%' - agent: - rag: - platform: 'symfony_ai.platform.azure.gpt_deployment' - structured_output: false # Disables support for "output_structure" option, default is true - model: - name: 'GPT' - version: 'gpt-4o-mini' - system_prompt: 'You are a helpful assistant that can answer questions.' # The default system prompt of the agent - include_tools: true # Include tool definitions at the end of the system prompt - tools: - # Referencing a service with #[AsTool] attribute - - 'Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch' - - # Referencing a service without #[AsTool] attribute - - service: 'App\Agent\Tool\CompanyName' - name: 'company_name' - description: 'Provides the name of your company' - method: 'foo' # Optional with default value '__invoke' - - # Referencing a agent => agent in agent 🤯 - - service: 'symfony_ai.agent.research' - name: 'wikipedia_research' - description: 'Can research on Wikipedia' - is_agent: true - research: - platform: 'symfony_ai.platform.anthropic' - model: - name: 'Claude' - tools: # If undefined, all tools are injected into the agent, use "tools: false" to disable tools. - - 'Symfony\AI\Agent\Toolbox\Tool\Wikipedia' - fault_tolerant_toolbox: false # Disables fault tolerant toolbox, default is true - store: - # also azure_search, mongodb and pinecone are supported as store type - chroma_db: - # multiple collections possible per type - default: - collection: 'my_collection' - indexer: - default: - # platform: 'symfony_ai.platform.anthropic' - # store: 'symfony_ai.store.chroma_db.default' - model: - name: 'Embeddings' - version: 'text-embedding-ada-002' -``` - -## Usage - -### Agent Service - -Use the `Agent` service to leverage models and tools: -```php -use Symfony\AI\Agent\AgentInterface; -use Symfony\AI\Platform\Message\Message; -use Symfony\AI\Platform\Message\MessageBag; - -final readonly class MyService -{ - public function __construct( - private AgentInterface $agent, - ) { - } - - public function submit(string $message): string - { - $messages = new MessageBag( - Message::forSystem('Speak like a pirate.'), - Message::ofUser($message), - ); - - return $this->agent->call($messages); - } -} -``` - -### Register Tools - -To use existing tools, you can register them as a service: -```yaml -services: - _defaults: - autowire: true - autoconfigure: true - - Symfony\AI\Agent\Toolbox\Tool\Clock: ~ - Symfony\AI\Agent\Toolbox\Tool\OpenMeteo: ~ - Symfony\AI\Agent\Toolbox\Tool\SerpApi: - $apiKey: '%env(SERP_API_KEY)%' - Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch: ~ - Symfony\AI\Agent\Toolbox\Tool\Tavily: - $apiKey: '%env(TAVILY_API_KEY)%' - Symfony\AI\Agent\Toolbox\Tool\Wikipedia: ~ - Symfony\AI\Agent\Toolbox\Tool\YouTubeTranscriber: ~ -``` - -Custom tools can be registered by using the `#[AsTool]` attribute: - -```php -use Symfony\AI\Agent\Toolbox\Attribute\AsTool; - -#[AsTool('company_name', 'Provides the name of your company')] -final class CompanyName -{ - public function __invoke(): string - { - return 'ACME Corp.' - } -} -``` - -The agent configuration by default will inject all known tools into the agent. - -To disable this behavior, set the `tools` option to `false`: -```yaml -ai: - agent: - my_agent: - tools: false -``` - -To inject only specific tools, list them in the configuration: -```yaml -ai: - agent: - my_agent: - tools: - - 'Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch' -``` - -### Profiler +**This repository is a READ-ONLY sub-tree split**. See +https://github.com/symfony/ai to create issues or submit pull requests. -The profiler panel provides insights into the agent's execution: +## Resources -![Profiler](./profiler.png) +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/ai-bundle/doc/index.rst b/src/ai-bundle/doc/index.rst new file mode 100644 index 00000000..0a5a2f0f --- /dev/null +++ b/src/ai-bundle/doc/index.rst @@ -0,0 +1,194 @@ +AI Bundle +========= + +Symfony integration bundle for Symfony AI components. + +Integrating: + +* `Symfony AI Agent`_ +* `Symfony AI Platform`_ +* `Symfony AI Store`_ + +Installation +------------ + +.. code-block:: terminal + + $ composer require symfony/ai-bundle + +Configuration +------------- + +**Simple Example with OpenAI** + +.. code-block:: yaml + + # config/packages/ai.yaml + ai: + platform: + openai: + api_key: '%env(OPENAI_API_KEY)%' + agent: + default: + model: + name: 'GPT' + +**Advanced Example with Anthropic, Azure, Google and multiple agents** + +.. code-block:: yaml + + # config/packages/ai.yaml + ai: + platform: + anthropic: + api_key: '%env(ANTHROPIC_API_KEY)%' + azure: + # multiple deployments possible + gpt_deployment: + base_url: '%env(AZURE_OPENAI_BASEURL)%' + deployment: '%env(AZURE_OPENAI_GPT)%' + api_key: '%env(AZURE_OPENAI_KEY)%' + api_version: '%env(AZURE_GPT_VERSION)%' + google: + api_key: '%env(GOOGLE_API_KEY)%' + agent: + rag: + platform: 'symfony_ai.platform.azure.gpt_deployment' + structured_output: false # Disables support for "output_structure" option, default is true + model: + name: 'GPT' + version: 'gpt-4o-mini' + system_prompt: 'You are a helpful assistant that can answer questions.' # The default system prompt of the agent + include_tools: true # Include tool definitions at the end of the system prompt + tools: + # Referencing a service with #[AsTool] attribute + - 'Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch' + + # Referencing a service without #[AsTool] attribute + - service: 'App\Agent\Tool\CompanyName' + name: 'company_name' + description: 'Provides the name of your company' + method: 'foo' # Optional with default value '__invoke' + + # Referencing a agent => agent in agent 🤯 + - service: 'symfony_ai.agent.research' + name: 'wikipedia_research' + description: 'Can research on Wikipedia' + is_agent: true + research: + platform: 'symfony_ai.platform.anthropic' + model: + name: 'Claude' + tools: # If undefined, all tools are injected into the agent, use "tools: false" to disable tools. + - 'Symfony\AI\Agent\Toolbox\Tool\Wikipedia' + fault_tolerant_toolbox: false # Disables fault tolerant toolbox, default is true + store: + # also azure_search, mongodb and pinecone are supported as store type + chroma_db: + # multiple collections possible per type + default: + collection: 'my_collection' + indexer: + default: + # platform: 'symfony_ai.platform.anthropic' + # store: 'symfony_ai.store.chroma_db.default' + model: + name: 'Embeddings' + version: 'text-embedding-ada-002' + +Usage +----- + +**Agent Service** + +Use the `Agent` service to leverage models and tools:: + + use Symfony\AI\Agent\AgentInterface; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + final readonly class MyService + { + public function __construct( + private AgentInterface $agent, + ) { + } + + public function submit(string $message): string + { + $messages = new MessageBag( + Message::forSystem('Speak like a pirate.'), + Message::ofUser($message), + ); + + return $this->agent->call($messages); + } + } + +**Register Tools** + +To use existing tools, you can register them as a service: + +.. code-block:: yaml + + services: + _defaults: + autowire: true + autoconfigure: true + + Symfony\AI\Agent\Toolbox\Tool\Clock: ~ + Symfony\AI\Agent\Toolbox\Tool\OpenMeteo: ~ + Symfony\AI\Agent\Toolbox\Tool\SerpApi: + $apiKey: '%env(SERP_API_KEY)%' + Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch: ~ + Symfony\AI\Agent\Toolbox\Tool\Tavily: + $apiKey: '%env(TAVILY_API_KEY)%' + Symfony\AI\Agent\Toolbox\Tool\Wikipedia: ~ + Symfony\AI\Agent\Toolbox\Tool\YouTubeTranscriber: ~ + +Custom tools can be registered by using the ``#[AsTool]`` attribute:: + + use Symfony\AI\Agent\Toolbox\Attribute\AsTool; + + #[AsTool('company_name', 'Provides the name of your company')] + final class CompanyName + { + public function __invoke(): string + { + return 'ACME Corp.'; + } + } + +The agent configuration by default will inject all known tools into the agent. + +To disable this behavior, set the ``tools`` option to ``false``: + +.. code-block:: yaml + + ai: + agent: + my_agent: + tools: false + +To inject only specific tools, list them in the configuration: + +.. code-block:: yaml + + ai: + agent: + my_agent: + tools: + - 'Symfony\AI\Agent\Toolbox\Tool\SimilaritySearch' + +Profiler +-------- + +The profiler panel provides insights into the agent's execution: + +.. image:: profiler.png + :alt: Profiler Panel + + +.. _`Symfony AI Agent`: https://github.com/symfony/ai-agent +.. _`Symfony AI Platform`: https://github.com/symfony/ai-platform +.. _`Symfony AI Store`: https://github.com/symfony/ai-store diff --git a/src/ai-bundle/profiler.png b/src/ai-bundle/doc/profiler.png similarity index 100% rename from src/ai-bundle/profiler.png rename to src/ai-bundle/doc/profiler.png diff --git a/src/mcp-bundle/README.md b/src/mcp-bundle/README.md index adc1ed98..8c23a9db 100644 --- a/src/mcp-bundle/README.md +++ b/src/mcp-bundle/README.md @@ -1,63 +1,27 @@ -# MCP Bundle [WIP] +# MCP Bundle Symfony integration bundle for [Model Context Protocol](https://modelcontextprotocol.io/) using the Symfony AI MCP SDK [symfony/mcp-sdk](https://github.com/symfony/mcp-sdk). **Currently only supports tools as server via Server-Sent Events (SSE) and STDIO.** +**This Bundle is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + ## Installation ```bash composer require symfony/mcp-bundle ``` -## Usage - -At first, you need to decide whether your application should act as a MCP server or client. Both can be configured -in the `mcp` section of your `config/packages/mcp.yaml` file. - -### Act as Server - -**Currently only supports tools.** - -To use your application as an MCP server, exposing tools to clients like [Claude Desktop](https://claude.ai/download), -you need to configure in the `client_transports` section the transports you want to expose to clients. -You can use either STDIO or SSE. - -### Act as Client - -**Not implemented yet.** +**This repository is a READ-ONLY sub-tree split**. See +https://github.com/symfony/ai to create issues or submit pull requests. -To use your application as an MCP client, integrating other MCP servers, you need to configure the `servers` you want to -connect to. You can use either STDIO or Server-Sent Events (SSE) as transport methods. +## Resources -You can find a list of example Servers in the [MCP Server List](https://modelcontextprotocol.io/examples). - -Tools of those servers are available in your [AI Bundle](https://github.com/symfony/ai-bundle) -configuration and usable in your agents. - -## Configuration - -```yaml -mcp: - app: 'app' # Application name to be exposed to clients - version: '1.0.0' # Application version to be exposed to clients - - # Configure this application to act as an MCP server - # Currently exposes tools registered in Symfony AI Bundle - client_transports: - stdio: true # Enable STDIO via command - sse: true # Enable Server-Sent Event via controller - - # Configure MCP servers to be used by this application - # Not implemented yet - servers: - name: - transport: 'stdio' # Transport method to use, either 'stdio' or 'sse' - stdio: - command: 'php /path/bin/console mcp' # Command to execute to start the client - arguments: [] # Arguments to pass to the command - sse: - url: 'http://localhost:8000/sse' # URL to SSE endpoint of MCP server - -``` +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/mcp-bundle/doc/index.rst b/src/mcp-bundle/doc/index.rst new file mode 100644 index 00000000..2e8ecb08 --- /dev/null +++ b/src/mcp-bundle/doc/index.rst @@ -0,0 +1,70 @@ +MCP Bundle +========== + +Symfony integration bundle for `Model Context Protocol`_ using the Symfony AI MCP SDK `symfony/mcp-sdk`_. + +**Currently only supports tools as server via Server-Sent Events (SSE) and STDIO.** + +Installation +------------ + +.. code-block:: terminal + + $ composer require symfony/mcp-bundle + +Usage +----- + +At first, you need to decide whether your application should act as a MCP server or client. Both can be configured in +the ``mcp`` section of your ``config/packages/mcp.yaml`` file. + +**Act as Server** + +.. warning:: + + Currently only supports tools. Support for prompts, resources, and other features coming soon. + +To use your application as an MCP server, exposing tools to clients like `Claude Desktop`_, you need to configure in the +``client_transports`` section the transports you want to expose to clients. You can use either STDIO or SSE. + +**Act as Client** + +.. warning:: + + Not implemented yet, but planned for the future. + +To use your application as an MCP client, integrating other MCP servers, you need to configure the ``servers`` you want +to connect to. You can use either STDIO or Server-Sent Events (SSE) as transport methods. + +You can find a list of example Servers in the `MCP Server List`_. + +Tools of those servers are available in your `AI Bundle`_ configuration and usable in your agents. + +Configuration +------------- + +.. code-block:: yaml + + # config/packages/mcp.yaml + mcp: + app: 'app' # Application name to be exposed to clients + version: '1.0.0' # Application version to be exposed to clients + + client_transports: + stdio: true # Enable STDIO via command + sse: true # Enable Server-Sent Event via controller + + servers: + name: + transport: 'stdio' # Transport method to use, either 'stdio' or 'sse' + stdio: + command: 'php /path/bin/console mcp' # Command to execute to start the client + arguments: [] # Arguments to pass to the command + sse: + url: 'http://localhost:8000/sse' # URL to SSE endpoint of MCP server + +.. _`Model Context Protocol`: https://modelcontextprotocol.io/ +.. _`symfony/mcp-sdk`: https://github.com/symfony/mcp-sdk +.. _`Claude Desktop`: https://claude.ai/download +.. _`MCP Server List`: https://modelcontextprotocol.io/examples +.. _`AI Bundle`: https://github.com/symfony/ai-bundle diff --git a/src/mcp-sdk/README.md b/src/mcp-sdk/README.md index c7637bed..b646e208 100644 --- a/src/mcp-sdk/README.md +++ b/src/mcp-sdk/README.md @@ -2,6 +2,11 @@ Model Context Protocol SDK for Client and Server applications in PHP. +**This Component is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + ## Installation ```bash @@ -17,10 +22,7 @@ https://github.com/symfony/ai to create issues or submit pull requests. ## Resources -- [Documentation](doc/index.rst) -- [Report issues](https://github.com/symfony/ai/issues) and - [send Pull Requests](https://github.com/symfony/ai/pulls) - in the [main Symfony AI repository](https://github.com/symfony/ai) - -[1]: https://symfony.com/backers -[3]: https://symfony.com/sponsor +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/mcp-sdk/doc/index.rst b/src/mcp-sdk/doc/index.rst index 847944b2..36606601 100644 --- a/src/mcp-sdk/doc/index.rst +++ b/src/mcp-sdk/doc/index.rst @@ -7,7 +7,7 @@ a PHP application and an LLM model. Installation ------------ -Install the bundle using Composer: +Install the SDK using Composer: .. code-block:: terminal @@ -59,9 +59,7 @@ the server supports is defined in the ``Symfony\AI\McpSdk\Server\RequestHandler\ When the client connects, it sees the capabilities and will ask the server to list the tools/resource/prompts etc. When you want to add a new capability, example a **Tool** that can tell the current time, you need to provide some metadata to the -``Symfony\AI\McpSdk\Server\RequestHandler\ToolListHandler``. - -.. code-block: php +``Symfony\AI\McpSdk\Server\RequestHandler\ToolListHandler``:: namespace App; @@ -95,16 +93,14 @@ the tools/resource/prompts etc. When you want to add a new capability, example a } } -We would also need a class to actually execute the tool. - -.. code-block: php +We would also need a class to actually execute the tool:: namespace App; - use Symfony\AI\McpSdk\Capability\Tool\ToolExecutorInterface; use Symfony\AI\McpSdk\Capability\Tool\IdentifierInterface; use Symfony\AI\McpSdk\Capability\Tool\ToolCall; use Symfony\AI\McpSdk\Capability\Tool\ToolCallResult; + use Symfony\AI\McpSdk\Capability\Tool\ToolExecutorInterface; class CurrentTimeToolExecutor implements ToolExecutorInterface, IdentifierInterface { @@ -123,9 +119,7 @@ We would also need a class to actually execute the tool. } } -If you have multiple tools, you can put them in a ToolChain. - -.. code-block: php +If you have multiple tools, you can put them in a ToolChain:: $tools = new ToolChain([ new CurrentTimeToolMetadata(), diff --git a/src/platform/README.md b/src/platform/README.md new file mode 100644 index 00000000..1b7e03da --- /dev/null +++ b/src/platform/README.md @@ -0,0 +1,24 @@ +# Symfony AI - Platform Component + +The Platform component provides an abstraction for interacting with different models, their providers and contracts. + +**This Component is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + +## Installation + +```bash +composer require symfony/ai-platform +``` + +**This repository is a READ-ONLY sub-tree split**. See +https://github.com/symfony/ai to create issues or submit pull requests. + +## Resources + +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/platform/doc/index.rst b/src/platform/doc/index.rst new file mode 100644 index 00000000..bdbb65ef --- /dev/null +++ b/src/platform/doc/index.rst @@ -0,0 +1,296 @@ +Symfony AI - Platform Component +=============================== + +The Platform component provides an abstraction for interacting with different models, their providers and contracts. + +Installation +------------ + +Install the component using Composer: + +.. code-block:: terminal + + $ composer require symfony/ai-platform + +Purpose +------- + +The Platform component provides a unified interface for working with various AI models, hosted and run by different +providers. It allows developers to easily switch between different AI models and providers without changing their +application code. This is particularly useful for applications that require flexibility in choosing AI models based on +specific use cases or performance requirements. + +Usage +----- + +The instantiation of the ``Symfony\AI\Platform\Platform`` class is usually delegated to a provider-specific factory, +with a provider being OpenAI, Azure, Google, Replicate, and others. + +For example, to use the OpenAI provider, you would typically do something like this:: + + use Symfony\AI\Platform\Bridge\OpenAI\Embeddings; + use Symfony\AI\Platform\Bridge\OpenAI\GPT; + use Symfony\AI\Platform\Bridge\OpenAI\PlatformFactory; + + // Platform + $platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']); + + // Embeddings Model + $embeddings = new Embeddings(); + + // Language Model in version gpt-4o-mini + $model = new GPT(GPT::GPT_4O_MINI); + +And with a ``Symfony\AI\Platform\PlatformInterface`` instance, and a ``Symfony\AI\Platform\Model`` instance, you can now +use the platform to interact with the AI model:: + + // Generate a vector embedding for a text, returns a Symfony\AI\Platform\Response\VectorResponse + $response = $platform->request($embeddings, 'What is the capital of France?'); + + // Generate a text completion with GPT, returns a Symfony\AI\Platform\Response\TextResponse + $embeddingsResult = $platform->request($model, new MessageBag(Message::ofUser('What is the capital of France?'))); + +Depending on the model and its capabilities, different types of inputs and outputs are supported, which results in a +very flexible and powerful interface for working with AI models. + +Models +------ + +The component provides a model base class ``Symfony\AI\Platform\Model`` which is a combination of a model name, a set of +capabilities, and additional options. Usually, bridges to specific providers extend this base class to provide a quick +start for vendor-specific models and their capabilities, see ``Symfony\AI\Platform\Bridge\Anthropic\Claude`` or +``Symfony\AI\Platform\Bridge\OpenAI\GPT``. + +**Capabilities** are a list of strings defined by ``Symfony\AI\Platform\Capability``, which can be used to check if a model +supports a specific feature, like ``Capability::INPUT_AUDIO`` or ``Capability::OUTPUT_IMAGE``. + +**Options** are additional parameters that can be passed to the model, like ``temperature`` or ``max_tokens``, and are +usually defined by the specific models and their documentation. + +**Supported Models & Platforms** + +* **Language Models** + * `OpenAI's GPT`_ with `OpenAI`_ and `Azure`_ as Platform + * `Anthropic's Claude`_ with `Anthropic`_ and `AWS Bedrock`_ as Platform + * `Meta's Llama`_ with `Azure`_, `Ollama`_, `Replicate`_ and `AWS Bedrock`_ as Platform + * `Google's Gemini`_ with `Google`_ and `OpenRouter`_ as Platform + * `DeepSeek's R1`_ with `OpenRouter`_ as Platform + * `Amazon's Nova`_ with `AWS Bedrock`_ as Platform + * `Mistral's Mistral`_ with `Mistral`_ as Platform +* **Embeddings Models** + * `OpenAI's Text Embeddings`_ with `OpenAI`_ and `Azure`_ as Platform + * `Voyage's Embeddings`_ with `Voyage`_ as Platform + * `Mistral Embed`_ with `Mistral`_ as Platform +* **Other Models** + * `OpenAI's Dall·E`_ with `OpenAI`_ as Platform + * `OpenAI's Whisper`_ with `OpenAI`_ and `Azure`_ as Platform + * All models provided by `HuggingFace`_ can be listed with a command in the examples folder, + and also filtered, e.g. ``php examples/huggingface/_model-listing.php --provider=hf-inference --task=object-detection`` + +See `GitHub`_ for planned support of other models and platforms. + +Options +------- + +The third parameter of the ``request`` method is an array of options, which basically wraps the options of the +corresponding model and platform, like ``temperature`` or ``stream``:: + + $response = $platform->request($model, $input, [ + 'temperature' => 0.7, + 'max_tokens' => 100, + ]); + +.. note:: + + For model- and platform-specific options, please refer to the respective documentation. + +Language Models and Messages +---------------------------- + +One central feature of the Platform component is the support for language models and easing the interaction with them. +This is supported by providing an extensive set of data classes around the concept of messages and their content. + +Messages can be of different types, most importantly ``UserMessage``, ``SystemMessage``, or ``AssistantMessage``, can +have different content types, like ``Text``, ``Image`` or ``Audio``, and can be grouped into a ``MessageBag``:: + + use Symfony\AI\Platform\Message\Content\Image; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Create a message bag with a user message + $messageBag = new MessageBag( + Message::ofSystem('You are a helpful assistant.') + Message::ofUser('Please describe this picture?', Image::fromFile('/path/to/image.jpg')), + ); + +Response Streaming +------------------ + +Since LLMs usually generate a response word by word, most of them also support streaming the response using Server Side +Events. Symfony AI supports that by abstracting the conversion and returning a ``Generator`` as content of the response:: + + use Symfony\AI\Agent\Agent; + use Symfony\AI\Message\Message; + use Symfony\AI\Message\MessageBag; + + // Initialize Platform and LLM + + $agent = new Agent($model); + $messages = new MessageBag( + Message::forSystem('You are a thoughtful philosopher.'), + Message::ofUser('What is the purpose of an ant?'), + ); + $response = $agent->call($messages, [ + 'stream' => true, // enable streaming of response text + ]); + + foreach ($response->getContent() as $word) { + echo $word; + } + +In a terminal application this generator can be used directly, but with a web app an additional layer like `Mercure`_ +needs to be used. + +**Code Examples** +* `Streaming Claude`_ +* `Streaming GPT`_ +* `Streaming Mistral`_ + +Image Processing +---------------- + +Some LLMs also support images as input, which Symfony AI supports as content type within the ``UserMessage``:: + + use Symfony\AI\Platform\Message\Content\Image; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Initialize Platform, LLM & agent + + $messages = new MessageBag( + Message::forSystem('You are an image analyzer bot that helps identify the content of images.'), + Message::ofUser( + 'Describe the image as a comedian would do it.', + Image::fromFile(dirname(__DIR__).'/tests/fixtures/image.jpg'), // Path to an image file + Image::fromDataUrl('data:image/png;base64,...'), // Data URL of an image + new ImageUrl('https://foo.com/bar.png'), // URL to an image + ), + ); + $response = $agent->call($messages); + +**Code Examples** +* `Binary Image Input with GPT`_ +* `Image URL Input with GPT`_ + +Audio Processing +---------------- + +Similar to images, some LLMs also support audio as input, which is just another content type within the +``UserMessage``:: + + use Symfony\AI\Platform\Message\Content\Audio; + use Symfony\AI\Platform\Message\Message; + use Symfony\AI\Platform\Message\MessageBag; + + // Initialize Platform, LLM & agent + + $messages = new MessageBag( + Message::ofUser( + 'What is this recording about?', + Audio::fromFile('/path/audio.mp3'), // Path to an audio file + ), + ); + $response = $agent->call($messages); + +**Code Examples** + +* `Audio Input with GPT`_ + +Embeddings +---------- + +Creating embeddings of word, sentences, or paragraphs is a typical use case around the interaction with LLMs. + +The standalone usage results in an ``Vector`` instance:: + + use Symfony\AI\Platform\Bridge\OpenAI\Embeddings; + + // Initialize Platform + + $embeddings = new Embeddings($platform, Embeddings::TEXT_3_SMALL); + + $vectors = $platform->request($embeddings, $textInput)->getContent(); + + dump($vectors[0]->getData()); // returns something like: [0.123, -0.456, 0.789, ...] + +**Code Examples** + +* `Embeddings with OpenAI`_ +* `Embeddings with Voyage`_ +* `Embeddings with Mistral`_ + +Parallel Platform Calls +----------------------- + +Since the ``Platform`` sits on top of Symfony's HttpClient component, it supports multiple model calls in parallel, +which can be useful to speed up the processing:: + + // Initialize Platform & Model + + foreach ($inputs as $input) { + $responses[] = $platform->request($model, $input); + } + + foreach ($responses as $response) { + echo $response->getContent().PHP_EOL; + } + +.. note:: + + This requires `cURL` and the `ext-curl` extension to be installed. + +**Code Examples** + +* `Parallel GPT Calls`_ +* `Parallel Embeddings Calls`_ + +.. note:: + + Please be aware that some embedding models also support batch processing out of the box. + +.. _`OpenAI's GPT`: https://platform.openai.com/docs/models/overview +.. _`OpenAI`: https://platform.openai.com/docs/overview +.. _`Azure`: https://learn.microsoft.com/azure/ai-services/openai/concepts/models +.. _`Anthropic's Claude`: https://www.anthropic.com/claude +.. _`Anthropic`: https://www.anthropic.com/ +.. _`AWS Bedrock`: https://aws.amazon.com/bedrock/ +.. _`Meta's Llama`: https://www.llama.com/ +.. _`Ollama`: https://ollama.com/ +.. _`Replicate`: https://replicate.com/ +.. _`Google's Gemini`: https://gemini.google.com/ +.. _`Google`: https://ai.google.dev/ +.. _`OpenRouter`: https://www.openrouter.com/ +.. _`DeepSeek's R1`: https://www.deepseek.com/ +.. _`Amazon's Nova`: https://nova.amazon.com +.. _`Mistral's Mistral`: https://www.mistral.ai/ +.. _`Mistral`: https://www.mistral.ai/ +.. _`OpenAI's Text Embeddings`: https://platform.openai.com/docs/guides/embeddings/embedding-models +.. _`Voyage's Embeddings`: https://docs.voyageai.com/docs/embeddings +.. _`Voyage`: https://www.voyageai.com/ +.. _`Mistral Embed`: https://www.mistral.ai/ +.. _`OpenAI's Dall·E`: https://platform.openai.com/docs/guides/image-generation +.. _`OpenAI's Whisper`: https://platform.openai.com/docs/guides/speech-to-text +.. _`HuggingFace`: https://huggingface.co/ +.. _`GitHub`: https://github.com/symfony/ai/issues/16 +.. _`Mercure`: https://mercure.rocks/ +.. _`Streaming Claude`: https://github.com/symfony/ai/blob/main/examples/anthropic/stream.php +.. _`Streaming GPT`: https://github.com/symfony/ai/blob/main/examples/openai/stream.php +.. _`Streaming Mistral`: https://github.com/symfony/ai/blob/main/examples/mistral/stream.php +.. _`Binary Image Input with GPT`: https://github.com/symfony/ai/blob/main/examples/openai/image-input-binary.php +.. _`Image URL Input with GPT`: https://github.com/symfony/ai/blob/main/examples/openai/image-input-url.php +.. _`Audio Input with GPT`: https://github.com/symfony/ai/blob/main/examples/openai/audio-input.php +.. _`Embeddings with OpenAI`: https://github.com/symfony/ai/blob/main/examples/openai/embeddings.php +.. _`Embeddings with Voyage`: https://github.com/symfony/ai/blob/main/examples/voyage/embeddings.php +.. _`Embeddings with Mistral`: https://github.com/symfony/ai/blob/main/examples/mistral/embeddings.php +.. _`Parallel GPT Calls`: https://github.com/symfony/ai/blob/main/examples/misc/parallel-chat-gpt.php +.. _`Parallel Embeddings Calls`: https://github.com/symfony/ai/blob/main/examples/misc/parallel-embeddings.php diff --git a/src/store/README.md b/src/store/README.md new file mode 100644 index 00000000..c90ff018 --- /dev/null +++ b/src/store/README.md @@ -0,0 +1,24 @@ +# Symfony AI - Store Component + +The Store component provides a low-level abstraction for storing and retrieving documents in a vector store. + +**This Component is experimental**. +[Experimental features](https://symfony.com/doc/current/contributing/code/experimental.html) +are not covered by Symfony's +[Backward Compatibility Promise](https://symfony.com/doc/current/contributing/code/bc.html). + +## Installation + +```bash +composer require symfony/ai-store +``` + +**This repository is a READ-ONLY sub-tree split**. See +https://github.com/symfony/ai to create issues or submit pull requests. + +## Resources + +- [Documentation](doc/index.rst) +- [Report issues](https://github.com/symfony/ai/issues) and + [send Pull Requests](https://github.com/symfony/ai/pulls) + in the [main Symfony AI repository](https://github.com/symfony/ai) diff --git a/src/store/composer.json b/src/store/composer.json index 2ec1c653..122838b5 100644 --- a/src/store/composer.json +++ b/src/store/composer.json @@ -1,7 +1,7 @@ { "name": "symfony/ai-store", "type": "library", - "description": "PHP library for abstracting interaction with data stores in AI applications.", + "description": "Low-level abstraction for storing and retrieving documents in a vector store.", "keywords": [ "ai", "mongodb", diff --git a/src/store/doc/index.rst b/src/store/doc/index.rst new file mode 100644 index 00000000..f2dfa71a --- /dev/null +++ b/src/store/doc/index.rst @@ -0,0 +1,89 @@ +Symfony AI - Store Component +============================ + +The Store component provides a low-level abstraction for storing and retrieving documents in a vector store. + +Installation +------------ + +Install the component using Composer: + +.. code-block:: terminal + + composer require symfony/ai-store + +Purpose +------- + +A typical use-case in agentic applications is a dynamic context-extension with similar and useful information, for so +called `Retrieval Augmented Generation`_ (RAG). The Store component implements low-level interfaces, that can be +implemented by different concrete and vendor-specific implementations, so called bridges. +On top of those bridges, the Store component provides higher level features to populate and query those stores with and +for documents. + +Indexing +-------- + +One higher level feature is the ``Symfony\AI\Store\Indexer``. The purpose of this service is to populate a store with documents. +Therefore it accepts one or multiple ``Symfony\AI\Store\Document\TextDocument`` objects, converts them into embeddings and stores them in the +used vector store:: + + use Symfony\AI\Store\Document\TextDocument; + use Symfony\AI\Store\Indexer; + + $indexer = new Indexer($platform, $model, $store); + $document = new TextDocument('This is a sample document.'); + $indexer->index($document); + +You can find more advanced usage in combination with an Agent using the store for RAG in the examples folder: + +* `Similarity Search with MongoDB (RAG)`_ +* `Similarity Search with Pinecone (RAG)`_ + +Supported Stores +---------------- + +* `Azure AI Search`_ +* `Chroma`_ +* `MongoDB Atlas`_ +* `Pinecone`_ + +.. note:: + + See `GitHub`_ for planned stores. + +Implementing a Bridge +--------------------- + +The main extension points of the Store component are + +* ``Symfony\AI\Store\StoreInterface`` - Takes care of adding documents to the store. +* ``Symfony\AI\Store\VectorStoreInterface`` - Takes care of querying the store for documents. + +This leads to a store implementing two methods:: + + use Symfony\AI\Store\StoreInterface; + use Symfony\AI\Store\VectorStoreInterface; + + class MyStore implements StoreInterface, VectorStoreInterface + { + public function add(VectorDocument ...$documents): void + { + // Implementation to add a document to the store + } + + public function query(Vector $vector, array $options = [], ?float $minScore = null): array + { + // Implementation to query the store for documents + return []; + } + } + +.. _`Retrieval Augmented Generation`: https://de.wikipedia.org/wiki/Retrieval-Augmented_Generation +.. _`Similarity Search with MongoDB (RAG)`: https://github.com/symfony/ai/blob/main/examples/store/mongodb-similarity-search.php +.. _`Similarity Search with Pinecone (RAG)`: https://github.com/symfony/ai/blob/main/examples/store/pinecone-similarity-search.php +.. _`Azure AI Search`: https://azure.microsoft.com/products/ai-services/ai-search +.. _`Chroma`: https://www.trychroma.com/ +.. _`MongoDB Atlas`: https://www.mongodb.com/atlas +.. _`Pinecone`: https://www.pinecone.io/ +.. _`GitHub`: https://github.com/symfony/ai/issues/16