An intelligent search assistant powered by LangChain and Streamlit that helps you find and summarize information from the web in real-time.
Here's what the app looks like in action:
- Real-time Web Search: Utilizes Google Search to find the most relevant information
- Dynamic Content Fetching: Automatically retrieves and processes webpage content
- Interactive UI: Built with Streamlit for a smooth user experience
- Tool Use Transparency: See exactly how the agent processes your query with an expandable tools log
- Streaming Responses: Get information as it's being processed, no need to wait for complete responses
- Frontend: Streamlit
- Backend: Python, LangChain
- AI/ML: GPT-4o-mini for natural language processing
- Tools:
- Google Search Integration
- Web Content Fetching
- Real-time Response Streaming
-
Clone the repository
git clone https://github.com/Xphi310302/google-search-agent-assistant.git cd search-agent
-
Install Poetry (if not already installed)
pip install poetry
-
Install dependencies using Poetry
poetry install source $(poetry env info --path)/bin/activate # Activate the virtual environment
-
Set up environment variables
# Create a .env file with your API keys OPENAI_API_KEY=your_openai_api_key GOOGLE_SEARCH_API_KEY=your_google_search_api_key GOOGLE_SEARCH_ENGINE_ID=your_google_search_engine_id
-
Run the application
streamlit run main.py
- Enter your search query in the chat input
- Watch as the agent:
- Searches Google for relevant information
- Fetches and processes webpage content
- Provides a comprehensive response
- View the Tools Use expander to see how the agent processes your query
- Query Processing: Your input is processed by the LangChain agent
- Tool Selection: The agent decides which tools to use:
- Google Search for finding relevant sources
- Web Fetching for retrieving detailed content
- Response Generation: Information is synthesized into a coherent response
- Real-time Updates: See the process through the Tools Use expander
search-agent/
├── main.py # Main Streamlit application
├── agent/ # Agent implementation
│ ├── base.py # Base agent class
│ └── tools/ # Tool implementations
├── config/ # Configuration files
├── utils/ # Utility functions
└── README.md # Project documentation
- Do some experiments with some other crawler tools for more reliable results
This project is licensed under the MIT License - see the LICENSE file for details.
Phi Nguyen Xuan
- LinkedIn: Phinx
- Website: https://phinx.vercel.app
Built with ❤️ using LangChain and Streamlit