Velum Labs embodies the ambition to democratize AI, making it accessible, understandable, and applicable for real-world scenarios. At its core, Velum Labs hosts the Thor Framework, a sophisticated, high-performance foundation for building AI agents, designed to thrive in environments where speed, security, and scalability are paramount. Thor leverages the strengths of both Rust and Golang, combining Rust's safety and concurrency with Golang's simplicity and robustness to create a unique platform for AI development.
Our mission at Velum Labs is to:
Empower Developers: Provide tools and frameworks that allow developers of all levels to create, test, and deploy AI agents with ease.
Enhance Social Interaction: Utilize AI to analyze, engage, and enhance social media interactions, notably on platforms like Twitter, with agents that can understand and react to the nuances of human communication.
Promote Ethical AI: Ensure that our technology is developed with ethical considerations at the forefront, focusing on privacy, security, and responsible AI practices.
Drive Innovation: Continuously explore new methodologies, algorithms, and integrations to keep our tech stack at the cutting edge of AI development.
Our Vision is a world where AI agents are seamlessly integrated into everyday life, augmenting human capabilities and enriching digital experiences. We envision a future where AI not only responds but anticipates, learns, and adapts in real-time to the ever-evolving landscape of social media and beyond.
Thor - An Advanced Conversational LLM Framework
Thor is a modular, open-source conversational AI framework designed for the decentralized era. It bridges the power of advanced AI with the flexibility of systems.
Components:
- An architecture based on plugins that has hot-swappable parts
- Multi-provider LLM support, including OpenAI and bespoke connectors.
- Semantic storage enabled by PostgreSQL and pgvector
- Extendable manager system with customized conversational behaviors.
Explore Thor in its repository.
Kit - LLM Function Calling Framework Toolkit is a Go and Rust package for constructing and managing LLM function calling tools. It offers a simple interface for developing AI-compatible functions and grouping them into toolkits.
Key features:
An abstract tool interface for putting LLM-compatible features into practice Organizing relevant tools in a toolkit Pattern of functional possibilities for configuration Support for function parameters and returns in the JSON schema
Explore Toolkit in its repository.
-
LLM Providers: Add new AI providers by implementing the LLM interface type Provider interface { GenerateCompletion(context.Context, CompletionRequest) (string, error) GenerateJSON(context.Context, JSONRequest, interface{}) error EmbedText(context.Context, string) ([]float32, error) }
-
Managers: Create new behaviors by implementing the Manager interface type Manager interface { GetID() ManagerID GetDependencies() []ManagerID Process(state *state.State) error PostProcess(state *state.State) error Context(state *state.State) ([]state.StateData, error) Store(fragment *db.Fragment) error StartBackgroundProcesses() StopBackgroundProcesses() RegisterEventHandler(callback EventCallbackFunc) triggerEvent(eventData EventData) }
- Manager System: Extend functionality through custom managers
- Insight Manager: Extracts and maintains conversation insights
- Personality Manager: Handles response behavior and style
- Custom Managers: Add your own specialized behaviors
- Shared State System: Centralized state management across components
- Manager-specific data storage
- Custom data injection
- Cross-manager communication
- Provider Abstraction: Support for multiple LLM providers
- Built-in OpenAI support
- Extensible provider interface for custom LLMs
- Configurable model selection per operation
- Automatic fallback and retry handling
- Platform Agnostic Core:
- Abstract conversation engine independent of platforms
- Built-in support for CLI chat and Twitter
- Extensible platform manager interface
- Example implementations for new platform integration
- Flexible Data Storage:
- PostgreSQL with pgvector for semantic search
- GORM-based data models
- Customizable fragment storage
- Vector embedding support
- Pluggable Tool/Function Integration:
- Support for custom tool implementations
- Built-in toolkit management
- Function calling capabilities
- Automatic tool response handling
- State-aware tool execution
The foundation of everything we create is decentralization and cryptocurrency. These guidelines guarantee that technology will always be equitable, accessible, and uncontrolled by a single entity. We develop solutions that embrace transparency, security, and autonomy by fusing AI with the cryptocurrency ecosystem.
In the cryptocurrency industry and beyond, we are constructing the decentralized AI of the future. We would love to work with if you are an open-source enthusiast, researcher, or developer. To join us, see our contribution guidelines.
Stay updated and engage with us:
Velum Labs