- Rio de Janeiro
- http://www.fer.mat.br
Stars
The World's First AI-Enabled Multi-Modality Native Search Engine
🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT)
Implementations of Embedding-based methods for Knowledge Base Completion tasks
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
Generate embeddings from large-scale graph-structured data.
A collection of modern/faster/saner alternatives to common unix commands.
Some notes on a cli terminal, upcycling an old super 8 film viewer
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
Model parallel transformers in JAX and Haiku
The simplest, fastest repository for training/finetuning medium-sized GPTs.
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entr…
A lightweight yet powerful audio-to-MIDI converter with pitch bend detection
Pronounced as "musician", musicnn is a set of pre-trained deep convolutional neural networks for music audio tagging.
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Data and code for the paper "Future is not One-dimensional: Complex Event Schema Induction via Graph Modeling".
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Documents from a live coding session by Christopher Wolfram related to content from the 2016 film Arrival
python async orm with fastapi in mind and pydantic validation