-
Maximum Software
- New Jersey, USA
-
21:50
(UTC -05:00)
Highlights
- Pro
Stars
Quick implementation of nGPT, learning entirely on the hypersphere, from NvidiaAI
The Python <-> Objective-C Bridge with bindings for macOS frameworks
see github.com/understanding-search/maze-transformer
Exploring the minimal architecture required for coherent English language generation.
Creating a mini GPT-2 model from scratch by training it with data obtained from TinyStories
A 2M parameter neural language model trained on the TinyStories corpus.
Character Level Small Language Model trained on TinyStories datasets
Collection of experiments related to small language models, mostly seq2seq models
Reproduction of TinyStories: How Small Can Language Models Be and Still Speak Coherent English?
code to train a gpt-2 model to train it on tiny stories dataset according to the TinyStories paper
A Mathematica and Matlab toolboxes for Clifford algebras of n-dimensional Euclidean vector spaces