Skip to content

Commit

Permalink
Embed circle packing chart for model summary (huggingface#20791)
Browse files Browse the repository at this point in the history
* embed circle packing chart

* trim whitespace from bottom

* explain bubble sizes
  • Loading branch information
stevhliu authored Dec 20, 2022
1 parent bd1a43b commit 3be028b
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion docs/source/en/model_summary.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,12 @@ specific language governing permissions and limitations under the License.

# Summary of the models

This is a summary of the models available in 🤗 Transformers. It assumes you’re familiar with the original [transformer
This is a summary of the most downloaded models in 🤗 Transformers. Click on the large outermost bubble of each pretrained model category (encoder, decoder, encoder-decoder) to zoom in and out to see the most popular models within a modality. The size of each bubble corresponds to the number of downloads of each model.

<iframe width="100%" height="900" frameborder="0"
src="https://observablehq.com/embed/eafbe39385aaf8f2?cells=chart"></iframe>

It assumes you're familiar with the original [transformer
model](https://arxiv.org/abs/1706.03762). For a gentle introduction check the [annotated transformer](http://nlp.seas.harvard.edu/2018/04/03/attention.html). Here we focus on the high-level differences between the
models. You can check them more in detail in their respective documentation. Also check out [the Model Hub](https://huggingface.co/models) where you can filter the checkpoints by model architecture.

Expand Down

0 comments on commit 3be028b

Please sign in to comment.