File: supervised-models.md

package info (click to toggle)
fasttext 0.9.2%2Bds-9
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid
  • size: 4,952 kB
  • sloc: cpp: 5,459; python: 2,427; javascript: 635; sh: 621; makefile: 106; xml: 81; perl: 43
file content (54 lines) | stat: -rw-r--r-- 3,626 bytes parent folder | download | duplicates (4)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
id: supervised-models
title: Supervised models
---

This page gathers several pre-trained supervised models on several datasets.

### Description

The regular models are trained using the procedure described in [1]. They can be reproduced using the classification-results.sh script within our github repository. The quantized models are build by using the respective supervised settings and adding the following flags to the quantize subcommand.

```bash
-qnorm -retrain -cutoff 100000
```

### Table of models

Each entry describes the test accuracy and size of the model. You can click on a table cell to download the corresponding model.

| dataset   | ag news               | amazon review full    | amazon review polarity | dbpedia                |
|-----------|-----------------------|-----------------------|------------------------|------------------------|
| regular   | [0.924 / 387MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/ag_news.bin) | [0.603 / 462MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/amazon_review_full.bin) | [0.946 / 471MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/amazon_review_polarity.bin) | [0.986 / 427MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/dbpedia.bin) |
| compressed | [0.92 / 1.6MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/ag_news.ftz)    | [0.599 / 1.6MB]( https://dl.fbaipublicfiles.com/fasttext/supervised-models/amazon_review_full.ftz)   | [0.93 / 1.6MB]( https://dl.fbaipublicfiles.com/fasttext/supervised-models/amazon_review_polarity.ftz)  | [0.984 / 1.7MB]( https://dl.fbaipublicfiles.com/fasttext/supervised-models/dbpedia.ftz) |

| dataset   | sogou news           | yahoo answers          | yelp review polarity | yelp review full       |
|-----------|----------------------|------------------------|----------------------|------------------------|
| regular   | [0.969 / 402MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/sogou_news.bin) | [0.724 / 494MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yahoo_answers.bin)| [0.957 / 409MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yelp_review_polarity.bin)| [0.639 / 412MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yelp_review_full.bin)|
| compressed | [0.968 / 1.4MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/sogou_news.ftz)   | [0.717 / 1.6MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yahoo_answers.ftz)       | [0.957 / 1.5MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yelp_review_polarity.ftz) | [0.636 / 1.5MB](https://dl.fbaipublicfiles.com/fasttext/supervised-models/yelp_review_full.ftz)  |

### References

If you use these models, please cite the following paper:

[1] A. Joulin, E. Grave, P. Bojanowski, T. Mikolov, [*Bag of Tricks for Efficient Text Classification*](https://arxiv.org/abs/1607.01759)

```markup
@article{joulin2016bag,
  title={Bag of Tricks for Efficient Text Classification},
  author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Mikolov, Tomas},
  journal={arXiv preprint arXiv:1607.01759},
  year={2016}
}
```

[2] A. Joulin, E. Grave, P. Bojanowski, M. Douze, H. Jégou, T. Mikolov, [*FastText.zip: Compressing text classification models*](https://arxiv.org/abs/1612.03651)

```markup
@article{joulin2016fasttext,
  title={FastText.zip: Compressing text classification models},
  author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Douze, Matthijs and J{\'e}gou, H{\'e}rve and Mikolov, Tomas},
  journal={arXiv preprint arXiv:1612.03651},
  year={2016}
}
```