- Introduction
- Maven Dependency
- Building
- Quick Application - Language Identification
- Detailed Examples
- API
- FastText's Command Line
- License
- References
- Changelog
JFastText is a Java wrapper for Facebook's fastText, a library for efficient learning of word embeddings and fast sentence classification. The JNI interface is built using javacpp.
The library provides full fastText's command line interface. It also provides the API for loading trained model from file to do label prediction in memory. Model training and quantization are supported via the command line interface.
JFastText is ideal for building fast text classifiers in Java.
<dependency>
<groupId>io.github.carschno</groupId>
<artifactId>jfasttext</artifactId>
<version>0.9.1</version>
</dependency>
The Jar package on Maven Central is bundled with precompiled fastText library for Windows, Linux and
MacOSX 64bit.
Currently, the Maven dependency only contains binaries for Linux (64 bit), not for Windows or Mac OS X. In order to use JFastText for Windows or Mac OS X (or any other system), you need to build it yourself (see below).
C++ compiler (g++ on Mac/Linux or cl.exe
on Windows) is required to compile fastText's code.
git clone --recursive https://github.com/carschno/JFastText
cd JFastText
git submodule init
git submodule update
mvn package
The (automatic) build seems to fail on some Windows systems/C++ compilers. See this issue:
I used MS's developer tools, not the full-blown Visual Studio. If I run
cl
directly, the compilation fails with the same error.I was able to build on Windows by changing the call to
cl.exe
and running it outside the Maven build. I changed one parameter in the call tocl
: I use/MT
(whereas Maven uses/MD
). Bundling the generated DLLs works fine.
JFastText can use FastText's pretrained models directly. Language identification models can be downloaded here. In this quick example, we will use the quantized model which is super small and a bit less accurate than the original model.
$ wget -q https://s3-us-west-1.amazonaws.com/fasttext-vectors/supervised_models/lid.176.ftz \
&& { echo "This is English"; echo "Xin chào"; echo "Привет"; } \
| java -jar target/jfasttext-*-jar-with-dependencies.jar predict lid.176.ftz -
__label__en
__label__vi
__label__ru
Examples on how to use JFastText can be found at examples/api and examples/cmd.
import com.github.jfasttext.JFastText;
...
JFastText jft = new JFastText();
jft.runCmd(new String[] {
"skipgram",
"-input", "src/test/resources/data/unlabeled_data.txt",
"-output", "src/test/resources/models/skipgram.model",
"-bucket", "100",
"-minCount", "1"
});
// Train supervised model
jft.runCmd(new String[] {
"supervised",
"-input", "src/test/resources/data/labeled_data.txt",
"-output", "src/test/resources/models/supervised.model"
});
// Load model from file
jft.loadModel("src/test/resources/models/supervised.model.bin");
// Do label prediction
String text = "What is the most popular sport in the US ?";
JFastText.ProbLabel probLabel = jft.predictProba(text);
System.out.printf("\nThe label of '%s' is '%s' with probability %f\n",
text, probLabel.label, Math.exp(probLabel.logProb));
FastText's command line interface can be accessed as follows:
$ java -jar target/jfasttext-*-jar-with-dependencies.jar
usage: fasttext <command> <args>
The commands supported by fasttext are:
supervised train a supervised classifier
quantize quantize a model to reduce the memory usage
test evaluate a supervised classifier
predict predict most likely labels
predict-prob predict most likely labels with probabilities
skipgram train a skipgram model
cbow train a cbow model
print-word-vectors print word vectors given a trained model
print-sentence-vectors print sentence vectors given a trained model
print-ngrams print ngrams given a trained model and word
nn query for nearest neighbors
analogies query for analogies
dump dump arguments,dictionary,input/output vectors
For example:
$ java -jar target/jfasttext-*-jar-with-dependencies.jar quantize -h
BSD
(From fastText's references)
Please cite 1 if using this code for learning word representations or 2 if using for text classification.
[1] P. Bojanowski*, E. Grave*, A. Joulin, T. Mikolov, Enriching Word Vectors with Subword Information
@article{bojanowski2016enriching,
title={Enriching Word Vectors with Subword Information},
author={Bojanowski, Piotr and Grave, Edouard and Joulin, Armand and Mikolov, Tomas},
journal={arXiv preprint arXiv:1607.04606},
year={2016}
}
[2] A. Joulin, E. Grave, P. Bojanowski, T. Mikolov, Bag of Tricks for Efficient Text Classification
@article{joulin2016bag,
title={Bag of Tricks for Efficient Text Classification},
author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Mikolov, Tomas},
journal={arXiv preprint arXiv:1607.01759},
year={2016}
}
[3] A. Joulin, E. Grave, P. Bojanowski, M. Douze, H. Jégou, T. Mikolov, FastText.zip: Compressing text classification models
@article{joulin2016fasttext,
title={FastText.zip: Compressing text classification models},
author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Douze, Matthijs and J{\'e}gou, H{\'e}rve and Mikolov, Tomas},
journal={arXiv preprint arXiv:1612.03651},
year={2016}
}
(* These authors contributed equally.)