Package: sacRebleu 0.2.0
sacRebleu: Metrics for Assessing the Quality of Generated Text
Implementation of the BLEU-Score in 'C++' to evaluate the quality of generated text. The BLEU-Score, introduced by Papineni et al. (2002) <doi:10.3115/1073083.1073135>, is a metric for evaluating the quality of generated text. It is based on the n-gram overlap between the generated text and reference texts. Additionally, the package provides some smoothing methods as described in Chen and Cherry (2014) <doi:10.3115/v1/W14-3346>.
Authors:
sacRebleu_0.2.0.tar.gz
sacRebleu_0.2.0.tar.gz(r-4.5-noble)sacRebleu_0.2.0.tar.gz(r-4.4-noble)
sacRebleu_0.2.0.tgz(r-4.4-emscripten)sacRebleu_0.2.0.tgz(r-4.3-emscripten)
sacRebleu.pdf |sacRebleu.html✨
sacRebleu/json (API)
NEWS
# Install 'sacRebleu' in R: |
install.packages('sacRebleu', repos = c('https://cran.r-universe.dev', 'https://cloud.r-project.org')) |
Bug tracker:https://github.com/lazerlambda/sacrebleu/issues
Last updated 2 hours agofrom:53df1ec189. Checks:2 OK. Indexed: yes.
Target | Result | Latest binary |
---|---|---|
Doc / Vignettes | OK | Jan 22 2025 |
R-4.5-linux-x86_64 | OK | Jan 22 2025 |
Exports:bleu_corpus_idsbleu_sentence_idsvalidate_argumentsvalidate_references
Readme and manuals
Help Manual
Help page | Topics |
---|---|
Computes BLEU score (Papineni et al., 2002). | bleu_corpus_ids |
Computes BLEU-Score (Papineni et al., 2002). | bleu_sentence_ids |
Validate Arguments | validate_arguments |
Validate References | validate_references |