2

[2401.17377] Infini-gram: Scaling Unbounded n-gram Language Models to a Trillion...

 1 week ago
source link: https://arxiv.org/abs/2401.17377
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
[Submitted on 30 Jan 2024 (v1), last revised 4 Apr 2024 (this version, v3)]

Infini-gram: Scaling Unbounded n-gram Language Models to a Trillion Tokens

View PDF HTML (experimental)

Are n-gram language models still relevant in this era of neural large language models (LLMs)? Our answer is yes, and we showcase their values in both text analysis and improving neural LLMs. This was done by modernizing n-gram LMs in two aspects. First, we train them at the same data scale as neural LLMs -- 5 trillion tokens. This is the largest n-gram LM ever built. Second, existing n-gram LMs use small n which hinders their performance; we instead allow n to be arbitrarily large, by introducing a new ∞-gram LM with backoff. Instead of pre-computing n-gram count tables (which would be very expensive), we develop an engine named infini-gram -- powered by suffix arrays -- that can compute ∞-gram (as well as n-gram with arbitrary n) probabilities with millisecond-level latency. The ∞-gram framework and infini-gram engine enable us to conduct many novel and interesting analyses of human-written and machine-generated text: we find that the ∞-gram LM has fairly high accuracy for next-token prediction (47%), and can complement neural LLMs to greatly reduce their perplexity. When analyzing machine-generated text, we also observe irregularities in the machine--∞-gram agreement level with respect to the suffix length, which indicates deficiencies in neural LLM pretraining and the positional embeddings of Transformers.
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI); Information Retrieval (cs.IR)
Cite as: arXiv:2401.17377 [cs.CL]
  (or arXiv:2401.17377v3 [cs.CL] for this version)
  https://doi.org/10.48550/arXiv.2401.17377

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK