Llemma: An Open Language Model For Mathematics. Azerbayev, Z., Schoelkopf, H., Paster, K., Santos, M. D., McAleer, S., Jiang, A. Q., Deng, J., Biderman, S., & Welleck, S. November, 2023. arXiv:2310.10631 [cs]
Llemma: An Open Language Model For Mathematics [link]Paper  doi  abstract   bibtex   
We present Llemma, a large language model for mathematics. We continue pretraining Code Llama on the Proof-Pile-2, a mixture of scientific papers, web data containing mathematics, and mathematical code, yielding Llemma. On the MATH benchmark Llemma outperforms all known open base models, as well as the unreleased Minerva model suite on an equi-parameter basis. Moreover, Llemma is capable of tool use and formal theorem proving without any further finetuning. We openly release all artifacts, including 7 billion and 34 billion parameter models, the Proof-Pile-2, and code to replicate our experiments.
@misc{azerbayev_llemma_2023,
	title = {Llemma: {An} {Open} {Language} {Model} {For} {Mathematics}},
	shorttitle = {Llemma},
	url = {http://arxiv.org/abs/2310.10631},
	doi = {10.48550/arXiv.2310.10631},
	abstract = {We present Llemma, a large language model for mathematics. We continue pretraining Code Llama on the Proof-Pile-2, a mixture of scientific papers, web data containing mathematics, and mathematical code, yielding Llemma. On the MATH benchmark Llemma outperforms all known open base models, as well as the unreleased Minerva model suite on an equi-parameter basis. Moreover, Llemma is capable of tool use and formal theorem proving without any further finetuning. We openly release all artifacts, including 7 billion and 34 billion parameter models, the Proof-Pile-2, and code to replicate our experiments.},
	urldate = {2024-01-16},
	publisher = {arXiv},
	author = {Azerbayev, Zhangir and Schoelkopf, Hailey and Paster, Keiran and Santos, Marco Dos and McAleer, Stephen and Jiang, Albert Q. and Deng, Jia and Biderman, Stella and Welleck, Sean},
	month = nov,
	year = {2023},
	note = {arXiv:2310.10631 [cs]},
	keywords = {artificial intelligence, computation and language, large language models, mentions sympy},
}

Downloads: 0