An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors. Ning, H., Wen, H., Meng, Y., Yu, Z., Fu, Y., Zou, X., Shen, Y., Luo, X., Zhao, Q., Zhang, T., Liu, L., Zhu, S., Li, T., Li, W., Li, L., Gao, L., Shi, Y., & Wang, X. Nature Electronics, January, 2025. Publisher: Nature Publishing Group
Paper doi abstract bibtex The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4% accuracy in an EMNIST letter recognition task under 75% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.
@article{ning_index-free_2025,
title = {An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors},
copyright = {2025 The Author(s), under exclusive licence to Springer Nature Limited},
issn = {2520-1131},
url = {https://www.nature.com/articles/s41928-024-01328-4},
doi = {10.1038/s41928-024-01328-4},
abstract = {The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4\% accuracy in an EMNIST letter recognition task under 75\% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.},
language = {en},
urldate = {2025-01-10},
journal = {Nature Electronics},
author = {Ning, Hongkai and Wen, Hengdi and Meng, Yuan and Yu, Zhihao and Fu, Yuxiang and Zou, Xilu and Shen, Yilin and Luo, Xiai and Zhao, Qiyue and Zhang, Tao and Liu, Lei and Zhu, Shitong and Li, Taotao and Li, Weisheng and Li, Li and Gao, Li and Shi, Yi and Wang, Xinran},
month = jan,
year = {2025},
note = {Publisher: Nature Publishing Group},
keywords = {Electronic devices, Two-dimensional materials},
pages = {1--13},
}
Downloads: 0
{"_id":"S3vK3e3hJBojeize3","bibbaseid":"ning-wen-meng-yu-fu-zou-shen-luo-etal-anindexfreesparseneuralnetworkusingtwodimensionalsemiconductorferroelectricfieldeffecttransistors-2025","author_short":["Ning, H.","Wen, H.","Meng, Y.","Yu, Z.","Fu, Y.","Zou, X.","Shen, Y.","Luo, X.","Zhao, Q.","Zhang, T.","Liu, L.","Zhu, S.","Li, T.","Li, W.","Li, L.","Gao, L.","Shi, Y.","Wang, X."],"bibdata":{"bibtype":"article","type":"article","title":"An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors","copyright":"2025 The Author(s), under exclusive licence to Springer Nature Limited","issn":"2520-1131","url":"https://www.nature.com/articles/s41928-024-01328-4","doi":"10.1038/s41928-024-01328-4","abstract":"The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4% accuracy in an EMNIST letter recognition task under 75% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.","language":"en","urldate":"2025-01-10","journal":"Nature Electronics","author":[{"propositions":[],"lastnames":["Ning"],"firstnames":["Hongkai"],"suffixes":[]},{"propositions":[],"lastnames":["Wen"],"firstnames":["Hengdi"],"suffixes":[]},{"propositions":[],"lastnames":["Meng"],"firstnames":["Yuan"],"suffixes":[]},{"propositions":[],"lastnames":["Yu"],"firstnames":["Zhihao"],"suffixes":[]},{"propositions":[],"lastnames":["Fu"],"firstnames":["Yuxiang"],"suffixes":[]},{"propositions":[],"lastnames":["Zou"],"firstnames":["Xilu"],"suffixes":[]},{"propositions":[],"lastnames":["Shen"],"firstnames":["Yilin"],"suffixes":[]},{"propositions":[],"lastnames":["Luo"],"firstnames":["Xiai"],"suffixes":[]},{"propositions":[],"lastnames":["Zhao"],"firstnames":["Qiyue"],"suffixes":[]},{"propositions":[],"lastnames":["Zhang"],"firstnames":["Tao"],"suffixes":[]},{"propositions":[],"lastnames":["Liu"],"firstnames":["Lei"],"suffixes":[]},{"propositions":[],"lastnames":["Zhu"],"firstnames":["Shitong"],"suffixes":[]},{"propositions":[],"lastnames":["Li"],"firstnames":["Taotao"],"suffixes":[]},{"propositions":[],"lastnames":["Li"],"firstnames":["Weisheng"],"suffixes":[]},{"propositions":[],"lastnames":["Li"],"firstnames":["Li"],"suffixes":[]},{"propositions":[],"lastnames":["Gao"],"firstnames":["Li"],"suffixes":[]},{"propositions":[],"lastnames":["Shi"],"firstnames":["Yi"],"suffixes":[]},{"propositions":[],"lastnames":["Wang"],"firstnames":["Xinran"],"suffixes":[]}],"month":"January","year":"2025","note":"Publisher: Nature Publishing Group","keywords":"Electronic devices, Two-dimensional materials","pages":"1–13","bibtex":"@article{ning_index-free_2025,\n\ttitle = {An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors},\n\tcopyright = {2025 The Author(s), under exclusive licence to Springer Nature Limited},\n\tissn = {2520-1131},\n\turl = {https://www.nature.com/articles/s41928-024-01328-4},\n\tdoi = {10.1038/s41928-024-01328-4},\n\tabstract = {The fine-grained dynamic sparsity in biological synapses is an important element in the energy efficiency of the human brain. Emulating such sparsity in an artificial system requires off-chip memory indexing, which has a considerable energy and latency overhead. Here, we report an in-memory sparsity architecture in which index memory is moved next to individual synapses, creating a sparse neural network without external memory indexing. We use a compact building block consisting of two non-volatile ferroelectric field-effect transistors acting as a digital sparsity and an analogue weight. The network is formulated as the Hadamard product of the sparsity and weight matrices, and the hardware, which is comprised of 900 ferroelectric field-effect transistors, is based on wafer-scale chemical-vapour-deposited molybdenum disulfide integrated through back-end-of-line processes. With the system, we demonstrate key synaptic processes—including pruning, weight update and regrowth—in an unstructured and fine-grained manner. We also develop a vectorial approximate update algorithm and optimize training scheduling. Through this software–hardware co-optimization, we achieve 98.4\\% accuracy in an EMNIST letter recognition task under 75\\% sparsity. Simulations on large neural networks show a tenfold reduction in latency and a ninefold reduction in energy consumption when compared with a dense network of the same performance.},\n\tlanguage = {en},\n\turldate = {2025-01-10},\n\tjournal = {Nature Electronics},\n\tauthor = {Ning, Hongkai and Wen, Hengdi and Meng, Yuan and Yu, Zhihao and Fu, Yuxiang and Zou, Xilu and Shen, Yilin and Luo, Xiai and Zhao, Qiyue and Zhang, Tao and Liu, Lei and Zhu, Shitong and Li, Taotao and Li, Weisheng and Li, Li and Gao, Li and Shi, Yi and Wang, Xinran},\n\tmonth = jan,\n\tyear = {2025},\n\tnote = {Publisher: Nature Publishing Group},\n\tkeywords = {Electronic devices, Two-dimensional materials},\n\tpages = {1--13},\n}\n\n\n\n\n\n\n\n\n\n\n\n","author_short":["Ning, H.","Wen, H.","Meng, Y.","Yu, Z.","Fu, Y.","Zou, X.","Shen, Y.","Luo, X.","Zhao, Q.","Zhang, T.","Liu, L.","Zhu, S.","Li, T.","Li, W.","Li, L.","Gao, L.","Shi, Y.","Wang, X."],"key":"ning_index-free_2025-1","id":"ning_index-free_2025-1","bibbaseid":"ning-wen-meng-yu-fu-zou-shen-luo-etal-anindexfreesparseneuralnetworkusingtwodimensionalsemiconductorferroelectricfieldeffecttransistors-2025","role":"author","urls":{"Paper":"https://www.nature.com/articles/s41928-024-01328-4"},"keyword":["Electronic devices","Two-dimensional materials"],"metadata":{"authorlinks":{}},"downloads":0,"html":""},"bibtype":"article","biburl":"https://bibbase.org/zotero/qiuyuanwang","dataSources":["wWPhSRj9hrZuqsm9D"],"keywords":["electronic devices","two-dimensional materials"],"search_terms":["index","free","sparse","neural","network","using","two","dimensional","semiconductor","ferroelectric","field","effect","transistors","ning","wen","meng","yu","fu","zou","shen","luo","zhao","zhang","liu","zhu","li","li","li","gao","shi","wang"],"title":"An index-free sparse neural network using two-dimensional semiconductor ferroelectric field-effect transistors","year":2025}