Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU

In recent years, Quantum Machine Learning (QML) has increasingly captured the interest of researchers. Among the components in this domain, activation functions hold a fundamental and indispensable role. Our research focuses on the development of activation functions quantum circuits for integration...

Full description

Saved in:
Bibliographic Details
Main Authors: Zi, Wei, Wang, Siyi, Kim, Hyunji, Sun, Xiaoming, Chattopadhyay, Anupam, Rebentrost, Patrick
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2025
Subjects:
Online Access:https://hdl.handle.net/10356/182156
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-182156
record_format dspace
spelling sg-ntu-dr.10356-1821562025-01-17T15:38:21Z Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU Zi, Wei Wang, Siyi Kim, Hyunji Sun, Xiaoming Chattopadhyay, Anupam Rebentrost, Patrick School of Computer Science and Engineering Computer and Information Science Activation functions Fault-tolerant quantum computing In recent years, Quantum Machine Learning (QML) has increasingly captured the interest of researchers. Among the components in this domain, activation functions hold a fundamental and indispensable role. Our research focuses on the development of activation functions quantum circuits for integration into fault-tolerant quantum computing architectures, with an emphasis on minimizing T-depth. Specifically, we present novel implementations of ReLU and leaky ReLU activation functions, achieving constant T-depths of 4 and 8, respectively. Leveraging quantum lookup tables, we extend our exploration to other activation functions such as the sigmoid. This approach enables us to customize precision and T-depth by adjusting the number of qubits, making our results more adaptable to various application scenarios. This study represents a significant advancement towards enhancing the practicality and application of quantum machine learning. Agency for Science, Technology and Research (A*STAR) Ministry of Education (MOE) National Research Foundation (NRF) Published version This work is supported in part by the National Research Foundation, Singapore, and A*STAR under its CQT Bridging Grant and its Quantum Engineering Programme under Grant No. NRF2021-QEP2-02-P05. This work is supported in part by an MoE Tier-1 grant. This work was supported in part by the National Natural Science Foundation of China Grants No. 62325210, as well as the Strategic Priority Research Program of the Chinese Academy of Sciences Grant No. XDB28000000. Additionally, this work was funded by the China Scholarship Council. 2025-01-13T00:50:29Z 2025-01-13T00:50:29Z 2024 Journal Article Zi, W., Wang, S., Kim, H., Sun, X., Chattopadhyay, A. & Rebentrost, P. (2024). Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU. Physical Review Research, 6(4), 043048-. https://dx.doi.org/10.1103/PhysRevResearch.6.043048 2643-1564 https://hdl.handle.net/10356/182156 10.1103/PhysRevResearch.6.043048 2-s2.0-85210979532 4 6 043048 en NRF2021-QEP2-02-P05 Physical Review Research © The Author(s). Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
Activation functions
Fault-tolerant quantum computing
spellingShingle Computer and Information Science
Activation functions
Fault-tolerant quantum computing
Zi, Wei
Wang, Siyi
Kim, Hyunji
Sun, Xiaoming
Chattopadhyay, Anupam
Rebentrost, Patrick
Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
description In recent years, Quantum Machine Learning (QML) has increasingly captured the interest of researchers. Among the components in this domain, activation functions hold a fundamental and indispensable role. Our research focuses on the development of activation functions quantum circuits for integration into fault-tolerant quantum computing architectures, with an emphasis on minimizing T-depth. Specifically, we present novel implementations of ReLU and leaky ReLU activation functions, achieving constant T-depths of 4 and 8, respectively. Leveraging quantum lookup tables, we extend our exploration to other activation functions such as the sigmoid. This approach enables us to customize precision and T-depth by adjusting the number of qubits, making our results more adaptable to various application scenarios. This study represents a significant advancement towards enhancing the practicality and application of quantum machine learning.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Zi, Wei
Wang, Siyi
Kim, Hyunji
Sun, Xiaoming
Chattopadhyay, Anupam
Rebentrost, Patrick
format Article
author Zi, Wei
Wang, Siyi
Kim, Hyunji
Sun, Xiaoming
Chattopadhyay, Anupam
Rebentrost, Patrick
author_sort Zi, Wei
title Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
title_short Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
title_full Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
title_fullStr Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
title_full_unstemmed Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
title_sort efficient quantum circuits for machine learning activation functions including constant t-depth relu
publishDate 2025
url https://hdl.handle.net/10356/182156
_version_ 1821833177326419968