A derivative-free method for quantum perceptron training in multi-layered neural networks

Khan, Tariq M and Robles-Kelly, Antonio 2020, A derivative-free method for quantum perceptron training in multi-layered neural networks, in ICONIP 2020 : Proceedings of the 27th International Conference on Neural Information Processing, Springer, Cham, Switzerland, pp. 241-250, doi: 10.1007/978-3-030-63823-8_29.

Attached Files
Name Description MIMEType Size Downloads

Title A derivative-free method for quantum perceptron training in multi-layered neural networks
Author(s) Khan, Tariq MORCID iD for Khan, Tariq M orcid.org/0000-0002-7477-1591
Robles-Kelly, AntonioORCID iD for Robles-Kelly, Antonio orcid.org/0000-0002-2465-5971
Conference name Neural Information Processing. International Conference (27th : 2020 : Online from Bangkok, Thailand)
Conference location Bangkok, Thailand
Conference dates 2020/11/18 - 2020/11/22
Title of proceedings ICONIP 2020 : Proceedings of the 27th International Conference on Neural Information Processing
Editor(s) Yang, H
Pasupa, K
Leung, AC-S
Kwok, JT
Chan, JH
King, I
Publication date 2020
Series Neural Information Processing International Conference
Start page 241
End page 250
Total pages 10
Publisher Springer
Place of publication Cham, Switzerland
Keyword(s) Quantum perceptron
Derivative-free training methods for quantum-inspired neural networks
Measurable operators
Summary In this paper, we present a gradient-free approach for training multi-layered neural networks based upon quantum perceptrons. Here, we depart from the classical perceptron and the elemental operations on quantum bits, i.e. qubits, so as to formulate the problem in terms of quantum perceptrons. We then make use of measurable operators to define the states of the network in a manner consistent with a Markov process. This yields a Dirac–Von Neumann formulation consistent with quantum mechanics. Moreover, the formulation presented here has the advantage of having a computational efficiency devoid of the number of layers in the network. This, paired with the natural efficiency of quantum computing, can imply a significant improvement in efficiency, particularly for deep networks. Finally, but not least, the developments here are quite general in nature since the approach presented here can also be used for quantum-inspired neural networks implemented on conventional computers.
ISBN 9783030638221
ISSN 1865-0929
1865-0937
Language eng
DOI 10.1007/978-3-030-63823-8_29
Indigenous content off
HERDC Research category E1 Full written paper - refereed
Persistent URL http://hdl.handle.net/10536/DRO/DU:30146509

Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 1 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 14 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Thu, 07 Jan 2021, 14:16:51 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.