Deakin University
Browse

File(s) under permanent embargo

A derivative-free method for quantum perceptron training in multi-layered neural networks

conference contribution
posted on 2020-01-01, 00:00 authored by Tariq Khan, Antonio Robles-KellyAntonio Robles-Kelly
In this paper, we present a gradient-free approach for training multi-layered neural networks based upon quantum perceptrons. Here, we depart from the classical perceptron and the elemental operations on quantum bits, i.e. qubits, so as to formulate the problem in terms of quantum perceptrons. We then make use of measurable operators to define the states of the network in a manner consistent with a Markov process. This yields a Dirac–Von Neumann formulation consistent with quantum mechanics. Moreover, the formulation presented here has the advantage of having a computational efficiency devoid of the number of layers in the network. This, paired with the natural efficiency of quantum computing, can imply a significant improvement in efficiency, particularly for deep networks. Finally, but not least, the developments here are quite general in nature since the approach presented here can also be used for quantum-inspired neural networks implemented on conventional computers.

History

Event

Neural Information Processing. International Conference (27th : 2020 : Online from Bangkok, Thailand)

Volume

1333

Series

Neural Information Processing International Conference

Pagination

241 - 250

Publisher

Springer

Location

Bangkok, Thailand

Place of publication

Cham, Switzerland

Start date

2020-11-18

End date

2020-11-22

ISSN

1865-0929

eISSN

1865-0937

ISBN-13

9783030638221

Language

eng

Publication classification

E1 Full written paper - refereed

Editor/Contributor(s)

H Yang, K Pasupa, A Leung, J Kwok, J Chan, I King

Title of proceedings

ICONIP 2020 : Proceedings of the 27th International Conference on Neural Information Processing

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC