Record Detail
Advanced Search
Text
Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond
In a seminal book, Minsky and Papert define the perceptron as a limited implementation of what they called “parallel machines.” They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capacity to learn only linearly separable functions. In this work, we propose a new more powerful implementation of such parallel machines. This new mathematical tool is defined using analytic sinusoids—instead of linear combinations—to form an analytic signal representation of the function that we want to learn. We show that this re-formulated parallel mechanism can learn, with a single layer, any non-linear k-ary Boolean function. Finally, to provide an example of its practical applications, we show that it outperforms the single hidden layer multilayer perceptron in both Boolean function learning and image classification tasks, while also being faster and requiring fewer parameters.
Availability
No copy data
Detail Information
Series Title |
-
|
---|---|
Call Number |
-
|
Publisher | Frontiers in Artificial Intelligence : Switzerland., 2022 |
Collation |
006
|
Language |
English
|
ISBN/ISSN |
2624-8212
|
Classification |
NONE
|
Content Type |
-
|
Media Type |
-
|
---|---|
Carrier Type |
-
|
Edition |
-
|
Subject(s) | |
Specific Detail Info |
-
|
Statement of Responsibility |
-
|
Other Information
Accreditation |
Scopus Q3
|
---|
Other version/related
No other version available
File Attachment
Information
Web Online Public Access Catalog - Use the search options to find documents quickly