Yoshifusa Ito A Weak Condition on Linear Independence of Unscaled Shifts of a Function and Finite Mappings by Neural Networks. [Citation Graph (0, 0)][DBLP] ICANN, 2002, pp:337-343 [Conf]
Yoshifusa Ito Surface-Tracing Approximation by Basis Functions and Its Application to Neural Networks. [Citation Graph (0, 0)][DBLP] IJCNN (4), 2000, pp:227-231 [Conf]
Yoshifusa Ito Activation Functions Defined on Higher-Dimensional Spaces for Approximation on Compact Sets with and without Scaling. [Citation Graph (0, 0)][DBLP] Neural Computation, 2003, v:15, n:9, pp:2199-2226 [Journal]
Yoshifusa Ito Representation of functions by superpositions of a step or sigmoid function and their applications to neural network theory. [Citation Graph (0, 0)][DBLP] Neural Networks, 1991, v:4, n:3, pp:385-394 [Journal]
Yoshifusa Ito Approximation of functions on a compact set by finite sums of a sigmoid function without scaling. [Citation Graph (0, 0)][DBLP] Neural Networks, 1991, v:4, n:6, pp:817-826 [Journal]
Yoshifusa Ito Approximation of continuous functions on Rd by linear combinations of shifted rotations of a sigmoid function with and without scaling. [Citation Graph (0, 0)][DBLP] Neural Networks, 1992, v:5, n:1, pp:105-115 [Journal]
Multi-category Bayesian Decision by Neural Networks. [Citation Graph (, )][DBLP]
Learning of Bayesian Discriminant Functions by a Layered Neural Network. [Citation Graph (, )][DBLP]
Learning of Mahalanobis Discriminant Functions by a Neural Network. [Citation Graph (, )][DBLP]
A Neural Network having Fewer Inner Constants to be Trained and Bayesian Decision. [Citation Graph (, )][DBLP]