Search the dblp DataBase
Katsuyuki Hagiwara :
[Publications ]
[Author Rank by year ]
[Co-authors ]
[Prefers ]
[Cites ]
[Cited by ]
Publications of Author
Katsuyuki Hagiwara , Kazuhiro Kuno , Shiro Usui Upper Bounds on the Expected Training Errors of Neural Networks Regressions for a Gaussian Noise. [Citation Graph (0, 0)][DBLP ] ICONIP, 1998, pp:502-505 [Conf ] Katsuyuki Hagiwara , Kazuhiro Kuno Regularization Learning and Early Stopping in Linear Networks. [Citation Graph (0, 0)][DBLP ] IJCNN (4), 2000, pp:511-516 [Conf ] Katsuyuki Hagiwara , Kazuhiro Kuno , Shiro Usui On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario. [Citation Graph (0, 0)][DBLP ] IJCNN (6), 2000, pp:461-466 [Conf ] Katsuyuki Hagiwara Regularization learning, early stopping and biased estimator. [Citation Graph (0, 0)][DBLP ] Neurocomputing, 2002, v:48, n:1-4, pp:937-955 [Journal ] Katsuyuki Hagiwara On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario. [Citation Graph (0, 0)][DBLP ] Neural Computation, 2002, v:14, n:8, pp:1979-2002 [Journal ] Katsuyuki Hagiwara , Taichi Hayasaka , Naohiro Toda , Shiro Usui , Kazuhiro Kuno Upper bound of the expected training error of neural network regression for a Gaussian noise sequence. [Citation Graph (0, 0)][DBLP ] Neural Networks, 2001, v:14, n:10, pp:1419-1429 [Journal ] Qi Jia , Katsuyuki Hagiwara , Naohiro Toda , Shiro Usui Equivalence relation between the back propagation learning process of an FNN and that of an FNNG. [Citation Graph (0, 0)][DBLP ] Neural Networks, 1994, v:7, n:2, pp:411-0 [Journal ] Orthogonal Shrinkage Methods for Nonparametric Regression under Gaussian Noise. [Citation Graph (, )][DBLP ] Orthogonalization and Thresholding Method for a Nonparametric Regression Problem. [Citation Graph (, )][DBLP ] Search in 0.003secs, Finished in 0.004secs