一般注記This paper presents two models of complex-valued neurons (CVNs) for real-valued classification problems, incorporating two newly-proposed activation functions, and presents their abilities as well as differences between them on benchmark problems. In both models, each real-valued input is encoded into a phase between 0 and π of a complex number of unity magnitude, and multiplied by a complex-valued weight. The weighted sum of inputs is fed to an activation function. Activation functions of both models map complex values into real values, and their role is to divide the net-input (weighted sum) space into multiple regions representing the classes of input patterns. The gradient-based learning rule is derived for each of the activation functions. Ability of such CVNs are discussed and tested with two-class problems, such as two and three input Boolean problems, and symmetry detection in binary sequences. We exhibit here that both the models can form proper boundaries for these linear and nonlinear problems. For solving n-class problems, a complex-valued neural network (CVNN) consisting of n CVNs is also considered in this paper. We tested such single-layered CVNNs on several real world benchmark problems. The results show that the classification and generalization abilities of single-layered CVNNs are comparable to the conventional real-valued neural networks (RVNNs) having one hidden layer. Moreover, convergence of CVNNs is much faster than that of RVNNs in most of the cases.
一次資料へのリンクURLhttps://u-fukui.repo.nii.ac.jp/?action=repository_action_common_download&item_id=22350&item_no=1&attribute_id=22&file_no=1
関連情報(DOI)10.1016/j.neucom.2008.04.006
連携機関・データベース国立情報学研究所 : 学術機関リポジトリデータベース(IRDB)(機関リポジトリ)