Incremental extreme learning machine.
Date of Issue2007
School of Electrical and Electronic Engineering
This new theory shows that in order to let SLFNs work as universal approximators, one may simply randomly choose input-to-hidden nodes, and then we only need to adjust the output weights linking the hidden layer and the output layer. In such SLFNs implementations, the activation functions for additive nodes can be any bounded nonconstant piecewise continuous functions or the activation functions for RBF nodes can be any integrable piecewise continuous functions.We propose two incremental algorithms:1) Incremental extreme learning machine (I-ELM) 2) Convex I-ELM (CI-ELM).
DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Nanyang Technological University