nu-Anomica algorithm
An algorithm shared by Santanu Das, updated on Sep 10, 2010
Summary
One-class nu-Support Vector machine (SVMs) learning technique maps the
input data into a much higher dimensional space and then uses a small
portion of the training data (support vectors) to parametrize the
decision surface that can linearly separate nu fraction of training
points (labeled as anomalies) from the rest. The exact solution of
standard one-class nu SVMs assigns (at least) nu fraction of training
points as support vectors. However some of these support vectors may be
unnecessary or redundant. Hence the computational issue turns alarming
especially when SVMs based novelty detectors with nonlinear kernels are
trained on data sets of huge size. The proposed nu-Anomica algorithm can
solve this problem. The idea is to train the machine such that it can
provide a close approximation to the exact decision plane using far less
number of training points and without loosing much of the generalization
performance of the classical approach. The developed procedure closely
preserves the accuracy of standard One-class nu-SVMs while reducing both
training time and test time by several factors.
Source Files
Support/Documentation (edit)
|
193.3 KB | 432 downloads |
|
9.2 MB | 525 downloads |
|
542.9 MB | 16 downloads |
For any questions, contact this resource's administrator: NDC-sdas
Discussions
Santanu's Projects (11)
Need help?
Visit our help center