SOFTWARE TOOLS
LNKnet
Documentation includes a short quickstart guide, a long 200page user's guide, and UNIX man pages for the many LNKnet commands.
Downloading LNKnet
The version of LNKnet that includes support vector machines is not yet in the public domain but will be released in the future.LNKnet software was originally developed under Sun Microsystems Solaris 2.5.1 (SunOS 5.5.1) UNIX operating system and then ported to Solaris 2.6 (SunOS 5.6) under SUN Open Windows. It was recently ported to Red Hat Linux and to Cygwin to run under the Windows operating system.
Source code is available both for individual programs (e.g. mlp_lnk, svm_lnk) and for the graphical user interface.
The following four gzipped tar files contain executables, man pages, documentation (the QuickStart Guide and the LNKnet User's Guide), and sample data sets for Solaris, Red Hat Linux, and Cygwin:
LNKnet executables for Solaris with Sample Data and Man Pages  gzipped tar file (22 Mbytes)
LNKnet executables for Red Hat Linux with Sample Data and Man Pages  gzipped tar file (22 Mbytes)
LNKnet executables for Cygwin with Sample Data and Man Pages  gzipped tar file (23 Mbytes)
LNKnet now uses the GNU Autoconf tool to simplify compiling LNKnet on different host platforms. The following gzipped tar file uses Autoconf and contains source code for all LNKnet programs, man pages, documentation, and sample data sets:
Source Code  gzipped tar file (9 Mbytes)
The following two pdf files contain the QuickStart Guide and the LNKnet User's Guide:
QuickStart Guide  pdf file (0.2 Mbytes)
UsersGuide  pdf file (4.3 Mbytes)
Methods Used to Access LNKnet Classifiers
The above image shows three different methods are used to access LNKnet pattern classification software. The first and most popular approach is to take advantage of the highlevel graphical user interface (GUI) to run experiments. This GUI allows a nonprogrammer who knows at least a little about pattern classification to run experiments by pressing buttons and making selections using a mouse. A second approach, used to create automatic batch jobs, is to run LNKnet classification driver programs from the UNIX command line or in shell scripts. A third popular approach, used by Cprogrammers to embed LNKnet classifiers in application programs, is to use the LNKnet GUI to automatically produce C source code which implements a trained classifier. This Ccode can be copied into an application program and used with little knowledge concerning details of the classifier being used. These three interfaces have been used successfully for speech recognition, talker identification, medical risk prediction, chemical analysis, and other problems at MIT Lincoln Laboratory and many other government labs and Universities. The GUI has substantially reduced the time required to run classification experiments and explore new data sets.
Algorithms Included in LNKnet
Supervised Training  Combine Unsupervised Supervised Training  Unsupervised Training (Clustering)  
Neural Network Algorithms 
BackPropagation (MLP) Hypersphere Classifier Perceptron Committee 
Radial Basis Function (RBF) Incremental RBF Learning Vector Quantizer (LVQ) Nearest Cluster Classifier Fuzzy ARTMap Classifier 

Statistical and Machine Learning Algorithms 
Gaussian Linear Discriminant Gaussian Quadratic KNearest Neighbor Binary Decision Treee Parzen Window Histogram Naive Bayes Support Vector Machine 
Gaussian Mixture Classifier  Diagonal/Full Covariance  Tied/PerClass Centers 
KMeans Clustering EM Clustering Leader Clustering Random Clustering 
Feature Selection Algorithms  Linear Discriminant (LDA) Forward/Backward Search 
Principal Components (PCA) 
The above table shows the most comonly used LNKnet neural network, statistical, and machine learning classification and clustering algorithms and feature selection algorithms. Classification algorithms include: support vector machine classifiers, naive Bayesian classifiers, multilayer sigmoid nets (multilayer perceptrons) with squarederror, crossentropy, maximum likelihood, and classificationfigureofmerit cost functions; hypersphere classifiers (sometimes called RCE classifiers); unimodal Gaussian classifier with full/diagonal and grand/perclass covariance matrices (linear discriminant and quadratic discriminant classifiers); knearest neighbor classifier, condensed knearest neighbor classifier; binary decision tree (similar to CART); radial basis function classifier, incremental adaptive center and variance radial basis function classifier with four cost functions; four versions of learning vector quantizer classifiers; nearest cluster classifier, perceptron convergence procedure; Parzen widow classifier, histogram classifier, and Gaussian mixture classifier with full/ diagonal and tied perclass covariance matrices. Clustering algorithms include: leader clustering (similar to ART); kmeans clustering with binary splitting initialization; and Gaussian mixture clustering with binary splitting initialization. Feature reduction algorithms include principal components analysis and canonical linear discriminant analysis. Feature selection algorithms available with all classifiers include forward and backward search feature selection. In addition, nfold cross validation is available with all classifiers and C source code is produced which replicates any trained classifier.
LNKnet Graphics and Plots 
LNKnet produces a wide variety of plots and tables. The plot on the above left shows decision regions for a vowel classification problem and the plot on the right shows decision regions for a checkboard classification problem that uses a support vector machine for classification. Plots produced include 2D decision region plots, output profile and histogram plots, structure plots describing the computing architecture, and plots illustrating how the error rate and other cost functions vary with time. In addition, after an experiment is run, outputs include confusion matrices, tables of the error rate for each class, and overall error rates.
Questions and Comments
email: KUKOLICH@LL.MIT.EDU
top of page