Techniques in support vector classification
| dc.contributor.author | Martin, Shawn, author | |
| dc.contributor.author | Kirby, Michael, advisor | |
| dc.date.accessioned | 2026-05-07T18:04:10Z | |
| dc.date.issued | 2001 | |
| dc.description.abstract | This work falls into the field of Pattern Classification and more generally Artificial Intelligence. Classification is the problem of assigning a "pattern" z to be a member of a finite set ("class") X or a member of a disjoint finite set Y. In case z ∈ Rn and X, Y ⊂ Rn we can solve this problem using Support Vector Machines. Support Vector Machines are functions of the form ƒ(z) = sign (∑i αik(xi, z) + ∑jβjk(yj, z) + b), (*) where k : Rn x Rn → R and z is classified as a member of X = {xi} if ƒ(z) > 0 and a member of Y = {yj} otherwise. We consider three problems in classification, two of which concern Support Vector Machines. Our first problem concerns feature selection for classification. Feature selection is the problem of identifying properties which distinguish between the two classes X and Y. Color, for example, distinguishes between apples and oranges, while shape may not. Our method of feature selection uses a novel combination of a linear classifier known as Fisher's discriminant and a nonlinear (polynomial) map known as the Veronese map. We apply our method to a problem in materials design. Our second problem concerns the selection of the kernel k : Rn x Rn → R in (*). For kernel selection we use a kernel version of the classical Gram-Schmidt orthonormalization procedure again coupled with Fisher's discriminant. We apply our method to the materials design problem and to a handwritten digit recognition problem. Finally, we consider the problem of training Support Vector Machines. Specifically, we develop a fast method for obtaining the coefficients αi and βj in (*). Traditionally, these coefficients are found by solving a constrained quadratic programming problem. We present a geometric reformulation of the SVM quadratic programming problem. We then present, using this reformulation, a modified version of Gilbert's Algorithm for obtaining the coefficients αi and βj. We compare our algorithm with the Nearest Point Algorithm and with Sequential Minimal Optimization. | |
| dc.format.medium | doctoral dissertations | |
| dc.identifier.uri | https://hdl.handle.net/10217/244311 | |
| dc.identifier.uri | https://doi.org/10.25675/3.026906 | |
| dc.language | English | |
| dc.language.iso | eng | |
| dc.publisher | Colorado State University. Libraries | |
| dc.relation.ispartof | 2000-2019 | |
| dc.rights | Copyright and other restrictions may apply. User is responsible for compliance with all applicable laws. For information about copyright law, please see https://libguides.colostate.edu/copyright. | |
| dc.rights.license | Per the terms of a contractual agreement, all use of this item is limited to the non-commercial use of Colorado State University and its authorized users. | |
| dc.subject | mathematics | |
| dc.title | Techniques in support vector classification | |
| dc.type | Text | |
| dcterms.rights.dpla | This Item is protected by copyright and/or related rights (https://rightsstatements.org/vocab/InC/1.0/). You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s). | |
| thesis.degree.discipline | Mathematics | |
| thesis.degree.grantor | Colorado State University | |
| thesis.degree.level | Doctoral | |
| thesis.degree.name | Doctor of Philosophy (Ph.D.) |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- ETDF_PQ_2001_3013850.pdf
- Size:
- 3.8 MB
- Format:
- Adobe Portable Document Format
