Survey of Model Selection Criteria for Large Margin Classifiers
Title | Survey of Model Selection Criteria for Large Margin Classifiers PDF eBook |
Author | Takashi Onoda |
Publisher | |
Pages | 19 |
Release | 2002 |
Genre | |
ISBN |
Advances in Large Margin Classifiers
Title | Advances in Large Margin Classifiers PDF eBook |
Author | Alexander J. Smola |
Publisher | MIT Press |
Pages | 436 |
Release | 2000 |
Genre | Computers |
ISBN | 9780262194488 |
The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms. The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.
Analysis of Large and Complex Data
Title | Analysis of Large and Complex Data PDF eBook |
Author | Adalbert F.X. Wilhelm |
Publisher | Springer |
Pages | 640 |
Release | 2016-08-03 |
Genre | Computers |
ISBN | 3319252267 |
This book offers a snapshot of the state-of-the-art in classification at the interface between statistics, computer science and application fields. The contributions span a broad spectrum, from theoretical developments to practical applications; they all share a strong computational component. The topics addressed are from the following fields: Statistics and Data Analysis; Machine Learning and Knowledge Discovery; Data Analysis in Marketing; Data Analysis in Finance and Economics; Data Analysis in Medicine and the Life Sciences; Data Analysis in the Social, Behavioural, and Health Care Sciences; Data Analysis in Interdisciplinary Domains; Classification and Subject Indexing in Library and Information Science. The book presents selected papers from the Second European Conference on Data Analysis, held at Jacobs University Bremen in July 2014. This conference unites diverse researchers in the pursuit of a common topic, creating truly unique synergies in the process.
The Oxford Handbook of Applied Nonparametric and Semiparametric Econometrics and Statistics
Title | The Oxford Handbook of Applied Nonparametric and Semiparametric Econometrics and Statistics PDF eBook |
Author | Jeffrey Racine |
Publisher | Oxford University Press |
Pages | 562 |
Release | 2014-04 |
Genre | Business & Economics |
ISBN | 0199857946 |
This volume, edited by Jeffrey Racine, Liangjun Su, and Aman Ullah, contains the latest research on nonparametric and semiparametric econometrics and statistics. Chapters by leading international econometricians and statisticians highlight the interface between econometrics and statistical methods for nonparametric and semiparametric procedures.
Proceedings of the 1st International Conference on Neural Networks and Machine Learning 2022 (ICONNSMAL 2022)
Title | Proceedings of the 1st International Conference on Neural Networks and Machine Learning 2022 (ICONNSMAL 2022) PDF eBook |
Author | Ika Hesti Agustin |
Publisher | Springer Nature |
Pages | 346 |
Release | 2023-05-20 |
Genre | Computers |
ISBN | 9464631740 |
This is an open access book. The 1st ICONNSMAL 2022 was held at CGANT Research Group the University of Jember, Jember, East-Java, Indonesia.
Discrepancy-based Algorithms for Best-subset Model Selection
Title | Discrepancy-based Algorithms for Best-subset Model Selection PDF eBook |
Author | Tao Zhang |
Publisher | |
Pages | 142 |
Release | 2013 |
Genre | Akaike Information Criterion |
ISBN |
The selection of a best-subset regression model from a candidate family is a common problem that arises in many analyses. In best-subset model selection, we consider all possible subsets of regressor variables; thus, numerous candidate models may need to be fit and compared. One of the main challenges of best-subset selection arises from the size of the candidate model family: specifically, the probability of selecting an inappropriate model generally increases as the size of the family increases. For this reason, it is usually difficult to select an optimal model when best-subset selection is attempted based on a moderate to large number of regressor variables. Model selection criteria are often constructed to estimate discrepancy measures used to assess the disparity between each fitted candidate model and the generating model. The Akaike information criterion (AIC) and the corrected AIC (AICc) are designed to estimate the expected Kullback-Leibler (K-L) discrepancy. For best-subset selection, both AIC and AICc are negatively biased, and the use of either criterion will lead to overfitted models. To correct for this bias, we introduce a criterion AICi, which has a penalty term evaluated from Monte Carlo simulation. A multistage model selection procedure AICaps, which utilizes AICi, is proposed for best-subset selection. In the framework of linear regression models, the Gauss discrepancy is another frequently applied measure of proximity between a fitted candidate model and the generating model. Mallows' conceptual predictive statistic (Cp) and the modified Cp (MCp) are designed to estimate the expected Gauss discrepancy. For best-subset selection, Cp and MCp exhibit negative estimation bias. To correct for this bias, we propose a criterion CPSi that again employs a penalty term evaluated from Monte Carlo simulation. We further devise a multistage procedure, CPSaps, which selectively utilizes CPSi. In this thesis, we consider best-subset selection in two different modeling frameworks: linear models and generalized linear models. Extensive simulation studies are compiled to compare the selection behavior of our methods and other traditional model selection criteria. We also apply our methods to a model selection problem in a study of bipolar disorder.
Pattern Recognition and Artificial Intelligence
Title | Pattern Recognition and Artificial Intelligence PDF eBook |
Author | Yue Lu |
Publisher | Springer Nature |
Pages | 752 |
Release | 2020-10-09 |
Genre | Computers |
ISBN | 3030598306 |
This book constitutes the proceedings of the Second International Conference on Pattern Recognition and Artificial Intelligence, ICPRAI 2020, which took place in Zhongshan, China, in October 2020. The 49 full and 14 short papers presented were carefully reviewed and selected for inclusion in the book. The papers were organized in topical sections as follows: handwriting and text processing; features and classifiers; deep learning; computer vision and image processing; medical imaging and applications; and forensic studies and medical diagnosis.