Kernel methods are among the most popular techniques in machine learning. From a regularization theory perspective, they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a probabilistic theory perspective, they are the key in the context of Gaussian processes, where the kernel function is known as the covariance function. The theory of kernel methods for single-valued functions is well established by now, and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. Applications of kernels for vector-valued functions include sensor networks, geostatistics, computer graphics and several more. Kernels for Vector-Valued Functions looks at different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and regularization methods. It is aimed at researchers with an interest in the theory and application of kernels for vector-valued functions in areas such as statistics, computer science and engineering. One of its goals is to provide a unified framework and a common terminology for researchers working in machine learning and statistics.
Read More
Specifications
Dimensions
Width
5 mm
Height
234 mm
Length
156 mm
Weight
135 gr
Series & Set Details
Series Name
Foundations and Trends in Machine Learning
Book Details
Title
Kernels for Vector-Valued Functions
Imprint
now publishers Inc
Product Form
Paperback
Publisher
now publishers Inc
Genre
Computers
ISBN13
9781601985583
Book Category
Higher Education and Professional Books
BISAC Subject Heading
COM094000
Book Subcategory
Computing and Information Technology Books
ISBN10
9781601985583
Language
English
Be the first to ask about this product
Safe and Secure Payments.Easy returns.100% Authentic products.