himalaya.kernel_ridge.WeightedKernelRidge¶
- class himalaya.kernel_ridge.WeightedKernelRidge(alpha=1.0, deltas='zeros', kernels=['linear', 'polynomial'], kernels_params=None, solver='conjugate_gradient', solver_params=None, random_state=None, force_cpu=False)[source]¶
Weighted kernel ridge regression.
Solve the kernel ridge regression:
w* = argmin_w ||K @ w - Y||^2 + alpha (w.T @ K @ w)
where the kernel K is a weighted sum of multiple kernels:
K = sum_i exp(deltas[i]) Ks[i]
Contrarily to
MultipleKernelRidgeCV
, this model does not optimize the log kernel-weightsdeltas
. However, it is not equivalent toKernelRidge
, since the log kernel-weightsdeltas
can be different for each target, therefore the kernel sum is not precomputed.- Parameters
- alphafloat, or array of shape (n_targets, )
L2 regularization parameter.
- deltasarray of shape (n_kernels, ) or (n_kernels, n_targets)
Kernel weights. Default to “zeros”, an array of shape (n_kernels, ) filled with zeros.
- kernelslist of (str or callable), default=[“linear”, “polynomial”]
List of kernel mapping. Available kernels are: ‘linear’, ‘polynomial, ‘poly’, ‘rbf’, ‘sigmoid’, ‘cosine’. Set to ‘precomputed’ in order to pass a precomputed kernel matrix to the estimator methods instead of samples. A callable should accept two arguments and the keyword arguments passed to this object as kernel_params, and should return a floating point number.
- kernels_paramslist of dict, or None
Additional parameters for the kernel functions. See more details in the docstring of the function:
WeightedKernelRidge.ALL_KERNELS[kernel]
- solverstr
Algorithm used during the fit, “conjugate_gradient”, or “gradient_descent”.
- solver_paramsdict or None
Additional parameters for the solver. See more details in the docstring of the function:
WeightedKernelRidge.ALL_SOLVERS[solver]
- random_stateint, or None
Random generator seed. Use an int for deterministic search.
- force_cpubool
If True, computations will be performed on CPU, ignoring the current backend. If False, use the current backend.
Examples
>>> from himalaya.kernel_ridge import WeightedKernelRidge >>> from himalaya.kernel_ridge import ColumnKernelizer >>> from himalaya.kernel_ridge import Kernelizer >>> from sklearn.pipeline import make_pipeline
>>> # create a dataset >>> import numpy as np >>> n_samples, n_features, n_targets = 10, 5, 3 >>> X = np.random.randn(n_samples, n_features) >>> Y = np.random.randn(n_samples, n_targets)
>>> # Kernelize separately the first three columns and the last two >>> # columns, creating two kernels of shape (n_samples, n_samples). >>> ck = ColumnKernelizer( ... [("kernel_1", Kernelizer(kernel="linear"), [0, 1, 2]), ... ("kernel_2", Kernelizer(kernel="polynomial"), slice(3, 5))])
>>> # A model with precomputed kernels, as output by ColumnKernelizer >>> model = WeightedKernelRidge(kernels="precomputed") >>> pipe = make_pipeline(ck, model) >>> pipe.fit(X, Y)
- Attributes
- dual_coef_array of shape (n_samples) or (n_samples, n_targets)
Representation of weight vectors in kernel space.
- deltas_array of shape (n_kernels, n_targets) or (n_kernels, )
Log of kernel weights.
- X_fit_array of shape (n_samples, n_features)
Training data. If kernels == “precomputed” this is None.
- n_features_in_int
Number of features (or number of samples if kernels == “precomputed”) used during the fit.
- dtype_str
Dtype of input data.
Methods
fit
(X[, y, sample_weight])Fit kernel ridge regression model
get_params
([deep])Get parameters for this estimator.
get_primal_coef
(Xs_fit)Returns the primal coefficients, assuming all kernels are linear.
predict
(X[, split])Predict using the model.
score
(X, y[, split])Return the coefficient of determination R^2 of the prediction.
set_params
(**params)Set the parameters of this estimator.