基于拟凸损失的核正则化成对学习算法的收敛速度

王淑华,王英杰,陈振龙,盛宝怀

系统科学与数学 ›› 2020, Vol. 40 ›› Issue (3) : 389-409.

PDF(519 KB)
PDF(519 KB)
系统科学与数学 ›› 2020, Vol. 40 ›› Issue (3) : 389-409. DOI: 10.12341/jssms13826
论文

基于拟凸损失的核正则化成对学习算法的收敛速度

    王淑华1,2,王英杰3,陈振龙1,盛宝怀2
作者信息 +

The Convergence Rate for Kernel-Based Regularized Pair Learning Algorithm with a Quasiconvex Loss

    WANG Shuhua 1,2 ,WANG Yingjie3 ,CHEN Zhenlong1 ,SHENG Baohuai2
Author information +
文章历史 +

摘要

核正则化排序算法是目前机器学习理论领域讨论的热点问题, 而成对学习算 法是排序算法的推广. 文章给出一种基于拟凸损失的核正则化成对学习算法, 利用拟凸 分析理论对该算法进行误差分析, 给出算法的收敛速度. 分析结果表明, 算法的样本误 差与损失函数中的参数选择有关. 数值实验结果显示, 与基于最小二乘损失的排序算法相比较, 该算法有更稳健的学习性能.

Abstract

Regularized ranking algorithm based on kernels has recently gained much attention in machine learning theory, and pairwise learning is the generalization of ranking problem. In this paper, a kernel-based regularized pairwise learning algorithm with a quasiconvex loss function is provided, the error estimate is given by using the quasiconvex analysis theory, and an explicit learning rate is obtained. It is shown that the sample error is influenced by the parameters in the loss function. The experiments show that our method is more robust compared with the ranking algorithm with the least square loss function.

关键词

成对学习 / 拟凸函数 / 核正则化算法 / 收敛速度.

引用本文

导出引用
王淑华 , 王英杰 , 陈振龙 , 盛宝怀. 基于拟凸损失的核正则化成对学习算法的收敛速度. 系统科学与数学, 2020, 40(3): 389-409. https://doi.org/10.12341/jssms13826
WANG Shuhua , WANG Yingjie , CHEN Zhenlong , SHENG Baohuai. The Convergence Rate for Kernel-Based Regularized Pair Learning Algorithm with a Quasiconvex Loss. Journal of Systems Science and Mathematical Sciences, 2020, 40(3): 389-409 https://doi.org/10.12341/jssms13826
PDF(519 KB)

263

Accesses

0

Citation

Detail

段落导航
相关文章

/