Fav

Favourite Collections,收藏好文,主题关于机器学习,NLP

SVM

  • https://www.cnblogs.com/dreamvibe/p/4349886.html (写的最好最完整)

XGBoost

  • https://arxiv.org/pdf/1603.02754v1.pdf (陈天奇论文)
  • https://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf (陈天奇UW slides)
  • http://odjt9j2ec.bkt.clouddn.com/xgboost-xgboost%E5%AF%BC%E8%AF%BB%E5%92%8C%E5%AE%9E%E6%88%98.pdf(XgBoost导读及分布式源码)

L1,L2正则化
用贝叶斯解释,引入先验:

https://yiyang186.github.io/2017/10/13/%E6%AD%A3%E5%88%99%E5%8C%96%E5%AF%B9%E5%81%87%E8%AE%BE%E7%A9%BA%E9%97%B4%E7%9A%84%E5%BD%B1%E5%93%8D/#more

TextCNN 基本概念入门经典

http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/

http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow/