Logistic回归

机器学习 2018-05-26 admin

 Logistic回归

假设表示
 
S形功能或逻辑功能:
Logistic回归

假设函数:
Logistic回归
这是什么意思?
概率y = 1,给出x,由θ参数化
Logistic回归


成本函数

Logistic回归


向量化表示:

Logistic回归


梯度

Logistic回归


综合上两式:

Logistic回归

 

向量化表示
当j=0时
Logistic回归

当j≠0时
Logistic回归
Logistic回归
#!/usr/bin/env python2 # -*- coding: utf-8 -*-  """ Created on Fri May 13 17:23:29 2016  @author: ronny """  import numpy as np import matplotlib.pyplot as plt import sys  def ComputeCostAndGradient(X, y, theta, lamb):     """     Compute the linear regression cost and gradient.     """     m = X.shape[1]     h = sigmoid(np.dot(X.T, theta))     cost = -(np.dot(y.T, np.log(h)) + np.dot(1 - y.T, np.log(1-h))) / m + lamb * np.dot(theta.T, theta) / (2 * m)     grad = np.dot(X, h - y) / m     grad[1:] = grad[1:] + lamb * theta[1:] / m      return cost, grad   def GradientDescend(X, y, init_theta, lamb, max_iter, alpha):     cost = np.zeros([max_iter, 1])     for i in range(max_iter):         cost[i], grad = ComputeCostAndGradient(X, y, init_theta, lamb)         init_theta = init_theta - alpha * grad     return init_theta, cost   def sigmoid(z):     return 1 / (1 + np.exp(-z))  def VisualizingData(X, y):     pos = y == 1     neg = y == 0     plt.plot(X[pos[:,0], 0], X[pos[:,0            
                   
                     本站文章信息来源于网络以及网友投稿,本站只负责对文章进行整理、排版、编辑,是出于传递更多信息之目的,并不意味着赞同其观点或证实其内容的真实性。
                     如果您有什么意见或建议.请联系QQ:870129809