新聞中心
這里有您想知道的互聯(lián)網(wǎng)營(yíng)銷解決方案
怎么理解并掌握Python邏輯回歸-創(chuàng)新互聯(lián)
這篇文章主要講解了“怎么理解并掌握Python邏輯回歸”,文中的講解內(nèi)容簡(jiǎn)單清晰,易于學(xué)習(xí)與理解,下面請(qǐng)大家跟著小編的思路慢慢深入,一起來(lái)研究和學(xué)習(xí)“怎么理解并掌握Python邏輯回歸”吧!
def sigmoid(x):定義sigmoid函數(shù)
return 1/(1+np.exp(-x))
進(jìn)行邏輯回歸的參數(shù)設(shè)置以及迭代
def weights(x,y,alpha,thershold): #初始化參數(shù) m,n = x_train.shape theta = np.random.rand(n) #參數(shù) cnt = 0 # 迭代次數(shù) max_iter = 50000 #開(kāi)始迭代 while cnt < max_iter: cnt += 1 diff = np.full(n,0) for i in range(m): diff = (y[i]-sigmoid(theta.T @ x[i]))*x[i] theta = theta + alpha * diff if(abs(diff)預(yù)測(cè)函數(shù)
def predict(x_test,theta): if sigmoid(theta.T @ x_test)>0.5: return 1 else:return 0調(diào)用函數(shù)
x_train = np.array([[1,2.697,6.254], [1,1.872,2.014], [1,2.312,0.812], [1,1.983,4.990], [1,0.932,3.920], [1,1.321,5.583], [1,2.215,1.560], [1,1.659,2.932], [1,0.865,7.362], [1,1.685,4.763], [1,1.786,2.523]]) y_train = np.array([1,0,0,1,0,1,0,0,1,0,1]) alpha = 0.001 # 學(xué)習(xí)率 thershold = 0.01 # 指定一個(gè)閾值,用于檢查兩次誤差 print(weights(x_train,y_train,alpha,thershold))感謝各位的閱讀,以上就是“怎么理解并掌握Python邏輯回歸”的內(nèi)容了,經(jīng)過(guò)本文的學(xué)習(xí)后,相信大家對(duì)怎么理解并掌握Python邏輯回歸這一問(wèn)題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識(shí)點(diǎn)的文章,歡迎關(guān)注!
新聞名稱:怎么理解并掌握Python邏輯回歸-創(chuàng)新互聯(lián)
瀏覽地址:http://www.dlmjj.cn/article/dshphj.html