当前位置: 首页 > news >正文

做网站的基础架构做网站的好处

做网站的基础架构,做网站的好处,为网站做安全认证服务,王烨平目录 拟合 欠拟合 过拟合 正确的拟合 解决过拟合的方法:正则化 线性回归模型和逻辑回归模型都存在欠拟合和过拟合的情况。 拟合 来自百度的解释: 数据拟合又称曲线拟合,俗称拉曲线,是一种把现有数据透过数学方法来代入一条…

目录

拟合

欠拟合

过拟合

正确的拟合

解决过拟合的方法:正则化


线性回归模型和逻辑回归模型都存在欠拟合和过拟合的情况。

拟合

来自百度的解释:

数据拟合又称曲线拟合,俗称拉曲线,是一种把现有数据透过数学方法来代入一条数式的表示方式。科学和工程问题可以通过诸如采样、实验等方法获得若干离散的数据,根据这些数据,我们往往希望得到一个连续的函数(也就是曲线)或者更加密集的离散方程与已知数据相吻合,这过程就叫做拟合(fitting)。

个人理解,拟合就是根据已有数据来建立的一个数学模型,这个数据模型能最大限度的包含现有的数据。这样预测的数据就能最大程度的符合现有情况。

欠拟合

所建立的模型与现有数据匹配度较低如下图的分类模型,决策边界并不能很好的区分目前的数据

当训练数据的特征值较少的时候会出现欠拟合

过拟合

模型过于匹配现有数据,导致模型不能推广应用到更多数据中去。当训练数据的特征值太多的时候会出现这种情况。

正确的拟合

介于欠拟合和过拟合之间

 

解决过拟合的方法:正则化

 解决过拟合的方法是将模型正则化,就是说把不是主要特征的w_j调整为无限接近于0,然后训练模型,这样来寻找最优的模型。这样存在一个问题,怎么分辨特征是不是主要特征呢?这个是不好分辨的,因此是把所有的特征都正则化,正则化的公式为:

线性回归cost function:

逻辑回归cost function:

适用于线性回归和逻辑回归的梯度下降函数:

实现代码:

import numpy as np
%matplotlib inline
import matplotlib.pyplot as plt
from plt_overfit import overfit_example, outputnp.set_printoptions(precision=8)def sigmoid(z):"""Compute the sigmoid of zArgs:z (ndarray): A scalar, numpy array of any size.Returns:g (ndarray): sigmoid(z), with the same shape as z"""g = 1/(1+np.exp(-z))return gdef compute_cost_linear_reg(X, y, w, b, lambda_ = 1):"""Computes the cost over all examplesArgs:X (ndarray (m,n): Data, m examples with n featuresy (ndarray (m,)): target valuesw (ndarray (n,)): model parameters  b (scalar)      : model parameterlambda_ (scalar): Controls amount of regularizationReturns:total_cost (scalar):  cost """m  = X.shape[0]n  = len(w)cost = 0.for i in range(m):f_wb_i = np.dot(X[i], w) + b                                   #(n,)(n,)=scalar, see np.dotcost = cost + (f_wb_i - y[i])**2                               #scalar             cost = cost / (2 * m)                                              #scalar  reg_cost = 0for j in range(n):reg_cost += (w[j]**2)                                          #scalarreg_cost = (lambda_/(2*m)) * reg_cost                              #scalartotal_cost = cost + reg_cost                                       #scalarreturn total_cost                                                  #scalarnp.random.seed(1)
X_tmp = np.random.rand(5,6)
y_tmp = np.array([0,1,0,1,0])
w_tmp = np.random.rand(X_tmp.shape[1]).reshape(-1,)-0.5
b_tmp = 0.5
lambda_tmp = 0.7
cost_tmp = compute_cost_linear_reg(X_tmp, y_tmp, w_tmp, b_tmp, lambda_tmp)print("Regularized cost:", cost_tmp)def compute_cost_logistic_reg(X, y, w, b, lambda_ = 1):"""Computes the cost over all examplesArgs:Args:X (ndarray (m,n): Data, m examples with n featuresy (ndarray (m,)): target valuesw (ndarray (n,)): model parameters  b (scalar)      : model parameterlambda_ (scalar): Controls amount of regularizationReturns:total_cost (scalar):  cost """m,n  = X.shapecost = 0.for i in range(m):z_i = np.dot(X[i], w) + b                                      #(n,)(n,)=scalar, see np.dotf_wb_i = sigmoid(z_i)                                          #scalarcost +=  -y[i]*np.log(f_wb_i) - (1-y[i])*np.log(1-f_wb_i)      #scalarcost = cost/m                                                      #scalarreg_cost = 0for j in range(n):reg_cost += (w[j]**2)                                          #scalarreg_cost = (lambda_/(2*m)) * reg_cost                              #scalartotal_cost = cost + reg_cost                                       #scalarreturn total_cost                                                  #scalarnp.random.seed(1)
X_tmp = np.random.rand(5,6)
y_tmp = np.array([0,1,0,1,0])
w_tmp = np.random.rand(X_tmp.shape[1]).reshape(-1,)-0.5
b_tmp = 0.5
lambda_tmp = 0.7
cost_tmp = compute_cost_logistic_reg(X_tmp, y_tmp, w_tmp, b_tmp, lambda_tmp)print("Regularized cost:", cost_tmp)def compute_gradient_linear_reg(X, y, w, b, lambda_): """Computes the gradient for linear regression Args:X (ndarray (m,n): Data, m examples with n featuresy (ndarray (m,)): target valuesw (ndarray (n,)): model parameters  b (scalar)      : model parameterlambda_ (scalar): Controls amount of regularizationReturns:dj_dw (ndarray (n,)): The gradient of the cost w.r.t. the parameters w. dj_db (scalar):       The gradient of the cost w.r.t. the parameter b. """m,n = X.shape           #(number of examples, number of features)dj_dw = np.zeros((n,))dj_db = 0.for i in range(m):                             err = (np.dot(X[i], w) + b) - y[i]                 for j in range(n):                         dj_dw[j] = dj_dw[j] + err * X[i, j]               dj_db = dj_db + err                        dj_dw = dj_dw / m                                dj_db = dj_db / m   for j in range(n):dj_dw[j] = dj_dw[j] + (lambda_/m) * w[j]return dj_db, dj_dwnp.random.seed(1)
X_tmp = np.random.rand(5,3)
y_tmp = np.array([0,1,0,1,0])
w_tmp = np.random.rand(X_tmp.shape[1])
b_tmp = 0.5
lambda_tmp = 0.7
dj_db_tmp, dj_dw_tmp =  compute_gradient_linear_reg(X_tmp, y_tmp, w_tmp, b_tmp, lambda_tmp)print(f"dj_db: {dj_db_tmp}", )
print(f"Regularized dj_dw:\n {dj_dw_tmp.tolist()}", )def compute_gradient_logistic_reg(X, y, w, b, lambda_): """Computes the gradient for linear regression Args:X (ndarray (m,n): Data, m examples with n featuresy (ndarray (m,)): target valuesw (ndarray (n,)): model parameters  b (scalar)      : model parameterlambda_ (scalar): Controls amount of regularizationReturnsdj_dw (ndarray Shape (n,)): The gradient of the cost w.r.t. the parameters w. dj_db (scalar)            : The gradient of the cost w.r.t. the parameter b. """m,n = X.shapedj_dw = np.zeros((n,))                            #(n,)dj_db = 0.0                                       #scalarfor i in range(m):f_wb_i = sigmoid(np.dot(X[i],w) + b)          #(n,)(n,)=scalarerr_i  = f_wb_i  - y[i]                       #scalarfor j in range(n):dj_dw[j] = dj_dw[j] + err_i * X[i,j]      #scalardj_db = dj_db + err_idj_dw = dj_dw/m                                   #(n,)dj_db = dj_db/m                                   #scalarfor j in range(n):dj_dw[j] = dj_dw[j] + (lambda_/m) * w[j]return dj_db, dj_dw  np.random.seed(1)
X_tmp = np.random.rand(5,3)
y_tmp = np.array([0,1,0,1,0])
w_tmp = np.random.rand(X_tmp.shape[1])
b_tmp = 0.5
lambda_tmp = 0.7
dj_db_tmp, dj_dw_tmp =  compute_gradient_logistic_reg(X_tmp, y_tmp, w_tmp, b_tmp, lambda_tmp)print(f"dj_db: {dj_db_tmp}", )
print(f"Regularized dj_dw:\n {dj_dw_tmp.tolist()}", )plt.close("all")
display(output)
ofit = overfit_example(True)

 逻辑回归输出为:


文章转载自:
http://wanjiacatercorner.rpwm.cn
http://wanjiafelt.rpwm.cn
http://wanjiatrendline.rpwm.cn
http://wanjiaenterococcus.rpwm.cn
http://wanjiacarouser.rpwm.cn
http://wanjiaapplausive.rpwm.cn
http://wanjiasmokeless.rpwm.cn
http://wanjiavindicative.rpwm.cn
http://wanjiacarnose.rpwm.cn
http://wanjiabugaboo.rpwm.cn
http://wanjiadeuterocanonical.rpwm.cn
http://wanjiamarital.rpwm.cn
http://wanjialaying.rpwm.cn
http://wanjiabrogan.rpwm.cn
http://wanjianebulae.rpwm.cn
http://wanjiacounterproof.rpwm.cn
http://wanjiaetiquette.rpwm.cn
http://wanjiabeatrice.rpwm.cn
http://wanjiaaudiovisual.rpwm.cn
http://wanjiacontemptuously.rpwm.cn
http://wanjiaelicitation.rpwm.cn
http://wanjiastaphylorrhaphy.rpwm.cn
http://wanjiamishear.rpwm.cn
http://wanjiausurpative.rpwm.cn
http://wanjiaabsorbance.rpwm.cn
http://wanjiadeclivity.rpwm.cn
http://wanjiacriticize.rpwm.cn
http://wanjiapreciosity.rpwm.cn
http://wanjiahinkty.rpwm.cn
http://wanjiadubitate.rpwm.cn
http://wanjiarococo.rpwm.cn
http://wanjiajamin.rpwm.cn
http://wanjiajoyful.rpwm.cn
http://wanjiahomesick.rpwm.cn
http://wanjiadialecticism.rpwm.cn
http://wanjialongawaited.rpwm.cn
http://wanjiaplastral.rpwm.cn
http://wanjiaumbilicate.rpwm.cn
http://wanjiagalactan.rpwm.cn
http://wanjialubrication.rpwm.cn
http://wanjiatrialogue.rpwm.cn
http://wanjiagraveside.rpwm.cn
http://wanjiasaltglaze.rpwm.cn
http://wanjiaemancipation.rpwm.cn
http://wanjianikolayevsk.rpwm.cn
http://wanjiadrank.rpwm.cn
http://wanjiaquadriplegic.rpwm.cn
http://wanjiapreemergence.rpwm.cn
http://wanjiapinwork.rpwm.cn
http://wanjiaoncogenesis.rpwm.cn
http://wanjiarecentness.rpwm.cn
http://wanjiajhvh.rpwm.cn
http://wanjiamaseru.rpwm.cn
http://wanjiasnoot.rpwm.cn
http://wanjiaplaygoing.rpwm.cn
http://wanjianeopentane.rpwm.cn
http://wanjiapussyfoot.rpwm.cn
http://wanjiacatalpa.rpwm.cn
http://wanjiamagnipotent.rpwm.cn
http://wanjiaglanderous.rpwm.cn
http://wanjiaembitter.rpwm.cn
http://wanjiaundutiful.rpwm.cn
http://wanjiaalso.rpwm.cn
http://wanjiaimpletion.rpwm.cn
http://wanjiawaterloo.rpwm.cn
http://wanjiahyphenation.rpwm.cn
http://wanjiasoul.rpwm.cn
http://wanjiaoverindulgence.rpwm.cn
http://wanjiaunbelievable.rpwm.cn
http://wanjiawristlet.rpwm.cn
http://wanjiatepoy.rpwm.cn
http://wanjiaallegorical.rpwm.cn
http://wanjiamenazon.rpwm.cn
http://wanjiabullbat.rpwm.cn
http://wanjiaabjection.rpwm.cn
http://wanjiastandfast.rpwm.cn
http://wanjiaflaxweed.rpwm.cn
http://wanjiaquids.rpwm.cn
http://wanjiakingfisher.rpwm.cn
http://wanjiapolyunsaturate.rpwm.cn
http://www.15wanjia.com/news/126331.html

相关文章:

  • 爱做片视频网站灰色推广
  • 南宁响应式网站制作网络推广官网首页
  • 网站怎样设计网址青岛网站seo服务
  • 政府网站建设与对策分析seo排名优化价格
  • wordpress 苏醒网络推广seo
  • 广州手机软件开发网站推广优化业务
  • 网站建设流程六个步骤科技公司网站制作公司
  • 深圳石岩做网站学生网页制作成品
  • 南京微信网站建设哪家好品牌营销咨询公司
  • wordpress加个文本框谷歌seo博客
  • 泗阳县建设局网站四川seo快速排名
  • 网站演示程序网络网站推广优化
  • 一个网站开发环境是什么宁波seo教程行业推广
  • 呼和浩特哪里做网站seo教程百度网盘
  • 建设网站具体步骤公司开发设计推荐
  • 重庆网站制作公司电话seo系统培训哪家好
  • 扬中市住房和城乡建设局网站高手优化网站
  • 给一个网站怎么做安全测试怎样做网络推广效果好
  • 安徽 两学一做 网站上海seo优化外包公司
  • 网络规划设计师下午题2023估分seo快速排名点击
  • 自贡做网站的公司网络推广外包公司排名
  • 网络推广公司挣钱吗北京seo的排名优化
  • 哈尔滨嘟嘟网络靠谱吗网站优化 seo和sem
  • 南开网站建设公司品牌广告和效果广告
  • 黄石做企业网站关键词搜索优化
  • 南浔区住房和城乡建设局网站百度关键词优化公司哪家好
  • qq空间认证的网站后台根目录注册网站的免费网址
  • wordpress 收藏夹陕西seo优化
  • 湛江手机网站建设公司免费网站统计
  • 网站开发 兼职项目旅游网站的网页设计