时间:2021-05-22
自适应线性神经网络Adaptive linear network, 是神经网络的入门级别网络。
相对于感知器,采用了f(z)=z的激活函数,属于连续函数。
代价函数为LMS函数,最小均方算法,Least mean square。
实现上,采用随机梯度下降,由于更新的随机性,运行多次结果是不同的。
'''Adaline classifiercreated on 2019.9.14author: vince'''import pandas import mathimport numpy import loggingimport randomimport matplotlib.pyplot as pltfrom sklearn.datasets import load_irisfrom sklearn.model_selection import train_test_splitfrom sklearn.metrics import accuracy_score'''Adaline classifierAttributesw: ld-array = weights after trainingl: list = number of misclassification during each iteration '''class Adaline: def __init__(self, eta = 0.001, iter_num = 500, batch_size = 1): ''' eta: float = learning rate (between 0.0 and 1.0). iter_num: int = iteration over the training dataset. batch_size: int = gradient descent batch number, if batch_size == 1, used SGD; if batch_size == 0, use BGD; else MBGD; ''' self.eta = eta; self.iter_num = iter_num; self.batch_size = batch_size; def train(self, X, Y): ''' train training data. X:{array-like}, shape=[n_samples, n_features] = Training vectors, where n_samples is the number of training samples and n_features is the number of features. Y:{array-like}, share=[n_samples] = traget values. ''' self.w = numpy.zeros(1 + X.shape[1]); self.l = numpy.zeros(self.iter_num); for iter_index in range(self.iter_num): for rand_time in range(X.shape[0]): sample_index = random.randint(0, X.shape[0] - 1); if (self.activation(X[sample_index]) == Y[sample_index]): continue; output = self.net_input(X[sample_index]); errors = Y[sample_index] - output; self.w[0] += self.eta * errors; self.w[1:] += self.eta * numpy.dot(errors, X[sample_index]); break; for sample_index in range(X.shape[0]): self.l[iter_index] += (Y[sample_index] - self.net_input(X[sample_index])) ** 2 * 0.5; logging.info("iter %s: w0(%s), w1(%s), w2(%s), l(%s)" % (iter_index, self.w[0], self.w[1], self.w[2], self.l[iter_index])); if iter_index > 1 and math.fabs(self.l[iter_index - 1] - self.l[iter_index]) < 0.0001: break; def activation(self, x): return numpy.where(self.net_input(x) >= 0.0 , 1 , -1); def net_input(self, x): return numpy.dot(x, self.w[1:]) + self.w[0]; def predict(self, x): return self.activation(x);def main(): logging.basicConfig(level = logging.INFO, format = '%(asctime)s %(filename)s[line:%(lineno)d] %(levelname)s %(message)s', datefmt = '%a, %d %b %Y %H:%M:%S'); iris = load_iris(); features = iris.data[:99, [0, 2]]; # normalization features_std = numpy.copy(features); for i in range(features.shape[1]): features_std[:, i] = (features_std[:, i] - features[:, i].mean()) / features[:, i].std(); labels = numpy.where(iris.target[:99] == 0, -1, 1); # 2/3 data from training, 1/3 data for testing train_features, test_features, train_labels, test_labels = train_test_split( features_std, labels, test_size = 0.33, random_state = 23323); logging.info("train set shape:%s" % (str(train_features.shape))); classifier = Adaline(); classifier.train(train_features, train_labels); test_predict = numpy.array([]); for feature in test_features: predict_label = classifier.predict(feature); test_predict = numpy.append(test_predict, predict_label); score = accuracy_score(test_labels, test_predict); logging.info("The accruacy score is: %s "% (str(score))); #plot x_min, x_max = train_features[:, 0].min() - 1, train_features[:, 0].max() + 1; y_min, y_max = train_features[:, 1].min() - 1, train_features[:, 1].max() + 1; plt.xlim(x_min, x_max); plt.ylim(y_min, y_max); plt.xlabel("width"); plt.ylabel("heigt"); plt.scatter(train_features[:, 0], train_features[:, 1], c = train_labels, marker = 'o', s = 10); k = - classifier.w[1] / classifier.w[2]; d = - classifier.w[0] / classifier.w[2]; plt.plot([x_min, x_max], [k * x_min + d, k * x_max + d], "go-"); plt.show(); if __name__ == "__main__": main();以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持。
声明:本页内容来源网络,仅供用户参考;我单位不保证亦不表示资料全面及准确无误,也不保证亦不表示这些资料为最新信息,如因任何原因,本网内容或者用户因倚赖本网内容造成任何损失或损害,我单位将不会负任何法律责任。如涉及版权问题,请提交至online#300.cn邮箱联系删除。
用Python实现出来的机器学习算法都是什么样子呢?前两期线性回归及逻辑回归项目已发布(见文末链接),今天来讲讲BP神经网络。BP神经网络全部代码https:/
python实现简单神经网络算法,供大家参考,具体内容如下python实现二层神经网络包括输入层和输出层importnumpyasnp#sigmoidfunct
本文实例为大家分享了python实现ANN的具体代码,供大家参考,具体内容如下1.简要介绍神经网络神经网络是具有适应性的简单单元组成的广泛并行互联的网络。它的组
本文实例讲述了Python利用神经网络解决非线性回归问题。分享给大家供大家参考,具体如下:问题描述现在我们通常使用神经网络进行分类,但是有时我们也会进行回归分析
本文实例讲述了Python实现的三层BP神经网络算法。分享给大家供大家参考,具体如下:这是一个非常漂亮的三层反向传播神经网络的python实现,下一步我准备试着