【ElM分类】基于哈里斯鹰优化ElM神经网络实现数据分类附matlab代码
1 简介
为了提高核极限学习机(ELM)的分类正确率,采用哈里斯鹰算法(HHO)对惩罚系数,宽度参数两个参数进行优化.首先,根据乳腺良恶性肿瘤数据库训练集并利用哈里斯鹰算法优化核极限学习机;然后,通过HHO-ELM和ELM对测试集进行分类诊断;最后,对比分析HHO-ELM和ELM的分类性能,测试结果表明,HHO-ELM的总体诊断正确率相较于ELM提高了10%,且恶性肿瘤的诊断正确率明显优于ELM.
2 部分代码
function [fbst, xbst, performance] = hho( objective, d, lmt, n, T, S)%Harris hawks optimization algorithm% inputs: % objective - function handle, the objective function% d - scalar, dimension of the optimization problem% lmt - d-by-2 matrix, lower and upper constraints of the decision varable% n - scalar, swarm size% T - scalar, maximum iteration% S - scalar, times of independent runs% data: 2021-05-09% author: elkman, github.com/ElkmanY/%% Levy flightbeta = 1.5;sigma = ( gamma(1+beta)*sin(pi*beta/2)/gamma((1+beta)/2)*beta*2^((beta-1)/2) ).^(1/beta);Levy = @(x) 0.01*normrnd(0,1,d,x)*sigma./abs(normrnd(0,1,d,x)).^(1/beta);%% algorithm proceduretic;for s = 1:S %% Initialization X = lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n); for t = 1:T F = objective(X); [f_rabbit(s,t), i_rabbit] = min(F); x_rabbit(:,t,s) = X(:,i_rabbit); xr = x_rabbit(:,t,s); J = 2*(1-rand(d,1)); E0 = 2*rand(1,n)-1; E(t,:) = 2*E0*(1-t/T); absE = abs(E(t)); p1 = absE>=1; %eq(1) r = rand(1,n); p2 = (r>=0.5) & (absE>=0.5) & (absE<1); %eq(4) p3 = (r>=0.5) & (absE<0.5); %eq(6) p4 = (r<0.5) & (absE>=0.5) & (absE<1); %eq(10) p5 = (r<0.5) & (absE<0.5); %eq(11) %% update locations rh = randi([1,n],1,n); flag1 = rand(1,n)>=0.5; Y = xr - E(t,:).*abs( J.*xr - X ); Z = Y + rand(d,n).*Levy(n); flag2 = (objective(Y)<objective(Z)) & (objective(Y)<F); flag3 = (objective(Y)>objective(Z)) & (objective(Z)<F); flag4 = (~flag2) & (~flag3); X_ = p1.*( (X(:,rh) - rand(1,n).*abs( X(:,rh) - 2*rand(1,n).*X )).*flag1 +... ((X(:,rh) - mean(X)) - rand(1,n).*( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) )).*(~flag1) )... + p2.*( xr - X - E(t,:).*abs( J.*xr - X ) )... + p3.*( xr - E(t,:).*abs( xr - X ) )... + p4.*( Y.*flag2 + Z.*flag3 + ( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) ).*flag4 )... + p5.*( Y.*flag2 + Z.*flag3 + ( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) ).*flag4 ); X_(:,i_rabbit) = xr; X = X_; endend%% Êä³ö-outputsperformance = [min(f_rabbit(:,T));mean(f_rabbit(:,T));std(f_rabbit(:,T))];timecost = toc;[fbst, ibst] = min(f_rabbit(:,T));xbst = x_rabbit(:,T,ibst);%% »æÍ¼-plot data% Convergence Curvefigure('Name','Convergence Curve');box onsemilogy(1:T,mean(f_rabbit,1),'b','LineWidth',1.5);xlabel('Iteration','FontName','Aril');ylabel('Fitness/Score','FontName','Aril');title('Convergence Curve','FontName','Aril');if d == 2 % Trajectory of Global Optimal figure('Name','Trajectory of Global Optimal'); x1 = linspace(lmt(1,1),lmt(1,2)); x2 = linspace(lmt(2,1),lmt(2,2)); [X1,X2] = meshgrid(x1,x2); V = reshape(objective([X1(:),X2(:)]'),[size(X1,1),size(X1,1)]); contour(X1,X2,log10(V),100); % notice log10(V) hold on plot(x_rabbit(1,:,1),x_rabbit(2,:,1),'r-x','LineWidth',1); hold off xlabel('\it{x}_1','FontName','Time New Roman'); ylabel('\it{x}_2','FontName','Time New Roman'); title('Trajectory of Global Optimal','FontName','Aril');endend
3 仿真结果


4 参考文献
[1]彭甜, 孙伟, 张楚,等. 基于改进哈里斯鹰算法优化ELM的风速预测方法及系统:.
博主简介:擅长智能优化算法、神经网络预测、信号处理、元胞自动机、图像处理、路径规划、无人机等多种领域的Matlab仿真,相关matlab代码问题可私信交流。
部分理论引用网络文献,若有侵权联系博主删除。


