我一直使用R,因此对于Matlab来说我还是很陌生,并且遇到了一些故障排除问题。我正在为张量分解方法运行一些代码(可在此处使用:https://github.com/caobokai/tBNE)。首先,我尝试运行演示代码,该代码生成模拟数据以运行该方法,并导致以下错误:
使用feval时出错 未定义的函数或变量“ Sfun”。
OptStiefelGBB错误(第199行) [F,G] = feval(fun,X,varargin {:}); out.nfe = 1;
tbne_demo错误> tBNE_fun(第124行) S,@Sfun,opts,B,P,X,L,D,W,Y,alpha,beta);
这是我正在运行的代码块:
clear
clc
addpath(genpath('./tensor_toolbox'));
addpath(genpath('./FOptM'));
rng(5489, 'twister');
m = 10;
n = 10;
k = 10; % rank for tensor
[X, Z, Y] = tBNE_data(m, n, k); % generate the tensor, guidance and label
[T, W] = tBNE_fun(X, Z, Y, k);
[~, y1] = max(Y, [], 2);
[~, y2] = max(T{3} * W, [], 2);
fprintf('accuracy %3.2e\n', sum(y1 == y2) / n);
function [X, Z, Y] = tBNE_data(m, n, k)
B = randn(m, k);
S = randn(n, k);
A = {B, B, S};
X = ktensor(A);
Z = randn(n, 4);
Y = zeros(n, 2);
l = ceil(n / 2);
Y(1 : l, 1) = 1;
Y(l + 1 : end, 2) = 1;
X = tensor(X);
end
function [T, W] = tBNE_fun(X, Z, Y, k)
% t-BNE computes brain network embedding based on constrained tensor factorization
%
% INPUT
% X: brain networks stacked in a 3-way tensor
% Z: side information
% Y: label information
% k: rank of CP factorization
%
% OUTPUT
% T is the factor tensor containing
% vertex factor matrix B = T{1} and
% subject factor matrix S = T{3}
% W is the weight matrix
%
% Example: see tBNE_demo.m
%
% Reference:
% Bokai Cao, Lifang He, Xiaokai Wei, Mengqi Xing, Philip S. Yu,
% Heide Klumpp and Alex D. Leow. t-BNE: Tensor-based Brain Network Embedding.
% In SDM 2017.
%
% Dependency:
% [1] Matlab tensor toolbox v 2.6
% Brett W. Bader, Tamara G. Kolda and others
% http://www.sandia.gov/~tgkolda/TensorToolbox
% [2] A feasible method for optimization with orthogonality constraints
% Zaiwen Wen and Wotao Yin
% http://www.math.ucla.edu/~wotaoyin/papers/feasible_method_matrix_manifold.html
%% set algorithm parameters
printitn = 10;
maxiter = 200;
fitchangetol = 1e-4;
alpha = 0.1; % weight for guidance
beta = 0.1; % weight for classification loss
gamma = 0.1; % weight for regularization
u = 1e-6;
umax = 1e6;
rho = 1.15;
opts.record = 0;
opts.mxitr = 20;
opts.xtol = 1e-5;
opts.gtol = 1e-5;
opts.ftol = 1e-8;
%% compute statistics
dim = size(X);
normX = norm(X);
numClass = size(Y, 2);
m = dim(1);
n = dim(3);
l = size(Y, 1);
D = [eye(l), zeros(l, n - l)];
L = diag(sum(Z * Z')) - Z * Z';
%% initialization
B = randn(m, k);
P = B;
S = randn(n, k);
S = orth(S);
W = randn(k, numClass);
U = zeros(m, k); % Lagrange multipliers
%% main loop
fit = 0;
for iter = 1 : maxiter
fitold = fit;
% update B
ete = (S' * S) .* (P' * P); % compute E'E
b = 2 * ete + u * eye(k);
c = 2 * mttkrp(X, {B, P, S}, 1) + u * P + U;
B = c / b;
% update P
ftf = (S' * S) .* (B' * B); % compute F'F
b = 2 * ftf + u * eye(k);
c = 2 * mttkrp(X, {B, P, S}, 2) + u * B - U;
P = c / b;
% update U
U = U + u * (P - B);
% update u
u = min(rho * u, umax);
% update S
tic;
[S, out] = OptStiefelGBB(...
S, @Sfun, opts, B, P, X, L, D, W, Y, alpha, beta);
tsolve = toc;
fprintf(...
['[S]: obj val %7.6e, cpu %f, #func eval %d, ', ...
'itr %d, |ST*S-I| %3.2e\n'], ...
out.fval, tsolve, out.nfe, out.itr, norm(S' * S - eye(k), 'fro'));
% update W
H = D * S;
W = (H' * H + gamma * eye(k)) \ H' * Y;
% compute the fit
T = ktensor({B, P, S});
normresidual = sqrt(normX ^ 2 + norm(T) ^ 2 - 2 * innerprod(X, T));
fit = 1 - (normresidual / normX);
fitchange = abs(fitold - fit);
if mod(iter, printitn) == 0
fprintf(' Iter %2d: fitdelta = %7.1e\n', iter, fitchange);
end
% check for convergence
if (iter > 1) && (fitchange < fitchangetol)
break;
end
end
%% clean up final results
T = arrange(T); % columns are normalized
fprintf('factorization error %3.2e\n', fit);
end
我知道这里几乎没有上下文,但是我怀疑我需要Simulink,因为Sfun是与Simulink相关的功能(?)。该脚本需要两个工具箱:tensor_toolbox和FOptM。
可在以下位置获得: https://www.sandia.gov/~tgkolda/TensorToolbox/index-2.6.html https://github.com/andland/FOptM
非常感谢您的帮助,
保罗
答案 0 :(得分:1)
尽管SFun
是Simulink S-Function的常用缩写,但在这种情况下,错误与Simulink无关,并且名称是巧合。 (没有与Simulink相关的功能专门称为Sfun
,它只是一个通用术语。)
您的错误消息中包含@Sfun
,这是MATLAB中为称为function handle
的(m代码)函数创建Sfun
的一种方式。从您显示的代码中总结一下,这是优化中使用的成本函数。
如果查看代码所基于的代码(tBNE_fun.m),您会发现文件末尾有一个名为Sfun
的函数。这就是你所缺少的。