我有一个二进制标记,我试图用13个变量进行预测。
我使用了R SVM(要精确地使用KSVM来解决)并且我希望获得平面的功能(基于变量的权重)以在其他数据系统中使用该功能。 任何想法?
谢谢!
答案 0 :(得分:1)
kernlab
包证明一个两类案例:
library(kernlab)
# set seed to make this reproducible
set.seed(101)
# create a matrix of inputs
x <- rbind(matrix(rnorm(120),,2),matrix(rnorm(120,mean=3),,2)); head(x)
# create an output...something simple and contrived
y <- matrix(c(rep(1,60),rep(-1,60))); head(y); tail(y)
# train svm model
our_svm <- ksvm(x,y,type="C-svc")
# if you want to plot classification results, run plot(our_svm,data=x)
# get the weights of the classifier
(w <- colSums(coef(our_svm)[[1]] * x[unlist(alphaindex(our_svm)),]))
# get the intercept
(b <- b(our_svm))
# our classifier takes the form g(x) = sign(f(x)),
# where f(x) = w*x + b, and input variables x are SCALED AND TRANSPOSED
# scale it
x_sc <- scale(x); head(x_sc)
# get the f(x) (don't forget to transpose x!)
(f_x <- colSums(t(x_sc)*w) + b)
# get the sign, which is the class of the inputs
(g_x <- sign(f_x))
# if we run fitted(our_svm), we'll see it came up with the same results
# as our manual calculations
table(Manual_Calc = g_x, From_Model = fitted(our_svm))
现在,如果你有任何新的输入,只需缩放,转置并插入 f(x),然后 g(x)来获取它class - 这两个函数是你的SVM分类器。
答案 1 :(得分:0)