emf-rbm
emf-rbm copied to clipboard
omniglot test does not use rbm features
We use the RBM to generate features for the SVM or Logistic Regression
so we need to transform X_train to features (F_train):
example:
rbm = BernoulliRBM() rbm= rbm.fit(X_train) F_train = rbm.transform(X_train) F_test = rbm.transform(X_test)
classifier = LinearSVC() classifier.fit(F_train, train_t) Y_test_rbm_pred = classifier.predict(F_test) emf_accuracy = accuracy_score(y_pred=Y_test_rbm_pred, y_true=test_t)
for the EMF RBM, we need to implement a transform method, based on sig_means()
from sklearn.utils.fixes import expit
from sklearn.utils.extmath import safe_sparse_dot
def sig_means(x, b, W): a = safe_sparse_dot(x, W.T) + b return expit(a, out=a)
See
RBM_Baseline_Ominglot.ipynb
rbm = BernoulliRBM() rbm= rbm.fit(X_train) F_train = rbm.transform(X_train) F_test = rbm.transform(X_test)
classifier = LinearSVC() classifier.fit(F_train, train_t)
I do this following the sklearn example using the pipeline.
classifier = Pipeline(steps=[('rbm', B_rbm), ('logistic', logistic)])
statement does the same.
Followed by
classifier.fit(X_train, Y_train) Y_test_emf_pred = classifier.predict(X_test)
Why do we need the sig_means
transform only for emf-rbm
and not for the bernoulli rbm
?