分享
002-Experiments with ENIST, Logistic regression
输入“/”快速插入
002-Experiments with ENIST,
Logistic regression
1.
Setting
•
Normalize data to (0-0.25), Gaussian blur, to simulate spectrum addition
•
channel to a single value, which is then followed by a linear layer.
•
Logistic regression
, BCE loss.
2.
Experiment 1: MNIST split
2.1 Config
•
MNIST data, official split to training and test.
•
0-7 classes as the recognition target.
•
Full CNN, 3 CNN layers, LayerNorm+ReLU[activation red]+Maxpooling. The last CNN layer compress each
•
Check EER by
pooling
all the positive and negative recognition of the 8 classes
•
Compute the BCE of the positive samples (target as 1) and the negative samples (target as 0)
•
x: image; y: label; p1=f(w1*x1) p2=f(w2*x2) p3=f(w1*x1+w2*x2);
•
Training: weigh w sampling from 0.2-1.0; mixture w1 and w2 sampling from 0.2-1.0; w1+w2=1.0;
•
Test: sampling weight and mixing 0-3 images from class 0-7
2.2 Main results