logistic-regressions
logistic-regressions
August 6, 2024
1
[[ 0., 0., 0., …, 12., 0., 0.],
[ 0., 0., 3., …, 14., 0., 0.],
[ 0., 0., 8., …, 16., 0., 0.],
…,
[ 0., 9., 16., …, 0., 0., 0.],
[ 0., 3., 13., …, 11., 5., 0.],
[ 0., 0., 0., …, 16., 9., 0.]],
…,
2
is an integer in the range\n0..16. This reduces dimensionality and gives
invariance to small\ndistortions.\n\nFor info on NIST preprocessing routines,
see M. D. Garris, J. L. Blue, G.\nT. Candela, D. L. Dimmick, J. Geist, P. J.
Grother, S. A. Janet, and C.\nL. Wilson, NIST Form-Based Handprint Recognition
System, NISTIR 5469,\n1994.\n\n.. topic:: References\n\n - C. Kaynak (1995)
Methods of Combining Multiple Classifiers and Their\n Applications to
Handwritten Digit Recognition, MSc Thesis, Institute of\n Graduate Studies in
Science and Engineering, Bogazici University.\n - E. Alpaydin, C. Kaynak (1998)
Cascading Classifiers, Kybernetika.\n - Ken Tang and Ponnuthurai N. Suganthan
and Xi Yao and A. Kai Qin.\n Linear dimensionalityreduction using relevance
weighted LDA. School of\n Electrical and Electronic Engineering Nanyang
Technological University.\n 2005.\n - Claudio Gentile. A New Approximate
Maximal Margin Classification\n Algorithm. NIPS. 2000.\n"}
print("Accuracy:", accuracy)
print("Confusion Matrix:\n", confusion_mat)
3
# Visualize some of the images and their predicted labels
fig, axes = plt.subplots(nrows=3, ncols=4, figsize=(11,11))
for i, ax in enumerate(axes.flat):
ax.imshow(X_test[i].reshape(8, 8), cmap='binary')
ax.set(title=f"True: {y_test[i]}, Predicted: {y_pred[i]}")
plt.show()
Accuracy: 0.9722222222222222
Confusion Matrix:
[[33 0 0 0 0 0 0 0 0 0]
[ 0 28 0 0 0 0 0 0 0 0]
[ 0 0 33 0 0 0 0 0 0 0]
[ 0 0 0 33 0 1 0 0 0 0]
[ 0 1 0 0 45 0 0 0 0 0]
[ 0 0 0 0 0 44 1 0 0 2]
[ 0 0 0 0 0 1 34 0 0 0]
[ 0 0 0 0 0 0 0 33 0 1]
[ 0 0 0 0 0 1 0 0 29 0]
[ 0 0 0 1 0 0 0 0 1 38]]
4
[ ]: