Recognizing hand drawn numbers with Peckis

My tiny experiment playing with MNIST dataset of handwritten digits. Sadly, the end result can no longer be viewed live :( It guesses numbers pretty terribly, because my focus was more on how to setup the whole infrastructure and connect all the moving parts together.

Backend is made using Flask and Tensorflow: https://github.com/mseimys/peckis. Frontend was crafted using plain React: https://github.com/mseimys/peckis-ui. Here is a small screenshot in case the site is offline:

Attempt #1

The MNIST dataset is nice but is not enough to pull this project off. Firstly, all numbers there are somewhat small, blurred, nicely rotated and centered, this you have to take into account. So my first rough attempt to improve accuracy: take MNIST dataset and expand images in a training set by zooming some arbitrary and random amount. This made a visible improvement!

Sample images of MNIST dataset

Future attempt #2

Probably will work on some more complicated model if I have. Ideas that come to mind:

  • adjust the "pencil width".
  • heavily blur/add noise to the incoming image.
  • use exactly the same preprocessing when training.