Explain me in-depth how ML number recognizers work

I’ve coded genetic algorithm based permutation ml program data models so I know all the technical terms and what not. What I am wondering is how do we convert, say a grid where each cell in the matrix represents a value from 0 to 1, with that as our input how can we create a deep q learning network to recognize numbers from previous fed data that we said were numbers? I really feel its like the XOR problem but I am not sure. After all how do you even define the target model for something like this?

1 Like