Hi Nick, I changed your code to make JIT compile once instead of everytime when there are multiple samples in .dat file, and imports a result csv log. Getting code from
https://github.com/leonfg/epiphanyANN.git.
In addition, I wrote a matlab project to train a MLP and import weights and biases in .nn format and generate test samples in .dat format. I used UCI IRIS data set to train a 4/10/3 BP SIGMOID network, and generated several sets of nn and dat files. The difference between the file sets is the selection ratio of train samples, leading to different accuracy rate of prediction in matlab.The strange thing is, in parallella, some file sets will output exactly same prediction result as matlab, but some other sets will get totally different and wrong result. I can not explain this, these file sets are all generated by same matlab code and the only difference is training sample selection ratio, technically in parallella they should all get same result as they were in matlab.
I just want to find a faster training way and import the trained network model in Epiphany will be enough for most applications. But recently I am stuck in here.