Page 1 of 1

How much data can clmsync write?

PostPosted: Thu Jun 04, 2015 10:39 am
by nickoppen

I'm progressing slowly with my neural network program and I'm at the stage where I want to pass a lot of data to my kernel to calculate the weight adjustments form the network. To start off with I thought I'd load a large number of paired input/output vectors into global memory using clmsync and then iterate through them.

The thing I'm wondering about is where does clmsync put the data and how big is that space. I've read the eSDK reference and I'm assuming that clmsync uses e_write and e_read to write to and read from an "external memory buffer" but I don't know where that is and how big it is.

Can anyone give me a clue?


Re: How much data can clmsync write?

PostPosted: Thu Jun 04, 2015 11:43 am
by dar
clmsync when targeting the Epiphany device is copying data from the platform memory your host application uses to the ~32MB of shared DRAM. From there your kernels can read the data. There is a redundancy in the current design left over from GPUs whereby it should be possible to allocate (clmalloc) the shared DRAM directly and use it from the host, with some concern over collisions. However, the original semantics from accelerators was retained, and the implementation is kept safe. (I mentioned the issue of collisions.) There are other problems with trying to save the memory used in the redundancy. Also, if you follow the defined semantics of clmsync you will not have any problems with the redundant memory allocation, its just an implementation detail.

As far as max size, I do not know. Some of the DRAM is used for certain things. My guess is you have "most" of the 32MB. If you check the source code for dmalloc you can figure out how to calculate the bounds.


Re: How much data can clmsync write?

PostPosted: Thu Jun 04, 2015 12:08 pm
by nickoppen
Hi David,

Thanks once again for your detailed response.

32Mb, even though I might not be able to use all of it, means that the memory constraint lies with the local memory on the Epiphany for data that it needs to hold.

For the training data set that I'll use in this phase it will be ample. My biggest training sets to date are 1 - 2Mb.