Page 1 of 2

A version of Python for Epiphany

PostPosted: Fri Oct 16, 2015 9:22 am
by polas
Hi all,

I have written a version of Python for the Epiphany, called ePython. It implements a subset of the language and comes will documentation and a number of examples which illustrate using the language. It support running over physical cores (i.e. on the Epiphany) and virtual cores (on the ARM) which look identical and codes can run in a hybrid fashion on both with all cores communicating.

The code is at , and to get started do a make, then sudo make install, then bash to start a new bash environment (required as PYTHONPATH is updated to point to the install location. Alternatively you can run it without installing via the ./ script.

For instance, a simple hello world:

Code: Select all
print "Hello World"

If you call this, then run with epython then each physical core will print out Hello World to stdio. By doing something line epython -h 8 -d 5 , this will run it over 8 virtual cores (running on host) and 5 physical cores (running on the device.)

Code: Select all
import parallel
a=bcast(random%100, 0)
print "The random number from core 0 is "+a

The above code will broadcast a random number from core 0 to all other (physical and virtual) cores which then display this number. This example uses the parallel functions provided (via import parallel) and is a good test that PYTHONPATH is correctly set.

This is still very much a beta, but allows for one to very quickly write simple codes and get them running with no fuss. I think even a novice programmer could get the examples here running within a minute and then tweak them to explore what the Epiphany is capable of.


Re: A version of Python for Epiphany

PostPosted: Fri Oct 16, 2015 3:54 pm
by aolofsson
This is brilliant! Well done! I will spread the word, hopefully it will help some folks get over the "fear of parallel" hurdle.

Re: A version of Python for Epiphany

PostPosted: Fri Oct 16, 2015 4:19 pm
by polas
Great, I also popped up a guide to writing and running your own code on the Epiphany cores, novice to stuff running in less than 60 seconds.

This is at viewtopic.php?f=49&t=3249

Re: A version of Python for Epiphany

PostPosted: Thu Nov 05, 2015 2:39 pm
by aolofsson

Same request I made to Jan :D

Do you think you can place a small "hello world" example in the parallella_examples directory for epython?
It should include a build/run script and indications of any pre requisites.(like git clone, build your epython repo)


Re: A version of Python for Epiphany

PostPosted: Mon Nov 16, 2015 2:20 pm
by MiguelTasende
This looks totally amazing!
I am testing it now, and I can't believe how easy it is to use.

I would like to know a bit more about "how does it work"?
That is to try to guess how good the performance would be in different scenarios, and get to know the limitations of the library (and possibly enhance it or improve it, if I can). I will look at the source code and try to find out, but if you could give any "big picture" ideas about the "hidden magic", would be great.

Anyway (performance or not), congratulations! Looks like a great work!

In fact I've been developing some software in C with ESDK, but now I will try the epython and consider switching... (may keep both tools at hand, also, will see...).

Re: A version of Python for Epiphany

PostPosted: Tue Nov 17, 2015 1:47 pm
by MiguelTasende
Some details...

Broadcast and send don't seem to be working for host-to-device communications.

Also, I didn't find a good way to get the "host coreid" from a "device" (as to call recv). I need the knowledge from "outside" (code below):

(this is the command...)
>> epython -h 1 -d 16

(and the code...)
Code: Select all
import parallel
import util

if ishost():
  print "soy el host"
elif coreid()==4:
  print "Recibi "+x

Edit: I thought there was another problem with printing, but I found it seems to be in another place. OK, I post below the code for an easy parallel Matrix Multiplication with epython. To which you have to pass the arguments one by one, with inputs (not too practical). Just for fun :)

I run it with:

>>epython -d 2

Code: Select all
  GNU nano 2.2.6                                    File:                                                                             

import parallel
import util


dim a[N]
dim b[N]
dim c[N*N]

while i<N:
  print "Core: "+coreid()+" a("+i+","+coreid()+")= "
  temp=input("Ingrese un valor: ")
  print "Core: "+coreid()+" b("+coreid()+","+i+")= "
  temp=input("Ingrese un valor: ")

l = 0
for j in range(N-1):
  for i in range(N-1):
    c[l] = a[i]*b[j]
    reducido = reduce(c[l],"sum")
    if coreid()==0:
      print "c("+i+","+j+")= "+reducido

Re: A version of Python for Epiphany

PostPosted: Wed Nov 18, 2015 11:15 pm
by polas
Great, thanks for trying it, posting bugs and especially posting some code :) The code is very much beta and I will look into & fix the issue around host to device communication issue you mentioned for p2p & bcast. Please let me know of any other issues you find.

Yes, for the host id from a device that is a good point, I suppose the solution would be for me to add a num_device_cores and num_host_cores call (which should be quite trivial) and then a mapping function which translates the relative host id to the absolute core id (which always follow the device ids.)

Due to the limits of memory per core in the Epiphany, it does the lexing & parsing on the host (using Flex & Bison) to "compile" into a byte code representation (which can be written out via command line args.) This is designed to have as small a memory footprint as possible and is transferred onto the device via memory copy. The device (and host threads if selected) are running an interpreter which then executes the byte code itself. An additional thread on the host is a "monitor" device cores can communicate with this via a memory copy to do I/O, maths functions such as cos,tan etc (as we don't want to put maths library onto cores), string handling etc... By default it tries to put the variable values in core memory too, but for big arrays this is not always possible. There is some logic to switch the variables and/or byte code into shared memory (which can be done explicitly via command line switches too) if it doesn't fit into the device memory but this is obviously at a performance penalty (it is quite noticeable actually, and ePython provides a timing command line option where you can see the impact of this.)

In terms of performance, it is currently quite slow - but the whole idea with it is education and to get people really quickly writing parallel code and running it on this architecture. I could imagine that, based on this, they would then explore some other tools and build on the initial knowledge gained. I think there are plenty of places where the code could be sped up, and lots of additional functionality which could be added too.


Re: A version of Python for Epiphany

PostPosted: Fri Dec 11, 2015 7:30 pm
by gartor
Sorry This is taking a lot longer than 60 seconds.

What should EPIPHANY_DEV be set to?
Is the Epiphany Shared Memory Manager only available on the headless kernel?
I'm not using the headless kernel. How do I start it?

linaro@linaro-nano:~/epython$ ./ -d 5
epython-host: e_init(): EPIPHANY_DEV file open failure.
epython-host: e_init(): Failed to initialize the Epiphany Shared Memory Manager.
Error on initialisation
epython-host: ee_write_esys(): EPIPHANY_DEV file open failure.
epython-host: e_reset_system(): Failed

Error on system reset
epython-host: e_open(): Platform was not initialized. Use e_init().
Error opening Epiphany
epython-host: e_alloc(): Platform was not initialized. Use e_init().
Error allocating memory


Re: A version of Python for Epiphany

PostPosted: Tue Dec 15, 2015 8:22 am
by polas
Hi Gartor,

Am sorry to hear you are having issues. From looking at this it seems more an issue with/finding/connecting to the underlying Epiphany. What version of the OS image are you using, is it the latest one or an older one? Especially if it is an older image, try to run this as root (via sudo) - older versions of the API required you to be root in order to connect to the Epiphany, but this was fixed later on. The .sh script *should* handle this but has only been tested on my headless image - just edit the .sh script and put a hash (#) before lines 23,24,25, 26, 28 as it might not be picking up the correct OS version.

If this doesn't work then do other examples (such as those in the test directory on the image) all work OK? The wrapper script in requires that the EPIPHANY_HOME environment variable is set so it is worth checking that this is correct (via the export command in bash.) If you look at the script it makes some assumptions about the location of your library directory and HDF file, it is worth checking that these are correct on your system, for instance does /opt/adapteva/esdk/bsps/current/parallella_E16G3_1GB.hdf for you?

Let me know how you get on,

Re: A version of Python for Epiphany

PostPosted: Tue Jan 05, 2016 11:40 am
by ioiomi
i want to install it but
Code: Select all
parallella@parallella:~/epython-master$ sudo make
device-support.c:35:19: fatal error: e-hal.h: No such file or directory
 #include "e-hal.h"
compilation terminated.
make[1]: *** [device-support.o] Error 1
make[1]: Leaving directory `/home/parallella/epython-master/host'
make: *** [host-build] Error 2

what can i do thx