graphics assist/fgpa.. feasible?

Using Zynq Programmable Logic and Xilinx tools to create custom board configurations

graphics assist/fgpa.. feasible?

Postby dobkeratops » Sat Jun 06, 2015 9:18 pm

Is the FPGA theoretically capable of implementing hardware that could assist the epiphany chip in running graphics (shaders, etc..) - especially Texture Sampling (access to standard compressed formats), z-buffering (GPU's have tricks for compression), and maybe more.

i'm guessing a full on FPGA GPU would be out of the question and slow anyway

I'd expect the epiphany to be pretty good at vertex-processing, but that might waste its full capabiltiies.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: graphics assist/fgpa.. feasible?

Postby piotr5 » Sun Jun 07, 2015 7:58 am

I don't think full on-fpga gpu is out of the question, it wouldn't be slow because fpga programs are completely parallell. so only the size is a limitation, and of course fpga-gpu would run much hotter than a real gpu, waste much more energy. afaik someone already implemented a standard framebuffer on fpga, for the self-contained sdr project shown in the tokyo talk. at least to me, gpu is a major reason for buying the embedded parallella model, if at all only this model will have the size required for full 3d accelleration...
piotr5
 
Posts: 230
Joined: Sun Dec 23, 2012 2:48 pm

Re: graphics assist/fgpa.. feasible?

Postby dobkeratops » Sun Jun 07, 2015 9:56 am

full GPU in FPGA... thats' rather interesting to hear.

would there be virtue to the hybrid aproach - in that you'd leverage the parallela ASIC more (executing shader code, then rely on the FPGA for details of memory addressing)

I suppose FPGA gpu's could implement specific common shaders directly , rather than having to run shader software, if you knew which shaders you wanted to run most often

Does the FPGA have hardware multipliers & RAM ?
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: graphics assist/fgpa.. feasible?

Postby piotr5 » Sun Jun 07, 2015 11:27 am

I can't say much about what fpga is capable of. but just do the math: a resolution of 1280x1024, 1280 kilo-pixels. at 32-bit colour consumes 15Mbyte per frame. epiphany can only transfer 1Gigabyte/s, that's about 68 frames per second. so while you'd be able to use epiphany for directly writing the frames to be displayed, I'm afraid textures have a higher resolution in total since gpu will scale them down. I suppose it is possible to write a game running on parallella with lots of details and physics and such, but it wont suffice to port an existing game over to epiphany. as you said, some stuff like light and shading needs to be implemented in fpga. so if you plan to release a game for parallella, you need to distribute it on sd-compatible memory which boots into the game and uploads a custom fpga image, you need to master programming of all these things. of course you can turn valuable fpga space into memory, but why not use the 1Gb chip provided?
piotr5
 
Posts: 230
Joined: Sun Dec 23, 2012 2:48 pm

Re: graphics assist/fgpa.. feasible?

Postby dobkeratops » Sun Jun 07, 2015 3:00 pm

heh I don't think it would be wise to consider the Parallella for gamedev. we've seen CELL & Ageia PhysX come & go in the same niche.

however it would be nice to see how far it could get as a jack-of-all-trades: would it be able to get passable performance in various areas, even outside the niches to which it may excel in. Get most use out of the devices that have been sold, and give people more incentive to get them..
Last edited by dobkeratops on Sun Jun 07, 2015 10:39 pm, edited 2 times in total.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: graphics assist/fgpa.. feasible?

Postby 9600 » Sun Jun 07, 2015 7:23 pm

Antmicro implemented a graphics IP core for driving an LCD.

Regards,

Andrew
Andrew Back (a.k.a. 9600 / carrierdetect)
User avatar
9600
 
Posts: 997
Joined: Mon Dec 17, 2012 3:25 am

Re: graphics assist/fgpa.. feasible?

Postby dobkeratops » Mon Jun 08, 2015 10:31 am

piotr5 wrote:epiphany can only transfer 1Gigabyte/s, that's about 68 frames per second. so while you'd be able to use epiphany for directly writing the frames to be displayed, I'm afraid textures have a higher resolution in total since gpu will scale them down.


(a) textures are usually read in 'DXT'/'S3TC' compressed/mipmapped (typically 4bpp) - the GPU can write out more out than it reads in, expanding colour data out and enriching it with shading calculations (admittedly, I have suggested using the FPGA to assist with texture decompression, so that idea would probably have to go, but perhaps you could assist decompressing into palettized 256 colour images easier for a CPU to use). I suppose you might be referring to the potential for 'scaling down' if you use the highest quality sampling modes. Texture resolution is one of the easiest tweakable quality settings

(b) it might be possible to render on-chip tiles, 'TileBasedDeferedRendering' as per powerVR (not to be confused with deferred lighting/shading) - and compress these before writing out e.g. 16bit colour with dithering . perhaps a custom video mode could be used with some on-the-fly decompression (how about doing something a little like DXT compression but on 8x1 chunks of scanlines..)

but overall.. 1gb/sec does seem limiting. I wonder how far the 64core version could get.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: graphics assist/fgpa.. feasible?

Postby tnt » Tue Jun 09, 2015 9:22 am

I'm not so convinced by 3D capabilities, this might end up being pretty big in the FPGA.

However simple blitting and colorspace conversion/scaling overlays would go a long way to making the desktop experience much more usable. Currently when playing a video, the scaling and colorspace change is a very significant part of the CPU usage and these could easily be handled by HW.
tnt
 
Posts: 408
Joined: Mon Dec 17, 2012 3:21 am

Re: graphics assist/fgpa.. feasible?

Postby over9000 » Mon Aug 31, 2015 2:48 pm

Just came across this article about a new open source proof-of-concept GPU that's implemented on FPGA. Could be implemented on Parallella, perhaps?

http://www.theregister.co.uk/2015/08/31 ... ource_gpu/
over9000
 
Posts: 98
Joined: Tue Aug 06, 2013 1:49 am

Re: graphics assist/fgpa.. feasible?

Postby aolofsson » Tue Sep 01, 2015 1:49 am

It doesn't do graphics yet. Until it does, not sure I see the point.
User avatar
aolofsson
 
Posts: 1005
Joined: Tue Dec 11, 2012 6:59 pm
Location: Lexington, Massachusetts,USA


Return to FPGA Design

Who is online

Users browsing this forum: No registered users and 5 guests