Index: [thread] [date] [subject] [author]
  From: Hartmut Niemann <niemann@cip.e-technik.uni-erlangen.de>
  To  : ggi-develop@eskimo.com
  Date: Tue, 15 Sep 1998 17:37:22 +0200 (MESZ)

Re: Graphics newbie

> 
> Hi-
Hi,
welcome onboard!
> 
> I've been looking into the ggi project a bit and have a few questions.
> Keep in mind that I have little to no experience with graphics or ggi, so
> be gentle.  BTW, some of these questions are not specific to ggi, so
> pointers to other documentation is welcomed.
> 
> 1)  I've compiled and run the stars demo on libggi (since it shows frames
> per second), under X with GGI_AUTO sizing, it shows 30 fps, when I increase
> the size to say, 1024x768, it slows down dramatically to somewhere around 4
> or 8 fps.  What is the major contributor to this slowdown?  Is it:
> a)  Slow graphics card?  I have a TGA card running on a DEC UDB (multia)
> b)  X is just slow?  Would running kgi instead change this?  What sort of
> speed up?
> c)  The X target is slow because it copies a whole XImage to the window
> each time ggiFlush is called?
> d)  My X target is slow because it doesn't use X shared memory or the X
> double buffering extension?  I don't know anything about these other than
> they are supposed to speed up graphics under X
> e)  Other?
As far as I know: c. The X target draws into a memory area and then sends
that to the X server. You could try the Xlib target, which issues
single X drawing primitives, IIRC.
Normally this update is done every 50ms or so (we call that 'synchronous'),
but you can (and should) 'manually' update with ggiFlush as often as *you*
need (that would mean: after you finished drawing).
See http://cip8.e-technik.uni-erlangen.de:8080/hyplan/niemann/ggidoc/libggi/libggi-4.html#ss4.15
I don't know how the stars demo is done.
> 
> 2)  How do you directly access the data in a visual?  Is there some way to
> do something like memcpy(visual_data_ptr,image_data_ptr,count) ?  A more
> general question:  Is there a  way to copy data directly into video memory?
> The reason I ask is, I'm attempting to write a game with run length encoded
> sprites.  That is, instead of having a rectangular image that is
> widthxheight, the images are stored in segments like:  x,y,length,data
I probably would unpack the sprites into memory, and then use ggiPutBox.
You probably won't understand how that is supposed to work without 
*lots of questions*, because this function has been discussed about
far more than used. But PLEASE try it out and ask all questions.
We (at least I) want this to be really usable.

The easiest way that I could think of:
You write a routine that draws the sprite onto the screen, say: the upper 
left corner. Then you ggiGetBox to some memory spot. Now you can use
ggiPutBox to draw this sprite onto the screen. But be warned that there
is currently no notion of 'transparent' or 'sprite mask'.

On many targets you can directly access the frame buffer aka video memory,
but not on all of them.
> 
> 3)  Where did you guys find information on programming graphics cards?
vgadoc was mentioned very often. Many graphics companies publish their
programming guides. E.g. S3 mails one to you if you ask politely,
Matrox and Sis request some sort of developer registration, and email
you a password, then you download some 4MB pdf or doc file and print it.
Texas Instruments has the pdf files ready for download to anyone.
Short answer: search, search, have luck. It depends.
> 
> Thanks for your time,
> Brett
> 
> 
You are welcome!
Hartmut.
> 

--  
Hartmut Niemann   --   niemann(a)cip.e-technik.uni-erlangen.de
http://cip2.e-technik.uni-erlangen.de:8080/hyplan/niemann/index_en.html [/ggi]

Index: [thread] [date] [subject] [author]