Index: [thread] [date] [subject] [author]
  From: Marcus Sundberg <e94_msu@elixir.e.kth.se>
  To  : andreas.beck@ggi-project.org
  Date: Wed, 01 Jul 1998 20:44:54 +0200

Re: GGI_AUTO (was: Re: O.K. - backported LibGGImisc to devel rep ... ARGH

> > > Not needed. The Thing to change are the KGI drivers to recognize the GGI_AUTO.
> > > They know.
> > 
> > They do? I thought the monitor, chipset and ramdac driver
> > contained separate check_mode functions that doesn't
> > know anything about each other, or am I wrong here?
> 
> No - you are right, but they can check. The chipset can simply try a mode 
> it can handle and it thinks is fast.

Try "a mode" ?
Have a look at lib/libggi/display/svgalib/mode.c
to see what it can take to implement proper GGI_AUTO handling.
And the new graphtype scheme will make it even more complicated.

And if we have a programmable chipset and a timelist monitor
and ramdac the chipset could have to try a _lot_ of modes before
it finds one that fits.

> > But the real reason not to put this into KGI drivers are that
> > it's not neccesary. 
> 
> Hmm - how would you handle it in usermode (other than relying on 
> LIBGGI_DEFMODE ?)

Like I suggested, KGI exports a structure that describes what modes
the driver kan handle.

> In kernelmode I think it is easy: Use some reasonable default mode, check it 
> and return the suggested mode.

Hmmm, I'm not talking about a default mode, I'm talking about
implementing the proper GGI_AUTO/suggest scheme, which all
targets should conform to. And I think this has no business
in kernel mode, as it can easily be implemented in user-space
if KGI exports info about possible modes.

//Marcus

Index: [thread] [date] [subject] [author]