Index: [thread] [date] [subject] [author]
  From: Martin Eli Erhardsen <mee@daimi.aau.dk>
  To  : ggi-develop@eskimo.com
  Date: Wed, 01 Jul 1998 09:55:04 +0200

Re: GGI_AUTO (was: Re: O.K. - backported LibGGImisc to devel rep ... ARGH !)

Marcus Sundberg wrote:
> 
> > > > This happens only on the KGI target - right ? It hasn't been updated to
> > > > recognize GGI_AUTO. Someone should finally do that ... C'mon it's easy ...
> > > That will require the KGI drivers to export information about
> > > what modes it can handle. Do we currently have an interface for
> > > this?
> >
> > Not needed. The Thing to change are the KGI drivers to recognize the GGI_AUTO.
> > They know.
> 
> They do? I thought the monitor, chipset and ramdac driver
> contained separate check_mode functions that doesn't
> know anything about each other, or am I wrong here?
> 
> But the real reason not to put this into KGI drivers are that
> it's not neccesary. Why put something in the kernel if it
> doesn't need to be there for security or performance reasons?
> 

Another problem is that to suggest a mode the kernel drivers must know
about each other, because the maximum supported resolution depends on the
monitor, ramdac, clock, graphics and chipset.

I think it is silly to require chipset drivers to be put into 4 directories,
when there is only one chip, which does everything.
It would be more logical to have a directory for integrated chipsets,
and maybe another directory for external ramdacs.
Nobody uses external clock chips now, and the graphics is done on the
main chip, even on an old Amiga 1000. It did have an external ramdac though.

Matrox is releasing a G200 card with external ramdac, but the general trend 
is toward higher integration, and even 3dfx is doing this in their Banshee 
chip with an integrated ramdac. Most moniters don't need higher that
250 MHz dotclock anyway.

Index: [thread] [date] [subject] [author]