*BSD News Article 19486


Return to BSD News archive

Path: sserve!newshost.anu.edu.au!munnari.oz.au!constellation!convex!convex!convex!darwin.sura.net!gatech!prism!gt8134b
From: gt8134b@prism.gatech.EDU (Howlin' Bob)
Newsgroups: comp.os.386bsd.development
Subject: Re: V86 mode & the BIOS (was Need advice: Which OS to port to?)
Message-ID: <108738@hydra.gatech.EDU>
Date: 14 Aug 93 00:31:06 GMT
References: <hastyCBLnIF.Cyq@netcom.com> <1993Aug11.164429.6015@fcom.cc.utah.edu> <kjb.745145142@manda.cgl.citri.edu.au> <1993Aug13.042831.15754@fcom.cc.utah.edu>
Organization: Georgia Institute of Technology
Lines: 239

In <1993Aug13.042831.15754@fcom.cc.utah.edu> terry@cs.weber.edu (A Wizard of Earth C) writes:

>Microsoft failing at something is not a sterling proof of impossibility;

Agreed.

>>I also don't see that this is that much of a problem. Given the fact that
>>you may be on a different arhictecture, you can simply have two compilable
>>versions - one that uses the dosemu stuff to access the BIOS, and another
>>that goes directly to the hardware for new architectures like the MIPS and
>>DEC Alpha.

I have to agree with Terry; if we're talking about the same card here,
then why not just use the direct knowledge of the card instead of
the dosemu/BIOS solution?  I think the dosemu/BIOS solution for mode
setting is most valuable for those cards for which we simply
don't have the information, or must sign an NDA to get info for.

>WHAT I DID OVER MY SUMMER VACATION
>Copyright (c) 1993, Terry Lambert
>All rights reserved

Er, I'm going to respond to this.  I think quoting for review
purposes is allowed.  Please mail me if I have violated your rights
in any way, and I will cancel my post in all due haste. :-)

>THE ARGUMENT AGAINST BIOS IF YOU ALREADY NEED A HARDWARE LEVEL DRIVER

Or already have one... I think a hardware level driver is always the 
best solution.

>There should be a video driver in the kernel... X is not the only consumer
>of graphics display services.  A DOS emulator must also consume these
>resources, as must console and virtual console implementations.

I agree, there *should*.  I also happen to feel that most applications
should keep their grubby hands off the hardware.  X and dosemu are
exceptions.  

A windowing system is the proper place for UNIX graphics.


>	b)	The DOS emulation can not use this model for anything more
>		complicated than a MDA/CDA (text only) emulation.  If an
>		application such as AutoCAD uses incestuous knowledge of
>		the real adapter because it is running its own driver,
>		there is no way to have the application restore the adapter
>		to the correct state, nor is there a way to restore the

Bzzzt.  Try again.  The DOS emulator *does* use this model for graphics 
on VGA cards.  It can only reliably save and restore the screen state
for standard VGA and ET4000 or Trident 8900c SVGA.  Again, we obviously
have some duplication of logic here: the DOS emulator must know enough
about the card to save and restore the state.  But knowing which registers
to cut and paste on a SuperVGA is trivial compared to knowing the niggly
details of setting every mode.  

Also, have you heard of the "Save Video State" function of the VGA+ video 
BIOS?  It allows an application to get an actual video state "image"
placed in a buffer.  I haven't used this function yet, but I plan to
adapt dosemu to use it.  The VGA-aware console switching code will
be disabled, and this function will be used instead.  Theoretically,
(i.e. if VGA manufacturers get it right), this will allow console
switching on any card.


            INT 10,1C - Save/Restore Video State  (VGA only)
 
        AH = 1C
 
 	AL = 0	get save buffer size
           CX = requested states
        	bit 0: video hardware state
        	bit 1: video BIOS data areas
        	bit 2: video DAC state
 
        on return:
        AL = 1C
        BX = buffer size in 64 byte blocks
 
 	AL = 1	save requested state
           CX = requested states (see AL = 0)
           ES:BX = pointer to buffer
 
        returns nothing
 
 	AL = 2	restore requested states
           CX = requested states (see AL = 0)
           ES:BX = pointer to buffer
 
        returns nothing
 
This information doesn't help with the kernel, though: this video state
buffer has no defined format.  

>		adapter to the current state instantiated by AutoCAD when
>		the DOS emulation resumes control of the adapter, as would
>		happen if you were to switch from the VC running the DOS
>		emulator to another, and then back.  THE DOS APPLICATION
>		DOES NOT EXPECT NOTIFICATION (AND DOES NOT RECEIVE IT), NOR
>		IS IT ACCUSTOMED TO SHARING THE VIDEO HARDWARE.  THE DOS
>		EMULATOR (THE APPLICATION BEING ASKED TO RESTORE THE STATE)
>		CAN NOT DO SO -- IT IS NOT THE APPLICATION WHICH CHANGED
>		THE STATE, IT SIMPLY RAN THE APPLICATION.

Aside from the yelled and conflicting use of "APPLICATION" here, I have
to agree.  If AutoCAD knows more about the FooVGA card than dosemu does,
it has the advantage in setting an unrestorable mode.  There's no good
way around that.  Sure, a completely virtualized VGA card would be
one way around it.  It would be too slow to be usable, and in all honesty,
would gain you nothing.  If the application can be satisfied with a
vanilla VGA card, then why not tell it to use one?  dosemu already
has save/restore code which works well with vanilla VGA.

The best solution would be the int10h,1c function described above.
If this really does a complete video state save and restore, then
many of the problems are solved.  Of course, you still have to
know how to save all of the video memory...

>2)	To support internationalization.  Internationalization requires
>	the ability to render character sets other than the default PC
>	character set.  Good internationalization requires the ability
>	to render large-glyph-set character sets (like Kanji) and to
>	render ligatured characters (like Hebrew/Tamil/Devangari/Arabic,
>	etc., etc.).

Good internationalization requires applications which can actually
use this character set, and I'll bet most of those will be written
for X (and Windows NT, sadly enough).  I doubt there will be too
many programs written with the 386BSD console in mind (no insult
intended).

>3)	To prevent having to wire mode switching code into every application
>	that requires the ability to render directly to console hardware.

This is a good point.  Believe me, I don't have any urge to write
said switching code.  I would have been happy if I could have relied on
the kernel to do it for me.  But someone's gotta write the switching
code, and it's gotta go somewhere.

>	Despite the wonderful efforts of the X gods, it is still much slower
>	than direct video I/O for some applications.  The mode handling code
>	is available for use by all applications.  This will become more
>	important as commercial applications become available.

I get better performance with fracting under dosemu than I do with
xfractint.  That's not surprising.  X is bound to be slower for some
tasks.  (of course, some of the slowdown is because xfractint doesn't
use its fast assembly integer math code under Linux).

>The default video driver should support detection of MDA/CDA/MGA/CGA/EGA/VGA
>and default mode handling for modes of the device it detects.  You might add
>HGA if you get ambitious.

Ok, sounds good.  That can be done fairly easily.  The harder part is
defining a kernel service that's flexible enough to serve all needs.

>With such an interface, I can write an X server, GSS/CGI, MGR, PostScript,
>Display PostScript, HPGL, a DOS emulator, or any other consumer of the
>adapter driver without a single line of device dependent code, and without

I wouldn't be so sure.  Rather, I wouldn't be so sure you'd *want* to.
Remember, there's more to graphics than setting the mode.  How will you
implement actual drawing?  What if the X server wants to use the
2048x1536 mode offered by your new ZGA card?  Well, you can 
(hopefully) get the programming information from the vendor, put together
an LKM with the mode setting magic, and simply add a new mode to your X
server's mode list.  Now, let's say it's a command-oriented card,
like the S3.  IN fact, it's worse than that: you have no access
to the video memory except through commands.  Insert any other
VGA-incompatible card architecture here, but the point is that
by the time you've added all the primitives needed to effiently use 
it (get/put pixel, get block of video ram, put block of video ram,
circle, square, palette, etc) you've written a fairly large kernel
service.  Don't forget that these operations must be supported
on *all* the cards.  So, you have to write the circle,square, etc.
code for all the cards.  

That's a little too complex for my kernel, thank you.

>duplicating the effort required to produce device specific code for each
>application.  This is nearly a postage-stamp description of Intel's iBCSII
>standard, but is more flexible in that it does not limit the modes which

Linux has the stubs for SYSVR4 mode setting, as well as mouse control, in
the console code, by the way.  It looks alright, I guess, but I think
that the X server should not sacrifice the speed its knowledge of specific
video cards and features gives it.  

>Part of the DOS emulation would be a layer to emulate a generic CGA/EGA/VGA
>card on top of the driver.  Since this interface has a well-defined upper

Shudder.  People, this is not easy, and it is not fast.  I implemented
virtual UARTs for dosemu, and they're not pretty.

>and lower bounds, it could be easily replaced with a more complex bleed
>through of the actual card capabilities, up to and including "emulating"
>the actual card in the machine by allowing all of the commands supported by

This is such a loss I can't put my feelings into words.   I'm not
about to invest money in a local bus video card only to see it
reduced to the speed of the original IBM CGA.

>the card itself.  Generally, this emulation would take the form of write-only
>register caching so that full state information could be tagged to the virtual

This is an important point for MDA/CGA/EGA cards: the registers cannot
be saved by reading them.  Instead, shadow copies must be kept up to date
as the originals are changed.  dosemu would allow this functionality
by trapping all register accesses.

>There is another significant advantage, which is the ability to provide a
>user space daemon that translates the CGA or VGA calls into X calls to a
>particular server connection.  With such a "shim", I can easily run DOS
>many applications on remote X servers.  Generally, I will not want to do

I really want to scream.  Let me yell for a second: FOR ANY PROFESSIONAL
DOS APPLICATION, THERE ARE **NO** VGA CALLS EXCEPT ONE: SET VIDEO
MODE.  Real applications bang on the hardware in the worst way, and
there is no such thing as "simply" translating this into X calls.
Do you realize that the VGA memory address space (0xa0000-0xbffff)
can actually address two different 128k banks, one for writes, one
for reads?  Do you realize that the VGA memory access hardware lets
you specify that the data be rotated and masked before it gets
from the card to the CPU or before it gets from the CPU to the card?
If you can suggest a "simple" way to handle this, I'll be indebted
to you.  There are ways, but they're neither efficient nor simple.

>All in all, there is no reason why an architecture can not be arrived at
>that provides the best of both worlds without precluding the capabilities
>of either.

I can see why the *BSD groups get more arguing than development done.

-- 
Robert Sanders
Georgia Institute of Technology, Atlanta Georgia, 30332
uucp:	  ...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!prism!gt8134b
Internet: gt8134b@prism.gatech.edu