*BSD News Article 19508


Return to BSD News archive

Newsgroups: comp.os.386bsd.development
Path: sserve!newshost.anu.edu.au!munnari.oz.au!constellation!osuunx.ucc.okstate.edu!moe.ksu.ksu.edu!hobbes.physics.uiowa.edu!math.ohio-state.edu!caen!hellgate.utah.edu!fcom.cc.utah.edu!cs.weber.edu!terry
From: terry@cs.weber.edu (A Wizard of Earth C)
Subject: Re: V86 mode & the BIOS (was Need advice: Which OS to port to?)
Message-ID: <1993Aug15.062620.6503@fcom.cc.utah.edu>
Sender: news@fcom.cc.utah.edu
Organization: Weber State University, Ogden, UT
References: <kjb.745145142@manda.cgl.citri.edu.au> <1993Aug13.042831.15754@fcom.cc.utah.edu> <108738@hydra.gatech.EDU>
Date: Sun, 15 Aug 93 06:26:20 GMT
Lines: 262

In article <108738@hydra.gatech.EDU> gt8134b@prism.gatech.EDU (Howlin' Bob) writes:
>In <1993Aug13.042831.15754@fcom.cc.utah.edu> terry@cs.weber.edu (A Wizard of Earth C) writes:
>
>>WHAT I DID OVER MY SUMMER VACATION
>>Copyright (c) 1993, Terry Lambert
>>All rights reserved
>
>Er, I'm going to respond to this.  I think quoting for review
>purposes is allowed.  Please mail me if I have violated your rights
>in any way, and I will cancel my post in all due haste. :-)

This was a stab at humor.  Apparently it left it alive and well, and I will
have to stab again.  8-).

>>There should be a video driver in the kernel... X is not the only consumer
>>of graphics display services.  A DOS emulator must also consume these
>>resources, as must console and virtual console implementations.
>
>I agree, there *should*.  I also happen to feel that most applications
>should keep their grubby hands off the hardware.  X and dosemu are
>exceptions.  

I don't think they are exceptions unless you pound a device-as-a-resource
wedge into the kernel.  The fact that you supported UART emulation at all
is an indicator that you don't buy dosemu as an exception.  X is an exception
because of tradition, not need.

>A windowing system is the proper place for UNIX graphics.

I'll quote this again a little later.  8-).

>>	b)	The DOS emulation can not use this model for anything more
>>		complicated than a MDA/CDA (text only) emulation.  If an
>>		application such as AutoCAD uses incestuous knowledge of
>>		the real adapter because it is running its own driver,
>>		there is no way to have the application restore the adapter
>>		to the correct state, nor is there a way to restore the
>
>Bzzzt.  Try again.  The DOS emulator *does* use this model for graphics 
>on VGA cards.  It can only reliably save and restore the screen state
>for standard VGA and ET4000 or Trident 8900c SVGA.  Again, we obviously
>have some duplication of logic here: the DOS emulator must know enough
>about the card to save and restore the state.  But knowing which registers
>to cut and paste on a SuperVGA is trivial compared to knowing the niggly
>details of setting every mode.  
>
>Also, have you heard of the "Save Video State" function of the VGA+ video 
>BIOS?  It allows an application to get an actual video state "image"
>placed in a buffer.  I haven't used this function yet, but I plan to
>adapt dosemu to use it.  The VGA-aware console switching code will
>be disabled, and this function will be used instead.  Theoretically,
>(i.e. if VGA manufacturers get it right), this will allow console
>switching on any card.

The "save video state" function applies only to a class of cards; other
save mechanisms, such as write-only register shadowing are insufficient
if the mode is set by "operation A followed by operation B" and a different
mode is set by "operation B followed by operation A".  I can think of at
least one ATI card where this is true.  But you later admit that this model
is flawed:

[ quoted text out of sequence in article, but in context of reply ]
>>the card itself.  Generally, this emulation would take the form of write-only
>>register caching so that full state information could be tagged to the virtual
>
>This is an important point for MDA/CGA/EGA cards: the registers cannot
>be saved by reading them.  Instead, shadow copies must be kept up to date
>as the originals are changed.  dosemu would allow this functionality
>by trapping all register accesses.

[ ... ]
>>		adapter to the current state instantiated by AutoCAD when
>>		the DOS emulation resumes control of the adapter, as would
>>		happen if you were to switch from the VC running the DOS
>>		emulator to another, and then back.  THE DOS APPLICATION
>>		DOES NOT EXPECT NOTIFICATION (AND DOES NOT RECEIVE IT), NOR
>>		IS IT ACCUSTOMED TO SHARING THE VIDEO HARDWARE.  THE DOS
>>		EMULATOR (THE APPLICATION BEING ASKED TO RESTORE THE STATE)
>>		CAN NOT DO SO -- IT IS NOT THE APPLICATION WHICH CHANGED
>>		THE STATE, IT SIMPLY RAN THE APPLICATION.
>
>Aside from the yelled and conflicting use of "APPLICATION" here, I have
>to agree.  If AutoCAD knows more about the FooVGA card than dosemu does,
>it has the advantage in setting an unrestorable mode.  There's no good
>way around that.  Sure, a completely virtualized VGA card would be
>one way around it.  It would be too slow to be usable, and in all honesty,
>would gain you nothing.  If the application can be satisfied with a
>vanilla VGA card, then why not tell it to use one?  dosemu already
>has save/restore code which works well with vanilla VGA.

Any VGA card can emulate a "vanilla" VGA card, by definition, with nearly
zero overhead in anything but the mode switching operations, which have
to be trapped (as do all BIOS calls effecting hardware state) anyway.

PS:	It was meant as italics, and was done for emphasis, since a 5 line
	sentence with *this* type of "italics" would seem to refer to a
	footnote (the asterisks would be "lost" by the reader).

>The best solution would be the int10h,1c function described above.
>If this really does a complete video state save and restore, then
>many of the problems are solved.  Of course, you still have to
>know how to save all of the video memory...

This is perhaps the most speed-effective soloution (I would -- and have done
so -- argue the point); however, the standard does not require that states
outside of the standard be saved.  When we talk about applications like
AutoCAD, we are not constraining the vendor supplied device driver to the
use of modes in the standard (otherwise, one would not be necessary).

>>2)	To support internationalization.  Internationalization requires
>>	the ability to render character sets other than the default PC
>>	character set.  Good internationalization requires the ability
>>	to render large-glyph-set character sets (like Kanji) and to
>>	render ligatured characters (like Hebrew/Tamil/Devangari/Arabic,
>>	etc., etc.).
>
>Good internationalization requires applications which can actually
>use this character set, and I'll bet most of those will be written
>for X (and Windows NT, sadly enough).  I doubt there will be too
>many programs written with the 386BSD console in mind (no insult
>intended).

None taken.  I am more interested in applications written to take advantage
of a Unicode (or similar) rendering engine, and providing the applications
with such an engine, and providing the engine with an interface to use in
the form of a console.  NT embodies a rendering engine; applications written
to the NT interface should require minimal modification for other Unicode
(or other rendering standard) interfaces.  This includes the OS and all
its utilities, error messages, input, output, and storage mechanisms.

>>3)	To prevent having to wire mode switching code into every application
>>	that requires the ability to render directly to console hardware.
>
>This is a good point.  Believe me, I don't have any urge to write
>said switching code.  I would have been happy if I could have relied on
>the kernel to do it for me.  But someone's gotta write the switching
>code, and it's gotta go somewhere.

Better to put it in the driver and abstract the interface.  My argument in
a nutshell.

>>	Despite the wonderful efforts of the X gods, it is still much slower
>>	than direct video I/O for some applications.  The mode handling code
>>	is available for use by all applications.  This will become more
>>	important as commercial applications become available.
>
>I get better performance with fracting under dosemu than I do with
>xfractint.  That's not surprising.  X is bound to be slower for some
>tasks.  (of course, some of the slowdown is because xfractint doesn't
>use its fast assembly integer math code under Linux).

I know some of the X servers takes advantage of acceleration not embodied in
even the DOS fractint.  Again:

>A windowing system is the proper place for UNIX graphics.

>>With such an interface, I can write an X server, GSS/CGI, MGR, PostScript,
>>Display PostScript, HPGL, a DOS emulator, or any other consumer of the
>>adapter driver without a single line of device dependent code, and without
>
>I wouldn't be so sure.  Rather, I wouldn't be so sure you'd *want* to.
>Remember, there's more to graphics than setting the mode.  How will you
>implement actual drawing?  What if the X server wants to use the
>2048x1536 mode offered by your new ZGA card?  Well, you can 
>(hopefully) get the programming information from the vendor, put together
>an LKM with the mode setting magic, and simply add a new mode to your X
>server's mode list.  Now, let's say it's a command-oriented card,
>like the S3.  IN fact, it's worse than that: you have no access
>to the video memory except through commands.  Insert any other
>VGA-incompatible card architecture here, but the point is that
>by the time you've added all the primitives needed to effiently use 
>it (get/put pixel, get block of video ram, put block of video ram,
>circle, square, palette, etc) you've written a fairly large kernel
>service.  Don't forget that these operations must be supported
>on *all* the cards.  So, you have to write the circle,square, etc.
>code for all the cards.  
>
>That's a little too complex for my kernel, thank you.

OK; here's the re-quote I warned you about:

>A windowing system is the proper place for UNIX graphics.

The DOS emulator runs in a UNIX environment.

Direct video I/O can be supported, and some intelligence in terms of the
knowledge of display memory geometry will have to be supported.  Write and
read faulting can translate to the "generic" model if we want to run in a
window in the X (or other distributed diplay environment).

Note that Phoenix was able to match the speed of a 4MHz PC on a 7MHz 68000
with little trouble.  I have a hard time believing I can't do a 33MHz PC
on a 50MHz 486 -- especially since there is much less that needs emulation.

>Linux has the stubs for SYSVR4 mode setting, as well as mouse control, in
>the console code, by the way.  It looks alright, I guess, but I think
>that the X server should not sacrifice the speed its knowledge of specific
>video cards and features gives it.  

I'm not asking it to; the state information need not take into account
operations in progress on accelerated cards... only the display memory
contents and the vidoe card state at the time of the switch.

>>Part of the DOS emulation would be a layer to emulate a generic CGA/EGA/VGA
>>card on top of the driver.  Since this interface has a well-defined upper
>
>Shudder.  People, this is not easy, and it is not fast.  I implemented
>virtual UARTs for dosemu, and they're not pretty.

But this very implementation implies it is possible (as do all versions of
SoftPC 8-)).  It also implies that there exist sound resons to do such things.

>>and lower bounds, it could be easily replaced with a more complex bleed
>>through of the actual card capabilities, up to and including "emulating"
>>the actual card in the machine by allowing all of the commands supported by
>
>This is such a loss I can't put my feelings into words.   I'm not
>about to invest money in a local bus video card only to see it
>reduced to the speed of the original IBM CGA.

I am suggesting trapping mode switch calls and the ability to dump and restore
screen memory.  The only reason for emulating the actual card are to "go
remote"... somthing that is accomplished with adequate speed even with serial
port/modem technology in products like "PC Anywhere" and "Carbon Copy".

>>There is another significant advantage, which is the ability to provide a
>>user space daemon that translates the CGA or VGA calls into X calls to a
>>particular server connection.  With such a "shim", I can easily run DOS
>>many applications on remote X servers.  Generally, I will not want to do
>
>I really want to scream.  Let me yell for a second: FOR ANY PROFESSIONAL
>DOS APPLICATION, THERE ARE **NO** VGA CALLS EXCEPT ONE: SET VIDEO
>MODE.  Real applications bang on the hardware in the worst way, and
>there is no such thing as "simply" translating this into X calls.
>Do you realize that the VGA memory address space (0xa0000-0xbffff)
>can actually address two different 128k banks, one for writes, one
>for reads?  Do you realize that the VGA memory access hardware lets
>you specify that the data be rotated and masked before it gets
>from the card to the CPU or before it gets from the CPU to the card?
>If you can suggest a "simple" way to handle this, I'll be indebted
>to you.  There are ways, but they're neither efficient nor simple.

Yep.  Copy memory deltas at reasonable time intervals, or fault accesses to
a virtual display's "video memory".  This is what "Carbon Copy" does (and
makes so mus money for).

>>All in all, there is no reason why an architecture can not be arrived at
>>that provides the best of both worlds without precluding the capabilities
>>of either.
>
>I can see why the *BSD groups get more arguing than development done.

But even with 90% argument and only 10% developement, as long as you have
100 times as much arguing as other groups have developement, you end up with
10 times the developement.  A fair trade, wouldn't you say?  8-).


					Terry Lambert
					terry@icarus.weber.edu
---
Any opinions in this posting are my own and not those of my present
or previous employers.