*BSD News Article 12347


Return to BSD News archive

Newsgroups: comp.os.386bsd.questions
Path: sserve!manuel.anu.edu.au!munnari.oz.au!news.Hawaii.Edu!ames!haven.umd.edu!uunet!mnemosyne.cs.du.edu!nyx!smace
From: smace@nyx.cs.du.edu (Scott Mace)
Subject: Re: gcc - large arrays, out of vm - any way to avoid?
Message-ID: <1993Mar4.211900.1782@mnemosyne.cs.du.edu>
X-Disclaimer: Nyx is a public access Unix system run by the University
	of Denver for the Denver community.  The University has neither
	control over nor responsibility for the opinions of users.
Sender: usenet@mnemosyne.cs.du.edu (netnews admin account)
Organization: Nyx, Public Access Unix at U. of Denver Math/CS dept.
References: <9303022137.AA04169@pizzabox.demon.co.uk>
Date: Thu, 4 Mar 93 21:19:00 GMT
Lines: 65

In article <9303022137.AA04169@pizzabox.demon.co.uk> gtoal@gtoal.com (Graham Toal) writes:
>I'm writing a program which has very little source code, but a whapping
>big initialised char array at the head of it.  Well, I say 'whapping big',
>but in fact it's only 50K yet its running out of virtual memory during the
>compile (with the error: "prog.c:2683: Virtual memory exhausted.")
>
>I've tried making the array static, or putting it inside main as an auto.
>No help.  Any suggestions how to get round this?  Do I have to split it
>up into lots of separate arrays? :-(  If it's a solution like that that's
>needed, I can hack it myself - I'm really more looking for some life-saving
>flag I can give that'll just make everything work magically... (or even just
>an explanation of why gcc can't cope with this, to satisfy my curiosity...)
>
>This is the gcc that first came out with 386bsd; the machine has 16Mb of
>Ram and I think 8Mb swap space.
>
>Thanks.
>
>#include <stdio.h>
>#include <stdlib.h>
>char prog[] = {
>     /* The 50915 elements of this array have been removed for brevity */
>};
>
>int main(int argc, char **argv)
>{
>  /* Prog deleted for brevity too - still went wrong with a null main */
>  return(1);
>}

I had similar problems with compiling tiff support in xv.  There is a
700k worth of arrays in one .h file.  I have to set the limit of
datasize in my shell

in csh it looks like this
cputime         unlimited
filesize        unlimited
datasize        6144 kbytes
stacksize       512 kbytes
coredumpsize    unlimited
memoryuse       6916 kbytes
memorylocked    unlimited
maxproc         40 
openfiles       64 

I set the datasize to 30000 kbytes and it worked.  note that you might
even want to switch into sigle user mode if you memeory is real tight.
Otherwise your systems will crawl as it tries to swap.  I have 16 megs
so my system didn't really crawl, but if you have say 4megs youll want
all the real RAM ou have free.

Depending on the size of the array you may not need to set it that high.

PS.  It may have been stacksize that I changed. (it was some time ago)

If you don't think your system can handle it, Then I would suggest that
you try using ref.tfs.com late at night, when few people are on.

Scott Mace

--
*********************************************************************
*    Scott Mace                internet:    smace@nyx.cs.du.edu     *
*                                           emace@tenet.edu         *
*********************************************************************