*BSD News Article 12259


Return to BSD News archive

Path: sserve!manuel.anu.edu.au!munnari.oz.au!news.Hawaii.Edu!ames!haven.umd.edu!uunet!pipex!demon!gtoal.com!gtoal
Newsgroups: comp.os.386bsd.questions
From: gtoal@gtoal.com (Graham Toal)
Subject: gcc - large arrays, out of vm - any way to avoid?
Date: Tue, 2 Mar 1993 21:37:15 +0000
Message-ID: <9303022137.AA04169@pizzabox.demon.co.uk>
Sender: usenet@demon.co.uk
Lines: 28

I'm writing a program which has very little source code, but a whapping
big initialised char array at the head of it.  Well, I say 'whapping big',
but in fact it's only 50K yet its running out of virtual memory during the
compile (with the error: "prog.c:2683: Virtual memory exhausted.")

I've tried making the array static, or putting it inside main as an auto.
No help.  Any suggestions how to get round this?  Do I have to split it
up into lots of separate arrays? :-(  If it's a solution like that that's
needed, I can hack it myself - I'm really more looking for some life-saving
flag I can give that'll just make everything work magically... (or even just
an explanation of why gcc can't cope with this, to satisfy my curiosity...)

This is the gcc that first came out with 386bsd; the machine has 16Mb of
Ram and I think 8Mb swap space.

Thanks.

#include <stdio.h>
#include <stdlib.h>
char prog[] = {
     /* The 50915 elements of this array have been removed for brevity */
};

int main(int argc, char **argv)
{
  /* Prog deleted for brevity too - still went wrong with a null main */
  return(1);
}