Board index » delphi » Too much global data defined in file

Too much global data defined in file

    I'm developing an application for DOS (standard) and I'm coming up with
the following error on compilation:

        "Too much global data defined in file"

    When I go to the help topic on this error it gives some tips, which I
have tried (to the best of my ability) but to no avail. I pulled all my
string literals out and put them in their own header file and even pulled
some of my functions out of my CPP file with my main function in it (leaving
a lot of them, but removing what I would think is enough to resolve the
issue). I continue to get the same error no matter what. The one thing I'm
not too sure on, and maybe this is key, is how to place my globals and code
segments into far segments. I tried to follow the examples I found in the
help files and on the net, but maybe I'm doing something wrong?
    If anyone has seen this issue and knows of some things I can try to
resolve this issue, it would be much appreciated.

                 Thank you,
                            Patrick

 

Re:Too much global data defined in file


It helps to know what version of the compiler you are using.  I am assuming
that you are using BC++ version 4.0 or above.

When you put something in a header file it doesn't move it.  The header file
gets included into the source file and the compiler sees the combined header
and source as one file.

You can put the keyword far into a declaration for an array (or char string)
and that will move it to another segment, out of the default data segment
that is overflowing.  This only works if you have a few arrays since each
creates its own segment and the linker has a limit to how many segments it
can take (255 I think).

I assume that you are compiling in Large model.  If not, change to Large
model.

Here are some things you can do:

arrays of unitialized data:
  Declare them as pointers and allocate them in main with new or malloc

arrays of initialized data:
  Create a source file that has nothing in it.
  Add the file to the project
  Move the arrays with their initializations to that source file.
  Right click on the file name in the project window, select
    TargetExpert and use it to set the memory model to huge
  If you have too much data form that, you might need to use
     more than one huge model file for the data.
  Copy the declarations of these arrays with the initializations
     removed and extern added to the left into a header file which
     you will then include into any source file which uses the
     arrays
for instance:
  char abc[] = "def";
will be placed in the header file as:
  extern char abc[];

This works because each huge model file can accept a segment (up to 64K) of
data.  Don't push it.  If it's close to 64K, go ahead and use a second file.

If it still doesn't work, please tell what compiler you are using and some
more details about the initialized data.

.  Ed

Re:Too much global data defined in file


Ed,
    First off, I would like to thank you for getting back to me so quickly
with a response! The version of Borland C++ I'm using is 5.02, sorry for
leaving it out of the original message. I understand what you mean about how
to put the initialized strings into their own file and use "extern" in a
header file with the declarations. I'm going to try that this evening. I
also understand how header files worked, so when I tried using those, I
figured it probably wasn't going to fix my issue.
    I have some other questions though about things you mentioned I should
try, as follows:

Quote
> You can put the keyword far into a declaration for an array (or char
string)
> and that will move it to another segment, out of the default data segment
> that is overflowing.  This only works if you have a few arrays since each
> creates its own segment and the linker has a limit to how many segments it
> can take (255 I think).

    I actually created two classes for some common functions I use (or will
be using, since it doesn't compile at this point) and created pointers to
strings do handle all the output I have for these functions. By output I
mean things like: cout << "Hello world", etc. Basically any literals that
I've used. But how many is a "few arrays" and how do I know when I've
crossed the threshold? I also have a couple structures that are relatively
big, because they have a number of character arrays in them, for example:

  /* Structure used to store a call record */
  struct CallData {
      char date[15];
      char time[10];
      float duration;
      char extension[10];
      char phone_num[30];
      char trunk[10];
      char calltype[20];
      char group[20];
      char type[20];
      char location[20];
      float cost;
      float price;
      char special;

Quote
};

could this also be causing a problem? If so, how would I go about resolving
it? Should I declare the structures as far?

Quote
> I assume that you are compiling in Large model.  If not, change to Large
> model.

    I was using the "Large" model, but I've been flipping through them like
crazy to try and make something work... I've also noticed you can change
them for the project and each individual node (through the "local options").
Which should I make "Large", and which should I make "Huge", as you
mentioned later in your response?

Quote
> arrays of initialized data:
>   Create a source file that has nothing in it.
>   Add the file to the project
>   Move the arrays with their initializations to that source file.
>   Right click on the file name in the project window, select
>     TargetExpert and use it to set the memory model to huge
>   If you have too much data form that, you might need to use
>      more than one huge model file for the data. ... <removed text>
> This works because each huge model file can accept a segment (up to 64K)
of
> data.  Don't push it.  If it's close to 64K, go ahead and use a second

file.

    Again, how do I know when I have too much data and I need to use an
additional file? Also, this will work for those string literals I mentioned
earlier as well?

Quote
> If it still doesn't work, please tell what compiler you are using and some
> more details about the initialized data.

    Most of my initialized data is strings (or at least now that I've turned
all those literals into character strings declared as follows:

        const far char* far PermissionDenied = "Permission denied";

These are the strings that I mentioned I had put into a header file. I
thought that there was an outside chance of this solving my problem because
they are declared as "far", but to no avail... I also tried playing around
with the data, far, and code segment names, but I honestly didn't really
understand how that works, so naturally, it didn't work...
    Anyway, thank you for your help, and hopefully some of the additional
information I provided in here will help you provide more tips to help me
out. I've been working on this one problem for days and I'm at the end of my
rope with trying to find a solution, so every little bit helps!

                    Thanks again,
                                 Patrick

Re:Too much global data defined in file


I wasn't talking about moving individual items, such as a structure with
only one instance.  I was talking about moving arrays of things such as
arrays of structures or arrays of characters.

You shouldn't use huge model for anything other than what I mentioned, a way
to declare initialized global data while not taking from the limited area
available in the data segment.

  const far char* far PermissionDenied = "Permission denied";

I assumed that you would use something like this:

  // in the huge model source file
  char PermissionDenied[] = "Permission denied";
  // where the 'far' doesn't matter since you are in Large or Huge model
  // where far is the default pointer type and you've already moved the
  // declaration to the Huge model file so have already moved it out of
  // the default data segment

  // in the header file
  extern char PermissionDenied[];

The struct CallData that you show is only a definition of the structure.
While I wonder about using floats instead of doubles to handle currency
because of the smaller number of significant figures, it is otherwise benign
for this because the definition does not use memory.

When you declare arrays of such structures is what I am talking about.  If
the arrays are not initialized, then declare them as pointers and allocated
them in WinMain.  If they are initialized then they are part of what you
should consider moving to that other file.

And one more time: DO NOT declare arrays in header files.  Only declare them
in source files.  Put extern declarations for arrays in header files.

I would prefer that you try doing what I suggested rather than interpreting
and modifying it.  When you get something working you can modify to your
heart's content.  Until then it doesn't work and there is no value to
modifying to your taste, only to the compiler's taste.

.  Ed

Re:Too much global data defined in file


Ed,
    Thank you again for the help. I finally got the situation resolved as
far as the "Too much data declared in file" is concerned, so thank you for
helping me work through that. I can compile each module separately, then
compile the project, but when I go to build the project I get two new
errors:

Error: Segment _BSS exceeds 64K
Error: Segment DGROUP exceeds 64K

    Obviously, I'm overshooting the default data segment limit when I
compile the whole project now, not just one file. The weird thing is that
each source file builds fine, but the project as a whole doesn't. I thought
when you use different source files the data gets stored in a group for each
of the source files?
    Thanks in advance for any help you can give me with this latest twist of
the problem. It appears that we're progressing toward a solution, so
hopefully this will be the last hurdle.

            Patrick

Ed Mulroy (TeamB) <e...@mulroy.org> wrote in message news:3bec04a2_1@dnews...

Quote
> I wasn't talking about moving individual items, such as a structure with
> only one instance.  I was talking about moving arrays of things such as
> arrays of structures or arrays of characters.

> You shouldn't use huge model for anything other than what I mentioned, a
way
> to declare initialized global data while not taking from the limited area
> available in the data segment.

>   const far char* far PermissionDenied = "Permission denied";

> I assumed that you would use something like this:

>   // in the huge model source file
>   char PermissionDenied[] = "Permission denied";
>   // where the 'far' doesn't matter since you are in Large or Huge model
>   // where far is the default pointer type and you've already moved the
>   // declaration to the Huge model file so have already moved it out of
>   // the default data segment

>   // in the header file
>   extern char PermissionDenied[];

> The struct CallData that you show is only a definition of the structure.
> While I wonder about using floats instead of doubles to handle currency
> because of the smaller number of significant figures, it is otherwise
benign
> for this because the definition does not use memory.

> When you declare arrays of such structures is what I am talking about.  If
> the arrays are not initialized, then declare them as pointers and
allocated
> them in WinMain.  If they are initialized then they are part of what you
> should consider moving to that other file.

> And one more time: DO NOT declare arrays in header files.  Only declare
them
> in source files.  Put extern declarations for arrays in header files.

> I would prefer that you try doing what I suggested rather than
interpreting
> and modifying it.  When you get something working you can modify to your
> heart's content.  Until then it doesn't work and there is no value to
> modifying to your taste, only to the compiler's taste.

> .  Ed

Re:Too much global data defined in file


DGROUP is a segment containing all the data.  BSS is the logical segment
which is part of DGROUP and which contains the uninitialized global and
static data.  A segment has a hardware limit of 64K in size.  You have more
than 64K of uninitialized global and static data.  I would expect maybe as
much as 4K on a large application (for instance, in a project for something
such as MS Word).  Either get rid of your uninitialized global and static
data or move it to one of the huge model files containing only data.

You don't have any single array (initialized or uninitialized) that has over
64K by itself do you?

.  Ed

Re:Too much global data defined in file


Quote
On Fri, 9 Nov 2001 13:27:05 -0500, "Patrick" <bluewater_1...@yahoo.com> wrote:
>Error: Segment _BSS exceeds 64K
>Error: Segment DGROUP exceeds 64K

Go to http://www.vmlinux.org/~jakov/community.borland.com/
and look for:

Resolving "Segment or Group xxxx Exceeds 64K" msgs

Managing  Global Variables

--
Wayne A. King
(ba...@torfree.net, wayne.k...@ablelink.org,
 wak...@idirect.com, Wayne_A_K...@compuserve.com)

Re:Too much global data defined in file


Ed,
    I would assume that I'm fine on my individual array size, since I
haven't added any new ones to my program since the last time this program
worked. I do have a question though about *.LIB files. Could creating
libraries out of my separate source files help this problem out? I came
across something that I thought said that when you create LIB files and
attach them to your program that they are automatically placed in separate
segments, but I obviously could be mistaken.
    I created a MAP file of my project as well, but I'm not exactly sure
what I should be looking for in it as far as resolving my problem. Some of
the literature I found on the internet suggested looking at the MAP file,
but they didn't really say how to interpret it...
    The thing that I'm concerned with is that I only get this error when I
try and create my EXE for the whole project. Each CPP file compiles
completely fine on its own, and even the whole project compiles (or so I
think that's what's happening), but it tells me my "Make Failed" when I try
to create my EXE...
    Thank you again for the help, hopefully this will be my last post and
the problem will be solved. I appreciate the time you've put into giving me
tips on how to get through this.

                Thanks,
                    Patrick

Quote
> DGROUP is a segment containing all the data.  BSS is the logical segment
> which is part of DGROUP and which contains the uninitialized global and
> static data.  A segment has a hardware limit of 64K in size.  You have
more
> than 64K of uninitialized global and static data.  I would expect maybe as
> much as 4K on a large application (for instance, in a project for
something
> such as MS Word).  Either get rid of your uninitialized global and static
> data or move it to one of the huge model files containing only data.
> You don't have any single array (initialized or uninitialized) that has
over
> 64K by itself do you?

> .  Ed

Re:Too much global data defined in file


Wayne,
    Thank you for the link, it provided some good insight, unfortunately it
didn't help me solve the problem, yet...

        Thanks,
            Patrick

Wayne A. King <wak...@idirect.com> wrote in message
news:3bec340f.35004251@newsgroups.borland.com...

Quote
> On Fri, 9 Nov 2001 13:27:05 -0500, "Patrick" <bluewater_1...@yahoo.com>
wrote:

> >Error: Segment _BSS exceeds 64K
> >Error: Segment DGROUP exceeds 64K

> Go to http://www.vmlinux.org/~jakov/community.borland.com/
> and look for:

> Resolving "Segment or Group xxxx Exceeds 64K" msgs

> Managing  Global Variables

> --
> Wayne A. King
> (ba...@torfree.net, wayne.k...@ablelink.org,
>  wak...@idirect.com, Wayne_A_K...@compuserve.com)

Re:Too much global data defined in file


The reason that you only get the error when building the exe and not when
compiling individual files is that the linker is what checks the size of the
segments.  Their size is not knowable before the link phase as the size is
the combined size from all the source files that are in the project.

When you compile a source file it results in an object (*.OBJ) file being
created which has code and information about the data and location of data.

When you use a library (*.LIB) file it is nothing more than a collection of
object files.  There is little difference between linking with an object
file and linking with a library which contains the object file.  The global
and static data in all object files still default to being placed into the
DGROUP segment.

The only exceptions to this have nothing to do with object versus library
files.

Global data declared with the far keyword will be placed into its own
segment but that is only useful if there are few global data items as the
DOS linker has a limited number of segments that it can handle.

If a source file is compiled as huge model it will have its own segment for
global data so can handle nearly 64K of global data outside of the default
DGROUP.  That is why I recommended that you move the large amount of data
that is causing problems to a source file which has a project override of
huge model.  Do not move code to that file.  Code in huge model has more
size and function calls to/from it take more time to execute.

.  Ed

Other Threads