Board index » delphi » 64-bit memory consumption

64-bit memory consumption


2006-05-17 04:25:32 PM
delphi94
64-bit is coming in Delphi "Highlander", both native and via .NET 2.0.
Obviously 64-bit apps consume more memory. However I was surprised to find
that a do-nothing WinForm app in VS 2005 actually consumes around double the
amount when running as 64-bit:
www.itwriting.com/blog/
Important to bear in mind if you are contemplating the switch.
Tim
 
 

Re:64-bit memory consumption

"Tim Anderson" <XXXX@XXXXX.COM>writes
Quote
64-bit is coming in Delphi "Highlander", both native and via .NET 2.0.
Not natively. Unless you have some new info?
 

Re:64-bit memory consumption

"Uffe Kousgaard" <XXXX@XXXXX.COM>writes
Quote
"Tim Anderson" <XXXX@XXXXX.COM>writes
news:446addfc$XXXX@XXXXX.COM...
>64-bit is coming in Delphi "Highlander", both native and via .NET 2.0.

Not natively. Unless you have some new info?
Actually old info :-)
"In addition, 64 bit code generation will be added to the Delphi native code
compliers to support native 64 bit development and debugging."
bdn.borland.com/article/0,1410,33383,00.html
But you are correct - in later remarks DavidI says 2008 for native 64-bit.
Apologies.
Tim
 

Re:64-bit memory consumption

Tim Anderson writes:
Quote
Obviously 64-bit apps consume more memory. However I was surprised to find
that a do-nothing WinForm app in VS 2005 actually consumes around double the
amount when running as 64-bit:

www.itwriting.com/blog/

Important to bear in mind if you are contemplating the switch.
Well, the other side of the coin is that you might add a lot more than
2GB of RAM to a 64-bit system. If you are currently using a 32-bit
system with 2GB of RAM that *still* uses the swap file every now and
then, you should probably contemplate switching to a 64-bit system with
more than 4GB of memory.
 

Re:64-bit memory consumption

Tim Anderson writes:
Quote
in later remarks DavidI says 2008 for native
64-bit.
And they don't even know if they will still be around by 2008. :)
 

Re:64-bit memory consumption

Quote
Well, the other side of the coin is that you might add a lot more than
2GB of RAM to a 64-bit system. If you are currently using a 32-bit
system with 2GB of RAM that *still* uses the swap file every now and
then, you should probably contemplate switching to a 64-bit system with
more than 4GB of memory.
Or may be the users will get tired of being constantly forced to upgrade
their hardware so they can run new and greatly improved .NET notepad.
 

Re:64-bit memory consumption

Kostya writes:
Quote
>Well, the other side of the coin is that you might add a lot more than
>2GB of RAM to a 64-bit system. If you are currently using a 32-bit
>system with 2GB of RAM that *still* uses the swap file every now and
>then, you should probably contemplate switching to a 64-bit system
>with more than 4GB of memory.

Or may be the users will get tired of being constantly forced to upgrade
their hardware so they can run new and greatly improved .NET notepad.
...or maybe the users get tired of their borg implants and switch to
Linux or OSX. ;)
Seriously, though, why would a user purchase a Win64 system without
packing it with decent amounts of RAM from the start? Even if he did
that, he would probably consider upgrading the RAM simply to be able to
run more 32-bit processes without having them swapped out all of the
time. Hint: It is nice to be able to run four instances each of BDS2006
and VS2005 and actually being able to work with them. ;)
 

Re:64-bit memory consumption

Quote
...or maybe the users get tired of their borg implants and switch to
Linux or OSX. ;)
Switching to Linux will not solve the 64 bit memory consumption problem.
M
 

Re:64-bit memory consumption

""Henrick Hellström [StreamSec]"" <XXXX@XXXXX.COM>writes
Quote
Seriously, though, why would a user purchase a Win64 system without
packing it with decent amounts of RAM from the start? Even if he did that,
he would probably consider upgrading the RAM simply to be able to run more
32-bit processes without having them swapped out all of the time. Hint: It
is nice to be able to run four instances each of BDS2006 and VS2005 and
actually being able to work with them. ;)
I agree, that is the main benefit I find. I fitted 4GB in mine, though 1GB
was stolen by Intel :-(
Tim
 

Re:64-bit memory consumption

"Tim Anderson" wrote
Quote
... a do-nothing WinForm app in VS 2005 actually
consumes around double the amount when running
as 64-bit:
www.itwriting.com/blog/
And can we assume that it would consume about
half the amount of memory if running as a 16-bit
application?
All kinds of affects come into play here. It would
be interesting if the net result was linear.
--JohnH
 

Re:64-bit memory consumption

One has to wonder what the point of using .Net to target 64bit would be,
since two important benefits are negated by design:
- GC already has trouble coping with hundreds of megabytes worth of
objects, and with gigabytes, it is not really improving (euphemism).
- IL doesn't support vector floating-point, so the benefits of
standardized SSE2 support are lost.
- JITter (so far) doesn't seem to be able of making good use of the
extra registers, this may improve, but there are still quite a few
roundtrips to memory and stack juggling in there, unless you're lucky
enough to hit one of the few peephole optimizations.
A side effect of larger pointers in .Net is that as objects get bigger,
they also become more likely to end up in generation 1+, which are less
often GC'ed, and thus it ends up inflating memory usage even more.
Eric
 

Re:64-bit memory consumption

Moni writes:
Quote
>...or maybe the users get tired of their borg implants and switch to
>Linux or OSX. ;)

Switching to Linux will not solve the 64 bit memory consumption problem.
Probably not, but that comment was just ironic, since I didn't find
Kostya's reply entirely relevant.
The fact that there are differences between .NET 2.0 32-bit and .NET 2.0
64 bit, does not logically entail that .NET 2.0 forces users to upgrade
their hardware. It is of course true that .NET 2.0 to some extent forces
users to upgrade their hardware, but it isn't a a consequence of the
question under discussion in this thread.
 

Re:64-bit memory consumption

Tim Anderson writes:
Quote
I agree, that is the main benefit I find. I fitted 4GB in mine, though 1GB
was stolen by Intel :-(
Stolen "by" or "from" intel?
--Jesper
 

Re:64-bit memory consumption

"Eric Grange" writes:
Quote

One has to wonder what the point of using .Net to target 64bit would be,
since two important benefits are negated by design:
- GC already has trouble coping with hundreds of megabytes worth of
objects, and with gigabytes, it is not really improving (euphemism).
- IL doesn't support vector floating-point, so the benefits of
standardized SSE2 support are lost.
- JITter (so far) doesn't seem to be able of making good use of the
extra registers, this may improve, but there are still quite a few
roundtrips to memory and stack juggling in there, unless you're lucky
enough to hit one of the few peephole optimizations.

A side effect of larger pointers in .Net is that as objects get bigger,
they also become more likely to end up in generation 1+, which are less
often GC'ed, and thus it ends up inflating memory usage even more.
Have you looked into Microsoft's Phoenix project which is the code name for
a software optimization and analysis framework that is the basis for all
their future compiler technologies? If not may I suggest you take a minute
and read about it here:
research.microsoft.com/phoenix/
If interested there is an interview with the program manager:
channel9.msdn.com/ShowPost.aspx
A somewhat related interview about the backend C++ compiler:
channel9.msdn.com/ShowPost.aspx
 

Re:64-bit memory consumption

Eric Grange <XXXX@XXXXX.COM>writes:
Quote
One has to wonder what the point of using .Net to target 64bit would be,
since two important benefits are negated by design:
- GC already has trouble coping with hundreds of megabytes worth of
objects, and with gigabytes, it is not really improving (euphemism).
A nice side effect of future developments is that with more cores, the
scanning of gen 2 (i.e. the big bit) can be done in parallel. As long as
CPU cores and memory keep pace with each other, this shouldn't really
get slower.
Speaking for my own development on .NET, I have found it to be extremely
fast - although apps requires 2-3x more memory than in the past. Of
course, my application domain was application servers, so I didn't need
SSE etc. Time spent in GC never exceeded 2% or so.
Quote
- IL doesn't support vector floating-point, so the benefits of
standardized SSE2 support are lost.
.NET is the wrong tool for fast vector SIMD code, and that isn't its
target market.
I can think of three possible ways this could be improved in the future:
1) Supply .NET valuetypes that the CLR has intrinsic knowledge about.
Operators on these valuetypes can be internal calls, so the JIT can
exercise extra information on them.
2) Give the JIT extensive knowledge of certain idioms produced by the C#
compiler for vector-style code, so it can extract possible SIMD
implementations. This is a much longer shot than option (1).
3) A "SIMD sublanguage" in MSIL, perhaps designed along the lines of
current shader assembly languages that we see these days in GPUs. See
microsoft.cs.msu.su/projects/ilshaders/ilshaders.pdf for example.
Quote
- JITter (so far) doesn't seem to be able of making good use of the
extra registers, this may improve, but there are still quite a few
roundtrips to memory and stack juggling in there, unless you're lucky
enough to hit one of the few peephole optimizations.
I definitely expect register usage to improve - MS are hardly going to
stand still.
The "peephole" optimizations in .NET are directly related to common C#
idioms and the IL the current MS C# compiler emits for those idioms. So,
whether one is implementing an algorithm in C# or implementing a code
generator for MSIL, the code pattern to stick to is the "natural" one
for C#. I think this is only to be expected.
Quote
A side effect of larger pointers in .Net is that as objects get bigger,
they also become more likely to end up in generation 1+, which are less
often GC'ed, and thus it ends up inflating memory usage even more.
The size of Gen0 is directly proportional to the CPU cache size, so
again, as CPU caches get bigger, this won't make a huge difference.
Speaking from personal experience, GC has never been a problem on
multi-CPU machines in a server environment, under heavy throughput for
many hours (ASP.NET performs appdomain recycling, so days doesn't
apply).
-- Barry