Board index » cppbuilder » Re: C++Builder future live chat replay

Re: C++Builder future live chat replay


2005-02-04 10:37:02 AM
cppbuilder61
Leroy Casterline wrote:
Quote
It's quite refreshing to be able to communicate back.
Definitely!
--
John Kaster blogs.borland.com/johnk
Features and bugs: qc.borland.com
Get source: cc.borland.com
What's going on? calendar.borland.com
 
 

Re:Re: C++Builder future live chat replay

Jonathan Neve wrote:
Quote
I suspected it wasn't merely the delay before compiling, but since I
haven't done any systematic testing, I wasn't sure. Currently, for my
big 700 unit project, I'm using precompiled headers, which helps a lot.
Yet most of my projects (somewhere between 150 and 300 units usually)
don't use precompiled headers, and they compile disproportionately
faster. As I said, I haven't done any thorough experiementing, so I
can't give any figures, but what you say definitely matches my experience.
One way to handle large projects is to break them into smaller LIBs (or
DLLs). I just took out all the files that I don't modify or debug too
often, and put them in several LIB files. This tremendously decreases
compile time. Not only because you have to compile less files, but
because the IDE handles smaller projects incomparably faster. So it's
better to create a project group and compile the project in smaller
portions at a time. The problem is not that there are too many files to
compile, but with the fact that large projects are improportionately
slower. In other words, the exact same files that compile fast in a
small project are considerably slower in a large one. The compiler goes
into slow motion when there are too many units in a BPR. I'll try to use
make instead of the IDE, but after breaking it into LIBs, I'm quite
happy now. I also use #pragma link to link the LIB files, as opposed to
adding them to the project.
One more observation. If I delete all the obj and csm files before a
full rebuild, the building is noticeably faster that without doing this
cleanup.
A RAM disk helped for me, but only a very little bit, and only if the
entire project, everything is in the RAM disk. It's not worth the hassle
to me, compile times are comparable to the hard disk times, at least on
my computer. However, if only the obj and csm files are on a RAM disk,
everything else is on the hard disk, it's slower than if everthing is on
the hard disk. It seems it's the best when all files are on the same
drive. But not in the same directory, we should always put temporary
files in a different subdirectory. I have a Western Digital Raptor SATA
drive, which is one of the fastests hard disks, and it's worth it. It
doesn't have to be very large one. Other people told me that they had
very good results with a good RAM disk. It's worth doing some tests with
a stop watch (yes, a stop watch, because my compiler sometimes stops in
the middle, and I have to start it again, and thus the compile time
reported in the main caption is inaccurate).
Nothing would be more important to me than fixing this very steep
compile time increase that I experience with growing projects. I don't
want to spread false information by saying that it's exponential, but I
suspect it's quite bad. At one point I added 1 unit to my project and it
took 10 seconds. I added 1 more and it took a minute. Then I added one
more and it crashed after several minutes. We just have to avoid these
insanely large projects. Take the time and break them up, and you can
easy the pain that way.
Tom
 

Re:Re: C++Builder future live chat replay

Tamas Demjen wrote:
Quote
One way to handle large projects is to break them into smaller LIBs (or
DLLs). I just took out all the files that I don't modify or debug too
often, and put them in several LIB files. This tremendously decreases
compile time. Not only because you have to compile less files, but
because the IDE handles smaller projects incomparably faster. So it's
better to create a project group and compile the project in smaller
portions at a time. The problem is not that there are too many files to
compile, but with the fact that large projects are improportionately
slower. In other words, the exact same files that compile fast in a
small project are considerably slower in a large one. The compiler goes
into slow motion when there are too many units in a BPR. I'll try to use
make instead of the IDE, but after breaking it into LIBs, I'm quite
happy now. I also use #pragma link to link the LIB files, as opposed to
adding them to the project.
I might try that. I'm not really very fond of the idea, as it sounds
like considerable extra hassle (especially considering that my project
is continually being worked on, so...). But I guess it might help.
Still, that's just a workaround: there's still some basic BCB bug in
there somewhere, it's not normal to have to do that!
Quote
One more observation. If I delete all the obj and csm files before a
full rebuild, the building is noticeably faster that without doing this
cleanup.
I haven't tried that, I'll try.
Quote
A RAM disk helped for me, but only a very little bit, and only if the
entire project, everything is in the RAM disk. It's not worth the hassle
to me, compile times are comparable to the hard disk times, at least on
my computer. However, if only the obj and csm files are on a RAM disk,
everything else is on the hard disk, it's slower than if everthing is on
the hard disk. It seems it's the best when all files are on the same
drive. But not in the same directory, we should always put temporary
files in a different subdirectory. I have a Western Digital Raptor SATA
drive, which is one of the fastests hard disks, and it's worth it. It
doesn't have to be very large one. Other people told me that they had
very good results with a good RAM disk. It's worth doing some tests with
a stop watch (yes, a stop watch, because my compiler sometimes stops in
the middle, and I have to start it again, and thus the compile time
reported in the main caption is inaccurate).
Glad to hear you say so. I had great expectations for using a RAM disk,
but my experience was similar to yours...
However, I currently have 1 directory, with all the source files, and
all the obj files, etc, all in the same place. I'll give what you say a
try. Thanks for the tip!
Quote
Nothing would be more important to me than fixing this very steep
compile time increase that I experience with growing projects. I don't
want to spread false information by saying that it's exponential, but I
suspect it's quite bad. At one point I added 1 unit to my project and it
took 10 seconds. I added 1 more and it took a minute. Then I added one
more and it crashed after several minutes.
I entirely agree. It's insane to have to work around this kind of BCB
bug/misbehaviour.
Regards,
Jonathan.
 

{smallsort}