Board index » delphi » Memory Leaks w/ ADO

Memory Leaks w/ ADO

Hi All,
 I have an application that runs every half hour reading and updating
databases.  After about a day the server would
slow down due to decreased memory.  After several days of code revision I
tracked it down to the act of simply connecting the ADOConnection object!  I
lose from 3-4M of memory for the connection and DO NOT get it back after the
connection is closed or the connection object is freed.  I cannot get the
memory back until the application is shut down.

You can easily reproduce this.

1. Create a new project and drop an ADOConnection object.  Set it to connect
to a database (I used Access and SQL)
2. Drop a button and in the onClick put:

ADOConnection1.Connected := true;   <=== huge memory jump!
Button1.Caption := 'Connected';
Sleep(3000);
ADOConnection1.Connected := false;
Button1.Caption := 'Disconnected';
Sleep(3000);
ADOConnection1.Free;
Button1.Caption := 'Freed';

3. Next, fire up Taskmanager and check the memory.  Then run your program.
Click the connect button and watch the memory go up.  Then after the
connection object disconnects you will see that the memory is not recovered.

I also tried this connecting directly with a TADOQuery object and the same
results were seen.  This leads me to believe that
this bug is either common to the Borland ADO library or the Microsoft ADO
objects.  We are checking on the Microsoft side now.

Does anyone have any experience with this?  I've been researching but do not
see how simply connecting to the datasource would cause such a huge memory
jump nor why it would not be freed until the application is stopped.

Thanks!
 - Brent

 

Re:Memory Leaks w/ ADO


1) Pooled connections will increase memory when the first connection is
made.
2) There are several ADO drivers for OLEDB and ODBC that leak some memory,
but only a small leak and no 4M. To solve the problem you can run your stuff
in a virtual directory as a separate process and set the timeout of the
process with Microsoft Transaction Explorer, properties, advanced of the
corresponding package (look for the name of the virtual directory). If it
does shut down once in a while your leaks will probably dissapear with the
shut down.

Martijn Houtman

Quote
"Brent Williams" <bwilli...@netmerchants.net> wrote in message

news:8velr5$qj83@bornews.inprise.com...
Quote
> Hi All,
>  I have an application that runs every half hour reading and updating
> databases.  After about a day the server would
> slow down due to decreased memory.  After several days of code revision I
> tracked it down to the act of simply connecting the ADOConnection object!
I
> lose from 3-4M of memory for the connection and DO NOT get it back after
the
> connection is closed or the connection object is freed.  I cannot get the
> memory back until the application is shut down.

> You can easily reproduce this.

> 1. Create a new project and drop an ADOConnection object.  Set it to
connect
> to a database (I used Access and SQL)
> 2. Drop a button and in the onClick put:

> ADOConnection1.Connected := true;   <=== huge memory jump!
> Button1.Caption := 'Connected';
> Sleep(3000);
> ADOConnection1.Connected := false;
> Button1.Caption := 'Disconnected';
> Sleep(3000);
> ADOConnection1.Free;
> Button1.Caption := 'Freed';

> 3. Next, fire up Taskmanager and check the memory.  Then run your program.
> Click the connect button and watch the memory go up.  Then after the
> connection object disconnects you will see that the memory is not
recovered.

> I also tried this connecting directly with a TADOQuery object and the same
> results were seen.  This leads me to believe that
> this bug is either common to the Borland ADO library or the Microsoft ADO
> objects.  We are checking on the Microsoft side now.

> Does anyone have any experience with this?  I've been researching but do
not
> see how simply connecting to the datasource would cause such a huge memory
> jump nor why it would not be freed until the application is stopped.

> Thanks!
>  - Brent

Re:Memory Leaks w/ ADO


Quote
> 1) Pooled connections will increase memory when the first connection is
> made.

Take a look at the SQL profiler. If you open the connection a second time
you will see that you are opening an existing connection. So there is were
you memory has gone. Also the SQL server can be greedy with memory.

If you don't want pooling you can use another connection string.

"Martijn Houtman"

Re:Memory Leaks w/ ADO


Are you changing the query and reuse it (a 140.000 times)?

Martijn Houtman

Quote
"Steve Fields" <sfield...@earthlink.net> wrote in message

news:3A21FC52.7BB26018@earthlink.net...
Quote
> I'm still waiting for answer to my question,
> much the same as this one. Does this also explain why
> the server will eventually run out of virtual memory
> after simply inserting numerous (140,000) records
> serially using ADO components. After a short time
> the program uses 20 plus meg on the user and has pushed
> the server program (MS SQL7) into using over 600 meg
> of space (on a 512 meg machine at that)? (No
> grids used, using only a TADOQuery and reading the
> records from a flat file and writing them to the
> MS SQL table.)

> (D5.1 Ent, MS SQL7 w/updates,
> Compaq client 450mhz w/192 meg ram 10 gig,
> Dell server w/512 meg ram and 36 gig RAID)

> (I am still waiting for a nibble on my question from
> 11/20)

> Steve Fields

Re:Memory Leaks w/ ADO


So you are using a live data set (server-side cursor?). I don't think this
is a good idea for a 140.000 inserts. It will be much faster to use a
prepared insert SQL with parameters that you execute. Also if the
transaction stays open until you finish the 140.000 inserts you will have a
lot of overhead. So, it is better to commit the transaction after each
insert (or a block of inserts).

Martijn Houtman

Quote
"Steve Fields" <sfield...@earthlink.net> wrote in message

news:3A226E07.2D8B43FA@earthlink.net...
Quote
> No. Simply using a query to give access to the
> dataset.
>   ...
>   DSet : TADOQuery;
>   l : integer;
>   fp : integer; // File Pointer
>   Recd : TFlatRecord; // defined record for access to flat file
>   ...
>   l := FileRead(fp, Recd, SizeOf(Recd));
>   while Not Done do begin
>     DSet.Append;
>     DSetField1.AsString := Trim(Recd.SomeThing);
>     DSetField2.AsInteger := StrToInt(Recd.SomeThingElse);
>     DSet.Post;
>     l := FileRead(fp, Recd, SizeOf(Recd));
>   end;
>   ...
>   FileClose(fp);

> Martijn Houtman wrote:
> > Are you changing the query and reuse it (a 140.000 times)?

Re:Memory Leaks w/ ADO


I remember something about a 4k memory leak, so that could be 32k for 8
posts if it happens for each post.

What I mean with an insert sql is just something like

INSERT data.object (id, guid) VALUES (:property_id, :property_guid)

You can use it with TAdoCommand or TAdoQuery and just execute it without
getting data back. This will speed up the process, because you don't need a
cursor. If you use a cursor ADO is caching unnecessary a lot of data.

Martijn Houtman

Quote
"Steve Fields" <sfield...@earthlink.net> wrote in message

news:3A228F0A.211C6CE4@earthlink.net...
Quote
> Martijn Houtman wrote:

> > So you are using a live data set (server-side cursor?).
> > AND
> > .... So, it is better to commit the transaction after each
> > insert (or a block of inserts).

> Actually used BeginTrans and CommitTrans after
> each post. Along with a Application.Processmessages
> after the commit. Also used client and server
> cursors to try to find out what was happening.
> On both the program used more and more memory in
> blocks of about 32k every 8 or 9 posts, the server
> climber to over 650meg used by SQLServer program.
> The overload of virtual memory on the server remained
> until I had to reboot the server.

> > It will be much faster to use a
> > prepared insert SQL with parameters that you execute. Also if the
> > transaction stays open until you finish the 140.000

> I am not familiar with an _insert sql_. (I am filling in over
> 60 fields, 170,000 records in the code, for the dataset, so i
> don't know if you mean use params and paste in the fields and
> execute the SQL for each record? It already takes over an hour
> to append the records. This could be a problem as I have to
> add another 80,000 records in the next few days and I have about
> 8 more datasets I am having to do this to. _Data conversion blues_)

> Steve Fields

Re:Memory Leaks w/ ADO


I'm still waiting for answer to my question,
much the same as this one. Does this also explain why
the server will eventually run out of virtual memory
after simply inserting numerous (140,000) records
serially using ADO components. After a short time
the program uses 20 plus meg on the user and has pushed
the server program (MS SQL7) into using over 600 meg
of space (on a 512 meg machine at that)?

(D5.1 Ent, MS SQL7 w/updates,
Compaq client 450mhz w/192 meg ram 10 gig,
Dell server w/512 meg ram and 36 gig RAID)

(I am still waiting for a nibble on my question from
11/20)

Steve Fields

Quote
Martijn Houtman wrote:

  sfields24.vcf
< 1K Download

Re:Memory Leaks w/ ADO


I'm still waiting for answer to my question,
much the same as this one. Does this also explain why
the server will eventually run out of virtual memory
after simply inserting numerous (140,000) records
serially using ADO components. After a short time
the program uses 20 plus meg on the user and has pushed
the server program (MS SQL7) into using over 600 meg
of space (on a 512 meg machine at that)? (No
grids used, using only a TADOQuery and reading the
records from a flat file and writing them to the
MS SQL table.)

(D5.1 Ent, MS SQL7 w/updates,
Compaq client 450mhz w/192 meg ram 10 gig,
Dell server w/512 meg ram and 36 gig RAID)

(I am still waiting for a nibble on my question from
11/20)

Steve Fields

  sfields24.vcf
< 1K Download

Re:Memory Leaks w/ ADO


No. Simply using a query to give access to the
dataset.
  ...
  DSet : TADOQuery;
  l : integer;
  fp : integer; // File Pointer
  Recd : TFlatRecord; // defined record for access to flat file
  ...
  l := FileRead(fp, Recd, SizeOf(Recd));
  while Not Done do begin
    DSet.Append;
    DSetField1.AsString := Trim(Recd.SomeThing);
    DSetField2.AsInteger := StrToInt(Recd.SomeThingElse);
    DSet.Post;
    l := FileRead(fp, Recd, SizeOf(Recd));    
  end;
  ...
  FileClose(fp);
Quote
Martijn Houtman wrote:
> Are you changing the query and reuse it (a 140.000 times)?

Re:Memory Leaks w/ ADO


Quote
Martijn Houtman wrote:

> So you are using a live data set (server-side cursor?).
> AND
> .... So, it is better to commit the transaction after each
> insert (or a block of inserts).

Actually used BeginTrans and CommitTrans after
each post. Along with a Application.Processmessages
after the commit. Also used client and server
cursors to try to find out what was happening.
On both the program used more and more memory in
blocks of about 32k every 8 or 9 posts, the server
climber to over 650meg used by SQLServer program.
The overload of virtual memory on the server remained
until I had to reboot the server.

Quote
> It will be much faster to use a
> prepared insert SQL with parameters that you execute. Also if the
> transaction stays open until you finish the 140.000

I am not familiar with an _insert sql_. (I am filling in over
60 fields, 170,000 records in the code, for the dataset, so i
don't know if you mean use params and paste in the fields and
execute the SQL for each record? It already takes over an hour
to append the records. This could be a problem as I have to
add another 80,000 records in the next few days and I have about
8 more datasets I am having to do this to. _Data conversion blues_)

Steve Fields

Re:Memory Leaks w/ ADO


Also...Just for information, when the
server, SQLServer, is running with no
users and not tables open, should it
be using 365mg just running in the
background as a service on the (Dell)
server?

Thanks!!!

Steve Fields

Re:Memory Leaks w/ ADO


How many databases do you have?
Do you have any Extended Stored Procs?

-Euan

Quote
"Steve Fields" <sfield...@earthlink.net> wrote in message

news:3A228F88.16F9DAA@earthlink.net...
Quote
> Also...Just for information, when the
> server, SQLServer, is running with no
> users and not tables open, should it
> be using 365mg just running in the
> background as a service on the (Dell)
> server?

> Thanks!!!

> Steve Fields

Re:Memory Leaks w/ ADO


One database, no Procs, one index (for
uniqueness) Just a simple append of records
filled in by assigning to fielddefs and a
post statement (with a transaction for
each read/write loop)

Steve Fields

Quote
Euan Garden wrote:

> How many databases do you have?
> Do you have any Extended Stored Procs?

> -Euan

Re:Memory Leaks w/ ADO


I will try that.
Is not the assigning of FieldDefs much
the same thing though?

Steve Fields

Quote
Martijn Houtman wrote:

> I remember something about a 4k memory leak, so that could be 32k for 8
> posts if it happens for each post.

> What I mean with an insert sql is just something like

> INSERT data.object (id, guid) VALUES (:property_id, :property_guid)

> You can use it with TAdoCommand or TAdoQuery and just execute it without
> getting data back. This will speed up the process, because you don't need a
> cursor. If you use a cursor ADO is caching unnecessary a lot of data.

> Martijn Houtman

Re:Memory Leaks w/ ADO


SQL Server 7.0 will, by default, appear to use most of the memory in the
machine.
Go to page: [1] [2]

Other Threads