Board index » delphi » BDE with NT very slow on first insert using Delphi Cached Update

BDE with NT very slow on first insert using Delphi Cached Update

I hope some one out there can assist me with this problem.  I just
switched my system to NT server in preparation for getting clients up
and running on the server and the c/s application.  I am still learning
NT server...so this may have been one of my blunders here.

I am using delphi 3 c/s and interbase 4.2.

What I am doing is using a cached update query to get the information
requested by the user.  When the user wants to insert new data the
TUpdateSQL is used which for the insert just calls a stored procedure in

IB.  That IB stored procedure then handles the inserts, changes and
updates to various tables.

So what happens on NT Server which did not occur on Win95 is that the
first Insert takes FOREVER!  At least a minute....the cursor is just a
pointer then the cursor changes to the sql cursor and the data is
committed.  After this first insert performance is back to what is was
with Win95.

I did trace with SQL monitor and it looked like there were a lot of
fetches of all records for the TQuery. Humm.  But this must have been
true with Win95 since the code was not changed.

So what do I do next?

Is this a NT configuration blunder?
Did I reinstall IB 4.2 incorrectly?
Do I need to up the IB Settings and OS Settings for IB 4.2?  What would
be acceptable values or where would I get the information to determine
the best fit for my needs?
Do I need to change some settings with the BDE?
Is there something with the approach I choose using the
TQuery/TUpdateSQL which inserts/updates/deletes using a IB stored
procedure?

Thanks for you help and guidance

Barb  ;)

 

Re:BDE with NT very slow on first insert using Delphi Cached Update


Barb,

I have these questions:

1. Where in your code are putting the TUpdateSQL.Apply method?

2. How are you using TUpdateSQL to start a stored procedure? What does
the code look like?

3. Have you tried using a stored procedure component instead of a
TUpdateSQL?

Phil Cain

Re:BDE with NT very slow on first insert using Delphi Cached Update


Philip,

Thanks for your response.

I think what I am doing is pretty standard, if not please advise.  The
problem occured in working sw when I upgraded to NT sever 4.0 SP3.

But below are answers to your questions...any pointers would be
appreciated.  The query actually links 3 tables together.  The TUpdateSQL
calles a stored procedure in the database which determines what in which
tables needs to be updated, inserted, or deleted.  The specific stored
procedure called depends on the UpdateKind.

Quote
Philip Cain wrote:
> Barb,

> I have these questions:

> 1. Where in your code are putting the TUpdateSQL.Apply method?

In some code which gets called when the user or the system needs to post
the data.  I call a routine I call applyupdates.  Here is some sample
code.  if MyQuery.UpdatesPending then
    try
      begin
        MyQuery.Database.TransIsolation := tiDirtyRead;
        MyQuery.Database.StartTransaction;
      try
        MyQuery.ApplyUpdates;
        MyQuery.Database.Commit;
    except
       MyQuery.Database.Rollback;
      raise;
    end; {}
    MyQuery.CommitUpdates; {on success, clear the internal cache}

  end; {if MyQuery.UpdatesPending then}
  finally
  end;
  {refresh the query}

Quote
> 2. How are you using TUpdateSQL to start a stored procedure? What does
> the code look like?

Yes I am executing a stored procedure in the TUpdateSQL.  I have run SQL
mon and it just seams to loop for a while.  After the first insert or
record move everything goes very smoothly.

here is the update record code on the query and the execute line in the
TupdateSQL for an insert.
      with MyUpdateSQL, Query[Updatekind] do
        begin
          SetParams(UpdateKind);
          MyUpdateSQL.Query[UpdateKind].ParamByName('ID').AsInteger :=
gCurrentId;
{these lines are indented to match but not in this email}
 MyUpdateSQL.Query[UpdateKind].ParamByName('CUSTOMER_ID').AsInteger:=MyQuerySource.DataSet.FieldByName('CUSTOMER_ID').AsInteger;

 MyUpdateSQL.Query[UpdateKind].ParamByName('OWNER_ID').AsInteger:=MyOwnerQuerySource.DataSet.FieldByName('OWNER_ID').AsInteger;

 {other params}
        ExecSQL;
          UpdateAction := uaApplied;
        end; {with ...}

The line for the insert might look like this
execute procedure insert_my_information(:id, :customer_id, :owner_id,
.....);

Quote
> 3. Have you tried using a stored procedure component instead of a
> TUpdateSQL?

I really like the connection of using a query to for grid
infomaiton/edit/modify etc.  Then went it is time to do the updates it
gets handled via the connected TUpdateSQL.  I used TStoredProc for other
items such as getting generated values.  My TUpdateSQL may execute various
stored procedures in the database based on the action provided, or
insert/delete or update to a specific table.  In general I like the
TUpdateSQL because all the activities for the updates are coordinated in
one area with one component.

Is there some performance/network issue I am not aware of?

Quote
> Phil Cain

  thanks for your help in resolving this performance hit.

Barb  ;)

Re:BDE with NT very slow on first insert using Delphi Cached Update


Barb,

Sorry for the questions. I just had never seen TUpdateSQL used with
stored procedures. Interesting to know that it works.

Since your code does work, then the first obvious conclusion is that
the logic is valid. I'm not familiar with the nuances of NT and so I
can't comment directly on how that might be affecting your
performance, but I have these comments (from the ridiculous to....):

First, the apply updates. You wrote:

Quote
>if MyQuery.UpdatesPending then
>    try
>      begin
>        MyQuery.Database.TransIsolation := tiDirtyRead;
>        MyQuery.Database.StartTransaction;
>      try
>        MyQuery.ApplyUpdates;
>        MyQuery.Database.Commit;
>    except
>       MyQuery.Database.Rollback;
>      raise;
>    end; {}
>    MyQuery.CommitUpdates; {on success, clear the internal cache}

>  end; {if MyQuery.UpdatesPending then}

The test for UpdatesPending is probably redundant. If there are no
updates pending, then OnUpdateRecord won't fire. The only performance
hit here is the overhead for starting and ending a transaction. But
since you only start a transaction when updates are pending, there is
no effect on your performance.

The dirty read declaration is a candidate. NT may take that as a
signal to refresh the local dataset and that may be causing a
performance problem at the start. It also may be a potential problem
in cached updates and you may want to consider eliminating this
specification. Cached updates depends on a stable dataset. If dirty
read introduces a new record or deletes one, for example, cached
updates may loose its place in your dataset and you may wind up with
the dreaded "unexpected results." You're not likely to see this in
testing unless you test in a heavy multi-user environment, but it
could be a problem if dirty read changes the dataset underlying the
cache.

Not a performance problem, just a suggestion, but think about putting
the CommitUpdates statement directly behind the Database.Commit
statement. That way, the cached is preserved if you get an exception.
As it is, you're flushing the cache whether or not there is an
exception. On exception, you will loose all the changes in the cache
and have to do all the entries again.

In your OnUpdateRecord:

Quote
>here is the update record code on the query and the execute line in the
>TupdateSQL for an insert.
>      with MyUpdateSQL, Query[Updatekind] do
>        begin
>          SetParams(UpdateKind);
>          MyUpdateSQL.Query[UpdateKind].ParamByName('ID').AsInteger :=
>gCurrentId;
>{these lines are indented to match but not in this email}
> MyUpdateSQL.Query[UpdateKind].ParamByName('CUSTOMER_ID').AsInteger:=MyQuerySource.DataSet.FieldByName('CUSTOMER_ID').AsInteger;

> MyUpdateSQL.Query[UpdateKind].ParamByName('OWNER_ID').AsInteger:=MyOwnerQuerySource.DataSet.FieldByName('OWNER_ID').AsInteger;

> {other params}
>        ExecSQL;
>          UpdateAction := uaApplied;
>        end; {with ...}

The only performance candidate I can see here is the SetParms method
which you use before you actually set the parm values. (Please don't
take what I say here as gospel, but just speculation. I use Apply
instead of SetParms and ExecSQL and so don't have any street
experience with it.) My impression is that setting parm values should
come first, then SetParms, then ExecSQL, but I could be way off base
here.

I'm assuming that you chose the SetParms/ExecSQL route because the
parms don't exist anywhere else except in UpdateSQL. If so, then
you're stuck with this construct. But for completeness, I'll mention
another possibility.

TUpdateSQL has a dataset property (set at run time. it's not in the
Object Inspector). If that is set to a TQuery or TTable which has
params that match your TUpdateSQL params, then you don't have to set
the params explicitly. Instead, TUpdateSQL will use the params from
the TQuery or TTable. In this case, you can skip the SetParms/ExecSQL
and explicit assignments in OnUpdateRecord and simply say
MyUpdateSQL.Apply(updatekind) which will run the SetParms/ExecSQL and
assignments automatically.

Just as a housekeeping note, you start this construct with :
         with MyUpdateSQL, Query[Updatekind] do
and then don't seem to use those qualifiers. That is, you repeat them
in the long statements. I imagine that you repeat them to avoid some
ambiguity, but I don't understand the reference to " ,
Query[Updatekind] " so I can't offer suggestions. (I hate those long
statements, don't you?).

That's all I can say at this point. Perhaps some NT expert can add
some wisdom.

Hope this helps.

Phil Cain

Other Threads