Board index » delphi » Speed of "large" databases question

Speed of "large" databases question

Hello !

I have a database in wich data is stored per person ID per date:

  field 1:    ID Integer
  field 2:    Date TDateTime
  field n-m: .... ....

The primairy index is 'ID;Date'.

The application generates a certain report every day (of all persons), from
the data gathered that day. I use a filter to show only the data of that
day. Also, I use a master-detail relation between this DB and a Personel DB.
The data is accessed by a TTable component.

If the database is small (a couple of megabytes), the report is generated
within 10 seconds. If the database is bigger (like 15 megabytes), it takes
several minutes to generate the same report (same date, same data that
date).

What can be done to speed things up, should I first make a query and then
make the report with this query, or does this query take just as long to
generate ??

Thanks,

Roy

 

Re:Speed of "large" databases question


"Roy Rutten" <roy.rut...@xarmac.nl.remove.nospam> skrev i melding
news:9o6psp$2h1l$1@reader02.wxs.nl...

Quote
> Hello !

> I have a database in wich data is stored per person ID per date:

>   field 1:    ID Integer
>   field 2:    Date TDateTime
>   field n-m: .... ....

> The primairy index is 'ID;Date'.

> The application generates a certain report every day (of all persons), from
> the data gathered that day. I use a filter to show only the data of that
> day. Also, I use a master-detail relation between this DB and a Personel
DB.
> The data is accessed by a TTable component.

> If the database is small (a couple of megabytes), the report is generated
> within 10 seconds. If the database is bigger (like 15 megabytes), it takes
> several minutes to generate the same report (same date, same data that
> date).

> What can be done to speed things up, should I first make a query and then
> make the report with this query, or does this query take just as long to
> generate ??

I've got an application with the exact same fields and the same problem. I
believe I've used filters in my application, but I tested a TQuery right now.
It's quicker, no doubt. I don't know why, really, it should be possible to
implement a filter in exactly the same way. Maybe it's the problem with BDE -
it's supposed to work with a large range of databases, like with SQL servers
not supporting live queries...

regards,
--
Bjoerge Saether
Consultant / Developer
http://www.itte.no
Asker, Norway
bjorge@takethisaway_itte.no (remve the obvious)

Re:Speed of "large" databases question


Filters are processed locally, so all records have to be transferred from
the server to the local PC before they are filtered. Queries are processed
remotely (at least on SQL servers) and only return the valid records.
So, if transfer speed is the limiting factor queries are faster.

Quote
"Bj?rge S?ther" <REMOVE_bsaether@THIS_online.no> wrote in message

news:B4Ip7.19432$1T5.226569@news1.oke.nextra.no...
Quote
> "Roy Rutten" <roy.rut...@xarmac.nl.remove.nospam> skrev i melding
> news:9o6psp$2h1l$1@reader02.wxs.nl...
> > Hello !

> > I have a database in wich data is stored per person ID per date:

> >   field 1:    ID Integer
> >   field 2:    Date TDateTime
> >   field n-m: .... ....

> > The primairy index is 'ID;Date'.

> > The application generates a certain report every day (of all persons),
from
> > the data gathered that day. I use a filter to show only the data of that
> > day. Also, I use a master-detail relation between this DB and a Personel
> DB.
> > The data is accessed by a TTable component.

> > If the database is small (a couple of megabytes), the report is
generated
> > within 10 seconds. If the database is bigger (like 15 megabytes), it
takes
> > several minutes to generate the same report (same date, same data that
> > date).

> > What can be done to speed things up, should I first make a query and
then
> > make the report with this query, or does this query take just as long to
> > generate ??

> I've got an application with the exact same fields and the same problem. I
> believe I've used filters in my application, but I tested a TQuery right
now.
> It's quicker, no doubt. I don't know why, really, it should be possible to
> implement a filter in exactly the same way. Maybe it's the problem with
BDE -
> it's supposed to work with a large range of databases, like with SQL
servers
> not supporting live queries...

> regards,
> --
> Bjoerge Saether
> Consultant / Developer
> http://www.itte.no
> Asker, Norway
> bjorge@takethisaway_itte.no (remve the obvious)

Re:Speed of "large" databases question


Quote
"Roy Rutten" <roy.rut...@xarmac.nl.remove.nospam> wrote in message

news:9o6psp$2h1l$1@reader02.wxs.nl...

Quote
> Hello !

> I have a database in wich data is stored per person ID per date:

>   field 1:    ID Integer
>   field 2:    Date TDateTime
>   field n-m: .... ....

> The primairy index is 'ID;Date'.

> The application generates a certain report every day (of all persons),
from
> the data gathered that day. I use a filter to show only the data of that
> day. Also, I use a master-detail relation between this DB and a Personel
DB.
> The data is accessed by a TTable component.

> If the database is small (a couple of megabytes), the report is generated
> within 10 seconds. If the database is bigger (like 15 megabytes), it takes
> several minutes to generate the same report (same date, same data that
> date).

> What can be done to speed things up, should I first make a query and then
> make the report with this query, or does this query take just as long to
> generate ??

Have you tried adding an index with Date as the primary segment? If you only
want to show data for 1 day this should speed things up.

Re:Speed of "large" databases question


"M.H. Avegaart" <avega...@NOSPAMmccomm.nl> skrev i melding
news:9o8143$18qe$1@scavenger.euro.net...

Quote
> Filters are processed locally, so all records have to be transferred from
> the server to the local PC before they are filtered. Queries are processed
> remotely (at least on SQL servers) and only return the valid records.
> So, if transfer speed is the limiting factor queries are faster.

I tested this with TQuery vs. TTable w/filter, and a local Paradox database.
The query was noticeably faster even then.

Quote
> "Bj?rge S?ther" <REMOVE_bsaether@THIS_online.no> wrote in message
> news:B4Ip7.19432$1T5.226569@news1.oke.nextra.no...
> > "Roy Rutten" <roy.rut...@xarmac.nl.remove.nospam> skrev i melding
> > news:9o6psp$2h1l$1@reader02.wxs.nl...
> > > Hello !

> > > I have a database in wich data is stored per person ID per date:

> > >   field 1:    ID Integer
> > >   field 2:    Date TDateTime
> > >   field n-m: .... ....

> > > The primairy index is 'ID;Date'.

> > > The application generates a certain report every day (of all persons),
> from
> > > the data gathered that day. I use a filter to show only the data of
that
> > > day. Also, I use a master-detail relation between this DB and a
Personel
> > DB.
> > > The data is accessed by a TTable component.

> > > If the database is small (a couple of megabytes), the report is
> generated
> > > within 10 seconds. If the database is bigger (like 15 megabytes), it
> takes
> > > several minutes to generate the same report (same date, same data that
> > > date).

> > > What can be done to speed things up, should I first make a query and
> then
> > > make the report with this query, or does this query take just as long
to
> > > generate ??

> > I've got an application with the exact same fields and the same problem.
I
> > believe I've used filters in my application, but I tested a TQuery right
> now.
> > It's quicker, no doubt. I don't know why, really, it should be possible
to
> > implement a filter in exactly the same way. Maybe it's the problem with
> BDE -
> > it's supposed to work with a large range of databases, like with SQL
> servers
> > not supporting live queries...

--
Bjoerge Saether
Consultant / Developer
http://www.itte.no
Asker, Norway
bjorge@takethisaway_itte.no (remve the obvious)

Re:Speed of "large" databases question


I had the best results when I loaded whole query into the memory,
created index in memory and created reports/processes by searching the
memory.
Difference was astonishing... from almost 4000 ms, it dropped to 700 ms,
while most of the time of this 700 ms were loading query into the
memory.

In article <9o6psp$2h1...@reader02.wxs.nl>,
roy.rut...@xarmac.nl.remove.nospam says...

Quote
> Hello !

> I have a database in wich data is stored per person ID per date:

>   field 1:    ID Integer
>   field 2:    Date TDateTime
>   field n-m: .... ....

> The primairy index is 'ID;Date'.

> The application generates a certain report every day (of all persons), from
> the data gathered that day. I use a filter to show only the data of that
> day. Also, I use a master-detail relation between this DB and a Personel DB.
> The data is accessed by a TTable component.

> If the database is small (a couple of megabytes), the report is generated
> within 10 seconds. If the database is bigger (like 15 megabytes), it takes
> several minutes to generate the same report (same date, same data that
> date).

> What can be done to speed things up, should I first make a query and then
> make the report with this query, or does this query take just as long to
> generate ??

> Thanks,

> Roy

--
Best regards,
Emil Beli
http://www.greenhousemm.com

Re:Speed of "large" databases question


Quote
> Have you tried adding an index with Date as the primary segment? If you
only
> want to show data for 1 day this should speed things up.

I thought about this too, however, reports need to generated not only by day
but by period as well. I use a master-detail relationship to process only
data from persons who have been selected in a list. A master-detail relation
should include the first field of the index as well, so switching the Date
and ID field requires to make a secondary index with ID as first field. I
tried this, but this takes even longer than the the original method (I
didn't expect this .... ?!?)

If tried making a query per person, which is very slow, as I've just seen in
a test-app.

Thanks anyway !

Re:Speed of "large" databases question


In article <9o9npc$sj...@reader05.wxs.nl>,
roy.rut...@xarmac.nl.remove.nospam says...

Quote

> > Have you tried adding an index with Date as the primary segment? If you
> only
> > want to show data for 1 day this should speed things up.

> I thought about this too, however, reports need to generated not only by day
> but by period as well. I use a master-detail relationship to process only
> data from persons who have been selected in a list. A master-detail relation
> should include the first field of the index as well, so switching the Date
> and ID field requires to make a secondary index with ID as first field. I
> tried this, but this takes even longer than the the original method (I
> didn't expect this .... ?!?)

> If tried making a query per person, which is very slow, as I've just seen in
> a test-app.

> Thanks anyway !

Just remember that indexing small table will probably make it slower
than without index. For example, if you have 4 fields on a relatively
small number of register and table is updated often, it will take 3
times more to update indexes then to update a table.

--
Best regards,
Emil Beli
http://www.greenhousemm.com

Other Threads