SQL Slow Downs?
Doug Easterbrook
doug at artsman.com
Wed Aug 12 19:54:46 EDT 2020
hi Das:
here’s the likely explanation for you on ‘insert fatigue’
Unlike the Omnis df1, SQLite is ACID compliant.
one of the good effects is that a transaction means data changed is all or nothing. so the insert of many many records also has to be able to create the rollback state. If there are a lot of records to roll back, SQL engines tend to slow down as it is writing out a big temp file to allow the rollback. It also increases the size of the database
so, as Dan pointed out .. doing mass inserts in chunks really means
inserting a bunch of records, then issuing a commit. inserting more record, issuing another commit, repeat until done.
Shutting down the app, probably means Omnis is issuing a commit for you.
When we do mass inserts in postgres, we do similar things. out procedure is:
start transaction
insert some data
commit
vaccum
Then repeat the above.
why a vacuum at the end of each chunk? In postgres, it re-does planner statistics (a performance thing) and helps manage disk space usage.
I believe SQLite has some similar characteristics
hopefully that givers you an idea why.
Doug Easterbrook
Arts Management Systems Ltd.
mailto:doug at artsman.com
http://www.artsman.com
Phone (403) 650-1978
> On August 12, 2020, at 2:16 PM, Das Goravani <goravanis at gmail.com> wrote:
>
>
> Well I figured out with the help of Dan’s comment that I should import the records in chunks, using the big list to import with simply do 10,000 then shut off the session and remake it, log back on, pick up where left off..
>
> I found that this totally works to cure what I learned about today, insert fatigue.. never heard of that til now and it totally answers my distress from lately.. noticed that if I shut down the app and restart it and start back up where left off it’s fast again like at first.. then it slows down.. this was quite distressing as I thought there’s no way big data can deal with the kind of speeds it was coming down to but now in code I simply log off and log back on and pick up and this keeps it very peppy.. I’ve done half a million records in about an hour which is totally acceptable for my needs..
>
> This cities database is the biggest I have to deal with.. only have to bring it in once, so this will be a one time thing.. I gotta work it a lot after it’s in, massage it for various reasons, speed then will be interesting.. I might have a multi day experience to go through
>
> I REALLY APPRECIATE THE fact that Dan got on here and mentioned “Insert Fatigue”.. the fact of existence of the expression taught tons and then that all get it and that you want to refresh your session every so many records to keep up speed.. it really worked for me for sure.. no slowdown on this one.. it’s cooking as I would expect from the software and gear that I’m running..
>
>
> _____________________________________________________________
> Manage your list subscriptions at http://lists.omnis-dev.com
> Start a new message -> mailto:omnisdev-en at lists.omnis-dev.com
More information about the omnisdev-en
mailing list