Document Processing question

Doug Easterbrook doug at
Wed Sep 7 21:02:51 UTC 2022

hi Paul.

I’ll leave others to talk tech on the solutions that have been provided - all of which assume you need to archive the data.

there is also the out of the box answer which is ‘do you really need to archive the data to another drive’.    I’m speaking tongue-in-cheek here because there are many factors that are not stated that will say ‘yes, archiving must be done’.

I used to worry about that at one time with our application when a huge drive was 160 kb, scsi , direct connected and things were managed in an omnis data file.

nowadays, if venue need more space to track data, I suggest they just buy it.     it is far cheaper to buy more than it is to write code to optimize space and the run time hassles to make sure processes run.

a 16 TB drive is about $500 CAD ..    if I spend more than a day tryig to save space on that, then I’ll just by a second and add it to our array.

again, I have no clue about the volumne of data that you are speaking about or the growth rate .. but if you are archiving (meaning keeping) data  ….  why not just add more/larger disks and avoid the problem.

as an example, I was running out of space on my FreeNAS home server..    so I bought a bunch of 12 TB’ drives and just replaced the raid array one by one.   when I was done, I’d trippled my space.

if I do this again in 5 years ..  its still cheaper (than paying for manpower) than any sort of document management and offloading.

Doug Easterbrook
Arts Management Systems Ltd.
mailto:doug at
Phone (403) 650-1978

> On Sep 7, 2022, at 10:40 AM, Paul Manning via omnisdev-en <omnisdev-en at> wrote:
> Background. I am working with a company that handles a massive amount of documents.  The system is set up to store these documents on a drive on the network. They are wanting to start archiving/deleting old records.  Due to time requirements they will need to archive to a separate drive then come back later and delete those.
> It presently goes through data records to determine which have docs to archive and does a fileOps.$move… to move files from the working directory to the archive directory. Well this takes a bit of time when you are dealing with thousands of files.  Might there be a way to batch these moves?
> Paul
> ************************************************
> Paul R. Manning
> paulrmanning58 at
> _____________________________________________________________
> Manage your list subscriptions at
> Start a new message -> mailto:omnisdev-en at 

More information about the omnisdev-en mailing list