AW: Document Processing question

Rudolf Bargholz rudolf at
Thu Sep 8 04:32:03 UTC 2022

Hi Paul,

Just thought of another "solution". Why not create a batch process that copies all files from the source to the destination. I would use FreeFileSync, create a job that syncs a folder from left to right, not deleting any files at the destination. This job can be run run as a batch. Then all your "archive" process would need to do is $doesfileexist at the source and destination, and if this return true for both files, delete the file at the source. FreeFileSync is really good at this type of process, and this would take a lot of load from your Omnis workflow.



-----Ursprüngliche Nachricht-----
Von: omnisdev-en <omnisdev-en-bounces at> Im Auftrag von Rudolf Bargholz
Gesendet: Mittwoch, 7. September 2022 22:04
An: OmnisDev List - English <omnisdev-en at>
Betreff: AW: Document Processing question

Hi Paul,

To be honest, Andrew's solution will probably be the most performant solution to your problem, however you mention the starting point of your problem, the end result, but are glossing over the important middle part that would really help to give you decent information. What are the rules to determine if a file needs to be archived or not? Using robocopy will likely only work if you can tell robocopy what the rules are to archive/move files. If the rules for archiving cannot be mapped to the robocopy command, perhaps an easier solution would be to write the source and destination paths into a table on your database, then in an asynchronous process use the data from this table to create a batch file that does the moving of the files:

move  /Y "c:\source\file1.txt" "z:\destination\file1.txt"
move  /Y "c:\source\file2.txt" "z:\destination\file2.txt"

Depending on your rules on how to determine which files need to be batched, you could even create a simple SQL to mark the files to archive, then copy the files to the copy table, and then delete the old file records, e.g.

Begin transaction

merge into ARCHIVE dest
using (
select SEQ
from DMS
) as source(SEQ, SOURCEFILE)
on source. SEQ = dest. SEQ
when not matched
insert (
values (
source. SEQ


Commit if everything above succeeded

A second async process could then process the ARCHIVE table to copy the files from the old path to the archive path, with whatever workflow you decide on.

By splitting up the workflow into two steps, you will take away a lot of the pressure to have this process work as fast as possible.


Rudolf Bargholz

-----Ursprüngliche Nachricht-----
Von: omnisdev-en <omnisdev-en-bounces at> Im Auftrag von Paul Manning via omnisdev-en
Gesendet: Mittwoch, 7. September 2022 19:41
An: OmnisDev List - English <omnisdev-en at>
Cc: Paul Manning <paulrmanning58 at>
Betreff: Document Processing question

Background. I am working with a company that handles a massive amount of documents.  The system is set up to store these documents on a drive on the network. They are wanting to start archiving/deleting old records.  Due to time requirements they will need to archive to a separate drive then come back later and delete those.

It presently goes through data records to determine which have docs to archive and does a fileOps.$move… to move files from the working directory to the archive directory. Well this takes a bit of time when you are dealing with thousands of files.  Might there be a way to batch these moves?


Paul R. Manning
paulrmanning58 at

Manage your list subscriptions at Start a new message -> mailto:omnisdev-en at 

More information about the omnisdev-en mailing list