Data files are full error

todd aron, senior developer, software development taron at
Fri Apr 3 11:02:44 EDT 2009


have you checked your data file for corruption?

we do a "quick check" twice a day (datafile is on EMC Celerra, and we 
take snapshots on the hour and check that); altho its not definitive, 
its served its purpose for us.

some also swear by walking up and down the indices of every slot in the 
data file. this can be done while in multi-user; i recently wrote a 
procedure that does this and tracks inconsistencies (for now, # of 
records in data file $cdata.$slots.<tablename>.$recordcount v. count via 
find first/next v. count via find last/previous). if you want the proc, 
contact off-line. its o7 but should translate to $o3.3.x.

OH it looks at EVERY index, but can be adjusted to only do selected 
ones, eg SEQs or other primary keys. our data file has 10 segments, and 
it was still running on MON am after setting it off FRI evening; it 
looked like it was about 1/2 of the way thru. (datafile on EMC, my 
machine is reasonably fast, network is OK.)

Brian wrote:
> Hello All,
> I'm re-inserting some records in an native data file while users are
> accessing it.  The data file has 9 segments and 80meg of free blocks.
> Periodically, the routine I created to re-insert the records will stop and
> report a data files are full error.  I have double-checked and the data
> files are not full...this includes the hard drives as well.
> Does anyone have a clue on this.  I'm using Studio 3.3.3 on this project.


be well,

todd ARON                    p 718.482.4206
Competitrack                 f 718.482.4286
36-36 33rd st, suite 501     <taron at>
long island city, NY 11106   <>


More information about the omnisdev-en mailing list