1. - for each keynum in name.key file (each current record)
- get the history
- for each preamble in the history
- seen{ preamble }++ (name.tmp.seen dbm file)
- write to name.tmp.history
- transnum keynum status [date reclen user]?
2. - for each record in name.n.dat files (each transaction)
- die "seen too many times" if seen{ preamble }++ > 1
- write to name.tmp.transactions
- transnum keynum status [date reclen user]?
- if read line from name.md5
- compare transnum keynum user md5--die if not equal
- else write to name.md5
- transnum keynum user md5 (of record data) [date reclen]?
result:
name.tmp.seen dbm file(s) - of no use after step 2.
name.tmp.history flat file - can compare to new after migrate
of no use after that
name.tmp.transactions flat file - can compare to new after migrate
of no use after that
name.md5 [sha]? - can compare to new after migrate
keep around for next validation
Note: we plan to add optional 'digest' preamble field for one of
crc, md5, sha (of record data)
data scanning procedure:
read each data record in from_ds
read first preamble
get reclen, read record, skip recsep
read next preamble
repeat until end of file
repeat for every datafile