Name that concept

Joined
Mar 21, 2002
Messages
656
Many moons ago I came across a piece of software that would allow you to split large files into much smaller files. The particular utility was also capable of rebuilding the initial file aswell.

Software like this is easy to get hold of, but this particular piece of software had another very useful tool that I am looking to replicate in my own project.

When the initial file is split down, the software also created a check file. This means that if, for example, the intial file split down into 50 and the check file. Then for whatever reason I only had 47 of the split files I could use the check file to rebuild the missing data. I assume this works in a similar way to raid hard drives.

Does anybody know what this "technology" is called, and as an extra bonus does anybody know of any tutorials that may point me in the correct direction.
 
I know winrar can split files like that and I believe it creates a check file, but I don't think the check file can be used to actually recreate data. AFAIK it basically acts like a hash - it verifies that what you have is the same as the original. If it could recreate data then that would be an awesome form of compression. :D Split the files then go ahead and throw away a few of them to save a few more megs.
 
I'm with jpmkm on "no data re-creation." Other than that, it just sounds like a generic version of WinZip's split-for-floppy option. If you like, it's also like TCP.
 
BuddhistPunk said:
Many moons ago I came across a piece of software that would allow you to split large files into much smaller files. The particular utility was also capable of rebuilding the initial file aswell.

Software like this is easy to get hold of, but this particular piece of software had another very useful tool that I am looking to replicate in my own project.

When the initial file is split down, the software also created a check file. This means that if, for example, the intial file split down into 50 and the check file. Then for whatever reason I only had 47 of the split files I could use the check file to rebuild the missing data. I assume this works in a similar way to raid hard drives.

Does anybody know what this "technology" is called, and as an extra bonus does anybody know of any tutorials that may point me in the correct direction.

QuickPar for Windows does that exact thing and much more :)

-E

 
Error detection and correction.

If you've got 47 of 50 files and can still recreate the data, you have more than just simple pairty (even if it's a scheme that spreads groups of parity over some other bunch).
 
Yeah there's got to be something more there than just a parity check. Parity can be used to detect errors but it really can't be used to recreate lost data.
 
Back
Top