What actually happens on the physical drive surface is subject to many,many variables: file system type, drive space available, current level of fragmentation, bad drive blocks, cluster size, etc.
The notion that copying or moving a file compromises the data in it can quickly be tested:
1. Record some audio data in Sonar
2. Copy the file to another location.
3. Move the new file as many times as you like
4. Enter a command window and run the utility fc.exe with the switch /? to display the help for this command. It will enable you to run a full binary comparison of the two files.
The OS file system has some built in mechanisms to control and eliminate data errors. As SteveD points out, it's all just ones and zeroes to the PC (and the OS). There is no difference between your audio and say, a MS Word document in the sense of raw storage. The fact that the PC knows what to do with a particular file is because of higher levels of the OS. Associations, as they are called are not fundamental to writing data on a disk. They simply tell the os what to do with a particular file based on file names. The file system relies on a database(or set of databases) that contain "pointers" which refer the OS to the location of data based upon file name, folder name, etc.
An application program can manage this data in a couple of ways: 1. Use the standard API to interact with the file system and allow it to handle the manipulation of the bits on the disk(ususally, the most reliable method). or 2. Use built-in code to order a direct write of data on the disk using the absolute sector value to access the hardware. The absolute sector method was frequently used back in the 16-bit days as a sort of shortcut for placing application specific binary files on disks. The application code then could order a read of sector XXX rather than asking the OS for the data. There is a speed advantage to this, but the advent of the Hardware Abstraction Layer in Windows NT(and NT descendants, 2000 and XP) makes direct interaction with hardware substantially more difficult. Also, the Windows APIs for FAT32 and NTFS are easily available to developers, so there is no need to "reinvent the wheel".
All of this is to say that you are using best practices when you :
1. Partition your drive appropriate to its use.
2. Perform regular housekeeping and defragmentation
3. Don't try to "micro-manage" the data. There's not a lot that you can do that offers specific and direct impact at the drive surface.
< Message edited by Jimtoonz -- 8/21/2004 4:33:48 PM >