If I had a hammer

Since I’m a bit-player rather than a bit-worker, I generally stick to toy-sized problems. However, in recent weeks I’ve been fooling around with multimegabyte elevation maps, and I’ve had trouble scaling up. What I’ve found most challenging is not writing programs to digest lots of megabytes; instead, it’s the trivial-seeming data-preparation tasks that gave me fits.

At one point I had five plain text files of about 30 megabytes each that I needed to edit in minor ways (e.g., removing headers) and then concatenate. What is the right tool for that chore? I tried opening the files in various text editors, but the programs weren’t up to the job. Some quit when I attempted to load a big file. Others opened the file but then quit when I tried to scroll through the text. Some programs didn’t quit, but they became so lethargic and unresponsive that eventually I quit.

Note that I’m talking about big files but not huge ones. At most they run to a few hundred megabytes, a volume that ought to fit in memory on a machine with a few gigabytes of RAM.

Surely this is a common problem. Is there some obvious answer that everyone but I has always known?

Eventually I did manage to finish what needed doing. I discovered that a very modest Macintosh hex editor called 0xED—meant for editing binary files more than text files—would do the trick instantly. 0xED opened the largest files, let me scroll through and make changes, and saved the new version back to disk—all without fuss. But I still have the feeling that I’m pounding nails with a monkey wrench, and I’d like to know if there’s a hammer designed just for this task.

Posted in computing | 15 Comments