In the last week I have been working on two projects that seem unrelated, but are actually quite similar and in my opinion may point out a general defiency in our existing software solutions.

The first project was writing a synchronization script between a self-contained wiki implementation called TiddlyWiki. This was (after some false starts in Visual Basic) fairly easily accomplished in JavaScript within the webpage itself. The code is fairly simple and has limitations appropriate with anything of that minute size.

The second project was to create a tracking spreadsheet from a testplan stored in a database. There already existed a utility to export the testplan from the database as an .xml file. From there, I created a perl script that could read in the .xml file (using the XML::Simple module) traversed around the data and created an Excel spreadsheet (using the Spreadsheet::WriteExcel module).

Both projects were basically reformatting existing data into different formats to enable different programs to access the same data to meet different (or multiple) customer needs. Frequently this is a more efficient than trying to meet all the usage models with a single solution. But it is also not in the realm of "Joe User’s" abilities to go out and hack together 200-400 lines of Javascript or perl. If it were, our computing environment would certainly look differently than it does and I suspect that it would be much easier to get those "glue scripts" working because of more concious API design, and certainly better writing and debug tools.

I suspect that it’s not in the software companie’s best interests to let data created in their application be easily migrated to other, potentially superior apps. They wish to grow the capabilities of their app to continue to get revenue. Hence the behavior that all applications grow until they can read email (or surf the web, or both). They are economically disincentivised from letting data out of their app’s format, a format that is usually optimized for that application.