Back in 2006, Dutch weblog Sargasso started following the activity of about 260 Dutch blogs that were active at the time, mainly by looking at the frequency of new postings.
While browsing ArchiveTeam’s File Formats Wiki earlier this week, I came across some entries I created there on Quattro Pro spreadsheets two years ago. At the time I had also contributed some old Quattro Pro for DOS spreadsheets (here and here) from my personal archives to the OPF format corpus. Seeing those files again, I decided to spend an afternoon trying to access them using modern-day software. This turned out to be more challenging than expected. It even made me wonder whether, at long last, I had finally run into a case of the much discussed (but rarely observed) phenomenon of format obsolescence. Yes, big words indeed, and if anyone would like to prove me wrong, the comments section below is your friend!
Earlier this week I had a discussion with some colleagues about the archiving of mobile phone and tablet apps (iPhone/Android), and, equally important, ways to provide long-term access. The immediate incentive for this was an announcement by a Dutch publisher, who recently published a children’s book that is accompanied by its own app. Also, there are already several examples of Ebooks that are published exclusively as mobile apps. So, even though we’re not receiving any apps in our collections yet, we’ll have to address this at some point, and it’s useful to have an initial idea of the challenges that may lie ahead.
Some time ago Will Palmer, Peter May and Peter Cliff of the British Library published a really interesting paper that investigated three different JPEG 2000 codecs, and their effects on image quality in response to lossy compression. Most remarkably, their analysis revealed differences not only in the way these codecs encode (compress) an image, but also in the decoding phase. In other words: reading the same lossy JP2 produced different results depending on which implementation was used to decode it.
A limitation of the paper’s methodology is that it obscures the individual effects of the encoding and decoding components, since both are essentially lumped in the analysis. Thus, it’s not clear how much of the observed degradation in image quality is caused by the compression, and how much by the decoding. This made me wonder how similar the decode results of different codecs really are.
Vandaag maakte de Britse Digital Preservation Coalition de finalisten bekend die in de race zijn voor de Digital Preservation Awards 2014. Deze prijs is in 2004 in het leven geroepen om aandacht te vestigen op initiatieven die een belangrijke bijdrage leveren aan het toegankelijk houden van digitaal erfgoed.