Blog

Unless otherwise stated, content is shared under CC-BY-NC Licence

Freeze Frame: Preservation Partnerships

Sara Day Thomson

Sara Day Thomson

Last updated on 13 December 2016

Partnership is a critical success factor for long term access to data from small or short-lived projects. This depends on a thoughtful dialogue between the project team and their preservation partner. Thorough documentation will be required.

In this case note we examine the relationship between the relatively short lived Freeze Frame project at the Scott Polar Research Institute and the institutional repository which offered to provide long term preservation services to ensure ongoing access at the end of the project. This study shows that small organisations don't necessarily need to establish a sophisticated preservation infrastructure when they embark on digitisation and that partnerships need need to be managed but can bring unexpected benefits to both parties.

Read More

iPRES 2016 Blog - Digital Art Preservation

Sharon McMeekin

Sharon McMeekin

Last updated on 12 October 2018

Following on from Sabine Himmelsbach’s excellent introduction to digital art conservation at the key note, the theme was continued with the morning’s first session.

Towards a Risk Model for Emulation-based Preservation Strategies: A Case Study from the Software-based Art Domain

The first session was presented by Klaus Rechert from the University of Freiburg in Germany.

The British Library has worked with Freiburg and the Emulation as a Service product, so it was interesting to have the opportunity to hear more about the workings and developments of emulation from their point of view. After a brief history of emulation, Klaus confirmed that despite great progress, emulation and virtualisation aren't quite there yet.

Of the major issues to resolve, the scalability is now mostly solved but sustainability and the long-term plans are not yet in place.

He went on to discuss a case study of software-based artworks with the goal of the project to highlight preservation risks of the emulation strategy. The computer system is the part that changes and that is where the emulation comes in.

External dependencies include artefact description & configuration, software environment & configuration and hardware environment. The key preservation risk is when the hardware or equivalent becomes obsolete which it where the emulation strategy is focussed. Acquisition and preparation is an analysis of what is available. You need to determine the environment and factors. If you don't have the environment, build one!

It also important to identify dependencies and whilst the dependent software may not change, the licensing does!

What the setup looks like!

Read More

iPRES 2016 Blog - Panel: Software Sustainability and Preservation

Paul Young

Paul Young

Last updated on 11 September 2018

PaulYoung1

Paul Young has been Digital Archivist at the National Archives for just over a year, dealing with the ingest of Born-Digital records and undertaking file format research for PRONOM.

Paul attended iPRES 2016 with support from the DPC's Leadership Programme. This blog is part of a series produced by scholarship recipients who attended iPRES 2016.

Panel Discussion: Software Sustainability and Preservation: Implications for the Long-Term Access to Digital Heritage

Read More

Fear of the executables: who is going to preserve software in the UK?

Paul Wheatley

Paul Wheatley

Last updated on 30 January 2017

I was reminded this week about the issue of software preservation from a couple of different quarters. First by a slightly random twitter conversation about reading lists, and secondly by the latest blog post from David Rosenthal. The former took me back to one of the first pieces of digital preservation literature I ever read. It was originally recommended to me by former colleague, friend and mentor, David Holdsworth. It helped me to really understand, for the first time, what the challenges of preserving digital stuff were all about. It's a short piece in the Computer Conservation Society bulletin called "The Problems of Software Conservation" by Doron Swade. It delves into what it means to preserve something interactive, where the function is (largely) more important than the physical form. Looking back, what strikes me about this writing is the date of publication. 1993. Despite many advances in digital preservation, so much so that someone touting the existence of a digital dark age provokes a backlash, we still haven't nailed the software preservation problem 22 years later.

Read More

Social Media for Good: the Series

Sara Day Thomson

Sara Day Thomson

Last updated on 27 January 2017

UK_Data_Service_logoThis year, DPC's Research and Practice team has been working on two studies commissioned by the UK Data Service as part of their Big Data Network Support. Both Preserving Social Media and Preserving Transactional Data will address the issues facing long-term access to this big, fast-moving data and will be published as Technology Watch reports. As part of Preserving Social Media, this series of posts examines some of the points of tension in the efforts of research and collecting institutions to preserve this valuable record of life in the 21st century.

sdaythomson_shetland_small

I'm Sara Day Thomson, Project Officer for the DPC, specialising in the pursuit of new ideas in digital preservation. 

If you want to get involved, follow me on Twitter @sdaythomson and the DPC account @DPC_chatter to get the scoop on upcoming DPC events and activities!

Read More

Social Media for Good: the Series, Episode 2

Sara Day Thomson

Sara Day Thomson

Last updated on 27 January 2017

UK_Data_Service_logoThis year, DPC's Research and Practice team has been working on two studies commissioned by the UK Data Service as part of their Big Data Network Support. Both Preserving Social Media and Preserving Transactional Data will address the issues facing long-term access to this big, fast-moving data and will be published as Technology Watch reports. As part of Preserving Social Media, this series of posts examines some of the points of tension in the efforts of research and collecting institutions to preserve this valuable record of life in the 21st century. 

Read More

Cloudy Culture: Preserving digital culture in the cloud

Lee Hibberd

Lee Hibberd

Last updated on 27 January 2017

Cloudy_culture

By now you’ll have heard of The Cloud. The big amorphous space out there that is the answer to anything digital. You want more storage? You need the cloud. You want a back-up copy of all of your treasured photos? You need the cloud. You want to undertake large scale high performance number crunching? You guessed it…you need the cloud. So it’s no surprise that the cloud is featuring more and more in the cultural heritage sector too. Tate Gallery, the Parliamentary Archives and the Bodleian Library have all dipped their toes, or their heads, into cloud technology. The National Library of Scotland has also been thinking about the role of the cloud, which is essentially a service that stores and manages digital information, as part of its continuing mission to preserve the nation’s digital culture. Is the cloud the answer to all our digital problems and if it is surely there’s a price tag attached to it. To find out the National Library of Scotland is about to embark on a journey of discovery with the Edinburgh Parallel Computing Centre, National Galleries of Scotland and the Digital Preservation Coalition. It doesn’t matter if you haven’t heard of these organisations, just be assured that we are all interested in preserving digital culture for current and future generations. Our journey starts at a project called EUDAT…

Read More

Screening the Future 2013

Angela Dappert

Angela Dappert

Last updated on 8 March 2023

Sceening the Future 2013

London, 7-8 May 2013 January 2012

About the event

From 7 to 8 May 2013 the 3rd annual conference “Screening the Future”, organised by PrestoCentre and focusing on the latest trends in audio-visual preservation took place at the Tate Modern in London. It covered topics related to digitisation and digital preservation in the creative and cultural industries including broadcast, post-production, motion picture, sound and music recording, visual and performing arts. The programme can be found at https://prestocentre.org/calendar/screeningfuture-2013-conference. This conference focused on strategies of preserving audio-visual materials for stakeholders from different backgrounds, also discussing some technological issues. It is worth noting that in audio-visual materials the term “digital preservation” is used more broadly than in other sectors of the digital preservation community. It includes the digitsation, which leads to the production of digital materials and mixes curation and digital preservation issues more than elsewhere. Angela Dappert represented the DPC; DPC members from the BBC, the British Library, Jisc, Kings College London, the National Archives and the Tate were present in their own right. These notes are intended to provide an informal briefing for members of the DPC not able to attend the event. For an authoritative and comprehensive repost readers are encouraged to contact the organisers of the event and the speakers directly.

Presentations and discussion

Mark Schubin from the New York Metropolitan Opera media department narrated the history of the recording of opera over time, showing the evolution of technology, but also the evolution of user response and user requests. User response is acculturated and new technologies have a different effect on users than accustomed technologies do. Early consumers judged the experience of consumption of a recording to be “just like being there”, in spite of the fact that recordings had distortions and that essential parts of the whole experience are missing: Librettos provide only the text, audio recordings provide a changed experience of the actual music, whereas the stage settings, the presence of the audience, and the acting and actors were wholly missing. For example, early listeners of recorded sound, in the dark, could not tell the difference between a recording and live production since they had not developed a notion of the qualities of recorded sound. The users’ requests with respect to technical features and preserved features change over time. For example requirements of colour (black and white, 3-colour- primaries recording systems; full colourations), resolution, and high definition needs have changed over time. Some of the recommended settings are based on visual acuity tests (Smelling) – but young people have actually better than 20/20 vision and benefit from richer settings than the theoretical values suggest. The slides for the presentation are available at bit.ly/stf13schubin.

Neal Beagrie, Charles Beagrie Ltd., discussed how multi-media content is permeating previously existing organisational or regional boundaries. He illustrated this by showing the great diversity of DPC members’ sectors and the fact that information provided by the DPC, such as the tech-watch reports, are being consumed widely by audiences that go far beyond the traditional ones. These changes cause organisations to have to respond in new ways, by forming alliances and partnerships, by creating shared services, by outsourcing (JISC, MIMAS, EDINA for UK HE/FE), by offering cloud services for preservation and by performing mergers for storage, skills and cost savings (Canada, Netherlands, New Zealand). There were questions from the audience on how to vet suppliers of services if your organisation has an obligation to ensure content preservation. There also were questions about in how far alliances are actually creating technical solutions. It was remarked that organisations’ remit changes over time, but also varies from country to country, and is not necessarily clear to the outside public; it is not clear, for example, whether the BBC’s role encompasses archiving.

Michael Moon from GISTICS presented on “Beyond cost-based preservation strategies”. When preserving assets for a future world in which things are many times faster and cheaper than now there is a likelihood of missing emergent opportunities because of lacking observations. Michael asserted that our model of planning for the future has become obsolete since we cannot even imagine the future. Instead, he proposes to step into pure imagination of the future and describe how we got there. He uses the “red dot” procedure described in his book. Michael made three underlying observations:

1) A business case takes place in the context of an organisation. It is an investment analysis to justify a decision. The conscious business case tries to achieve ROI. The motivation of the corporate political game (of not looking bad) is somewhat less conscious but has greater impact. The business model of how to make money is even less conscious but has even greater impact; and, finally, the very unconscious cultural norms of criteria, beliefs and values have an even bigger impact on the purchase decision than the previous elements. When you want to derive a business plan it is important to talk to the culturally-experienced person in the organisation who knows the organisation and history.

2) Arguments vary in how much they convince management. Starting with the most strategic and powerful and in decreasing order they are share prices; balance sheet improvement; increased revenue; cost reduction; process improvement; and finally intangible opportunity. Those latter tactical arguments are the weakest. One must target one’s value proposition through the top-level, strategic arguments, rather than through lower-level, tactical arguments.

3) Brand loyalty has a similar model to the first point: Rationalisation of decisions is a conscious approach but has lower impact on purchase decisions. Fads and then trends are less conscious, but have even higher impact. The unconscious, but high-impact self-identity that is fuelled by brands as “the tribal mind that lives in the limbic system” is the most powerful motivator of them . Brands invite you into a desirable tribe. Digital preservation activities should strive to create brands. Based on these three arguments, Michael suggests an approach for transmediation. Transmediation focuses on output – for example, how do you take something from a film and transmediated it into a 3d entertainment object? In the process of transmediation metadata is added to the dark, undescribed initial object, including provenance and storage. When you add policy-managed routes and storage governance it becomes a collaborative object. A mastered object is vetted, packaged for provision and linked to CRM, DRM and ERP and finally made into a digital cultural asset. Metadata are crucial in order to enable transmediation. In this view, content is just attached to the metadata, which is the primary object of our creation and curation. Neuro-computational imaging provides real-time feedback on how we experience the world. When we transmediate we can create the raw materials that “fuel the dream factories”. A lot of audio-visual preservation has been sold as business case –without much success. But cultural assets play a pivotal role in creating place plans and public diplomacy, which drive exports, investments, tourism and hospitality. Audio-visual preservation is about how our collections contribute to cultural heritage and self-identity (branding). We need to aim for transmediation into objects that we cannot even imagine yet. Part of this is to bring the essence of humanity into the hyper-reality world.

David Giaretta, STFC spoke about “Psychology and Digital Preservation”. Digital preservation is motivated by fear of loss of items that are special, personally or societally. This is partially hording behaviour, but the material can also be very useful (“data is the new gold”, but unlike gold neither rare nor non-reactive). Maslow’s hierarchy of needs, in which physiological needs are the basis, followed by the satisfaction of safety, the need for love and belonging, esteem and selfactualisation, had self-transcendence added at a later point. This is where digital preservation fits. Future generations cannot vote or pay taxes at this point and would not have any claims for representation without self-transcendence. When things change we need to, amongst other things, know what has changed, identify the implications, hand over to another repository, and ensure that the material remains usable. Digital Preservation requires reliance on others (trust). Trust applies when we do not have certainty, can be altered by hormones and is affected by the presence of technology. Our understanding of risk is not sufficiently based on the understanding of likelihoods. Our perception varies from reality: Experts are particularly prone to self-perception; we detect patterns where there are none; we are overly optimistic; and don’t react to non-immediate risks. One question is whom we trust. We have an innate sense of fairness and reciprocate others’ behaviour. When we need to rely on others’ judgement, factors that matter are authenticity, curation by others, audits and the certification of auditors. Over time digital materials become unfamiliar to societies and the capture of tacit knowledge is important. An interesting audience question was whether one might apply psychometric tests to ensure that people given the responsibility for valuable information have the personality and proper attitude to care about things that go beyond their time and employment. If not, how can one instil the right values during training?

Richard Wright as one of the driving forces behind the PrestoCentre spoke about understanding why different communities need different digital preservation approaches. The digital problem is rooted in the fact that digital data storage has enormous information density and very short data carrier stability. As storage capacity increases more information is being produced. Network services that are out of our control are the latest response to increase storage availability. With them, storage is a service, a file is a performance, and media is stored without media concerns by utilising managed services. One possible traditional taxonomy of communication technology is a matrix of media (realtime and non-real-time) against one-to-one or one-to-many scenarios. Digital technology breaks this matrix. But breaking out from the box also provides new opportunities. Digital objects require different institutional responses. For example, we now use different access approaches via streaming, without scheduled programming, in non-real-time. This is a process of publishing rather than broadcasting. The archives become the centre of a TV organisation; the rest of the organisation just produces for the archive. In the remainder Richard analysed the different organisational responses for different communities. Video and post production communities are a service industry for broadcasting, cinema and advertising. Capital investments are problematic if production is run as a project and does not include it. They need to respond to the technology change of having to hold files and provide mass storage. Files now become assets. Film collections and film makers are at both ends of the business life-cycle. They depend on subsidies. Their technology changes are also manifested through the disappearance of film. Richard thinks that all film will have to be digitised requiring the purchase of more and more storage. For sound and music archives, unlike for video, there are very clear audio standards. The technological change is a great opportunity for independents. Sound and music archives’ mission is to support research. They collect published items, research items, and also do their own recordings as part of their collection. Their holdings will have to be digitised. Access for audio is more difficult since you cannot subtitle it. Metadata to deal with this is an unsolved problem. Digital Preservation of Personal Collections are, amongst others, addressed by the Library of Congress. They intersect with Genealogy as a stakeholder. Richard does not see a digital black hole, but opportunities.

Kara van Malsen, from AV Preservation Solutions, spoke about disaster response to hurricane Sandy’s flooding at Eyebeam, a non-profit art and technology centre dedicated to exposing broad and diverse audiences to new technologies and media arts, while simultaneously establishing and demonstrating new media as a significant genre of cultural production. Kara illustrated and described the salvage of digital data carriers from salt waters by organising volunteers through social media. In the absence of power, they had to establish workflows and non-destructive procedures for cleaning a mix of 1500 items of all types of data carriers. This included such concerns as ensuring that containers and media were kept together and records of the workflow steps were kept (in a shareable fashion on Google spread sheets). A positive side-effect was that a catalogue was created to manage the materials, which introduced archival processes for the organisation. A paper was written about the recovery details of the cleaning procedures, supplies, super vision, and working with volunteers and can be found at http://bit.ly/11F3vuO . This was accompanied by a video by Jonathan Minard http://eyebeam.org/press/media/videos/recovering-eyebeams-archive-as-told-byresident-jonathan-minard.

In the second half of the presentation Kara addressed the issue of using Return on Investment for motivating investing in digital preservation. Kara’s team believes that the ROI argument is not effective and, instead suggests a COI: cost of inaction metric. They have developed a Google doc spread sheet on avpreserve.com to calculate the COI based on collection size, investment on the media to date, annual cost per year, how long you had the content, etc. to calculate the rough investment to date. This offsets the digital preservation cost against the on-going investments saved. Inspired by the book: Files that last – self-published – April 2013 by Gary McGrath, Kara states that only instantaneous disasters provoke an immediate (heroic) response, but that slow deterioration and obsolescence have the same effect and do not elicit the same visceral response. In disaster recovery it is important not to get hung up on detail. The COI calculator has the same goal. There was some criticism that COI does only contain cost of digitisation and not the cost of digital preservation.

Panels:

The afternoons were taken up by 2 panel discussions each on practical aspects of AV preservation. The following descriptions are taken from the conference website https://prestocentre.org/calendar/screening-future-2013-conference:

  • Preserving Objects, Telling Stories This session concentrates on the transmediation of cultural, commercial, and personal narratives and its impact on multimedia preservation. The session will make a case for preserving the potential of transmediated narratives, i.e. exploit the creative potentials of anyone medium and media format. · Making it Now, Keeping it Forever Media production, from broadcasting and advertising to computer games to feature films, is a high-pressure environment. Decisions and processing during production determine the quality -- or possibility -- of the preservation and reuse of content. How can production processes be made 'preservation ready'?

  • Understanding Differences, Discovering Similarities This session draws upon case studies to examine the business case for the development of inhouse solutions and asks when and what to outsource. The session looks at different types of scenario, considers the potential for greater collaboration and asks whether the trend is away from the development of bespoke solutions? It addresses different preservation solutions being developed for audio visual material and asks what is the impact of scale and mission on these key decisions? How do we manage cost? What are the appropriate solutions for different environments?

  • Developing Solutions, Building Value This session identifies the changing business models which are affecting product development for digital audiovisual preservation and asks how is this impacting archives' IT infrastructure and research and development. The session will explore which AV sub-sectors are attracted to open source solutions and why. And what types of services or models provide viable value chains for preservation vendors.

Vendor presentations:

In addition there were a number of very informative small and large groups vendor presentations from Memnon Archiving Services, Cambridge Imaging Systems, Oracle, and many others

Read More

Conference Report: Curating Research: e-Merging New Roles and Responsibilities in the European Landscape

William Kilbride

William Kilbride

Last updated on 30 September 2016

17 April 2009, The Hague, Koninklijke Bibliotheek, The Netherlands

1. Summary of issues relevant for DPC members 

  • Training is popular but what sort of training will be most effective: what will drive down costs and support our work best?
  • Considerations of scale: what is the right size solution to our digital preservation challenges? Do we want lots of small DP facilities or a small number of large ones?
  • How do we collaborate without undermining institutions?
  • There would appear to be a lot of policy development which is an important change from a decade ago: but how do we assess the value of these emerging policies and how do we know if they are being applied?
  • There is still a policy gap. There are some high level aspirations in the UNESCO Charter and some very detailed guides, but a gap in between. What would be our
  • golden rules for creating digital data?
Read More

Report on IS & T Archiving 2005 Conference, Washington, 26 - 29 April 2005

Sarah Middleton

Sarah Middleton

Last updated on 30 September 2016

By Hugh Campbell, PRONI

1. I attended the Imaging Science & Technology (IS&T) Archiving 2005 conference at the Washington Hilton. This is my report on the conference.

2. Washington is quite a long way away – home to hotel was about 20 hours with hotel to home about 18 hours. This needs to be borne in mind when planning travel to such a conference and return to work - the body needs time to recover.

3. The conference itself started on Tuesday, 26 April with a number of tutorials. I attended the Long-Term Archiving of Digital Images tutorial – see attached summary. The conference proper ran from Wednesday 27 April – Friday 29 April, kicking off at 0830 each morning (and finishing at 1700 on Wednesday and Thursday and 1500 on Friday). Wednesday featured a 40-minute keynote address and 15 20-minute sessions; Thursday featured a 40-minute keynote address, 10 20-minute sessions and approximately 20 90second poster previews followed by the opportunity to visit the poster presentations. Friday featured a 40-minute keynote address and 10 20-minute sessions. I felt that there were too many sessions, cramming too much into a short space of time.

Read More

Scroll to top