![]() |
||
Digital images or video with no analogue equivalent and where the only copy is online with a social media platform or cloud image hosting service. |
||
Group: Social Media |
Trend in 2021: |
Consensus Decision |
Added to List: 2018 |
|
Previous classification: Endangered |
Trend in 2022: |
||
|
||
Imminence of Action Action is recommended within three years, detailed assessment within one year. |
Significance of Loss The loss of tools, data or services within this group would impact on people and sectors around the world. |
Effort to Preserve It would require a major effort to address losses in this group, possibly requiring the development of new preservation tools or techniques. |
Examples Flickr; Vimeo; YouTube; Instagram; Periscope; DropBox; Facebook; Twitter. |
||
‘Critically Endangered’ in the Presence of Aggravating Conditions Lack of preservation capacity in provider; lack of explicit preservation commitment or incentive from provider to preserve; lack of storage replication by provider; dependence on proprietary products or formats; poor management of data protection; inaccessibility to automated web crawlers; political or commercial interference; lack of offline equivalent; over-abundance; poorly managed intellectual property rights; lossy compression applied in upload scripts. |
||
‘Vulnerable’ in the Presence of Good Practice Offline backup; lossless compression; good documentation; accessible to web harvesting; clarity of intellectual property rights which enable preservation; credible preservation commitment from service provider; export pathway. |
||
2021 Jury Review This entry was added in 2018 within a wider social media group, sharing common risk profiles and challenges with other social media materials such as dependency on global service providers whose business model can only be presumed and tied to users via asymmetrical contracts that favour the supplier. In 2019 it became a standalone entry given the distinct preservation challenges of images and videos in cloud services which need to be addressed. The 2020 Jury noted a trend towards increased risk based on cloud services with a low barrier to entry leading to use by agencies or individuals least able to respond to closure or loss. |
||
Additional Comments The vast majority of content may be accessible for as long as the platform where it is hosted is popular (and has a viable business model); however, more insidious content (such as malicious misinformation or hate speech) may be deleted by content creators (potentially backed by hostile governments) to avoid prosecution or tracing. It is unclear to what extent these platform providers are compelled to provide access to servers / deleted content or private content for evidential purposes in the course of legal or criminal investigations. The lack of transparency and standardized international regulation of these platforms make their content vulnerable to exploitation and malicious use by individuals, corporations, and hostile governments. Museums, Libraries, and Archives have begun to pay attention to this content through projects like Collecting Social Photo (CoSoPho), but no breakthroughs have been made. See: https://www.collectingsocialphoto.org/en/home Case Studies or Examples:
|