Digital scholarship blog

Enabling innovative research with British Library digital collections

37 posts categorized "Sound and vision"

26 November 2020

Using British Library Cultural Heritage Data for a Digital Humanities Research Course at the Australian National University

Posted on behalf of Terhi Nurmikko-Fuller, Senior Lecturer, Centre for Digital Humanities Research, Australian National University by Mahendra Mahey, Manager of BL Labs.

The teaching philosophy and pedagogy of the Centre for Digital Humanities Research (CDHR) at the Australian National University (ANU) focus on research-fuelled, practice-led, object-orientated learning. We value collaboration, experimentation, and individual growth, rather than adhering to standardised evaluation matrix of exams or essays. Instead, students enrolled in jointly-taught undergraduate and postgraduate courses are given a task: to innovate at the intersection of digital technologies and cultural heritage sector institutions. They are given a great degree of autonomy, and are trusted to deliver. Their aim is to create digital prototypes, which open up GLAM sector material to a new audience.

HUMN2001: Digital Humanities Theories and Projects, and its postgraduate equivalent HUMN6001 are core courses for the programs delivered from the CDHR. HUMN2001 is a compulsory course for both the Minor and the Major in Digital Humanities for the Bachelor of Arts; HUMN6001 is a core, compulsory course in the Masters of Digital Humanities and Public Culture. Initially the course structure was quite different: experts would be invited to guest lecture on their Digital Humanities projects, and the students were tasked with carrying out critical evaluations of digital resources of various kinds. What quickly became apparent, was that without experience of digital projects, the students struggled to meaningfully and thoughtfully evaluate the projects they encountered. Many focused exclusively on the user-interface; too often critical factors like funding sources were ignored; the critical evaluative context in which the students operated was greatly skewed by their experiences of tools such as Google and platforms such as Facebook.

The solution to the problem became clear - students would have to experience the process of developing digital projects themselves before they could reasonably be expected to evaluate those of others. This revelation brought on a paradigm shift in the way in which the CDHR engages with students, projects, and their cultural heritage sector collaborators.

In 2018, we reached out to colleagues at the ANU for small-scale projects for the students to complete. The chosen project was the digitisation and the creation of metadata records for a collection of glass slides that form part of the Heritage in the Limelight project. The enthusiasm, diligence, and care that the students applied to working with this external dataset (external only to the course, since this was an ANU-internal project) gave us confidence to pursue collaborations outside of our own institution. In Semester 1 of 2019, Dr Katrina Grant’s course HUMN3001/6003: Digital Humanities Methods and Practices ran in collaboration with the National Museum of Australia (NMA) to almost unforeseeable success: the NMA granted five of the top students a one-off stipend of $1,000 each, and continued working with the students on their projects, which were then added to the NMA’s Defining Moments Digital Classroom, launched in November 2020. This collaboration was featured in a piece in the ANU Reporter, the University’s internal circular. 

Encouraged by the success of Dr Grant’s course, and presented with a serendipitous opportunity to meet up at the Australasian Association for Digital Humanities (aaDH) conference in 2018 where he was giving the keynote, I reached out to Mahendra Mahey to propose a similar collaboration. In Semester 2, 2019 (July to November), HUMN2001/6001 ran in collaboration with the British Library. 

Our experiences of working with students and cultural heritage institutions in the earlier semester had highlighted some important heuristics. As a result, the delivery of HUMN2001/6001 in 2019 was much more structured than that of HUMN3001/6003 (which had offered the students more freedom and opportunity for independent research). Rather than focus on a theoretical framework per se, HUMN2001/6001 focused on the provision of transferable skills that improved the delivery and reporting of the projects, and could be cited directly in future employment opportunities as a skills-base. These included project planning and time management (such as Gantt charts and SCRUM as a form of agile project management), and each project was to be completed in groups.

The demographic set up of each group had to follow three immutable rules:

  • The first, was that each team had to be interdisciplinary, with students from more than one degree program.
  • Second, the groups had to be multilingual, and not each member of the group could have the same first language, or be monolingual in the same language.
  • Third, was that the group had to represent more than one gender.

Although not all groups strictly implemented these rules, the ones that did benefitted from the diversity and critical lens afforded by this richness of perspective to result in the top projects.

Three examples that best showcase the diversity (and the creative genius!) of these groups and their approach to the British Library’s collection include a virtual reality (VR) concert hall, a Choose-You-Own-Adventure-Game travelling through Medieval manuscripts, and an interactive treasure hunt mobile app.

Examples of student projects

(VR)2 : Virtuoso Rachmaninoff in Virtual Reality

Research Team: Angus Harden, Noppakao (Angel) Leelasorn, Mandy McLean, Jeremy Platt, and Rachel Watson

Fig. 1 Angel Leelasorn testing out (VR)2
Figure 1: Angel Leelasorn testing out (VR)2
Figure 2: Snapshots documenting the construction of (VR)2
Figure 2: Snapshots documenting the construction of (VR)2

This project is a VR experience of the grand auditorium of the Bolshoi Theatre in Moscow. It has an audio accompaniment of Sergei Rachmaninoff’s Prelude in C# Minor, Op.3, No.2, the score for which forms part of the British Library’s collection. Reflective of the personal experiences of some of the group members, the project was designed to increase awareness of mental health, and throughout the experience the user can encounter notes written by Rachmaninoff during bouts of depression. The sense of isolation is achieved by the melody playing in an empty auditorium. 

The VR experience was built using Autodesk Maya and Unreal Engine 4. The music was produced  using midi data, with each note individually entered into Logic Pro X, and finally played through Addictive Keys Studio Grand virtual instrument.

The project is available through a website with a disclosure, and links to various mental health helplines, accessible at: https://virtuosorachmaninoff.wixsite.com/vrsquared

Fantastic Bestiary

Research Team: Jared Auer, Victoria (Vick) Gwyn, Thomas Larkin, Mary (May) Poole, Wen (Raven) Ren, Ruixue (Rachel) Wu, Qian (Ariel) Zhang

Fig. 3 Homepage of A Fantastic Bestiary
Figure 3:  Homepage of A Fantastic Bestiary

This project is a bilingual Choose-Your-Own-Adventure hypertext game that engages with the Medieval manuscripts (such as Royal MS 12 C. xix. Folios 12v-13, based off the Greek Physiologus and the Etymologiae of St. Isidore of Seville) collection at the British Library, first discovered through the Turning the Pages digital feature. The project workflow included design and background research, resource development, narrative writing, animation, translation, audio recording, and web development. Not only does it open up the Medieval manuscripts to the public in an engaging and innovative way through five fully developed narratives (~2,000-3,000 words each), all the content is also available in Mandarin Chinese.

The team used a plethora of different tools, including Adobe Animate, Photoshop, Illustrator, and Audition and Audacity. The website was developed using HTML, CSS, and JavaScript in the Microsoft Visual Studio Integrated Development Environment

The project is accessible at: https://thomaslarkin7.github.io/hypertextStory/

ActionBound

Research Team: Adriano Carvalho-Mora, Conor Francis Flannery, Dion Tan, Emily Swan

Fig 4 (Left)Testing the app at the Australian National Botanical Gardens, (Middle) An example of one of the tasks to complete in ActionBound (Right) Example of sound file from the British Library (a dingo)
Figure 4: (Left) Testing the app at the Australian National Botanical Gardens, (Middle) An example of one of the tasks to complete in ActionBound (Right) Example of sound file from the British Library (a dingo)

This project is a mobile application, designed as a location-based authoring tool inspired by the Pokemon Go! augmented reality mobile game. This educational scavenger-hunt aims to educate players about endangered animals. Using sounds of endangered or extinct animals from the British Library’s collection, but geo-locating the app at the Australian National Botanical Gardens, this project is a perfect manifestation of truly global information sharing and enrichment.

The team used a range of available tools and technologies to build this Serious Game or Game-With-A-Purpose. These include GPS and other geo-locating (and geo-caching), they created QR codes to be scanned during the hunt, locations are mapped using Open Street Map

The app can be downloaded from: https://en.actionbound.com/bound/BotanicGardensExtinctionHunt

Course Assessment

Such a diverse and dynamic learning environment presents some pedagogical challenges and required a new approach to student evaluation and assessment. The obvious question here is how to fairly, objectively, and comprehensively grade such vastly different projects? Especially since not only do they differ in both methodology and data, but also in the existing level of skills within the group. The approach I took for the grading of these assignments is one that I believe will have longevity and to some extent scalability. Indeed, I have successfully applied the same rubric in the evaluation of similarly diverse projects created for the course in 2020, when run in collaboration with the National Film and Sound Archives of Australia

The assessment rubric for this course awards students on two axis: ambition and completeness. This means that projects that were not quite completed due to their scale or complexity are awarded for the vision, and the willingness of the students to push boundaries, do new things, and take on a challenge. The grading system allows for four possible outcomes: a High Distinction (for 80% or higher), Distinction (70-79%), Credit (60-69%), and Pass (50-59%). Projects which are ambitious and completed to a significant extent land in the 80s; projects that are either ambitious but not fully developed, or relatively simple but completed receive marks in the 70s; those that very literally engaged with the material, implemented a technologically straightforward solution (such as building a website using WordPress or Wix, or using one of the suite of tools from Northwestern University’s Knightlab) were awarded marks in the 60s. Students were also rewarded for engaging with tools and technologies they had no prior knowledge of. Furthermore, in week 10 of a 12 week course, we ran a Digital Humanities Expo! Event, in which the students showcased their projects and received user-feedback from staff and students at the ANU. Students able to factor these evaluations into their final project exegeses were also rewarded by the marking scheme.

Notably, the vast majority of the students completed the course with marks 70 or higher (in the two top career brackets). Undoubtedly, the unconventional nature of the course is one of its greatest assets. Engaging with a genuine cultural heritage institution acted as motivation for the students. The autonomy and trust placed in them was empowering. The freedom to pursue the projects that they felt best reflected their passions, interests in response to a national collection of international fame resulted, almost invariably, in the students rising to the challenge and even exceeding expectations.

This was a learning experience beyond the rubric. To succeed students had to develop the transferable skills of project-planning, time-management and client interaction that would support a future employment portfolio. The most successful groups were also the most diverse groups. Combining voices from different degree programs, languages, cultures, genders, and interests helped promote internal critical evaluations throughout the design process, and helped the students engage with the materials, the projects, and each other in a more thoughtful way.

Two groups discussing their projects with Mahendra Mahey
Figure 5: Two groups discussing their projects with Mahendra Mahey
Figure 6 : National Museum of Australia curator Dr Lily Withycombe user-testing a digital project built using British Library data, 2019.
Figure 6: National Museum of Australia curator Dr Lily Withycombe user-testing a digital project built using British Library data, 2019.
User-testing feedback! Staff and students came to see the projects and support our students in the Digital Humanities Expo in 2019.
Figure 7: User-testing feedback! Staff and students came to see the projects and support our students in the Digital Humanities Expo in 2019.

Terhi Nurmikko-Fuller Biography

Dr. Terhi Nurmikko-Fuller
Dr. Terhi Nurmikko-Fuller

Terhi Nurmikko-Fuller is a Senior Lecturer in Digital Humanities at the Australian National University. She examines the potential of computational tools and digital technologies to support and diversify scholarship in the Humanities. Her publications cover the use of Linked Open Data with musicological information, library metadata, the narrative in ancient Mesopotamian literary compositions, and the role of gamification and informal online environments in education. She has created 3D digital models of cuneiform tables, carved boab nuts, animal skulls, and the Black Rod of the Australian Senate. She is a British Library Labs Researcher in Residence and a Fellow of the Software Sustainability Institute, UK; an eResearch South Australia (eRSA) HASS DEVL (Humanities Arts and Social Sciences Data Enhanced Virtual Laboratory) Champion; an iSchool Research Fellow at the University of Illinois at Urbana-Champaign, USA (2019 - 2021), a member of the Australian Government Linked Data Working Group; and, since September 2020 has been a member of the Territory Records Advisory Council for the Australian Capital Territory Government.

BL Labs Public Awards 2020 - REMINDER - Entries close NOON (GMT) 30 November 2020

Inspired by this work that uses the British Library's digitised collections? Have you done something innovative using the British Library's digital collections and data? Why not consider entering your work for a BL Labs Public Award 2020 and win fame, glory and even a bit of money?

This year's public awards 2020 are open for submission, the deadline for entry is NOON (GMT) Monday 30 November 2020

Whilst we welcome projects on any use of our digital collections and data (especially in research, artistic, educational and community categories), we are particularly interested in entries in our public awards that have focused on anti-racist work, about the pandemic or that are using computational methods such as the use of Jupyter Notebooks.

Work will be showcased at the online BL Labs Annual Symposium between 1400 - 1700 on Tuesday 15 December, for more information and a booking form please visit the BL Labs Symposium 2020 webpage.

11 November 2020

BL Labs Online Symposium 2020 : Book your place for Tuesday 15-Dec-2020

Posted by Mahendra Mahey, Manager of BL Labs

The BL Labs team are pleased to announce that the eighth annual British Library Labs Symposium 2020 will be held on Tuesday 15 December 2020, from 13:45 - 16:55* (see note below) online. The event is FREE, but you must book a ticket in advance to reserve your place. Last year's event was the largest we have ever held, so please don't miss out and book early, see more information here!

*Please note, that directly after the Symposium, we are organising an experimental online mingling networking session between 16:55 and 17:30!

The British Library Labs (BL Labs) Symposium is an annual event and awards ceremony showcasing innovative projects that use the British Library's digital collections and data. It provides a platform for highlighting and discussing the use of the Library’s digital collections for research, inspiration and enjoyment. The awards this year will recognise outstanding use of British Library's digital content in the categories of Research, Artistic, Educational, Community and British Library staff contributions.

This is our eighth annual symposium and you can see previous Symposia videos from 201920182017201620152014 and our launch event in 2013.

Dr Ruth Anhert, Professor of Literary History and Digital Humanities at Queen Mary University of London Principal Investigator on 'Living With Machines' at The Alan Turing Institute
Ruth Ahnert will be giving the BL Labs Symposium 2020 keynote this year.

We are very proud to announce that this year's keynote will be delivered by Ruth Ahnert, Professor of Literary History and Digital Humanities at Queen Mary University of London, and Principal Investigator on 'Living With Machines' at The Alan Turing Institute.

Her work focuses on Tudor culture, book history, and digital humanities. She is author of The Rise of Prison Literature in the Sixteenth Century (Cambridge University Press, 2013), editor of Re-forming the Psalms in Tudor England, as a special issue of Renaissance Studies (2015), and co-author of two further books: The Network Turn: Changing Perspectives in the Humanities (Cambridge University Press, 2020) and Tudor Networks of Power (forthcoming with Oxford University Press). Recent collaborative work has taken place through AHRC-funded projects ‘Living with Machines’ and 'Networking the Archives: Assembling and analysing a meta-archive of correspondence, 1509-1714’. With Elaine Treharne she is series editor of the Stanford University Press’s Text Technologies series.

Ruth's keynote is entitled: Humanists Living with Machines: reflections on collaboration and computational history during a global pandemic

You can follow Ruth on Twitter.

There will be Awards announcements throughout the event for Research, Artistic, Community, Teaching & Learning and Staff Categories and this year we are going to get the audience to vote for their favourite project in those that were shortlisted, a people's BL Labs Award!

There will be a final talk near the end of the conference and we will announce the speaker for that session very soon.

So don't forget to book your place for the Symposium today as we predict it will be another full house again, the first one online and we don't want you to miss out, see more detailed information here

We look forward to seeing new faces and meeting old friends again!

For any further information, please contact [email protected]

23 October 2020

BL Labs Public Award Runner Up (Research) 2019 - Automated Labelling of People in Video Archives

Example people identified in TV news related programme clips
People 'automatically' identified in digital TV news related programme clips.

Guest blog post by Andrew Brown (PhD researcher),  Ernesto Coto (Research Software Engineer) and Andrew Zisserman (Professor) of the Visual Geometry Group, Department of Engineering Science, University of Oxford, and BL Labs Public Award Runner-up for Research, 2019. Posted on their behalf by Mahendra Mahey, Manager of BL Labs.

In this work, we automatically identify and label (tag) people in large video archives without the need for any manual annotation or supervision. The project was carried out with the British Library on a sample of 106 videos from their “Television and radio news” archive; a large collection of news programs from the last 10 years. This archive serves as an important and fascinating resource for researchers and the general public alike. However, the sheer scale of the data, coupled with a lack of relevant metadata, makes indexing, analysing and navigating this content an increasingly difficult task. Relying on human annotation is no longer feasible, and without an effective way to navigate these videos, this bank of knowledge is largely inaccessible.

As users, we are typically interested in human-centric queries such as:

  • “When did Jeremy Corbyn first appear in a Newsnight episode?” or
  • “Show me all of the times when Hugh Grant and Shirley Williams appeared together.

Currently this is nigh on impossible without trawling through hundreds of hours of content. 

We posed the following research question:

Is it possible to enable automatic person-search capabilities such as this in the archive, without the need for any manual supervision or labelling?

The answer is “yes”, and the method is described next.

Video Pre-Processing

The basic unit which enables person labelling in videos is the face-track; a group of consecutive face detections within a shot that correspond to the same identity. Face-tracks are extracted from all of the videos in the archive. The task of labelling the people in the videos is then to assign a label to each one of these extracted face-tracks. The video below gives an example of two face-tracks found in a scene.


Two face-tracks found in British Library digital news footage by Visual Geometry Group - University of Oxford.

Techniques at Our Disposal

The base technology used for this work is a state-of-the-art convolutional neural network (CNN), trained for facial recognition [1]. The CNN extracts feature-vectors (a list of numbers) from face images, which indicate the identity of the depicted person. To label a face-track, the distance between the feature-vector for the face-track, and the feature-vector for a face-image with known identity is computed. The face-track is labelled as depicting that identity if the distance is smaller than a certain threshold (i.e. they match). We also use a speaker recognition CNN [2] that works in the same way, except it labels speech segments from unknown identities using speech segments from known identities within the video.

Labelling the Face-Tracks

Our method for automatically labelling the people in the video archive is divided into three main stages:

(1) Our first labelling method uses what we term a “celebrity feature-vector bank”, which consists of names of people that are likely to appear in the videos, and their corresponding feature-vectors. The names are automatically sourced from IMDB cast lists for the programmes (the titles of the programmes are freely available in the meta-data). Face-images for each of the names are automatically downloaded from image-search engines. Incorrect face-images and people with no images of themselves on search engines are automatically removed at this stage. We compute the feature-vectors for each identity and add them to the bank alongside the names. The face-tracks from the video archives are then simply labelled by finding matches in the feature-vector bank.

Face-tracks from the video archives are labelled by finding matches in the feature-vector bank.
Face-tracks from the video archives are labelled by finding matches in the feature-vector bank. 

(2) Our second labelling method uses the idea that if a name is spoken, or found displayed in a scene, then that person is likely to be found within that scene. The task is then to automatically determine whether there is a correspondence or not. Text is automatically read from the news videos using Optical Character Recognition (OCR), and speech is automatically transcribed using Automatic Speech Recognition (ASR). Names are identified and they are searched for on image search engines. The top ranked images are downloaded and the feature-vectors are computed from the faces. If any are close enough to the feature-vectors from the face-tracks present in the scene, then that face-track is labelled with that name. The video below details this process for a written name.


Using text or spoken word and face recognition to identify a person in a news clip.

(3) For our third labelling method, we use speaker recognition to identify any non-labelled speaking people. We use the labels from the previous two stages to automatically acquire labelled speech segments from the corresponding labelled face-tracks. For each remaining non-labelled speaking person, we extract the speech feature-vector and compute the distance of it to the feature-vectors of the labelled speech segments. If one is close enough, then the non-labelled speech segment and corresponding face-track is assigned that name. This process manages to label speaking face-tracks with visually challenging faces, e.g. deep in shadow or at an extremely non-frontal pose.

Indexing and Searching Identities

The results of our work can be browsed via a web search engine of our own design. A search bar allows for users to specify the person or group of people that they would like to search for. People’s names are efficiently indexed so that the complete list of names can be filtered as the user types in the search bar. The search results are returned instantly with their associated metadata (programme name, data and time) and can be displayed in multiple ways. The video associated with each search result can be played, visualising the location and the name of all identified people in the video. See the video below for more details. This allows for the archive videos to be easily navigated using person-search, thus opening them up for use by the general public.


Archive videos easily navigated using person-search.

For examples of more of our Computer Vision research and open-source software, visit the Visual Geometry Group website.

This work was supported by the EPSRC Programme Grant Seebibyte EP/M013774/1

[1] Qiong Cao, Li Shen, Weidi Xie, Omkar M. Parkhi, and Andrew Zisserman. VGGFace2: A dataset for recognising faces across pose and age. In Proc. International Conference on Automatic Face & Gesture Recognition, 2018.

[2] Joon Son Chung, Arsha Nagrani and Andrew Zisserman. VoxCeleb2: Deep Speaker Recognition. INTERSPEECH, 2018

BL Labs Public Awards 2020

Inspired by this work that uses the British Library's digital archived news footage? Have you done something innovative using the British Library's digital collections and data? Why not consider entering your work for a BL Labs Public Award 2020 and win fame, glory and even a bit of money?

This year's public and staff awards 2020 are open for submission, the deadline for entry for both is Monday 30 November 2020.

Whilst we welcome projects on any use of our digital collections and data (especially in research, artistic, educational and community categories), we are particularly interested in entries in our public awards that have focused on anti-racist work, about the pandemic or that are using computational methods such as the use of Jupyter Notebooks.

19 October 2020

The 2020 British Library Labs Staff Award - Nominations Open!

Looking for entries now!

A set of 4 light bulbs presented next to each other, the third light bulb is switched on. The image is supposed to a metaphor to represent an 'idea'
Nominate an existing British Library staff member or a team that has done something exciting, innovative and cool with the British Library’s digital collections or data.

The 2020 British Library Labs Staff Award, now in its fifth year, gives recognition to current British Library staff who have created something brilliant using the Library’s digital collections or data.

Perhaps you know of a project that developed new forms of knowledge, or an activity that delivered commercial value to the library. Did the person or team create an artistic work that inspired, stimulated, amazed and provoked? Do you know of a project developed by the Library where quality learning experiences were generated using the Library’s digital content? 

You may nominate a current member of British Library staff, a team, or yourself (if you are a member of staff), for the Staff Award using this form.

The deadline for submission is NOON (GMT), Monday 30 November 2020.

Nominees will be highlighted on Tuesday 15 December 2020 at the online British Library Labs Annual Symposium where some (winners and runners-up) will also be asked to talk about their projects (everyone is welcome to attend, you just need to register).

You can see the projects submitted by members of staff and public for the awards in our online archive.

In 2019, last year's winner focused on the brilliant work of the Imaging Team for the 'Qatar Foundation Partnership Project Hack Days', which were sessions organised for the team to experiment with the Library's digital collections. 

The runner-up for the BL Labs Staff Award in 2019 was the Heritage Made Digital team and their social media campaign to promote the British Library's digital collections one language a week from letters 'A' to 'U' #AToUnknown).

In the public Awards, last year's winners (2019) drew attention to artisticresearchteaching & learning, and community activities that used our data and / or digital collections.

British Library Labs is a project within the Digital Scholarship department at the British Library that supports and inspires the use of the Library's digital collections and data in exciting and innovative ways. It was previously funded by the Andrew W. Mellon Foundation and is now solely funded by the British Library.

If you have any questions, please contact us at [email protected].

11 September 2020

BL Labs Public Awards 2020: enter before NOON GMT Monday 30 November 2020! REMINDER

The sixth BL Labs Public Awards 2020 formally recognises outstanding and innovative work that has been carried out using the British Library’s data and / or digital collections by researchers, artists, entrepreneurs, educators, students and the general public.

The closing date for entering the Public Awards is NOON GMT on Monday 30 November 2020 and you can submit your entry any time up to then.

Please help us spread the word! We want to encourage any one interested to submit over the next few months, who knows, you could even win fame and glory, priceless! We really hope to have another year of fantastic projects to showcase at our annual online awards symposium on the 15 December 2020 (which is open for registration too), inspired by our digital collections and data!

This year, BL Labs is commending work in four key areas that have used or been inspired by our digital collections and data:

  • Research - A project or activity that shows the development of new knowledge, research methods, or tools.
  • Artistic - An artistic or creative endeavour that inspires, stimulates, amazes and provokes.
  • Educational - Quality learning experiences created for learners of any age and ability that use the Library's digital content.
  • Community - Work that has been created by an individual or group in a community.

What kind of projects are we looking for this year?

Whilst we are really happy for you to submit your work on any subject that uses our digital collections, in this significant year, we are particularly interested in entries that may have a focus on anti-racist work or projects about lock down / global pandemic. We are also curious and keen to have submissions that have used Jupyter Notebooks to carry out computational work on our digital collections and data.

After the submission deadline has passed, entries will be shortlisted and selected entrants will be notified via email by midnight on Friday 4th December 2020. 

A prize of £150 in British Library online vouchers will be awarded to the winner and £50 in the same format to the runner up in each Awards category at the Symposium. Of course if you enter, it will be at least a chance to showcase your work to a wide audience and in the past this has often resulted in major collaborations.

The talent of the BL Labs Awards winners and runners up over the last five years has led to the production of remarkable and varied collection of innovative projects described in our 'Digital Projects Archive'. In 2019, the Awards commended work in four main categories – Research, Artistic, Community and Educational:

BL_Labs_Winners_2019-smallBL  Labs Award Winners for 2019
(Top-Left) Full-Text search of Early Music Prints Online (F-TEMPO) - Research, (Top-Right) Emerging Formats: Discovering and Collecting Contemporary British Interactive Fiction - Artistic
(Bottom-Left) John Faucit Saville and the theatres of the East Midlands Circuit - Community commendation
(Bottom-Right) The Other Voice (Learning and Teaching)

For further detailed information, please visit BL Labs Public Awards 2020, or contact us at [email protected] if you have a specific query.

Posted by Mahendra Mahey, Manager of British Library Labs.

01 September 2020

Taking a Virtual Walk on the Wild Side

For those of us in the northern hemisphere, summer is drawing to a close and autumn feels hot on its heels. On recent walks I’ve noticed blackberries ripening in the hedgerows, tree leaves turning colour and bats darting through the air.

Thinking of nature and the senses, today is the first day of Sound Walk September, the yearly global festival celebrating sound walks. If you want to check some of these out, there is a comprehensive list of walking pieces on their website and also many interesting events planned. Including one about virtual walks; exploring how we can enjoy the great outdoors, by using digital technology to experience virtual nature, when staying indoors.

Blue graphic of a stick person wearing large headphones
Sound Walk September, 1-30 September 2020

We'd love for you to join us for this online Virtual Walks panel discussion on Wednesday 16th September at 7pm (BST), booking details are here.

This event will be chaired by Sue Thomas, author of “Nature and Wellbeing in the Digital Age”, who champions how we can use technology to feel better without logging off.

Sue will be joined by cultural geographer and digital media artist, Jack Lowe, who will talk about a genre of video games known as ‘walking simulators’ and his research in developing location-based online games, as a method of place based digital storytelling.

Virtual Whitby Abbey, one of the British Library’s “Off the Map” gothic winning entries. Created by Team Flying Buttress, i.e. six students from De Montfort University, Ben Mowson, Elliott Pacel, Ewan Couper, Finn McAvinchey, Kit Grande and Katie Hallaron.

Use of atmospheric sound recordings is very much part of the ambience of virtual walking simulators and videogames. Completing the panel will be British Library Wildlife and Environmental Sounds Curator, Cheryl Tipp and myself discussing how digitised sound recordings from the Library’s sound archive have been innovatively used in videogames made by UK students, as part of the "Off the Map" initiative.

If you are inspired to make your own digital sound walk, then you may want to take a read of this previous blog post, which has lots of practical advice. Furthermore, if you use any openly licensed British Library sound recordings in your walk, such as ones on the "Off the Map" SoundCloud Gothic, Alice or Shakespeare sets, or these ones on Wikimedia Commons, then please do let us know by emailing digitalresearch(at)bl(dot)uk, as we always love to share and showcase what people have done with our digital collections.

This post is by Digital Curator Stella Wisdom (@miss_wisdom

04 August 2020

Having a Hoot for International Owl Awareness Day

Who doesn’t love owls? Here at the British Library we certainly do.

Often used as a symbol of knowledge, they are the perfect library bird. A little owl is associated and frequently depicted with the Greek goddess of wisdom Athena. The University of Bath even awarded Professor Yoda the European eagle owl a library card in recognition of his valuable service deterring seagulls from nesting on their campus.

The British Library may not have issued a reader pass to an owl (as far as I am aware!), but we do have a wealth of owl sound recordings in our wildlife and environmental sounds collection, you can read about and listen to some of these here.

Little Owl calls recorded by Nigel Tucker in Somerset, England (BL ref 124857)

Owls can also be discovered in our UK Web Archive. Our UK Web Archivists recently examined the Shine dataset to explore which UK owl species is the most popular on the archived .uk domain. Read here to find out which owl is the winner.

They also curate an Online Enthusiast Communities in the UK collection, which features bird watching and some owl related websites in the Animal related hobbies subsection. If you know of websites that you think should be included in this collection, then please fill in their online nomination form.

Here in Digital Scholarship I recently found many fabulous illustrations of owls in our Mechanical Curator Flickr image collection of over a million Public Domain images. So to honour owls on International Owl Awareness Day, I put together an owl album.

These owl illustrations are freely available, without copyright restrictions, for all types of creative projects, including digital collages. My colleague Hannah Nagle blogged about making collages recently and provided this handy guide. For finding more general images of nature for your collages, you may find it useful to browse other Mechanical Curator themed albums, such as Flora & Fauna, as these are rich resources for finding illustrations of trees, plants, animals and birds.

If you creatively use our Mechanical Curator Flickr images, please do share them with us on twitter, using the hashtag #BLdigital, we always love to see what people have done with them. Plus if you use any of our owls today, remember to include the #InternationalOwlAwarenessDay hashtag too!

We also urge you to be eagle-eyed (sorry wrong bird!) and look out for some special animated owls during the 4th August, like this one below, which uses both sounds and images taken from our collections. These have been created by Carlos Rarugal, our arty Assistant Web Archivist and will shared from the WildlifeWeb Archive and Digital Scholarship Twitter accounts. 


Video created by Carlos Rarugal,  using Tawny Owl hoots recorded by Richard Margoschis in Gloucestershire, England (BL ref 09647) and British Library digitised image from page 79 of "Woodland Wild: a selection of descriptive poetry. From various authors. With ... illustrations on steel and wood, after R. Bonheur, J. Bonheur, C. Jacque, Veyrassat, Yan Dargent, and other artists"

One of the benefits of making digital art, is that there is no risks of spilling paint or glue on your furniture! As noted in this tweet from Damyanti Patel "Thanks for the instructions, my kids were entertained & I had no mess to clean up after their art so a clear win win, they really enjoyed looking through the albums". I honestly did not ask them to do this, but it is really cool that her children included this fantastic owl in the centre of one of their digital collages:

I quite enjoy it when my library life and goth life connect! During the covid-19 lockdown I have attended several online club nights. A few months ago I was delighted to see that one of these; How Did I Get Here? Alternative 80s Night! regularly uses the British Library Flickr images to create their event flyers, using illustrations of people in strange predicaments to complement the name of their club; like this sad lady sitting inside a bird cage, in the flyer below.

Their next online event is Saturday 22nd August and you can tune in here. If you are a night owl, you could even make some digital collages, while listening to some great tunes. Sounds like a great night in to me!

Illustration of a woman sitting in a bird cage with a book on the floor just outside the cage
Flyer image for How Did I Get Here? Alternative 80s Night!

This post is by Digital Curator Stella Wisdom (@miss_wisdom

24 April 2020

BL Labs Learning & Teaching Award Winners - 2019 - The Other Voice - RCA

Innovations in sound and art

Dr Matt Lewis, Tutor of Digital Direction and Dr Eleanor Dare, Reader of Digital Media both at the School of Communication, at the Royal College of Art and Mary Stewart Curator, Oral History and Deputy Director of National Life Stories at the British Library reflect on an ongoing and award-winning collaboration (posted on behalf of them by Mahendra Mahey, BL Labs Manager).

In spring 2019, based in both the British Library and the Royal College of Art School of Communication, seven students from the MA Digital Direction course participated in an elective module entitled The Other Voice. After listening in-depth to a selection of oral history interviews, the students learnt how to edit and creatively interpret oral histories, gaining insight into the complex and nuanced ethical and practical implications of working with other people’s life stories. The culmination of this collaboration was a two-day student-curated showcase at the British Library, where the students displayed their own creative and very personal responses to the oral history testimonies.

The culmination of this collaboration was a two-day student-curated showcase at the British Library, where the students displayed their own creative and very personal responses to the oral history testimonies. The module was led by Eleanor Dare (Head of Programme for MA Digital Direction, RCA), Matt Lewis (Sound Artist and Musician and RCA Tutor) and Mary Stewart (British Library Oral History Curator). We were really pleased that over 100 British Library staff took the time to come to the showcase, engage with the artwork and discuss their responses with the students.

Eleanor reflects:

The students have benefited enormously from this collaboration, gaining a deeper understanding of the ethics of editing, the particular power of oral history and of course, the feedback and stimulation of having a show in the British Library.”

We were all absolutely delighted that the Other Voice group were the winners of the BL Labs Teaching and Learning Award 2019, presented in November 2019 at a ceremony at the British Library Knowledge Centre.  Two students, Karthika Sakthivel and Giulia Brancati, also showcased their work at the 2019 annual Oral History Society Regional Network Event at the British Library - and contributed to a wide ranging discussion reflecting on their practice and the power of oral history with a group of 35 oral historians from all over the UK.  The collaboration has continued as Mary and Matt ran ‘The Other Voice’ elective in spring 2020, where the students adapted to the Covid-19 Pandemic, producing work under lockdown, from different locations around the world. 

Here is just a taster of the amazing works the students created in 2019, which made them worthy winners of the BL Labs Teaching and Learning Award 2019.

Karthika Sakthivel and Giulia Brancati were both inspired by the testimony of Irene Elliot, who was interviewed by Dvora Liberman in 2014 for an innovative project on Crown Court Clerks. They were both moved by Irene’s rich description of her mother’s hard work bringing up five children in 1950s Preston.

On the way back by Guilia Brancati

Giulia created On the way back an installation featuring two audio points – one with excerpts of Irene’s testimony and another an audio collage inspired by Irene’s description. Two old fashioned telephones played the audio, which the listener absorbed while curled up in an arm chair in a fictional front room. It was a wonderfully immersive experience.

Irene-eilliot
Irene Elliot's testimony interwoven with the audio collage (C1674/05)
Audio collage and photography © Giulia Brancati.
Listen here

Giulia commented:

In a world full of noise and overwhelming information, to sit and really pay attention to someone’s personal story is an act of mindful presence. This module has been continuous learning experience in which ‘the other voice’ became a trigger for creativity and personal reflection.”

Memory Foam by Karthika Sakthivel

Inspired by Irene’s testimony Karthika created a wonderful sonic quilt, entitled Memory Foam.

Karthika explains,

There was power in Irene’s voice, enough to make me want to sew - something I’d never really done on my own before. But in her story there was comfort, there was warmth and that kept me going.”

Illustrated with objects drawn from Irene's memories, each square of the patchwork quilt encased conductive fabric that triggered audio clips. Upon touching each square, the corresponding story would play.

Karthika further commented,

The initial visitor interactions with the piece gave me useful insights that enabled me to improve the experience in real time by testing alternate ways of hanging and displaying the quilt. After engaging with the quilt guests walked up to me with recollections of their own mothers and grandmothers – and these emotional connections were deeply rewarding.”

Karthika, Giulia and the whole group were honoured that Irene and her daughter Jayne travelled from Preston to come to the exhibition, Karthika:

"It was the greatest honour to have her experience my patchwork of her memories. This project for me unfurled yards of possibilities, the common thread being - the power of a voice.”

Memory-foam
Irene and her daughter Jayne experiencing Memory Foam © Karthika Sakthivel.
Irene's words activated by touching the lime green patch with lace and a zip (top left of the quilt) (C1674/05)
Listen here

Meditations in Clay by James Roadnight and David Sappa

Listening to ceramicist Walter Keeler's memories of making a pot inspired James Roadnight and David Sappa to travel to Cornwall and record new oral histories to create Meditations in Clay. This was an immersive documentary that explores what we, as members of this modern society, can learn from the craft of pottery - a technology as old as time itself. The film combines interviews conducted at the Bernard Leach pottery with audio-visual documentation of the St Ives studio and its rugged Cornish surroundings.


Meditations in Clay, video montage © James Roadnight and David Sappa.

Those attending the showcase were bewitched as they watched the landscape documentary on the large screen and engaged with the selection of listening pots, which when held to the ear played excerpts of the oral history interviews.

James and David commented,

This project has taught us a great deal about the deep interview techniques involved in Oral History. Seeing visitors at the showcase engage deeply with our work, watching the film and listening to our guided meditation for 15, 20 minutes at a time was more than we could have ever imagined.”

Beyond Form

Raf Martins responded innovatively to Jonathan Blake’s interview describing his experiences as one of the first people in the UK to be diagnosed with HIV. In Beyond Form Raf created an audio soundscape of environmental sounds and excerpts from the interview which played alongside a projected 3D hologram based on the cellular structure of the HIV virus. The hologram changed form and shape when activated by the audio – an intriguing visual artefact that translated the vibrant individual story into a futuristic media.

Beyond-form
Jonathan Blake's testimony interwoven with environmental soundscape (C456/104) Soundscape and image © Raf Martins.
Listen here

Stiff Upper Lip

Also inspired by Jonathan Blake’s interview was the short film Stiff Upper Lip by Kinglsey Tao which used clips of the interview as part of a short film exploring sexuality, identity and reactions to health and sickness.

Donald in Wonderland

Donald Palmer’s interview with Paul Merchant contained a wonderful and warm description of the front room that his Jamaican-born parents ‘kept for best’ in 1970s London. Alex Remoleux created a virtual reality tour of the reimagined space, entitled Donald in Wonderland, where the viewer could point to various objects in the virtual space and launch the corresponding snippet of audio.

Alex commented,

I am really happy that I provided a Virtual Reality experience, and that Donald Palmer himself came to see my work. In the picture below you can see Donald using the remote in order to point and touch the objects represented in the virtual world.”

Donald-wonderland
Donald Palmer describes his parents' front room (C1379/102)
Interviewee Donald Palmer wearing the virtual reality headset, exploring the virtual reality space (pictured) created by Alex Remoleux.
Listen here

Showcase at the British Library

The reaction to the showcase from the visitors and British Library staff was overwhelmingly positive, as shown by this small selection of comments. We were incredibly grateful to interviewees Irene and Donald for attending the showcase too. This was an excellent collaboration: RCA students and staff alike gained new insights into the significance and breadth of the British Library Oral History collection and the British Library staff were bowled over by the creative responses to the archival collection.

Feedback
Examples of feedback from British Library showcase of 'The Other Voice' by Royal College of Art

With thanks to the MA Other Voice cohort Giulia Brancati, Raf Martins, Alexia Remoleux, James Roadnight, Karthika Sakthivel, David Sappa and Kingsley Tao, RCA staff Eleanor Dare and Matt Lewis & BL Oral History Curator Mary Stewart, plus all the interviewees who recorded their stories and the visitors who took the time to attend the showcase.

Digital scholarship blog recent posts

Archives

Tags

Other British Library blogs