Hub contributors’ reflections on the current and future state of the Hub



The Archives Hub is what the contributors make it, and with over 170 institutions now contributing, we want to continue to ensure that we listen to them and develop in accordance with their needs. This week we brought together a number of Archives Hub contributors for a workshop session. The idea was to think about where the Hub is now and where it could go in the future.
We started off by giving a short overview of the new Hub strategy, and updating contributors on the latest service developments. We then spent the rest of the morning asking them to look at three questions: What are the benefits of being part of the Hub? What are the challenges and barriers to contributing? What sort of future developments would you like to see?
Probably the strongest benefit was exposure – as a national service with an international user-base the Hub helps to expose archival content, and we also engage in a great deal of promotional work across the country and abroad. Other benefits that were emphasised included the ability to search for archives without knowing which repository they are held at, and the pan-disciplinary approach that a service like the Hub facilitates. Many contributors also felt that the Hub provides them with credibility, a useful source of expertise and support, and sometimes ‘a sympathetic ear’, which can be invaluable for lone archivists struggling to make their archives available to researchers. The network effect was also raised – the value of having a focus for collaboration and exchange of idea.
A major barrier to contributing is the backlog of data, which archivists are all familiar with, and the time required to deal with this, especially with the lack of funding opportunities for cataloguing and retro-conversion. The challenges of data exchange were cited, and the need to make this a great deal easier. For some, getting the effective backing of senior managers is an issue. For those institutions who host their own descriptions (Spokes), the problems surrounding the software, particularly in the earlier days of the distributed system, were highlighted, and also the requirement for technical support. One of the main barriers here may be the relationship with the institution’s own IT department. It was also felt that the use of Encoded Archival Description (EAD) may be off-putting to those who feel a little intimidated by the tags and attributes.
People would like to see easy export routines to contribute to the Hub from other sytems, particularly from CALM, a more user-friendly interface for the search results, and maybe more flexibility with display, as well as the ability to display images and seamless integration of other types of files. ‘More like Google’ was one suggestion, and certainly exposure to Google was considered to be vital. It would be useful for researchers to be able to search a Spoke (institution) and then run the same search on the central Hub automatically, which would create closer links between Spokes and Hub. Routes through to other services would add to our profile and more interoperability with digital repositories would be well-received. Similarly, the ability to search across archival networks, and maybe other systems, would benefit users and enable more people to find archival material of relevance. The importance of influencing the right people and lobbying were also listed as something the Hub could do on behalf of contributors.
After a very good lunch at Christie’s Bistro we returned to look at three particular developments that we all want to see, and each group took one issues and thought about what the drivers are that move it forward and what the retraining forces are that stop it from happening. We thought about usability, which is strongly driven by the need to be inclusive and to de-mystify archival descriptions for those not familiar with archives and in particular archival hierarchies. It is also driven by the need to (at least in some sense) compete with Google, the need to be up-to-date, and to think about exposing the data to mobile devices. However, the unrealistic expectations that people have and, fundamentally, the need to be clear about who our users are and understanding their needs are hugely important. The quality and consistency of the data and markup also come into play here, and the recognition that this sort of thing requires a great deal of expert software development.
The need for data export, the second issue that we looked at, is driven by the huge backlogs of data and the big impact that this should have on the Hub in terms of quantity of descriptions. It should be a selling point for vendors of systems, with the pressure of expectation from stakeholders for good export routines. It should save time, prove to be good value for money and be easily accommodated into the work flow of an archive office. However, complications arise with the variety of systems out there and the number of standards, and variance in application of standards. There may be issues about the quality of the data and people may be resistant to changing their work habits.
Our final issue, the increased access to digital content, is driven by increased expectations for accessing content, making the interface more visually attractive (with embedded images), the drive towards digitisation and possibly the funding opportunities that exist around this area. But there is the expense and time to consider, issues surrounding copyright, the issue of where the digital content is stored and issues around preservation and future-proofing.
The day ended with a useful discussion on measuring impact. We got some ideas from contributors that we will be looking at and sharing with you through our blog. But the challenges of understanding the whole research life-cycle and the way that primary sources fit into this are certainly a major barrier to measuring the impact that the Hub may have in the context of research outputs.

Web 2.0 for teaching: wishy-washy or nitty-gritty?

A useful report, summarising Web 2.0 and some of the perspectives in literature about Web 2.0 and teaching, was recently produced by Susan A. Brown of the School of Education at the University of Manchester: The Potential of Web 2.0 in Teaching: a study of academics’ perceptions and use. The findings were based on a questionnaire (74 respondents across 4 Faculties) and interviews (8 participants) with teaching staff from the University of Manchester. It is available on request, so let us know if you would like a copy.
Some of the points that came out of the report:
  • It is the tutors’ own beliefs about teaching that are the main influence on their perceptions of Web 2.0
  • There is little discussion about Web 2.0 amongst colleagues and the use of it is generally a personal decision
  • Top-down goals and initiatives do not play a major part in use of Web 2.0
  • It may be that a bottom-up experimental approach is the most appropriate, especially given the relative ease with which Web 2.0 tools can be deployed, although there were interviewees who argued for a more considered and maybe more strategic approach, which suggests something that is more top-down
  • There is little evidence that students’ awareness of Web 2.0 is a factor, or that students are actively arguing in favour of its use:
“This absence of a ‘student voice’ in tutors’ comments on Web 2.0 is interesting given the perceptions of ‘digital natives’ – the epithet often ascribed to 21st Century students – as drivers for the greater inclusion of digital technologies. It may shore up the view that epithets such as ‘digital natives’ and ‘Millennials’ to describe younger students over-simplify a complex picture where digital/Web technology users do not necessarily see the relevance of Web 2.0 in education.”
  • The use of and familiarity with Web 2.0 tools (personal use or use for research) was not a particularly influential factor in whether the respondents judged them to have potential for teaching.
  • In terms of the general use of Web 2.0 tools, mobile social networking (e.g Twitter) and bookmarking were the tools used the least amongst respondents. Wikis, blogs and podcasting had higher use.
  • In terms of using these tools for teaching, the data was quite complex, and rather more qualitative than quantitative, so it is worth looking at the report for the full analysis. There were interviewees who felt that Web 2.0 is not appropriate for teaching, where the role of a teacher is to lay down the initial building blocks of knowledge, implying that discussion can only follow understanding, not be used to achieve understanding. There was also a notion that Web 2.0 facilitates more surface, social interactions, rather than real cognitive engagement.
“A number of…respondents expressed the view that Web 2.0 is largely socially orientated, facilitating surface ‘wishy-washy’ discussion that cannot play a role in tacklinkg the ‘nitty-gritty’ of ‘hard’ subject matter”.
Three interviewees saw a clear case for the use of Web 2.0 and they referred to honing research skills, taking a more inquiry-based approach and taking a more informal approach and tapping into a broader range of expertise.
In conclusion “The study indicates that there are no current top-down and bottom-up influences operating that are likely to spread Web 2.0 use beyond individuals/pockets of users at the UoM [Universtiy of Manchester]”. The study recommends working with a small group of academics to get a clearer understanding of the issues they face in teaching and how Web 2.0 might offer opportunities, as well as providing an opportunity for more detailed discussion about teaching practices and thinking about how to tailor Web2.0 for this context.

Archival Context: entities and multiple identities


I recently took part in a Webinar (Web seminar) on the new EAC-CPF standard. This is a standard for the encoding of information about record creators: corporate bodies, persons and families. This information can add a great deal to the context of archives, supporting a more complete understanding of the records and their provenance.

We were given a brief overview of the standard by Kathy Wisser, one of the Working Group members, and then the session was open to questions and discussion.

The standard is very new, and archivists are still working out how it fits in to the landscape and how it relates to various other standards. It was interesting to note how many questions essentially involved the implementation of EAC-CPF: who creates the records? where are they kept? how are they searched? who decides what?
These questions are clearly very important, but the standard is just a standard for the encoding of ISAAR(CPF) information. It will not help us to figure out how to work together to create and use EAC-CPF records effectively.
In general, archivists use EAD to include a biographical history of the record creator, and may not necessarily create or link to a whole authority record for them. The idea is that providing separate descriptions for different entities is more logical and efficient. The principle of separation of entities is well put: “Because relations occur between the descriptive nodes [i.e. between archive collections, creators, functions, activities], they are most efficiently created and maintained outside of each node.” So that if you have a collection description and a creator description, the relationship between the two is essentially maintained separately to the actual descriptions. If only EAD itself was a little more data-centric (database friendly you might say), this would facilitate a relational approach.
I am interested in how we will effectively link descriptions of the same person, because I cannot see us managing to create one single authoritative record for each creator. This is enabled via the ‘identities’: a record creator can have two or more identities with each represented by a distinct EAC-CPF instance. I think the variety of identity relationships that the standard provides for is important, although it inevitably adds a level of complexity. It is something we have implemented in our use of the tag to link to related descriptions. Whilst this kind of semantic markup is a good thing, there is a danger that the complexity will put people off.
I’m quite hung-up on the whole issue of identifiers at the moment. This may be because I’ve been looking at Linked Data and the importance of persistent URLs to identify entities (e.g. I have a URL, you have a URL, places have a URL, things have a URL and that way we can define all these things and then provide links between them). The Archives Hub is going to be providing persistent URLs for all our descriptions, using the unique identifier of the countrycode, repository code and local reference for the collection (e.g. http://www.archiveshub.ac.uk/search/record.html?id=gb100mss, where 100 is the repository code and MSS is the local reference).
I feel that it will be important for ISAAR(CPF) records to have persistent URLs, and these will come from the recordID and the agencyCode. Part of me thinks the agency responsible for the EAC-CPF instance should not be part of the identifer, because the record should exist apart from the institution that created it, but then realistically, we’re not going to get consensus on some kind of independent stand-alone ISAAR(CPF) record. One of the questions I’m currently asking myself is: If two different bodies have EAC-CPF records, does it matter what the identifers/URLs are for those records, even if they are for the same person? Is the important thing to relate them as representing the same thing? I’m sure its very important to have a persistent URL for all EAC-CPF instances, because that is how they will be discoverable; that is their online identity. But the question of providing one unique identifier for one person, or one corporate body is not something I have quite made my mind up about.
It will be interesting to see how the standard is assessed by archivists and more examples of implementation. The Archives Hub would be very interested to hear from anyone using it.

Designs on Delivery: GPO Posters from 1930 to 1960: Online extras

 Mail Coach A.D. 1784

University of the Arts London Archives and Special Collections Centre, in collaboration with The British Postal Museum & Archive, presents Designs on Delivery: GPO Posters from 1930 to 1960. The exhibition at the Well Gallery – and online here on the Archives Hub – focuses on a period when the Post Office was at the cutting edge of poster design and mass communication. It explores how the GPO translated, often complex, messages to the public in order to educate them about the services offered, by using text, image, and colour.

The Archives Hub website now has online extras: exclusively online, an additional eight posters representing the range of themes adopted by the General Post Office in their advertising.

Illustration: John Armstrong (1893-1973) ‘Mail Coach A.D. 1784’ (1935) reference The Royal Mail Archive POST 110/3175; copyright © Royal Mail Group Ltd and courtesy of The British Postal Museum & Archive.

Sustainable content: visits to contributors

I recently visited two of the contributors to the Archives Hub sustainable content development project. The archivists at Queen Mary, University of London (QMUL) and the BT Archives were nice enough to let me drink their tea, and see how they used CALM.

Axiell, developers of the CALM software, have kindly let us have access to a trial version of CALM to help with this project, but it

Designs on Delivery: GPO Posters from 1930 to 1960

NIGHT MAIL

University of the Arts London Archives and Special Collections Centre, in collaboration with The British Postal Museum & Archive, presents Designs on Delivery: GPO Posters from 1930 to 1960. The exhibition at the Well Gallery – and online here on the Archives Hub – focuses on a period when the Post Office was at the cutting edge of poster design and mass communication. It explores how the PO translated, often complex, messages to the public in order to educate them about the services offered, by using text, image, and colour.

As part of the exhibition, the Well Gallery will be showing on loop Night Mail (1936) which the British Film Institute calls "one of the most popular and instantly recognised films in British film history … one of the most critically acclaimed films … [of the] documentary film movement".

Illustration: poster designed by Pat Keely (died 1970) for the film Night Mail, reference The Royal Mail Archive POST 109/377; copyright © Royal Mail Group Ltd and courtesy of The British Postal Museum & Archive.