A talk delivered in the Collections strand of the Museums + Heritage Show 2016, Olympia, 19 May 2016.
Introduction
This talk relates to work that was done from April 2012 to March 2015, when I was still Documentation Manager at the Horniman Museum. But since I’m now working at the National Gallery, I’d like to start by thanking my former colleagues at the Horniman for their help in correcting my dodgy memory and digging out pictures for my slides.
Even though I’ve left, part of me will forever belong to the Horniman, so when I say ‘we’ during this talk, that’s who I mean. For those of you who don’t know us, the Horniman Museum and Gardens is a surprising, eccentric, family-friendly attraction in Forest Hill in south east London.
It has been open since the late nineteenth century, when the tea trader and philanthropist Frederick John Horniman first opened his house and extraordinary collection of objects to the local community. Since then, our collection has grown significantly and includes internationally important collections of anthropology and musical instruments, as well as an acclaimed aquarium and natural history gallery – all surrounded by 16 acres of beautiful Gardens offering breathtaking views across London.
We aim to use our worldwide collections and the Gardens to encourage a wider appreciation of the world, its peoples and their cultures, and its environments.
The Collections People Stories project set out to uncover the range, scale and importance of the Horniman’s anthropology collections. This included reviewing at least four key areas of our collection, using both academic expertise and community response; identifying star objects; and significantly developing our collections online and digital engagement. A further outcome of the project was a proposal for a redisplay of our two existing anthropology galleries, the Centenary Gallery and African Worlds.
I will say now that this project was very well-funded through ACE’s major partnership scheme. But before you say ‘that’s all very well, him and all his money’, I hope and believe that the basic procedures we followed and the lessons we learned are transferable to museums of more-or-less any size.
Before we began
Many of the decisions we took at the beginning of the project were based on our experience with a six-month review of our Oceanian collections, which took place from September 2011 to March 2012 and served as a pilot for the main Collections People Stories review. So here’s my first top tip for the day:
Run a pilot on a subset of the collection you plan to review. This will help resolve many questions before the beginning of the main project, and help you identify where you need to focus your attention, and show you where problems may lie before you start work.
Reviewing the registers
The Horniman’s documentation began in the 1890s, and since then has been migrated between a bewildering variety of systems.
By the time the review began, we had allocated 19,339 temporary numbers to objects in the anthropology collections which had become detached from their original register entries. The first stage of work was therefore to go through the accession registers and make sure everything recorded in them was also entered in our collections management system, Mimsy.
Work began in April 2012, with the fitting out of a room for the project staff to work in. This was in the Horniman’s Study Collections Centre in Greenwich – the SCC – where the Horniman’s objects are stored, about six miles from the main site at Forest Hill. The registers had already been digitised, so we could print off disposable working copies. We also appointed the staff we needed:
- 2 × Collections/Documentation Assistant
- 2 × Documentation/Collections Assistant
- 0.6 × Collections Assistant
- 1 × Photographer
- 1 × Conservator
We created a new data-entry screen for the project in Mimsy, making a number of changes so that the system mirrored the team’s work as closely as possible, and allowed them to flag progress record-by-record within the system:
- Important fields at top
- Unimportant fields hidden
- Open repeating fields automatically
- Set up default values or shortcuts
- Create flagging system
- Draw up progress report templates
And then we made sure that the team was trained in what they had to do, and how to do it.
The process itself was straightforward: checking the register against the database and flagging any discrepancies, entering any records not yet in the database into the system, recording provenance systematically, and linking the images of the register pages to the relevant object records so they could easily be checked during the object review.
We set up a reporting system for this phase of the project which we continued during the main review, circulating a weekly ‘checkpoint’ report summarising the project’s status, work done, key achievements and problems, work planned for the following week, and graphs summarising progress. Similar monthly ‘highlight’ reports went to the project’s steering committee.
From these, we can see how much progress varied depending upon the quality of the different registers. The extent of the work completed by the team by the time work finished in mid-November 2012 is remarkable:
Over
17 weeks
40, 015 register entries
were checked against
73,832 database records
Digitise your registers before starting work with the objects, and make sure all registered objects are entered in your database. Whilst this will add objects to your database which cannot be found, it also means that you can easily search through the unlocated objects when you are confronted with something in the store that has lost its identity, making it easier to reconcile undocumented objects with their register entries.
Reviewing the history files
The work on the registers revealed how much information lay outside the database, and we decided we should also make sure we knew what was in the object history files, kept away from the objects at the main site in Forest Hill. From November 2012 to April 2013, we employed an Archive Officer to list these into a spreadsheet, which we later uploaded into Mimsy – although we were unable to link the individual history files directly to the objects they documented. The process we followed was:
- Describe contents
- List related
- People and organisations
- Places
- Cultures
- Objects
- Write biographies
- Identify duplicate people and organisations
Over
25 weeks
1,719 new archive records
were created
Go through the archives before starting work in the stores, and, if possible, link archive records to object records. It helped the review teams, curators and external experts to know what records were available whilst working in the stores, and before organising visits.
Try and bring all your documentation together alongside the objects. We spent a fair amount of time travelling from the stores to the main museum to check the history files for information like lists of objects acquired during particular fieldwork projects. We would have been able to work much quicker and resolve more problems, if everything was immediately to hand: the difference made by having the register pages scanned and attached to the object records made that very clear.
Reviewing the collection
The register review showed us the degree of discrepancy between what we had recorded in the registers and what was in the database. The only way to resolve this was by carrying out the project’s main aim: review 30,000 items, approximately one third of our entire anthropology collection, confirming the basic object information:
- What we have
- Where it is
- ‘Tombstone’ information
- Brief description
- What it looks like (image)
We began by making sure we had the necessary infrastructure in place, starting with an overhaul of the SCC’s network – although this continued to cause us problems, leading to the installation of new network switches and Wi-Fi access points during the project.
We also needed to create space for the project to happen in, clearing out the large objects which filled our main ground floor hall into external storage for the duration of the project.
Make sure your infrastructure is as good as it can be from the beginning: in particular, if your documentation is computerised, fix the network before you start. This will save a great deal of time and your teams will be much happier.
Be realistic about space and resources; in particular, it’s much easier if you don’t do two different collection reviews at once in the same space. If necessary, you may need to restrict access to the stores whilst you’re carrying out the review. This may be frustrating, but should lead to greater access after the project, because you’ll then actually know what objects you have.
We reviewed the equipment in the stores, and bought additional items for the review teams, and supplies to help us keep track of progress:
- Trolleys
- Tables
- Lamps
- Laptops etc.
- Measuring equipment
- UV torches
- Plier staplers
- Tool belts
- Marking equipment
- Portable extractor fan
- Labels
- PPE
We also reconfigured Mimsy again to match the work that would be done in this phase of the project, revising the main screen and adding new terms and reports for flagging objects. We set up the system’s ‘Actions’ area to record problems with records and whether they had been resolved, and the ‘Instructions’ area to record particularly sensitive objects.
The project was already up to strength:
- 4 × Collections / Documentation Assistants and Documentation / Collections Assistants
- 0.6 × Collections Assistant
- 1 × Photographer
- 1 × Conservator
But I’ll say here:
Staff turnover is inevitable in a long project, particularly as your staff should be getting really good at their jobs; use the appointment of new staff as an opportunity to retrain and refresh everybody’s knowledge and make sure everyone is still working to the same standards.
We decided that the project would work best if the four project assistants were divided into two teams of two; but we also made sure that we did this:
Move the review teams around, so that they work with different people and different objects each week. This keeps the teams interested, prevents friction from building up in a relatively claustrophobic setting, and helps ensure a consistent approach. If you have a tame one, use a campanologist to draw up the schedules: the variations you will need to create are quite similar to bell-ringing changes.
The part-time collections assistant provided roving support, covered staff absences, and followed up problems raised by the teams which they didn’t have time to address. After some discussion, we named the teams Haddon and Quick, after two of our early curatorial staff:
- Alfred Cort Haddon, advisory curator 1902-15
- Richard Quick, curator 1891-1904
Perhaps inevitably, the ‘floating’ team member became known as Team Haddock.
After all this preparation, we had to be sure that everyone knew what they were doing, so we ran a week of training for all project members, and anyone else who needed a refresher. The syllabus was intensive:
- Health & safety and PPE
- Introduction to equipment
- Packing and storage
- Reviewing objects
- Using Mimsy
- Pest control
- Marking and labelling
- Photography
The training culminated in us testing our packing skills with an egg-dropping competition. We held a tea party at the SCC at the end of the week to mark the new phase of the review project.
We chose the objects we would review according to how they were stored, which at the Horniman is mostly by object type, so that each team would be working with objects of similar size and materials together. Object types were prioritised by the curators, taking account of ease of handling and space available. Towards the end of the project, we also reviewed our monocultural collections which, as rich and extensive representations of particular cultures, were strong candidates for display. We also took the opportunity to sort out a group of objects which had not been properly documented since being removed from storage at the main museum during our last redevelopment around the Millennium.
The review process itself was quite simple:
- Retrieve objects
- Unpack object
- Inspect object and update database
- Photograph if necessary
- Repack object
- Return objects
We quickly found that the teams worked best as pairs, doing some tasks in parallel but working together on others, rather than each member working completely on their own as an individual.
We were initially ambitious in what we recorded about the objects:
- Object number
- Hazards
- Whole/Part & hierarchy
- Collection
- Object name
- Materials
- Place
- Culture
- Maker
- Item count
- Measurements
- Inscriptions
- Description
- Condition
- Location
- Sensitivities
But we gradually realised that we were being over-ambitious, and over the course of the project gradually reduced the depth of what we recorded in many areas:
- Object number
- Whole/Part & hierarchy
- Object name
- Culture
- Measurements
- Inscriptions
- Description
- Condition
However, there were a few aspects where we were able to take advantage of access to the objects to systematically record information that could be updated quickly and had previously been omitted from the review:
- Place
- Culture
- Collector
- Date made
- Date collected
- Pest control
During the review, we learnt a series of lessons:
Be flexible. We all want to record everything we can about the objects, but we may not be able to. There’s no harm in starting with an ambitious programme, but be prepared to cut back on some aspects of the work if they’re stopping you from getting through the material as quickly as you need to.
Be proactive in looking for hazards in the collections. Think about what the teams might find in the collections, and make sure you have the procedures and vocabularies in place to identify them clearly in the stores and the collections management system. Make sure your curators think of possible dangers and alert the teams before they reach them. We had to delay the review of the weapons once the teams realised they were encountering objects that might have been poisoned, so that we could understand which objects might be affected, what toxins might have been used, and how we should manage the risks.
Trust your teams to understand what they’re looking at. They’ll very quickly get their eye in: the only other place they are likely to see so many different objects in such a short space of time would be in the trade. They’ll also be aware of the limits of their knowledge.
Our photographic setup was quite basic, using old 6 Mpixel DSLRs with macro and zoom lenses, a single studio flash and softbox, and a table-top light diffusing tent.
Our photographer set up a workflow with automated scripts for colour management and lens aberrations:
- Shoot images and record subjects (assistants – daily)
- Download images from camera cards (assistants – daily)
- Review images for quality, and rephotograph if necessary (photographer and assistants – weekly)
- Correct colour and lens aberrations using PhotoShop scripts (photographer – weekly)
- Add manual metadata and upload images (assistants – weekly)
We also had Mimsy’s developers, at that point Selago Design, develop an image upload tool that allowed us to record much richer metadata than the system’s standard image upload scripts, as well as producing and renaming all the derivative files we needed. This made a huge difference to the speed with which we could process the photographs the teams had taken. With the photographer’s training, weekly checks of image quality, and practical help with awkward objects and tweaking the setup for different object types, the teams quickly learnt to take very good reference images with this basic equipment.
Non-photographers can take great photos – but they need to be trained in what makes a good photograph by a trained photographer. The equipment is less important than developing a good eye. In fact, we found curators were taking very acceptable photographs with the same basic setup and a newish digital compact camera.
Which is not to say that you don’t need a professional photographer: you do need a trained photographer to set up workflows and systems, and quality-check the teams’ work – though you could employ one as a consultant to set systems up and come in every so often to review quality, if that’s all you can afford.
Image checking and post-processing takes time – but is worth it as it significantly improves the quality and usefulness of the images.
The developer of your collections management system is your friend: they may well be able to help you with tools and tweaks which will make your life much easier. Even if you have to pay for a tool, it may more than recoup its costs in the time it saves during the project. If there’s enough demand for a tool, you may be able to share development costs with other users – or ask the developers to add it to the next release.
The photographer continued to take professional-quality photographs of anthropology objects throughout the review, assisted by a series of photography volunteers working as her assistant. We also took the opportunity provided by clearing out the ground floor hall to retrieve our largest objects (including a 7 metre canoe) from store, and photograph them. With the help of external art handlers, we dismantled several shelves to make space, set up a large temporary studio in the hall, retrieved the objects, photographed them, reviewed them, returned them to storage, dismantled the studio and rebuilt the shelves. Our photographer also produced a time-lapse video of the process:
The Collections People Stories project aimed to open up our anthropology collections, and one way we did this was by simply putting every object that had been reviewed on the web, increasing our online collections from about 2,500 objects to well over 30,000.
The teams tweeted their work using the @HornimanReviews account, but they were also encouraged by our digital media team to use Tumblr to post images of interesting or unusual objects. This was a runaway success: our ‘In the Horniman’ account had secured 37,000 followers by the end of the project – as well as winning the social media category of the Museums and the Web conference’s Best of the Web Awards in 2014.
(Of course, we put it on our mantelpiece.)
Be brave about putting objects online. An adequate record is better than no record at all, and people will let you know if they find a mistake in it. We also decided our default position would be to publish wherever possible, so unless it was very clear that publishing an object would cause offence, we put it online. We had no complaints.
Advertise what you’re doing: social media loves a cheese horse – or anything that relates to work in museum stores. Trust your review teams and give them free rein: they’re closest to the objects, they’ll be sensible about what they post, and if they’re enthusiastic about them, this will lead to great social media, which can only help improve your museum’s profile.
We followed the same basic reporting regime as we used for the register review. The graphs we produced showed the extent of the other problems which the review was revealing, including objects needing to be marked, measured and repacked. We did our best to resolve these using our regular SCC student placements.
We used project budget surpluses to appoint various short-term members of staff to work on data backlogs: correcting our authority files for cultures and object names, and entering a mass of information about our objects returned by external experts in response to questionnaires sent out by the curators.
Get your authorities straight first. It will save the teams’ time, make your data better and more consistent sooner, your searches will work better, you won’t have to spend time during the project dealing with problems caused by ambiguous termlists, and it’ll be much easier to add the new terms which the project will inevitably need.
Plan how you will retrieve and enter information that is captured outside the collections management system. So, if you’re asking external experts for information about objects, think about how you can get them to structure their responses so the new information can be extracted, standardised and uploaded as easily as possible.
We used project staff to address some of the other problems: the photographer dealt with image problems (completing the largest task, renumbering erroneously-numbered images, just after the review formally ended). Our biggest success was in reconciling duplicate records, to which the teams devoted several weeks over the winter of 2014-15. After a slow start caused by technical problems, and with the assistance of a newly-developed object record reconciler delivered by Selago, the teams made major inroads into the backlog which had built up during the project.
However, there were other areas where we failed to follow up properly on the work identified by the review:
- Curatorial guidance
- Data corrections
- Top and part records
- Incorrect records (deletions)
There was also a great deal of follow-up work that we had made some inroads into, but not completed:
- 227 general data corrections to be made
- 151 images to be reorientated
- 124 incorrect images to be removed
- 137 duplicate records to be reconciled
- 197 reconciliations to be researched
- 419 incorrect records to be deleted
- 1,530 objects to be conserved
- 6,238 experts’ reports to be entered
- 4,967 objects to be marked
- 4,292 objects to be measured
- 146 objects to be given part numbers
- 250 objects to be rehoused
- 978 objects to be repacked
These two lists teach us an important lesson:
Be realistic about the project’s impact on your documentation and collections management teams: the review will need a significant amount of technical support, whether it’s tweaking the database’s configuration, drawing up report templates, creating new authority records, following up on problems identified during the review, supporting experts’ visits to the stores, helping retrieve awkward objects, or following up on the problems with badly packed or stored objects. If your staff are already just about managing to keep up with business as usual, make sure you have the additional resources you need to support the project.
But not all these figures are entirely accurate: we primarily recorded whether objects had been reviewed and photographed, or needed follow-up work, so we weren’t particularly rigorous about ensuring that everyone flagged other work done. In particular, the review teams carried out a lot of marking and repacking as they went along, and this wasn’t captured.
Monitor progress and report on it regularly, to the teams themselves and to the project more widely. Plan the figures you will need to collect to monitor progress at the beginning of the project, and set up a system to record them, extract them, and draw up the necessary reports at regular intervals. The weekly reports took the Documentation Manager about an hour to create; with some more thought and a bit more time investigating Crystal Reports, they could quite possibly have been drawn up even quicker.
The same is true for identifying additional work that may be needed: set up a system for logging problems and further work that’s needed – and for signing them off once they’re completed. This will help reporting, and lead to a pleasing feeling of achievement as you see how much work is actually being done.
The graphs we produced for the weekly reports meant that we could keep a good eye on progress, adjusting as we went along the baseline figure the teams would need to keep to if we were to meet our target. We also used them to keep track of photography.
More importantly, the whole review team, other collections management and documentation staff, and the project co-ordinator and sponsor, met every fortnight to review progress, hear about work done in the larger project, plan future work, and raise and resolve any problems as they occurred. A member of the review team, chosen in rotation, also attended the monthly steering group meetings, ensuring that the team’s voice was heard at the meeting, and they kept abreast of the project and its priorities.
The regular reports meant that we were aware of significant events before they occurred – for example, reviewing 10,000 or 20,000 objects – allowing us to publicise them to the rest of the Museum, and to celebrate, whether with cake (an SCC favourite) or with a trip out to visit another museum, see what they were doing, and enjoy lunch and perhaps the pub afterwards.
Hold meetings for all review staff and the project’s managers regularly, and make sure everyone has a chance to speak. Frequent communication means that problems will be quickly identified and resolved, and all participants in the project understand each other’s concerns, priorities and constraints.
Celebrate your achievements frequently, marking notable milestones so that the teams feel their work is appreciated.
So: did it work? In brief, yes: between the teams and curators, we reviewed or signed off over 32,000 items by the end of the project:
Over 122 weeks
28,925 items were reviewed by the teams
(average 237 items per week for 4 people)
3,869 items were signed off by curators
955 star objects were identified
746 two-star objects were identified
450 potential disposals were identified
We took over 69,000 photographs:
Over 122 weeks
the review teams took
51,363 photographs of 29,719 items
the curators took
7,578 photographs of 3,634 items
the photographer took
10,159 photographs of 3,899 items
We did a huge amount of follow-up work:
Over 122 weeks
741 objects were conserved
2,572 experts’ reports were entered
2,988 objects were marked
1,947 objects were measured
over 183 objects were rehoused
over 288 objects were repacked
582 incorrect numbers were corrected
1,599 duplicate records were reconciled
And we significantly increased our online presence and attracted a significant social media following:
More than
30,000 records online
1,000 Twitter followers
37,000 Tumblr followers
and 90,000 Tumblr page views
– and won an award along the way.
And what was the secret of our success? If I was to leave you with two tips above all, they would be these:
Plan and prepare; plan and prepare; plan and prepare. Time spent planning is never time wasted and, in retrospect, there were areas of the review that we didn’t anticipate but perhaps should have done. We hope that these tips will help other museums when they come to plan their own projects.
Trust, value and support your staff. The people on the ground, doing the reviewing, are your greatest asset: they are knowledgeable and dedicated, and have the best interests of the collection at heart. Give them the chance to express this in the project, and the results will speak for themselves.
And having said that, I think it’s only fitting that I finish with the names of the volunteers and placements who helped us, and the teams that were directly employed to work on the project:
- Adam Woodfield – Placement/Volunteer
- Alexandra Cantrill – Conservator
- Alexandra Skelton – Placement/Volunteer
- Alison South – Documentation Assistant
- Alix Dove – Collections/Documentation Assistant
- Ann Wallace – Placement/Volunteer
- Bronwen Roberts – Conservator
- Carolyn Carlins – Placement/Volunteer
- Cecilia Vascotto – Placement/Volunteer
- Charles Kottke – Placement/Volunteer
- Charlotte Mayhew – Placement/Volunteer
- Chris Olver – Archive Assistant
- Clare Plascow – Collections/Documentation Assistant
- Dani Tagen – Photographer
- Dominique Russell – Placement/Volunteer
- Elizabeth Baxter – Placement/Volunteer
- Gilberto Martinez – Photographer
- Gina Nam – Placement/Volunteer
- Heather Osborne – Placement/Volunteer
- Jack Mitchell – Collections/Documentation Assistant
- Jonathan Pascal – Placement/Volunteer
- Kitty Clarke – Placement/Volunteer
- Kitty Wong – Placement/Volunteer
- Laura Cronin – Collections Assistant
- Lizzie Cooper – Collections/Documentation Assistant
- Louise Bascombe – Placement/Volunteer
- Mariana Garcia – Placement/Volunteer
- Naomi Russell – Placement/Volunteer, Data-entry Assistant, Documentation Assistant, Collections/Documentation Assistant
- Natasha Logan – Documentation Assistant
- Nicholas Crowe – Documentation/Collections Assistant
- Rachael Utting – Documentation/Collections Assistant
- Rachel Jennings – Documentation/Collections Assistant
- Samantha Hadley – Placement/Volunteer
- Sarah Mahood – Collections/Documentation Assistant
- Steven Evans – Placement/Volunteer
- Yasmin Downie – Placement/Volunteer
Further information
The Horniman produced a short video about the review and its role in the broader context of the Collections People Stories project in July 2013:
In addition:
- The Collections People Stories project has a page on the Horniman’s website.
- The redisplay project that draws upon the review is also called Collections People Stories; it, too, has a page on the Horniman’s website.
- There are a large number of blog posts on the Horniman’s website relating to the larger Collections People Stories project, including several that are specifically about the collection review.
- Chis Olver, the archive officer, gave a talk about the history file review to Horniman staff.
- Dani Tagen, the photographer, gave a talk about achieving good photographs with non-expert photographers [PDF] at the AHFAP annual conference at Tate in November 2013.
- Rachel Jennings, one of the Documentation/Collections Assistants, wrote about the project’s use of Tumblr on the NatSCA blog in June 2015.
- The project’s Tumblr is In the Horniman.