Refining training of non-human primates using automated home room training systems

Why did we fund this project?

This award aims to refine behavioural neuroscience research using monkeys by installing Mymou training systems at eight research groups allowing animals to be trained in their home environment.

Monkeys are used to study brain activity in behaviour and cognition. Experiments are performed where brain activity is recorded or imaged while animals respond to a sensory stimulus by pressing a button on a touch screen or moving a joystick. Each animal requires training before recordings can begin. Training typically requires the animal to be restrained in a chair within the laboratory away from the home environment and the animal’s social group. Each of these factors can increase the animal’s stress levels. Dr Steve Kennerley and colleagues have developed Mymou – a bespoke, low-cost, open-source system that allows monkeys to be trained in their home environment improving welfare. The Mymou device is a wireless touchscreen system that runs continuously using primate facial recognition to identify individual monkeys in the social group, with over 99% accuracy. Monkeys are free to interact with the system at will and there is no need to restrict their movement for training. The Mymou system automatically switches itself on and off and can provide food/fluid rewards throughout the day and night. Using Mymou, two macaques were able to complete on average over 1,200 trials per session between them, with this number rising to over 4,000 in trials in some sessions.  

Steve and colleagues have used Mymou to successfully train macaques on an associative learning task with 48 different associations in three weeks. Using NC3Rs funding, Mymou will be installed in eight research groups and tailored with cognitive tasks specific to the laboratory. Additionally, a suite of standardised training tasks will be developed based on the requirements of the research community and made available through an online repository.

Traditionally, the process of training non-human primates (NHPs) on cognitive tasks takes place in the laboratory. This involves removing the NHP from the comfort of their home room, and social group, for training each day. NHPs then perform tasks in a confined testing chair, placing restrictions on their mobility for the duration of the session and also limiting the types of behaviours that can be explored. This process can increase stress levels for the NHP. Furthermore, the laboratory-based training schedule is dictated by the experimenter, with NHPs typically only trained for 1-2 hours/day, 4-5 days/week, prolonging the training process and exacerbating these welfare concerns.

To overcome these issues, we recently developed the Mymou (Greek for "monkey", pronounced my-moo) system, a low-cost home room training system for NHPs (Butler & Kennerley, 2018). The wireless device runs continuously and automatically all-day including weekends, allowing NHPs to perform tasks in their own home room at their own leisure and comfort. This eliminates any need to place restrictions on the NHP's movement, including head restraint, or to remove them from the security of their home room, for the training process. Furthermore, the system uses a camera to snap a 'selfie' of the NHP for each trial they start. This selfie is run through a custom monkey facial recognition algorithm we developed that is capable of accurately identifying NHPs with >99% accuracy. This enables NHPs in the same cage to be trained without needing to be separated from one another. Mymou is therefore a significant refinement for the training process.

Furthermore, the constant availability of Mymou in the home room enables NHPs to complete many more trials per week relative to training in the lab, refining the training process by reducing the overall time the NHP spends under protocol. This enables researchers to train NHPs on more diverse and sophisticated tasks, thereby allowing the experimenter to obtain higher quality data whilst potentially reducing the total number of NHPs needed to address their scientific aims.

Having now successfully refined NHP training procedures at University College London, this project will repeat this in the other UK NHP neuroscience research centres (Oxford, Cambridge, and Newcastle Universities). For the 8 end users we will develop a personalised cognitive task(s) relevant to their research, spanning a range of tasks exploring attention, memory, learning and decision-making processes. We will then install these personalised systems in each centre, and provide training on how to use the device. This will enable home room training, and the refinements this provides, for 26-36 NHPs across the 3 centres.

An important aim of this project is to also encourage adoption of home room training by the wider NHP community. This project will produce a suite of standardised training and cognitive tasks that are used across the varied domain of NHP behavioural research, allowing users the flexibility to explore how their NHPs perform on different tasks. This will help optimize individual training regimens and also aid in assigning NHPs to projects they are best suited for.

All of these resources will be made publicly available online in our actively maintained repository, providing a rich foundation of information to help NHP researchers get started with home room training. Furthermore, encouraging home room training across the UK will generate a critical mass of users to help encourage the wider community. The results from this project will be publicised through many channels (e.g. conferences, press releases, scientific papers), which will help encourage the adoption of home room training worldwide, therefore providing a refinement and reduction to NHP research across the entire community.

Back to top
Skills and Knowledge Transfer grant

Status:

Active

Principal investigator

Dr Steven Kennerley

Institution

University College London

Co-Investigator

Dr James Butler

Grant reference number

NC/T001291/1

Award date:

Sep 2019 - Apr 2021

Grant amount

£73,326