The Fundamentals of Laboratory Automation

ALTEN International Tech Week 

This past November 25th to 29th 2024, ALTEN hosted its second edition of International Tech Week, featuring a programme of 7 unique webinars on the given theme: Automation Across Borders. One of these webinars was Lab Automation, presented by Pascal Roiné, a Computerised System Validation Engineer at ALTEN Switzerland.

How is automation contributing to life sciences? 

Besides the clear technological aspect of automising a laboratory environment, the implementation of automisation requires a consideration of various factors:

  • Patient Safety
  • Product Quality
  • Data Integrity
  • Legacy

Lab automation refers to the utilisation of various automated systems and devices to execute numerous laboratory tasks. These tasks can range from simple repetitive actions to highly complex procedures. By deploying robotic systems or automated instruments, lab automation endeavors to perform these operations with unmatched precision, consistency, and efficiency.

The key aims of lab automation are:

  • Boosting productivity by increasing analysis volumes
  • Improving accuracy and reproducibility, limiting the human error
  • Enabling various sample analyses and supporting the miniaturisation of assays, which involves handling smaller volumes of samples and reagents with precision, all while reducing costs and waste
  • Operating overnight or during weekends
  • Ensuring easy traceability and reducing the likelihood of data loss or errors in data recording

Automation is widespread across multiple sectors within the life sciences industry, such as pharmaceutical research, biotechnology, clinical diagnostics, and academic research laboratories. Each field benefits uniquely from automation, whether it’s speeding up drug discovery processes in pharmaceuticals or improving the accuracy of diagnostic tests in clinical settings.

Lab automation can range from deploying standalone automated instruments to fully integrated systems controlled by sophisticated lab orchestration software. Automation can be customised to handle specific tasks such as pipetting, sample handling, reformatting, incubation, centrifugation, and data analysis.

The journey towards the automation of a laboratory involves careful planning and execution. Before any implementation of automation technology, specific laboratory needs must be considered, whether it be budgetary constraints or long-term objectives.

Computerised Systems Categorisation 

Primarily intended for the pharmaceutical industry, GAMP 5®, a risk-based approach to compliant GxP computerised systems, has also been adopted as a monitoring guide for the medical devices industry and other highly regulated industries. It takes the environmental context into account according to the ISPE, International Society for Pharmaceutical Engineering:

  • Defining principles and procedures that help ensure that the products have the required quality
  • Detailing recognised standards for the validation of computer systems
  • Establishing a set of directives for manufacturers and usersof automated or IT systems


Computerised systems can be divided into 4 GAMP categories:

GAMP category 5 for custom or bespoke software (tailored software)

GAMP category 1 for infrastructure software

GAMP category 3 for non-configurable software

GAMP category 4 for configurable software

A closer look at lab automation in practice 

The main objective of Pascal’s laboratory project was to automate the part of the laboratory that was carrying out cytotoxicity measurements. Cytotoxicity is the degree to which a substance can cause damage to a cell.

This process automation had to consider regulations and user specifications, such as:

  • The business risks
  • The risks for patient health
  • The system architecture and its components
  • Critical Quality Attributes and Critical Process Parameters

The instruments already existent in the laboratory were directly integrated into the measurement chain, while others had to be purchased and had to consider the measurement’s characteristics,compatibility with other systems, and compliance with the regulations set in place (21CFR part 11).

All of these considerations led to the most suitable choice for the automated system: a manipulator arm mounted on a linear rail. This robotic arm was able to move with precision and in a reduced volume, to effectively move microplates from one instrument to another in complete safety.

A supervision software oversaw the control of the robotic arm, by running a computer programme specially adapted to these measurements, but also to facilitate the exchange of data and respect the analysis method.

Implementing such an elaborate computerised system called for taking the following necessary steps:

  • Taking the local requirements into account
  • After the precise installation of the robotic arm, the initialisation and the position learning of different instruments
  • A debugging session to verify the correct communication between instruments, the chronology of movements, and data transfers
  • Validation process following GAMP 5’s driven approach for instruments and software

Validation, the key to ensuring quality and reliability 

Validation is an essential process related to the pharmaceutical industry. It is how products are consistently produced, controlled, and meet quality standards.

The V-model is a graphical representation of a system development lifecycle following a risk-based approach, that can be characterised by:

  • A downstream flow of activities listing the major stages of a product defined by the requirements/specifications up until its realisation.
    • User Requirement Specifications (URS) describe the business needs for what users require from the system.
    • Functional specifications incorporate a formal document used to describe the capacities, appearance, and interactions of a product with users in detail.
    • Design specifications are recorded into a document that explains exactly what criteria a product or process should comply with.

  • The junction between downstream flow and upward flow reflects the implementation and installation of the solution. At this stage, the solution is controlled with:
  • The Factory Acceptance Test (FAT) whichis performed by the supplier on their [EM1] site. It makes it possible to verify the correct answer to the requirements of the solution.
  • The Site Acceptance Test (SAT) is also performed by the supplier, but on the client’s site. It makes it possible to verify that the solution always meets the requirements once installed in a client environment.

  • An upward flow, which lists the verification steps of activities within a qualitative framework.
    • The installation qualificationis carried out to ensure that the elements are properly installed in the right place and that the information provided by the supplier is correct and in adequacy with the design specifications.The operational qualificationtakes place to verify that the solution’s operation responds as expected, through the implementation of tests and documented evidence. This is to check the agreement with the functional specifications.
    • The performance qualificationallows us to check that the process put in place is properly functioning and guaranteeing the correct operation in routine.

At each stage, it is imperative that deliverables are finalised and validated with a formalised review from a Quality Assessment. A routine authorisation of use with a report consolidates the conformity to the URS.


Deep diving back into the project intricacies 

As shown in the flowchart above, Pascal’s laboratory automation project was divided into several subparts.

  1. The automated system, at the heart of the project, was materialised by the robotic arm and the associated software, GBG Scheduler.
  2. A set of instruments:
  3. Tempest® Liquid Dispenser, allowing the injection of a defined quantity of live cells into the selected wells of the microplate.
  4. A microplate sealerto simply glue a film over the microplate wells, avoiding any liquid spillage during the different movements.
  5. A peeler is then used to remove the glued film.
  6. A Multidrop Reagent Dispenseris used as a liquid manager to add chemicals to each well, and then a second time to eliminate dead cells from the wells.
  7. Incubators provide a controlled, contaminant-free environment by regulating conditions such as temperature, humidity, and CO2.
  8. A microplate reader to deliver DNA/RNA/protein quantification.
  9. MACS® Flow Cytometry Analyser used for analysing individual suspension cells.

Each instrument used followed a similar validation phase as an autonomous instrument.

  • Laboratory Information Management System (LIMS)
    The communication and transfer of data through LIMS was made possible with the dedicated Application Programming Interface (API), Mulesoft.

LIMS is a business application that manages and tracks analysis requests and the associated samples, test results, and other information required to issue the final certificate of analysis with the management of analysis features. Laboratory digitalisation relies on standardised interfaces that enable all devices to fully communicate with each other and to network with management systems, other software and hardware, as well as all human actors within the value chain.

Without communication protocols, it would be close to impossible to communicate effectively on a network. Data could be misinterpreted or completely lost. Protocols ensure that all devices “speak the same language” and can correctly interpret the data they receive. After bypassing technical constraints, the data follows a complex calculation pathway, a particular sequencing or even conversions to be exploited during treatment.

The integration of LIMS with automated systems, allows for transparent data flow and improves data traceability and reproducibility. The main difficulty in this project was the coding of the interface between the orchestrator software and LIMS for:

  • Initialisation, setting of devices or acknowledgement,
  • Instructions,
  • Recipes,
  • Results

The API acted as the bridge here. An API being a set of rules or protocols that enables software applications to communicate with each other to exchange data, features, and functionality. However, this process is very sensitive in terms of error, loss or even corruption, which is unacceptable in a laboratory, and more generally in the biotech or pharmaceutical industries.

The data validation was carried out to ensureone of the fundamental pillars of information management:Data Integrity.

Data integrity refers to the accuracy and consistency of data throughout its entire lifecycle. It ensures that information is not altered in an unauthorised way and that it remains reliable and accurate, regardless of when it is consulted or used.

Data integrity encompasses not only the prevention of accidental errors, but also protection against intentional manipulation.

1. Physical integrity ensures that data is physically stored without material corruption, with the help of:

  • Redundancy systems
  • Non-erasable media
  • Regular backups
  • Protections set in case of natural disasters
  • Environmental controls

2. Logical integrity ensures that data is correct and consistent within a logical model according to several important aspects: ALCOA Principles.

  • Database constraints to maintain data correctness (Uniqueness, respect of defined rules or conditions, …)
  • User data integrity ensuring that the data entered by users is valid and complies with expectations (Format, Value Range, Consistency check, Code injection protection, …).

Understanding and implementing robust practices maintain this integrity. Organisations must not only comply with regulatory requirements such as the General Data Protection Regulation (GDPR) but also prioritise the protection of their digital assets and maintain the trust of their customers and partners. Constant vigilance and continuous improvement of data management processes are key to meeting the current and future challenges in this crucial area of laboratory automation.

Conclusions 

In today’s world, laboratories have become an integral part of many industries. From biotech to pharmaceuticals passing through environmental sciences, all rely on lab research and data. Accuracy, speed, and efficiency have become the primary goals of every lab, which manual procedures can no longer guarantee.

Apart from ensuring high-quality outcomes, automated laboratories are timesaving, implement error-reducing mechanisms, promote consistency, create safer work environments, and offer advanced data management capabilities. Let us simply not forget that they also have their limitations, such as demanding large investments and staying constantly up to date on the latest developments.