A GC-MS and LC-MS Open Access Software Solution – User experiences

A GC-MS and LC-MS Open Access Software Solution – User experiences

Dr Peter Howe1, Dr Jackie Moseley2 and John H Moncur3

1Syngenta Research and Development, Jealott’s Hill, UK

2University of Durham, Durham, UK

3SpectralWorks Ltd, The Heath Business & Technical Park, Runcorn, UK

First published in Chromatography Today, Volume 11, Issue 4, Buyers Guide 2019

Advances in high throughput techniques, chemistries and analyses, have revolutionised the research landscape. The use of automated systems underpins the majority of chromatography-mass spectrometry approaches. This has been adopted in the Pharmaceutical industry, through all the ‘Omics’ but it has yet to be fully accepted within some of the chemistry communities. In many cases the high throughput approach means that the vast majority of users are not specifically trained chromatographers or mass spectrometrists.

They don’t need to be, but the sample submission, data acquisition and data presentation now demands a user friendly, robust client interface – the so-called open access software. This allows GC-MS and LC-MS to be fully utilised as a research tool in wider application fields that have their own specialists. They no longer need to have the day to day hands on capabilities of running their GC-MS and LC-MS samples. The expert users can control the access that users have to particular techniques or instruments and with no direct interaction with the instrument required from the user, common mistakes and problems with sample analysis can be reduced.
This article is constructed to provide unbiased end user in depth insights into the software of choice, the challenges they face and how it performs in their hands and the features employed
Firstly an industrial user Dr Peter Howe, Senior Technical Expert at Syngenta in Reading, UK provides his insights in to the use of their open access solution for GC-MS, LC-MS and NMR.

“Walk-up analysis is an integral part of modern chemistry laboratories. Thanks to improvements in technology and data processing, non-specialist users can rely on today’s NMR and mass spectrometry instruments to deliver high quality results. Even 2D NMR experiments and high-resolution mass spectra can be routinely delivered in minutes. Sample submission is facilitated by using sample submission templates such as that shown in Figure 1.”

Figure 1: User Sample Submission Page for Accurate Mass Determination. Additional user input can be captured in the relevant template depending on experimental requirements.

He continues; “Although this revolution has greatly enhanced the quality of analytical data used within laboratories, it poses several challenges for facility managers. The first is that instrument software is not always designed with non-specialist users in mind; even when walk-up interfaces are included, they can differ significantly between different manufacturers, increasing the training overhead. The second major challenge is coordinating methods and data files between instruments. Users want to be able to access their data quickly and easily while the facility manager needs to ensure user accounts are kept up-to-date
on different instruments, and that there is no risk of users over-writing each other’s data files.”
“We chose SpectralWorks RemoteAnalyzer for our laboratory because it addresses all these problems. Our users have the same interface for submitting samples to NMR, GC-MS, LC-MS and high-resolution LC-MS (figure 1), and the data is delivered back to them through a single web interface which lists all the samples they’ve submitted. The interface includes search capabilities to find samples easily and a basic MS
data viewer as shown in Figure 2. As facility managers, we value only needing to create new user accounts on one system, rather than on seven different instruments, and we can be sure that every data file created within the system has a unique name. We have also noticed improved reliability, because users submit samples via the web interface rather than needing to interact with the instrument operating software.”

Figure 2: Interactive Data File Viewer. Allows MS data processing

“We did have initial concerns about the scalability of the system, because our laboratory records almost 50,000 sample submissions every year from about 70 different users. However, we now have
over 200,000 samples in the database with no significant change in performance and we recently moved the system to a cloud provider giving the potential for even more scalability. We are looking to the planned enhancements that will see improved LC-MS data processing capabilities and integration with our electronic laboratory notebook (ELN) both for sample submission and data delivery.”

Cloud based solutions, such as Microsoft Azure or Amazon Web Services (AWS), are becoming more acceptable in all industries. Initial fears of data security have not materialised and the integrity and redundancy options within cloud-based solutions is very appealing. Both Microsoft and Amazon provide extensive details and information regarding their cloud services.
Secondly The University of Durham (UK) provides extensive open access MS capabilities, with Dr Jackie Mosely, Dr David Parker and Peter Stokes overseeing a system which runs over 30,000 samples per year.
Dr Jackie Moseley, comments as follows:

“We are a busy research-focused facility configured to provide every mass spectrometry-based requirement for a world leading University. With 8-plus mass spectrometers at any one time, our asset base is designed to be flexible enough to support changing instrument platforms and configuration, changing instrument provider and perhaps more critically changes in local research priorities. The open access solution was able to slide seamlessly into our existing instrument portfolio and was adaptable enough that we could use it to fully manage all our operations. Specifically:
1. we use barcodes to label vials
2. users can then choose from a range of analyses including:
• GC-MS (polar or non-polar column)
• LC-MS (flow injection or LC with a choice of solvents; positive ion, negative ion and photodiode array data delivered by default)
• MALDI (collected in a tray and run by staff)
• ASAP (a probe technique run by staff)
• high resolution accurate mass analysis (performed by using the ‘resubmit’ function following successful low-resolution analysis
• ‘by arrangement’ (information can be uploaded, or a user can discuss with staff any requirements for unique analyses)
3. users remotely log their samples information against a bar code number
4. a barcode scanner associated with a touch screen monitor in our open access lab completes submission when samples are dropped off
5. Samples can be submitted to open access instruments in which case the user is directed to a position in an autosampler, or alternatively samples can be submitted to a tray for staff to run more bespoke or advanced analyses”

Management Comments on the software (Dr Jackie Mosely, Senior Research Officer in the Department of Chemistry)
The introduction of a purely digital web-based solution has impacted in so many areas from a management point of view, from cost savings and space savings through to health and safety benefits. Our sample throughput has nearly tripled since installation. The installation of a walk-up system offers these features and benefits:
Cost saving:
1) Now operating with this paperless system means there are no printers to maintain and replace, no paperand no toner, lower powerconsumption and lower heat output into busy noisy laboratories. This also meant we were unaffected by a recent University decision to centralise printing.
2) In moving over to a virtual machine (VM) we have no requirements to replace the computing hardware, operating system or server software every 3 to 5 years.
3) Greenspace is the University’s environmental initiative and so any change in practice that leads to a reduction in the carbon footprint by being more energy efficient and requiring fewer consumable items
is helping the University achieve its environmental targets.
4) Users have direct access to instruments and so our sample capacity has increased. The ability to handle more samples means we get better value for money out of our instruments.
5) The system has enabled us to widen our user base beyond the confines of our immediate locale, reach new communities and thereby further expanded the accessibility and understanding of mass spectrometry.
Space saving (this is a crucial element as space is becoming a premium so we need to manage what we have more effectively):
1) from a facility management point of view, we no longer need to house printers and store printer paper, spectra or sample submission documents. This has a knock-on benefit of greatly reducing a potential firehazard in the lab, thereby satisfying local health and safety officials.
2) Users no longer need to store their paper results. Hot-desking is more achievable and remote users can access their results effectively and immediately.
Time saving:
1) Recently the Research Councils UK decreed that all publications and theses produced from RCUK funded projects must have the original data and appropriate metadata made available publicly. Some peer
reviewed journals are now following suit. This is easily managed by the user, or their supervisor, who may be publishing work produced by a group of contributors or former students, as can collect all the necessary mass spectrometry-based data and metadata in one place.
2) Our users now choose from a list of possible experiments. Simply having to make this choice means they must read and understand what that list contains. At Durham this meant more exposure to LC separation prior to MS detection. As this experiment rapidly became the most popular, it became impossible for staff to report all necessary information in a printed spectrum. The only solution was for users
to interact directly with raw data. Such interaction has had many positive outcomes including, increased understanding of mass spectrometry and its data interpretation by users; users take much more ownership of their analysis and subsequent data interpretation; users now often interpret much more of their data rather than looking for what they expect or hope to see and understanding more about their chemistries, all of which has dramatically reduced the amount of basic data processing required by staff.
3) RemoteAnalyzer takes care of the whole work flow from sample logging (figure 3), to processing, obviating the need for an operator at all stages, thereby releasing staff for non-routine or specialist tasks. As a direct consequence staff spend a much larger percentage of their working day performing tasks more appropriate to their pay-grade.
4) As staff time has moved from performing routine analyses to getting more involved in complete research projects their job characteristics are more fulfilling.

Figure 3: Administrator Reports. Simple Sample Usage reports. Other reports include, reporting by Experiments, Groups, Instruments, Projects and Users. This allows detailed utilisation information to be generated quickly and easily.

Technical support comments
(Dr David Parker and Mr Peter Stokes)
Administration:
1) Setup is easy and intuitive. It provides one customisable system that administers and coordinated the use of all our instruments allowing staff to oversee and monitor the use of multiple instruments
simultaneously. It is particularly easy to setup and manage multiple users and research groups, granting different levels of access as needs arise. We use this to develop bespoke experiments for new research
applications and assigned to select groups, thereby making advanced application-specific solutions appear routine to the end user.
2) Record keeping is done without human interaction. This provides a searchable electronic register with date and time stamped information.
3) User training. All training material is accessed through the web-based system itself and collating it all in one place is easier for staff to update and locate at the relevant point of access for the users.
4) Improved ability to organise workload. Everyone, from managers to MS staff to users benefit from real time monitoring instrument queues to optimise their workload. For MS staff we are able to see what is coming and plan accordingly. This is particularly beneficial for analyses performed by staff who can then group appropriate samples accordingly.
5) Communication. The landing page provides a very convenient place for a message board. Emails are seen as transient information; the landing page can hold relevant information for longer and is clearly visible at the point that it is relevant.
IT support comments (Mr Alan Harland)
We made the decision to involve a University IT representative from the outset. This meant that the University IT group had a say in designing the infrastructure, thus ensuring it complied with the University policy of the time and was as future proof as could be so that it could easily be supported and maintained. Six years of IT history:
1) Initial setup: Xeon based server 8 Gb RAM, 1 TB storage setup as raid 1 (mirrored) using Microsoft Windows server 2012 with Microsoft SQL server 2012. This was the minimum specification from SpectralWorks and was perfectly suited to the task at hand.
2) Initially we used Active Directory to manage the user base, which was the preferred solution, but moved to local authentication to comply with a change in University policy. It was easy to make this change and RemoteAnalyzer makes it easy for the MS staff who now manage user groups.
3) As hardware became outdated and University IT strategy changed, we migrated to a virtual machine, firstly migrating to VMware and then more recently to Microsoft HyperV. Moving to a virtual machine means that we have ‘future-proofed’ a core component to our facility and can flip between platforms as any need arises with no disruption to the users.
4) Virtual machines have enabled us to be part of a larger entity yet retain local control. We can expand as we need to, with the exception of memory but 8 Gb has always been plenty.
5) The IT management side is very easy, the IT support for backend management will already be familiar with the interfaces required to logon.
6) University policy now requires us to back-up data off-site. This is much more easily facilitated and maintained at an IT level when integrated with a larger entity, and much more cost-effective.
7) The web-based multi-platform browser makes it very easy for IT to support.
Academic User Comments
(Dr Elizabeth Grayson)
By her own admission Dr Grayson is the least computer savvy person in the Department but she found it very easy to learn, very straightforward to use to support her research in organic synthesis. The ability for MS staff to place written instructions alongside the point of submission is helpful and comforting. Everything is in one place. Dr Grayson also manages student researchers and as a supervisor she
has found that her student cohort learn and operate the open access solution without necessitating any input from her. The software is logical, and the training provided, and accessible instructions means
the students are always up and running very quickly. Whilst Dr Grayson can view her students accounts, this is only really used to support them, not to monitor them.
Academic User Comments
(Professor Ian Baxendale)
Professor Baxendale’s main interaction is as a user himself. He found it very easy to setup and then track his MS work. He does use it as a portal to student data to answer some queries, but not to monitor
progress or review data (many students are international or visiting for short periods and so more internationally accessible software solutions make data and data processing more portable).
Student User Comments (Alice Harnden)
Alice, a PhD student in the group of Professor David Parker FRS uses the facility through RemoteAnalyzer several times every working day. Easy access to MS analysis is essential to Alice’s research involving
synthetic organic chemistry.
‘RemoteAnalyzer was very easy to learn and is very easy and obvious to use. When the resubmit capability became available and the Facility changed high resolution accurate mass to make use of this it took a little adjusting, but ultimately the solution is much to the user benefit.’
She explained the Daily interactions and how that now directs her workflow:
1) Generally, start the day by running fast low-resolution LC-MS on a crude reaction mixture to see if worth working up for high resolution accurate mass.
2) By using RemoteAnalyzer to monitor the queue on instruments Alice will schedule her day accordingly, either continuing to run further low resolution analyses and resubmitting to high resolution accurate mass measurement. If an instrument is busy, she may collect a set of samples and submit in a group during a quiet period. If an analysis is time-critical Alice will ask a member of staff to adjust the sample queue to accommodate a change in priority

3) Reaction monitoring can be important and by monitoring the queues or speaking to staff such requirements are easily accommodated
4) If time is critical then the system provides an immediate view of the data.
Longer term interactions with the system and how that influences her sample and data management:
1) Electronic records, such as an electronic sample receipt (Figure 4) mean that nothing can get lost
2) It is very easy to look back at the history of what has been done to look for any gaps or missing experiments – a simple resubmission to a different experiment can then easily solve this and the sample easily
located due to the bar bode (bar codes are more robust – pen can rub off in time and the bar code is unique). Alice does this every month or two as a matter of housekeeping.
3) The flexibility to include a lot of information about the sample is very helpful, previously one would have to rationalise spectra in one location against a lab book entry and then open ChemDraw to obtain the
mass. Now the system contains all that information at the click of a button, so this is the first place to look.

Figure 4: Sample Submission e-receipt. e-receipt can be read from a mobile device (e.g. tablet or phone) toallocate and confirm the sample at the instrument.

Student User (Alexandra Webster)
‘If it were not for batch submission I would not have achieved half of the work that I did over the summer’.

Alex collects large numbers of fractions from an HPLC and so the introduction of batch submission really changed her working practices. No longer does Alex have to select a few examples from one HPLC run at a time to check, and then iteratively narrow in on the correct fraction(s). Now an excel spread sheet can be quickly and easily populated with all fractions to be run in one go so it now takes just a couple of minutes to submit a batch, regardless of batch size. These large batches can be confirmed on the instrument through a simple touch screen application (Figure 5) and are run over night so Alex is not worried about tying up a shared asset for hours on-end during a busy working day, and her work continues long after she goes home for the day. In the morning the data is ready to be checked and the relevant fractions freeze dried. It is really obvious how-to setup and run batches; such that students Alex oversees in her lab easily learn what to do to become self-sufficient very quickly.

Figure 5: Simple Sample Confirmation Touchscreen application. Located on or near the instrument, the user simply scans the sample barcode, places the sample in the autosampler and confirms the submission. Batches or plates can also be used.

Undergraduate teaching
(Dr Jonathan Sellars)
The undergraduate laboratories were greatly enhanced by using RemoteAnalyzer software. Students use the data to produce their reports for practical laboratory classes which involve the interpretation of the data and identification of the products that they had made. Also, for some experiments the students use the data to monitor reaction progress. All of this enables the students to access a research level piece of equipment and its results at an early stage of their undergraduate training, leading to a reaffirming of the research led nature of the chemistry course.
Notwithstanding this, Jon believes that the students gain a great deal more from this early interaction with the RemoteAnalyzer software as these skills stand them in good stead for when they undertake final year MChem research projects. It was a very easy process to setup. All students needed was a computer and an account to register the samples against and that’s it. Jon recorded a video tutorial that they could watch on the computer which meant it was very easy to manage and very little intervention was required by staff. Staff did, however, find RemoteAnalyzer a particularly easy means by which to monitor
who was submitting samples.

User Conclusions
The user experiences reported for the open access solutions are examples from both academia and industry. In both cases the system facilitates the maximum utilisation of core instruments by all levels of user.
Academic users with their own research groups are able to use the system to effectively support their students and provide answers to queries that they may have. Allowing access to external visiting researchers means that this support can carry on after international collaborators have returned home. Undergraduate teaching is also extended to allow the use of modern, expensive analytical equipment and
techniques that might not usually be made available to early career or students from disciplines other than core analytical chemistry.
Industry users have been able to offer biologists and synthetic chemists access to well supported analytical chemistry techniques on a simple open access basis. This allows the expert users to concentrate
of the more problematic samples rather than having to deal with the day to day provision of their chromatography services. Accurate utilisation and capacity data on a group, project and instrument level allows
for improved service provision. This can identify bottlenecks within a workflow or support the development of further course material at the student level.