Standardized workflows for increasing efficiency and ...

9 downloads 64970 Views 5MB Size Report
acceptance criteria, automated sample preparation, sample analysis platforms, data processing and data reporting. Kevin P ... Email: [email protected] ... Figure 1. Bioanalytical discovery workflow in Merck Research Laboratories.
P erspective S pecial Focus I ssue: I ncreasing

productivit y

Standardized workflows for increasing efficiency and productivity in discovery stage bioanalysis Merck consolidated discovery stage bioanalytical functions into the Department of Pharmacokinetics, Pharmacodynamics & Drug Metabolism in 2007. Since then procedures and equipment used to provide important quantitative data to project teams have been harmonized and in many cases standardized. This approach has enabled movement of work across the network of laboratories and has resulted in a lean, flexible and efficient organization. The overall goal was to reduce time and resources spent on routine activities while creating time to perform research in new areas and technologies to support future scientific needs. The current state of discovery bioanalysis at Merck is discussed, including hardware and software platforms, workflow procedures and performance metrics. Examples of improved processes will be discussed for compound tuning, LC method development, analytical acceptance criteria, automated sample preparation, sample analysis platforms, data processing and data reporting.

Background Driving efficiency for bioanalysis in drug discovery has been an ongoing theme for many years in the pharmaceutical industry [1–4] . The reasons for this include improved sample throughput, in terms of both increased numbers of samples analyzed and reduction of cycle time and resources for studies. The push for improved bioanalysis has been driven, to some extent, by the need to keep up with medicinal chemistry groups that employed new and creative ways for making larger numbers of compounds to screen against biological targets. Added to this is the desire to build large datasets in order to construct structure–activity relationship models to support discovery activities [5,6] . The final goal is to better support higher quality compounds advancing through discovery into development, and eventually approval of new therapeutic agents that improve human health. Discovery bioanalytical groups are continuously balancing analytical risk versus effort to execute on studies. This is especially relevant given that less than one percent of work is performed on compounds that will eventually become marketed drugs. Improvements in instrumentation and software provided many of the efficiency gains reported in the literature. This includes liquid handlers for sample preparation [7] , chromatographic systems with faster injection to injection cycle times [8] , more robust atmospheric pressure ionization sources for MS [9] , and data processing

tools for handling large data sets [10] . Bringing one or more of these new tools together helped drive overall sample throughput in many bioanalytical groups within pharmaceutical companies. Starting in 2007, Merck globalized its drug metabolism and pharmacokinetics groups under one leadership team. With this new organization, a goal was set to implement harmonized, and where possible, standardized approaches to all aspects of bioanalysis across the laboratories. This included compound optimization, liquid handling for sample preparation, sample introduction technology, mass spectrometer platform, analytical acceptance criteria, data processing tools, and data reporting templates. A metrics system was also implemented to track the volume of requests and cycle times for data reporting. This manuscript describes these efforts, the approaches used and the impact of a globalized and harmonized analytical capability. Real metrics on sample volumes and turnaround times are used to illustrate the efficiencies gained through these activities used to support in vitro and in vivo bioanalysis at Merck.

10.4155/BIO.13.162 © 2013 Future Science Ltd

Bioanalysis (2013) 5(14), 1–xxx

Kevin P Bateman*1, Lucinda Cohen2 , Bart Emary1 & Vincenzo Pucci3 1 Department of Pharmacokinetics, Pharmacodynamics & Drug Metabolism, Merck Research Laboratories, West Point, PA 19486, USA 2 Department of Pharmacokinetics, Pharmacodynamics & Drug Metabolism, Merck Research Laboratories, Laboratories, NJ, USA 3 Department of Pharmacokinetics, Pharmacodynamics & Drug Metabolism, Merck Research Laboratories, Laboratories, MA, USA *Author for correspondence: Email: [email protected]

Standardized workflows for bioanalysis Much discussion has been focused on the push for standardized processes in bioanalysis and the need for high-throughput analysis [11] . On the surface, standardized approaches can be seen as an impediment to innovation and creativity. However, much of the small molecule bioanalysis ISSN 1757-6180

1

P erspective |

Bateman, Cohen, Emary & Pucci performed in a discovery environment is well defined in terms of analytical requirements and performance. As such, standard process designs and platforms reduce data variability and result in more consistent outcomes. Standardization increases confidence in the data, and in turn, is the foundation of speed. To quote Henry Ford, “Today’s standardization… is the necessary foundation on which tomorrow’s improvement will be based. If you think of ‘standardization’ as the best you know today, but which is to be improved tomorrow – you get somewhere. But if you think of standards as confining, then progress stops.” Bioanalytical groups need to standardize and streamline wherever possible in order to create the opportunities to innovate in new areas, such as microsampling, biotherapeutics, biomarkers, microdosing studies and general improvements in quantitative analysis. Before the globalization of Pharmacokinetics, Pharmacodynamics and Drug Metabolism (PPDM), each bioanalytical group provided support to the programs at their local research site. The deliverables, quantitative analysis of study samples, were the same as depicted in Figure 1. Investigation of the details of each step provided insight into best practices and highlighted areas for harmonization to drive efficiencies across the different groups. The focus was on providing common platforms and approaches that would work for the majority of compounds. In instances where processes were over-engineered, simpler approaches were instituted, and in cases where gaps were identified, new processes were established. Examples of improved processes will

be discussed for compound tuning, LC method development, analytical acceptance criteria, automated sample preparation, sample analysis platforms, data processing and data reporting. The general approaches described below have mainly focused on PK studies, plasma protein binding samples, and P-glycoprotein and permeability assays, but the principles have also been applied to other in vitro and in vivo studies. Compound tuning After study compounds have been retrieved, the first step in building an LC–MS method is to determine the m/z values (Q1 and Q3) and voltages (i.e., declustering potential and collision energy) required for the SRM method. When only a few compounds are received, it is straightforward for an analyst to infuse the compounds into the mass spectrometer using a syringe pump to build methods. The downside to this approach is that the compound would need to be reoptimized for each mass spectrometer it is run on. For example, if one system is running in vitro studies and another system is running in vivo studies, the analyst would need to spend time on both systems building the methods. As the number of compounds increases into the thousands, the need for automated tuning becomes imperative. Groups that provided centralized ADME screening were the first to implement centralized compound optimization [12] . Merck developed a process for automated compound tuning, built using the AdvionNanomate chip-based infusion technology on an API3000 mass spectrometer [13] . Initially this was based on the Automaton software from AB Sciex and required that each

1

2

3

4

Study request

Compound tuning

LC method development

Standards and QCs

8

7

6

5

Data reporting

Data processing

Sample analysis

Sample preparation

Figure 1. Bioanalytical discovery workflow in Merck Research Laboratories.

2

Bioanalysis (2013) 5(14)

future science group

Increasing efficiency & productivity in discovery stage bioanalysis group maintain their own local database of compounds. With some assays centralized at only one site, a global database was generated to avoid re-tuning of compounds across the research sites. The current system uses AB Sciex Discovery Quant™ software running on API-4000 mass spectrometers with AdvionNanomate devices. Currently a single shared database of tuned compounds is available to all scientists conducting quantitative analysis. Each site can read and write to the database and common best practices are used for ensuring data quality within the database (common test compound, calibration testing and mass accuracy acceptance criteria of test compound). The software allows for setting of rules for number of transitions per compound to be collected, minimum fragment loss and acceptance thresholds. The settings for each of these parameters are as follows: three fragments, 20 amu minimum loss, precursor minimum: 5.0E4 cps, fragment minimum: 1.0E4 cps and saturation threshold: 1.0E7 cps. For the 80% of compounds are successfully analyzed using a common set of conditions. Typically the assay range is 1–5000 ng/ml, although ten-times lower is not uncommon. Ballistic gradients tend to provide sharp peaks, however perfect peak shape is not required to generate acceptable analytical results. Compounds that elute prior to labetalol are more likely to be impacted by ion suppression from the dosing vehicle when using electrospray ionization, especially with polyethylene glycol. Efforts are currently underway to

Table 2. Internal standard compound mixture used to support in vitro and in vivo pharmacokinetics, pharmacodynamics & drug metabolism assays. Compound ID Formula CAS#

Structure

Labetalol† C19H24N2O3•HCl 32780–64–6

OH

H N

Predicted LogD

Formula MW (g/mol)

Q1→Q3 DP/CE

0.73 (pH = 4.5) 0.83 (pH = 7.4)

364.87

329.2→162.1 50/50

CH3

HO

327.2→176.1 -60/-35

• HCl O

NH2

Imipramine C19H24N2•HCl 113–52–0

1.31 (pH = 5.5) 2.35 (pH = 7.4)

316.87

281.3→193.1 70/50

3.6 (pH = 4.5) 1.44 (pH = 7.4)

318.13

296.1→215.1 40/30

N CH3

N

• HCl

CH3

Diclofenac† C14H10Cl2NNaO2 15307–79–6

O Cl H N

ONa

Cl †

4

294.1→250.1 -30/-20

Can be used in positive or negative ionization mode.

Bioanalysis (2013) 5(14)

future science group

Increasing efficiency & productivity in discovery stage bioanalysis

| P erspective

1.4e6

Imipramine

1.3e6 1.2e6

Intensity (cps)

1.1e6 1.0e6 9.0e5 8.0e5 7.0e5 6.0e5

Diclofenac

5.0e5 4.0e5

Labetalol

3.0e5 2.0e5 1.0e5 0.0 0

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0 2.1

Time (min)

Figure 2. Chromatogram of the internal standards using gradient one on a LX-2 system.

develop standard approaches for analysis of polar compounds using approaches such as HILIC. Analytical acceptance criteria A key aspect of ensuring consistency in data quality across sites is to have a common approach to analytical acceptance criteria. Acceptance criteria, with increasing rigor and effort dependent on how the assay data is to be used, were written to provide analysts guidance on assay requirements. For routine quantitative analyses, assay criteria were fixed as minimum requirements (Box 1) . With this framework, the number of standards was defined for the required assay dynamic range. A relatively wide acceptance of ±25% was set for standards and QCs in order to reduce the need for re-assays where data volume is high and is typically being used for relative ranking of compounds. Differences in calculated parameters of greater than twoto ten-fold would be significant; therefore an analytical variability of 25% is reasonable for a discovery screening environment. „„Automated

sample preparation The use of liquid handlers for sample preparation is an important part of consistent high volume analytical workflows [14] . To this end, an automation platform based on Hamilton liquid handlers was implemented as part of the global bioanalytical process. An advantage of using a consistent hardware platform across all groups is that one site could be responsible for developing the main automation programs, with testing carried out at the other sites. Hamilton liquid handlers were chosen as the standard platform future science group

for a variety of reasons, including the ease of interoperability of automation programs from one system to another without the need for recalibrations or recoding of the program. Flexible programs were created for all aspects of sample handling, from initial pipetting of plasma samples to standard curve preparation. Typical volumes pipetted are in the 10–50 µl range for plasma samples, but depend on the assay needs and most assay volumes are user selectable. High-volume in vitro assays are automated on the same platform, including metabolic stability and plasma protein binding. LC–MS platforms As mentioned above, a global database of SRM transitions was implemented as part of the drive towards more efficient workflows. In order for this to be successful, the MS platform must be standardized. The vast majority of the instruments used in PPDM for quantitative analysis were Sciex mass spectrometers. The few Waters, Thermo and Agilent systems were repurposed for other workflows, sent to other groups or decommissioned. The SRM transitions with optimized declustering potential and collision energy voltages were found to work equally well on API3000, API4000, API5000 and API5500 systems without the need for specific instrument optimization. In this way an analyst can use any instrument rather than waiting for a specific instrument to become available. One common platform offers practical benefits in terms of training analysts, moving work across the laboratories/sites, troubleshooting, maintenance and data reporting. The core LC–MS groups typically have 10–20% of www.future-science.com

5

P erspective |

Bateman, Cohen, Emary & Pucci

Box 1. Analytical acceptance criteria for discovery studies. Calibration standard „„

Dimethylsulfoxide solution or solid material may be used for the calibration standards.

„„

Calibration standard curve will be prepared in the same matrix (i.e., species, tissue) as the unknown samples. Alternative or synthetic matrices are allowed if the sample types are unusual (e.g., CSF, tissues).

„„

The calibration standard curve will contain a minimum 2n +1 levels for each log unit change (n = log unit change).

„„

The mean accuracy should be within ±25% of nominal value. The mean calibration curve will have a minimum of six concentrations.

„„

Standards outside the selected calibration range are not included in the statistical calculations.

„„

Linear or quadratic fit are acceptable with weighting of 1 (unweighted), 1/x or 1/x2.

„„

Zero calibration standards will not be used in the regression.

„„

LLOQ is defined as the lowest calibration standard that is within +25% of the nominal concentration and within ± fivefold of the lowest passing QC concentration. The LLOQ must be greater than three times the mean of the back-calculated value of the zero standards.

„„

Values calculated to be below the lower limit of quantitation will be reported as ‘BLOQ’.

„„

The ULOQ is defined as the highest standard that meets the +25% accuracy criteria.

„„

Unknown samples concentrations may be calculated by extrapolating to a value 25% above the ULOQ (i.e., ULOQ × 1.25). No extrapolation is permitted if a quadratic regression fit is used.

„„

Extrapolation below the LLOQ is not allowed.

Quality control „„

QCs will be prepared in the same matrix as calibration standards.

„„

QCs can be prepared using the same stock solution as used for the calibration standards. A separate serial dilution must be used to prepare the QCs.

„„

At least two of three QCs must be within ±25% of the nominal concentrationin the calibration range. QCs outside the calibration range are not included in the statistical calculations.

„„

Precision (%CV) should be within ±25% for all replicates at each QC concentration. For this calculation, a minimum of three QCs are needed and QCs cannot be rejected unless a clear analytical cause can be assigned.

„„

QCs should be prepared at a minimum of two concentration levels and the minimum number of replicates for each QC set is n = 3.

„„

One of the accepted QC concentrations must be between one and five-times the nominal value of the LLOQ standard. One of the QC concentrations should be prepared between 75–100% of the ULOQ.

„„

QCs do not need to be frozen; they may be prepared on the day of analysis.

„„

Several QC levels should be prepared at or near the expected LLOQ and ULOQ in order to enable standard curve truncation and avoid re-assays.

the latest generation mass spectrometers for the most demanding assays. The current discovery groups have a combined footprint of >40 mass spectrometers. Managing these assets is an important aspect of platform standardization, ensuring that new systems are brought in and older systems are transitioned out; an approximate lifetime of 7–10 years and >40 systems requires that four to six new instruments are purchased each year, with the same number being retired. Having a global standard platform has enabled a much more proactive approach to capital planning and asset management. Sample introduction systems were also standardized across the laboratories. In order to maximize throughput capabilities and reduce the number of instruments required, parallel UHPLC systems were adopted for most instruments [8] . A Thermo LX-2 Transcend system with Aria software and a dual-arm CTC PAL 6

Bioanalysis (2013) 5(14)

autosampler is the common workhorse platform for quantitative analysis. The system offers great flexibility and the ability to run as two independent UHPLCs, doubling the capacity of the mass spectrometer. The systems can be setup to run one assay through both channels or to run two separate assays at the same time. Carryover and needle washes can become a rate-limiting factor on a fast gradient, however with a dual-arm system one arm is washing while the other is loading and injecting. Additionally, while one arm is running one assay, the other can be used to test analytical conditions prior to launching a second assay. The overall workflow around data collection employs acquisition to a server-based environment. Method templates for the various instruments (API4000, 5000 and 5500) are stored on the server with internal standard transitions preloaded. Data are collected to the server to enable future science group

Increasing efficiency & productivity in discovery stage bioanalysis ease of data processing and future archiving. Analysts process the overnight run data using their office PCs, while other analysts use the instrument PC to setup the next day’s experiments. When used in this manner, network acquisition eases analyst movement between LC–MS systems. Data processing The ability to efficiently transform raw data into concentration values and, subsequently, information used for decision making in discovery programs, is an important aspect of the overall bioanalytical workflow. Data processing from the standard MS platform is achieved using MultiQuant™ software from AB Sciex. The software enables the scripting of ‘rules’ for data acceptance according to Box 1. With well-defined acceptance criteria, software can be programmed to provide the end-users with automated output on analyte carryover, IS response, interference testing, as well as the standard accuracy and precision measurements. This approach ensures that departmental guidelines are being followed and releases the analyst from manually reviewing data when this adds no value. Data reporting Standardization of the final step in the bioanalytical process, report generation, is as necessary as standardizing the steps focused on execution of laboratory tasks. At the time of PPDM globalization, the six PPDM sites had at least six different versions of reports for the highest volume in vivo assay, animal PK studies. Although the final derived PK parameters or ‘metadata’ such as clearance, volume of distribution, and so on, were generally (but not universally) uploaded to an internal corporate database, the Excel-based reports including individual animal concentration–time profiles in tabulated and graphical format had several diverse versions. Diversity in reports not only consumes a significant amount of time to execute but also makes moving experiments from one group/site to another difficult. A Lean Six Sigma Green Belt project was undertaken in 2009 to standardize the report format and to implement process improvements globally [15,16] . Initial data collection focused on voice of the customer interviews, collating feedback into critical-to-quality parameters, and conducting a Kaizen event to come to consensus decisions with clear sponsorship across the organization. The ultimate solution included leveraging the Thermo Watson LIMS tool as future science group

| P erspective

well as minor modifications to custom-built inhouse tools to generate Excel-based reports. A sample report is shown in Figure 3 for a single compound IV/PO PK study, and a cassette PK report is shown in Figure 4. Initially the PK reports were manually deposited to individual project teams’ document sharing folders. This process has evolved considerably with the goal to allow report visualization directly from the corporate database rather than manual creation of a separate report file which is distributed to the requestor. The report generation and review process was streamlined from a task consuming 45–60 min to 15 min or less, and across the organization was estimated to have liberated two to three full-time employees (FTEs)/year. The implementation of a standardized report subsequently enabled integration activities between Merck and Schering-Plough bioanalytical groups, as well as externalization of the PK experiments to multiple contract research organizations. An additional aspect of data reporting is management of records and study information as part of a laboratory notebook. Merck adopted an electronic lab notebook (ELN) environment in 2007, prior to globalization of the bioanalysis groups. As part of overall workflow process improvements, shared notebooks were established with templates to enable efficient entry of study information into the ELN for routine studies. Scientific data management through Agilent’s OpenLAB with direct links embedded into the ELN ensures consistent approaches to data backup and retrieval. Efficiency, metrics & lean six sigma Metrics are used to monitor performance and measure the effect of process changes, and are an integral part of Lean Six Sigma projects. Lean Six Sigma projects comprise Lean’s waste elimination aspects and Six Sigma’s critical-toquality characteristics. In 2007, Merck sponsored Sigma projects to understand and optimize workflows in support of in vitro and in vivo assays. Across the various workflows supported by the Discovery Bioanalytics groups, the first, and one of the most effectively implemented in vitro assays, was P-glycoprotein (P-gp) uptake and permeability screening. This assay, centralized at the Rahway facility in 2008, included cell culture, automated incubations, bioanalysis, data analysis and reporting. At the time of centralization, the assay required approximately six FTEs to support ~100–120 compounds a week. www.future-science.com

7

P erspective |

Bateman, Cohen, Emary & Pucci

Figure 3. Standardized report single dose and compound intravenous/oral.

Since then, workflow improvements have driven efficiency gains and the assay is now supporting 150–200 compounds a week with three to four FTEs. Many of the gains were a result of the standardized platforms implemented as part of the globalization of discovery stage bioanalysis. PK screening studies are the first in vivo experiments and are run on a daily basis. Medicinal chemists select compounds for a standard study design (e.g. number of animals, time points, dose and vehicle) and compounds may be dosed as a single or part of a cassette. Fast turnaround time is the most important performance factor in PK screening and the average turnaround time target is 30

Turnaround time (days) B

120 Meet specification Other – specified in comments (18%) Late sample receipt (4%) Poor analytical behavior (22%) Long delivery time (9%) Extra large study (25%) Lower priority for project team Notified after study completion (7%) Analytical method not established No analysts ready to analyze (9%) Reasons not specified

Number of studies

100 80 60 40 20 0

1

3

5

7

9

11

13

15

17

19

21

23

25

27

29

>30

Turnaround time (days)

Figure 6. Monthly metrics for Group A demonstrating a process that is (A) out of control versus (B) in control. Upper specification limit = 5 business day turnaround time.

Current & future state As the pharmaceutical industry adapts to the external pressures to change, so does the bioanalytical laboratory. Our efforts to establish standard workf lows across the laboratories and measure their efficiency have had tangible benefits. With common platforms and workflow comes the ability to shift work from one internal lab to another. This has been especially valuable when sites cannot meet the capacity requirements locally. We have been able to move work to other sites and the internal customers are oblivious as to where the work was done since the reported results are of the same quality and format. Outsourcing has been a major theme for all areas of drug discovery and bioanalytical groups have been the most affected. We have been able to work with our external partners to 10

Bioanalysis (2013) 5(14)

share our standard platforms and processes to help them achieve higher efficiencies. This in turn reduces costs for our outsourcing activities. We can demonstrate our internal costs based on real numbers collected using our standardized approaches. Our external partners have been quick to adopt our standards and provide better pricing for the studies they perform. The angst associated with outsourcing has somewhat been offset by the new challenges faced internally for the bioanalytical scientist. We will likely never see the same volume of studies being conducted internally as in the past. However, new opportunities in the areas of peptide quantification, microdosing studies, biomarker measurements and other yet to be discovered demands will continue to offer both opportunity and satisfaction for highly skilled bioanalytical scientists. It has not always future science group

Increasing efficiency & productivity in discovery stage bioanalysis been easy, and progress feels painfully slow at times. However, when we look back on all the changes, clearly we are far better off and have enjoyed solving many challenging problems along the way. It will be exciting to see how far we progress in the next 5–10 years.

longer consider innovative activities for the vast majority of discovery stage bioanalytical work. However, with change comes opportunity, and new science, especially in bioanalysis, will be needed to support discovery and development programs of the future.

Conclusions The Merck bioanalytical groups have been significantly transformed over the past 5–10 years. What is standard now was the result of novel and creative efforts by many Merck scientists. Examples of processes for compound tuning, LC method development, analytical acceptance criteria, automated sample preparation, sample analysis platforms, data processing and data reporting were demonstrated and their impact on overall workflow was discussed. The implementation of standard approaches and capturing of hard metrics has been valuable and necessary in order to improve the business of bioanalysis. What was once considered innovative has become routine and/or outsourced, and that transition has been difficult in some cases. For example, choosing a column, picking a gradient and tuning a mass spectrometer are no

Acknowledgements

| P erspective

Any attempt to name all of the individuals who contributed to the efforts described in the manuscript will undoubtedly fall short. The authors are in the lucky position to work with many talented individuals on a daily basis and we wish to thank all of the members of the bioanalytical groups at Merck in Boston, Rahway and West Point. We also thank former colleagues who are no longer at Merck but made significant contributions to this work.

Financial & competing interests disclosure The authors have no relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert t­estimony, grants or patents received or pending, or royalties. No writing assistance was utilized in the production of this manuscript.

Executive summary Standardized workflows for bioanalysis „„

The LC–MS experiment for small drug-like molecules is routine for >80% of compounds and as such should be standardized as much as possible to enable resources to be available for more challenging research.

Areas for standardization „„

Compound tuning „„ All compounds have their precursor and product ions written to a shared database using a common set of parameters to allow consistent SRM method building.

„„

LC methods „„ Generic gradients and internal standards and a common C18 column can be used for most studies.

„„

Platforms „„ Common hardware platforms (liquid handlers, UHPLC and MS) are required to increase efficiency and enable movement of work across multiple sites.

„„

Data processing and acceptance criteria „„ Software can automate much of the data review process and a global set of defined acceptance criteria ensures consistent data quality across multiple sites.

„„

Reporting and record keeping „„ Common report formats and electronic laboratory notebooks with templates for recording experimental details are keys for efficient workflows.

„„

Metrics „„ Success can only be determined by measurement against desired outcomes. Metrics enable constructive discussion around collaborators’ and customers’ requirements for bioanalysis.

Future perspective „„

Improved efficiency will remain a requirement for the bioanalytical laboratories within the pharmaceutical industry and at their outsourcing partners. Bioanalytical scientists will once again need to show the value of core bioanalytical capabilities by delivering in challenging new areas such as biologics, microdosing and microsampling, biomarkers and other non-traditional areas.

future science group

www.future-science.com

11

P erspective |

Bateman, Cohen, Emary & Pucci

References 1

2

3

4

5

6

12

Janiszewski J, Schneider R, Kapinos B et al. Development of a high-speed, multiplexed sample-delivery instrument for LC–MS/MS bioanalysis. Bioanalysis 4(9), 1039–1056 (2012). Murphy K, Bennett PK, Duczak N Jr. High-throughput quantitation of large molecules using multiplexed chromatography and high-resolution/accurate mass LC–MS. Bioanalysis 4(9), 1013–1024 (2012).

considerations, impact on the portfolio and enabler of in silico ADME models. Curr. Drug Metab. 9(9), 847–853 (2008). 7

8

Larson B, Banks P, Shultz S, Sobol M, Cali JJ. The utility of semi-automating multiplexed assays for ADME/Tox applications. Comb. Chem. High Throughput Screen. 14(8), 658–668 (2011). Luippold AH, Arnhold T, Jorg W, Kruger B, Sussmuth RD. Application of a rapid and integrated analysis system (RIAS) as a high-throughput processing tool for in vitro ADME samples by liquid chromatography/ tandem mass spectrometry. J. Biomol. Screen. 16(3), 370–377 (2011). Ekins S, Williams AJ. Precompetitive preclinical ADME/Tox data: set it free on the web to facilitate computational model building and assist drug development. Lab Chip. 10(1), 13–22 (2010). Hop CE, Cole MJ, Davidson RE et al. High throughput ADME screening: practical

9

Cohen LH. Surrendering to the robot army: why we resist automation in drug discovery and development. Bioanalysis 4(9), 985–987 (2012). King RC, Miller-Stein C, Magiera DJ, Brann J. Description and validation of a staggered parallel high performance liquid chromatography system for good laboratory practice level quantitative analysis by liquid chromatography/tandem mass spectrometry. Rapid Commun. Mass Spectrom. 16(1), 43–52 (2002). Bruins A, Covey T, Henion J. Ion spray interface for combined liquid chromatography/atmospheric pressure ionization mass spectrometry. Anal. Chem. 59(22), 2642–2646 (1987).

10 Shou WZ , Zhang J. Recent development in

software and automation tools for highthroughput discovery bioanalysis. Bioanalysis 4(9), 1097–1109 (2012). 11 Janiszewski JS, Liston TE, Cole MJ.

Perspectives on bioanalytical mass spectrometry and automation in drug discovery. Curr. Drug Metab. 9(9), 986–994 (2008).

Bioanalysis (2013) 5(14)

12 Whalen K, Gobey J, Janiszewski J. A

centralized approach to tandem mass spectrometry method development for high-throughput ADME screening. Rapid Commun. Mass Spectrom. 20(10), 1497–1503 (2006). 13 Geddes K, Adamson G, Dube N, Crathern S,

King RC. Semi-automated tandem mass spectrometric (MS/MS) triple quadrupole operating parameter optimization for high-throughput MS/MS detection workflows. Rapid Commun. Mass Spectrom. 23(9), 1303–1312 (2009). 14 Tweed JA, Gu Z, Xu H et al. Automated

sample preparation for regulated bioanalysis: an integrated multiple assay extraction platform using robotic liquid handling. Bioanalysis 2(6), 1023–1040 (2010). 15 Gooding G. Implementing continuous

improvement. Med. Device Technol. 17(2), 50–51 (2006). 16 Johnson CW, Chatterjee M, Kubala S, Helm

D, Houston J, Banks M. Efficient and effective compound management to support lead optimization. J. Biomol. Screen. 14(5), 523–530 (2009).

future science group