Article pubs.acs.org/ac
Two-Site Evaluation of the Repeatability and Precision of an Automated Dual-Column Hydrogen/Deuterium Exchange Mass Spectrometry Platform David J. Cummins,† Alfonso Espada,‡ Scott J. Novick,§ Manuel Molina-Martin,‡ Ryan E. Stites,† Juan Felix Espinosa,‡ Howard Broughton,‡ Devrishi Goswami,§ Bruce D. Pascal,⊥ Jeffrey A. Dodge,† Michael J. Chalmers,*,† and Patrick R. Griffin*,§ †
Lilly Research Laboratories, Eli Lilly and Company, Lilly Corporate Center, Indianapolis, Indiana 46285, United States Analytical Technologies Department, Centro de Investigación Lilly, SA, Avenida de la Industria 30, 28108 Alcobendas, Spain § The Scripps Research Institute, Department of Molecular Therapeutics, 130 Scripps Way, Jupiter, Florida 33458, United States ⊥ The Scripps Research institute, Informatics Core, 130 Scripps Way, Jupiter, Florida 33458, United States ‡
S Supporting Information *
ABSTRACT: Hydrogen/deuterium exchange coupled with mass spectrometry (HDX-MS) is an information-rich biophysical method for the characterization of protein dynamics. Successful applications of differential HDX-MS include the characterization of protein−ligand binding. A single differential HDX-MS data set (protein ± ligand) is often comprised of more than 40 individual HDX-MS experiments. To eliminate laborious manual processing of samples, and to minimize random and gross errors, automated systems for HDX-MS analysis have become routine in many laboratories. However, an automated system, while less prone to random errors introduced by human operators, may have systematic errors that go unnoticed without proper detection. Although the application of automated (and manual) HDX-MS has become common, there are only a handful of studies reporting the systematic evaluation of the performance of HDX-MS experiments, and no reports have been published describing a cross-site comparison of HDX-MS experiments. Here, we describe an automated HDX-MS platform that operates with a parallel, two-trap, two-column configuration that has been installed in two remote laboratories. To understand the performance of the system both within and between laboratories, we have designed and completed a test−retest repeatability study for differential HDX-MS experiments implemented at each of two laboratories, one in Florida and the other in Spain. This study provided sufficient data to do both within and between laboratory variability assessments. Initial results revealed a systematic run-order effect within one of the two systems. Therefore, the study was repeated, and this time the conclusion was that the experimental conditions were successfully replicated with minimal systematic error.
H
as a biophysical technique in support of small molecule drug discovery. Given the emerging importance of the HDX experiment to industrial research, there is renewed interest in the characterization of basic analytical performance of the method, including key parameters such as precision and
ydrogen/deuterium exchange coupled with mass spectrometry (HDX-MS) has become a widely applied biophysical method for the characterization of protein conformational dynamics.1−5 When configured to acquire data as a pairwise differential experiment, HDX-MS has been used to profile a range of protein−ligand interactions6−8 including peptides, proteins, antigens, DNA, RNA, and small molecules.8−25 In recent years, HDX has been adopted by an increasing number of pharmaceutical companies for characterization of therapeutic proteins, antibody−epitope mapping, and © 2016 American Chemical Society
Received: April 26, 2016 Accepted: May 25, 2016 Published: May 25, 2016 6607
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
interrogate the performance of the HDX-MS experiment (manual or automated) over a period of time. As mentioned within their manuscript, the concept of applying HDX to assess batch-to-batch comparability of recombinant proteins requires a close examination of the analytical performance of these systems. The application of HDX-MS as a biophysical method for the comparison of biopharmaceutical protein samples was the topic of a subsequent study by Houde et al.32 In this work, the authors describe an increased variability between data acquired on different days and highlight the possibility of dayto-day experimental errors within interday HDX-MS experiments. Further commentary from Houde et al. has highlighted the need for increased replication within HDX experimental design.33 Given the complexity of the experiment and differences between automated (and manual) protocols, it was projected that individual systems/operators would benefit from prospective evaluation as to the consistency of the data generated, if multiday/multiligand studies are to be conducted. A detailed comparison of HDX data acquired with different mass analyzers has been published by Burns et al.34 Significant differences were identified between data acquired with Fourier transform (FT)MS instruments and data acquired with time-of-flight (ToF)MS systems, again highlighting the need for careful design and control of the HDX-MS experiment. Taken together, the studies described above highlight the need for a thorough evaluation of the analytical performance of the HDX-MS experiment which we address in part with the results presented here from a two-site comparison of the HDXMS method. Our examination of the analytical stability of the HDX-MS method was made possible by completing a two-site test−retest protocol designed to evaluate both interlaboratory and intralaboratory performance. In this study, experimental conditions from a system in Jupiter, Florida, were treated as the “gold standard,” since that system was assembled first and had been operating for over a year (herein referred to as the Scripps HDX System). Following completion of our test−retest study, we performed a detailed statistical analysis to assess whether the experimental conditions within the Analytical Technologies Department at Lilly S.A, Alcobendas, Spain (herein referred to as the Lilly HDX system), were an acceptable replication of data obtained with the Scripps HDX system.
repeatability within and across laboratories. As discussed in a recent review, the uptake of HDX in an industrial setting relies heavily on efficient software for data analysis, the development of robust automation, and a careful examination of the performance of the experiment over time.26 Herein, we address the latter two issues. A single differential HDX data set contains a large number of individual HDX-MS experiments for both the ligand-free and ligand-bound protein samples. For example, our typical differential experiment requires six time periods of exchange for each sample, each of which is acquired with three independent replicates. In addition, we acquire Dmin and Dmax time points (n = 2) and several blank injections to maintain system performance. In total, 40 HDX-MS experiments are combined into a single differential HDX data set (excluding blank injections). To minimize gross and random errors associated with laborious manual sample preparation, we employ an automated HDX-MS system based around a LEAP technologies CTC HTS PAL autosampler.27−29 This system has proven exceptionally robust in our laboratory, and we recently described a retrospective analysis of two large differential data sets, including a 40 experiment study acquired over an eight-month period.30 The data set contained differential analysis from 37 peptides, each of which contained 127 replicate measurements of the dimethyl sulfoxide (DMSO) control sample. For one peptide of interest (VDR LBD 134− 150), the mean and standard deviation of the 127 measurements were calculated to be 51% and 3.7%, respectively. The data from this retrospective analysis indicated that our system was capable of generating reproducible HDX data for the time period associated with large protein−ligand interaction studies; however, several limitations to the automation platform became evident. The time required to complete the experiment is approximately 12 hours, thereby precluding analysis of two samples per 24 h period. This can lead to problems when studying proteins that are not stable over extended time intervals. Also, the platform as configured did not allow for temperature control of the syringes or the high-performance liquid chromatography (HPLC) mobile phases. Burkitt and O’Connor described a prospective interrogation of the performance of a dual column automated system over a period of two months, within which they monitored key experimental parameters, such as temperature and pH.31 During this study, the repeatability and reproducibility of a 30 s on-exchange experiment were assessed for seven peptides of interest, although no temporal HDX data or differential HDX data were provided. In spite of this, we adopted a similar approach to the O’Connor system31 with the use of dual column parallel chromatography; however, the system described here employs a different digestion strategy and a unique six valve flow path employing three HPLC pumps (load, condition, gradient). When compared to our earlier configuration, the parallel HPLC platform eliminates time associated with column conditioning, sample digestion, and desalting and provides a significant reduction in total experiment duration. The optimized system maintains the use of a TWIN PAL liquid handler to allow for separate syringes for protein aspiration and the dilution/quench events. Temperature control is obtained by housing the entire system (including liquid handler, HPLC pumps, and solvents) within a refrigerated chromatography cabinet. The Burkitt and O’Connor report remains the only published description of a prospective study designed to
■
EXPERIMENTAL SECTION HDX Automation. The Scripps HDX system is comprised of a Leap Technologies CTC HTS-XT-PAL liquid handler (LEAP Technologies, Carrboro, NC) coupled with two gradient HPLC pump modules (Agilent Technologies, San Jose, CA), an isocratic HPLC pump (Knauer Wellchrom K1001, Leap Technologies, Carrboro, NC), and an orbitrap mass spectrometer (Exactive, Thermo Scientific, San Jose, CA). The two rail LEAP system is configured with a four plate tray holder, six HPLC valves, four wash stations, and five reagent vials. Both the liquid handler and all three HPLC pumps are located within a refrigerated chromatography cabinet held at 3.5−4.0 °C (Thermo REC500, Thermo Fisher Scientific, Waltham MA). Communication between CTC PAL arms is accomplished via TTL commands embedded within the control Macros. Connectivity between the LEAP and the MS system was accomplished via a contact closure signal from the gradient HPLC pump to the MS input (Figure S1). The Lilly HDX system is essentially identical to the Scripps HDX system; however, the mass spectrometer is a Q-Exactive (Thermo 6608
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
Figure 1. Run-order effect. Average % deuterium values for representative peptides across six time points (10, 30, 60, 300, 900, 3600 s). Data are from the “apo” sample and are therefore the same in all 10 days of the study. (A−C) Data from the first study performed with the Lilly system showed evidence of a run-day effect (p < 0.0001). The study was therefore repeated. (D−F) No run-day effect was observed for the second Lilly study or for the Scripps study (p = 0.215 overall), and the data are essentially identical. Therefore, the study was declared a success.
The design, shown in Table S2, was used at both sites. The design allows testing for a “run-day” effect as each ligand is run at the beginning of a week, and also by interleaving them every other day, we can test for any effect of duration since the start of the experiment. For example, on Wednesday of week two, ligand A was run. This is the third day of the week, but it is also the eighth day of the study. Although it is true that only one ligand was tested on day 8, it is also true that the other ligand was tested on days 7 and 9. One can estimate any effect of duration from this study, while avoiding a doubling of the experimental units needed to test each ligand on each of the 10 days of the study.
Scientific, San Jose, CA). All MS data were processed with HDX Workbench software.35 Both HDX systems are controlled with two instances of CTC Cycle Composer software (Version 1.6.0, CTC Analytics AG, Switzerland), each running (essentially) identical in-house instrument control macros (one macro for each PAL “head”). A complete list of parts and consumables are provided in Table S1. A picture of the Lilly HDX system is shown in Figure S2A. The HPLC flow paths and connectivity between pumps and valves are shown in Figure S2B and are identical in both systems. The automated HDX experiment can be separated into two distinct steps: (step 1) Incubation with D2O followed by sample quenching and (step 2) digestion and analysis with LC MS. Detailed descriptions of each of these two steps are provided in the Supporting Information. HDX Sample Preparation. For the on-exchange experiments, the vitamin D receptor ligand binding domain (VDRLBD) protein (130 μL, 10 μM in 20 mM Hepes, 150 mM NaCl, 10 mM methionine DL, and 5 mM dithiothreitol (DTT) prepared in H2O and adjusted to pH = 7.5) was equilibrated for 1 h in the presence of DMSO solution 1.3 μL ± ligand at 3.5 °C. The protein/ligand molar ratio is 1:10. The deuterium buffer for protein labeling was 20 mM Hepes, 150 mM NaCl, 10 mM methionine DL, and 5 mM DTT prepared in D2O and adjusted to pH = 7.5. The quench solution was 3 M urea and 1% trifluoroacetyl (TFA) prepared in H2O adjusted to pH = 2.5. For the Dmax samples, the VDR-LBD protein (20 uL, 5 μM in 20 mM Hepes, 150 mM NaCl, 10 mM DL-methionine, and 5 mM DTT prepared in H2O and adjusted to pH = 7.5) was mixed with 4 volumes of D2O containing 0.1 M phosphate buffer, 50 mM TCEP adjusted to pH = 4.5, resulting in a protein concentration of 1 μM. Study Design and Statistical Tests. Statistical analysis was performed with JMP 9.0.0 (SAS Institute, Cary, NC) and was applied to study modulators of the vitamin D receptor (socalled VDRMs).9,36 The experiment was designed to alternate between testing a ligand (A) and the reference compound, 250H Vitamin D 3 (VD3), over a two-week period. Each ligand was run a total of five times.
■
RESULTS AND DISCUSSION Automated Dual Column HDX System. A photograph of one of the two systems used is shown in Figure S2A. The HPLC flow paths and valve configuration are shown in Figure S2B. The combined HDX system operates under the control of a pair of CTC PAL instrument “Macros” and “Methods” (one for each PAL arm) assembled within the Cycle Composer software (CTC Analytics, the connectivity between components are detailed in Figure S1). Operating together, the Cycle Composer methods allow for the parallel processing of the Dmin, 10 s, 30 s, 60 s, 300 s, 900 s, 3600 s, and Dmax samples. In total, the completion of the entire differential HDX experiment, comprised of 44 individual HDX-MS samples, can be completed within 7 h (a reduction of 5 h when compared to our previous system). After the initial cooling to 3.5 °C, each LEAP system requires minor adjustments to the positional (X, Y, Z) mapping of target objects, along with a reduction in the maximum acceleration of the PAL head units; however, no deleterious effects have been observed related to long-term operation of the LEAP and HPLC components within the chromatography cabinet. To date, the systems have proven robust, and we estimate that we have acquired a combined total in excess of 600 differential HDX experiments across both laboratories. To assess the data obtained with the new system, we performed a differential HDX experiment for VDR LBD ± 25-OH VD3. These data were then compared with data from 6609
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
Figure 2. Distribution of (A) %D ligand, (B) %D apo, and the difference (C) delta values. These are measured across all peptide fragments and over both laboratories. For ligand, values range from 4 to 112. For apo, values range from 4 to 133. For delta = ligand − apo, values range from −87 to +15.
an identical experiment performed with an earlier version of our automated system.27 Deuterium exchange (%D) versus time plots for three regions of VDR LBD are shown in Figure S3. The upper plots (A, B, C) represent data from the earlier single column system, and the lower plots (D, E, F) represent data from the new dual column system. Although some minor differences are observed between experiments, it is clear that the trends for each peptide are similar and that the variances within each system appear comparable. The earlier single column HDX system relied on sample cooling with thermal sample trays (set to 4 °C), and no control of syringe temperature was possible. For the new system, all experimental steps are carried out at 3.5 °C. Therefore, an identical match between data obtained from each of these two different instrument configurations is not possible. This new platform configuration offers several practical improvements to the earlier system, including reduced cost and increased access to HPLC valves and columns. Experimental improvements include a 5 h reduction in experiment time and reduced back exchange because of improved cooling. The reduction in back exchange is reflected in the higher average %D values for the peptides shown in Figure S3 (A vs D, B vs E, C vs F). Reduced back exchange will provide improved differential HDX data by expanding the signal window between the protected and nonprotected states of each peptide, which improves the signalto-noise (S/N). One limitation of performing the experiment within a chromatography cabinet is a lack of flexibility in the control of the exchange temperature. Two-Site Test−Retest Study. After installation and testing of both systems, we proceeded to undertake and complete the two-week test−retest study. Upon close inspection of the data, we identified a potential run-day trend in the data from the Lilly HDX system, as illustrated in Figure 1A−C. We therefore proceeded with the statistical comparison of the two data sets as described later in the Test−Retest Analysis and Test−Retest Discussion sections. The outcome of the statistical analysis, covered in detail later, was a conclusion that the two experiments were not equivalent. There was a statistically significant run-day effect in the Lilly data, but not in the Scripps data. We therefore decided to repeat the experiment at the Lilly
site. Following a careful investigation of all experimental conditions and remaking of all samples, solutions, buffers, and mobile phases, the 10 day study was repeated (Figure 1D−F). The repeat results with the Lilly HDX system showed no statistically significant run-day effect. The data aligned with that from Scripps and were therefore deemed a success, as defined later. The final data set generated from both the Lilly HDX system and the Scripps HDX system is provided in Table S3. A recent editorial from Moroco and Engen highlighted the importance of running replicates within the HDX experiment workflow.33 Five areas of replication were cited: (1) biological, (2) manipulation, (3) labeling, (4) analysis, and (5) processing. The study defined here represents the first systematic evaluation of the variability of the HDX experiment across four of the five areas cited (2−5). Furthermore, Moroco and Engen highlighted the importance of being able to understand the performance of the HDX experiment between different laboratories. Test−Re-Test Data Analysis. Figure 2 shows the distribution of %D across all 35 peptide fragments for the “ligand” data (A), the “apo” data (B), and the difference between these two, which we define as “delta” (C). In this study, we focus on the value of delta, which ranges from −87 to +15 (Figure 2 C). A low negative number for delta indicates strong protection to solvent exchange for the respective polypeptide segment. A positive number indicates reduced protection to solvent exchange. Delta values near zero indicate no difference between apo and ligand. A transformation of the delta value was then performed to define a new response, “signal”. This signal is defined as 5 minus delta, truncated at 1. For example, if the delta value is −20, the signal is computed as +25, and any delta value greater than 4 will therefore have a signal of 1 (e.g., a delta value of 9 is computed as maximum (5 − 9, 1) = maximum (−4, 1) = 1). Finally, in our analyses, the log (base 10) of the signal is the response of interest. The log transformation has a stabilizing effect on the variance of the response. This is a common transformation used for analyzing data when the variation in the data is related to the magnitude of values seen (standard deviation versus mean). For these data, the standard deviation 6610
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
Figure 3. (A) Distribution of the response delta, followed by its truncation and transformation to compute log signal. Delta (ligand − apo) is truncated as follows: let W = 5 − delta, and if W < 1, then let W = 1. This is the upper right plot (B), Y-axis, called “signal”. Finally, the middle righthand plot (C) takes the log base 10 transform of the Y-axis from (B). Middle left plot (D) shows distribution of log signal. (E) Run-order effects for the first run of experiment. (F) Least-squares means for data from Lilly in red and for Scripps in blue. The decreasing red line shows a strong runorder effect for the Lilly site (P < 0.0001); however, there is no run-order effect for the Scripps data (P = 0.51). (F) Run-order effects for the rerun experiment. Variability in the repeat Lilly data (red line) is still higher than in the Scripps data; however, the strong run-day trend is no longer present (P = 0.29). In addition, the entire range of the Y-axis is 0.23, which is relatively small considering that log signal ranges from 0 to 1.97. Y-axis is forced to the same limits as for E.
of signal has a strong positive relationship with the mean of signal (R2 = 0.9). Statistical tests are more valid if this trend is removed. The standard deviation of log of signal has a much weaker relationship with its mean (R2 = 0.3). Figure 3A−D shows the transformation and the distribution of the log signal for data in this study. Test−Re-Test Discussion. The test−retest study would focus on comparison of the HDX signal for each peptide fragment derived from on-exchange of VDR in complex with the same ligand obtained at two different laboratories. Observing similar results was predetermined as the mark of success. While there are numerous criteria that could be used to declare successful replication between laboratories, we performed an analysis of variance (ANOVA) with log signal
as the response. An ANOVA tests the hypothesis that the means of two or more groups are equal and also assesses the importance of one or more factors by comparing the response variable means at the different effect levels. The key effects captured were (1) laboratory, (2) ligand, (3) run order, (4) the interaction between laboratory and run order, (5) peptide fragment, and (6) all interactions between laboratory, ligand, and peptide fragment. We loosely define “interaction” as an unexpected effect observed when two parameters are changed at the same time, relative to changing only one at a time. For example, if there is an interaction between laboratory and run order (effect 4 in the earlier list), this means that there is a run-order effect, but it is not the same at both laboratories. This is in fact what we saw 6611
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
a reversal of trend: in the original experiment, the Scripps data had a signal that was 0.06 log units higher than for Lilly overall. In the rerun experiment, the Lilly results had a higher signal overall, but in this case, the difference was only 0.025 log units. Comparative examination of Figure 4A and B gives a graphical illustration of the improvement achieved in aligning the experimental results from the two laboratories. This is taken as additional evidence of successful replication of one laboratory’s experimental conditions by another, in lieu of a more thorough component by component comparison. Although the source of the systematic error detected in the first data set from the Lilly HDX system could not be unambiguously defined, hypotheses included changes in ambient temperature within the laboratory and a change in pH of reagents over time. This reduction in deuterium incorporation was not mitigated with the use of our Dmax control. Regardless of the source of the error, the results highlight the potential for systematic errors within HDX-MS data sets and the importance of data interrogation to monitor for such errors in data over time. Several observations led to the need for the experiments described here and also influenced the design of the study. Variability of the response (whether it be log signal or raw percent protection) seems to be different for different peptides. There are numerous sources of variation in the data generation process, one of which is laboratory-to-laboratory variability. These sources of variation should be examined to determine whether there is a systematic bias introduced at one step and whether the variability in the response can be reduced by optimizing the experimental conditions. After identifying the largest sources of variation, we can address optimization of the data generation process, and finally, we may develop a quality control process that allows us to track conditions that have the largest impact on variability. Numerous opportunities naturally result from a deeper understanding of all the steps of the data-generating process and each step’s contribution to total variability. One of these is a normalization procedure that can adjust for any (unavoidable) systematic bias in the data, once it is known. The steps outlined here take months to years to accomplish; therefore, the scope of this work was to examine the very first step, which is to take an established system for HDX-MS and attempt to replicate, at a new site, the experimental conditions of that laboratory. In addition to setting up the same conditions as closely as possible, we compare the response vector between runs within the same laboratory, and between laboratories, for each peptide fragment. This study leaves some questions unanswered, and follow-up studies are planned to address these; however, the simplicity of this study allowed an exploration of variability within and between laboratories and identification of any run day or run-order effect.
anecdotally, confirmed statistically. The primary interest was in the lab-to-lab comparison (the first effect listed), where the goal was to have a difference of less than 5% in the least-squares means (LS means) from the ANOVA. If we take the estimated mean log signal for one laboratory (accounting for peptide and other effects) and divide it by the mean log signal for the other laboratory, this ratio should be within 5%. For this first experiment, the ratio of LS means for the lab-to-lab comparison was above the desired 5% threshold, with a calculated value of 5.4%. In the variance components model, we would like for labto-lab differences to be less than 1% of the total variation in the response. For the first experiment, this was calculated to be 0.88%, which is acceptable. In unraveling the significant lab by run-order interaction (p < 0.0001), we saw that there was a significant run-order effect in the Lilly data (p < 0.0001) that did not exist in the Scripps data (p = 0.51). On the basis of the lab-to-lab comparison (5.4%) and the run-day effect seen in the Lilly data, the entire 10-day test−re-test study was rerun at the Lilly site. Table 1 shows results for both the original experiment and the rerun, highlighting the laboratory-to-laboratory differences. Table 1. Results of the ANOVA Showing Laboratory Effecta measure percent percent percent percent
difference in laboratory effect total variation, laboratory total variation, laboratory by peptide total variation, laboratory by run order
first experiment
rerun
5.4 0.88 4.7 0.16
2.5 0.12 0.54 0.00
a
Row 1 comes from a ratio of LS means for laboratory effect, Scripps divided by Alcobendas, and reports the percentage difference in these two effects. Remaining rows give percentage total variation from a variance components analysis. For example, in the first experiment, 5.4% of the total variability in log signal can be explained by laboratory differences.
The magnitude of the laboratory differences was small enough in the second experiment as to not be meaningful. This is seen in the lower difference in laboratory effects LS means, reduced from 5.4% to 2.5%, and more dramatically in the lowering of the percent total variation in the variance components. In the first experiment, there were a few peptide fragments for which the differences between laboratories were very large; however, for other peptide fragments, the two laboratories were more aligned leading to a value of 4.7% (Table 1). For the second experiment, all the peptides were aligned leading to a value of 0.54%. The reduction from 4.7% in the first experiment to 0.54% in the rerun experiment was notable. The last column of Table 1 was, in part, the basis for declaring that the rerun experiment was a success. The magnitude of laboratory-to-laboratory differences reported in the ANOVA and the percent total variation from the variance components analysis were both low enough that any qualitative conclusions in a project work context are highly likely to be the same, regardless of which lab generated the data. This is an important result. Figure 3E shows the run-order effect detected in the first study. The run-order effect was only significant for the Lilly HDX system (p < 0.0001 for Lilly versus p = 0.51 for Scripps, or p = 0.0125 overall), and the effect seemed to exist for all peptide fragments (run order by peptide interaction p = 0.82). Figure 3F shows that the run-order effect was no longer significant in the rerun experiment (p = 0.215 overall, and the interaction, lab by run order, had p = 0.25). There was actually
■
CONCLUSIONS We have designed and constructed a pair of automated systems for HDX-MS experiments that are configured to operate with a dual column, parallel HPLC configuration. The introduction of the parallel column configuration allowed us to reduce our differential HDX experiment from over 12 hours to around 7 h (44 samples). We then performed a prospective evaluation of the cross-site performance of the HDX-MS experiment. In general, both systems provided equivalent data; however, careful control of experimental conditions was required to eliminate a slight systematic drift in one of the two systems. 6612
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry
Figure 4. (A) Log signal by peptide, initial experiment: LS means for log signal from a full ANOVA model, showing the effect of peptide for each laboratory. The red line is almost always lower than the blue line, showing a systematic shift between laboratories. (B) Log signal by peptide, rerun experiment: here, the alignment between laboratories is better, with lines closer together and crossing so that the red line is sometimes above, sometimes below, the blue line. There is no apparent (or statistical) shift between laboratories.
These observations highlight the need to carefully evaluate HDX data over the time scale required to complete the study. The results described here provide the first systematic evaluation of the stability of the HDX-MS experiment. Further and more detailed studies are planned, including a full assessment of all key steps of the data generation process, to be used as input to a more thorough variance components analysis. After identifying the greatest sources of variation, efforts will be made to optimize the experimental conditions to reduce the total variation. Finally, a quality control process will be set up to monitor these key components as a standard procedure; thus, we will establish early detection of problems. With this in place, we can detect and prevent problems rather than hope to detect them and correct them after they have occurred.
■
■
between pumps and valves; deuterium exchange (%D) vs time plots for three regions of VDR LBD; complete list of parts and consumables; design of the test−retest study; complete data set from both laboratories. (PDF)
AUTHOR INFORMATION
Corresponding Authors
*E-mail:
[email protected]. *E-mail: pgriffi
[email protected]. Notes
The authors declare no competing financial interest.
■
ACKNOWLEDGMENTS The authors would like to acknowledge Eduardo Harguindey and Jonathan Diaz (Eli Lilly and Company, Alcobendas, Spain) for IT support. Funding was provided in part from the National Institutes of Health grant GM084041.
ASSOCIATED CONTENT
S Supporting Information *
■
The Supporting Information is available free of charge on the ACS Publications website at DOI: 10.1021/acs.analchem.6b01650. HDX automation steps; schematic showing connections between PCs, PAL, MS, and HPLC; picture of the Lilly HDX system; HPLC flow paths and connectivity
REFERENCES
(1) Englander, S. W. J. Am. Soc. Mass Spectrom. 2006, 17, 1481−1489. (2) Wales, T. E.; Engen, J. R. Mass Spectrom. Rev. 2006, 25, 158−170. (3) Hebling, C. M.; Morgan, C. R.; Stafford, D. W.; Jorgenson, J. W.; Rand, K. D.; Engen, J. R. Anal. Chem. 2010, 82, 5415−5419.
6613
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614
Article
Analytical Chemistry (4) Konermann, L.; Pan, J.; Liu, Y. H. Chem. Soc. Rev. 2011, 40, 1224−1234. (5) Wei, H.; Mo, J.; Tao, L.; Russell, R. J.; Tymiak, A. A.; Chen, G.; Iacob, R. E.; Engen, J. R. Drug Discovery Today 2014, 19, 95−102. (6) Chalmers, M. J.; Busby, S. A.; Pascal, B. D.; West, G. M.; Griffin, P. R. Expert Rev. Proteomics 2011, 8, 43−59. (7) Pirrone, G. F.; Iacob, R. E.; Engen, J. R. Anal. Chem. 2015, 87, 99−118. (8) Percy, A. J.; Rey, M.; Burns, K. M.; Schriemer, D. C. Anal. Chim. Acta 2012, 721, 7−21. (9) Carson, M. W.; Zhang, J.; Chalmers, M. J.; Bocchinfuso, W. P.; Holifield, K. D.; Masquelin, T.; Stites, R. E.; Stayrook, K. R.; Griffin, P. R.; Dodge, J. A. Bioorg. Med. Chem. Lett. 2014, 24, 3459−3463. (10) Underbakke, E. S.; Iavarone, A. T.; Chalmers, M. J.; Pascal, B. D.; Novick, S.; Griffin, P. R.; Marletta, M. A. Structure 2014, 22, 602− 611. (11) Boerma, L. J.; Xia, G.; Qui, C.; Cox, B. D.; Chalmers, M. J.; Smith, C. D.; Lobo-Ruppert, S.; Griffin, P. R.; Muccio, D. D.; Renfrow, M. B. J. Biol. Chem. 2014, 289, 814−826. (12) Landgraf, R. R.; Goswami, D.; Rajamohan, F.; Harris, M. S.; Calabrese, M. F.; Hoth, L. R.; Magyar, R.; Pascal, B. D.; Chalmers, M. J.; Busby, S. A.; Kurumbail, R. G.; Griffin, P. R. Structure 2013, 21, 1942−1953. (13) West, G. M.; Pascal, B. D.; Ng, L. M.; Soon, F. F.; Melcher, K.; Xu, H. E.; Chalmers, M. J.; Griffin, P. R. Structure 2013, 21, 229−235. (14) Chalmers, M. J.; Wang, Y.; Novick, S.; Sato, M.; Bryant, H. U.; Montrose-Rafizdeh, C.; Griffin, P. R.; Dodge, J. A. ACS Med. Chem. Lett. 2012, 3, 207−210. (15) Devarakonda, S.; Gupta, K.; Chalmers, M. J.; Hunt, J. F.; Griffin, P. R.; Van Duyne, G. D.; Spiegelman, B. M. Proc. Natl. Acad. Sci. U. S. A. 2011, 108, 18678−18683. (16) West, G. M.; Chien, E. Y.; Katritch, V.; Gatchalian, J.; Chalmers, M. J.; Stevens, R. C.; Griffin, P. R. Structure 2011, 19, 1424−1432. (17) Zhang, J.; Chalmers, M. J.; Stayrook, K. R.; Burris, L. L.; Wang, Y.; Busby, S. A.; Pascal, B. D.; Garcia-Ordonez, R. D.; Bruning, J. B.; Istrate, M. A.; Kojetin, D. J.; Dodge, J. A.; Burris, T. P.; Griffin, P. R. Nat. Struct. Mol. Biol. 2011, 18, 556−563. (18) Heidebrecht, T.; Christodoulou, E.; Chalmers, M. J.; Jan, S.; Ter Riet, B.; Grover, R. K.; Joosten, R. P.; Littler, D.; van Luenen, H.; Griffin, P. R.; Wentworth, P., Jr.; Borst, P.; Perrakis, A. Nucleic Acids Res. 2011, 39, 5715−5728. (19) Wang, Y.; Kumar, N.; Solt, L. A.; Richardson, T. I.; Helvering, L. M.; Crumbley, C.; Garcia-Ordonez, R. D.; Stayrook, K. R.; Zhang, X.; Novick, S.; Chalmers, M. J.; Griffin, P. R.; Burris, T. P. J. Biol. Chem. 2010, 285, 5013−5025. (20) Dai, S. Y.; Burris, T. P.; Dodge, J. A.; Montrose-Rafizadeh, C.; Wang, Y.; Pascal, B. D.; Chalmers, M. J.; Griffin, P. R. Biochemistry 2009, 48, 9668−9676. (21) Bruning, J. B.; Chalmers, M. J.; Prasad, S.; Busby, S. A.; Kamenecka, T. M.; He, Y.; Nettles, K. W.; Griffin, P. R. Structure 2007, 15, 1258−1271. (22) Risinger, A. L.; Li, J.; Bennett, M. J.; Rohena, C. C.; Peng, J.; Schriemer, D. C.; Mooberry, S. L. Cancer Res. 2013, 73, 6780−6792. (23) Li, X.; Khan, M. F.; Schriemer, D. C.; Fliegel, L. J. Mol. Cell. Cardiol. 2013, 61, 153−163. (24) Huang, H. K.; Taneva, S. G.; Lee, J.; Silva, L. P.; Schriemer, D. C.; Cornell, R. B. J. Mol. Biol. 2013, 425, 1546−1564. (25) Ling, J. M.; Silva, L.; Schriemer, D. C.; Schryvers, A. B. Methods Mol. Biol. 2012, 799, 237−252. (26) Campobasso, N.; Huddler, D. Bioorg. Med. Chem. Lett. 2015, 25, 3771−3776. (27) Chalmers, M. J.; Busby, S. A.; Pascal, B. D.; He, Y.; Hendrickson, C. L.; Marshall, A. G.; Griffin, P. R. Anal. Chem. 2006, 78, 1005−1014. (28) Hamuro, Y.; Coales, S. J.; Southern, M. R.; Nemeth-Cawley, J. F.; Stranz, D. D.; Griffin, P. R. J. Biomol. Tech. 2003, 14, 171−182. (29) Woods, V. L., Jr.; Hamuro, Y. J. Cell. Biochem. 2001, Suppl 37, 89−98.
(30) Chalmers, M. J.; Pascal, B. D.; Willis, S.; Zhang, J.; Iturria, S. J.; Dodge, J. A.; Griffin, P. R. Int. J. Mass Spectrom. 2011, 302, 59−68. (31) Burkitt, W.; O’Connor, G. Rapid Commun. Mass Spectrom. 2008, 22, 3893−3901. (32) Houde, D.; Berkowitz, S. A.; Engen, J. R. J. Pharm. Sci. 2011, 100, 2071−2086. (33) Moroco, J. A.; Engen, J. R. Bioanalysis 2015, 7, 1065−1067. (34) Burns, K. M.; Rey, M.; Baker, C. A.; Schriemer, D. C. Mol. Cell. Proteomics 2013, 12, 539−548. (35) Pascal, B. D.; Willis, S.; Lauer, J. L.; Landgraf, R. R.; West, G. M.; Marciano, D.; Novick, S.; Goswami, D.; Chalmers, M. J.; Griffin, P. R. J. Am. Soc. Mass Spectrom. 2012, 23, 1512−1521. (36) Zhang, J.; Chalmers, M. J.; Stayrook, K. R.; Burris, L. L.; GarciaOrdonez, R. D.; Pascal, B. D.; Burris, T. P.; Dodge, J. A.; Griffin, P. R. Structure 2010, 18, 1332−1341.
6614
DOI: 10.1021/acs.analchem.6b01650 Anal. Chem. 2016, 88, 6607−6614