DDN Editorial Roundtable: Current and Emerging Trends in Automation and Detection

At ALA's LabAutomation2009, Drug Discovery News assembled an expert panel to discuss assay automation--about how their operations run, as well as some of the hurdles they face in automating assays.

Register for free to listen to this article
Listen with Speechify
At ALA's LabAutomation2009 in January, Drug Discovery News assembled an expert panel to discuss assay automation—about how their operations run, as well as some of the hurdles they face in automating assays.

Participating in the discussion were: Jeff Paslay, vice president of screening sciences at Wyeth Pharmaceuticals; Marcie Glicksman, senior director of leads discovery at Brigham and Women's Hospital and the Harvard NeuroDiscovery Center; Ricardo Macarron, vice president of sample management technologies at GlaxoSmithKline; Adam Hill, director of hits discovery at Novartis; and Nance Hall, vice president and general manager of the Automation and Detection Solutions business at PerkinElmer.

DDN: There is no shortage of debate on 384- versus 1536-well formats. Which formats do you use in your labs?

Marcie Glicksman:
We do 99.9 percent of our screening in 384-well format. There have been one maybe two cases where we have moved down to 96, but the vast majority is 384. Our compound library is 150,000 compounds, so we are not really looking to move to any higher density.

Ricardo Macarron:
At GSK, we had a major move toward 1536 for everything that started in 2000, when we were anticipating running HTS for more than 100 targets a year. We are running HTS for around 50 targets per year these days. It is really a matter of scale of the compound collection and number of targets that dictates your need to go one way or the other. Right now, we are fifty-fifty 384 and 1536, and with low-volume 384, the push to go all 1536 has diminished. Our current emphasis is that we need to be not just efficient, but effective, and focusing on the right thing. So, depending on needs for each target and assay we may go down to 96.

Glicksman: So you find some assays you can't do in 1536, otherwise you'd do everything 1536?

Yes. Sometimes it is the platform, too. We had a lot of FLIPR 384s and we waited a while to move away from that big investment. We are now slowly moving to 1536 using aequorin technology.

Jeff Paslay: At Wyeth, we are 100 percent 384. All of our detection equipment and all of our automation equipment is capable of 1536, but within our organization, the process of setting up an assay and developing that assay and modifying that for high-throughput screening doesn't occur in a core group. So trying to equip all of the research areas with the capability of doing 1536, which they would transfer into our HTS group, was not something we could afford to implement, so we decided to go to 384.

We have gone to small volumes, so this cost savings on a per-well for reagents is not that much greater [with 1536], and our library is smaller than GSK's and Novartis'. It is about 800,000. We don't have the need to try to save compound, because of the way we built the compound collection over time.

I do believe there is a place for 1536, and if it happens in Wyeth, then it will probably happen in the research areas where there are primary cells, or some stem cells.

Adam Hill: We tend to have a mixture. Almost everything starts off in 384 and if we can, we rapidly transition to 1536; if we can't, we are pragmatic and run the assay in 384. Screeners, on the whole, like the logistics of the 1536—they don't have as many things on their cart to move around. It has been our decision to have flexibility: All of our readers can read all densities, and our liquid handling can address 384 and 1536 well plates.

DDN: What are some of the specific challenges you have in automating your assays?

I think the five of us could end up agreeing on this: The person who ends up screening is usually not the person who had the original idea and developed the initial assay, the benchtop assay. Usually, there is at least one transfer, one hand-off. So the issue is you have the transfer from the original person who developed the assay to somebody in HTS assay development, and then they pass that to the screening group. To me, that is the real challenge: Trying to find out how to have the right processes and procedures and SOPs in place so that there is a good transfer of information and all knowledge about that assay.

Macarron: In [GSK's] case, I think we can say that has been solved, through so many years of iterations. Even though there could be different people doing assay development and doing the screening, chances are those people have had different roles in the past with GSK, so they understand each other pretty well. Now and then, you do encounter problems with a particular assay, but in general, it is pretty much a done deal.

Our bottleneck is really downstream from screening. Getting into the platform may take a month, more or less, but once you get there, you get results. Then getting chemists to act on the results, that is the bottleneck. Presenting the information on secondary screening with enough confidence that these are real hits and not artifacts from the assay, that is where we encounter most of our problems. So it is hit-to-lead, it is not so much getting to hits.

Hill: In assay development, there is the initial transition, as you mentioned. We really try to get in early with project teams to be sure that we are thinking along the same lines. One of the difficulties we frequently encounter is the transition of reagent handling from the person at the bench to the robotic system. We spend a lot of time tweaking assays, adding adjuvants and so on, to make sure that things aren't sticking to tubing or disappearing into plasticware, over time, on the system. Trying to find a plastic that has the same properties in these situations could be of great help to us.

Glicksman: Some of the issues of assay development can't be solved with reagents or automation, because even though you are running kinases and you might be using all the same platform for them, you still have to work out the best parameters for each kinase, and there are not a lot of shortcuts for that.

Macarron: It is pretty much a case-by-case basis.

Glicksman: Even with the same reagents?

Yes, in general. So kinases, as an example, maybe 90 percent of the knowledge from the last target can be applied to the new one, but you do encounter specific problems with every target. Going into other areas, it is even more diverse in terms of the need to characterize and tailor the solution to the problem.

Glicksman: I think the area that is still most challenging, and we are new at this, is in imaging assays. In the past, we would do imaging assays as secondary assays, but we are starting to use them for primary screens. I think there are bottlenecks in terms of the data collection and the analysis. If you could have one machine that would do it faster….

Hall: So is it the speed of the instrumentation? Or is the capabilities of the instrumentation? Or is it the data?

Glicksman: All of those things.

Hall: So if you could have anything, what would it be?

Glicksman: A faster imaging reader. I haven't investigated different machines, because we have a GE InCell Analyzer 1000, but the ability to do high-content imaging that is faster at collecting data and analysis would be helpful.

Hall: Multi-mode readers continue to be viewed as a key growth area in the life science instrument market, as there still is a continued desire for more flexibility, and for instruments that can perform more than one task. We increasingly see customers asking for imaging capabilities, so this is definitely an area where we not only have leading products and technologies, but it's also an area we continue to explore. 

DDN: What can be done to ease some of the challenges of moving to an automated system?

Hill: It is one of those areas that can cause frustration; for example, when the assay suddenly stops working because you have made what you judge to be a very subtle change.  I don't know whether if it is a plastics issue or something else, but perhaps surface chemists could look at that to see if they can come up with a good solution. Obviously, you've got increased residence time in tubing relative to pipettes that you have to consider. One of the things that I keep thinking is why not just do everything with 384-well pipettors? Maybe that will accelerate assay adaptation.

Paslay: My organization is probably behind Novartis and GSK in getting into ramping up the high-throughput screening. We started a bit late. When we were doing this five years ago, to get around some of the issues mentioned, we put in our assays development groups that are scattered across four or five sites 384-well pipettors,that are very close to what is on the automated system, and we put in detectors that are similar. We do have SOPs and some protocols about how people should develop assays that we have developed which do test plastics, so that all of the reservoirs that we use are in the assay development labs and they do the stability studies. We give them the tubing, and then we did calculate lost volumes due to loss on the platform so they could calculate in costs.
I think Adam is right. If you can put some of the same tools and materials sciences expertise in the hands of the assay developers, it helps get over some of the issues.

Hall: Do you see automation as a challenge because of the level of complexity of the instrumentation, making it difficult to be able to bring it earlier into the game?

Hill: I think part of the challenge is need. If you are a disease area with a limited budget would you want to spend money on something that you are going to use two or three times a year, versus buying the next gadget which will help you further downstream in your lead development?

Another challenge, even if the disease area does have the machinery, is making sure it is being used properly to be able to translate to HTS. Novartis Cambridge ramped up very quickly; bringing a lot of new talent in the organization over the last five years. Initially, it was very hard to get all the right pieces in place. We have now built a body of knowledge where people are developing their second and third assays. The disease areas now really understand what they need to do to bring that good assay forward.

Macarron: In our case, we have been playing with different models. If there is a new target, it goes directly to the assay development group that is central and very connected with the screening group. So you don't have the therapeutic areas starting a new assay that may encounter the problem of translation of a tube assay to a plate assay. We just start with the group doing the plate assay.

Paslay: What you are hearing is the five of us are operating on different scales. It would be nice to have a centralized assay development group side-by-side with the screening group as they took it from just a concept and did all of the reagent prep of the assay. I know other organizations do that, but we are too small.

So I think Ricardo's model is one that I would prefer, especially on the assay development if it was central. I think the body of knowledge that Adam is talking about would grow faster and there would be much more synergy and critical mass—it's the way to do it.

We have the assay developer run the screen and carry the project all the way through to medicinal chemistry start, so we don't lose too much in translation. The other thing we try to do is to make sure that for every project we run a design of experiment (DOE), using technology not typically available in the disease areas. This allows us to discuss with the disease area all possible factors affecting the assay up front; and after the DOE, an understanding of which factors are important for assay robustness and sensitivity.

The DOE sets the tone for a good collaboration. Everyone contributes to the design; you run the DOE and discussions then become data-driven. It works very well. It then becomes less important who does the work for the initial development. That gets divvied out among the disease area and screening resources.

Hall: I think where I was going with this was, if you are running an assay few times a year, and if you are aware of the potential instrumentation that may be used downstream, because it is already in the organization, would you consider some sort of a outside service with your process? I know we've had conversations with some of you with regards to the expertise that's required to run some of the instruments—where perhaps it was a service of not only the people, but perhaps the instrumentation that could help accelerate the development, or help you get up to speed faster.

Would there be a transfer issue?

Hall: There still might be a transfer issue, but I am asking in regards to assisting with capital budget constraints and education? Would this type of service help?

Hill: We just introduced what we call the FAST Lab (Facilitated Access to Screening Technologies). We set it up with the disease areas. They provide a sponsor to bring a project into the FAST lab. The sponsor is guided in the use of all logistics, liquid handling, and data handling that is in the screening lab. They learn what it means to run a small-scale screen (less than 100,000 data points), and the FAST Lab enables them to bring forward a full HTS, identify tool compounds, or expand SAR.

Macarron: We mentioned earlier the problem of transfer of an assay from tube to plate, and the friction between teams it may create. Another aspect of this is quality; even though the perception from the "low throughput biochemists" is that doing screening in high scale means you lose quality, actually our high-throughput groups have put a lot of emphasis in quality, much more than anyone running small scale tests.
So that has been a philosophical debate with these groups. I'm glad in GSK it is over, in terms of the set-up. It's just an organization change. In terms of starting with small scale going into the high scale, it is going into a group that is going to do everything.

DDN: Which labeling technologies would you would use for particular assays?

I like luminescence if it's an option for either reporter-gene type assays or even readouts from ELISAs, but they have their drawbacks, too. Kinases, we tend to use LANCE or HTRF.

Hill: We tend to use them with reporter-gene assays. We use Luciferin as the substrate, and then there is no lysis or anything like that. You can just get the readout straight when you want. The assay development for that is easy as well. I think for kinases again, HTRF is probably first choice.

We do screen with mass spectrometry, so that's label-free and that actually is quite a powerful tool. That is as close as you can get to artifact-free. It is pretty good on that.

DDN: What kind of future do you envision for label-free technologies?

I'm attracted to that. But for some of the label-free methods, it is not clear, even though you can measure change, what that change is.

Macarron: I agree, I think it comes with a price, as well.

Glicksman: Yes. But it certainly is attractive.

Macarron: I remember the days when we started tying fluorophores to substrates, or ligands, and everyone was horrified: "Oh, that is going to change the biochemistry." Actually, that is not a major problem. Sure there are artifacts, but in general labeled ligands mimic very well unlabeled ones. Label-free assays are a good addition to the toolbox, but I don't think it's a panacea for addressing fundamental problems, because quite often, you are still left with questions. For instance in a whole cell assay looking at antagonism for a membrane receptor, is the observed activity related to the receptor or interest or to other receptors?

Hall: Ricardo, to address your "it comes with a price" observation—if someone could produce an instrument that was more in, say, a multi-mode type of price range, would that allow you to evaluate label-free technologies more, to determine what its capabilities, or address or convince you on the challenges of the technology? Or is that not the bottleneck?

Hill: I agree with Ricardo.  I think in label-free cell-based detection systems, it is not so much the detection system, but convincing yourself that the pharmacology is correct in the cell. This has been my reservation for the past five or six years.

We are now looking at label-free cell-based detection, not for screening, but for use in profiling. You may spend six months examining the pharmacology and convincing yourself that the detector is giving you the right answer if you are going to use that for three or four years.  he up-front investment is worth it.

I think detection technologies in general have become very robust. You can very easily weed out the artifacts based on interference with the detection technology. It is much harder to answer "is my compound truly interacting with my target?" This is where label-free technology could play a part. I'm not convinced that there is one machine out there that can really give you the clear-cut answer. If you have compounds that are promiscuous, that are aggregators, you know you want to get rid of them. Simply looking to see if they interfere with the detection technology isn't good enough.  If it is denaturing the enzyme, and that is its mode of inhibition, as opposed to truly binding at an active site this approach would not work.

What I am seeing is a renaissance in biochemistry, going back to real old-fashioned biochemistry where people are starting to look at mechanism of action by doing multivariate enzyme experiments to make sure compounds are acting in a relevant way.  Coupling biochemical information with label-free biophysics, we can gain further insight to be able to say, "Yes, I have real compounds here."

Paslay: It does offer an advantage if you have primary cells, stem cells, etc., so I can see people using it in that case, but I still think there is a risk of not being able to fully understand all the pharmacology behind the answer.

Macarron: I think it is a stepwise process, so you can start with a simpler assay to screen 2 million compounds, and then get into more specific assays for some thousands hits. So, looking at label-free as a complement to a primary assay that may use labels of some sort.

Paslay: I think Adam brought up a good point about mass spec. I think that is something that will evolve to be a little higher throughput, even for cell-based, looking at metabolites, that's a nice …

Macarron: The ultimate answer.

Glicksman: It's direct.

Paslay: Very direct. So I think it is going to take some evolution.

Hill: I'm hoping that the proteomics revolution is going to be driving it.

Published In:

Subscribe to Newsletter
Subscribe to our eNewsletters

Stay connected with all of the latest from Drug Discovery News.

DDN July 2024 Magazine Issue

Latest Issue  

• Volume 20 • Issue 4 • July 2024

July 2024

July 2024 Issue