Multiplexed imaging and how it compares to other imaging techniques
Multiplexed imaging plays a crucial role in cancer research by enabling the comprehensive characterization of tumor microenvironments. Multiplexed imaging allows researchers to simultaneously visualize various cell types and their interactions within the tumor, providing valuable insights into the tumor ecosystem. This information can help in understanding the tumor's response to therapy, predicting treatment outcomes, and identifying potential targets for immunotherapy. Additionally, multiplexed imaging facilitates the analysis of biomarker expression patterns in cancer samples. By examining multiple biomarkers simultaneously, researchers can assess the expression levels and spatial distribution of different molecules involved in cancer progression. This aids in identifying specific molecular signatures associated with different tumor subtypes, patient stratification, and predicting therapeutic response. The comprehensive analysis provided by multiplexed imaging empowers researchers to unravel the intricate complexities of cancer biology, paving the way for personalized medicine approaches and the development of targeted therapies.
In this interview, Dr. Luke Gammon, Screening Corp Facility Manager at Queen Mary University London, and Dr. David Pointu, Advanced Workflow Specialist at Leica Microsystems, discuss multiplexing imaging and how it compares to other imaging techniques. They cover the main goals of the facility, the importance of multiplexing in accessing and reusing precious samples, and the benefits it provides over traditional imaging methods. Dr. Pointu also provides insights into how multiplexing using Leica Microsystems' Cell DIVE helps produce clear tissue images at scale, using 60+ biomarkers. They also discuss their pain points, and the future potential of multiplexing.
Listen to the interview
Transcript of the interview
David: Luke, can you tell us a little bit about your activities in the facility and what are the main questions that your users are currently looking to answer?
Luke: Yeah, sure. Well, the facility was actually set up as a sRNA screening facility after some purchasing of liquid-handling robotics, automated microscopes, and some high-content software back in, I think it was 2006. It was all really designed to do high-throughput multi-well compound screens of 96/384 well plates, that sort of thing. But then in 2013, Cleo Bishop, the academic facility lead, was successful in purchasing new equipment. Then we updated the facility a couple of years later and the facility’s new equipment was kind of really flexible and easy to use. And so we started doing all sorts of assays, with all sorts of projects coming through, and we go really from now general imaging to the odd well-based and slide right up to that large 20,000 compound screening and everything in between really. These are generally monolayer cultures with 2, 3, or 4-channel imaging, but we also do a lot of 3D imaging now for invasion assays, migration assays, and the sort of organ-on-a-chip type of stuff.
David: That’s really great. And in your what would you see as the main goal of the facility? How do you see your role?
Luke: It’s a good question. I guess it depends on whom you ask, but I would say it was probably to ensure that we do really high-quality research and stay ahead of the game in terms of technology and techniques. Like most core facilities, it just makes sense to have all the equipment
managed centrally and supported, rather than an independent research group having to do that – having to find funds and manage it. It also kind of means that the equipment gets used more. You know, it’s expensive, so we just don’t want it sitting there gathering dust.
David: OK, cool, so that means that you have also to stay state-of-theart, and multiplexing imaging is one of the new techniques that you recently got in the facility. And could you tell us also the context which drives this interest and this need in your facility?
Luke: I would say possibly all the projects are always there, they’re in the back of the mind of the PIs (principal investigators) that have had this archive material – they’ve got blocks, they’ve got slides which are kind of just hiding in cupboards somewhere. They’d be very precious, quite
rare, and they’ve just been waiting for the technology to catch up. We’ve always been able to do sort of small-scale multiplexing. It’s been around for a while, but really a robust method and the workflow just hasn’t been there for people to be able to do that sort of larger scale multiplexing assay. If you look at say, a rare skin disorders, something like that, where the biopsy material will be quite hard to come by, you need to really make the most out of every slide and every block that you have. Even if you go to the organ-on-the-chip type of stuff, then researchers spend such a lot of time optimizing that assay and they can spend a lot of money developing those models. They just don’t have an endless supply, so being able to reuse essentially the same material over and over again is just such a big step and change for other projects that have always been there. As I said, they’ve just lacked that technology jump, and that I think we’ve finally got it now.
…, you need to make the most out of every slide and every block that you have … being able to reuse essentially the same material over and over again is just such a big step and change for projects.
David: OK, I see. So they were already waiting for a major solution to start the multiplexing – could you also tell us why it was so important for their work to access the multiplex technology and at the end, what insights does it provide that you cannot get from the other techniques, the other microscopes that you already had in the facility?
Luke: I mean the biggest one, I would say, is the true single-cell data at the protein level. I mean, you can look at marker expressions, you can look at marker profiles. Even if you were to stain serial sections, for example, which would be, I’d say, the closest to what multiplexing does, – even with the best image registration around there, you just can’t accurately do this. And, of course, if you look at the transcription level, which can be on a cell basis, but then transcription levels don’t always translate to the protein level. Again, you need to see what’s actually there. You also don’t destroy the material like other methods out there do. You can go back and do follow up markers, say, from your initial findings. So you could do some level of work and then take a pause, have a look at what you’ve got, and then that can drive where you go from then onwards. As I said, with other technologies that would be completely impossible. You know the sample would be gone. Plus I guess everyone loves pretty pictures, right?
David: Yes, I see, this is great. Is there a specific type of sample that you typically examine, or do you work with various types of tissues and also cancer specimens at the end?
Luke: Well, it’s kind of hard to say at the moment, because the technology is still quite new to us, but so far we’ve had all kinds of things like organotypics, large tumor sections, and tissue microarrays. So the vast majority of the work that’s coming through is with those materials, but I can see it being spread out to a lot more.
David: OK, I see. And because you said so, now you’ve got pretty images. We know that this kind of technology also generates a lot of data. So I would like now to talk about the analysis. How do you do the analysis and, also very importantly, how do you share the results of this analysis with the end users in the facility?
Luke: Well the images generated from the Cell DIVE system are just Pyramid TIFF files, so they can be analyzed with a number of programs. I’ve tried a few of the open-source ones, like QuPath and Image J, and they do have their place, but from a workflow perspective, we went with Halo from Indica Labs. We agreed that it’s a lot simpler for a new user to the facility who has the ability to just jump straight in and analyze their data. They don’t have to learn any coding, and it’s very easy to replicate it for other users. We purchased a number of modules with essentially this as the base bit of software, and the main one we use is Highplex where you look at an unlimited number of biomarkers or combination of markers. And then from there the researchers either take the top-level information out, that is the number of cells with a particular marker or profile. They might go on and do a little bit more analysis within Halo, so they might generate some spatial analysis, and then they can take that and that would be essentially their end result, or they export the single-cell target data and then go off and plug it into other software packages.
They provide a manual about multiplexing, it’s something like 300 pages, it covers every single component of multiplexing, and it’s in step-by-step detail, so that’s worth its weight in gold.
David: I was wondering, in fact, do you do the analysis for the end users or do they do the analysis themselves?
Luke: There’s a bit of both really. Where we have users that are students, PhD students, postdocs, it depends on what they want to get and how much time they have. It could be that I would provide some training and then they would just jump straight to it and they would be fine. Or we can build assays together or I can do the analysis for them if they give me the questions that they want answered from that data.
David: OK, very good. And even if it’s still quite recent, what do you see as the main challenges of multiplexing for the activities that you carry out in the facility?
Luke: Well, there are a few for sure. I think the main one is probably antibodies. There’s so many times you can purchase a new antibody and it either just doesn’t work or it’s not specific, and you can spend a lot of time optimizing. Antibodies are just really time consuming and that has a cost, and actually just purchasing the reagents and the antibody itself can be quite expensive. But I think the main challenge is really just to make sure that the whole process is robust and that anyone can come along with the right training and repeat it.
David: OK, I see. I think it’s a good transition to ask you why did you choose to use the Cell DIVE system and what challenges does it help you overcome.
Luke: Yes, that goes back to the previous question: Leica have spent so much time to make sure what they put out works. The first thing I’d mention is the manual. They provide a manual about multiplexing, it’s something like 300 pages, it covers every single component of multiplexing, and it’s in step-by-step detail, so that’s worth its weight in gold. That right there takes out all the work that someone would have to do in the lab and optimize with a new system for the new antibodies, all those sorts of things. They’re just taking these steps out of that equation and it just speeds the process up for any new projects, and then the whole process is really scalable and flexible. And that’s really important for a facility like ours where we’ll get all of those types of projects coming through. You don’t need any kind of special slides. You don’t need to be tied into any particular labeling technology. And on the broad spectrum, those are the things that we would find challenging as a core facility and the flexibility of the Cell DIVE system is exactly what we needed.
David: Yeah, very good. It’s true that we have spent an enormous time and effort to validate the entire workflow prior to commercializing it. That means at the end you get a very mature system. Validation means we validated the imaging, the workflow, and the antibodies – as you said that’s a very important part. Even if it’s still recent, did you already get new insights or some good results thanks to this technology?
Luke: Yeah, it is a little bit too early to say. I can see already that I am definitely confident in the method itself, as it’s very highly reproducible. Having discussions with people coming along with their projects, you can see how flexible we’re going to be able to make it for our users. So although we haven’t got any insights yet, we can see that when a project comes through that’s got 50 slides and they only want to look at 8 markers, maybe it can be done. Or the other end of it where you could have eight slides and you’ve got 50 markers. So there’s something for everyone. I think the insights will come, it just takes a long time, and the users need to be able to get to grips with the technology themselves and that leads back into how they design their protocol and what they’re trying to answer. It’s an iterative process where they come along, you train them initially, and then they go, wow, it can do XY&Z, and then they go back to the drawing board, because they think “well, if it can do this, then maybe I design my protocol or my project a little bit differently”.
David: Yeah, this is great. And I know that you really also support the end users to get ready to use this technology and that’s great. The last question for you: what’s next, and how do you see these multiplexing activities evolving at the facility? What do you have in mind?
Luke: Well, say the I guess next jump, and you can already see it happening, is probably using some form of AI aspect for the analysis in projects. You know the equipment is going to get probably better over the years. It’s going to get faster and you may well be able to automate more parts of the process. That, again, goes back to being more robust, more streamlined, better for everyone really, but already even where we are now, we’re drowning in the data even if we only consider two markers, eg. nuclear markers, and cell markers, we get hundreds of morphological measurements per cell and you need something to really make sense of that. We’re quite good at picking up patterns in images. The end users are looking at that and saying to themselves “I can see something” and they’ll maybe use the analysis to confirm what they’re seeing. Or they may be doing it in a completely unbiased way, but when it comes to the data, thousands of lines of single-cell analysis is something that you can’t normally spot patterns in very easily, so I think that’s where we’re heading, really.
David: Very interesting. And it’s also a good transition for me to say that we have just released now a complete automation solution that automates the wet-lab workflow. So you can really walk away after defining the experiment that you want to carry on the Cell DIVE. That can really help the researchers to get through this technique.
Luke: So that’s how quickly the technology moves on! We’ve only had it less than 12 months and new technology is available.
David: Yeah, exactly! So thank you very much, Luke. It was a great discussion and looking forward to the next evolution of multiplexing with the Cell DIVE system. Thank you.
Luke: No, thank you very much for the opportunity. Great to have a chat and we are very happy with the system. Looking forward to continuing collaborations.