The roots of image analysis
The invention of the first microscopes inevitably led to the question of the size of the objects viewed through the eyepiece and their relationship to each other. Another equally important question was how to deduce the volumes of the objects from a 2D section. These quantitative microscopy questions were addressed initially by stereology, which in turn contributed to the principles on which image analysis is based.
The first experiments and results are attributed to Achille Delesse, who demonstrated that the area fraction is proportional to the volume fraction. In 1930, Thompson and Glagolev established the pioneering formula PP (point count) = LL (linear fraction) = AA (area fraction) = VV (volume fraction).
The first TV-based image analyser
Image analysis as we know it today was only made possible by the development of television technology: 50 years ago, in 1962, the first television-based image analyzer of microscopic images was developed by Metals Research – a Cambridge-based company that became part of the Leica Group. Using a television camera as input device, the so called QTM A (Quantitative Television Microscope) was totally analogue in operation. A threshold was applied to the video signal, shown as a binary image on a second TV screen and the area measurement read from a meter. Nevertheless it marked the beginning of automation in image analysis – and with 20 msec per image it was quite fast, even compared with today's instruments.
The initial acceptance of the concept spurred the development of the QTM B in 1963 which proved to be the first commercially successful automated instrument in this field. These early image analysis systems were mainly used for scientific purposes. The users came from the steel industry or from metallurgic and mineralogical research institutes, but the potential in life sciences was rapidly recognized.
The digital era
The digital era for image analysis began in 1969, when the Quantimet 720 was launched to market. This instrument used highly-modular, hardware-based image processing logic. Thanks to special tube cameras a resolution of 869 × 704 pixels was achieved and the entire image digitized. The system was much more flexible than its predecessors and offered automated control of the microscope stage and focus. When combined with accessories, these systems could fill a small room. They could also perform some clever processing functions such as image erosion, dilation and used a "light-pen" to edit the image directly – just few years in advance of Photoshop.
Of course, other optical companies entered the new market and Metals Research also later known as IMANCO from 1969 onwards was not the only supplier. "Interestingly, some of the original competitors are now united in the Leica Microsystems family," recalls Geoff Jenkinson, who is Product Manager having been with Leica Microsystems for 40 years. "The toughest competition came from the US company Bausch & Lomb. They had better access to the American market, but never managed to gain significant foothold in Europe. Of course, Leitz had sophisticated image analysis systems in their portfolio as well."
The first computers
From 1980 onwards, the increasing power and declining costs of computers led to computer control of the image analysis system. Although the actual image processing was still done by the hardware, the new microprocessors enabled images and results to be stored. Remarkably, the introduction of computers initially had the effect of slowing down the image analysis process. This was because early computers were unable to digest and process the vast amount of data in an image. The Quantimet 800 was developed using the first Apple II computer imported into the UK. Further innovations incorporated in the Q900 included complete digital image storage of near megapixel resolution, skeletonization and used the famous Quips imaging language for the first time.
Important "notions" for the further development of image analysis were laid in the 1960s by Jean Serra and George Matheron from the École des Mines in Paris. Their mathematical morphology concept treated the image as a numerical array of pixel values. First applied to binary images, the concept was later extended to grayscale images. Thanks to the group's work, new features like, distance transforms and watershed were introduced to image analysis in the following years. The Quantimet 570 from 1989 and Quantimet 600 from 1994 were a direct result of cooperation with the École des Mines.
The software generation
The 1990s marked the beginning of the software generation in image analysis with the PC-based Quantimet 520. This led to the Leica QWin image analysis software being completely independent of special hardware. The latest incarnation being the creation of an optimized user interface such as the Leica Application Suite (LAS). The LAS software integrates the latest advances in automated microscopy, computing and digital image analysis. For microscope documentation, it provides control of microscope and camera, giving calibrated and annotated digital images. From 1995 onwards, after image analysis became fully software-based, companies that only sold software, not microscopes entered the market. "We have always stressed the advantages of buying the entire and integrated system from a single-source supplier", reports Jenkinson. "And we seem to have been successful, as the competition from software companies has somewhat declined."
Today, computational microscopy plays an essential role in modern image analysis systems. It provides capabilities that extend the microscope image beyond the optical barrier. The user can see elements that cannot be seen in a single microscope image – and as they can be seen, they can also be measured. Among the key features are the extended depth of focus (results of this feature can be seen on www.antweb.org, please also see the Science Lab article "AntWeb Documents the World of Ants"), the extended field of view which is vital for geological sections, particle identification applications (Leica Cleanliness Expert) and customization by macro in almost any quantitative microscope application.
With LAS, Leica Microsystems also maintains its focus on industrial materials applications and converts the understanding of specialists in the materials field into "expert" applications dedicated to these tasks. Currently there are ten expert applications available, among them the proven Steel Expert to measure steel inclusion and the Cleanliness Expert for measuring and classifying particles on filters. The latest additions to the expert range are the Leica Dendrite Expert and the Leica Decarburisation Expert. LAS is are also used in forensic laboratories. "To be able to distinguish various printing patterns from each other we use the stereomicroscopes Leica M125 and MZ165 together with the LAS Measurement Module", says the Technical Director of Questioned Document Examinations at the Italian Forensic Police Service. "Through the microscope I observe tiny ink dots, details in handwritings and printed features. The LAS Measurement Module is irreplaceable to make measurements on the objects, quantify all data acquired and transfer the results to an excel sheet." Of course, image analysis has long been established in life sciences in the whole spectrum of applications on Leica widefield and confocal systems from simple cell culture observation to sophisticated imaging tasks in biomedical research. "Innovations in pathology and cytogenetics, like the digital image hub and the slide scanner, have their roots in the long and wide experience with image analysis", resumes Jenkinson, "and there is definitely more to come."