In the previous entry in the newly started series "These Technologies Do Not Exist" I mentioned the National Academy of Sciences report on the Mathematics and Physics of Emerging Biomedical Imaging that came out in 1996. I am going to continue using this report as a basis for at least another entry. Today we'll focus on the instrumentation and technology associated with X-rays CT (Computed Tomography) scans.
The technology is explained here. It is pretty simple compared to PET and SPECT. Roughly speaking, an X-ray source is shone on a human subject. X-ray photons go through the body but the different tissues in the body provide different attenuation. On the other side of the body, there are detectors collecting the X-rays and transforming them into electrical currents. The difference between what was sent and what was received is the subject of the computation in Computed Tomography. That computation does not matter to us today as much as to how the conversion between X-ray photons to electrons is made. Let us recall though that I mentioned that CT ought to be investigated with the Coded Aperture concept to see if there is a way to improve some of the resolution or even guess scattered fluxes yet this is not the subject of this entry as we look at the process that happens after that (the coded could be used on top of what we describe here)
As one can see from the graph above (page 27 of the report), 140 KeV photons that have traveled through the body are then converted down into visible light (four orders of magnitude change - see below-) in a scintillation chamber (akin to what is performed in the Anger Coding Scheme mentioned earlier) or are ionizing elements in a chamber that provides the location (the negative and positive electrodes are location sensitive) of where the atoms where ionized.
A Compressive Scheme can be implemented in both cases. For the scintillation chamber, one could use, as in the case of the Anger Coding Scheme, a scintillation material that is incoherent i.e. that has occlusion, different light scatterers. One could also conceive that we have some amount of material within the scintillation material that also scatters/reflects X-rays (by employing multi-layers materials for instance). In the second case of the ionization chamber, we are confronted with a technology similar to the one used in the A2I very high frequency converter: A compressive sensing scheme could entail the ability to switch the polarity of the electrodes at a high frequency.
Both of these schemes should be investigated by evaluating the different efficiencies of each solutions. Instead of building detectors, one can start a trade studies process by using appropriate Monte Carlo codes such as MCNP, Fluka or Geant4.
Why would we want to do this ? In PET or SPECT, the positron mean free path bounds the resolution to no smaller than 2 mm. Here, our concern is not only to collect X-ray fluxes, but the approach given here could also provide additional information about the direction of that flux. X-rays do scatter in the body and with the current collection system, we don't know very well if the photon has collided several times in the body (and changed direction of energy) or not. The current approach uses different systems in the collection of the flux to make sure the right flux is known with the added help of image processing. In the proposed approach, the need for ad-hoc image processing would probably be decreased while a better 3-D description could be obtained.
December 13, 2009 from Benson, Arizona
ReplyDeleteDear Igor,
Compressive imaging was brought to my attention by:
Pan, X., E.Y. Sidky & M. Vannier (2009). Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction? Inverse Problems 25, 36p. #123009.
I suppose I’ve been using compressive imaging since I invented the ART algorithm. Its relationship to the Kaczmarz algorithm is discussed in:
Gordon, R. (2010). Stop breast cancer now! Imagining imaging pathways towards search, destroy, cure and watchful waiting of premetastasis breast cancer [invited]. In: Breast Cancer - A Lobar Disease. Eds.: T. Tot, Springer: in press.
which I’ll send on request to gordonr@cc.umanitoba.ca. You’ll find many ideas in there, including Wiener filtration, which might for example substantially improve single pixel imaging, although the first example of that was actually proposed and implemented by Eric Harth:
Harth, E. & E. Tzanakou (1974). Alopex: a stochastic method for determining visual receptive fields. Vision Res 14(12), 1475-1482.
Tzanakou, E., R. Michalak & E. Harth (1979). The Alopex process: visual receptive fields by response feedback. Biol Cybern 35(3), 161-174.
Harth, E. (1982). Windows of the Mind, Reflections on the Physical Basis of Consciousness. New York, William Morrow and Co.
My point in writing is to get some of the keen minds working on compressive imaging to come up with ways of making mass screening for premetastasis breast cancer at very low dose a reality. Yes, these technologies do not exist. They should. Thanks.
Yours, -Dick Gordon
--
Dr. Richard Gordon, Professor, Radiology, University of Manitoba GA216, HSC, 820 Sherbrook Street, Winnipeg R3A 1R9 Canada E-mail: DickGordonCan@gmail.com Skype: DickGordonCan, Second Life: Paleo Darwin, Fax: 1-(204) 787-2080 Embryo Physics Course: http://embryophysics.org/
http://bookswithwings.ca
http://www.umanitoba.ca/faculties/medicine/radiology/stafflist/rgordon.html