During MIA09, I talked to some of you and it may have looked like I was ranting about the need to be closer to the instrumentation. Then I received an unexpected feedback from none other than Richard Gordon, one of the people who invented or devised ART for Computed Tomography. Dick sent me the following in the comment section of one of the "These Technologies Do Not Exist" entry. Here is what he has to say:
Dear Igor,
Compressive imaging was brought to my attention by:
Xiaochuan Pan, Emil Sidky and Michael Vannier (2009). Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?. Inverse Problems 25, 36p. #123009.
I suppose I’ve been using compressive imaging since I invented the ART algorithm. Its relationship to the Kaczmarz algorithm is discussed in:
Gordon, R. (2010). Stop breast cancer now! Imagining imaging pathways towards search, destroy, cure and watchful waiting of premetastasis breast cancer [invited]. In: Breast Cancer - A Lobar Disease. Eds.: T. Tot, Springer: in press.
which I’ll send on request to gordonr@cc.umanitoba.ca. You’ll find many ideas in there, including Wiener filtration, which might for example substantially improve single pixel imaging, although the first example of that was actually proposed and implemented by Eric Harth:
Harth, E. & E. Tzanakou (1974). Alopex: a stochastic method for determining visual receptive fields. Vision Res 14(12), 1475-1482.
Tzanakou, E., R. Michalak & E. Harth (1979). The Alopex process: visual receptive fields by response feedback. Biol Cybern 35(3), 161-174.
Harth, E. (1982). Windows of the Mind, Reflections on the Physical Basis of Consciousness. New York, William Morrow and Co.
My point in writing is to get some of the keen minds working on compressive imaging to come up with ways of making mass screening for premetastasis breast cancer at very low dose a reality. Yes, these technologies do not exist. They should. Thanks.
Yours, -Dick Gordon
--
Dr. Richard Gordon, Professor, Radiology, University of Manitoba GA216, HSC, 820 Sherbrook Street, Winnipeg R3A 1R9 Canada E-mail: DickGordonCan@gmail.com Skype: DickGordonCan, Second Life: Paleo Darwin, Fax: 1-(204) 787-2080
Embryo Physics Course: http://embryophysics.org/
http://bookswithwings.ca
http://www.umanitoba.ca/faculties/medicine/radiology/stafflist/rgordon.html
This is interesting for many reasons, Tuesday afternoon Patrick Louis Combettes was presenting an interesting tutorial on proximal splitting methods for signal recovery where we learned that we should not be using the wording Split Bregman method as these methods really are proximal splitting methods in the first place. And while I see much aggravation arising from this somewhat incendiary statement; from a compressive sensing point of view, however, this may become a quaint little souvenir as the recent Approximate Message Passing Algorithm seems to wash them all away. Anyway, during Patrick Louis Combettes' presentation, we learned that either ART or the Gerchberg Papoulis or the Gerchberg Saxton (GS) algorithm could not handle well solutions that required projections over convex sets (if I understood correctly). ART is the algorithm developed by Dick while the GS algorithm was originally used by the ghost imaging folks. As it turns out, the authors of the ghost imaging paper went ahead and used a more adequate solver such as GPSR that minimizes the l_1 norm and obtained much better results. Similarly, using any of the reconstruction solvers devised in the compressive sensing setting with random projections obtained from a potential compressive CT-scanner (however it may be implemented) will yield good results because these solvers make an assumption on the solution.
The issue at hand here is these new solvers solve a different problem, they solve an underdetermined linear system of equations with the prior knowledge that the solution is sparse or compressible in its representation. This is a much stronger statement than trying to solve an underdetermined system with some type regularization a la Tikhonov and hope for the best. Anyway, the article is worth reading from beginning to end, for those of you who have published in compressive sensing, you'll notice very eerily similar parallels. Let me take some "morceaux choisis" (preferred highlights):
On page 15
on page 17
The issue at hand here is these new solvers solve a different problem, they solve an underdetermined linear system of equations with the prior knowledge that the solution is sparse or compressible in its representation. This is a much stronger statement than trying to solve an underdetermined system with some type regularization a la Tikhonov and hope for the best. Anyway, the article is worth reading from beginning to end, for those of you who have published in compressive sensing, you'll notice very eerily similar parallels. Let me take some "morceaux choisis" (preferred highlights):
On page 15
Underdetermined Equations
My undergraduate degree was in Mathematics from the University of Chicago, where I had taken a course on linear algebra from Alberto P. Calderón (Christ et al., 1998). Thus I was keenly aware that in CT with ART we were taking the unusual step of solving for many more unknowns than we had equations. It may have been his belief that this was “impossible” (personal communication, 1974) that constrained Hounsfield’s use of his ART-like algorithm to the overdetermined case, i.e., many more projections than necessary, and this attitude may be the original cause of the high dose of CT to patients, because commercial medical scanners until recently have not deviated in concept from EMI’s lead in turning away from ART to FBP.
In terms of dose for equivalent image quality: ART with fewer views \lt \lt overdetermined ART is about the same as FBP. The consequences have been a significant increase in dose to populations due to CT (Huda, 2002; Linton & Mettler Jr, 2003; Dawson, 2004; Prokop, 2005; Bertell, Ehrle & Schmitz-Feuerhake, 2007; Colang, Killion & Vano, 2007). I did my own “back of the envelope“ calculation a few years ago, given in the next section. Breast CT may end up being the leader in correcting this situation, because no one has advocated any higher dose for 3D screening than the 2 mGy that has become the practice in ordinary mammography (Spelic, 2009).
on page 17
It will be interesting to compare this highly nonlinear pattern recognition approach with recently developed statistical inverse methods that generate alternative solutions to the equations from different samplings of matrices containing white noise (Kaipio & Somersalo, 2004).and
Focus on Breast Cancer Detection
I don’t recall exactly what got me started on the application of CT to breast imaging, but when I proposed to a large NIH audience of 500 or so people that we could screen women to detect small tumors, and it would take only a week of computing time per exam, I was greeted with derisive laughter (Gordon, 1976b).
I really wanted to put more of that paper in the entry, but it would be unfair to the rest of the paper, so go read it. I'll most probably talk more about this paper and MIA in the future.Thank you Dick for the paper, thank you Xiaochuan Pan, Emil Sidky and Michael Vannier for the other paper and thank you to Gabriel Peyré , Laurent Cohen and Frédéric Barbaresco for organizing MIA09.
No comments:
Post a Comment