On Google+, Serguey Ten wonders if some super TV-L1 could be construed as a low complexity regularizers. Eventually the question became whether some of these solution could be not low Kolmogorov complexity, but high "sophistication" or "emergent complexity" as described in Scott Aaronson's post on The First Law of Complexodynamics
It is interesting because in Universal MAP Estimation in Compressed Sensing by Dror Baron, Marco Duarte the empirical entropy is used as a proxy for the Kolmogorov Sampler regularizer which may not yield the same type of solutions as those of Scott Aaronson's post. I have already talked about TV and L infinity being low complexity solvers although if I were to guess the solutions would fall in the low complexity and high entropy side of things.
If we want to go for some interesting complexity, then I wonder about enforcing either self similarity using two different regularizing terms. Using two regularizers is not new: Take for instance Constructal theory which boils down to an equilibrium between two constraints. The result of these equilibrium yields a fascinating sets of figures that seems to be as equally rich as fractals without the issue of infinite recursion that fractals have. From the presentation of the concept on the website:
Constructal theory is this mental viewing:
(i) The generation of design (configuration, pattern, geometry) in nature is a physics phenomenon that unites all animate and inanimate systems, and (ii) This phenomenon is covered by the Constructal Law: "For a finite-size (flow) system to persist in time (to live), its configuration must evolve such that it provides easier and easier access to its currents." (Bejan, 1996)
The Constructal Law is about the time direction of the "movie" of design generation and evolution. It is not about optimality (min, max), end design, destiny or entropy.The concept that the Constructal Law defines in Physics is "design" (configuration) as a phenomenon in time.
This is rather abstract, but let us go further and see what is behind this concept: in one of Adrian Bejan's latest paper:
S. Kim and S. Lorente and A. Bejan and W. Miller and J. Morse, The emergence of vascular design in three dimensions, Journal Of Applied Physics, vol. 103 no. 12 (June, 2008) .
(last updated on 2009/02/18)
Abstract:
Nature shows that fluids bathe the animal body as trees matched canopy to canopy. The entering streams invade the body as river deltas and the reconstituted streams sweep and exit the body as river basins. Why should this be so? Why is animal vascularization not based on arrays of parallel channels, as in modern heat exchangers? In this paper, we rely on constructal theory to show that the flow architecture that provides greatest access from point to volume and from volume to point is the three-dimensional compounding of trees matched canopy to canopy. This three-dimensional tree architecture is deduced, not assumed. Its flow performance is evaluated at every step relative to the performance of equivalent architectures with parallel channels. This paper also shows that the dendritic design must become more complex (with more levels of branching) as the volume inhabited by the flow design increases. The transition from designs with p branching levels to p+1 levels occurs abruptly as the available flow volume increases. This fundamental development has implications not only in evolutionary animal design but also in animal tissue modeling and the design of new vascular (smart) materials with volumetric functionalities such as self-cooling and self-healing. (c) 2008 American Institute of Physics.
The folks who are designing sensors are focused on investigating some of these natural embodiment through CT/X-ray/MRI....These instrumentation can look at pulmonary branching and not see much. The question is why ? why are our reconstruction not good enough to reproduce some of the intricacies of the systems that have different scales. What about if during the reconstruction of solutions, two kinds of regularization terms were to be used and their Lagrange multipliers changed at each scale ? what if the changed came from the physics ? When Bejan specifically talks about two antagonizing functions shaping the system with finite resources, shouldn't the l_0/l_1 term be connected to a finite resource of material ?
Here is a recent instructive discussion on the LinkedIn Compressive Sensing group on performing reconstruction with multiple regularizers. You can read it only if you're one of the 1170 members of that group, if you are not, you may consider joining.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
2 comments:
Had a crack at constructal approaches too:
Sterrenburg, F.A.S., R. Gordon, M.A. Tiffany & S.S. Nagy (2007). Diatoms: living in a constructal environment. In: Algae and Cyanobacteria in Extreme Environments. Series: Cellular Origin, Life in Extreme Habitats and Astrobiology, Vol. 11. Eds.: J. Seckbach. Dordrecht, The Netherlands, Springer: 141-172.
“why are our reconstructions not good enough to reproduce some of the intricacies of the systems that have different scales?“
The answer is primarily because we throw away the data at the collection step, by using too wide detectors. See “Future 1: The Intelligently Steered X-Ray Microbeam“ in:
Gordon, R. (2011). Stop breast cancer now! Imagining imaging pathways towards search, destroy, cure and watchful waiting of premetastasis breast cancer [invited]. In: Breast Cancer - A Lobar Disease. Ed.: T. Tot. London, Springer: 167-203.
By the way, Bejan has at least 53 papers published after 2008.
Yours, -Dick Gordon, DickGordonCan@gmail.com
53 papers ? wow.
Igor.
Post a Comment