If you recall, I was looking for a tablet-like device so I could read most of the pdfs mentioned here off-line. See, I dread having to sit in front of a computer to read something when I could be out on the beach reading the same paper while drinking a margarita. After having gone for a netbook, I was still not too happy with the ergonomics of it as the keyboard meant using the netbook in the same way I would use a laptop. Furthermore, the sand would get stuck in the keyboard rather fast, so using it at the beach was out of the question. Not being an Apple fanboi, I reluctantly went for an iPod Touch and besides a 1604 error on the very first day, I have not looked back. It looks like I am not the only one enjoying the two columns articles anymore. Being off-line is good, I can connect only when there is free wifi ( a now rare occurence), and I don't need to pay any monies on a monthly basis such as would be the case for an iPhone. I certainly don't have to care about outrageous international roaming fees as a result. For what it's worth, I use Instapaper to read web articles off-line and use a pdf viewer app in order to read the papers I download from the blog. Why do I say this ? No, I am not subsidized by Apple but rather I say this because it has changed some of my habits. I now spend some more time reading a paper after it has been mentioned on the blog. This may explain why I sometimes provide some comments on an article after it's been cited here rather than at the time it came out. I value a timely release of a paper on the blog rather having it be on my reading queue which, after two days, is a euphemism for never-gonna-read-it list. For instance, I am bound to write more about Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective by Richard Baraniuk, Volkan Cevher, Michael Wakin.
On a different subject, I am stunned! I looked at the recent stats of These Technologies Do Not Exist page I set up last week and found the following distribution:
On a different subject, I am stunned! I looked at the recent stats of These Technologies Do Not Exist page I set up last week and found the following distribution:
I really was not expecting that many countries represented in the first week (25). This is truly a good representation of the long tail argument for ArXiv or blogs focused on specialized technical content. Could technical journals truly reach out to individuals who are interested on a specific subject area on such global scale so rapidily ? I doubt it.
On a different note, some entries like These Technologies Do Not Exist or Imaging With Nature posts are a direct result of my being off-line ( and one wonders if this a good thing!). What is true, though, is that while compressive sensing investigations could be in the 10% improvement business, others also seem to think it is blue-sky development worth investing in. For this movement to be more transformational we need, for instance, to have more review articles like the following one (see below). I had a discussion with Emil Sidky one of the author and will write about it later but in the meantime, the review is available for free for the next 30 days: Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction? by Xiaochuan Pan, Emil Sidky and Michael Vannier. The abstract reads:
In a different direction, I just found the following on the interwebs and on arxiv:
Compressive Nonstationary Spectral Estimation Using ParsimoniousRandom Sampling of the Ambiguity Function by Alexander Jung, Georg Taubock, and Franz Hlawatsch. The abstract reads:
On a different note, some entries like These Technologies Do Not Exist or Imaging With Nature posts are a direct result of my being off-line ( and one wonders if this a good thing!). What is true, though, is that while compressive sensing investigations could be in the 10% improvement business, others also seem to think it is blue-sky development worth investing in. For this movement to be more transformational we need, for instance, to have more review articles like the following one (see below). I had a discussion with Emil Sidky one of the author and will write about it later but in the meantime, the review is available for free for the next 30 days: Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction? by Xiaochuan Pan, Emil Sidky and Michael Vannier. The abstract reads:
This is good piece when one wants to think about how compressive sensing has the ability to be a real disruptive technology.Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues.
In a different direction, I just found the following on the interwebs and on arxiv:
Compressive Nonstationary Spectral Estimation Using ParsimoniousRandom Sampling of the Ambiguity Function by Alexander Jung, Georg Taubock, and Franz Hlawatsch. The abstract reads:
We propose a compressive estimator for the discrete Rihaczek spectrum (RS) of a time-frequency sparse, underspread, nonstationary random process. The new estimator uses a compressed sensing technique to achieve a reduction of the number of measurements. The measurements are randomly located samples of the ambiguity function of the observed signal. We provide a bound on the mean-square estimation error and demonstrate the performance of the estimator by means of simulation results. The proposed RS estimator can also be used for estimating the Wigner-Ville spectrum (WVS) since for an underspread process the RS and WVS are almost equal.
Compressed Sensing Pulse-Echo Mode THz Reflectance Tomography by Kyung Hwan Jin, Youngchan Kim, Dae-Su Yee, Ok Kyun Lee and Jong Chul Ye. The abstract reads:
We demonstrate a pulse-echo mode terahertz (THz) reflectance tomography, where scattered THz waveforms are measured using a high resolution asynchronous-optical-sampling THz time domain spectroscopy (AOS THz-TDS) technique, and 3-D tomographic reconstruction is accomplished using a compressed sensing approach. One of the main advantages of the proposed system is a significant reduction of acquisition time without sacrificing the reconstruction quality thanks to the sufficient incoherency in the pulse-echo mode sensing matrix and the fast sampling scheme in AOS THz-TDS.
Approximate Sparse Recovery: Optimizing Time and Measurements by Anna Gilbert, Yi Li, Ely Porat, Martin Strauss. The abstract reads:
An approximate sparse recovery system consists of parameters $k,N$, an $m$-by-$N$ measurement matrix, $\Phi$, and a decoding algorithm, $\mathcal{D}$. Given a vector, $x$, the system approximates $x$ by $\widehat x =\mathcal{D}(\Phi x)$, which must satisfy $\| \widehat x - x\|_2\le C \|x - x_k\|_2$, where $x_k$ denotes the optimal $k$-term approximation to $x$. For each vector $x$, the system must succeed with probability at least 3/4. Among the goals in designing such systems are minimizing the number $m$ of measurements and the runtime of the decoding algorithm, $\mathcal{D}$.
In this paper, we give a system with $m=O(k \log(N/k))$ measurements--matching a lower bound, up to a constant factor--and decoding time $O(k\log^c N)$, matching a lower bound up to $\log(N)$ factors.
We also consider the encode time (i.e., the time to multiply $\Phi$ by $x$), the time to update measurements (i.e., the time to multiply $\Phi$ by a 1-sparse $x$), and the robustness and stability of the algorithm (adding noise before and after the measurements). Our encode and update times are optimal up to $\log(N)$ factors.
No comments:
Post a Comment