Michael Ravnitzky, the chief counsel to the chairman of the Postal Regulatory Commission, sent me an email asking me if I had read his New York Times Op-ed entitled The Postman Always Pings Twice. He also sent me a more detailed version of his incredible idea (A first draft of that idea can be found here): Use the time-space distribution of the USPS fleet as a way to provide some sort of service (data use by companies, science based experiments,...) to a diverse set of stakeholders (Federal, states, companies, universities,...). It is simply amazing to have such a forward looking concept being articulated. From what Michael is telling me, this is the beginning of a (well known) process: The USPS has to buy off the idea, then Congress has to do the same... all in all, it will eventually require the input and the interest of several stakeholders to make it a reality.
In my response to him I pointed out that in order to help his project, he probably had to use as a model other types of network set up by the government that cost real money (as opposed to the marginal cost entailed here since the infrastructure in the case of the USPS fleet is already up and running). A famous example I can think of is the GPS constellation, the array of satellites that eventually provide you with directions on your iPhone. Up until president Clinton signed an Executive Order (i.e. marginal cost on top of the fleet development entailed the ability to deny services), GPS was basically scrambled for precision civilian use. Even after the sign off, I can specifically recall discussions with several people who would argue that GPS could never be used for civilian purposes. We know this is not true anymore as it has spawned a large set of services.
Michael also tells me that Popular Science magazine will also have an article in the February issue, which hits the newsstands on about January 15th. He thinks of the idea primarily as a new analytical tool for scientific discoveries and pollution reduction.
Finally, here is an "old" paper that just showed on my radar screen: Disparity-Compensated Compressed-Sensing Reconstruction for Multiview Images by Maria Trocan, ; Thomas Maugey; James Fowler, BĂ©atrice Pesquet-Popescu.. The abstract reads:
In a multiview-imaging setting, image-acquisition costs could be substantially diminished if some of the cameras operate at a reduced quality. Compressed sensing is proposed to effectuate such a reduction in image quality wherein certain images are acquired with random measurements at a reduced sampling rate via projection onto a random basis of lower dimension. To recover such projected images, compressed-sensing recovery incorporating disparity compensation is employed. Based on a recent compressed-sensing recovery algorithm for images that couples an iterative projection-based reconstruction with a smoothing step, the proposed algorithm drives image recovery using the projection-domain residual between the random measurements of the image in question and a disparity-based prediction created from adjacent, high-quality images. Experimental results reveal that the disparity-based reconstruction significantly outperforms direct reconstruction using simply the random measurements of the image alone.
"It is simply amazing to have such a forward looking concept being articulated. "
ReplyDeleteAnd now that it has been articulated it seems amazing that it hadn't been before.
I think, it has to do with privacy concerns.
Your first draft of the idea link is not working for me.
ReplyDeleteFarmboy,
ReplyDeleteYou could try this:
http://goo.gl/ZgR1D
Cheers,
Igor.