Pages

Monday, June 24, 2013

Sunday Morning Insight: Enabling the "Verify" in "Trust but Verify" thanks to Compressive Sensing

In last week's Quick Panorama of Sensing from Direct Imaging to Machine Learning, I made the case that sensing could be seen from different point of views that are themselves embedded in different academic communities yet are essentially doing the same thing with different approaches. Today, instead of giving a deeper view of these different approaches, I'd like to give some perspective as to why sensors and sensing in general is important economically. First, here is a little known fact: the betterment of sensors has consistently delivered Nobel Prizes. However, the generic view of the public and policy makers is that we already have CSI like technology at the tip of our fingers and that given some human ingenuity generally displayed by the actors (and enough time) we can "solve" problems. I noticed that in most episodes, the fingerprinting of materials seems to be a given. I really don't know where the screenwriters get that impression because it is the somewhat most difficult part of the identification process.



Lack of good sensors: An unacknowledged economic side effect.

I believe this perception issue, which bubbles up all the way to policy makers, is at the root of many problematic. There are numerous economic activities that are currently uncontrolled solely because there is an asymmetry between the making of products with certain materials and the control of whether these products are made of said materials. The system works in a "trust" regime rather than effectively a "trust but verify" regime. Every country has had that problem. In China, for instance, there have been instances of fraud that have led to deaths and widespread market distortions. In France recently, the complaint of only one person led to the shutdown and recall of a medication that eventually was cleared. In the US, the FDA has a freely available database for recalls. All countries have in some shape or fashion some issues with how their regulations are enforced on local products and foreign exports. The sheer magnitude of the world trade makes it a near impossible task to enforce local rules on imported goods. All these cases and attendant warning systems are really the sometimes adhoc result of the lengthy process of material fingerprinting that typically requires long weeks in the lab. In short, CSI stories and Hollywood in general impress on the public -and lawmakers- the general notion that the technology behind entire countries' rules, laws and regulations that are protecting people's health and safety is available, immediate and cheap. Nothing could be further from the current realities in sensing.

Helping Reality: Better Fingerprinting through Compressive Sensing

Since sensing the right elements quickly through some signature is utmost importance for world trade and is probably very important to minimize major market distortions, can new methods help in developing faster and probably more task specific sensors ? 

Maybe.

One of the most important lessons of the compressive sensing adventure, in my view, is that it has allowed randomization to be taken seriously.  That randomization in turn has allowed us to devise sensors away from traditional direct imaging into compressive sensing. And look where this is taking us: Just watch some of the CS hardware implementations and some of the start-ups that have used it. And it's only the beginning. To get a sense of the cost reduction enabled by randomization, let us take the case of hyperspectral imagers. Currently these cameras cost about 100,000 buckarus. Thanks to the multiplexing allowed by compressive sensing, there are several groups trying to decrease this cost by one or two orders of magnitude. Randomization is also at the heart of the recent fingerprinting attempts in MRI. In short a deep mathematical statement on concentration of measure does seem to provide a way to design better and cheaper sensors or imagine new ones [1,2]. 

Compressive Sensing, The Internet of Things, Big Data and Machine Learning.

Cost reduction has two main consequences: a larger footprint in the academic world yielding a larger sphere of influence in tackling different problematic. The second effect has to do with the ability to build sensor networks the size of a planet. For instance, during the unfolding of Fukushima Daiichi accident, it became obvious that citizen sensor networks such as SafeCast gave a more robust view to decision makers and the population of how events were unfolding. Coupled with computational codes running plume diffusion, and you had a potentially pretty powerful predictive mechanism. All this because of the availability of a tiny and somewhat cheap and undiscriminative Geiger counter. Some of these costs could be further reduced if only one were to surf on the steamrollers like Moore's law: I am personally of the opinion that much of the fear related to radiation could be dampened if one were to have Google Glass-like capabilities to detect radiation surrounding us. To show that, Cable and I showed that in a highly radiative environment the radiation field could be simply decoupled from CMOS imagery through a robust deconvolution ( It never was noise; Just a different convolution, see also the videos in [3-5]). In an area around Fukushima or elsewhere where the radiation is much lower, a different procedure would have to be used to provide real time information to the general population and while I sympathize with the Geiger counter effort of SafeCast, I could see CMOS taking over that detection market in the future. The purists who have read Glenn Knoll's Radiation Detection and Measurement will rightfully argue that silicon is not the best detector material for this type of task. To which I will argue that a combination of better converters (or multiplexer as we call them in compressive sensing) and the economies of scale of CMOS wil largely, in the end, win that fight. And with CMOS comes big data and mechanisms found in Machine Leanring to reduce it to human understandable concepts. 

To come back to SafeCast, the project is now embarking in a larger worldwide air pollution quantification effort. In the home, there is a similar effort like AirBoxLab now featured on IndieGoGo -a Kickstarter-like platform- that aims at quantifying indoor air pollution. The kit features the following sensors: 
  • VOC: Formaldehydes, benzene, ethylene glycol, acetone.
  • CO2: Carbon dioxide
  • CO: Carbon monoxide
  • PM: Particulate Matter
  • T: Temperature
  • RH: Relative Humidity




AirBoxLab has the potential to produce large amount of data thanks to the capability of sampling  not just ambient air but also surface effluents. This is interesting as it clearly is a way to build a large database of products and attendant effluents that can seldom be undertaken by states (check out how small NASA's or ESA's Outgassing databases are [6]) or even traditional NGOs. A large database like this one would clearly be a treasure trove not just for machine learners or for enforcement purposes but could eventually yield virtuous economic cycles.  

Better sensors are always needed

In the case of the air quality of SafeCast or AirBoxLab, one realizes that there is a dearth of sensors that ought to be researched in light of the development in compressive sensing [2]. A whole VOC sensor is a first step that is needed in order to know when to ventilate your home, but eventually, one wants to fingerprint only those VOCs that are toxic from the non toxic ones. Recently at one of the Paris meetup, I was told of a device that never went into market because while it partially destroyed VOCs, some of the byproducts of the destruction process included smaller quantities of other VOCs within the family of sarin gas. The total concentration of VOC was reduced at the expense of increasing the potentially lethality of the byproduct. In short, while it is always a good thing to have an idea of total VOCs. it also is a good idea to know exactly what type of COVs are being measured. Here again we are witness to better or more discriminate sensing being the engine behind other technology development (VOC processing and disposition) and eventual economic growth. 


For those of you in Paris, I'll be attending the Meetup Internet des Objets n°1 this coming Tuesday night.

No comments:

Post a Comment