Thursday, September 23, 2010

Orphan Technologies ...

There is an entry in this blog that keeps on getting some attention, it features two technologies that, because they are complex, have become orphan (at least in the U.S.), some would called them rudderless at the policy level:
While reading the interview of James Bagian I was reminded that both of these technologies use Probabilistic Rsik Assessments (PRA), maybe it's a sign that when you begin to use Monte Carlo techniques, you are doomed.  Both sets of technologies are in that state because most political stakeholders have no interest in developing them. Take for instance the case of reprocessing nuclear materials after they have gone through one core use: the Left which has, for years, built its political credentials on fighting earlier projects , cannot but be extremely negative about any development on some ideological level while the Right cannot think of a market where building one plant results in putting billions of dollars in some government escrow account for years during the plant construction while the stock market has 2 to 15% returns. On a longer time scale, reprocessing is the only way we can buy some time but the political landscape does not allow for this type of vision. It just buggles the mind that we ask people to waste energy in creating and recycling glass [see note in the comment section] but throw away perfectly usable plutonium. On the other end of the spectrum timewise, maybe we should not care so much about nonproliferation as much as dealing with ETFs. Maybe a way to make ETFs less a problem is getting people to use Monte Carlo techniques there.. oh wait they do, we're screwed. But the interview also pointed out to issues in health care where even getting people to follow a check list is a matter that seems to involve lawyers. As it turns out John Langford just had a posting on Regretting the dead about making tiny changes in our way of thinking to map better the expectations in the Health care sector. It would seem to me that we ought to have similar undertaking from the Theoretical Computer Science community as well. For instance, the Stable Marriage Problem could be extended a little further so that we could eventually get to know why certain traits, conditions or diseases stay endemic at a low level in the population. I certainly don't hold my breath on having algorithms changing some landscape as exemplified by the inability for industry to include ART or better algorithms in current CT technology even with the stick of  the radiation burden [1]. In a somewhat different direction,  Jason Moore has a post on Machine Learning Prediction of Cancer Susceptibility. From his grant renewal, I note:
This complex genetic architecture has important implications for the use of genome-wide association studies for identifying susceptibility genes. The assumption of a simple architecture supports a strategy of testing each single-nucleotide polymorphism (SNP) individually using traditional univariate statistics followed by a correction for multiple tests. However, a complex genetic architecture that is characteristic of most types of cancer requires analytical methods that specifically model combinations of SNPs and environmental exposures. While new and novel methods are available for modeling interactions, exhaustive testing of all combinations of SNPs is not feasible on a genome-wide scale because the number of comparisons is effectively infinite.
I am sure that some of you already see the connection between this problem and some of the literature featured here. As Seth Godin stated recently:
You can add value in two ways:

* You can know the answers.
* You can offer the questions.

Relentlessly asking the right questions is a long term career, mostly because no one ever knows the right answer on a regular basis.
I certainly don't know the answers but if you think this blog offers the right questions please support it by ordering through the Amazon - Nuit Blanche Reference Store

Credit: NASA


Alejandro Weinstein said...

"It just buggles the mind that we ask people to waste energy in recycling glass..."

Do you have any source for this statement? At least wikipedia say (

"Every metric ton of waste glass recycled into new items saves 315 additional kilograms of carbon dioxide from being released into the atmosphere during the creation of new glass"

Igor said...

I should have been more accurate:, instead of saying

"It just buggles the mind that we ask people to waste energy in recycling glass..."

Implicit in that statement is that glass production is OK. I should have said

"It just buggles the mind that we ask people to waste energy in creating and recycling glass..."

I am taking the viewpoint that much energy is wasted on the wrong emphasis. Glass is probably necessary for bottles of wines (maybe not all the wines) but it should not even compete with biodegradable matter on most applications.

Plutonium produces electricity when it is produced (yes, you read that right, a third of a nuclear reactor core gets its power from plutonium at the end of an 18th month cycle). The reprocessing of it calls for -much- less energy than what the plutonium will eventually be able to produce afterwards in subsequent reuse.

In other words, the creation and recycling of plutonium is a win-win situation whereas the creation and recycling of glass is a lose-lose situation energetically speaking.

This situation is not tenable in the long run. Yet, we accept recycling (and implicit the use of) glass as a given but demonize plutonium.

Alejandro Weinstein said...

Thanks for the clarification.

May be you find this book interesting:

by David Mackay. There is a free pdf available.

He also has a book on information theory (also available as a free pdf):

Interestingly, it mention the 12 balls puzzle.

Igor said...


Thanks for the tip, I know David, he is a friend of mine ....on Facebook :-)

Irrespective, I think his chapter on Nuclear Energy is far more eloquent than what I could ever say: