I mentioned a K-T extinction level event yesterday when in fact a smaller asteroid can produce a much larger plume than the one from this Icelandic volcano named by somebody who slept on his keyboard or maybe it's just a way for Iceland to tell us in a more direct manner "Forgive Our Debt or We Ground Your Planes." To give a sense of the underlying reason as to why traffic over Europe is stalled, take a look at the inside of the turbines of a F-18 Hornet plane flown on purpose through that cloud by the Icelandic Air Force (Thanks Mikko Hypponen.) Raising the temperature of sand in a jet engine combustion chamber is likely to not produce a pretty picture.
Coming back to the K-T extinction level event, the Near Earth Object Program that aims at discovering unfriendly rocks in the sky has the following stats of its discovery program:
Coming back to the K-T extinction level event, the Near Earth Object Program that aims at discovering unfriendly rocks in the sky has the following stats of its discovery program:
As one can see, not only the trend is not really decreasing for the km size asteroids, only one type of surveys seems to be finding most new objects: the Catalina survey effort.
Which leads us to the AMD competition and my failed bid to get a 48 core.
Back in January 2006 I was talking to David McKay and who seemed to know people who wanted to use fast algorithms doing star tracking type of analysis to analyze a large set of photos taken by the Royal Astronomical Observatory over the past fifty years. They wanted to find traces of previous close encounters we may have had in the past fifty years that have still not been found yet (because the object has not come back in front of the current survey). The issue was that they had _LOTS_ of photos but even though they might have time to digitize them, they still were looking for a way to figure out where the photo was taken to be able to provide some type of inference as to whether or not every little bright spot on the photo was a normal star or something a little more threatening. I know people who do star tracking algorithms and he knew the folks that had the photos but this discussion never went anywhere. Instead of evaluating the Donoho-Tanner phase transition, a subject that induces much less fear indeed, here is what I should have proposed:
48 Cores to Save Earth from K-T level Extinction Events.
The purpose of this project is to use large amount of old photos from the sky as well as photos taken by amateur astronomers all around the world to enhance the ability to detect large asteroids (km size or more) potentially threatening life on Earth. These large asteroids represent a clear and present danger to Humanity as witnessed by the the Cretaceous–Tertiary extinction event that occurred 65 millions years ago. The most recent survey capability of the current Near Earth Object Program has found a large amount of these asteroids for the past 6 years. However, many objects have been photographed in the past fifty years yet have not been identified because of the lack of adequate processing power (i.e. it meant manpower before 1990's). These yet unidentified but photographed objects have probably not crossed our path again. Similarly, many objects are currently being photographed in the sky by members of amateur astronomy community but rely on each person's knowledge to figure out if this is an object worth paying attention to. We propose to adopt an unstructured distributed approach to this problem. It is unstructured in that we do not need information about attitude determination, just the photos. It is distributed because we will rely on diverse source of sky photographs taken a different latitudes. We propose to use the 48 core machine to centralize an algorithm dedicated to
- receiving digitized photos (old or not) over the web
- processing these photos to evaluate attitude determination through the use of common star tracker algorithms (it answers the question as to where the photo is pointing in the sky)
- processing these photos to then evaluate "things that should not be there" by comparing it with a +18 magnitude star catalog
- provide results to the person making the request for evaluation and to specialist for further identification.
Additional processing could also be performed. The server would be hosted at a University.
4 comments:
If you really want to do this project it sounds like the sort of algorithm that could be implemented in OpenCl and run on a high end graphics card to get about as much performance as you would get from the 48 core machine.
Oh! Yeah?
What does a 48 Cores do against a Gamma Ray Burst or a Supervolcano?
Have you been reading too much on Singularitarian websites recently?
Relax...
Absalon
I understand what you are saying but a 48 core system provides a way not to have to dabble in the conversion of matlab/octave code into gpu specific code.
Kev
your argument is bizarre. There are currently no known digital data stored somewhere showing past supervolcanoes nor gamma Ray burst.
With regards to the GRB you should the badastronomy blog.
Post a Comment