Saturday, December 22, 2007

Asteroid impacts might be beneficial (in the long term)

The image on the left is an artists impression (from http://universe-review.ca/R10-19-animals.htm) of the first moments of the K-T impact, the asteroid impact that probably caused the mass extinction event at the end of the Cretaceous period about 65 million years ago.

Although mass extinctions wipe out many of the species that are present on Earth at the time of the extinction event, the number and diversity of animal and plant species ultimately increases after mass extinctions. Thus, although mass extinctions are in the short term (several million years) destructive, on the longer terms of hundreds of millions of years they may actually beneficial. Carl Zimmer discusses this, more specifically asteroid-related extinctions, in an article at Wired (NB, only a few of the documented mass extinctions have been plausibly shown to be due to asteroid impact).

The ambiguous and poorly-understood long-term effect of destructive astronomical events (such as asteroid impacts, supernovae, gamma-ray bursts, AGN, etc) on biological life is something that renders assessing the size of the galactic habitable zone difficult, if not impossible, at the present and with our current understanding of these astronomical events and the fossil record.

In order to arrive at a conception of a very small Galactic Habitable Zone (visible in the images from the 2001 Gonzalez et al Scientific American article) , i.e. that the Solar system and the Earth was very unusual (a "privileged planet" that was then evidence of divine favor [or in code: Intelligent Design]), Guillermo Gonzalez assumed that anything that increases the chances of asteroid impacts, or nearby supernova, or high UV or cosmic ray fluxes, was negative and harmful to life.

But those assumptions are by no means robustly justified by existing data, nor are they unique (as the opposite effect, as discussed in Zimmer's article, could plausibly be true), as other astrobiologists pointed out the him at the time. Indeed, it could be that external events are ultimately responsible for driving greater evolutionary diversity and hence for increasing the chances of complex multi-cellular life evolving.

While on the subject of asteroid impacts, SPACE.com has two asteroid-related articles.

The first article, by Charles Choi, discusses new simulations by researchers at Sandia National Labs that suggest that the 1908 Tunguska explosion could have been caused by meteorite only 20 meters in diameter, smaller that previously thought. As there the number of asteroids of a given size is a strongly decreasing function of the their size., this implies that Tunguska-level events might be more common than previously thought. However, the actual destruction caused by the Tunguska explosion is also probably less than previously estimated, so rest easy!


The second article, by Alicia Chang, discusses an asteroid that has a 1 in 75 chance of hitting Mars (not us) this coming January 30th. This asteroid, 2007 WD5, would also cause a Tunguska-level explosion, equivalent to an explosion of about 15 Megatons of TNT.

Norman Levitt Deconstructs Steve Fuller’s Postmodernist Critique of Evolution

Steve Fuller, a self-proclaimed leftist sociologist at the University of Warwick (just south of Birmingham, which is where I did my undergraduate and graduate degrees) was a witness for the defense in the Dover School board trial (i.e for the pro-ID school board). He has a new book out ("Science v. Religion? Intelligent Design and the Problem of Evolution") defending himself and critiquing evolutionary science.

Skeptic magazine has a review of the book by Norman Levitt, a mathematician and author with Paul Gross of several books on the academic culture wars (e.g. "The Flight from Science and Reason", "Higher Superstition" and "Prometheus Bedeviled"), which is worth a read (more for the insight into the unlikely alliance between the leftist and rightist anti-science groups than for a detailed critique of Fuller's book).

http://www.skeptic.com/eskeptic/07-12-19.html#feature

Tuesday, December 18, 2007

The Death Star Galaxy


Being in galaxy in the path of a jet emanating from a supermassive black hole in a neighboring galaxy might be quite catastrophic:


A powerful jet from a super massive black hole is blasting a nearby galaxy, according to new findings from NASA observatories. This never-before witnessed galactic violence may have a profound effect on planets in the jet's path and trigger a burst of star formation in its destructive wake

Known as 3C321, the system contains two galaxies in orbit around each other. Data from NASA's Chandra X-ray Observatory show both galaxies contain super massive black holes at their centers, but the larger galaxy has a jet emanating from the vicinity of its black hole. The smaller galaxy apparently has swung into the path of this jet.

This NASA press release is getting quite a bit of press (e.g. BBC, Yahoo news), and some of the quotes in the press articles certainly spin up the catastrophic angle.

From the BBC article:
The combined effects of this radiation and particles travelling at almost the speed of light could have disastrous consequences for the atmospheres of any Earth-like planets lying in the path of the jet.

For example, protective layers of ozone in the planet's upper atmosphere could be destroyed, which could result in the mass extinction of any life that had evolved on the planet.
From the Yahoo news article:
The larger galaxy has a multi-digit name but is called the "death star galaxy" by one of the researchers who discovered the galactic bullying, Daniel Evans of the Harvard-Smithsonian Center for Astrophysics.

Tens of millions of stars, including those with orbiting planets, are likely in the path of the deadly jet, said study co-author Martin Hardcastle of the University of Hertfordshire in the United Kingdom.

If Earth were in the way — and it's not — the high-energy particles and radiation of the jet would in a matter of months strip away the planet's protective ozone layer and compress the protective magnetosphere, said Evans. That would then allow the sun and the jet itself to bombard the planet with high-energy particles.

And what would that do life on the planet?

"Decompose it," Tyson said.

"Sterilize it," Evans piped in.
However, the actual scientific paper (by Evans et al, still undergoing peer review but available here: astro-ph/0712.2669) wisely ignores the stuff that gets the press excited. None of the speculation regarding zapping planets or star formation in the press stories is in the scientific paper at all. Indeed, its rather sad that the straight facts are not considered exciting enough for the press stories, and instead the speculative stuff gets pushed to the top.

Indeed I have to wonder, given the lack of any discussion in the preprint, if the quotes regarding the effect on planetary atmospheres are based on actual calculations of the estimated particle flux in the target galaxy? Or are they purely speculation based a the hard-to-resist gut feeling that "blacks holes are powerful, so they must be capable of anything"?

Speaking in general (as I don't know the answer to the questions above) I feel this is a real problem with press releases and articles that deviate from the actual scientific paper - the public is potentially being fed speculation rather than the actual peer-reviewed science.

Now it is possible that the quotes in the press articles are based on quantitative comparisons to other peer-reviewed literature on the effect astrophysical events may have on planetary atmospheres or biological activity - the trouble is we don't and can't easily know whether this is the case.

There have been calculations of the effect of nearby supernovae, Gamma Ray bursts on the atmosphere of Earth-like planets (e.g. Smith et al, 2004, Icarus, 171, 229; Scalo & Wheeler, 2002, ApJ, 566, 723; Hunt, 1978, Nature, 271, 430), and also of the effects of neutron star mergers (Dar et al, 1998, Phys Rev Letters, 80, 5813), but I am not aware of equivalent publications regarding jets from AGN.

The image shown comes from the press release, and is described in the accompanying caption:

This composite image shows the jet from a black hole at the center of a galaxy striking the edge of another galaxy, the first time such an interaction has been found. In the image, data from several wavelengths have been combined. X-rays from Chandra (colored purple), optical and ultraviolet (UV) data from Hubble (red and orange), and radio emission from the Very Large Array (VLA) and MERLIN (blue) show how the jet from the main galaxy on the lower left is striking its companion galaxy to the upper right. The jet impacts the companion galaxy at its edge and is then disrupted and deflected, much like how a stream of water from a hose will splay out after hitting a wall at an angle.

Wednesday, December 05, 2007

Winds, winds everywhere

Blogging on Peer-Reviewed Research
With the NSF and Suzaku proposal deadlines safely behind me I've been spending this week catching up on the astrophysical literature on galactic winds that has appears in the last few months.

There is actually quite a bit of it, which I thought I would share with you as evidence that galactic winds are both quite common, potentially very close to home, and also potentially quite important in understanding the nature of galaxies and galaxy formation (blogging is also a useful way of making notes for myself, of course).

The following is a list of galaxies for which observational evidence of galactic-scale winds has recently been obtained:

Leon et al (2007, A&A, 473, 747) find evidence for outflow from NGC 6764, [NED link]. This galaxy is a LINER (often classified as Seyfert 2) and known Wolf-Rayet galaxy (a class of starburst galaxy). A limb-brightened outflow cone or elongated bubble is visible in the optical. The galaxy itself is pretty faint (IRAS f60 flux 6.6 Jy, D ~ 34 Mpc) so its not surprising that no Chandra or XMM-Newton data exist on this object.

NGC 4460 [NED link] in the Canes Venatici I cloud of galaxies (which also hosts classic starbursts such as NGC 4449 and NGC 4631) is an edge-on spiral (lenticular) that shows a classic limb-brightened nuclear outflow cone, see Kaisin & Karachentsev 2007 (astro-ph/0710.0711). They note "compact emission disk in its core, from which diffuse emission protruberances originated along the minor axis." Distance ~ 9.6 Mpc. I was not aware of this object before now, but their continuum-subtracted H-alpha image shows a classic superwind-like morphology. IR warm, f60/f100 ~ 0.48, indicating a genuine starburst. But f60 is only 3.2 Jy, so it is faint because it has a low absolute SF rate.

Jimenez-Vicente et al (2007, MNRAS, 382, L16) find spectroscopic evidence for a low velocity outflow (about 100 km/s) from the center of Messier 100, a beautiful spiral galaxy at a distance of D ~ 16 Mpc (I discussed this result before here, but as its now been published its worth mentioning again). Globally M100 is not quite a starburst galaxy, but it is possible that a weaker form of outflow might occur from its central regions.

Good candidates for poorly-collimated, kiloparsec or larger, AGN-driven winds (as opposed to large-scale jet activity or nuclear-scale warm-absorbers) are much rarer than for the "classic" starburst-driven type of galactic wind.

IC 5063 [NED link] has kiloparsec-scale ~600 km/s outflow of neutral hydrogen and ionized gas, see Morganti et al 2007 (astro-ph/0710.1189). IC 5063 is a bulge-dominated SA galaxy with a prominent dust lane (possibly a minor merger remnant) with Seyfert 2 activity at a distance of D~47 Mpc. [Added 08 Oct 2007]

Theoretical work on galactic winds is rarer than observational work, but some significant new papers on related to winds (not necessarily starburst-driven) have also appeared.

Jackie Cooper and colleagues (Cooper et al 2007, ApJ, in press, see astro-ph/0710.5437) present some of the first(*) 3-D simulations of a starburst-driven galactic wind (or superwind). Their main innovation is to model the initial ambient ISM the supernova explode into as a multi-phase medium with a tenuous inter-cloud component and dense clouds drawn from a Kolmogorov density spectrum (as you would expect to be created by turbulence). Despite this major difference from the previous generation of superwind models that were forced to assume a homogeneous ISM (e.g Suchkov et al 1994, Strickland & Stevens 2000), Cooper confirm our (SS2000) finding that the soft X-ray emission from superwinds arises in low-volume regions where the SN-driven hot wind interacts with dense cool clumps and clouds. An mpg movie of one of their simulations is available here.

The main limitation of their models are the small scales they can simulate at high resolution in 3-D - only a cube 1 kpc on a side over a time of only 1 Myr, compared to the 10+ kpc, 10+ Myr scales of real winds.

The idea that our own galaxy, the Milky Way, has some form of weak galactic wind has been kicking around for a while (e.g. Sofue 1989; Bland-Hawthorn & Cohen 2003; Keeney et al 2006). However, although star formation is more vigorous in the center of our Galaxy than it is on average through the disk, the Milky Way is by no means a starburst galaxy so we would not expect there to be enough mechanical energy released by the stellar winds and supernova to drive a starburst-driven wind. Furthermore the scale of the supposed MW outflow varies from study to study, from a few 100 pc through 1-kpc-scale out to 10 to 100 kpc-scale winds, so I'm normally very skeptical of claims that Milky Way has a galactic wind.

However, a paper worth mentioning by Everett et al (astro-ph/0710.3712) argues for a 1-kpc-scale outflow driven by a mix of cosmic ray and thermal pressure. Cosmic rays typically carry between only 10 and 30% of the kinetic energy released by supernovae, which is why they're often ignored in the theoretical picture of starburst-driven superwinds -- in starburst galaxies we have more than enough energy to drive the observed winds. However, as stated above, the extra energy in the cosmic rays might be important for a small outflow from the center of the Milky Way.

Theoretical models of purely cosmic-ray-driven galactic winds are not new (e.g. Breitschwerdt et al 1991), but it is nice to see mixed thermal plus CR-driven wind models being developed and applied.

The Everett model appears to generate the same form of wind solution as the pure CR-driven winds - an initially low velocity (v less than 200 km/s) flow that slowly accelerates to higher velocity as the height above the plane of the Galaxy increases to several kpc. This behavior is unlike the velocity of the warm neutral and ionized gas observed in classic supernova-driven winds, where the measured velocities reach (typically) 200 - 600 km/s within a few hundred parsecs of the starburst region and then appear roughly constant.

Although interesting from a theoretical standpoint I think it is necessary that more observational evidence accumulate, specifically kinematic evidence of outflow from both absorption and emission-line studies, before we can be confident that the Milky Way galaxy does host some form of galactic wind.

The effect of supernovae and stellar winds from massive stars on the interstellar gas that they themselves formed out of is an example of an astrophysical feedback loop. Feedback from massive stars, and/or AGN, is thought to be important in regulating galaxy formation and evolution, but the exact mechanisms by which this proceeds and the significance of the effects of feedback are by no means clear as yet.

At one point SN feedback was believed to be very powerful, such at it might actually destroy low mass galaxies by blowing all the gas out of them and preventing further star formation, or even by unbinding them and their dark matter halos completely
(Dekel & Silk 1986).

More recently the pendulum of opinion has swung in the other direction, as more detailed theoretical work demonstrated that supernova-driven winds could not efficiently blow all the gas out of even the lowest mass dwarf galaxies. Supernova-driven winds could eject metals from galaxies (elements heavier than hydrogen and helium that are created in stellar nucleosynthesis and ejected in supernova), but the majority of the interstellar gas would remain behind in the galaxy (e.g. Mac Low & Ferrara 1999).

Now, in a result that is sure to widespread attention, Maschenko et al (astro-ph/0711.4803) claim that SN feedback nevertheless can affect the shape of the dark matter halos of dwarf galaxies. The shape of DM halos has been a problem for some time, with theory predicting a different central shape (cuspy) to that inferred from observations of gas and stellar motions (constant density core).

For galaxies that are just forming most of the mass is either gas (rather than stars) or dark matter. Although overall there is thought to be more mass in dark matter than in "normal" baryonic matter, in the centers of these protogalaxies the condensed baryonic gas is a significant contributor to the gravity. Mashchenko et al claim that their simulations show that the first SN explosions perturb and stir the gas around, which then moves the dark matter around purely by gravity, thus smearing out the cuspy DM predicted by normal theory and turning into the smoother cores observed in real galaxies.

If true this would be a very significant result. While I work on feedback and think it very important for understanding the nature of the galaxies we observe in the Universe today, I am somewhat skeptical of this result. This work relies on accurately modeling star formation, and the hydrodynamics of the interstellar gas, over a wide range of physical and temporal scales. Ultimately it comes down to whether you believe their numerical method (smoothed particle hydrodynamics), and their implementation of SN feedback within the code, is accurate. Are these results result, or merely numerical artifacts? Time will tell.

(*) 3-D simulations of multiple SN explosions in proto-dwarf galaxies, single star clusters, or small regions of a starburst have been done before now. All of these situations are more simplified than simulating a wind in a modern, more massive, galaxy, so the Cooper simulations can be considered some of the first published 3-D sims of galactic winds. Annoyingly I've been sitting on a set of completed 3-D sims of winds covering larger physical scales than the Cooper models for two years now without publishing them.... argh!

[Update 6:39pm: Replace paragraphs lost after blogger decided to delete random paragraphs.]

How not to get tenure

Speaking of the Templeton Foundation, it is claimed that the TF(*) was one of the few sources of funding that Guillermo Gonzalez managed to obtain. Indeed, it seems he only managed to bring in $170,000 spread over 5 years, compared to the average of $1,300,000 other ISU physics and astronomy faculty had brought in while they were on tenure track!

GG's failure to bring in normal levels of funding, in addition to a dramatic drop in publication rate and lack of grad student/postdocs, were major factors in him being denied tenure. These are, along with undergraduate teaching, major aspects of being a professional scientist in tenure-track although tenure is not based on a fixed set of rules.

Many good scientists do not get tenure (e.g. Rob Knob of the Galactic Interactions blog), and many good scientists never even get onto tenure-track in the first place. Given that GG appears to have failed to satisfy the requirements in many ways it is totally unsurprising that he was denied tenure at Iowa State University.

That he espoused an unscientific astronomical version of Intelligent Design and had close links to the Discovery Institute was also, quite rightly and quite fairly, another aspect of concern for the the faculty in the Physics department he was attempting to get tenure from. The Discovery Institutes's anti-secular and anti-scientific agenda, coupled with its manifest dishonesty is no secret. The recently revealed emails clearly show that GG's DI/ID links were known and were (entirely fairly) viewed negatively but were not used as a litmus test to "discriminate" against him. Of course, this will not stop Gonzalez and the DI is hijacking a routine and just decision in order to play politics.

After all, the faculty must have been aware of his views when they offered him tenure-track in the first place - they were no secret in the astronomical community - yet he did get a tenure track position. I have no doubt that while the faculty may have viewed GG's views with distaste they would have given him tenure had he satisfied the standard requirements of all tenure track faculty: bring in funding, mentor students and postdocs, be scientifically productive. Guillermo Gonzalez has only himself to blame for his current position.

Gonzalez could have used those tenure track years to engage in peer-reviewed research to develop the concept of galactic habitability and turn it from a poorly-constrained hypothesis into a robust theory. He could easily have applied for grants to pay for several grad students and postdocs to to work with him to expand our knowledge of the role galaxies play in habitable planet formation and evolution.

But Guillermo Gonzalez didn't attempt to further science. Instead he decided to present his speculative and religiously-distorted views of Galactic Habitability to the unsuspecting public as scientific fact through his book "The Privileged Planet", bypassing peer review altogether.

In the mean time real science, done by real scientists, went ahead and left Guillermo Gonzalez behind. The ADS abstract service records 440 astronomy-related abstracts with the word habitable in 2006-2007 alone. If we repeat the search requiring the surname Gonzalez be one of the authors we get 1 abstract, and its a M. Gonzalez, not Guillermo. I have to expand the search to 2000-2007 before the Guillermo Gonzalez(**) appears, and then only in three abstracts, two of which are reviews rather than new work. Again, by way of comparison 107 abstracts contained "Galactic" and "Habitable" in the abstract between 2000 and 2007 (1353 with the word Habitable alone).

If Guillermo Gonzalez really believed that astrophysics did indeed provide convincing evidence of a Designer (specifically a Christian God) why would he have abandoned actual research?

(*) I am informed that the TF fund many good studies the interaction between science and religion, and that the TF is opposed to the "culture war" spin presented by fundamentalists such as the DI. Some of the people they occasionally fund are less rational though.

(**) There is also another Guillermo Gonzalez in professional astronomy.

Anyway, all this politics is tiring. In my next post we'll be back to discussing really interesting stuff. Yup, more on galactic winds!

Monday, December 03, 2007

Bob Park on Paul Davies and the Templeton Prize

Bob Park discusses Paul Davies's infamous Op-Ed (previously mentioned here) and the Templeton Prize in his November 30th "What's New" column.

Protecting NASA research funding

Space News Business Report (at Space.com) discusses Alan Stern's (NASA associate administrator for sciences) changes to restore and protect research funding.

Monday, November 26, 2007

Paul Davies does not know what science is

Paul Davies, best known for exploiting anthropic arguments in the service of religious apologetica and presenting the results to the unsuspecting public as popular "science" books, is at it again: a November 24th Op-Ed piece in the NYT that trots out the old canard that science is based on faith in the same way religion is.


When I was a student, the laws of physics were regarded as completely off limits. The job of the scientist, we were told, is to discover the laws and apply them, not inquire into their provenance. The laws were treated as “given” — imprinted on the universe like a maker’s mark at the moment of cosmic birth — and fixed forevermore. Therefore, to be a scientist, you had to have faith that the universe is governed by dependable, immutable, absolute, universal, mathematical laws of an unspecified origin. You’ve got to believe that these laws won’t fail, that we won’t wake up tomorrow to find heat flowing from cold to hot, or the speed of light changing by the hour.
Of course, almost none of this is true, as almost any scientist at any type will quickly tell you.

First of all, the "laws of science" are observed regularities of the way the Universe works, convenient descriptions for human consumption. Indeed, if there were no such regularities then everything we know, including life, could not exist. But that does not mean that a Universe that works differently could not exist.

That nature is ordered such that laws can be ascertained is not an untested assumption (or "faith" as Davies misleadingly but deliberately words it), but a hypothesis that is effectively tested. Even quantum mechanics, with predictions and behaviour that are extremely unusual by everyday standards, is rational and intelligible. Davies statement that "you have to believe that those laws won't fail" is an extremely unusual thing to say - physicists I known don't go around reciting "f=ma" for fear that Newton's Laws will stop working. Imagine if physics on the scale of everyday objects (houses, planets, etc) were not ordered and the so-called laws did fail - you'd work out that was happening pretty quickly.

Why would Davies even make such poorly reasoned claims, why use the word faith for something that is nothing like religious faith (belief without evidence)? Nor is this the only attempt to tie science to religion, specifically Christianity, in the article. The wikipedia article on Paul Davies (as seen at the time of writing on Nov 26th 2007 at 5:30pm EST) is blunt in one (quite probable) interpretation:

Davies recently made his commitment to deism clear in his New York Times Op-Ed, Science on Faith, 24 November, 2007 going so far as to write:

Clearly, then, both religion and science are founded on faith — namely, on belief in the existence of something outside the universe, like an unexplained God or an unexplained set of physical laws, maybe even a huge ensemble of unseen universes, too. For that reason, both monotheistic religion and orthodox science fail to provide a complete account of physical existence.

While coyly refusing to name or identify a particular candidate monotheistic religion, equating faith in the supernatural with a faith that the universe is itself real, testable and knowable is essentially a reductio ad absurdum where science can be faulted for not knowing anything if it can't explain absolutely everything. The only motive for making such an absurd argument is to open a crack where the wedge of deism can be inserted into scientific reason, to divide and dominate it. It is essentially a political, not scientific argument, one that advocates deism in exactly the same way as the "intelligent design" religious movement.


Sadly it is unlikely that the hordes of NYT readers who made this article one of the top emailed articles in the last 48 hours will learn just how wrong Davies is.

Monday, November 12, 2007

The Biggest Eyes in the Sky

NASA's "Great Observatories": Hubble, Chandra, Spitzer and Compton, cost on average $1-2 billion each to build, launch and operate, in a program that has well taken over twenty years to develop. By way of comparison, NASA spends approximately $1.3 billion per year on astrophysics (excluding Solar and Solar system related research), or about $5 billion per year on science in total, out of a total budget of $18 billion per year. This useful PDF describes the Great Observatories program, its aims, and NASA's budget.

Together they covered much of electromagnetic spectrum inaccessible to ground based observatories due to atmospheric absorption: Hubble probed ultraviolet as well as optical wavelengths, Spitzer probed lower energy infra-red radiation, while Chandra and Compton were sensitive to X-ray and Gamma ray radiation respectively. In many ways they are the most advanced telescopes ever created by mankind, for those specific wavelengths.

However, for the optical, near-IR and radio wavebands there is no doubt that much better telescopes have been launched into space, and built at greater cost to the public. The difference is that they're not pointed outward but instead inwards, down at the Earth. They are, of course, spy satellites. Quite apart from the vital intelligence-gathering work they do, they're fascinating and impressive technical accomplishments.

The NYT has a fascinating article by Philip Taubman about spy satellites, more specifically about the financial woes affecting latest generation of US spy satellites. Well worth a read when you're having a break from work.

[Edited at Mon Nov 12 11:09 to fix formatting and note article is mainly about the financial aspects of the spy satellite program]

Thursday, November 08, 2007

Comets, X-rays, and Pumpkins



I've been rather busy lately writing a NSF grant proposal, so I haven't written any posts for a while. Despite the time-crunch, I feel like taking a few minutes to briefly mention Comet Holmes which I managed to see with the naked eye (even from light-polluted Baltimore suburbia) last week and the week before.

As everyone is probably now aware, Comet 17P/Holmes is currently visible to the naked eye (to find it check out this S&T article, or the easier-to-use directions from SPACE.com), after undergoing a dramatic and unexpected increase in luminosity believed to be associated with some form of explosive out-gassing. Note is only through binoculars or a telescope that the truly diffuse nature of the comet is apparent, but even with small binoculars it was pretty impressive. Far more impressive to my mind than Halley was in 1986, so if you haven't seen it yet please go look for it.

Indeed, this one of rare astronomical phenomena that is more impressive to see yourself than when viewed as a picture taken with a big telescope (Astronomy Picture of the Day has a whole series of Comet Holmes images: 1, 2, 3, 4, 5).

I have a soft spot for comets, I worked very briefly (for a few hours) on trying to explain the X-ray emission from comets. The material evaporating off a comet is very cold (only about T~50 K), so it was a great surprise when X-ray emission was discovered coming from Comet Hyakatuke in 1996 using the ROSAT X-ray telescope (see e.g. Glanz, 1996, Science, 272, 194). Many explanations were advanced at the time, the vast majority of which did not work out. As the Solar wind has a velocity of several hundred km/s, one hypothesis was that a shock wave caused by the interaction of the Solar wind with the cometary halo caused X-ray emission by thermal bremsstrahlung. Ian Stevens, my thesis advisor at the time, performed hydrodynamical simulations of this, and my job was to take the simulations and calculate the expected X-ray luminosity. Which turned out to be orders of magnitude less than the observed emission, hence disproving the hypothesis. We didn't even bother considering to publish the results.

The real explanation for the cometary X-ray emission turned out to be charge exchange with the Solar wind. The material in the Solar wind is highly ionized, while the material out-gassed from the comet is largely neutral. A highly ionized ion interacts with a neutral atom, basically stealing one or more electron from the neutral atom. The formerly neutral atom is now ionized, and is left in an excited state. This excited state decays to a ground state by the emission of one or more photons. Dennis Bodewits PhD thesis "Cometary X-rays : solar wind charge exchange in cometary atmospheres" (2007, The University of Groningen) deals with many aspects of X-ray emission from comets, and is available chapter by chapter in PDF form.

Ironically solar wind charge exchange (SWCX) has now been recognized as a process than almost all X-ray astronomers must worry about (see Snowden et al 2004, ApJ, 610, 1182), even those like me who study distant galaxies. SWCX is now recognized as a major contributor to the soft X-ray background that affects all X-ray observations, and which makes observing faint diffuse X-ray emission difficult. Worse still, the SWCX can be time variable, further complicating background estimation and removal.


In acknowledgement of this link my Halloween pumpkin this year was a comet, which looked quite good until our local deer ate it.

Friday, October 19, 2007

Component analysis, causal inference, and general intelligence

The aim of astronomy is astrophysics - we observe to Universe with the hope of using the resulting data to understand the fundamental physical processes that give rise to its observed properties.

As with many sciences the data obtained from observation (experimentation, in other sciences) itself does not uniquely tell you the physics or what caused what. Instead one normally investigates to look for correlations between different aspects of the data.

For example it is known that the surface brightness, effective radius and velocity dispersion of the stars in elliptical galaxies are strongly correlated, a result now called the fundamental plane. Another example is that in starburst galaxies the soft X-ray luminosity is linearly proportional to the galaxies far-IR luminosity because, causually, the FIR traces the formation rate of massive stars, the same stars that very rapidly die and whose supernovae heat the ISM to X-ray-emitting temperatures.

Various methods of investigating correlations between multiple variables exist (e.g. principal component analysis), now often referred to as "data mining." The problem is that these methods, while useful at recasting the data in ways that aid visualization of any correlations in the data variables, do not necessarily tell you what caused what.

An interesting discussion of these often-forgotten issues and complexities, one is applicable even to astrophysics, can be found in Cosma Shalizi's article on the myth of g, the so-called general factor of intelligence. Indeed, he argues that while factor analysis is perfectly valid for data exploration or model testing, as a method for finding causal structure it is not reliable (it can be right, but often its completely wrong and can fool you).

All very interesting, and rather important to understand in the wake of a certain elderly Nobel-prize winner's recent counter-factual comments.

Wednesday, October 17, 2007

Rumors of the LTSA program returning

Steinn Sigurdsson reports rumors that the NASA LTSA (Long Term Space Astrophysics) funding program may return. The LTSA program has been not offered for several years due to budget cuts.

This would be good news for younger researchers in the space sciences, in particular those using mainly HST, Chandra, XMM-Newton or Spitzer data, who are ineligible for the currently remaining funding programs like the ADP. Let us hope the rumor is true.

Monday, October 15, 2007

LibraryThing

I spent a large part of Sunday adding the non-fiction part of the home and work book collections to LibraryThing, an online book collection web site. You can view my library here (or click the link at the bottom of the blog). A short introductory guide to LibraryThing displays many of its features.

I have wanted an easy-to-use book cataloging tool for some time now, largely to have a list of my book in case of accident or theft, and also to prevent the purchase of duplicate copies of books (which can be an expensive mistake when you're buying high level science textbooks).

Many of the available open-source list or catalog making software is not book-specific, and hence leaves much to be desired. In contrast, LibraryThing offers many of the features I've wanted and a few I hadn't even considered. Item entry is very easy, you can pretty much rely on entering the ISBN number and LibraryThing recognizing the correct book and edition (although the cover images are occasionally incorrect).

It also offers a variety of tools for blog integration - for example a random collection of covers from my collection should appear in the side bar to the bottom right of this blog. So far it appears random but static, as the covers do not change when I reload the site.

One drawback is that the free accounts are limited to 200 books, rather restrictive. A lifetime membership with unlimited books is $25. So for now I've limited my catalog to non-fiction history or science books, everything else will go in another free public LibraryThing when I get around to it.

Swara deserves thanks for bring LibraryThing to my attention.

Friday, October 05, 2007

Judge, fearing "Sleeper Cells", rules in favor of universal background checks

Judge Otis Wright has ruled in favor of NASA and against a group of JPL employees in a case regarding invasive background checks that would allow the questioning employees of sexuality, medical records and finances. These were ordered by presidential directive in 2004 for all national laboratories, of which JPL is one. "...I want the security of this nation preserved,'' Wright said Monday. "I don't want any sleepers infiltrating NASA or JPL.''

Apparently a bunch of uneducated turbaned guys hanging out in caves in Pakistan might be able to put "sleeper cells" into positions in NASA or JPL that don't even deal with classified information but that will still threaten the "security of this nation", even though this was something that clearly wasn't even a possible threat from the Soviet Union during the Cold War.

Link: SPACE.COM.

Wilkins on Feyerabend


John Wilkins at Evolving Thoughts has an interesting essay-length article on the philosopher most guaranteed to make a scientist role their eyes in scorn: Feyerabend. Well worth reading, especially regarding the origin of Feyerabend's ideas and their consequences in todays era of special interest denialist "think tanks".

[Image of Paul Feyerabend from https://webspace.utexas.edu/cokerwr/www/slides/philosophers.html]

Thursday, October 04, 2007

Dark Energy and Straw Men


On today's set of astro-ph preprints is a conference proceedings article by Andreas Albrecht (astro-ph/0710.0867) entitled "The case for an aggressive program of dark energy probes" caught my eye. Specifically the abstract attracted my attention:

The observed cosmic acceleration presents the physics and cosmology communities with amazing opportunities to make exciting, probably even radical advances in these fields. This topic is highly data driven and many of our opportunities depend on us undertaking an ambitious observational program. Here I outline the case for such a program based on both the exciting science related to the cosmic acceleration and the impressive impact that a strong observational program would have. Along the way, I challenge a number of arguments that skeptics use to question the value of a strong observational commitment to this field.

From the abstract I had hoped this would address and rebut the criticisms raised by Simon White earlier this year in "Fundamentalist physics: why Dark Energy is bad for Astronomy" (astro-ph/0704.2291). Unfortunately Albrecht's article does not address any of these substantive criticisms regarding discovery space, scientific method, whether Dark Energy experiments advance our knowledge of astrophysics, or experiment systematics, so for now I am left to believe that White's criticisms remain valid and significant.

Don't misunderstand me - IF we can confirm and constrain the nature of dark energy it may revolutionize our understanding of particle physics, but its role in explaining the Universe of galaxies, stars and planets that we live in is extremely limited. By all means fund dark energy probes out of the DoE and classics physics funding, but significant progress in astrophysics will be stifled if too much astrophysics money is diverted into DE.

Understanding the nature of the dark matter particle would also advance fundamental physics greatly, and dark matter plays a much larger role than dark energy in shaping the nature of the Universe and the specific objects we live in than dark energy does. Yet the discovery of the dark matter wasn't met with a program of dropping successful existing astronomy projects and throwing vast resources at satellites that could only quantify one number about dark matter.

Tuesday, September 25, 2007

The mysterious CK Vul

So what did Pere Dom Anthelme and Hevelius see in 1670? See Hajduk et al (astro-ph/0709.3746).

Monday, September 24, 2007

Black hole finding and who killed the Dinosaurs

NASA has approved the restart of NuSTAR, a SMEX level mission, which is a focusing hard X-ray telescope with the primary objectives of "conducting a census for black holes on all scales, mapping radioactive material in young supernova remnants, and exposing relativistic jets of particles from the most extreme active galaxies". This is great news! The hard X-ray sky has not been mapped in any systematic way, and NuSTAR promises to live up to its former unofficial name of Black Hole Finder by discovering the obscured black holes that can not be seen at any other wavelength.

The other primary source of astronomy funding is the National Science Foundation (the NSF), which had been reviewing its priorities under a process called Senior Review that culminated in a report published November 2006. The astronomy division of the NSF has just produced a progress update of how it has been seeking to implement the proposals of the Senior Review. Its worth reading, not least to gain an idea of the problems facing Arecibo's future.



While we're close to home, Bottke et al (2007, Nature, 449, 48) present work suggesting that the KT impactor (i.e. the asteroid impact believed to have lead to the extinction of the non-avian dinosaurs and many other species about 65 million years ago) may have itself originated in the break up of a larger asteroid about 170 million years ago. Those remaining remnants of this mega asteroid (the bits that didn't later hit the Earth or the Moon) make up the Baptistina asteroid family. If you don't have access to Nature then try reading the press release instead.

[Stylish black and white image from the South West Research Institute web site]

Thursday, September 13, 2007

A galactic wind from Messier 100?

Jiménez-Vicente et al (2007) appear to have discovered evidence for a galactic wind emanating from the nuclear region of the roughly face-on spiral galaxy Messier 100
(Astronomy Picture Of the Data image of M100 here). Blue-shifted sodium absorption lines (arising in neutral hydrogen gas at several thousand degrees Kelvin) and blue-shifted hydrogen and nitrogen emission lines (emitted from ionized hydrogen gas at about 10000 Kelvin) are a classic observational signature of a starburst-driven outflow, and their data is pretty persuasive.

These absorption and emission lines arise in gaseous "clouds" that are initially at rest in the host galaxy, but now have been entrained into (and accelerated by the ram pressure of) a much-hotter enveloping wind of merged supernova ejecta (at a temperature of order 10 000 000 Kelvin). In the standard theoretical model for galactic winds the clouds with the lowest column density (i.e. thinnest, lowest mass clouds) are accelerated up to higher velocity than the higher column density clouds, so that the flow ends up multiple clouds covering a broad range of outflow.

It is important to realize that the material seen using optical absorption or emission lines (the clouds) might not necessarily ever achieve the same velocity as the hot, metal-enriched gas that drives the wind, so that the velocity of the clouds is not the same as the true wind outflow velocity.

Constellation-X: There is good news and there is bad news

Last week the National Research Council's Space Studies Board finally published its conclusions assessing the feasibility and priorities for space missions in the NASA "Beyond Einstein" program. Basically this was a match-up to select between current political and fashion favorites going after dark energy, and the older-but-surer science missions like Constellation-X (and X-ray observatory, e.g. for understanding black holes, AGN, and anything else extremely hot in the universe) and LISA (gravitational wave detection).


The committee will be charged to address the following tasks:

1. Assess the five proposed Beyond Einstein missions (Constellation-X, Laser Interferometer Space Antenna, Joint Dark Energy Mission, Inflation Probe, and Black Hole Finder probe) and recommend which of these five should be developed and launched first, using a funding wedge that is expected to begin in FY 2009. The criteria for these assessments include:
a. Potential scientific impact within the context of other existing and planned space-based and ground-based missions; and
b. Realism of preliminary technology and management plans, and cost estimates.

2. Assess the Beyond Einstein missions sufficiently so that they can act as input for any future decisions by NASA or the next Astronomy and Astrophysics Decadal Survey on the ordering of the remaining missions. This second task element will assist NASA in its investment strategy for future technology development within the Beyond Einstein Program prior to the results of the Decadal Survey.

You can read the full report online here. Stein Sigurdsson already offered his take on the predictable outcome (a mission focused on dark energy) in a post last week that can be found here. The Constellation-X team is rather more upbeat in its assessment:

Dear Colleagues:

The BEPAC report provides both bad and good news for the Con-X project. Bad news is that Con-X was not chosen as the highest priority (JDEM was recommended as the first BE mission) or even as the 2nd highest priority (LISA was recommended for enhanced technology investment). Good news can be found in the BEPAC endorsement of the Con-X science and technical readiness. Specifically, "Con-X will make the broadest and most diverse contributions to astronomy of any of the candidate Beyond Einstein missions...the general observer program of Con-X will harness the ingenuity of the entire astronomical community", and "Con-X is one of the best studied and tested of the missions presented to the panel...much of this can be attributed to...strong community support". The panel recognized that "Con-X was ranked second only to the James Webb Space Telescope in the 2001 Decadal Survey".

Looking forward, the BEPAC recommends that "Con-X development activities need to continue aggressively in areas such as achieving the mirror angular resolution, cooling technology and x-ray micro-calorimeter arrays to improve the Con-X missions readiness for the next Astronomy and Astrophysics Decadal Survey." We will continue to pursue technology developments in these areas. All of your assistance in preparing the BEPAC presentations provides us with a very good start for the upcoming Decadal Survey. One of our tasks will be to expand upon the BEPAC material, which was very focused on BE science, in order to capitalize on the breadth and diversity of Con-X science (which was recognized by the BEPAC). We anticipate having the next Con-X FST meeting in mid-Feb 2008, following the release of the NASA budget request for FY2009.

Harvey Tananbaum, Nick White, Michael Garcia, Jay Bookbinder, and Ann Hornschemeier, for the Con-X Project Team.
I leave it as an exercise for the reader to compare the Constellation-X assessment with the officially stated goals of the assessment (1a and b above).

Speaking personally, I'm very disappointed in the selection. Getting the equation of state for dark energy is interesting, but very limited in scope, and this selection basically sacrifices the potential of lot of scientific progress and potential for new discoveries in many branches of astrophysics for a very limited advance in one specific area. Indeed, an area that has little practical impact for our understanding the broader nature of the Universe and how we came to be here (see Simon White's essay regarding dark energy and astronomy here).

Constellation-X is, furthermore, the only proposed instrument in the next 10-20 years with the potential to unambiguously answer the most fundamental unknowns regarding starburst-driven winds (my primary area of study): measuring the velocity of the hot and very hot metal-enriched X-ray-emitting gas. If we don't and won't know that for sure, then we'll never be quite sure about interstellar medium and intergalactic medium feedback processes. Forget about getting the full picture on galaxy formation, galactic chemical evolution, mass loss from galaxies, enrichment and heating of the IGM, and so on. But hey, instead we get to know whether a number called W is between -1 and 1. I'm sure the public is eager to know that, I'm positive they'll get very excited about that result.

Now Con-X isn't officially dead yet, but being third in line is a dangerous place to be. In another few years, if there are further assessments to rationalize and prioritize spending, will Con-X survive? And without Con-X, I feel like I might as well pack up and give in. Game over...

Wednesday, September 05, 2007

Brightest Supernova paper finally published, and its already been demoted to 2nd place

The popular science press and the science blogs went to town back in May 2007 reporting on SN 2006gy, a supernova explosion in the galaxy NGC 1260 that was detected on Earth in 2006. At the time SN 2006gy was heralded as the most luminous SN explosion ever detected.

It is common, but somewhat unfortunate, with "breaking news" about science that the press attention occurs well before the refereed scientific paper describing the results was actually published.

Well, the full, final, refereed paper "SN 2006gy: Discovery of the Most Luminous Supernova Ever Recorded, Powered by the Death of an Extremely Massive Star like η Carinae" by Smith et al was published just this week, in the September 10th edition of the Astrophysical Journal (Smith et al, 2007, ApJ, 666, 1116).

It is somewhat ironic then that in today's astro-ph preprints SN 2006gy is demoted to being the second brightest SN ever by SN 2005ap (Quimby et al, astro-ph/0709.0302). Thats right, a supernovae detected in 2005 was about twice as luminous as 2006gy. Except it'll be published second. But by the same group of researchers.

Both of these events (and a third unusually luminous but as yet unpublished SN event) were detected by the Texas Supernova Search (Quimby, R. 2006a, Ph.D. thesis, Univ. Texas at Austin).

Pretty impressive work, and evidence that unusual things still lurk out there waiting for the innovative scientist (or large team of innovative scientists and engineers) to find them.

Tuesday, August 28, 2007

The harvestmen of deep time

[image source: Gonzalo Giribet / NY times]

Carl Zimmer has produced yet another fascinating article in the science section of the NYT, featuring Dr Gonzalo Giribet and his group's research on the evolution of mite harvestmen (a relative of the daddy longlegs).

In short, as any species of mite harvestmen has a small physical range (apparently of order 50 miles or so) and hence do not spread or disperse much on their own, they provide a great way of tracing continental motion on time scales of hundreds of millions of years.

Monday, August 20, 2007

FUSE observations of O VI in Superwinds


Following up from the previous obituary for the Far Ultraviolet Spectroscopic Explorer (FUSE) I thought I'd start to discuss one way in which FUSE advanced the scientific topic I study most: starburst driven superwinds.

FUSE observations of starburst galaxies and superwinds yielded many important results, although they generated as many questions as they answered. In particular the 1032 and 1038 Angstrom doublet of the O VI ion (five times ionized oxygen, O^5+). This coronal phase gas could be used to measure the velocity and amount of material at a temperature of about 300,000 K (3e5 K), gas far hotter than the 10,000 K gas seen in Hubble observations of superwinds (this latter gas phase is technically termed warm ionized gas to differentiate it from the hotter ionized phases detected with O VI or X-rays).

Although 3e5 K is still much cooler than the temperature of the X-ray emitting gas in superwinds (which has temperatures of several times 1e6 to several times 1e7 K)
, this is still the hottest gas for which the velocity in a superwind has been measured.

The standard theoretical model for superwinds predicts that the hotter the gas phase the higher its outflow velocity. This is because it is the gas thermal pressure and ram pressure-driven expansion of the hottest material (the merged supernova and stellar wind ejecta) that sweeps up and accelerates the cooler ambient gas clouds that are seen in optical observations of superwinds. As this acceleration process is imperfect the cooler material is never accelerated to the same velocity as the hot gas.

In other theoretical models for superwinds, for example radiation-pressure driven winds or cosmic-ray driven winds, this specific variation of outflow velocity with temperature is not expected. Thus measurements of the outflow velocity of different gas phases at different temperatures can be used as a test of our theories of how superwinds are created and how they work physically.

Initial observations with FUSE of the starbursting dwarf galaxy NGC 1705 (Heckman et al, 2001, ApJ, 554, 1021) suggested that this O VI-absorbing gas was flowing outward faster than the warm and neutral ionized gaseous media in line with expectations from the standard superwind model [and that the amount of O VI absorbing material was inconsistent with thermal condiction in the shell of a superbubble, a slightly different model for what was is happening in NGC 1705].

The image shown above is an optical image of NGC 1705 taken from the SINGG survey of Meurer et al 2006, ApJS, 165, 307. The red in this image is H-alpha emission, light from warm ionized hydrogen with a temperature of 8000 - 10000 Kelvin. The filaments, shell and arcs of ionized hydrogen cover a region about 1 kpc (about 3000 light years) in diameter.

Since those early FUSE results on NGC 1705 were published in 2001 many more FUSE observations of a variety of different starbursting galaxies were taken, most of them analyzed by colleagues here at Johns Hopkins such as Charles Hoopes and John Grimes. In a future post I'll summarize what the latest thinking is on O VI in starbursts and the issue of phase-dependent velocities in superwinds.

FUSE is dead, long live FUSE!

The Far Ultraviolet Spectroscopic Explorer, a UV space telescope launched by NASA in June 1999 with a nominal lifetime of 3 years, ceased working for the last time on July 12th 2007.

FUSE Mission Status--Aug. 17, 2007
Dear Colleagues,

As reported previously in the FUSE newsletter, the last operational reaction wheel on FUSE stopped temporarily in early May 2007. It was restarted and science operations resumed on June 12. However, on July 12 the wheel stopped again. This time the stoppage was very abrupt indicating a large braking force. Attempts to restart any of the wheels over the last four weeks have been unsuccessful. Although the instrument remains in excellent condition, the FUSE satellite is currently incapable of the fine pointing control required to continue its science mission, and there is no real prospect for recovering this capability.

Regrettably, we have concluded that the scientific mission of the Far Ultraviolet Spectroscopic Explorer is no longer viable. The NASA Science Mission Directorate has accepted our recommendation to terminate the mission. The FUSE Project has started closeout activities and will complete the final CalFUSE 3.2 reprocessing of the entire science mission data set in mid 2008. The FUSE archive at MAST will be an ongoing legacy of the mission, and an important resource for years to come. Future editions of the FUSE Newsletter will provide details of our plans for the FUSE mission archive at
MAST. Also, watch the FUSE web page for updates.

The FUSE mission has been a fantastic success by any measure. 678 science programs (GI, PI team, and discretionary time) have obtained 67 Msec of observing time, over 5100 observations of about 2800 unique targets. There are over 430 peer-reviewed papers based on FUSE data and the number continues to grow. The story is not quite over, though. Twenty five of the 68 programs selected for Cycle 8 obtained data this Spring and Summer before the reaction wheel stopped for the last time. These data have been archived recently and should lead to further exciting results in the near future. Utilization of the FUSE archive will continue the flow of new results.

The Astrophysics Division intends to place special emphasis on FUSE archival research in the 2008 Astrophysics Data Program that will be part of the 2008 ROSES proposal solicitation.

The success of FUSE is a result of the combined efforts of the scientists and engineers who built and operated it plus the scientists who proposed, analyzed, and interpreted the observations. FUSE's legacy is a testament to the creativity, ingenuity, and hard work of all of you. We acknowledge your efforts and enthusiasm with gratitude.


George Sonneborn Warren Moos
Project Scientist Principal Investigator
NASA/GSFC Johns Hopkins University
Over the years scientists and engineers had worked wonders to overcome many hardware failures and software glitches to keep FUSE working beyond its normal lifetime, but this final problem with the reaction wheel appears fatal. FUSE will live on as scientists continue to analyze and re-analyze the data archive accumulated over its many years of successful observing.

Unlike the Hubble Space Telescope, which is primarily an imager, FUSE consisted of a set of four co-aligned UV-sensitive spectrometers (see the FUSE User Guide more a technical description of the instrument) and thus can not create pretty images of the sort that have made Hubble famous. FUSE has been very successful scientifically, but without images it is hard to convey this, and why it had unique capabilities that Hubble (or its eventual replacement JWST) can not match, to the public.

Friday, August 17, 2007

Mira's tail, 13 light years long and my favorite color: ultra-violet




The image above may look like a comet, but in fact the stream you see in this image taken by NASA ultra-violet space telescope GALEX is 13 light years long, and is comprised of gas thrown off the red giant star Mira over the last 30 000 years or so as it moves through space at a speed of about 130 km/s.

For more information read the press releases (here, and here with a really nice animation), which do a good job of explaining what is physically happening and how this discovery happened. Some of the images are quite impressive, even from a purely aesthetic point of view.

A BBC story is somewhat less informative, but it does includes more quotes from one of the primary scientists involved in this discovery< Mark Siebert (who just happened to be a grad student in the Astro Dept here at Johns Hopkins before moving out west. Hi Mark!).

Mira is a very well studied star and the archetype of a specific class of variable star (Mira variables), but this huge trail of cast-off gas was only discovered in GALEX images taken last year. This kind of discovery is yet another example that illustrates why having telescopes operating at wavelengths other than the optical is vital for advancing our understanding of the Universe. Hubble or ground-based telescopes like Keck and the VLT, with their primarily optical detectors and tiny fields of view, could never have made this discovery.




Almost as impressive as the scientific discovery itself is the effort JPL and GALEX PR team put into producing the press releases. You can see 6 different spins of the story one after the other. The work on the animations are also first class.

Friday, August 10, 2007

New Homo Erectus and Homo Habilis fossil finds


To get beyond the somewhat distorted press accounts regarding the Spoor et al letter in Nature on "Implications of new early Homo fossils from Ileret, east of Lake Turkana, Kenya" you should read this post at John Hawk's anthropology web log.

As with much press coverage of scientific issues something that has been known about for many decades: that multiple species of Homo coexisted at the same time, and that the evolutionary tree of genus Homo is quite bushy and complicated, and not a simple linear ladder of "progress"; is being presented as a new discovery, and furthermore the actual issues discussed in the Spoor letter aren't covered in the press coverage.

The image is a to-scale superposition of the skulls of the young adult (or late subadult) Homo Erectus (KNM-ER 42700, one of the subjects of the Spoor et al letter, cranial volume about 700 ml) on top of the skull of the largest known Homo Erectus skull (OH 9). A cool image that illustrates the diversity within Homo Erectus.

Just FYI, 90% of modern humans have cranial volumes in the range 1040 to 1595 ml [talk.origins FAQ], with volumes less than 1000 ml being extremely uncommon.

Friday, July 27, 2007

Veto?

Remember that last month the senate was trying to add a small amount of money to NASA's budget to make up for funding missing from Bush's budget request? That measure passed both Senate and House, but now do we have to worry about a Bush veto of H.R. 3093?


The Administration supports the House's full funding for NASA's Exploration Systems and Space Shuttle. However, the Administration does not endorse funding in excess of the request for Aeronautics, Education, and Science...

...
if H.R. 3093 were presented to the President, he would veto the bill.


More on using GPUs to calculate physics

Just to follow up on Monday's post on using PC graphics cards to perform scientific calculations, Ryan Smith at Anandtech has an article on the status of GPU physics and dedicated physics processors for games. Note that "physics" in terms of games is rather limited in scope, largely particle effects and finite element calculations (e.g. the rag-doll motion of dead enemies) to make a game look good.

What comes out of this article is that (unlike the case of the astrophysical N-body calculations discussed previously) it appears that in terms of gaming the initial attempts for PC games to use physics processors (e.g. the AGEIA PCI card) or GPUs as physics processors (e.g. Havok's separate software package called Havok FX) have difficultly accessing the results of any calculations other than changing the display:


The second reason, and that which has the greater effect, is a slew of technical details that stem from using Havok FX. Paramount to this is what the GPU camp is calling physics is not what the rest of us would call physics with a straight face. As Havok FX was designed, the physics simulations run on the GPU are not retrievable in a practical manner, as such Havok FX is designed to be used to generate "second-order" physics. Such physics are not related to gameplay and are inserted as eye-candy. A good example of this is Ghost Recon: Advanced Warfighter, which we'll ignore was a PhysX powered title for the moment and focus on the fact that it used the PhysX hardware primarily for extra debris.

The problem with this of course is obvious, and Havok goes through a great deal of trouble in their Havok FX literature to make this clear. The extra eye-candy is nice and it's certainly an interesting solution to bypassing the problem of lots-of-little-things loading down the CPU (although Direct3D 10 has reduced the performance hit of this), but it also means that the GPU can't have any meaningful impact on gameplay. It doesn't make Havok FX entirely useless since eye-candy does serve its purpose, but it's not what most people (ourselves included) envision when we think hardware accelerated physics; we're looking for the next step in interactive physics, not more eye-candy.
Emphasis added.

Note that the CUDA software used to do the N-body simulations described by Schiveet al (astro-ph/0707.2991) package described in the is very different from the gaming-related Havok FXAnandtech article.

If you've ever worked with parallel processing yourself you'll appreciate that efficiently accessing the results of a calculation on a separate processor is the trickiest part of parallel programming, rather than actually having a calculation run on a separate processor. Even if GPU physics processing does take off (and doesn't get forgotten as the access to multi-core CPUs increases), I doubt Havok FX will see much real use before approaches closer to CUDA in nature take over.

Just for interest's sake, let us return to the issue of what kind of "physics" is dealt with in games. Here I quote from Havok's own discussion of Havok FX:

What Kinds Of Effects Does Havok FX Enable?

Havok FX enables dynamic in-game effects that are based upon rigid-body collisions and constraints – including debris, smoke, fog, and ultimately simulated liquids - but on a scale that goes well beyond the magnitude and volume of objects normally simulated in CPU-based game-play physics. By performing simulation, collision detection and rendering directly on the GPU, Havok FX avoids the transfer of large amounts of data between the CPU and GPU, enabling a level of performance of object collisions in the 1000’s occurring in real-time, without putting any additional burden on the CPU or otherwise slowing down the game.

How DoesHavok FX Work?

Havok FX supports a new type of rigid-body object called a Debris Primitive. A Debris Primitive is a compact representation of a 3D collidable object that can be processed via Shader Model 3.0 (SM3.0) in a very efficient manner. Debris Primitives can be pre-modeled as part of a game's static art content (e.g. custom/textured boulders, space junk, or collateral objects waiting for an explosive charge). They may also be generated on the fly during game play by the CPU, based on the direction and intensity of a force (e.g. brick and stone structure blown apart by a cannon blast). Once generated by the CPU, Debris Primitives can be dispatched fully to the GPU for physical simulation and final rendering - comprising a powerful blending of physics and state-of-the-art shading effects for particulate and large scale phenomenon.

Thursday, July 26, 2007

Sunshine: yet another totally implausible sci-fi film

After reading this review of Danny Boyle and Alex Garland's (28 Days Later) new film Sunshine I will definitely NOT be watching this film, either in the cinema or on DVD, and I recommend you give it a miss too.


Sunshine imagines a near future when the sun is dying and a solar winter has enveloped the earth. To save humanity, an international crew aboard the aptly named Icarus II sets out towards the center of the solar system to deliver a nuclear device to re-ignite the sun.

Leading the expedition is the levelheaded Captain Kaneda (Hiroyuki Sanada), but the Icarus II's secret weapon is astrophysicist Robert Capa (Cillian Murphy), responsible for the ship's payload—a "stellar bomb" containing the earth's remaining supply of uranium and dark matter—whose detonation would create "a big bang on a small scale," as Capa promises, and "a new star born out a dying one."

There are just so many things wrong with this plot that its impossible to cover them all.

It's also a highly theoretical mission, based on as yet unconfirmed physical theories of supersymmetry. But no matter, Boyle, Garland, and the film's scientific advisor Dr. Brian Cox—who is currently working on the Large Hadron Collider, the world's largest particle accelerator—care less about scientific rigor than scientific metaphor in their film.

Unconfirmed theories of super symmetry? It sounds like it based on the random techno-babble of the scientifically illiterate, with absolutely zero physical theoretical basis at all. And are we supposed to believe its impossible to have metaphors in anything that is actually scientifically accurate?

Forgetting plot machinations for a moment, Sunshine excels most at illuminating the mysterious elemental beauties of the universe.

Argh! How can it illuminate the mysteries of the universe if its based on absolute nonsense and totally ignores real science? The universe is wondrous, but what we really know about it, and how we can work it out, and what is still not known, is the truly amazing stuff.

Making some crap up doesn't illuminate anything, frankly it obscures the truth and diminishes just how awe-inspiring reality really is. And as a work of fiction it is really lazy. A decent writer could produce a plot that could still largely adhere to scientific accuracy and yet be an interesting work of fiction. Sure, science fiction will go beyond modern science or change aspects of it but the more implausible and inconsistent you make it the worse it is.

This is a common problem with most big-budget science fiction films, as science fiction they're really bad. There is a fair bit of really good written science fiction, but it seems like the movie studios want the lowest-common-denominator special effects laden or action-based their execs can understand. The elegance of either accuracy or a well-crafted story need not apply, it seems.

Monday, July 23, 2007

Using graphics cards to do science

I'm stuck working from home for a second day thanks to a sudden-onset summer cold, but while I back-up all my data here is an interesting article from today's astro-ph: astro-ph/0707.2991.

Schive et al, at the National Taiwan University in Taipei, describe using the highly-parallel processors on modern GPUs (specifically a cluster of machines with dual nVidia GeForce 8800 GTX graphics cards) to perform astrophysical N-body simulations. A special library and software development kit called CUDA, developed by nVidia, allows the programmers to perform computation on the GPU and access the results instead of creating graphical output to a monitor.

Compared to the dedicated GRAPE processors that are designed to specifically address this kind of problem in hardware their PC+GPU cluster performs extremely well (both in performance per unit price, and in net teraflops).

This is very exciting, as the increasing capabilities of commodity PC hardware bode well for the future of scientific computing. Back in the mid-1990's when I started my PhD the department hard a few, very expensive, Sun Sparc and DEC Alpha workstation that were used by everyone through thin client terminals. By the time I finished my PhD commodity Pentium and Athlon PCs running linux offered equivalent or better performance per unit price, and people started having individual machines to themselves* (although the fastest CPUs in absolute terms were still the ultra-expensive Compaq [formerly DEC] Alpha's). Beowulf clusters appeared at the same time, and now clusters based on essentially PC hardware dominated the chart of the top 100 supercomputers.

Much of the growth in PC CPU power over the last decade has been driven by multiple factors: PC gaming, the growth of the Internet, the AMD vs. Intel competition, and also ironically the need for fast CPUs to counteract the growing bloat and slothfulness of Microsoft's Windows 98/XP/Vista series.

Now the increasing graphical demands of PC gaming, and the nVidia vs ATI (now part of AMD) market share competition may drive the rate of improvement in scientific computing for the next decade.

* Although the move to semi-individual machines rather than multi-user use was also driven at the by the poor network performance of NFS cross-mounted IDE disks compared to NFS mounted SCSI drives on the traditional workstations.

Wednesday, July 11, 2007

Missing metals, lost gas, and inadequate dwarfs.

To follow on from the last post, this is another round up of recent papers or preprints related to galactic winds. This time I'll provide much less commentary.

I will start by noting that it is a commonly held view that if galaxies do lose metals via outflows, then these outflows must be most effective in the lowest mass galaxies, largely based on the plausibility argument that low mass galaxies has shallower gravitational potential wells and hence less energy is required for gas to escape. In this view metal ejection and winds are only effective in dwarf galaxies (M_galaxy Dekel & Silk 1986, Ferrara and Tolstoy 2000).

Although I initially believed this idea while I was a grad student, there really wasn't any good physical reason to believe it. My theoretical work with Ian Stevens confirmed what others (See eg. De Young and Heckman 1994, or Suchkov et al 1994) had already pointed out - what controls superwind dynamics are the starburst energy and mass injection and the ambient ISM the ejecta interacts with, not the galactic gravitational field. Observationally the most spectacular, either largeest, brightest or fastest winds all appeared in large, M_galaxy > 1e10 Msun, galaxies and not in the dwarf starbursts. Worse still, the dwarf starbursts have the lowest optical outflow velocities and in both optical and X-ray typically show only small bubble or shell like structures rather than clear minor-axis outflows.

Having dispensed with background information, let us proceed to the new results:

Bouche et al (2007, Monthly Notices of the Royal Astronomical Society, Volume 378, Issue 2, 525, astro-ph/0703509) find that at redshift z~2.5 about a third of all the heavy element created by stars up to that point had been ejected from their original birth sites within galaxies. They find that much of this ejection must have occurred in galaxies with Blue-band luminosities between 1/10 and 1/3 of the characteristic galaxy L_B (L_B^*) at that redshift.

By way of comparison, L_B^* ~ 7x10 Lsun at z~2 and not much different now, with L_B^*~6e10 Lsun at z~2 (Lilly et al, 1995, ApJ, 455, 108). The classic local superwind galaxies, which are not in dwarf galaxies but in small to medium mass late-type galaxies, have 0.3e10 < href="http://arxiv.org/abs/0707.1345">astro-ph/0707.1345), using a very different (and rather more model-dependent) method using galaxies selected the SDSS. They calculate the fraction of baryons (note: not just metals) lost by galaxies as a function of look-back time and galaxy mass. Although they interpret their results as being consistent with the "classic dwarfs are more important" scenario, these results are actually contradict it in several ways. (1) Ejection fractions are both non-zero and non-negligible (10-20%) in galaxies up to 1e12 Msol, (2) the galaxy luminosity function will ensure that the net contribution from all galaxies will be largest from objects in the M ~ 1e10 range, not the tiny dwarfs, (3) the Mac-Low and Ferrara simulations of outflows from dwarf galaxies (not actually of starburst strength) showed that gas ejection was difficult and inefficient. Note that metal ejection and gas or mass ejection are distinctly different things.

Finally, still on superwinds, Cooper et al (2007, Ap&SS, currently online only) have reported on their initial 3-D hydrodynamical simulations of a superwind. At present the region covered is only the starburst region itself, although at good resolution. They generate extensive H-alpha filamentation by initially introducing (by hand) a large population of dense clumps within the starburst region. In other respects the models appear to reproduce many of the structures, in particular regions of soft X-ray emission, seen in the older 2-D cylindrically symmetric simulations by myself and others, which is nice.