Friday, July 27, 2007

Veto?

Remember that last month the senate was trying to add a small amount of money to NASA's budget to make up for funding missing from Bush's budget request? That measure passed both Senate and House, but now do we have to worry about a Bush veto of H.R. 3093?


The Administration supports the House's full funding for NASA's Exploration Systems and Space Shuttle. However, the Administration does not endorse funding in excess of the request for Aeronautics, Education, and Science...

...
if H.R. 3093 were presented to the President, he would veto the bill.


More on using GPUs to calculate physics

Just to follow up on Monday's post on using PC graphics cards to perform scientific calculations, Ryan Smith at Anandtech has an article on the status of GPU physics and dedicated physics processors for games. Note that "physics" in terms of games is rather limited in scope, largely particle effects and finite element calculations (e.g. the rag-doll motion of dead enemies) to make a game look good.

What comes out of this article is that (unlike the case of the astrophysical N-body calculations discussed previously) it appears that in terms of gaming the initial attempts for PC games to use physics processors (e.g. the AGEIA PCI card) or GPUs as physics processors (e.g. Havok's separate software package called Havok FX) have difficultly accessing the results of any calculations other than changing the display:


The second reason, and that which has the greater effect, is a slew of technical details that stem from using Havok FX. Paramount to this is what the GPU camp is calling physics is not what the rest of us would call physics with a straight face. As Havok FX was designed, the physics simulations run on the GPU are not retrievable in a practical manner, as such Havok FX is designed to be used to generate "second-order" physics. Such physics are not related to gameplay and are inserted as eye-candy. A good example of this is Ghost Recon: Advanced Warfighter, which we'll ignore was a PhysX powered title for the moment and focus on the fact that it used the PhysX hardware primarily for extra debris.

The problem with this of course is obvious, and Havok goes through a great deal of trouble in their Havok FX literature to make this clear. The extra eye-candy is nice and it's certainly an interesting solution to bypassing the problem of lots-of-little-things loading down the CPU (although Direct3D 10 has reduced the performance hit of this), but it also means that the GPU can't have any meaningful impact on gameplay. It doesn't make Havok FX entirely useless since eye-candy does serve its purpose, but it's not what most people (ourselves included) envision when we think hardware accelerated physics; we're looking for the next step in interactive physics, not more eye-candy.
Emphasis added.

Note that the CUDA software used to do the N-body simulations described by Schiveet al (astro-ph/0707.2991) package described in the is very different from the gaming-related Havok FXAnandtech article.

If you've ever worked with parallel processing yourself you'll appreciate that efficiently accessing the results of a calculation on a separate processor is the trickiest part of parallel programming, rather than actually having a calculation run on a separate processor. Even if GPU physics processing does take off (and doesn't get forgotten as the access to multi-core CPUs increases), I doubt Havok FX will see much real use before approaches closer to CUDA in nature take over.

Just for interest's sake, let us return to the issue of what kind of "physics" is dealt with in games. Here I quote from Havok's own discussion of Havok FX:

What Kinds Of Effects Does Havok FX Enable?

Havok FX enables dynamic in-game effects that are based upon rigid-body collisions and constraints – including debris, smoke, fog, and ultimately simulated liquids - but on a scale that goes well beyond the magnitude and volume of objects normally simulated in CPU-based game-play physics. By performing simulation, collision detection and rendering directly on the GPU, Havok FX avoids the transfer of large amounts of data between the CPU and GPU, enabling a level of performance of object collisions in the 1000’s occurring in real-time, without putting any additional burden on the CPU or otherwise slowing down the game.

How DoesHavok FX Work?

Havok FX supports a new type of rigid-body object called a Debris Primitive. A Debris Primitive is a compact representation of a 3D collidable object that can be processed via Shader Model 3.0 (SM3.0) in a very efficient manner. Debris Primitives can be pre-modeled as part of a game's static art content (e.g. custom/textured boulders, space junk, or collateral objects waiting for an explosive charge). They may also be generated on the fly during game play by the CPU, based on the direction and intensity of a force (e.g. brick and stone structure blown apart by a cannon blast). Once generated by the CPU, Debris Primitives can be dispatched fully to the GPU for physical simulation and final rendering - comprising a powerful blending of physics and state-of-the-art shading effects for particulate and large scale phenomenon.

Thursday, July 26, 2007

Sunshine: yet another totally implausible sci-fi film

After reading this review of Danny Boyle and Alex Garland's (28 Days Later) new film Sunshine I will definitely NOT be watching this film, either in the cinema or on DVD, and I recommend you give it a miss too.


Sunshine imagines a near future when the sun is dying and a solar winter has enveloped the earth. To save humanity, an international crew aboard the aptly named Icarus II sets out towards the center of the solar system to deliver a nuclear device to re-ignite the sun.

Leading the expedition is the levelheaded Captain Kaneda (Hiroyuki Sanada), but the Icarus II's secret weapon is astrophysicist Robert Capa (Cillian Murphy), responsible for the ship's payload—a "stellar bomb" containing the earth's remaining supply of uranium and dark matter—whose detonation would create "a big bang on a small scale," as Capa promises, and "a new star born out a dying one."

There are just so many things wrong with this plot that its impossible to cover them all.

It's also a highly theoretical mission, based on as yet unconfirmed physical theories of supersymmetry. But no matter, Boyle, Garland, and the film's scientific advisor Dr. Brian Cox—who is currently working on the Large Hadron Collider, the world's largest particle accelerator—care less about scientific rigor than scientific metaphor in their film.

Unconfirmed theories of super symmetry? It sounds like it based on the random techno-babble of the scientifically illiterate, with absolutely zero physical theoretical basis at all. And are we supposed to believe its impossible to have metaphors in anything that is actually scientifically accurate?

Forgetting plot machinations for a moment, Sunshine excels most at illuminating the mysterious elemental beauties of the universe.

Argh! How can it illuminate the mysteries of the universe if its based on absolute nonsense and totally ignores real science? The universe is wondrous, but what we really know about it, and how we can work it out, and what is still not known, is the truly amazing stuff.

Making some crap up doesn't illuminate anything, frankly it obscures the truth and diminishes just how awe-inspiring reality really is. And as a work of fiction it is really lazy. A decent writer could produce a plot that could still largely adhere to scientific accuracy and yet be an interesting work of fiction. Sure, science fiction will go beyond modern science or change aspects of it but the more implausible and inconsistent you make it the worse it is.

This is a common problem with most big-budget science fiction films, as science fiction they're really bad. There is a fair bit of really good written science fiction, but it seems like the movie studios want the lowest-common-denominator special effects laden or action-based their execs can understand. The elegance of either accuracy or a well-crafted story need not apply, it seems.

Monday, July 23, 2007

Using graphics cards to do science

I'm stuck working from home for a second day thanks to a sudden-onset summer cold, but while I back-up all my data here is an interesting article from today's astro-ph: astro-ph/0707.2991.

Schive et al, at the National Taiwan University in Taipei, describe using the highly-parallel processors on modern GPUs (specifically a cluster of machines with dual nVidia GeForce 8800 GTX graphics cards) to perform astrophysical N-body simulations. A special library and software development kit called CUDA, developed by nVidia, allows the programmers to perform computation on the GPU and access the results instead of creating graphical output to a monitor.

Compared to the dedicated GRAPE processors that are designed to specifically address this kind of problem in hardware their PC+GPU cluster performs extremely well (both in performance per unit price, and in net teraflops).

This is very exciting, as the increasing capabilities of commodity PC hardware bode well for the future of scientific computing. Back in the mid-1990's when I started my PhD the department hard a few, very expensive, Sun Sparc and DEC Alpha workstation that were used by everyone through thin client terminals. By the time I finished my PhD commodity Pentium and Athlon PCs running linux offered equivalent or better performance per unit price, and people started having individual machines to themselves* (although the fastest CPUs in absolute terms were still the ultra-expensive Compaq [formerly DEC] Alpha's). Beowulf clusters appeared at the same time, and now clusters based on essentially PC hardware dominated the chart of the top 100 supercomputers.

Much of the growth in PC CPU power over the last decade has been driven by multiple factors: PC gaming, the growth of the Internet, the AMD vs. Intel competition, and also ironically the need for fast CPUs to counteract the growing bloat and slothfulness of Microsoft's Windows 98/XP/Vista series.

Now the increasing graphical demands of PC gaming, and the nVidia vs ATI (now part of AMD) market share competition may drive the rate of improvement in scientific computing for the next decade.

* Although the move to semi-individual machines rather than multi-user use was also driven at the by the poor network performance of NFS cross-mounted IDE disks compared to NFS mounted SCSI drives on the traditional workstations.

Wednesday, July 11, 2007

Missing metals, lost gas, and inadequate dwarfs.

To follow on from the last post, this is another round up of recent papers or preprints related to galactic winds. This time I'll provide much less commentary.

I will start by noting that it is a commonly held view that if galaxies do lose metals via outflows, then these outflows must be most effective in the lowest mass galaxies, largely based on the plausibility argument that low mass galaxies has shallower gravitational potential wells and hence less energy is required for gas to escape. In this view metal ejection and winds are only effective in dwarf galaxies (M_galaxy Dekel & Silk 1986, Ferrara and Tolstoy 2000).

Although I initially believed this idea while I was a grad student, there really wasn't any good physical reason to believe it. My theoretical work with Ian Stevens confirmed what others (See eg. De Young and Heckman 1994, or Suchkov et al 1994) had already pointed out - what controls superwind dynamics are the starburst energy and mass injection and the ambient ISM the ejecta interacts with, not the galactic gravitational field. Observationally the most spectacular, either largeest, brightest or fastest winds all appeared in large, M_galaxy > 1e10 Msun, galaxies and not in the dwarf starbursts. Worse still, the dwarf starbursts have the lowest optical outflow velocities and in both optical and X-ray typically show only small bubble or shell like structures rather than clear minor-axis outflows.

Having dispensed with background information, let us proceed to the new results:

Bouche et al (2007, Monthly Notices of the Royal Astronomical Society, Volume 378, Issue 2, 525, astro-ph/0703509) find that at redshift z~2.5 about a third of all the heavy element created by stars up to that point had been ejected from their original birth sites within galaxies. They find that much of this ejection must have occurred in galaxies with Blue-band luminosities between 1/10 and 1/3 of the characteristic galaxy L_B (L_B^*) at that redshift.

By way of comparison, L_B^* ~ 7x10 Lsun at z~2 and not much different now, with L_B^*~6e10 Lsun at z~2 (Lilly et al, 1995, ApJ, 455, 108). The classic local superwind galaxies, which are not in dwarf galaxies but in small to medium mass late-type galaxies, have 0.3e10 < href="http://arxiv.org/abs/0707.1345">astro-ph/0707.1345), using a very different (and rather more model-dependent) method using galaxies selected the SDSS. They calculate the fraction of baryons (note: not just metals) lost by galaxies as a function of look-back time and galaxy mass. Although they interpret their results as being consistent with the "classic dwarfs are more important" scenario, these results are actually contradict it in several ways. (1) Ejection fractions are both non-zero and non-negligible (10-20%) in galaxies up to 1e12 Msol, (2) the galaxy luminosity function will ensure that the net contribution from all galaxies will be largest from objects in the M ~ 1e10 range, not the tiny dwarfs, (3) the Mac-Low and Ferrara simulations of outflows from dwarf galaxies (not actually of starburst strength) showed that gas ejection was difficult and inefficient. Note that metal ejection and gas or mass ejection are distinctly different things.

Finally, still on superwinds, Cooper et al (2007, Ap&SS, currently online only) have reported on their initial 3-D hydrodynamical simulations of a superwind. At present the region covered is only the starburst region itself, although at good resolution. They generate extensive H-alpha filamentation by initially introducing (by hand) a large population of dense clumps within the starburst region. In other respects the models appear to reproduce many of the structures, in particular regions of soft X-ray emission, seen in the older 2-D cylindrically symmetric simulations by myself and others, which is nice.