Re: Barnes-Hut treecode (was Re: Galaxy Interaction Simulations)

Steinn Sigurdsson (steinly@topaz.ucsc.edu)
9 Mar 94 12:06:25

In article <1994Mar9.005526.28636@jarvis.csri.toronto.edu> wayne@csri.toronto.edu (Wayne Hayes) writes:

steinly@topaz.ucsc.edu (Steinn Sigurdsson) writes:
>Systems for which "N" approaches the actual number of particles
>do exist. Trivially, N=3-10 are instances, (actually 1-10 is a better
>range :-) - certainly the solar system people use the actual "N"
>by and large.

Yes, and they usually use specially-designed code with *very* high
order integrators (like 12th or 13th order), unlike large-N researches
who are often content with 4th or even 2nd order integrators. That's

Try leapfrog, it's usually good enough and it's even symplectic :-)

because we're interested in fundamentally different things: solar
system researchers want to integrate paths of specific objects very
accurately over a very long time; useful galactic dynamics research
can be done by just following the essentually "fluid" motion of the
huge number of particles. (This isn't a flame; I just wanted to
make it clear for interested bystanders.)

To paraphrase Jerry Ostriker, we take a particle distribution, approximate
it as a fluid, and then model it by creating a realisation of the
fluid as a set of particles :-)

>To model globular clusters with unsoftened particles (ie including
>2bod relaxation) is not possible,

You mean not possible with current machines/techniques, or fundamentally
impossible, even in principle? I was under the impression that it's
only a matter of not enough CPU cycles to accurately model large-N
systems without softening.

Well, in a very real sense modeling large N collisional systems
is intractable, there are two problems, the true trajectories
diverge from the calculated ones, and you can show that if there
are bound subsystems then at any resolution (in the point mass
approximation) there will be a perturbation of a bound system by an
unbound particle that is significant at your resolution scale but
can't be accurately modeled at that scale - there is however good reason to believe
neither issue matters all that much.

>the HARP project is probably the closest. For those interested in
>collisionless dynamics, which still allows modeling of some
>interesting aspects of globulars, I have a working code that
>has done a full realisation of globular clusters, anything from
>Pal 1 to Omega Cen for, say, 1000 dynamical times - but it doesn't
>include two body relaxation and has limited application.

You lost me here. I thought globulars were highly collisional? How
can a collisionless code model globulars. (For the uninitiated:
``collisional'' essentially means ``high density'', where the paths

Eeek, that's a bit too much of an oversimplification, the velocity
distribution is a rather important factor here! Interested people
go read Binney & Tremaine...

of individual particles can be substantially altered by close passes
of neighbors; this opposed to ``collisionless'' systems, where the
particle trajectories are governed almost solely by some global
potential. The solar system is an extreme example of a collisionless
system.)

There are perfectly good ways to model collisional systems with
collisionless codes. The gross properties of the system change
due to "collisions" on a "relaxation" time scale, this time scale
can be longer than physical time scales of interest (in fact a
"collisionless" system is simply one in which this time scale
is longer than any other physical time scale of interest).
So, that leaves a window whereby you can use "collisionless"
models to model physics of collisional systems.
(A simple example is the tidal shocking of a globular
cluster crossing the galactic disk, the response to
the shock occurs on a time scale much shorter than a
relaxation time scale for most all globulars)

>We're in a race with the cosmology code people for the largest N,
>our (non-interacting) galaxy simulations have hit N=10^7 and we'll
>get to N=10^8 in a matter of months.

Wow! Surely this is *with* softening, though? When I said "ideal"
N-body, I meant that each particle in the simulation is representative
of one particle in the system -- thus you wouldn't use softening.

Well, the particular scheme is not a "particle-particle"
scheme so softening per se does not enter at all. The pair
interactions are averaged over, it really is a "collisionless"
scheme. The fact that there is a particle per body in the
system to be modeled does not preclude the use of "softening",
if "collisional" encounters are not going to be important to
the dynamics of interest then there is little point in tracking
them at enormous expense (except of course to verify they are
not important :-)
I would disagree with your contention that one shouldn't
use softening for "ideal" N-body. Unsoftened N-body is an interesting
problem, but physical systems _should_ deal separately with "short
enough" separations, at least in astrophysics, most of the interesting
systems either have finite size (eg stellar radius) or other
physics become important on short length scales (eg GR).
For direct particle-particle interaction N-body simulations,
it is possible to follow a few thousand (unsoftened) particles
for a few hundred-thousand dynamical times. This will increase
by a factor of few in the next year, just about bringing
globular clusters into reach by about 1996.

* Steinn Sigurdsson Lick Observatory *
* steinly@lick.ucsc.edu "standard disclaimer" *
* The laws of gravity are very,very strict *
* And you're just bending them for your own benefit - B.B. 1988*