swarm-modeling
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Swarm-Modelling] Floating point arithmetic


From: James Marshall
Subject: Re: [Swarm-Modelling] Floating point arithmetic
Date: Tue, 03 May 2005 14:23:36 +0100
User-agent: Mozilla Thunderbird 1.0.2 (X11/20050317)

Hi Russell,
  define serious production work!
With Repast I think there is quite a lot of Java modelling being done... I made the switch to Repast from Swarm some time ago. A lot of the modelling people want to do is for much smaller scale models than the kind you probably use at UNSW's HPC unit, i.e. models where the sensitivity and robustness analyses can often be done by running a batch script overnight on your desktop PC. I even did a larger model (that ran via batch parallelism on Imperial College's HPC machines) for a consultancy project in Java Swarm, my main motivation being ease of model construction and maintenance for the other people on the project, who didn't have good software engineering, C++, etc. backgrounds. There were some performance issues to work around, but I felt efficiency of modelling was more important than runtime efficiency (I have to say, if I'd already tried Repast before starting the project I probably would have chosen that over Java Swarm for the same reason) But, Gary's work on floating point precision in modelling is very sobering, and he's quite right to point out Java's inadequate support in this area. I should definitely be revising my practices after reading Gary's work, but have to confess it has slipped my mind!
  Regards,
        James

Russell Standish wrote:
On Sun, May 01, 2005 at 06:40:45PM -0600, Marcus G. Daniels wrote:

I see the two worlds (ahead of time compilation and just in time compiation) coming together, as the crucial issue for future performance of complex codes will have as much to do with effective dynamic code partitioning over independent compute engines (finding parallelism) and identification of runtime bottlenecks like out-of-cache memory access. Expensive ahead-of-time analysis and mapping of user code to the CPU architecture(s) in a system will still be important, but since dynamics of programs can change over time (e.g. as in an agent based simulation), it will also be useful to have system support to do things like adaptively arrange the access pattern over a working set to fit in cache.


This is a fair comment, and I hope this does come to pass. The current
"two cultures" situation is a bit of a pain.


1) I learnt C++ in 1993/4, a decade after the language came out. At
that time, it was completely unknown in computational science. Even
C was considered a novelty at the time, barely registering in a
world of Fortran77.


In contrast, Java took off relatively quickly..


In computational science, Java has not taken off at all. I know of not
one computational scientist using Java for serious production
work. There are some pedagogical examples written in Java, and
occasionally these might provide research outcome too, but that is
often incidental. Usually, the models need to be rewritten in C++ or
Fortran to be serious research tools.

Matlab stands in similar role to Java as well. It is used for
prototyping, and for some problems, the prototype will ultimately
provide the searched for answer. However, many times, the Matlab code
needs to be rewritten, usually in Fortran90.

A decent optimising, autoparallelising Matlab compiler could be a real
boon. I think the problem is that Matlab is basically a product, not a
language standard.

Cheers



------------------------------------------------------------------------

_______________________________________________
Modelling mailing list
address@hidden
http://www.swarm.org/mailman/listinfo/modelling

--
Dr James A. R. Marshall
Department of Computer Science
University of Bristol
http://www.cs.bris.ac.uk/home/marshall


reply via email to

[Prev in Thread] Current Thread [Next in Thread]