swarm-hackers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [swarm-hackers] Swarm on MacOS 10.6 Snow Leopard


From: Scott Christley
Subject: Re: [swarm-hackers] Swarm on MacOS 10.6 Snow Leopard
Date: Fri, 18 Sep 2009 12:32:09 -0700

Yes it looks very interesting. It isn't clear to me how much GNU needs to track the changes. I know for example that Apple added new keywords (exceptions, properties) in the ObjC 2.0 language, the parsing is there in GCC but I don't believe the code support for the GNU runtime is written yet.

OpenCL struck me as well, especially now that they have the on-the-fly compilation/linking/loading (I've been wanting this for years!). After SwarmFest, I went boggles over GPU programming, especially seeing I am getting over 100x! speedup with my PDE code. I have my models defined in human-readable form, which I was then running CPU code, but now I generate GPU code. Though I have to do it in two steps, run a program to generate GPU code from the model, then compile and link everything together. It would be nice to just have one step. But along with that, generic GPU code is not the most efficient or practical, based upon the size and complexity of the model, different GPU code needs to be produced. Mostly this is to get by limitations in the GPU programming paradigm, as you have to be more careful about the software -> hardware mapping, who knows maybe someday that stuff will go away. Like I wonder how much it would hurt GPU performance if it incorporated a stack for function calls.

I especially like this with tools like MetaABM where code can be generated on the fly. Not so much for strict ABM but more for hybrid models where you have different methods that need to be integrated, to get the most efficient code, you need to intertwine the different methods. For example, I'm working on a model with discrete cells in 3D space that move, each cell has a gene network (ODE) and that network is tied to cell-cell neighbor interaction, so it is not really possible to separate out the two pieces, they need to be integrated together into the same code. I suppose the big question is if these "model integration links" can be determined automatically to know what code to produce. For example, if my gene network wasn't based on cell- cell interaction, cell movement and the gene network could be effectively separate.

Marcus, have you looked at Cappuccino at all? What do you think about these web browser frameworks for building high-functionality apps running in the browser? I'm becoming more and more fascinated with the idea of having a dynamic GUI experience of constructing, probing and analysis residing in the web browser, with a connection to a backend server running simulation code.

cheers
Scott

On Sep 18, 2009, at 6:36 AM, Marcus G. Daniels wrote:

It occurs to me that Grand Central Dispatch is getting close to providing parallelized scheduling. For example, `blocks' are like FCall objects, and the different priority queues are like different schedules. Perhaps it would be worth looking at the GCD library source code to see what it would take to attach priorities to each and every block?



_______________________________________________
swarm-hackers mailing list
address@hidden
http://lists.nongnu.org/mailman/listinfo/swarm-hackers





reply via email to

[Prev in Thread] Current Thread [Next in Thread]