getfem-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Getfem-users] Using getfem++ with the parallel version of superlu (


From: Konstantinos Poulios
Subject: Re: [Getfem-users] Using getfem++ with the parallel version of superlu (SUPERLU_MT)
Date: Wed, 25 Jan 2017 16:54:54 +0100

Dear Natanael Lanz,

GetFEM++ can currently use MUMPS as a parallel direct solver if compiled with --enable-paralevel=2, see http://download.gna.org/getfem/html/homepage/userdoc/parallel.html .
I guess an interface for the parallel version of SUPERLU would be similar to the available interface for MUMPS. However, I am wondering why not just using MUMPS? As far as I understand it is the most mature of the sparse direct solvers that one can get for free.

Best regards
Kostas

On Wed, Jan 25, 2017 at 4:29 PM, Lanz Natanael <address@hidden> wrote:

Hello,

 

Did ever anybody try to use getfem with a different version of superlu as the included one?

 

I tried to compile getfem it with the superlu_mt version for parallel computation by disabling superlu with the option –disable-superlu and giving the path of superlu_mt libraries in SUPERLU_LIBS before compiling but that always gives an error.

 

What also doesn’t work is the attempt to use the two superlu versions in parallel, as many functions have the same name in both libraries.

 

And compiling getfem++ completely without superlu by just typing –disable-superlu also does not seem to work.

 

Does anybody know, how this could be done? Parallel computation would be very useful in getfem for solving.

 

Thank you for your help!

 

Best regards,

 

Natanael Lanz

MSc Masch. Ing. ETH

 

IWF ETH Zürich

Technoparkstrasse 1

PFA E82

CH-8005 Zürich

 

address@hidden

www.iwf.mavt.ethz.ch

Tel. +41 44 6325714

Mob. +41 79 5862958

 

 


_______________________________________________
Getfem-users mailing list
address@hidden
https://mail.gna.org/listinfo/getfem-users



reply via email to

[Prev in Thread] Current Thread [Next in Thread]