getfem-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Getfem-users] gmres did not converge!


From: Konstantinos Poulios
Subject: Re: [Getfem-users] gmres did not converge!
Date: Thu, 2 May 2019 21:36:41 +0200

Dear Zhenghuai Guo,

You need to install mumps seperately. If you install it in one of the standard locations and reconfigure and compile getfem, the configure script should be able to find and use mumps. If not, you have to use the options that you mentioned in your mail.

This is a code sample regarding the use of mumps in getfem:

    gmm::iteration iter(res, 1, 20);
    getfem::simplest_newton_line_search ls(-1, /*alpha max ratio*/ 2., /*alpha min*/ 0.36,
                                           /*alpha mult*/ 0.6, /*alpha threshold res*/ 5e3);
    getfem::standard_solve(md, iter,
                           getfem::rselect_linear_solver(md, "mumps"), ls);

BR
Kostas


On Thu, May 2, 2019 at 3:52 PM Zhenghuai Guo <address@hidden> wrote:

Hi Kostas,

 

Thank you for the advice.

 

Sorry to be stupid. But I have not use MUMPS before. And I haven’t fond any example showing how to use/call it in a getfem code. As I understand, in order to use it I should

 

  1. Add the following in ./configure stage when install getfem

 

--with-mumps-include-dir=" -I /path/to/MUMPS/include "
--with-mumps=" F90 libraries and libs of MUMPS to be linked "

 

              (as indicated in page 5 in getfem User Documentation version5.3)

 

 

  1. Then I can use it

 

 

Is this procedure correct? Or do I need to install MUMPS separately?

 

Thank you very much

Regards

Zhenghuai Guo

 

From: Konstantinos Poulios <address@hidden>
Sent: Thursday, May 2, 2019 11:04 PM
To: Zhenghuai Guo <address@hidden>
Cc: Yves Renard <address@hidden>; getfem-users <address@hidden>
Subject: Re: [Getfem-users] gmres did not converge!

 

Dear Zhenghuai Guo,

 

Why don't you use a direct solver (MUMPS)? Direct solvers are much more robust, especially if your model contains Lagrange multipliers.

 

BR

Kostas

 

On Thu, May 2, 2019 at 2:57 PM Zhenghuai Guo <address@hidden> wrote:

Hi, Andriy and Yves and Konstantinos,

I loaded two meshes, totally about 350000 elements.

I only added elasticity bricks:
  getfem::add_isotropic_linearized_elasticity_brick(model, mim_1, "u_1", "lambda_1", "mu_1");
  getfem::add_isotropic_linearized_elasticity_brick(model, mim_2, "u_2", "lambda_2", "mu_2");


and four pointwise_constraints (two more each mesh):
  getfem::add_pointwise_constraints_with_multipliers(model,"u_1","point_1_1","point_1_1_unitx", "point_1_1_valuex");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_1","point_1_1","point_1_1_unity", "point_1_1_valuey");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_1","point_1_2","point_1_2_unitx", "point_1_2_valuex");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_1","point_1_2","point_1_2_unity", "point_1_2_valuey");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_2","point_2_1","point_2_1_unitx", "point_2_1_valuex");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_2","point_2_1","point_2_1_unity", "point_2_1_valuey");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_2","point_2_2","point_2_2_unitx", "point_2_2_valuex");
  getfem::add_pointwise_constraints_with_multipliers(model,"u_2","point_2_2","point_2_2_unity", "point_2_2_valuey");

 The values for the pointwise_constraints are simply the same as -1. This means the two meshes should just be translated parallelly by -1 in both x and y direction.

My solver is
  gmm::iteration iter(1E-9, 1, 100);
  getfem::standard_solve(model, iter);


But I get warning of not convergence:
iter   0 residual            1
 iter 500 residual  6.17288e-05
Level 2 Warning in getfem/getfem_model_solvers.h, line 127: gmres did not converge!

I am quite sure that my mesh looks good and have no strange shape elements. When I use small simply meshes, the code runs ok.
Could you please advise to me how to improve?

Regards
Zhenghuai Guo


reply via email to

[Prev in Thread] Current Thread [Next in Thread]