|
[Sponsors] |
March 24, 2009, 09:23 |
trilinos or petsc,which is better?
|
#1 |
New Member
leping chen
Join Date: Mar 2009
Location: china
Posts: 19
Rep Power: 17 |
Recently, I utilize PETSc to solve Ax=b,but A is pretty ill-conditioned, all kinds of iterative methods were diverged, Only LU has good performance.But LU will encounter memory bottleneck. Because PETSc's precontioner is very poor when I use iterative methods, I want to use trilinos package!Do you think this is a right way?Thank you a lot! Or use hypre + petsc, but I don't konw how call pc of hypre in petsc. I have not these knowledge and experience.Whether I need to modify my original program? As for trilinos, I cannot download it from web,because the web is very very slow.I am puzzled!
|
|
March 24, 2009, 13:07 |
|
#2 | |
Member
Jed Brown
Join Date: Mar 2009
Posts: 56
Rep Power: 19 |
Quote:
Trilinos is relatively weaker on flexible object creation. That is, you end up having to do a lot of work to avoid hard-coding the types. Since you want to be able to try out many combinations, especially when using multilevel and/or domain decomposition preconditioners, this is pretty serious. PETSc is more dynamic in this regard, you should never need to determine the matrix type, preconditioner, etc at compile time. |
||
March 25, 2009, 06:03 |
|
#3 |
New Member
Join Date: Mar 2009
Posts: 27
Rep Power: 17 |
Is there any commercial alternative to petsc?
|
|
March 31, 2009, 12:56 |
Trilinos/ML
|
#4 | |
New Member
Mike Heroux
Join Date: Mar 2009
Posts: 3
Rep Power: 17 |
Quote:
I am not sure what is meant by the inflexible object creation comment. There are several native data classes that are only restrictive for the purposes of performance. The FECrsMatrix classes allows any processor to register any data needed by any other processor. Trilinos ultimately provides complete flexibility because all of its solvers and preconditioners access matrix operations and data via abstract interfaces. If you don't like the native data classes, you can use PETSc's or roll your own. You can also easily compose advanced preconditioners using any combination of other preconditioners, something that is essential for advanced multiphysics. In addition, the next version of Trilinos will have a new data package called Tpetra that will support templated data types, allowing mixing of float and double, and complex types all at the same time with a simple parameter redefinition. Finally, the right package in Trilinos to use for accessing solvers and preconditioners is Stratimikos. It provides a common access to all packages driven by a parameter list class that can be constructed in code or read from an XML input file. PETSc is a great package, so I am not discouraging people from using it. I just don't want misconceptions to exist about Trilinos. |
||
March 31, 2009, 14:28 |
|
#5 | ||
Member
Jed Brown
Join Date: Mar 2009
Posts: 56
Rep Power: 19 |
Quote:
Quote:
The functionality you describe in FECrsMatrix is present in all PETSc parallel matrix formats. Likewise for data structure neutrality. Incidentally, I think that C++ inheritance is a bad model for matrices since there are so many overlapping sets of possible functionality. You either end up forcing people to implement pure virtual functions that would be unreasonably expensive (so they just error) or you give everything a default implementation. What ends up happeningg is an attempt to partition the interface into orthogonal subsets, but this either means massively multiple inheritance or a class hiererchy in which the users' matrix doesn't fit. With OO in C using pimpl (PETSc's model), you can have a flat hierarchy and just check whether function pointers are NULL if you want to find out whether a given type implements an interface. Being able to try fairly exotic preconditioning strategies on the command line is a strength of PETSc that I haven't seen available from Trilinos. Experimentation is essential and the overhead of recompiling or even editing an XML file is significant. Tpetra has been around for a long time so I'm skeptical that it will be usable from all of Trilinos at the next release. It will offer a significant advantage for people who need complex-valued matrices for a small part of a multiphysics problem where the bulk of the computation is real-valued (if the complex-valued part makes up most of the computation, there is little hit to just using complex-valued matrices throughout). PETSc has had support for lower precision matrices (usually used for the preconditioning matrix with a matrix-free Jacobian) for ages, but you cannot use real- and complex-valued matrices in the same code. Trilinos is also quite good, but I have frequently run into the attitude that "it's C++ therefore more OO/flexible/better than C" (maybe it's just particularly strong at my university). On multiple occasions, I have talked with people at conferences who are using Trilinos (or even Matlab) and been told that they "are working on implementing this feature" which turns out to be possible on the command line with PETSc. I'm sure that a sufficiently experienced Trilinos user could have implemented these features in very few lines of code, so don't take this anecdote too seriously. |
|||
March 31, 2009, 16:10 |
|
#6 | ||||
New Member
Mike Heroux
Join Date: Mar 2009
Posts: 3
Rep Power: 17 |
Quote:
Quote:
Quote:
Quote:
Tpetra is for real now. It is intended to be the workhorse for Trilinos starting with the 10.0 release in September. In addition to being able to support all common precisions, it will allow mixed-precision computations and extended precision. This will be very useful in many settings. |
|||||
March 31, 2009, 17:58 |
|
#7 | |||
Member
Jed Brown
Join Date: Mar 2009
Posts: 56
Rep Power: 19 |
Quote:
A specific example in Trilinos is Epetra_Operator::ApplyInverse(). A user must implement this function, but it doesn't make sense for many matrix types (in principle, the operation exists but it's not affordable) so have to implement it and throw an error. But now, a caller cannot determine if ApplyInverse is implemented without calling it. The same applies for applying a transpose. You can add flags to determine whether the matrix has this operation, but that's working outside the type system and someone is responsible for keeping it consistent. You could split these operations into separate base classes Epetra_InvertableOperator, Epetra_TransposableOperator, multiply inherit from these, and dynamic_cast when you need that operation. This might be the most "OO" operation, but there are reasons everyone doesn't do it. The choice in Trilinos is to have a small number of matrix interfaces and require the user to provide dummy implementations of all the methods that don't make sense for their specific implementation. If a method is added to the interface, the user's code won't compile even though that method may not make any sense for them. The decision to structure the hierarchy this way is certainly reasonable, but I wouldn't claim that the type system is helping or that it's nicer to work with than a v-table supporting reflection. Quote:
Code:
mpirun -n 80 ./app -snes_mf_operator -snes_ksp_ew -pc_type mg -pc_mg_type kaskade -mg_levels_2_pc_type hypre -mg_levels_2_pc_hypre_bj TRUE -mg_levels_2_pc_hypre_type euclid -mg_coarse_pc_redundant_number 10 -mg_coarse_redundant_pc_factor_mat_solver_package mumps Quote:
|
||||
March 31, 2009, 22:46 |
|
#8 | |||
New Member
Mike Heroux
Join Date: Mar 2009
Posts: 3
Rep Power: 17 |
Quote:
Quote:
Quote:
Thanks for the good exchange. |
||||
April 1, 2009, 06:26 |
|
#9 | |||
Member
Jed Brown
Join Date: Mar 2009
Posts: 56
Rep Power: 19 |
Quote:
Quote:
Quote:
Thanks you, it's been informative. |
||||
|
|