PETSc solver breaks when using it on different PrimitiveStorages
Consider the following setup:
MPIManager::instance()->numProcesses() >= 1080
- There are 2 different
PrimitiveStorage
sstorage1
andstorage2
wherestorage1.getNumberOfGlobalPrimitives() < storage2.getNumberOfGlobalPrimitives() < MPIManager::instance()->numProcesses()
- We attempt to solve a system Au=f using
PETScCGSolver
for both of these storages.
If we do solve(storage1)
followed by solve(storage2)
the function VecAssemblyBegin
in PETScVector::createVectorFromFunction
in PETScCGSolver::solve
never returns. However, if the ordering is reversed, i.e., we first solve(storage2)
and then solve(storage1)
, everything seems to work.
Also see the minimal example in commit f7e7fc6e :
running mpirun -n 1080 apps/petscError config -param.n1=1 -param.n2=2
breaks while the same command with -param.n1=2 -param.n2=1
works.
The issue also doesn't seem to come up when using 1060 or fewer processes.