1. 29 Sep, 2021 1 commit
  2. 28 Sep, 2021 1 commit
  3. 27 Sep, 2021 2 commits
  4. 28 Apr, 2021 1 commit
  5. 29 Jan, 2021 1 commit
    • Marcus Mohr's avatar
      Allows selecting output format for PETScSparseMatrix object · e2bb3e97
      Marcus Mohr authored
      * The classes' print function now accepts a second optional argument of
        type PetscViewerFormat. Any valid format is possible, as this will just
        be passed on to PETSc. Default value is PETSC_VIEWER_ASCII_MATRIXMARKET.
      * The exportOperator() function now requires the format as an argument; it
        now also lives in the petsc namespace.
      * The exportOperatorMatrix app allows to select either MATRIXMARKET or
        MATLAB format on the command-line.
      e2bb3e97
  6. 12 Nov, 2020 1 commit
    • Marcus Mohr's avatar
      Adds PETScSparseMatrixInfo class and overloads stream operator << · 32760cda
      Marcus Mohr authored
      The PETScSparseMatrixInfo provides access to the MatInfo struct of a
      matrix from PETSc. It is a child of the SparseMatrixInfo class. Maybe
      we want to implement a similar object for Trilinos?
      
      The PETScSparseMatrix class has a corresponding new getInfo() method
      and we can insert a PETScSparseMatrixInfo into the output stream.
      
      ElementwiseOperatoPetscTest now uses this new approach for its reporting.
      
      Further changes in PETScSparseMatrix:
      * renames assemble to assemble_
      * make constructor set assemble_ to false
      * relocates setting assemble_ to true from createMatrixFromOperatorOnce()
        to createMatrixFromOperator(), so that we can sensibly query it in
        getInfo()
      32760cda
  7. 22 Sep, 2020 1 commit
  8. 25 Aug, 2020 1 commit
    • Nils Kohl's avatar
      Implemented PETSc assembly for the strong free-slip wrapper in 2D (T.-H. only). · 5d7f6771
      Nils Kohl authored
      First the individual matrices (Stokes, normal projection) are assembled,
      then a PETSc MatMatMult() is performed and the result is stored.
      
      The resulting operator is, as the matrix-free operator, not symmetric.
      However, the PETSc MINRES solver seems to converge anyway.
      
      Added a small test for the assembly (checking if HyTeG and PETSc
      operator application produce same results).
      
      Extended free-slip test and app.
      5d7f6771
  9. 16 Jun, 2020 1 commit
  10. 15 Jun, 2020 1 commit
  11. 04 Jun, 2020 1 commit
  12. 26 May, 2020 1 commit
  13. 22 May, 2020 1 commit
    • Nils Kohl's avatar
      Using split communicator in PETScLU solver. · 21dd539a
      Nils Kohl authored
      All processes that own primitives perform the solve() call on the assembled problem.
      All remaining processes solve an empty problem.
      
      Intended for agglomeration: by simply re-distributing the primitives to a subset of processes,
      the PETSc solver will solve the global problem only on that subset, the remaining processes
      do not know about this and solve an empty problem. The solve() function may actually only
      be invoked on non-empty processes then.
      21dd539a
  14. 20 May, 2020 1 commit
  15. 24 Apr, 2020 1 commit
  16. 03 Apr, 2020 1 commit
  17. 07 Feb, 2020 1 commit
  18. 03 Feb, 2020 1 commit
    • Marcus Mohr's avatar
      Implements matrix setup for elementwise operators · 28b4888b
      Marcus Mohr authored
      This commit implements the functionality required to assemble the matrix
      associated with a P[12]ElementwiseOperator as a sparse PETScMatrix.
      
      We also have a test that compares the matrix we get for the elementwise
      operator to that of the corresponding P[12]ConstantOperator. This is done
      for 2D and 3D with the operators for the Laplace and Mass forms.
      28b4888b
  19. 28 Jan, 2020 1 commit
  20. 14 Nov, 2019 1 commit
  21. 06 Nov, 2019 1 commit
  22. 11 Sep, 2019 1 commit
  23. 30 Aug, 2019 3 commits
  24. 17 Apr, 2019 1 commit
    • Marcus Mohr's avatar
      Truly eliminates Dirichlet DoFs from Matlab matrix · edd4d07a
      Marcus Mohr authored
      The commit adds code to hhg::exportOperator() that will append a vector
      with the global indices of those DoFs that are fixed by Dirichlet
      boundary conditions.
      
      For this the variant of applyDirichletBCSymmetrically() that works only
      on the matrix now returns this information to its caller.
      
      Additionally Matlab code is appended to the script which truly removes
      the Dirichlet DoFs from the sparse matrix.
      
      The script also becomes a little bit more talkative.
      edd4d07a
  25. 16 Apr, 2019 1 commit
    • Marcus Mohr's avatar
      Adds export utility for general operators and demo app · 86c9b059
      Marcus Mohr authored
      - PETScExportOperatorMatrix.hpp contains a templated function that
        allows exporting an arbitrary operator to a file as matrix via PETSc.
      - PETScSparseMatrix.hpp now contains a variant of the method
        applyDirichletBCSymmetrically() which does only modify the matrix
        itself, but not the RHS. So we need not pass any vectors.
      - exportOperatorMatrix.cpp is a demo app that allows exporting some of
        our constant operators
      86c9b059
  26. 01 Apr, 2019 1 commit
    • Nils Kohl's avatar
      Implemented application of Dirichlet BCs to a linear system without losing... · 589d0711
      Nils Kohl authored
      Implemented application of Dirichlet BCs to a linear system without losing symmetry for PETSc solvers.
      
      Uses the PETSc function MatZeroRowsColumns() which does that automatically.
      Still, we need to think how we can easily integrate this to use more efficient
      solvers in HyTeG, because the RHS is modified depending on the original system.
      
      Possible solutions:
      1. re-assemble the system every time we solve it since we need to also rebuild the RHS
      2. store a copy of the original system and circumvent re-assembling by copying it and applying
         only MatZeroRowsColumns() (without re-assembly) before calling the solver -
      
      If PETSc is only used as a coarse grid solver, option 2 might be a good solution
      589d0711
  27. 10 Jan, 2019 1 commit
  28. 09 Jan, 2019 1 commit
  29. 11 Dec, 2018 1 commit
  30. 04 Dec, 2018 1 commit
  31. 19 Nov, 2018 1 commit
  32. 18 Sep, 2018 1 commit
  33. 14 Sep, 2018 1 commit
  34. 19 Apr, 2018 1 commit
  35. 04 Apr, 2018 1 commit
  36. 29 Jan, 2018 1 commit
  37. 18 Jan, 2018 1 commit