Module mpi_domain

Module mpi_domain 

Source
Expand description

MPI-oriented domain decomposition scaffolding.

This module defines deterministic domain partition metadata and halo packing/exchange primitives that can be wired to rsmpi in a later phase.

Structs§

CartesianTile
2D Cartesian tile descriptor — one per rank in a (pz × pr) topology.
DistributedSolveResult
Result of a distributed GS solve.
DistributedSolverConfig
Configuration for the distributed GS solver.
DomainSlice

Functions§

apply_halo_rows
auto_distributed_gs_solve
Convenience: solve GS with automatic process-grid selection.
decompose_2d
Decompose a 2D grid of shape (global_nz × global_nr) into a (pz × pr) Cartesian process topology.
decompose_z
distributed_gs_solve
Distributed Grad-Shafranov solver using additive Schwarz domain decomposition with Rayon thread-parallelism.
extract_tile
Extract a padded local tile (with halo) from the global array.
gs_residual_l2
Compute the L2 norm of the GS residual: ||LΨ - f||₂.
inject_tile
Write the core (non-halo) region of a local tile back into the global array.
l2_norm_delta
optimal_process_grid
Optimal process-grid factorisation for a given (nz, nr) global grid and total number of available ranks. Minimises the surface-to-volume ratio of each tile (i.e. the halo communication overhead).
pack_halo_rows
serial_halo_exchange
serial_halo_exchange_2d
Serial 2D halo exchange across all tiles.
split_with_halo
stitch_without_halo