Parallel data structure organizing the thick spherical shell metadata for distributed (MPI parallel) simulations.
More...
#include <spherical_shell.hpp>
|
| static DistributedDomain | create_uniform_single_subdomain_per_diamond (const int lateral_diamond_refinement_level, const int radial_diamond_refinement_level, const real_t r_min, const real_t r_max, const SubdomainToRankDistributionFunction &subdomain_to_rank=subdomain_to_rank_iterate_diamond_subdomains) |
| | Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
|
| |
| static DistributedDomain | create_uniform_single_subdomain_per_diamond (const int lateral_diamond_refinement_level, const std::vector< double > &radii, const SubdomainToRankDistributionFunction &subdomain_to_rank=subdomain_to_rank_iterate_diamond_subdomains) |
| | Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
|
| |
| static DistributedDomain | create_uniform (const int lateral_diamond_refinement_level, const int radial_diamond_refinement_level, const real_t r_min, const real_t r_max, const int lateral_subdomain_refinement_level, const int radial_subdomain_refinement_level, const SubdomainToRankDistributionFunction &subdomain_to_rank=subdomain_to_rank_iterate_diamond_subdomains) |
| | Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
|
| |
| static DistributedDomain | create_uniform (const int lateral_diamond_refinement_level, const std::vector< double > &radii, const int lateral_subdomain_refinement_level, const int radial_subdomain_refinement_level, const SubdomainToRankDistributionFunction &subdomain_to_rank=subdomain_to_rank_iterate_diamond_subdomains) |
| | Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
|
| |
| static DistributedDomain | create_uniform_on_comm (MPI_Comm comm, const int lateral_diamond_refinement_level, const std::vector< double > &radii, const int lateral_subdomain_refinement_level, const int radial_subdomain_refinement_level, const SubdomainToRankDistributionFunction &subdomain_to_rank) |
| | Build a DistributedDomain that lives on a sub-communicator.
|
| |
Parallel data structure organizing the thick spherical shell metadata for distributed (MPI parallel) simulations.
This is essentially a wrapper for the DomainInfo and the neighborhood information (SubdomainNeighborhood) for all process-local subdomains.
◆ LocalSubdomainIdx
◆ DistributedDomain()
| terra::grid::shell::DistributedDomain::DistributedDomain |
( |
| ) |
|
|
default |
◆ comm()
| MPI_Comm terra::grid::shell::DistributedDomain::comm |
( |
| ) |
const |
|
inline |
MPI communicator this domain lives on. Defaults to MPI_COMM_WORLD. For agglomerated multigrid, coarse levels carry a sub-communicator so their reductions / halo exchanges hit only the participating ranks.
◆ create_uniform() [1/2]
Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
◆ create_uniform() [2/2]
Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
◆ create_uniform_on_comm()
| static DistributedDomain terra::grid::shell::DistributedDomain::create_uniform_on_comm |
( |
MPI_Comm |
comm, |
|
|
const int |
lateral_diamond_refinement_level, |
|
|
const std::vector< double > & |
radii, |
|
|
const int |
lateral_subdomain_refinement_level, |
|
|
const int |
radial_subdomain_refinement_level, |
|
|
const SubdomainToRankDistributionFunction & |
subdomain_to_rank |
|
) |
| |
|
inlinestatic |
Build a DistributedDomain that lives on a sub-communicator.
Same geometry as create_uniform but resolves subdomain ownership against the caller's sub-comm rank, and stamps the sub-comm on the resulting domain so downstream reductions/halos run on the restricted comm.
- Parameters
-
| comm | The sub-comm this domain is distributed over (MPI_COMM_NULL for ranks that are not part of the sub-comm at this level — the returned domain is then empty of local subdomains). |
| subdomain_to_rank | Must return sub-comm-local rank indices (not parent ranks). |
◆ create_uniform_single_subdomain_per_diamond() [1/2]
Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
◆ create_uniform_single_subdomain_per_diamond() [2/2]
Creates a DistributedDomain with a single subdomain per diamond and initializes all the subdomain neighborhoods.
◆ domain_info()
| const DomainInfo & terra::grid::shell::DistributedDomain::domain_info |
( |
| ) |
const |
|
inline |
Returns a const reference.
◆ set_comm()
| void terra::grid::shell::DistributedDomain::set_comm |
( |
MPI_Comm |
comm | ) |
|
|
inline |
Override the MPI communicator (non-owning; caller manages lifetime). Used by MG setup to assign each coarse level its own sub-comm after construction.
◆ subdomain_info_from_local_idx()
◆ subdomains()
The documentation for this class was generated from the following file: