Dune Core Modules (2.5.2)
mpicollectivecommunication.hh
Go to the documentation of this file.
70 std::shared_ptr<MPI_Op> Generic_MPI_Op<Type,BinaryFunction>::op = std::shared_ptr<MPI_Op>(static_cast<MPI_Op*>(0));
156 DUNE_THROW(ParallelError,"You must call MPIHelper::instance(argc,argv) in your main() function before using the MPI CollectiveCommunication!");
Various helper classes derived from from std::binary_function for stl-style functional programming.
int min(T *inout, int len) const
Compute the minimum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:220
int rank() const
Return rank, is between 0 and size()-1.
Definition: mpicollectivecommunication.hh:166
int gather(T *in, T *out, int len, int root) const
Gather arrays on root task.
Definition: mpicollectivecommunication.hh:258
int allgatherv(T *in, int sendlen, T *out, int *recvlen, int *displ) const
Gathers data of variable length from all tasks and distribute it to all.
Definition: mpicollectivecommunication.hh:310
T max(T &in) const
Compute the maximum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:228
int broadcast(T *inout, int len, int root) const
Distribute an array from the process with rank root to all other processes.
Definition: mpicollectivecommunication.hh:250
int allreduce(Type *inout, int len) const
Compute something over all processes for each component of an array and return the result in every pr...
Definition: mpicollectivecommunication.hh:319
T prod(T &in) const
Compute the product of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:195
int scatterv(T *send, int *sendlen, int *displ, T *recv, int recvlen, int root) const
Scatter arrays of variable length from a root to all other tasks.
Definition: mpicollectivecommunication.hh:286
int scatter(T *send, T *recv, int len, int root) const
Scatter array from a root to all other task.
Definition: mpicollectivecommunication.hh:277
int size() const
Number of processes in set, is greater than 0.
Definition: mpicollectivecommunication.hh:172
CollectiveCommunication(const MPI_Comm &c=MPI_COMM_WORLD)
Instantiation using a MPI communicator.
Definition: mpicollectivecommunication.hh:149
T min(T &in) const
Compute the minimum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:211
int max(T *inout, int len) const
Compute the maximum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:237
int barrier() const
Wait until all processes have arrived at this point in the program.
Definition: mpicollectivecommunication.hh:243
int allreduce(Type *in, Type *out, int len) const
Compute something over all processes for each component of an array and return the result in every pr...
Definition: mpicollectivecommunication.hh:330
T sum(T &in) const
Compute the sum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:179
int allgather(T *sbuf, int count, T1 *rbuf) const
Gathers data from all tasks and distribute it to all.
Definition: mpicollectivecommunication.hh:301
int prod(T *inout, int len) const
Compute the product of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:204
int sum(T *inout, int len) const
Compute the sum of the argument over all processes and return the result in every process....
Definition: mpicollectivecommunication.hh:188
int gatherv(T *in, int sendlen, T *out, int *recvlen, int *displ, int root) const
Gather arrays of variable size on root task.
Definition: mpicollectivecommunication.hh:267
Collective communication interface and sequential default implementation.
Definition: collectivecommunication.hh:79
Default exception if an error in the parallel communication of the programm occurred.
Definition: exceptions.hh:285
Implements an utility class that provides collective communication methods for sequential programs.
A few common exception classes.
Traits classes for mapping types onto MPI_Datatype.
A traits class describing the mapping of types onto MPI_Datatypes.
Definition: mpitraits.hh:39
|
Legal Statements / Impressum |
Hosted by TU Dresden |
generated with Hugo v0.111.3
(Nov 12, 23:30, 2024)