1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49
|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML3.2 EN">
<HTML>
<HEAD> <link rel="canonical" href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJSumSeqAIJ.html" />
<META NAME="GENERATOR" CONTENT="DOCTEXT">
<TITLE>MatCreateMPIAIJSumSeqAIJ</TITLE>
</HEAD>
<BODY BGCOLOR="FFFFFF">
<div id="version" align=right><b>petsc-3.14.5 2021-03-03</b></div>
<div id="bugreport" align=right><a href="mailto:petsc-maint@mcs.anl.gov?subject=Typo or Error in Documentation &body=Please describe the typo or error in the documentation: petsc-3.14.5 v3.14.5 docs/manualpages/Mat/MatCreateMPIAIJSumSeqAIJ.html "><small>Report Typos and Errors</small></a></div>
<A NAME="MatCreateMPIAIJSumSeqAIJ"><H1>MatCreateMPIAIJSumSeqAIJ</H1></A>
Creates a <A HREF="../Mat/MATMPIAIJ.html#MATMPIAIJ">MATMPIAIJ</A> matrix by adding sequential matrices from each processor
<H3><FONT COLOR="#CC3333">Synopsis</FONT></H3>
<PRE>
#include "petscmat.h"
<A HREF="../Sys/PetscErrorCode.html#PetscErrorCode">PetscErrorCode</A> <A HREF="../Mat/MatCreateMPIAIJSumSeqAIJ.html#MatCreateMPIAIJSumSeqAIJ">MatCreateMPIAIJSumSeqAIJ</A>(<A HREF="../Sys/MPI_Comm.html#MPI_Comm">MPI_Comm</A> comm,<A HREF="../Mat/Mat.html#Mat">Mat</A> seqmat,<A HREF="../Sys/PetscInt.html#PetscInt">PetscInt</A> m,<A HREF="../Sys/PetscInt.html#PetscInt">PetscInt</A> n,<A HREF="../Mat/MatReuse.html#MatReuse">MatReuse</A> scall,<A HREF="../Mat/Mat.html#Mat">Mat</A> *mpimat)
</PRE>
Collective
<P>
<H3><FONT COLOR="#CC3333">Input Parameters</FONT></H3>
<TABLE border="0" cellpadding="0" cellspacing="0">
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>comm </B></TD><TD>- the communicators the parallel matrix will live on
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>seqmat </B></TD><TD>- the input sequential matrices
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>m </B></TD><TD>- number of local rows (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>n </B></TD><TD>- number of local columns (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>scall </B></TD><TD>- either <A HREF="../Mat/MatReuse.html#MatReuse">MAT_INITIAL_MATRIX</A> or <A HREF="../Mat/MatReuse.html#MatReuse">MAT_REUSE_MATRIX</A>
</TD></TR></TABLE>
<P>
<H3><FONT COLOR="#CC3333">Output Parameter</FONT></H3>
<TABLE border="0" cellpadding="0" cellspacing="0">
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>mpimat </B></TD><TD>- the parallel matrix generated
</TD></TR></TABLE>
<P>
<P>
<H3><FONT COLOR="#CC3333">Notes</FONT></H3>
The dimensions of the sequential matrix in each processor MUST be the same.
The input seqmat is included into the container "Mat_Merge_SeqsToMPI", and will be
destroyed when mpimat is destroyed. Call <A HREF="../Sys/PetscObjectQuery.html#PetscObjectQuery">PetscObjectQuery</A>() to access seqmat.
<P><B></B><H3><FONT COLOR="#CC3333">Level</FONT></H3>advanced<BR>
<H3><FONT COLOR="#CC3333">Location</FONT></H3>
</B><A HREF="../../../src/mat/impls/aij/mpi/mpiaij.c.html#MatCreateMPIAIJSumSeqAIJ">src/mat/impls/aij/mpi/mpiaij.c</A>
<BR><A HREF="./index.html">Index of all Mat routines</A>
<BR><A HREF="../../index.html">Table of Contents for all manual pages</A>
<BR><A HREF="../singleindex.html">Index of all manual pages</A>
</BODY></HTML>
|