File: MatCreateMPIAIJSumSeqAIJ.html

package info (click to toggle)
petsc 3.4.2.dfsg1-8.1
  • links: PTS, VCS
  • area: main
  • in suites: jessie, jessie-kfreebsd
  • size: 129,104 kB
  • ctags: 516,422
  • sloc: ansic: 395,939; cpp: 47,201; python: 34,788; makefile: 17,193; fortran: 16,251; f90: 1,592; objc: 954; sh: 822; xml: 621; java: 381; lisp: 293; csh: 241
file content (46 lines) | stat: -rw-r--r-- 2,517 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML3.2 EN">
<HTML>
<HEAD> <link rel="canonical" href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJSumSeqAIJ.html" />
<META NAME="GENERATOR" CONTENT="DOCTEXT">
<TITLE>MatCreateMPIAIJSumSeqAIJ</TITLE>
</HEAD>
<BODY BGCOLOR="FFFFFF">
   <div id="version" align=right><b>petsc-3.4.2 2013-07-02</b></div>
<A NAME="MatCreateMPIAIJSumSeqAIJ"><H1>MatCreateMPIAIJSumSeqAIJ</H1></A>
Creates a MPIAIJ matrix by adding sequential matrices from each processor 
<H3><FONT COLOR="#CC3333">Synopsis</FONT></H3>
<PRE>
#include "petscmat.h" 
PetscErrorCode  MatCreateMPIAIJSumSeqAIJ(MPI_Comm comm,Mat seqmat,PetscInt m,PetscInt n,MatReuse scall,Mat *mpimat)
</PRE>
Collective on <A HREF="../Sys/MPI_Comm.html#MPI_Comm">MPI_Comm</A>
<P>
<H3><FONT COLOR="#CC3333">Input Parameters</FONT></H3>
<TABLE border="0" cellpadding="0" cellspacing="0">
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>comm </B></TD><TD>- the communicators the parallel matrix will live on
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>seqmat </B></TD><TD>- the input sequential matrices
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>m </B></TD><TD>- number of local rows (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>n </B></TD><TD>- number of local columns (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>scall </B></TD><TD>- either MAT_INITIAL_MATRIX or MAT_REUSE_MATRIX
</TD></TR></TABLE>
<P>
<H3><FONT COLOR="#CC3333">Output Parameter</FONT></H3>
<DT><B>mpimat </B> -the parallel matrix generated
<br>
<P>

<P>
<H3><FONT COLOR="#CC3333">Notes</FONT></H3>
The dimensions of the sequential matrix in each processor MUST be the same.
The input seqmat is included into the container "Mat_Merge_SeqsToMPI", and will be
destroyed when mpimat is destroyed. Call <A HREF="../Sys/PetscObjectQuery.html#PetscObjectQuery">PetscObjectQuery</A>() to access seqmat.
<P><B><P><B><FONT COLOR="#CC3333">Level:</FONT></B>advanced
<BR><FONT COLOR="#CC3333">Location:</FONT></B><A HREF="../../../src/mat/impls/aij/mpi/mpiaij.c.html#MatCreateMPIAIJSumSeqAIJ">src/mat/impls/aij/mpi/mpiaij.c</A>
<BR><A HREF="./index.html">Index of all Mat routines</A>
<BR><A HREF="../../index.html">Table of Contents for all manual pages</A>
<BR><A HREF="../singleindex.html">Index of all manual pages</A>
</BODY></HTML>