File: MatCreateMPIMatConcatenateSeqMat.html

package info (click to toggle)
petsc 3.7.5%2Bdfsg1-4
  • links: PTS, VCS
  • area: main
  • in suites: stretch
  • size: 163,864 kB
  • ctags: 618,438
  • sloc: ansic: 515,133; python: 29,793; makefile: 20,458; fortran: 18,998; cpp: 6,515; f90: 3,914; sh: 1,012; xml: 621; objc: 445; csh: 240; java: 13
file content (43 lines) | stat: -rw-r--r-- 2,441 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML3.2 EN">
<HTML>
<HEAD> <link rel="canonical" href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIMatConcatenateSeqMat.html" />
<META NAME="GENERATOR" CONTENT="DOCTEXT">
<TITLE>MatCreateMPIMatConcatenateSeqMat</TITLE>
</HEAD>
<BODY BGCOLOR="FFFFFF">
   <div id="version" align=right><b>petsc-3.7.5 2017-01-01</b></div>
   <div id="bugreport" align=right><a href="mailto:petsc-maint@mcs.anl.gov?subject=Typo or Error in Documentation &body=Please describe the typo or error in the documentation: petsc-3.7.5 v3.7.5 docs/manualpages/Mat/MatCreateMPIMatConcatenateSeqMat.html "><small>Report Typos and Errors</small></a></div>
<A NAME="MatCreateMPIMatConcatenateSeqMat"><H1>MatCreateMPIMatConcatenateSeqMat</H1></A>
Creates a single large PETSc matrix by concatenating sequential matrices from each processor 
<H3><FONT COLOR="#CC3333">Synopsis</FONT></H3>
<PRE>
#include "petscmat.h" 
PetscErrorCode MatCreateMPIMatConcatenateSeqMat(MPI_Comm comm,Mat seqmat,PetscInt n,MatReuse reuse,Mat *mpimat)
</PRE>
Collective on <A HREF="../Sys/MPI_Comm.html#MPI_Comm">MPI_Comm</A>
<P>
<H3><FONT COLOR="#CC3333">Input Parameters</FONT></H3>
<TABLE border="0" cellpadding="0" cellspacing="0">
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>comm </B></TD><TD>- the communicators the parallel matrix will live on
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>seqmat </B></TD><TD>- the input sequential matrices
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>n </B></TD><TD>- number of local columns (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>reuse </B></TD><TD>- either MAT_INITIAL_MATRIX or MAT_REUSE_MATRIX
</TD></TR></TABLE>
<P>
<H3><FONT COLOR="#CC3333">Output Parameter</FONT></H3>
<DT><B>mpimat </B> -the parallel matrix generated
<br>
<P>

<P>
Notes: The number of columns of the matrix in EACH processor MUST be the same.
<P>
<P><B><P><B><FONT COLOR="#CC3333">Level:</FONT></B>advanced
<BR><FONT COLOR="#CC3333">Location:</FONT></B><A HREF="../../../src/mat/interface/matrix.c.html#MatCreateMPIMatConcatenateSeqMat">src/mat/interface/matrix.c</A>
<BR><A HREF="./index.html">Index of all Mat routines</A>
<BR><A HREF="../../index.html">Table of Contents for all manual pages</A>
<BR><A HREF="../singleindex.html">Index of all manual pages</A>
</BODY></HTML>