File: MatMerge_SeqsToMPI.html

package info (click to toggle)
petsc 3.2.dfsg-6
  • links: PTS, VCS
  • area: main
  • in suites: wheezy
  • size: 124,660 kB
  • sloc: ansic: 342,250; cpp: 62,975; python: 32,761; fortran: 17,337; makefile: 15,867; xml: 621; objc: 594; sh: 492; java: 381; f90: 347; csh: 245
file content (45 lines) | stat: -rw-r--r-- 2,247 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML3.2 EN">
<HTML>
<HEAD>
<META NAME="GENERATOR" CONTENT="DOCTEXT">
<TITLE>MatMerge_SeqsToMPI</TITLE>
</HEAD>
<BODY BGCOLOR="FFFFFF">
<A NAME="MatMerge_SeqsToMPI"><H1>MatMerge_SeqsToMPI</H1></A>
Creates a MPIAIJ matrix by adding sequential matrices from each processor 
<H3><FONT COLOR="#CC3333">Synopsis</FONT></H3>
<PRE>
#include "petscmat.h" 
PetscErrorCode  MatMerge_SeqsToMPINumeric(Mat seqmat,Mat mpimat)
</PRE>
Collective on <A HREF="../Sys/MPI_Comm.html#MPI_Comm">MPI_Comm</A>
<P>
<H3><FONT COLOR="#CC3333">Input Parameters</FONT></H3>
<TABLE border="0" cellpadding="0" cellspacing="0">
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>comm </B></TD><TD>- the communicators the parallel matrix will live on
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>seqmat </B></TD><TD>- the input sequential matrices
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>m </B></TD><TD>- number of local rows (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>n </B></TD><TD>- number of local columns (or <A HREF="../Sys/PETSC_DECIDE.html#PETSC_DECIDE">PETSC_DECIDE</A>)
</TD></TR>
<TR><TD WIDTH=40></TD><TD ALIGN=LEFT VALIGN=TOP><B>scall </B></TD><TD>- either MAT_INITIAL_MATRIX or MAT_REUSE_MATRIX
</TD></TR></TABLE>
<P>
<H3><FONT COLOR="#CC3333">Output Parameter</FONT></H3>
<DT><B>mpimat </B> -the parallel matrix generated
<br>
<P>

<P>
<H3><FONT COLOR="#CC3333">Notes</FONT></H3>
The dimensions of the sequential matrix in each processor MUST be the same.
The input seqmat is included into the container "Mat_Merge_SeqsToMPI", and will be
destroyed when mpimat is destroyed. Call <A HREF="../Sys/PetscObjectQuery.html#PetscObjectQuery">PetscObjectQuery</A>() to access seqmat.
<P><B><P><B><FONT COLOR="#CC3333">Level:</FONT></B>advanced
<BR><FONT COLOR="#CC3333">Location:</FONT></B><A HREF="../../../src/mat/impls/aij/mpi/mpiaij.c.html#MatMerge_SeqsToMPI">src/mat/impls/aij/mpi/mpiaij.c</A>
<BR><A HREF="./index.html">Index of all Mat routines</A>
<BR><A HREF="../../index.html">Table of Contents for all manual pages</A>
<BR><A HREF="../singleindex.html">Index of all manual pages</A>
</BODY></HTML>