1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170
|
name: build
on:
push:
paths-ignore:
- '**/*.md'
- '**/*.rst'
pull_request:
paths-ignore:
- '**/*.md'
- '**/*.rst'
env:
PREREQS_ENV: ${{github.workspace}}/prereqs.sh
SPARK_ENV: ${{github.workspace}}/spark_env.sh
HADOOP_ENV: ${{github.workspace}}/hadoop_env.sh
PREREQS_INSTALL_DIR: ${{github.workspace}}/prereqs
PROTOBUF_VERSION: 3.21.7
CMAKE_INSTALL_PREFIX: ${{github.workspace}}/install
GENOMICSDB_BUILD_DIR: ${{github.workspace}}/build
GENOMICSDB_RELEASE_VERSION: x.y.z.test
HADOOP_VER: 3.3.5
SPARK_VER: 3.4.0
SPARK_HADOOP_VER: 3
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-20.04,ubuntu-22.04,macos-13]
type: [basic]
java: [17]
include:
- os: ubuntu-22.04
type: hdfs
java: 17
env:
OS_TYPE: ${{matrix.os}}
JAVA_VER: ${{matrix.java}}
runs-on: ${{matrix.os}}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: 3.11
cache: 'pip'
- uses: actions/setup-java@v4
with:
java-version: ${{matrix.java}}
distribution: temurin
java-package: jdk
cache: maven
- name: Cache built prerequisites
uses: actions/cache@v4
with:
path: |
${{env.PREREQS_INSTALL_DIR}}
~/awssdk-install
~/gcssdk-install
~/protobuf-install/${{env.PROTOBUF_VERSION}}
key: ${{matrix.os}}-cache-prereqs-${{env.PROTOBUF_VERSION}}-v1
- name: Cache Distributed FileSystems
if: matrix.type == 'hdfs'
uses: actions/cache@v4
with:
path: ${{runner.workspace}}/hadoop-${{env.HADOOP_VER}}
key: ${{matrix.os}}-dfs-${{env.HADOOP_VER}}
- name: Install Prerequisites
shell: bash
working-directory: ${{github.workspace}}/scripts/prereqs
run: |
$GITHUB_WORKSPACE/.github/scripts/cleanup_hosts.sh
if [[ ${{matrix.os}} == macos* ]]; then
echo "Installing Prerequistes for MacOS..."
INSTALL_PREFIX=$PREREQS_INSTALL_DIR PREREQS_ENV=$PREREQS_ENV ./install_prereqs.sh
else
echo "Install Prerequisites for Linux.."
sudo INSTALL_PREFIX=$PREREQS_INSTALL_DIR PREREQS_ENV=$PREREQS_ENV ./install_prereqs.sh
fi
echo "cat prereqs env..."
cat $PREREQS_ENV
echo "cat prereqs env DONE"
- name: Install spark and hadoop dependencies
if: matrix.type == 'hdfs'
shell: bash
working-directory: ${{github.workspace}}
run: |
source $GITHUB_WORKSPACE/.github/scripts/install_hadoop.sh
# Spark needs JAVA_HOME to be the jdk path
JAVA_HOME==$(dirname $(dirname $(readlink -f $(which javac)))) SPARK_ENV=$SPARK_ENV source $GITHUB_WORKSPACE/.github/scripts/install_spark.sh
env:
INSTALL_DIR: ${{runner.workspace}}
- name: Create Build Directory
shell: bash
run: mkdir -p $GENOMICSDB_BUILD_DIR
- name: Configure CMake Build
shell: bash
working-directory: ${{env.GENOMICSDB_BUILD_DIR}}
run: |
source $PREREQS_ENV
# java tests take a very long time to run on MacOS, so limit running the tests to PRs and master/develop branches
BRANCH=${GITHUB_REF##*/}
if [[ ${{matrix.os}} != macos* || ${GITHUB_REF##*/} == master || ${GITHUB_REF##*/} == develop || $GITHUB_BASE_REF == master || $GITHUB_BASE_REF == develop ]]; then
echo "cmake BUILD_JAVA set to 1"
JAVA_BUILD_ARGS="-DBUILD_JAVA=1"
fi
if [[ ${{matrix.type}} == 'hdfs' ]]; then
HDFS_BUILD_ARGS="-DUSE_HDFS=1"
fi
if [[ ${{matrix.os}} != 'ubuntu-20.04' ]]; then
NANOARROW_BUILD_ARGS="-DBUILD_NANOARROW=1"
fi
cmake $GITHUB_WORKSPACE -DCMAKE_BUILD_TYPE=Coverage -DCMAKE_INSTALL_PREFIX=$CMAKE_INSTALL_PREFIX \
-DCMAKE_PREFIX_PATH=$PREREQS_INSTALL_DIR -DGENOMICSDB_PROTOBUF_VERSION=$PROTOBUF_VERSION \
-DGENOMICSDB_RELEASE_VERSION=$GENOMICSDB_RELEASE_VERSION $JAVA_BUILD_ARGS $HDFS_BUILD_ARGS $NANOARROW_BUILD_ARGS
- name: Build
working-directory: ${{env.GENOMICSDB_BUILD_DIR}}
shell: bash
run: |
source $PREREQS_ENV
make -j4
make install
- name: Test
shell: bash
working-directory: ${{env.GENOMICSDB_BUILD_DIR}}
if: matrix.type == 'basic'
run: |
python -m pip install --upgrade pip
python -m pip install jsondiff
make test ARGS=-V
- name: Test - Distributed FileSystems
if: matrix.type == 'hdfs'
shell: bash
working-directory: ${{github.workspace}}
run: |
python -m pip install --upgrade pip
python -m pip install jsondiff
source $SPARK_ENV
python tests/run_spark_hdfs.py $GENOMICSDB_BUILD_DIR $CMAKE_INSTALL_PREFIX local hdfs://localhost:9000/ client $GENOMICSDB_RELEASE_VERSION $GITHUB_WORKSPACE/tests "" Coverage
$GITHUB_WORKSPACE/.github/scripts/test_hdfs_htslib_support.sh
- name: Coverage
shell: bash
working-directory: ${{env.GENOMICSDB_BUILD_DIR}}
run: |
lcov --directory . --capture --output-file coverage.info
lcov --remove coverage.info '/opt*' '/usr*' '*/dependencies/*' '*/src/test*' '*.pb.h' '*.pb.cc' '*/protobuf-install/*' '*/awssdk-install/*' '*/gcssdk-install/*' 'v1/*' '/Library/*' -o coverage.info
- name: Upload Coverage to CodeCov
uses: codecov/codecov-action@v4
with:
files: build/coverage.info, build/target/jacoco-reports/jacoco-ut/jacoco.xml, build/target/jacoco-reports/jacoco-ci/jacoco-ci.xml
verbose: true
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|