For Software Developers

Hail is an open-source project. We welcome contributions to the repository. If you’re interested in contributing to Hail, you will need to build your own Hail JAR and set up the testing environment.

Requirements

You’ll need:

Building a Hail JAR

To build Hail from source, you will need a C++ compiler and lz4 Debian users might try:

sudo apt-get install g++ liblz4-dev

On Mac OS X, you might try:

xcode-select --install
brew install lz4

The Hail source code is hosted on GitHub:

git clone https://github.com/hail-is/hail.git
cd hail/hail

Build a Hail JAR compatible with Spark 2.2.0:

./gradlew -Dspark.version=2.2.0 shadowJar

Environment Variables and Conda Environments

You will need to set some environment variables so that Hail can find Spark, Spark can find Hail, and Python can find Hail. Add these lines to your .bashrc or equivalent, setting SPARK_HOME to the root directory of a Spark installation and HAIL_HOME to the root of the Hail repository:

export SPARK_HOME=/path/to/spark
export HAIL_HOME=/path/to/hail/hail
export PYTHONPATH="$PYTHONPATH:$HAIL_HOME/python:$SPARK_HOME/python:`echo $SPARK_HOME/python/lib/py4j*-src.zip`"
export SPARK_CLASSPATH=$HAIL_HOME/build/libs/hail-all-spark.jar
export PYSPARK_SUBMIT_ARGS="--conf spark.driver.extraClassPath=$SPARK_CLASSPATH --conf spark.executor.extraClassPath=$SPARK_CLASSPATH --driver-memory 8G pyspark-shell"

Hail uses conda environments to manage some of hail’s python dependencies. First, create a conda environment for hail:

conda env create -f ./python/hail/dev-environment.yml

Activate the environment

source activate hail

Now the shell prompt should include the name of the environment, in this case “hail”.

Now you can import hail from a python interpreter:

$ python
Python 3.6.5 |Anaconda, Inc.| (default, Mar 29 2018, 13:14:23)
[GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)] on darwin
Type "help", "copyright", "credits" or "license" for more information.

>>> import hail as hl

>>> hl.init() # doctest: +SKIP
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Running on Apache Spark version 2.2.0
SparkUI available at http://10.1.6.36:4041
Welcome to
     __  __     <>__
    / /_/ /__  __/ /
   / __  / _ `/ / /
  /_/ /_/\_,_/_/_/   version devel-9f866ba
NOTE: This is a beta version. Interfaces may change
  during the beta period. We also recommend pulling
  the latest changes weekly.

>>>

When you are finished developing hail, disable the environment

source deactivate hail

The dev-environment.yml file may change without warning; therefore, after pulling new changes from a remote repository, we always recommend updating the conda environment

conda env update hail -f ./python/hail/dev-environment.yml

Building the Docs

Within the “hail” environment, run the makeDocs gradle task:

./gradlew makeDocs

The generated docs are located at ./build/www/docs/0.2/index.html.

Running the tests

Several Hail tests have additional dependencies:

  • PLINK 1.9

  • QCTOOL 1.4

  • R 3.3.4 with CRAN packages jsonlite, SKAT and logistf, as well as pcrelate from the GENESIS Bioconductor package. These can be installed within R using:

    install.packages(c("jsonlite", "SKAT", "logistf"))
    source("https://bioconductor.org/biocLite.R")
    biocLite("GENESIS")
    biocLite("SNPRelate")
    biocLite("GWASTools")
    

To execute all Hail tests, run:

./gradlew -Dspark.version=${SPARK_VERSION} -Dspark.home=${SPARK_HOME} test

Contributing

Chat with the dev team on our Zulip chatroom if you have an idea for a contribution. We can help you determine if your project is a good candidate for merging.

Keep in mind the following principles when submitting a pull request:

  • A PR should focus on a single feature. Multiple features should be split into multiple PRs.
  • Before submitting your PR, you should rebase onto the latest master.
  • PRs must pass all tests before being merged. See the section above on Running the tests locally.
  • PRs require a review before being merged. We will assign someone from our dev team to review your PR.
  • Code in PRs should be formatted according to the style in code_style.xml. This file can be loaded into Intellij to automatically format your code.
  • When you make a PR, include a short message that describes the purpose of the PR and any necessary context for the changes you are making.