Skip to content

Commit

Permalink
Automated subtree export at 2020-01-12T19:05:01.189089
Browse files Browse the repository at this point in the history
  • Loading branch information
[Git export bot] committed Jan 12, 2020
1 parent 1b6df7c commit 37235c9
Show file tree
Hide file tree
Showing 5 changed files with 46 additions and 287 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ install:
- chmod -R 777 $TRAVIS_BUILD_DIR

script:
- docker run -v$TRAVIS_BUILD_DIR:/phd chriscummins/phd_build:latest -c "./configure --noninteractive && ./tools/flaky_bazel.sh run --config=travis //tools:whoami"
- docker run -v$TRAVIS_BUILD_DIR:/phd chriscummins/phd_build:latest -c "./configure --noninteractive && ./tools/flaky_bazel.sh test --config=travis //tools/format/..."

notifications:
email:
Expand Down
File renamed without changes.
270 changes: 44 additions & 226 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,242 +1,60 @@
# My PhD

<!-- repo size -->
<a href="https://github.com/ChrisCummins/phd">
<img src="https://img.shields.io/github/repo-size/ChrisCummins/phd.svg">
</a>
<!-- commit counter -->
<a href="https://github.com/ChrisCummins/phd">
<img src="https://img.shields.io/github/commit-activity/y/ChrisCummins/phd.svg?color=yellow">
</a>
<!-- Better code -->
<a href="https://bettercodehub.com/results/ChrisCummins/phd">
<img src="https://bettercodehub.com/edge/badge/ChrisCummins/phd?branch=master">
</a>
<!-- Travis CI -->
<a href="https://github.com/ChrisCummins/phd">
<img src="https://img.shields.io/travis/ChrisCummins/phd/master.svg">
</a>

A monolothic repository for (almost) everything I have done while at the University of Edinburgh. Living an #open life.


## Publications

1. Chris Cummins, Pavlos Petoumenos, Alastair Murray, Hugh Leather.
"**Compiler Fuzzing through Deep Learning**".
ISSTA '18.
[[source code]](/docs/2018_07_issta).
[[pdf]](https://chriscummins.cc/pub/2018-issta.pdf).
Build command: `$ bazel build //docs/2018_07_issta`.
1. Chris Cummins, Pavlos Petoumenos, Alastair Murray, Hugh Leather.
"**DeepSmith: Compiler Fuzzing through Deep Learning**".
ACACES '18.
[[source code]](/docs/2018_07_acaces).
[[pdf]](https://chriscummins.cc/pub/2018-acaces.pdf).
Build command: `$ bazel build //docs/2018_07_acaces`.
1. Chris Cummins, Pavlos Petoumenos, Zheng Wang, Hugh Leather.
"**End-to-end Deep Learning of Optimization Heuristics**".
PACT '17.
[[source code]](https://github.com/ChrisCummins/paper-end2end-dl/).
[[pdf]](https://github.com/ChrisCummins/paper-end2end-dl/raw/master/paper.pdf).
Build command: `$ bazel build //docs/2017_09_pact`.
1. Chris Cummins, Pavlos Petoumenos, Zheng Wang, Hugh Leather.
"**Synthesizing Benchmarks for Predictive Modeling**".
CGO '17.
[[source code]](https://github.com/ChrisCummins/paper-synthesizing-benchmarks/).
[[pdf]](https://github.com/ChrisCummins/paper-synthesizing-benchmarks/raw/master/paper.pdf).
[[acm]](https://dl.acm.org/citation.cfm?id=3049843).
Build command: `$ bazel build //docs/2017_02_cgo`.
1. Chris Cummins, Pavlos Petoumenos, Michel Steuwer, Hugh Leather.
"**Autotuning OpenCL Workgroup Sizes**". ACACES '16.
[[source code]](/docs/2016_07_acaces).
Build command: `$ bazel build //docs/2016_07_acaces`.
1. Chris Cummins, Pavlos Petoumenos, Michel Steuwer, Hugh Leather.
"**Towards Collaborative Performance Tuning of Algorithmic Skeletons**".
HLPGPU '16, HiPEAC.
[[source code]](https://github.com/ChrisCummins/paper-towards-collaborative-performance-tuning).
[[pdf]](https://github.com/ChrisCummins/paper-towards-collaborative-performance-tuning/raw/master/paper.pdf).
Build command: `$ bazel build //docs/2016_01_hlpgpu`.
1. Chris Cummins, Pavlos Petoumenos, Michel Steuwer, Hugh Leather.
"**Autotuning OpenCL Workgroup Size for Stencil Patterns**".
ADAPT '16, HiPEAC.
[[source code]](https://github.com/ChrisCummins/paper-autotuning-opencl-wgsize).
[[pdf]](https://github.com/ChrisCummins/paper-autotuning-opencl-wgsize/raw/master/paper.pdf).
[[arxiv]](https://arxiv.org/abs/1511.02490).
Build command: `$ bazel build //docs/2016_01_adapt`.
1. Chris Cummins. "**Autotuning Stencils Codes with Algorithmic Skeletons**".
MSc Thesis, 2015. The University of Edinburgh.
[[source code]](/docs/2015_08_msc_thesis).
Build command: `$ bazel build //docs/2015_08_msc_thesis`.


## Talks

1. Chris Cummins. "**Compiler Fuzzing through Deep Learning**", 3rd August, 2018.
Codeplay, Edinburgh, Scotland.
[[files]](/talks/2018_08_codeplay).
[[slides]](https://speakerdeck.com/chriscummins/compiler-fuzzing-through-deep-learning-issta-18).
1. Chris Cummins. "**Machine Learning for Compilers**", 20th July, 2018.
Workshop on Introspective Systems for Automatically Generating Tests (ISAGT),
Amsterdam, Netherlands.
[[files]](/talks/2018_07_isagt).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2018_07_isagt/2018_07_isagt.pdf).
1. Chris Cummins. "**Compiler Fuzzing through Deep Learning**", 16th July, 2018.
ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA), Amsterdam, Netherlands.
[[files]](/talks/2018_07_issta).
[[slides]](https://speakerdeck.com/chriscummins/compiler-fuzzing-through-deep-learning-issta-18).
1. Chris Cummins. "**End-to-end Deep Learning of Optimization Heuristics**", 23rd March, 2018.
Facebook, Menlo Park.
[[files]](/talks/2018_03_facebook).
[[slides]](https://speakerdeck.com/chriscummins/end-to-end-deep-learning-of-optimization-heuristics-pact-17).
1. Chris Cummins. "**End-to-end Deep Learning of Optimization Heuristics**", 2nd Feb, 2018.
Google, Mountain View.
[[files]](/talks/2018_02_google).
[[slides]](https://speakerdeck.com/chriscummins/end-to-end-deep-learning-of-optimization-heuristics-pact-17).
1. Chris Cummins. "**Second Year Progression Review**", 18th Dec, 2017.
The University of Edinburgh, Scotland.
[[files]](/talks/2017_12_second_year_review).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2017-12-second-year-review/2017-12-second-year-review.pdf).
1. Chris Cummins. "**End-to-end Deep Learning of Optimization Heuristics**", 4th Oct, 2017.
The University of Edinburgh, Scotland.
[[files]](/talks/2017_10_ppar).
[[slides]](https://speakerdeck.com/chriscummins/end-to-end-deep-learning-of-optimization-heuristics-pact-17).
1. Chris Cummins. "**End-to-end Deep Learning of Optimization Heuristics**", 12th Sep, 2017.
International Conference on Parallel Architectures and Compilation Techniques (PACT), Portland, Oregon, USA.
[[files]](/talks/2017_09_pact).
[[slides]](https://speakerdeck.com/chriscummins/end-to-end-deep-learning-of-optimization-heuristics-pact-17).
1. Chris Cummins. "**Deep Learning in Compilers**", 14th Jun, 2017.
The University of Edinburgh, Scotland.
[[files]](/talks/2017_06_ppar).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2017-06-ppar/2017-06-ppar.pdf). [[transcript]](https://chriscummins.cc/2017/deep-learning-in-compilers/).
1. Chris Cummins. "**Using Deep Learning to Generate Human-like Code**", 22nd April, 2017.
Scottish Programming Languages Seminar, University of St.
Andrews, Scotland.
[[files]](/talks/2017_03_spls).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2017-03-spls/2017-03-spls.pdf).
1. Chris Cummins. "**Synthesizing Benchmarks for Predictive Modeling**", 6th Febuary, 2017.
International Symposium on Code Generationand Optimization (CGO), Austin, Texas, USA.
[[files]](/talks/2017_02_cgo).
[[slides]](https://speakerdeck.com/chriscummins/synthesizing-benchmarks-for-predictive-modelling-cgo-17).
1. Chris Cummins. "**Machine Learning & Compilers**", 9th September, 2016.
Codeplay, Edinburgh, Scotland.
[[files]](/talks/2017_02_cgo).
[[slides]](https://speakerdeck.com/chriscummins/machine-learning-and-compilers).
1. Chris Cummins. "**Building an AI that Codes**", 22nd July, 2016.
Ocado Technology, Hatfield, England.
[[files]](/talks/2016_07_ocado).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-07-ocado/2016-07-ocado.pdf).
1. Chris Cummins.
"**All the OpenCL on GitHub: Teaching an AI to code, one character at a time**", 19th May, 2016.
Amazon Development Centre, Edinburgh, Scotland.
[[files]](/talks/2016_05_amazon).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-05-amazon/2016-05-amazon.pdf).
1. Chris Cummins. "**Autotuning and Algorithmic Skeletons**", Wed 10th Feb, 2016.
The University of Edinburgh, Scotland.
[[files]](/talks/2016_02_ppar).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-02-ppar/2016-02-ppar.pdf).
1. Chris Cummins. "**Towards Collaborative Performance Tuning of Algorithmic Skeletons**", Tues 19th Jan, 2016.
HLPGPU, HiPEAC, Prague.
[[files]](/talks/2016_01_hlpgpu).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-01-hlpgpu/2016-01-hlpgpu.pdf).
1. Chris Cummins. "**Autotuning OpenCL Workgroup Size for Stencil Patterns**", Mon 18th Jan, 2016.
ADAPT, HiPEAC, Prague.
[[files]](/talks/2016_01_adapt).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-01-adapt/2016-01-adapt.pdf).
1. Chris Cummins.
"**Towards Collaborative Performance Tuning of Algorithmic Skeletons**", Thurs 14th Jan, 2016.
The University of Edinburgh, Scotland.
[[files]](/talks/2016_01_hlpgpu).
[[pdf]](https://github.com/ChrisCummins/phd/raw/master/talks/2016-01-hlpgpu/2016-01-hlpgpu.pdf).


## Misc

1. **Curriculum Vitae**.
[[source code]](/docs/cv).
[[pdf]](https://chriscummins.cc/cv.pdf).
[[html]](https://chriscummins.cc/cv/).
Build command: `$ bazel build //docs/cv`.
1. Chris Cummins, Pavlos Petoumenos, Michel Steuwer, Hugh Leather.
"**Collaborative Autotuning of Algorithmic Skeletons for GPUs and CPUs**".
Incomplete journal version of ADAPT and HLPGPU papers.
[[source code]](/docs/2016_12_wip_taco).
Build command: `$ bazel build //docs/2016_12_wip_taco`.
1. Chris Cummins. "**Deep Learning for Compilers**". PhD First Year Review
Document, 2016.
[[source code]](/docs/2016_11_first_year_review).
Build command: `$ bazel build //docs/2016_11_first_year_review`.
1. Chris Cummins, Hugh Leather. "**Autotuning OpenCL Workgroup Sizes**".
Rejected submission for PACT'16 Student Research Competition.
[[source code]](/docs/2016_07_pact).
Build command: `$ bazel build //docs/2016_07_pact`.
1. Chris Cummins, Pavlos Petoumenos, Michel Steuwer, Hugh Leather.
"**Autotuning OpenCL Workgroup Sizes**".
Submission for PLDI'16 Student Poster Session.
[[source code]](/docs/2016_06_pldi).
Build command: `$ bazel build //docs/2016_06_pldi`.
1. Chris Cummins. "**Autotuning and Skeleton-aware Compilation**".
PhD Progression Review, 2015.
[[source code]](/docs/2015_09_progression-review).
Build command: `$ bazel build //docs/2015_09_progression_review`.


<h2>
Building the code
</h2>

I use [Bazel](https://bazel.build) as my build system of choice, with a
preliminary [configure](/configure) script to setup the build. I'm gradually
working towards a completely hermetic build, but for now there remains a couple
of dependencies on the host C++ toolchain and Python runtime.

This project can only be built on a modern version of Ubuntu Linux or macOS.
This is a requirement I inherit from my dependencies, which eschew Windows and
other Linux distros. Fortunately, you can use a
[Docker](https://www.docker.com/community-edition) image and follow the Ubuntu
instructions:
# Format: Automated Code Formatters

This package implements an opinionated, non-configurable enforcer of code style.

```sh
$ docker run -it ubuntu:18.04 /bin/bash
$ format <path ...>
```

If you have success building this project on other platforms, I'd love to hear
about it and accept patches.
This program enforces a consistent code style on files by modifying them in
place. If a path is a directory, all files inside it are formatted.

#### Ubuntu/MacOS instructions
Features:

Configure the build and answer the yes/no questions. The default answers should
be fine:
* Automated code styling of C/C++, Python, Java, SQL, JavaScript, HTML,
CSS, go, and JSON files.
* Support for `.formatignore` files to mark files to be excluded from
formatting. The syntax of ignore files is similar to `.gitignore`, e.g. a
list of patterns to match, including (recursive) glob expansion, and
patterns beginning with `!` are un-ignored.
* A `--pre_commit` mode which formats files that have been staged for commit,
and stages changes. The commit is rejected if a partially-staged file is
modified. Enforce the use of pre-commit mode (and add a "Signed off" footer
to commits) by running `--install_pre_commit_hook`.
* Persistent caching of "last modified" timestamps for files to minimize the
amount of work done.
* A process lock which prevents races when multiple formatters are launched
simultaneously.

```sh
$ ./configure
```
The type of formatting applied to a file is determined by its suffix. See
format --print_suffixes for a list of suffixes which are formatted.

Note that CUDA support requires CUDA to have been installed separately,
see the [TensorFlow build docs](https://www.tensorflow.org/install/) for
instructions. CUDA support has only been tested for Linux builds, not macOS or
Docker containers.
This program uses a filesystem cache to store various attributes such as a
database of file modified times. See `format --print_cache_path` to print the
path of the cache. Included in the cache is a file lock which prevents mulitple
instances of this program from modifying files at the same time, irrespective
of the files being formatted.

The configure process generates a `bootstrap.sh` script which will install the
required dependent packages. Since installing these packages will affect the
global state of your system, and may requires root access, inspect this script
carefully. Once you're happy to proceed, run it using:
## Setup

```sh
$ bash ./bootstrap.sh
```
### Requirements

Finally, we must set up the shell environment for running bazel. The file `.env`
is created by the configure process and must be sourced for every shell we want
to use bazel with:
1. Python >= 3.6.
1. sqlite

### Install

Download the latest binary from the [Releases page](https://github.com/ChrisCummins/format) and put it in your $PATH.

Or to build from source:

```sh
$ source $PWD/.env
$ bazel run -c opt //tools/format:install
```

Now build or test whatever bazel targets you'd like. Use `bazel query //...` to
list the available targets. E.g. to run the entire test suite, run:
## License

```bash
$ bazel test //...
```
Copyright 2020 Chris Cummins <[email protected]>.

Released under the terms of the Apache 2.0 license. See
`LICENSE` for details.
60 changes: 0 additions & 60 deletions tools/format/README.md

This file was deleted.

1 change: 1 addition & 0 deletions tools/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

0 comments on commit 37235c9

Please sign in to comment.