Skip to content

Commit

Permalink
Merge pull request #14 from AutoResearch/ci/update-pre-commit-hooks-a…
Browse files Browse the repository at this point in the history
…nd-add-workflow

ci: update pre-commit hooks and add workflow
  • Loading branch information
benwandrew authored Sep 27, 2023
2 parents a474f8f + 10de126 commit 8d2c9ae
Show file tree
Hide file tree
Showing 11 changed files with 805 additions and 209 deletions.
25 changes: 25 additions & 0 deletions .github/workflows/test-pre-commit-hooks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: Test pre-commit-hooks

on:
pull_request:
merge_group:

jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.8'
cache: 'pip'
- run: pip install pre-commit
- uses: actions/cache@v3
with:
path: ~/.cache/pre-commit
key: pre-commit-3|${{ env.pythonLocation }}|${{ runner.os }}|${{ hashFiles('.pre-commit-config.yaml') }}
- run: pre-commit run --all-files --show-diff-on-failure --color=always
19 changes: 14 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,28 +1,37 @@
repos:
- repo: https://github.com/ambv/black
rev: 22.12.0
rev: 23.7.0
hooks:
- id: black
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args:
- "--profile=black"
- "--filter-files"
- "--project=autora"
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8
args:
- "--max-line-length=100"
- "--extend-ignore=E203"
- "--per-file-ignores=__init__.py:F401"
- repo: https://github.com/pre-commit/mirrors-mypy
rev: "v0.991"
rev: "v1.5.1"
hooks:
- id: mypy
additional_dependencies: [types-requests]
additional_dependencies: [types-requests,types-tqdm,autora-core,scipy,pytest]
language_version: python3.8
args:
- "--namespace-packages"
- "--explicit-package-bases"
- repo: https://github.com/srstevenson/nb-clean
rev: 2.4.0
hooks:
- id: nb-clean
args:
- --preserve-cell-outputs
default_language_version:
python: python3
93 changes: 29 additions & 64 deletions docs/Basic Usage.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,15 @@
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"# Basic Usage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"metadata": {},
"outputs": [],
"source": [
"# Uncomment the following line when running on Google Colab\n",
Expand All @@ -24,20 +20,16 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"Let's generate a simple data set with two features $x_1, x_2 \\in [0, 1]$ and a target $y$. We will use the following generative model:\n",
"$y = 2 x_1 - e^{(5 x_2)}$"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
Expand All @@ -51,19 +43,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"Now let us define the search space, that is, the space of operations to consider when searching over the space of computation graphs."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"primitives = [\n",
Expand All @@ -79,9 +67,7 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"## Set Up The DARTS Regressor\n",
"\n",
Expand All @@ -101,10 +87,8 @@
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from autora.theorist.darts import DARTSRegressor\n",
Expand All @@ -123,19 +107,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"Now we have everything to run differentiable architecture search and visualize the model resulting from the highest architecture weights. Note that the current model corresponds to the model with the highest architecture weights."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
Expand Down Expand Up @@ -216,7 +196,7 @@
"<graphviz.graphs.Digraph at 0x288b726d0>"
]
},
"execution_count": 4,
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -228,19 +208,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"We can refine the fit by running the `fit` method again, after changing the parameters. This allows us to keep the same architecture but refit the parameters in the final sampled model, for example:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
Expand Down Expand Up @@ -321,7 +297,7 @@
"<graphviz.graphs.Digraph at 0x174355df0>"
]
},
"execution_count": 5,
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -337,19 +313,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"We can also change how the model is sampled from the architecture weight distribution:"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
Expand Down Expand Up @@ -430,7 +402,7 @@
"<graphviz.graphs.Digraph at 0x1746b7dc0>"
]
},
"execution_count": 6,
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -447,19 +419,15 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"To recover the initial model, we need to return the sampling strategy to the default `\"max\"`:"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": false
},
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
Expand Down Expand Up @@ -540,7 +508,7 @@
"<graphviz.graphs.Digraph at 0x119bba670>"
]
},
"execution_count": 7,
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -557,9 +525,7 @@
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": false
},
"metadata": {},
"source": [
"As long as the architecture has not been refitted in the meantime, the architecture should be identical to the initial result, as the `sampling_strategy=\"max\"` is deterministic. The coefficients of the architecture functions may, however, be different, as they have different starting values compared to when they were initially set.\n"
]
Expand All @@ -580,8 +546,7 @@
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.6"
"pygments_lexer": "ipython2"
}
},
"nbformat": 4,
Expand Down
819 changes: 709 additions & 110 deletions docs/Weber Fechner Example.ipynb

Large diffs are not rendered by default.

10 changes: 10 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,16 @@ dev = [
"autora-synthetic"
]

[tool.isort]
profile = "black"

[tool.mypy]
mypy_path="./src"

[[tool.mypy.overrides]]
module = ["matplotlib.*", "autora.*", "sklearn.*", "torch.*", "graphviz.*", "mkdocs_gen_files.*", "pandas.*", "numpy.*","scipy.*"]
ignore_missing_imports=true

[project.urls]
homepage = "http://www.empiricalresearch.ai"
repository = "https://github.com/AutoResearch/autora-theorist-darts"
Expand Down
2 changes: 1 addition & 1 deletion src/autora/theorist/darts/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
from .model_search import DARTSType
from .operations import PRIMITIVES
from .regressor import DARTSRegressor, DARTSExecutionMonitor
from .regressor import DARTSExecutionMonitor, DARTSRegressor
3 changes: 1 addition & 2 deletions src/autora/theorist/darts/model_search.py
Original file line number Diff line number Diff line change
Expand Up @@ -505,6 +505,7 @@ def genotype(self, sample: bool = False) -> Genotype:
Returns:
genotype: genotype describing the current (sampled) architecture
"""

# this function uses the architecture weights to retrieve the
# operations with the highest weights
def _parse(weights):
Expand Down Expand Up @@ -613,9 +614,7 @@ def count_parameters(self, print_parameters: bool = False) -> Tuple[int, int, li

tmp_param_list = list()
if isiterable(op._ops[maxIdx[0].item(0)]): # Zero is not iterable

for subop in op._ops[maxIdx[0].item(0)]:

for parameter in subop.parameters():
tmp_param_list.append(parameter.data.numpy().squeeze())
if parameter.requires_grad is True:
Expand Down
3 changes: 0 additions & 3 deletions src/autora/theorist/darts/operations.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,15 +88,13 @@ def get_operation_label(
else:
classifier_str = input_var + " .* ("
for param_idx, param in enumerate(params):

if param_idx > 0:
if output_format == "latex":
classifier_str += " + \\left("
else:
classifier_str += " .+ ("

if isiterable(param.tolist()):

param_formatted = list()
for value in param.tolist():
param_formatted.append(format_string.format(value))
Expand Down Expand Up @@ -541,7 +539,6 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:


def operation_factory(name):

if name == "none":
return Zero(1)
elif name == "add":
Expand Down
Loading

0 comments on commit 8d2c9ae

Please sign in to comment.