Skip to content

Commit

Permalink
(#46) Update Docs to add API.
Browse files Browse the repository at this point in the history
  • Loading branch information
zhangxjohn committed Jul 28, 2022
1 parent 66b7a85 commit a4de5bc
Show file tree
Hide file tree
Showing 11 changed files with 600 additions and 30 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ train_data, test_data = train_test_split(data, test_size=0.2)
model = make_experiment(train_data.copy(),
task='classification',
mode='dl',
dl_gpu_usage_strategy=1,
tf_gpu_usage_strategy=1,
reward_metric='accuracy',
max_trials=30,
early_stopping_rounds=10).run()
Expand Down
2 changes: 1 addition & 1 deletion README_zh_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ train_data, test_data = train_test_split(data, test_size=0.2)
model = make_experiment(train_data.copy(),
task='classification',
mode='dl',
dl_gpu_usage_strategy=1,
tf_gpu_usage_strategy=1,
reward_metric='accuracy',
max_trials=30,
early_stopping_rounds=10).run()
Expand Down
13 changes: 7 additions & 6 deletions docs/en_US/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import os
import sys
from datetime import datetime
sys.path.insert(0, os.path.abspath('./.'))
sys.path.insert(0, os.path.abspath('../../..'))

# -- Project information -----------------------------------------------------

Expand All @@ -23,7 +23,7 @@
author = 'DataCanvas.com'

# The full version, including alpha/beta/rc tags
release = '0.1.3'
release = '0.1.4'


# -- General configuration ---------------------------------------------------
Expand All @@ -33,12 +33,14 @@
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.napoleon',
'sphinx.ext.intersphinx', # this order is important to make intersphinx work!
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.autosummary',
'sphinx.ext.mathjax',
'sphinx.ext.viewcode',
'sphinx_gallery.gen_gallery',
]

autodoc_default_options = {
Expand Down Expand Up @@ -88,7 +90,6 @@
# a list of builtin themes.
#
html_theme = "sphinx_rtd_theme"
extensions = ['recommonmark']

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
Expand All @@ -104,4 +105,4 @@
(master_doc, 'HyperTS', 'HyperTS Documentation',
author, 'HyperTS', 'One line description of project.',
'Miscellaneous'),
]
]
2 changes: 1 addition & 1 deletion docs/en_US/source/contents/0400_quick_start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ An experiment is firsty created by ``make_experiment`` with several user-defined

.. tip::

For more advanced performance, you could modify other parameters. Please refer to the instructions of :doc:`Advanced Configurations </contents/0500_0500_advanced_config>`.
For more advanced performance, you could modify other parameters. Please refer to the instructions of :doc:`Advanced Configurations </contents/0500_advanced_config>`.



Expand Down
4 changes: 2 additions & 2 deletions docs/en_US/source/contents/0500_advanced_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,14 +71,14 @@ The deep learning method is based on the Tensorfolw framework, which processes i

- 0: processing in CPU;
- 1: processing in GPU with increasing memory according to the data scale;
- 2: processing in GPU with limited memory (2048M). Change the memory limit by the argument ``dl_memory_limit``.
- 2: processing in GPU with limited memory (2048M). Change the memory limit by the argument ``tf_memory_limit``.


.. code-block:: python
experiment = make_experiment(train_data,
mode='dl',
dl_gpu_usage_strategy=1,
tf_gpu_usage_strategy=1,
...)
------------------
Expand Down
10 changes: 5 additions & 5 deletions docs/en_US/source/contents/0700_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@ HyperTS provides three different methods to perform time series analysis, which
---------

Statistical Methods
********
********************
Different tasks require different statistical methods, which are introduced in sequence in this subsection.

- Time series forecasting: Prophet | ARIMA | VAR
- Time series classification: TSForest | KNeighbors


Prophet
=======
========
Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well.

Prophet is stated as a decomposable model with three main components: trend, seasonality, and holidays.
Expand Down Expand Up @@ -54,7 +54,7 @@ The MA part of ARIMA indicates that the forecast error is a linear combination o
.. math::
X_{t}=\varepsilon _{t}+\beta _{1}\varepsilon _{t-1}+...+\beta _{q}\varepsilon _{t-q},
where, :math:`\varepsilon _{}` are the errors of the AR models of the respective lags. :math:`\beta_{}`is the coefficient. From the equation, we could see that the past errors impact the current value indirectly.
where, :math:`\varepsilon _{}` are the errors of the AR models of the respective lags. :math:`\beta_{}` is the coefficient. From the equation, we could see that the past errors impact the current value indirectly.

The ARMA(p, q) model is combined with AR and MA models:

Expand Down Expand Up @@ -116,7 +116,7 @@ K-nearest-neighbor(KNN) classifiers with dynamic time warping `(DTW) <https://en


Deep Learning Algorithms
********
*************************

DeepAR
======
Expand Down Expand Up @@ -154,7 +154,7 @@ For more information, please refer to the paper `Modeling Long- and Short-Term T
--------

Neural Architecture Search
*************
*****************************
Since AlexNet won the 2012 ImageNet competition, deep learning has made breakthroughs in many challenging tasks and fields. In addition to AlexNet,
e.g., VGG, Inception, ResNet, Transformer, GPT and so on have been proposed and widely used in industry and academia. And then, behind all these great networks,
it is the crystallization of the experience of countless human experts. Consequently, neural architecture search (NAS) has emerged as a promising tool to alleviate human efforts in this trial-and-error design process.
Expand Down
Loading

0 comments on commit a4de5bc

Please sign in to comment.