Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into gg-ExploreViewSpl…
Browse files Browse the repository at this point in the history
…itReducerLogic
  • Loading branch information
Grace Guo committed Aug 7, 2017
2 parents 90502f6 + f68189b commit cf66782
Show file tree
Hide file tree
Showing 67 changed files with 1,295 additions and 410 deletions.
2 changes: 1 addition & 1 deletion .coveralls.yml
Original file line number Diff line number Diff line change
@@ -1 +1 @@
repo_token: eESbYiv4An6KEvjpmguDs4L7YkubXbqn1
repo_token: 4P9MpvLrZfJKzHdGZsdV3MzO43OZJgYFn
4 changes: 2 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ cache:
env:
global:
- TRAVIS_CACHE=$HOME/.travis_cache/
- TRAVIS_NODE_VERSION="6.10.2"
- TRAVIS_NODE_VERSION="7.10.0"
matrix:
- TOX_ENV=javascript
- TOX_ENV=pylint
Expand All @@ -19,7 +19,7 @@ env:
- TOX_ENV=py27-mysql
- TOX_ENV=py27-sqlite
before_install:
- npm install -g npm@'>=4.5.0'
- npm install -g npm@'>=5.0.3'
before_script:
- mysql -e 'drop database if exists superset; create database superset DEFAULT CHARACTER SET utf8 COLLATE utf8_unicode_ci' -u root
- mysql -u root -e "CREATE USER 'mysqluser'@'localhost' IDENTIFIED BY 'mysqluserpassword';"
Expand Down
21 changes: 13 additions & 8 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ meets these guidelines:

## Documentation

The latest documentation and tutorial are available [here](http://airbnb.io/superset).
The latest documentation and tutorial are available [here](https://superset.incubator.apache.org/).

Contributing to the official documentation is relatively easy, once you've setup
your environment and done an edit end-to-end. The docs can be found in the
Expand Down Expand Up @@ -144,7 +144,7 @@ referenced in the rst, e.g.

aren't actually included in that directory. _Instead_, you'll want to add and commit
images (and any other static assets) to the _superset/assets/images_ directory.
When the docs are being pushed to [airbnb.io](http://airbnb.io/superset/), images
When the docs are being pushed to [Apache Superset (incubating)](https://superset.incubator.apache.org/), images
will be moved from there to the _\_static/img_ directory, just like they're referenced
in the docs.

Expand All @@ -161,12 +161,12 @@ instead.

## Setting up a Python development environment

Check the [OS dependencies](http://airbnb.io/superset/installation.html#os-dependencies) before follows these steps.
Check the [OS dependencies](https://superset.incubator.apache.org/installation.html#os-dependencies) before follows these steps.

# fork the repo on GitHub and then clone it
# alternatively you may want to clone the main repo but that won't work
# so well if you are planning on sending PRs
# git clone [email protected]:airbnb/superset.git
# git clone [email protected]:apache/incubator-superset.git

# [optional] setup a virtual env and activate it
virtualenv env
Expand Down Expand Up @@ -223,8 +223,13 @@ To install third party libraries defined in `package.json`, run the
following within the `superset/assets/` directory which will install them in a
new `node_modules/` folder within `assets/`.

```
npm install
```bash
# from the root of the repository, move to where our JS package.json lives
cd superset/assets/
# install yarn, a replacement for `npm install` that is faster and more deterministic
npm install -g yarn
# run yarn to fetch all the dependencies
yarn
```

To parse and generate bundled files for superset, run either of the
Expand Down Expand Up @@ -342,7 +347,7 @@ new language dictionary, run the following command:

pybabel init -i ./babel/messages.pot -d superset/translations -l es

Then it's a matter of running the statement below to gather all stings that
Then it's a matter of running the statement below to gather all strings that
need translation

fabmanager babel-extract --target superset/translations/
Expand Down Expand Up @@ -374,4 +379,4 @@ to take effect, they need to be compiled using this command:

Here's an example as a Github PR with comments that describe what the
different sections of the code do:
https://github.com/airbnb/superset/pull/3013
https://github.com/apache/incubator-superset/pull/3013
17 changes: 8 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,14 @@
Superset
=========

[![Build Status](https://travis-ci.org/airbnb/superset.svg?branch=master)](https://travis-ci.org/airbnb/superset)
[![Build Status](https://travis-ci.org/apache/incubator-superset.svg?branch=master)](https://travis-ci.org/apache/incubator-superset)
[![PyPI version](https://badge.fury.io/py/superset.svg)](https://badge.fury.io/py/superset)
[![Coverage Status](https://coveralls.io/repos/airbnb/superset/badge.svg?branch=master&service=github)](https://coveralls.io/github/airbnb/superset?branch=master)
[![JS Test Coverage](https://codeclimate.com/github/airbnb/superset/badges/coverage.svg)](https://codeclimate.com/github/airbnb/superset/coverage)
[![Code Health](https://landscape.io/github/airbnb/superset/master/landscape.svg?style=flat)](https://landscape.io/github/airbnb/superset/master)
[![Code Climate](https://codeclimate.com/github/airbnb/superset/badges/gpa.svg)](https://codeclimate.com/github/airbnb/superset)
[![Coverage Status](https://coveralls.io/repos/apache/incubator-superset/badge.svg?branch=master&service=github)](https://coveralls.io/github/apache/incubator-superset?branch=master)
[![PyPI](https://img.shields.io/pypi/pyversions/superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/superset)
[![Requirements Status](https://requires.io/github/airbnb/superset/requirements.svg?branch=master)](https://requires.io/github/airbnb/superset/requirements/?branch=master)
[![Join the chat at https://gitter.im/airbnb/superset](https://badges.gitter.im/airbnb/superset.svg)](https://gitter.im/airbnb/superset?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![Requirements Status](https://requires.io/github/apache/incubator-superset/requirements.svg?branch=master)](https://requires.io/github/apache/incubator-superset/requirements/?branch=master)
[![Join the chat at https://gitter.im/apache/incubator-superset](https://badges.gitter.im/apache/incubator-superset.svg)](https://gitter.im/apache/incubator-superset?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![Documentation](https://img.shields.io/badge/docs-apache.org-blue.svg)](https://superset.incubator.apache.org)
[![dependencies Status](https://david-dm.org/airbnb/superset/status.svg?path=superset/assets)](https://david-dm.org/airbnb/superset?path=superset/assets)
[![dependencies Status](https://david-dm.org/apache/incubator-superset/status.svg?path=superset/assets)](https://david-dm.org/apache/incubator-superset?path=superset/assets)

<img
src="https://cloud.githubusercontent.com/assets/130878/20946612/49a8a25c-bbc0-11e6-8314-10bef902af51.png"
Expand Down Expand Up @@ -176,6 +173,7 @@ the world know they are using Superset. Join our growing community!
- [Brilliant.org](https://brilliant.org/)
- [Clark.de](http://clark.de/)
- [Digit Game Studios](https://www.digitgaming.com/)
- [Douban](https://www.douban.com/)
- [Endress+Hauser](http://www.endress.com/)
- [FBK - ICT center](http://ict.fbk.eu)
- [Faasos](http://faasos.com/)
Expand All @@ -187,4 +185,5 @@ the world know they are using Superset. Join our growing community!
- [Tobii](http://www.tobii.com/)
- [Tooploox](https://www.tooploox.com/)
- [Udemy](https://www.udemy.com/)
- [Yahoo!](www.yahoo.com)
- [Yahoo!](https://yahoo.com/)
- [Zalando](https://www.zalando.com)
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
master_doc = 'index'

# General information about the project.
project = "Superset's documentation"
project = "Apache Superset"
copyright = None
author = u'Maxime Beauchemin'

Expand Down
27 changes: 23 additions & 4 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,8 @@ never be affected by any dashboard level filtering.
"filter_immune_slice_fields": {
"177": ["country_name", "__from", "__to"],
"32": ["__from", "__to"]
}
},
"timed_refresh_immune_slices": [324]
}

In the json blob above, slices 324, 65 and 92 won't be affected by any
Expand All @@ -124,15 +125,33 @@ But what happens with filtering when dealing with slices coming from
different tables or databases? If the column name is shared, the filter will
be applied, it's as simple as that.


How to limit the timed refresh on a dashboard?
----------------------------------------------
By default, the dashboard timed refresh feature allows you to automatically requery every slice on a dashboard according to a set schedule. Sometimes, however, you won't want all of the slices to be refreshed - especially if some data is slow moving, or run heavy queries.
To exclude specific slices from the timed refresh process, add the ``timed_refresh_immune_slices`` key to the dashboard ``JSON Metadata`` field:

..code::

{
"filter_immune_slices": [],
"expanded_slices": {},
"filter_immune_slice_fields": {},
"timed_refresh_immune_slices": [324]
}

In the example above, if a timed refresh is set for the dashboard, then every slice except 324 will be automatically requeried on schedule.


Why does fabmanager or superset freezed/hung/not responding when started (my home directory is NFS mounted)?
-----------------------------------------------------------------------------------------
superset creates and uses an sqlite database at ``~/.superset/superset.db``. Sqlite is known to `don't work well if used on NFS`__ due to broken file locking implementation on NFS.
By default, superset creates and uses an sqlite database at ``~/.superset/superset.db``. Sqlite is known to `don't work well if used on NFS`__ due to broken file locking implementation on NFS.

__ https://www.sqlite.org/lockingv3.html

One work around is to create a symlink from ~/.superset to a directory located on a non-NFS partition.
You can override this path using the ``SUPERSET_HOME`` environment variable.

Another work around is to change where superset stores the sqlite database by adding ``SQLALCHEMY_DATABASE_URI = 'sqlite:////new/localtion/superset.db'`` in superset_config.py (create the file if needed), then adding the directory where superset_config.py lives to PYTHONPATH environment variable (e.g. ``export PYTHONPATH=/opt/logs/sandbox/airbnb/``).
Another work around is to change where superset stores the sqlite database by adding ``SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db'`` in superset_config.py (create the file if needed), then adding the directory where superset_config.py lives to PYTHONPATH environment variable (e.g. ``export PYTHONPATH=/opt/logs/sandbox/airbnb/``).

How do I add new columns to an existing table
---------------------------------------------
Expand Down
32 changes: 22 additions & 10 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,36 +1,48 @@
.. image:: _static/img/s.png

Superset's documentation
''''''''''''''''''''''''
Apache Superset (incubating)
''''''''''''''''''''''''''''

Superset is a data exploration platform designed to be visual, intuitive
and interactive.
Apache Superset (incubating) is a modern, enterprise-ready business
intelligence web application


----------------

.. warning:: This project was originally named Panoramix, was renamed to
Caravel in March 2016, and is currently named Superset as of November 2016

.. important::

**Disclaimer**: Apache Superset is an effort undergoing incubation at The
Apache Software Foundation (ASF), sponsored by the Apache Incubator.
Incubation is required of all newly accepted projects until a further
review indicates that the infrastructure, communications, and
decision making process have stabilized in a manner consistent with
other successful ASF projects. While incubation status is not
necessarily a reflection of the completeness or stability of
the code, it does indicate that the project has yet to be fully
endorsed by the ASF.

Overview
=======================================

Features
---------

- A rich set of data visualizations, integrated from some of the best
visualization libraries
- Create and share simple dashboards
- An extensible, high-granularity security/permission model allowing
intricate rules on who can access individual features and the dataset
- A rich set of data visualizations
- An easy-to-use interface for exploring and visualizing data
- Create and share dashboards
- Enterprise-ready authentication with integration with major authentication
providers (database, OpenID, LDAP, OAuth & REMOTE_USER through
Flask AppBuilder)
- An extensible, high-granularity security/permission model allowing
intricate rules on who can access individual features and the dataset
- A simple semantic layer, allowing users to control how data sources are
displayed in the UI by defining which fields should show up in which
drop-down and which aggregation and function metrics are made available
to the user
- Integration with most RDBMS through SqlAlchemy
- Integration with most SQL-speaking RDBMS through SQLAlchemy
- Deep integration with Druid.io

------
Expand Down
28 changes: 14 additions & 14 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -392,13 +392,13 @@ have the same configuration.

.. code-block:: python
class CeleryConfig(object):
BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ('superset.sql_lab', )
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ANNOTATIONS = {'tasks.add': {'rate_limit': '10/s'}}
class CeleryConfig(object):
BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ('superset.sql_lab', )
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ANNOTATIONS = {'tasks.add': {'rate_limit': '10/s'}}
CELERY_CONFIG = CeleryConfig
CELERY_CONFIG = CeleryConfig
To setup a result backend, you need to pass an instance of a derivative
of ``werkzeug.contrib.cache.BaseCache`` to the ``RESULTS_BACKEND``
Expand All @@ -410,13 +410,13 @@ look something like:

.. code-block:: python
# On S3
from s3cache.s3cache import S3Cache
S3_CACHE_BUCKET = 'foobar-superset'
S3_CACHE_KEY_PREFIX = 'sql_lab_result'
RESULTS_BACKEND = S3Cache(S3_CACHE_BUCKET, S3_CACHE_KEY_PREFIX)
# On S3
from s3cache.s3cache import S3Cache
S3_CACHE_BUCKET = 'foobar-superset'
S3_CACHE_KEY_PREFIX = 'sql_lab_result'
RESULTS_BACKEND = S3Cache(S3_CACHE_BUCKET, S3_CACHE_KEY_PREFIX)
# On Redis
# On Redis
from werkzeug.contrib.cache import RedisCache
RESULTS_BACKEND = RedisCache(
host='localhost', port=6379, key_prefix='superset_results')
Expand Down Expand Up @@ -444,8 +444,8 @@ your environment.::

# assuming $SUPERSET_HOME as the root of the repo
cd $SUPERSET_HOME/superset/assets
npm install
npm run build
yarn
yarn run build
cd $SUPERSET_HOME
python setup.py install

Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def get_git_sha():
'boto3==1.4.4',
'celery==3.1.25',
'colorama==0.3.9',
'cryptography==1.7.2',
'cryptography==1.9',
'flask-appbuilder==1.9.1',
'flask-cache==0.13.1',
'flask-migrate==2.0.3',
Expand All @@ -61,7 +61,7 @@ def get_git_sha():
'pandas==0.20.2',
'parsedatetime==2.0.0',
'pydruid==0.3.1',
'PyHive>=0.3.0',
'PyHive>=0.4.0',
'python-dateutil==2.6.0',
'requests==2.17.3',
'simplejson==3.10.0',
Expand Down
6 changes: 6 additions & 0 deletions superset/assets/backendSync.json
Original file line number Diff line number Diff line change
Expand Up @@ -750,6 +750,12 @@
"default": false,
"description": "Sort bars by x labels."
},
"combine_metric": {
"type": "CheckboxControl",
"label": "Combine Metrics",
"default": false,
"description": "Display metrics side by side within each column, as opposed to each column being displayed side by side for each metric."
},
"show_controls": {
"type": "CheckboxControl",
"label": "Extra Controls",
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit cf66782

Please sign in to comment.