From 557a2fe5f9a42258d2e56a7e113dd64b11ea8e40 Mon Sep 17 00:00:00 2001 From: Jianjie Liu Date: Thu, 14 Oct 2021 16:56:55 +0000 Subject: [PATCH] update setup.md on default support on Spark3 --- SETUP.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/SETUP.md b/SETUP.md index b39283380a..a8df7147b7 100644 --- a/SETUP.md +++ b/SETUP.md @@ -64,7 +64,7 @@ If using venv or virtualenv, see [these instructions](#using-a-virtual-environme **NOTE** the models from Cornac require installation of `libpython` i.e. using `sudo apt-get install -y libpython3.6` or `libpython3.7`, depending on the version of Python. -**NOTE** Spark requires Java version 8 or 11. We support Spark version 3, but versions 2.4+ with Java version 8 may also work. +**NOTE** We now support Spark version 3 by default, which requires Java version 11.
Install Java on MacOS @@ -151,7 +151,7 @@ create the file `%RECO_ENV%\etc\conda\deactivate.d\env_vars.bat` and add: It is straightforward to install the recommenders package within a [virtual environment](https://docs.python.org/3/library/venv.html). However, setting up CUDA for use with a GPU can be cumbersome. We thus recommend setting up [Nvidia docker](https://github.com/NVIDIA/nvidia-docker) and running the virtual environment within a container, as the most convenient way to do this. -In the following `3.6` should be replaced with the Python version you are using and `11` should be replaced with the appropriate Java version. +In the following, `3.6` should be replaced with the Python version you are using. For users of Spark2, `11` should be replace with Java `8` and `pip install recommenders[all]==0.6.0` instead. # Start docker daemon if not running sudo dockerd &