Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installing Ax on a Raspberry Pi (Model 3B+) #412

Closed
winf-hsos opened this issue Oct 22, 2020 · 9 comments
Closed

Installing Ax on a Raspberry Pi (Model 3B+) #412

winf-hsos opened this issue Oct 22, 2020 · 9 comments
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests

Comments

@winf-hsos
Copy link

Does anyone have experience with installing Ax on a Raspberry Pi, im my case the model is 3B+. I run Raspberry Pi OS 32bit and I also have one with the 64bit beta version. I'd like to install the Ax dependencies manually, but it's difficult to find the matching pre-compiled packages for the arm64 or arm7l architecture. Any help or hints (is this even possible?) would be very much appreciated!

Thanks
Nicolas

@stevemandala
Copy link
Contributor

stevemandala commented Oct 22, 2020

Hey Nicolas, this sounds super exciting! Admittedly I've been getting hyped up about ARM recently and it's great to hear Ax is finding its way onto Raspberry Pis. Unfortunately, I don't think we've tried to build and run Ax & Botorch on ARM yet, but if you are able to get it to work, we'd love to head about it (and are open to PRs). Here are some rough notes that might help:

  • Most of the common dependencies like scipy and pandas appear to have existing builds for aarch64 that you might be able to just install via pip or download those builds and install.
  • Some like gpytorch and scikit-learn may not have pre-built aarch64 binaries, but it should be possible to manually build these (e.g. Wheel support for aarch64 scikit-learn/scikit-learn#17800)
  • Pytorch also appears to support arm64 builds (https://mathinf.eu/pytorch/arm64/)
  • Both Ax and Botorch also have wheels builds which should work on any architecture (though haven't been tested on arm).

fwiw, it seemed like 32-bit ARM support is shaky, with Pytorch issues like pytorch/pytorch#27040, so arm64 might be your best bet.

@stevemandala stevemandala added the enhancement New feature or request label Oct 22, 2020
@Balandat
Copy link
Contributor

Note that scikit-learn has a conda package for aarch64, in case you can use conda: https://anaconda.org/conda-forge/scikit-learn

Unfortunately no ARM-compatible pytorch conda package yet though...

@winf-hsos
Copy link
Author

Thanks for your replies. I suspected that 32-bit would be a problem, so I tried the Raspberry Pi OS 64bit (beta):

https://www.raspberrypi.org/forums/viewtopic.php?t=275370

I also found the ARM64 build for PyTorch that @stevemandala refered to and successfully installed it on a model 3B+. That worked fine. I stopped when bulding scikit-learn didn't finish through the night and instead I ran into a bunch of errors. I read somewhere that this is a difficult task, especially if you don't know exactly what you're doing (I am not a linux expert).

The reason I wanted Ax on RPis is simple: In a project we are building a research platform to automate the execution of experiments with Ax. For this, we need a number of compute instances that carry out the experiments on demand. As we have many RPis at the university, I thought why not give it a spin. I now switched to Google compute VMs for that, which works perfectly fine.

That said - IF you should ever find a working solution for Ax on a Raspberry Pi I would be super interested to hear about it and try it myself!

Nicolas

@lena-kashtelyan lena-kashtelyan added the wishlist Long-term wishlist feature requests label Nov 2, 2020
@lena-kashtelyan
Copy link
Contributor

We will now be tracking wishlist items / feature requests in a master issue for improved visibility: #566. Of course please feel free to still open new feature requests issues; we'll take care of thinking them through and adding them to the master issue.

@sgbaird
Copy link
Contributor

sgbaird commented Aug 19, 2022

The reason I wanted Ax on RPis is simple: In a project we are building a research platform to automate the execution of experiments with Ax. For this, we need a number of compute instances that carry out the experiments on demand. As we have many RPis at the university, I thought why not give it a spin. I now switched to Google compute VMs for that, which works perfectly fine.

@winf-hsos curious to hear more about this! Both the automated research experiments and the Google compute VMs. I'm interested in getting Ax installed on an RPi in the context of this hackaday project [GitHub] and was thinking about offloading some of the more intensive optimizations to Google Colab, but wasn't entirely sure how I'd go about sending commands to the RPi remotely.

@sgbaird
Copy link
Contributor

sgbaird commented Aug 19, 2022

I was able to get ax-platform pip-installed onto my RPi 400. The steps I took were:

  1. Use the rpi-imager to install the 64-bit Ubuntu 22.04.1 LTS server which supports RPi Zero 2/2/3/4/400 (64-bit version so I can use Mambaforge, which isn't supported for Raspbian OS)
  2. Install gcc (necessary for C-compilations for certain programs, for which I think ax-platform is one)
  3. Install Mambaforge via the Miniforge instructions
  4. Create and activate a new conda environment
  5. Install Ax

Altogether:

sudo apt-get install gcc
curl -L -O "https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-$(uname)-$(uname -m).sh"
bash Mambaforge-$(uname)-$(uname -m).sh
exit # <or Ctrl+D, i.e. MUST close and restart terminal for conda to be recognized>

After closing and opening the terminal per the last step above:

conda create -n ax python==3.9.*
conda activate ax
pip install ax-platform

I ran the Service API script, and the output was as expected!

(sdl-demo) pi@raspberrypi:~/self-driving-lab-demo$  cd /home/pi/self-driving-lab-demo ; /usr/bin/env /home/pi/mambaforge/envs/sdl-demo/bin/python /home/pi/.vscode-server/extensions/ms-python.python-2022.12.1/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher 45537 -- /home/pi/self-driving-lab-demo/scripts/bayesian_optimization_basic.py 
[INFO 08-18 21:26:10] ax.service.ax_client: Starting optimization with verbose logging. To disable logging, set the `verbose_logging` argument to `False`. Note that float values in the logs are rounded to 6 decimal points.
[INFO 08-18 21:26:10] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x2. If that is not the expected value type, you can explicity specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 08-18 21:26:10] ax.service.utils.instantiation: Created search space: SearchSpace(parameters=[RangeParameter(name='x1', parameter_type=FLOAT, range=[-5.0, 10.0]), RangeParameter(name='x2', parameter_type=FLOAT, range=[0.0, 10.0])], parameter_constraints=[]).
[INFO 08-18 21:26:10] ax.modelbridge.dispatch_utils: Using Bayesian optimization since there are more ordered parameters than there are categories for the unordered categorical parameters.
[INFO 08-18 21:26:10] ax.modelbridge.dispatch_utils: Using Bayesian Optimization generation strategy: GenerationStrategy(name='Sobol+GPEI', steps=[Sobol for 5 trials, GPEI for subsequent trials]). Iterations after 5 will take longer to generate due to  model-fitting.
[INFO 08-18 21:26:10] ax.service.ax_client: Generated new trial 0 with parameters {'x1': -3.61248, 'x2': 8.722014}.
[INFO 08-18 21:26:11] ax.service.ax_client: Completed trial 0 with data: {'branin': (23.657991, None)}.
[INFO 08-18 21:26:11] ax.service.ax_client: Generated new trial 1 with parameters {'x1': 2.057908, 'x2': 9.03491}.
[INFO 08-18 21:26:11] ax.service.ax_client: Completed trial 1 with data: {'branin': (38.718558, None)}.
...

Note that I was running the RPi in headless mode via VS Code's Remote SSH which lets me use debugging among many other things like linting and auto-formatting.
image

@Balandat
Copy link
Contributor

This is awesome stuff! I'm going to share this with folks for some karma.

The main problem is that it's probably not going to scale particularly well since the BO algorithms can be pretty compute heavy and benefit a lot from HW acceleration such as MKL that you won't have on a RPi, so if you have large search spaces / many trials / many objectives I expect the little berry to burst pretty quickly.

We have a (currently internal) project that might interest you - It's a remote service based on Ax designed to be deployed on some machine (e.g. could be on AWS or some physical machine you've got sitting around) and then serve a client via RPC calls in an ask/tell fashion. The big benefit of course is that you can have a beefy machine doing the compute and then serve the results to super lightweight / embedded hardware that essentially just runs an RPC client (which could be python but also any other language you like). The RPC layer is currently based on Apache Thrift (this is what's well supported at Meta), but we actually do have plans (albeit no timelines yet) to abstract the RPC layer away from the server implementation so that other folks in OSS could easily implement their own.

cc @lena-kashtelyan, @bernardbeckerman, @pcanaran

@Balandat
Copy link
Contributor

necessary for C-compilations for certain programs, for which I think ax-platform is one

ax-platform does not have any compiled code, so this shouldn’t be required from ax itself. Maybe a dependency needs to be compiled on your platform.

@sgbaird
Copy link
Contributor

sgbaird commented Aug 19, 2022

This is awesome stuff! I'm going to share this with folks for some karma.

Thank you! 😸

The main problem is that it's probably not going to scale particularly well since the BO algorithms can be pretty compute heavy and benefit a lot from HW acceleration such as MKL that you won't have on a RPi, so if you have large search spaces / many trials / many objectives I expect the little berry to burst pretty quickly.

Great point about acceleration. Agreed about this quickly getting unwieldy for more complex optimizations. At least in terms of RAM, the RPi 4B can go up to 8 GB, but the speed would still be a major issue. Also, I like the metaphor usage "berry to burst"! 😆

We have a (currently internal) project that might interest you - It's a remote service based on Ax designed to be deployed on some machine (e.g. could be on AWS or some physical machine you've got sitting around) and then serve a client via RPC calls in an ask/tell fashion. The big benefit of course is that you can have a beefy machine doing the compute and then serve the results to super lightweight / embedded hardware that essentially just runs an RPC client (which could be python but also any other language you like). The RPC layer is currently based on Apache Thrift (this is what's well supported at Meta), but we actually do have plans (albeit no timelines yet) to abstract the RPC layer away from the server implementation so that other folks in OSS could easily implement their own.

Definitely of interest to me! Thanks for mentioning this. This aligns with the needs of many autonomous research lab projects and would fit nicely with the autonomous research lab demo I mentioned. Very exciting! I would love to try it out when it would make sense on your end.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

5 participants