Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference on a remote model #32

Open
gmuraru opened this issue Jul 19, 2020 · 0 comments
Open

Inference on a remote model #32

gmuraru opened this issue Jul 19, 2020 · 0 comments

Comments

@gmuraru
Copy link
Contributor

gmuraru commented Jul 19, 2020

At the moment of creating this issue, I do not know if we can do this in Syft but it should be pretty easy to implement.
We need to have a way to send data to a server and that server should run the inference (a minimum required) on a pre-trained model (that was trained using FL) and give back the result.

More ideas after we have the basic building block:

  • send back the saliency map
  • use HE to send to model
  • use DP in case the client wants to "attack" the model
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant