Skip to content

This repository is an example of dockerizing a Gradio application which uses the GPT2 model from the Transformers package provided by Hugging Face

Notifications You must be signed in to change notification settings

arora-r/gradio-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

gradio-example

This repository demonstrates the deployment of Gradio applications using transformer models.

It includes two examples: demo.py which employs transformer models, and gr_demo.py, the base gradio example from the quickstart guide. To use a specific file, change the demo.py text in the CMD ["python", "-u", "demo.py"] command to the desired file.

Prerequisites

  • Docker

Usage

  1. Clone this repository: git clone https://github.com/arora-r/gradio-example.git
  2. Navigate to the repository directory: cd gradio-example
  3. Build the Docker image: docker build -t gradio-example .
  4. Run the Docker container: docker run -p 8080:8080 gradio-example
  5. Open Gradio in your web browser: http://localhost:8080

Customization

To use your own transformer models in the Gradio application, you can modify the demo.py file to import and utilize your model. Then, rebuild the Docker image and run the container as described above.

Acknowledgments

This repository was inspired by the examples in the Gradio repository, found here.

About

This repository is an example of dockerizing a Gradio application which uses the GPT2 model from the Transformers package provided by Hugging Face

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published