This project aims to provide a neat and standard way of providing bin collection data in JSON format from UK councils that have no API to do so.
Why do this? You might want to use this in a Home Automation - for example say you had an LED bar that lit up on the day of bin collection to the colour of the bin you want to take out, then this repo provides the data for that.
PLEASE respect a councils infrastructure / usage policy and only collect data for your own personal use on a suitable frequency to your collection schedule.
Most scripts make use of Beautiful Soup 4 to scrape data, although others use different approaches, such as emulating web browser behaviour, or reading data from CSV files.
This integration can be installed directly via HACS. To install:
- Add the repository to your HACS installation
- Click
Download
- Ensure you have HACS installed
- In the Home Assistant UI go to
HACS
>Integrations
>â‹®
>Custom repositories
. - Enter
https://github.com/robbrad/UKBinCollectionData
in theRepository
field. - Select
Integration
as the category then clickADD
. - Click
+ Add Integration
and search for and selectUK Bin Collection Data
then clickDownload
. - Restart your Home Assistant.
- In the Home Assistant UI go to
Settings
>Devices & Services
click+ Add Integration
and search forUK Bin Collection Data
. - If your see a "URL of the remote Selenium web driver to use" field when setting up your council, you'll need to provide the URL to a web driver you've set up seperately such as standalone-chrome.
- Open the folder for your Home Assistant configuration (where you find
configuration.yaml
). - If you do not have a
custom_components
folder there, you need to create it. - Download this repository then copy the folder
custom_components/uk_bin_collection
into thecustom_components
folder you found/created in the previous step. - Restart your Home Assistant.
- In the Home Assistant UI go to
Settings
>Devices & Services
click+ Add Integration
and search forUK Bin Collection Data
.
PS G:\Projects\Python\UKBinCollectionData\uk_bin_collection\collect_data.py
usage: collect_data.py [-h] [-p POSTCODE] [-n NUMBER] [-u UPRN] module URL
positional arguments:
module Name of council module to use (required)
URL URL to parse (required)
options:
-h, --help show this help message (optional)
-p POSTCODE, --postcode POSTCODE Postcode to parse - should include (optional)
a space and be wrapped in double
quotes
-n NUMBER, --number NUMBER House number to parse (optional)
-u UPRN, --uprn UPRN UPRN to parse (optional)
The basic command to execute a script is:
python collect_data.py <council_name> "<collection_url>"
where council_name
is the name of the council's .py script (without the .py) and collection_url
is the URL to scrape.
The help documentation refers to these as "module" and "URL", respectively. Supported council scripts can be found in the uk_bin_collection/uk_bin_collection/councils
folder.
Some scripts require additional parameters, for example, when a UPRN is not passed in a URL, or when the script is not scraping a web page. For example, the Leeds City Council script needs two additional parameters - a postcode, and a house number. This is done like so:
python collect_data.py LeedsCityCouncil https://www.leeds.gov.uk/residents/bins-and-recycling/check-your-bin-day -p "LS1 2JG" -n 41
- A postcode can be passed with
-p "postcode"
or--postcode "postcode"
. The postcode must always include a space in the middle and be wrapped in double quotes (due to how command line arguments are handled). - A house number can be passed with
-n number
or--number number
. - A UPRN reference can be passed with
-u uprn
or--uprn uprn
.
To check the parameters needed for your council's script, please check the project wiki for more information.
Some scripts rely on external packages to function. A list of required scripts for both development and execution can be found in the project's PROJECT_TOML
Install can be done via
poetry install
from within the root of the repo.
Some councils make use of the UPRN (Unique property reference number) to identify your property. You can find yours here or here.
Some councils need Selenium to run the scrape on behalf of Home Assistant. The easiest way to do this is run Selenium as in a Docker container. However you do this the Home Assistant server must be able to reach the Selenium server
-
Download Docker Desktop for Windows:
- Go to the Docker website: Docker Desktop for Windows
- Download and install Docker Desktop.
-
Run Docker Desktop:
- After installation, run Docker Desktop.
- Follow the on-screen instructions to complete the setup.
- Ensure Docker is running by checking the Docker icon in the system tray.
-
Install Docker:
-
Open a terminal and run the following commands:
sudo apt-get update sudo apt-get install \ apt-transport-https \ ca-certificates \ curl \ gnupg \ lsb-release curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg echo \ "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt-get update sudo apt-get install docker-ce docker-ce-cli containerd.io
-
-
Start Docker:
-
Run the following command to start Docker:
sudo systemctl start docker
-
-
Enable Docker to start on boot:
bash
Copy code
sudo systemctl enable docker
-
Download Docker Desktop for Mac:
- Go to the Docker website: Docker Desktop for Mac
- Download and install Docker Desktop.
-
Run Docker Desktop:
- After installation, run Docker Desktop.
- Follow the on-screen instructions to complete the setup.
- Ensure Docker is running by checking the Docker icon in the menu bar.
-
Open a terminal or command prompt:
-
Pull the Selenium Standalone Chrome image:
docker pull selenium/standalone-chrome
-
Run the Selenium Standalone Chrome container:
docker run -d -p 4444:4444 --name selenium-chrome selenium/standalone-chrome
- Navigate to the Selenium server URL in your web browser:
- Open a web browser and go to
http://localhost:4444
- You should see the Selenium Grid console.
- Open a web browser and go to
-
Find the
UKBinCollectionData
project:- Go to the GitHub repository: UKBinCollectionData
-
Supply the Selenium Server URL:
- Typically, the URL will be
http://localhost:4444/wd/hub
- You might need to update a configuration file or environment variable in the project to use this URL. Check the project's documentation for specific instructions.
- Typically, the URL will be
Windows/Linux/Mac:
docker pull selenium/standalone-chrome docker run -d -p 4444:4444 --name selenium-chrome selenium/standalone-chrome
Selenium Server URL:
http://localhost:4444/wd/hub
To make a request for your council, first check the Issues page to make sure it has not already been requested. If not, please fill in a new Council Request form, including as much information as possible, including:
- Name of the council
- URL to bin collections
- An example postcode and/or UPRN (whichever is relevant)
- Any further information
Please be aware that this project is run by volunteer contributors and completion depends on numerous factors - even with a request, we cannot guarantee if/when your council will get a script.
We have created an API for this located under uk_bin_collection_api_server
- Docker installed on your machine
- Python (if you plan to run the API locally without Docker)
- Clone this repository.
- Navigate to the uk_bin_collection_api_server directory of the project.
docker build -t ukbc_api_server .
docker run -p 8080:8080 ukbc_api_server
Once the Docker container is running, you can access the API endpoints:
API Base URL: http://localhost:8080/api
Swagger UI: http://localhost:8080/api/ui/
The API documentation can be accessed via the Swagger UI. Use the Swagger UI to explore available endpoints, test different requests, and understand the API functionalities.
GET /bin_collection/{council}
Description: Retrieves information about bin collection for the specified council.
Parameters:
council (required): Name of the council.
Other optional parameters: [Specify optional parameters if any]
Example Request:
curl -X GET "http://localhost:8080/api/bin_collection/{council}" -H "accept: application/json"
This includes the Selenium standalone-chrome for Selenium based councils
version: '3'
services:
ukbc_api_server:
build:
context: .
dockerfile: Dockerfile
ports:
- "8080:8080" # Adjust the ports as needed
depends_on:
- selenium
selenium:
image: selenium/standalone-chrome:latest
ports:
- "4444:4444"
sudo apt-get update
sudo apt-get install docker-compose
docker-compose up
Please post in the HomeAssistant thread or raise a new (non council request) issue.
Contributions are always welcome! See CONTRIBUTING.md
to get started. Please adhere to the project's code of conduct.
- If you're new to coding/Python/BeautifulSoup, feel free to check here for issues that are good for newcomers!
- If you would like to try writing your own scraper, feel free to fork this project and use existing scrapers as a base for your approach (or
councilclasstemplate.py
).