Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
defining the streamlit rendering component and integration
- adding script from giro3D examples
- defining the wrapper function to render the streamlit component and passing  params
  • Loading branch information
dhruvmalik007 committed Aug 28, 2023
1 parent fab0c2b commit 77184c6
Show file tree
Hide file tree
Showing 18 changed files with 32,117 additions and 57 deletions.
115 changes: 68 additions & 47 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,41 +1,93 @@
# LidarhdBot

Application for running geospatial rendering algorithms on decentralised compute via users prompts via tweet / discord messages.
Bot application for running geospatial 3D mesh reconstruction algorithms on decentralised compute via discord messages.


## Status
- [Docker-compose-passing](https://github.com/The-Extra-Project/lidarhdpip/actions/workflows/test-docker-build.yml/badge.svg).
- [Codepipeline-deploy](https://github.com/The-Extra-Project/lidarhdpip/actions/workflows/test-docker-build.yml/badge.svg).

[Docker-compose-passing](https://github.com/The-Extra-Project/lidarhdpip/actions/workflows/test-docker-build.yml/badge.svg).

## stack
- **Serverless messaging using**: [upstash]() / [confluent]()
## Stack:

- **application development framework**: [Streamlit]()
- **Serverless messaging using**: [upstash]().

- **compute service on top of** [bacalhau]:
- **rendering visualization using**: [Streamlit]().

- **bot service running on top of**[discord] / [twitter]:
- **compute over Data using** [bacalhau]():

- **decentralised storage on top of**[web3.storage]:

- **bot service running on top of**[discord.py]:

## Credits
- https://github.com/bertt/nimes

- [bert lidar rendering pipeline](https://github.com/bertt/nimes):For the description concerning the geosptial data reconstruction algorithms.

- [Dockerised discord bot template]()

- [kafka examples]()

## packages description:

1. [Georender](./georender/): Core package that implements the functions to convert the given pointcloud file to .laz 3D tile format.

2. [Bots](./bots/): Package for discord bot logic and commands.

3. [aws_deployment](./aws_deployment/): Scripts to deploy the necessary infrastructure on registered AWS cloud.

4. [visualization](): This is the rendering app written in streamlit that shows the result once the mesh reconstruction job is completed


## Build instructions/Setup:

### 1. Locally running the bot:

1. We need to set following enviornment variables
- For the kafka messaging service. the reference env file is defined in the `.env.example` here.
- Then define the discord configuration in the config.json in 'bots/Discord/config.json', by following the steps defined [here](https://github.com/topics/discord-bot-template).


2. Then build the container application by running the Dockerfile for the streamlit and subsequentially that of the bacalau_sdk service.

### 2. On cloud instance (AWS):

1. Setup the awscli parameters for the account and access token.

2. Then run the scripts as defined below.


```sh
cd aws_deployment/ && cdk synth
# if its for the first time , you will be prompted to run the following command also
cdk bootstrap
```

- Follow the instructions in the [readme]() of the aws_repository in order to provide the necessary permissions to your account in order to address errors.

4. Setup the personal [web3.storage]() account and define the access token in the .env file.

## Build instructions / setup:
5. Store the files on web3.storagefrom where you want to fetch the mapping format (shape/obj files). we have stored the reference files in [data/](./datas/) . and then keep the reference cid generated for the user as parameter to pass for generating the reconstruction job.
- for more detials regarding the files latest version, checkout the following
- https://geoservices.ign.fr/lidarhd
- https://pcrs.ign.fr/version3


1. We need to set enviornment variables
- setting up the parameters for the kafka messaging service. it can be setup on either upstash / confluent, by defining in the utils/kafka/client_properties.py

2. Then deploy the container application by running the Dockerfile for the streamlit and subsequentially that of the bacalau_sdk service eventually.

### API's

3.The LiDAR HD tiles that are being processed can be downloaded from:
- https://geoservices.ign.fr/lidarhd
- https://pcrs.ign.fr/version3
discord bot api
```markdown
/circum job_point <<Xcoordiante>> <<Ycoordinate>> <<ipfs_shp_file>> <<ipfs_pipeline_template_file>> : runs the reconstruction job for the given point coordinate and returns the job id.

# Detailled pipeline / tasks
/circum status <<job_id>>: checks status of the reconstruction job with given id.

/circum result <<job_id>>: fetches the result (laz file cid) once the job is completed.

/circum visualize <<ipfs_cid>>: generated laz file visualization link.
```
# Detailled pipeline of the circum reconstruction process:

```mermaid
---
Expand Down Expand Up @@ -65,35 +117,4 @@ flowchart LR
connector:Streaminng-results-->user
```

## packages:

1. [Georender](./georender/): core package that implements the functions to convert the given pointcloud file to .laz 3D tile format.

2. [Twitter bot](./twitter_bot/): this is the script which is wrapper on tweepy package, allowing users to interact with their developer account.




# Build and Run instructions





## 2. Then run the images locally

2.1 The steps for the backend are defined in the `execute.sh`, but before we need to instantiate the following parameters:


## 2.1 setting up the confluent_kafka pipeline:
there can be two ways:

2.1.1 Either testing on the free instance:
- Setup the [confluent-cloud]() enviornment .
- Fetch the API keys and other parameters and store in the ``.

2.1.2 Either setting up the local instance of the [confluent-kafka image](https://hub.docker.com/r/confluentinc/cp-kafka/):
- then adding the API parameters in the client.properties files as defined before.

## 3. fetching the files in the given container for processing:

23 changes: 13 additions & 10 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
version: "3.9"
services:
georender:
bots:
build:
context: .
dockerfile: ./georender/Dockerfile.pdal
image: lxet/lidar_geocoordinate_bot
context: ./bots/
dockerfile: ./bots/Dockerfile.pdal
image: devextralabs/discord-bot

volumes:
- /usr/app/georender/:/usr/app/georender/
Expand All @@ -16,14 +16,17 @@ services:
build:
context: ./bacalau
dockerfile: ./bacalau/Dockerfile.bacalau
image: lxet/las_processing_bacalau
image: devextralabs/bacalau-job-script

ports:
- 8502

circumbot:

visualization:
build:
context: ./app/
dockerfile: ./Docker/Dockerfile
volumes:
- ./app/src/bots/Discord/:/usr/src/app/discordbot/:rw
context: ./visualization
dockerfile: ./Dockerfile.renderer
image: devextralabs/visualization-web

ports:
- 8503
20 changes: 20 additions & 0 deletions visualization/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
## upstash based kafka parameters.

KAFKA_BROKER_URL=""
SASL_PLAIN_USERNAME=""
SASL_PLAIN_PASSWORD=""



## twitter env parameters

CONSUMER_KEY=""
CONSUMER_KEY_SECRET=""
API_KEY=""
API_KEY_SECRET=""
BEARER_TOKEN=""

## discord env variables



1 change: 1 addition & 0 deletions visualization/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
**/config.json
18 changes: 18 additions & 0 deletions visualization/Dockerfile.renderer
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
FROM python:3.8

RUN mkdir -p /usr/src/app/visualization
COPY pyproject.toml /usr/src/app/pyproject.toml
WORKDIR /usr/src/app

ENV PATH = "${PATH}:/root/.local/bin"

RUN curl -sSL https://install.python-poetry.org | python3 - && poetry install

WORKDIR /

RUN poetry config virtualenvs.create false

COPY . /usr/src/app/visualization


CMD ["streamlit run ", "app.py"]
29 changes: 29 additions & 0 deletions visualization/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Lidarbot/Visualization:

This application package renders 3D view of the reconstructed mesh by [bacalhau](../bacalau/) computation result.

## Credits to:
- [discord-bot-template](https://github.com/kkrypt0nn/Python-Discord-Bot-Template).
- [streamlit-component](https://github.com/streamlit/component-template).

## Tech stack:
1. upstash : for integrating the bot with the kafka service.

## build instructions:
1. Defining the parameters for the bots:
- For twitter and kafka provider: provide the parameters as defined by the `.env.example`.
- `$ cp .env.example .env`
- then also instantiate a config.json file (as defined by the config-example.json) and then return the result.

- for discord:
- the details are defined in the [readme](https://github.com/kkrypt0nn/Python-Discord-Bot-Template/blob/main/README.md) of the python discord bot template.

- invite your bots by replacing the generated parameters using the url given [here](https://discord.com/oauth2/authorize?&client_id=1138054674696650842&scope=bot+applications.commands&permissions=2048):




2. Run the docker container.


For more information regarding the deployment on cloud, checkout the [aws_deployment](../aws_deployment/) setup.
76 changes: 76 additions & 0 deletions visualization/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
"""
frontend for users to access the 3D reconstructed tile and then render it on the browser
"""
import time
import streamlit as st
import os
from kafka import KafkaConsumer, KafkaProducer
from dotenv import load_dotenv, dotenv_values
from visualizer_component import run_visualization
st.set_page_config(layout="wide")


config = dotenv_values(dotenv_path='../.env')

topic = ['lidarbot_get_visualization']

consumer = KafkaConsumer(
bootstrap_servers=[config["KAFKA_BROKER_URL"]],
sasl_mechanism='SCRAM-SHA-256',
security_protocol='SASL_SSL',
sasl_plain_username=config["SASL_PLAIN_USERNAME"],
sasl_plain_password=config["SASL_PLAIN_PASSWORD"],
auto_offset_reset='earliest',
consumer_timeout_ms=1000
)

def parse_submit_visualization_documentation() -> list(str):
consumer.subscribe(topic[0])
message = consumer.poll(timeout_ms=1000, max_records=1)
## now parsing the details for the visualization of the tiles
if message:
message = message[0]
key = message.key
value = message.value
data = str(value.decode('utf-8')).split(',')
print(f"Received data from kafka: {data}")
tileIpfs = data[0]
username = data[1]
return [tileIpfs, username]

def render_visualization(Xcoord, Ycoord):
[tileIpfs, username] = parse_submit_visualization_documentation()
st.header(f"Visualization for tile {tileIpfs} submitted by {username}")

while st.container():
# Code to render 3D visualization goes here
time.sleep(1)
run_visualization(tilesetIPFS=tileIpfs,Xcoordinate=Xcoord, Ycoordinate=Ycoord)


def main():
st.title("georender: download 3D geospatial database from algorithms running on web3")
st.text("add your twitter handle name, select the required geo-coordinates and then get your shp file ready")

with st.sidebar:
with st.expander("User-details" ,False):
user_name = st.text_input("account user-name")
with st.expander("add geo-coordinates", False):
x_coord = st.text_input("X coordinates")
y_coord = st.text_input("Y coordinates")

## user before needs to set the env variables files.
with st.button("submit details"):
try:
if user_name & x_coord & y_coord:
render_visualization(Xcoord=x_coord, Ycoord= x_coord)
st.success("Job submitted successfully")
else:
st.error("please enter the user name")
except Exception as e:
st.error(f"parameters not complete")
st.stop()

if __name__ == "__main__":
main()
Loading

0 comments on commit 77184c6

Please sign in to comment.