This project is dormant
The goal is to have a search provider that can be deployed with a few commands.
Before going to far down this path it is worth noting that some deployment solution will be developed going forward to make the following easier to deploy, most likely docker-compose
# debian-based system requirements
sudo apt-get install python3-pip git
Install the openbazaar-go using the appropriate method i.e.: install-linux, install-pi3, or install-osx
Secure your openbazaar-go server
openbazaar-go init
openbazaar-go gencerts
openbazaar-go setapicreds
Parameters to connect to your crawling node are stored in postactivate.
# get the code
git clone https://github.com/BazaarDog/bazaar-dog-search.git
cd bazaar-dog-search
# install a python3 virtual environment
python3 -m venv .bazaar_dog_venv
source .bazaar_dog_venv/bin/activate
# Fill out the fields in postactivate
source postactivate
# install project requirements
pip install -r requirements.txt
./manage.py makemigrations ob
./manage.py migrate
./manage.py bootstrap
./manage.py runserver
In order to keep passwords and local configurations out of harms way, one approach is to store them in environment
variables. Currently configuration variables are stored in postactivate, the base configuration
assumes a openbazaar-go
server on 127.0.0.1:4003 with no auth or ssl.
each time you change your configuration, you'll need to rerun
source postactivate
In the future, these variables will likely move to an ansible-vault type configuration.
A management command is provided to bootstrap a node. It uses a list of peers stored in bootstrap/known_nodes.py in
the variable good_nodes
.
It can be run from the project directory in the python virtual environment a management command:
./manage.py bootstrap
It should spew an inordinate amount of text and load the profiles, listings & reviews of those peers into the database. It can be rerun if it fails without duplicating all the crawling already done.
In addition to the list of 'safe' peerIDs, it also loads their moderators, reviewers, etc, so the resulting database may contain content that is NSFW, or illegal in your area.
The OpenBazaar reference client prefers search engines on real domain names and not 'localhost' or '127.0.0.1', So the easiest approach is to fake it by mapping the development port to a standard port (80/443) and fake the dns.
On a debian-based system, the following commands should map your development port from 8000 to 80
sudo iptables -t nat -I PREROUTING --src 0/0 --dst 127.0.0.1 -p tcp --dport 80 -j REDIRECT --to-ports 8000
sudo iptables -t nat -I OUTPUT --src 0/0 --dst 127.0.0.1 -p tcp --dport 80 -j REDIRECT --to-ports 8000
For the purposes of testing, admin.bazaar.dog
can be pointed at your localhost in your /etc/hosts file or dns.
If you choose something else for your domain name, be sure to add it to ALLOWED_HOSTS
in settings
Translations are configured in settings.base with the translation source files in locale, there are a few django-admin commands that are useful for building international support
django-admin makemessages -a # make all language po files for all languages in settings
# alternatively
# django-admin makemessages -l de # make just german po files
# django-admin makemessages -l ko # create po files for Korean
django-admin compilemessages # compile language files for use
OpenBazaar listings may be denominated in any currency, which is an issue when comparing apples to oranges.
For the purposes of sorting by price, it's useful to convert all listings to a common base currency and store that price for speed and usability.
This is done by downloading exchange rates, and then converting all prices to a common currency (i.e. BCH), and storing
the price in a variable called price_value
on the listing.
./manage.py values
The following function is provided as a convenience, it simply calls exchange rates, waits three seconds, then calls update_price_values
ob.tasks.valuation.update_price_values()
Several functions are not provided to encourage innovation, for legal reasons, and for security through obfuscation.
get_listing_rank
returns the rank of a listingget_profile_rank
returns the rank of a profilemark_scammers
will mark all listed profiles as scams, causing them not to appear. List excluded for legal reasonsmark_dust
mark transactions as dust which fall below some percentage fee threshold.node_list
is a list of 'good nodes', which an operator may choose to change.
This code is not terribly efficient at crawling the IPFS network at the moment. There are several tools which attempt to make it more efficient.
- ob.tasks.ping contains functions to discover which nodes are online.
profile.should_update()
tries to prevent oversampling.listing.sync()
checks if the checksum differs when called.
There are several commands to aid in management
Crawl a list of "good nodes":
./manage.py bootstrap
Mark listings as reported for various reasons, i.e. scam:
./manage.py blacklist --reason scam --peerID Qm...
Crawl:
./manage.py crawl
Update online nodes:
The this command pings up-to 200 nodes marked as online in the database and a similar number of nodes marked as offline to see if their status has changed.
./manage.py online
The postgres user must have permission to create a test database on the fly.
postgres=# ALTER USER bazaardog CREATEDB;
coverage run manage_base.py test
./manage.py dumpdata ob --format=json --indent=4 > ob/fixtures/`date +%Y%m%d`.json
Future plans are dependent on gossipsub (ipfs) features coming online, and openbazaard migrating off a fork of go-ipfs.