Skip to content

A simple proxy manager for Python with WebShare API support

Notifications You must be signed in to change notification settings

rawandahmad698/ProxyMan

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ProxyMan 🌐

A simple Proxy manager for Python

Created with ❤️ by @MeshMonitors

Features

✨ Proxy Management:

  • Load proxies from a file
  • Support for IP-based and username-password authentication

🔄 Proxy Refresh:

  • Automatically refresh the proxy list from a remote source
  • Filter proxies based on location (US, UK, FR, or all)

🔀 Random Proxy:

  • Retrieve a random proxy from the loaded list

Installation

ProxyMan can be installed using pip:

pip install ProxyMan

Usage

Loading proxies from a file

from ProxyMan import ProxyMan, AuthType
import os

# Load proxies from a file
current_dir = os.path.dirname(os.path.realpath(__file__))
current_dir = os.path.dirname(current_dir)

pm = ProxyMan(file_path=current_dir + "/proxies.txt", auth=AuthType.IP)

Configuring ProxyMan

from ProxyMan import ProxyMan, Filter, AuthType

# Create a ProxyMan instance
webshare_api_key: str = "" # Optional (required to use the refresh feature)
scrape_filter: Filter = Filter.ALL # Optional (default: Filter.ALL) Filter proxies based on location
fail_count: int = 3 # Optional (default: 3) Remove a proxy from the list after it fails this many times
proxies_to_scrape: int = 3000 # Optional (default: 3000)
auth_type: AuthType = AuthType.IP # Optional (default: AuthType.IP) The type of authentication to use (IP or USER_PASS)

pm = ProxyMan(api_key=webshare_api_key, auth=auth_type, scrape_filter=scrape_filter, fail_count=fail_count, proxies_to_scrape=proxies_to_scrape)

# Get a random proxy
proxy: dict = pm.random()

print(proxy) # { "http": "http://0.0.0.0:8080"", "https": "http://0.0.0.0:8080" }

Failing a proxy

# If proxy fails somewhere in your code, call the fail method
# It will be removed from the list after it fails 3 times (or whatever you set fail_count to)

pm.increment_bad_proxies(proxy['http'])

Refreshing the proxy list

# Refresh the proxy list from the remote source
# This will overwrite the current list and update proxies.txt with the new list
await pm.update_proxies()

Getting a random proxy

# Get a random proxy from the list
proxy: dict = pm.random()

# Make a request 
requests.get("https://example.com", proxies=proxy)

About

A simple proxy manager for Python with WebShare API support

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages