Fast, scalable local business lead generation powered by Decodo Web Scraping API.
Turn a simple keyword and location into a ready-to-use dataset of local businesses from Google Maps. Perfect for freelancers, agencies, and SMBs who need quality leads at scale.
- Visit Local Leads Finder website to use Local Leads Finder instantly.
- Enter your Decodo username and password directly in the UI when prompted.
- Start a search and download results without any local setup.
# Clone the repository
git clone https://github.com/yourusername/local-leads-finder.git
cd local-leads-finder
# Install dependencies
pip install -r requirements.txt
# Or install as a package
pip install -e .Get your Scraper API credentials from Decodo Dashboard:
- Navigate to "Scraper" tab
- Find your username and password
Note: The web interface has its own credential management - users enter credentials directly in the UI.
For CLI usage, create a .env file:
# Create .env file
cp .env.example .env
# Add your credentials
echo "DECODO_USERNAME=your_username" >> .env
echo "DECODO_PASSWORD=your_password" >> .envOption A: Web Interface
cd webapp
python app.pyCheck the terminal output for the exact URL (default is http://localhost:5000, but another free port may be used) and open it in your browser.
Option B: Command Line
leads-finder --query "dentist" --city "Toronto" --out leads.csvThe web interface provides a modern, user-friendly way to find leads:
-
Start the web server:
cd webapp python app.py -
Open your browser to the URL printed in the terminal (defaults to
http://localhost:5000, but may vary if that port is in use) -
Fill in the search form:
- Business Type (e.g., "dentist", "pizza restaurant")
- Location with a given radius to look around
- City (e.g., "Toronto", "New York")
- Results Limit (1-1000)
- Country (optional)
-
Watch real-time progress as leads are collected
-
View results in an interactive table
-
Export to CSV or JSON with one click
# Find dentists in Toronto
leads-finder --query "dentist" --city "Toronto" --out toronto_dentists.csv
# Find pizza places in New York (200 results)
leads-finder --query "pizza" --city "New York" --limit 200 --out nyc_pizza.csv
# Find plumbers in Montreal
leads-finder --query "plumber" --city "Montreal" --out montreal_plumbers.csvleads-finder --help| Option | Description | Default |
|---|---|---|
--query |
Search keyword (e.g., "dentist", "pizza") | Required |
--city |
Target city name | Required |
--limit |
Max results to collect | 100 |
--out |
Output file (CSV or JSON) | leads.csv |
--rps |
Requests per second rate limit | 1.0 |
--username |
Decodo username | DECODO_USERNAME env var |
--password |
Decodo password | DECODO_PASSWORD env var |
# Process multiple cities
for city in "Toronto" "Montreal" "Vancouver"; do
leads-finder --query "dentist" --city "$city" --out "dentists_$city.csv"
doneleads-finder --query "gym" --city "Los Angeles" --out gyms.jsonCSV file with the following columns:
| Field | Description |
|---|---|
name |
Business name |
category |
Primary category or type |
phone |
Contact number |
website |
Business website |
rating |
Google Maps rating (1-5) |
reviews_count |
Number of reviews |
address |
Full address |
city |
City name |
country |
Country code (US, CA, etc.) |
lat |
Latitude |
lon |
Longitude |
source |
Data source (Google Maps) |
scraped_at |
ISO timestamp |
docker build -t leads-finder .docker run --rm \
-e DECODO_USERNAME=your_username \
-e DECODO_PASSWORD=your_password \
-v "$(pwd)/output:/out" \
leads-finder \
--query "dentist" --city "Toronto" --out /out/leads.csvdocker run --rm \
-p 5000:5000 \
-e DECODO_USERNAME=your_username \
-e DECODO_PASSWORD=your_password \
leads-finder \
webOpen your browser to http://localhost:5000. To use a different port, change the mapping (-p 8080:8080) and set -e PORT=8080.
from leads_finder.core.scraper_api_session import ScraperAPISession
session = ScraperAPISession(username="user", password="pass")
results = session.google_maps_search(
query="dentist",
geo="Toronto",
limit=100
)
businesses = results.get("results", [])- Social media profiles
- Business hours
- Scheduled scraping
Contributions welcome! Please open an issue or PR.
