Automate lead generation by extracting business data from Google Maps and enriching it with contact information from company websites. Perfect for sales teams, marketers, and researchers who need to build targeted contact lists at scale.
- Stage 1: Google Maps Mining - Extract business information (name, address, phone, website, rating, reviews)
- Stage 2: Website Enrichment - Automatically visit websites to find email addresses and owner information
- Intelligent Filtering - Filter by required words (e.g., "GmbH", "Co. KG", "AG") before processing
- Website Requirement - Option to only save entries with websites for Stage 2 processing
- Early Filtering - Checks company names in search results before clicking (massive speed boost)
- Parallel Processing - Configurable workers for Stage 2 (default: 10 concurrent threads)
- Smart Delays - Human-like behavior with randomized delays
- Browser Selection - Choose Safari, Chrome, or Edge
- Apple-Inspired Design - Clean, elegant interface with smooth animations
- Real-Time Updates - Live progress tracking via WebSocket
- Activity Logs - Color-coded logs for easy monitoring
- Statistics Dashboard - Track businesses scraped, emails found, and more
- Responsive Design - Works perfectly on all screen sizes
- English and German translations
- Easy to extend with additional languages
- Python 3.8+
- Node.js 20.19+ or 22.12+
- Safari (built-in on Mac) or Chrome/Edge browser
- Clone the repository
git clone <your-repo-url>
cd MapMiner- Backend Setup
cd backend
python3 -m venv venv
source venv/bin/activate # On Mac/Linux
# OR
venv\Scripts\activate # On Windows
pip install -r requirements.txt- Frontend Setup
cd frontend
npm installOption 1: Use the start script (Mac/Linux)
./start.shOption 2: Manual start
Terminal 1 - Backend:
cd backend
source venv/bin/activate
python app.pyTerminal 2 - Frontend:
cd frontend
npm run devOpen your browser to http://localhost:5173
- Search Term - What you're looking for (e.g., "Autohaus", "Restaurant", "Hotel")
- Cities - Comma-separated list (e.g., "Berlin, MΓΌnchen, Hamburg")
- Entries per City - How many matching results to collect per city
- Required Words - Filter by company type (e.g., "GmbH, Co. KG, AG")
- Browser - Choose Safari, Chrome, or Edge
- Require Website - Only save entries with websites (recommended for Stage 2)
- Run Stage 2 - Enable/disable website enrichment
- Min/Max Delay - Random delays between actions (2-5s default)
- Scroll Delays - Delays when scrolling results (3-7s default)
- Max Workers - Parallel threads for Stage 2 (10 default, max 50)
Stage 1: Google Maps Mining
- Searches Google Maps for your search term in each city
- Filters results by required words (if specified)
- Only clicks listings that match your criteria (speed optimization!)
- Extracts: name, address, phone, website, rating, reviews
- Saves to CSV in
backend/output/
Stage 2: Website Enrichment
- Visits each website in parallel
- Extracts email addresses using smart patterns
- Finds owner/manager names (German patterns supported)
- Updates CSV with contact information
Stage 3: Download
- Download your completed CSV file with all data
CSV files are saved in backend/output/ with the format:
{search_term}_{timestamp}.csv
Example: Autohaus_20260203_104530.csv
name- Business nameaddress- Full addressphone- Phone numberwebsite- Website URLrating- Google Maps ratingreviews- Number of reviewsemail- Email address(es) foundowner- Owner/manager name found
MapMiner/
βββ backend/
β βββ output/ # CSV output files
β βββ app.py # Flask server + WebSocket
β βββ scraper_orchestrator.py # Workflow manager
β βββ requirements.txt # Python dependencies
βββ frontend/
β βββ src/
β β βββ components/ # React components
β β βββ App.jsx # Main app
β β βββ translations.js # i18n
β βββ package.json # Node dependencies
βββ maps_scraper_configurable.py # Stage 1: MapMiner
βββ website_scraper_configurable.py # Stage 2: Enrichment
βββ README.md # This file
Safari (Mac only)
safaridriver --enableChrome/Edge
- Automatically managed by Selenium
- WebDriver downloads happen automatically
None required! Everything is configured through the UI.
- Google Maps structure may have changed
- Try a different browser
- Check your internet connection
- Ensure scraping completed successfully
- Check
backend/output/directory exists - Restart backend server
- Check browser console for errors (F12)
- Verify all dependencies installed:
npm install - Clear cache and hard reload (Cmd+Shift+R / Ctrl+Shift+R)
- Try a different browser (Safari β Chrome β Edge)
- Reduce entries per city
- Increase delays in advanced settings
- Use Required Words filter - Dramatically speeds up scraping by filtering before clicking
- Enable "Require Website" - Skips entries without websites (faster Stage 1)
- Increase Max Workers - More parallel processing in Stage 2 (20-30 for fast machines)
- Start small - Test with 1-2 cities and low entry count first
- Monitor logs - Watch for errors or rate limiting
- Respect Google Maps Terms of Service
- Respect robots.txt files
- Use reasonable delays to avoid overwhelming servers
- Use scraped data responsibly and ethically
- Consider privacy laws (GDPR, etc.) when handling contact information
Backend:
cd backend
source venv/bin/activate
pip install --upgrade -r requirements.txtFrontend:
cd frontend
npm updateThis project is for educational purposes. Please ensure you comply with all applicable laws and terms of service when using this tool.
Made with β€οΈ for efficient data collection


