SCRAPEGMAPS
, SCRAPEWEBS
, and VERIFYEMAILS
SoftwaresSCRAPEGMAPS
, SCRAPEWEBS
, and VERIFYEMAILS
are powerful industrial software for scrape, extract and verify bulk of
potential contact data, with input search keywords targeted by product name or industry name, and targeted location.
*
:
Xeon E5-2600 V4 Family
(Suggested models 2650, 2660, 2680, 2683, 2690, 2695, 2697, 2698 with
24~48 threads per CPU). Xeon E5-2600 V3 Family
or Xeon E5-2600 V2 Family
also working good with models
which have high threads number, they are cheaper but cost more power and always higher temperature when operating.
HUANANZHI X99-F8D Plus
(Support dual Xeon CPU). Supermicro X10DRL-i
or
X10DAL-i
are better options but cost more. HUANANZHI is a motherboard brand from China, is the cheapest option but
good enough (We have used it through over a year without any problem), Supermicro is a more reputable brand in supply datacenter
hardware so you can choose it if you don't care about build cost.
Nvidia GeForce GTX-1080/RTX-2080
or higher/newer-models. *
) This suggestion is good enough to run our software with save built cost. If you want higher speed or run heavier
workload and don't care about build cost, you can use high-model XEON E7
CPU which is same number of thread as above
XEON E5 CPU
suggestion but can run with motherboards which have 4 CPU sockets or 8 CPU sockets, or use Intel Xeon scalable
newer models. More threads mean more workload so it also requires more more memory with higher speed, VGA/GPU also requires
higher-model if process always takes over 80% of its efficiency.
Windows 10 Pro 64bit
, Windows 10 Pro for Workstation 64bit
, or
Windows 11 Pro 64bit
. Windows 10
OS is strong recommended because we have tested our software mostly in
Windows 10
OS and it is stable. Windows 11
OS is also worked, but we are not test much time so it is not
confirmed to work stable, if you use the software in Windows 11
OS and find any bugs, please contact us to fix it and
release an update patch.
Redis
and Memcached
caching technology for optimizing its
efficiency, so it is required to install a caching server, config the connection from our software to this caching server and start
the caching server before run our software. If you are not professional in IT, don't worry because we have prepared an easier
solution for you: you can use our pre-installed caching server
which can run through Docker Container
using Docker Desktop
software and it is very easy to do, also no need to re-config our software. This is totally a
free solution, you don't need to pay more money for it. Here is the detail instruction how to install and use our pre-installed
caching server
:
Docker Desktop
from
https//www.docker.com/products/docker-desktop:
Select and Download version for Windows 64bit
then install it.
Docker Desktop
.Docker Desktop
» Search our pre-installed caching sever
as Docker Image with the keyword
flbox/fl-cache
.
Images
menu » Run the downloaded flbox/fl-cache Docker Image
to create Docker
Container
(only for the first run). Remember to expose connection ports 11211
and 6379
for
the new Docker Container
from Optional Settings
when Run a new container
(also
required only for the first run), other settings no need to config. You can name the new container as FL-CACHE
or any name you want.
Docker Container
.Docker Desktop
» Click run the created Docker Container
from
Containers
menu » Then run our software.
Members Area
» Download Software
» Download file scrapegmaps.zip
then
unzip the downloaded file (You can only access after subscribed the software).
config.ini
with any Text editor:
[general]
config section » replace default config of cache_server
IP with your caching
server IP (which installed Redis
server and Memcached
server), add cache_password
if
your installed Redis
server require an authentication password (example:
cache_password = Your_Password
). Please only do this step if you don't use our pre-installed caching
server (See details).
[scrapegmaps]
config section » Replace the default config of maximum scrape threads number
(threads = 100
) by your computer CPU threads number if the default configuration is lower.
txt
» Open & Edit file scrapegmaps-keywords.txt
with any Text
editor:
"Hair Salon in London, United Kingdom"
"Hair Salon in Manchester, United Kingdom"
"Hair Salon in Leeds, United Kingdom"
...
"Hair Salon in Barking and Dagenham, London, United Kingdom"
"Hair Salon in Barnet, London, United Kingdom"
"Hair Salon in Bexley, London, United Kingdom"
...
txt
» Open & Edit file proxies_http.txt
with any Text editor:
HTTP proxy
with your HTTP proxy
, each proxy in one line.
ipv4:port
user:password@ipv4:port
user:password:ipv4:port*
ipv4:port:user:password*
(*
) Proxy user or proxy password must contain at least a latin character.
HTTP proxy
so please use only HTTP proxy
, if you add other proxy types, the
software will not work. It protects your real IP address will not be blacklisted by Google and make sure our scrape process
will not be stopped by Google's bot detector.
pre-installed caching server
(which can run in Docker
Container
- See the detail here).
run.bat
(double-click on it)
campaign name
(It will save the scraped data into database file with name campaign_name.db
inside
db
folder which inside software unzipped folder)
SCRAPEGMAPS
software for registration, please enter it.
campaign name
) if previous task crashed or stopped
with any reason before it completed. Please:campaign name
, it will continue to write data to the
database file with the same campaign name
.
Members Area
» Download Software
» Download file scrapewebs.zip
then
unzip the downloaded file (You can only access after subscribed the software).
config.ini
with any Text editor:
[general]
config section » replace default config of cache_server
IP with your caching
server IP (which installed Redis
server and Memcached
server), add cache_password
if
your installed Redis
server require an authentication password (example:
cache_password = Your_Password
). Please only do this step if you don't use our pre-installed caching
server (See details).
[scrapewebs]
config section » Replace the default config of maximum scrape threads number
(threads = 100
) by 300% of your computer CPU threads number if the default configuration is lower.
txt
» Open & Edit file scrapewebs-urls.txt
with any Text
editor:
https://www.myhairandbeautysalon.co.uk/
http://www.elegancesalons.co.uk/
http://www.samolgarhair.co.uk/
...
txt
» Open & Edit file proxies_http.txt
with any Text editor:
HTTP proxy
with your HTTP proxy
, each proxy in one line.
ipv4:port
user:password@ipv4:port
user:password:ipv4:port*
ipv4:port:user:password*
(*
) Proxy user or proxy password must contain at least a latin character.
HTTP proxy
so please use only HTTP proxy
, if you add other proxy types, the
software will not work.
pre-installed caching server
(which can run in Docker
Container
- See the detail here).
run.bat
(double-click on it)
campaign name
(It will save the scraped data into database file with name campaign_name.db
inside
db
folder which inside software unzipped folder)
SCRAPEWEBS
software for registration, please enter it.
campaign name
) if previous task crashed or stopped
with any reason before it completed. Please:campaign name
, it will continue to write data to the
database file with the same campaign name
.
txt/scrapewebs-urls.txt
as
the above instruction.
SCRAPEGMAPS
software. It
requires the entered campaign name
must be same with the file name *.db
(without '.db') which
saved from SCRAPEGMAPS
software, also requires you to do some steps before run the software:scrapewebs.zip
to the folder which extracted from
scrapegmaps.zip
, exclude files config.ini
, run.bat
,
txt/proxies_http.txt
, and *.db
files in db
folder which have the same name as
*.db
files in db
folder inside the destination folder.
scrapegmaps/config.ini
and file scrapewebs/config.ini
with a Text editor for
editing. From scrapewebs/config.ini
, copy [scrapewebs]
section and all configuration
lines under it to end of file scrapegmaps/config.ini
. Save and close files.
scrapegmaps/run.bat
to scrapegmaps/run-scrapegmaps.bat
, rename file
scrapewebs/run.bat
to scrapewebs/run-scrapewebs.bat
, then copy
run-scrapewebs.bat
to scrapegmaps
folder.
scrapegmaps/txt/proxies_http.txt
and file scrapewebs/txt/proxies_http.txt
with
a Text editor for editing. Copy proxies from scrapewebs/txt/proxies_http.txt
which is not duplicated to
end of file scrapegmaps/txt/proxies_http.txt
. Make sure no duplicated proxies and no empty line in
file scrapegmaps/txt/proxies_http.txt
. Save and close files.
*.db
from scrapewebs/db
folder which have the same name with
*.db
files in scrapegmaps/db
folder. Move all *.db
from
scrapewebs/db
folder to scrapegmaps/db
folder.
scrapegmaps
to scrapedata
.SCRAPEWEBS
software using file scrapedata/run-scrapewebs.bat
.Members Area
» Download Software
» Download file verifyemails.zip
then
unzip the downloaded file (You can only access after subscribed the software).
config.ini
with any Text editor:
[general]
config section » replace default config of cache_server
IP with your caching
server IP (which installed Redis
server and Memcached
server), add cache_password
if
your installed Redis
server require an authentication password (example:
cache_password = Your_Password
). Please only do this step if you don't use our pre-installed caching
server (See details).
[verifyemails]
config section » Replace the default config of maximum scrape threads number
(threads = 100
) by your computer CPU threads number if the default configuration is lower.
txt
» Open & Edit file verifyemails-list.txt
with any Text
editor:
[email protected]
[email protected]
[email protected]
...
txt
» Open & Edit file proxies_socks5.txt
with any Text editor:
SOCKS5 proxy
with your SOCKS5 proxy
, each proxy in one line.
ipv4:port
user:password@ipv4:port
user:password:ipv4:port*
ipv4:port:user:password*
(*
) Proxy user or proxy password must contain at least a latin character.
SOCKS5 proxy
so please use only SOCKS5 proxy
, if you add other proxy types,
the software will not work. It protects your real IP address will not be blacklisted by Email servers and maximize the
number of verified email addresses (If the IP address used for verify email addresses had been blacklisted, we cannot check
email addresses are exist or not).
SPAM SPAMRATS
, SPAMSOURCES FABEL
,
B BARRACUDACENTRAL
, ALL SPAMRATS
, that means the IP address of proxy is clean.
pre-installed caching server
(which can run in Docker
Container
- See the detail here).
run.bat
(double-click on it)
campaign name
(It will save the scraped data into database file with name campaign_name.db
inside
db
folder which inside software unzipped folder)
VERIFYEMAILS
software for registration, please enter it.
campaign name
) if previous task crashed or stopped
with any reason before it completed. Please:campaign name
, it will continue to write data to the
database file with the same campaign name
.
txt/verifyemails-list.txt
as
the above instruction.
SCRAPEWEBS
software. It
requires the entered campaign name
must be same with the file name *.db
(without '.db') which
saved from SCRAPEWEBS
software, also requires you to do some steps before run the software:verifyemails.zip
to the folder which extracted from
scrapewebs.zip
, exclude files config.ini
, run.bat
, and *.db
files in db
folder which have the same name as *.db
files in db
folder
inside the destination folder.
scrapewebs/config.ini
and file verifyemails/config.ini
with a Text editor for
editing. From verifyemails/config.ini
, copy [verifyemails]
section and all configuration
lines under it to end of file scrapewebs/config.ini
. Save and close files.
scrapewebs/run.bat
to scrapewebs/run-scrapewebs.bat
, rename file
verifyemails/run.bat
to verifyemails/run-verifyemails.bat
, then copy
run-verifyemails.bat
to scrapewebs
folder.
*.db
from verifyemails/db
folder which have the same name with
*.db
files in scrapewebs/db
folder. Move all *.db
from
verifyemails/db
folder to scrapewebs/db
folder.
scrapewebs
to scrapedata
.VERIFYEMAILS
software using file scrapedata/run-verifyemails.bat
.SCRAPEGMAPS
, SCRAPEWEBS
, and VERIFYEMAILS
software save the scraped/extracted/verified
data into SQLite3 database files
are *.db
files inside folder db
. You can easy open data
files using DB Browser
software then copy scraped/extracted/verified data to Excel, Google Sheets, or any Text Editor
software for using those data. You can view this step in the recorded instruction video of previous steps.
DB Browser
from:
https://sqlitebrowser.org/dl/
[email protected]
or Telegram
support group: https://t.me/flboxsoft_support. We are always ready to help you
to install and use the software.