Crawl URL's that stored in DB

(from github.com/rhayun)
Hello

i know that fess hase the option to connect to DB and index data from it
i need something similar but different
i have list of people in a table with they homepage URL, i am trying to make fess to connect to the DB and pull all these URL’s - but i need it to crawl these URL and not index them with a data exists in DB like the example in https://fess.codelibs.org/11.4/admin/dataconfig-guide.html

thanks

(from github.com/marevol)
It looks like CsvListDataStore.
How about exporting data to csv file and then using CsvListDataStore?

(from github.com/rhayun)
I can do that , but how its make it different can i map the URL inside the csv so the crawler crawl it? Can i get an example ? Thank you

(from github.com/marevol)
See csvlistdatastore.sh.
This script generates a sample file.

(from github.com/rhayun)
Thanks ill try it