Crawling debug

(from github.com/clauded)
Is there a way to have a detailed log of crawling : I’m trying to index a site and tried http://www.mysite.org/.* as well as http://www.mysite.org.* as indexing url but nothing gets indexed (it’s a sharepoint site with no robots.txt).

Note. the actual log file does not help.

(from github.com/marevol)
To change a log level for Crawler, open “Default Crawler” setting on Scheduler and change .logLevel(“info”) to .logLevel(“debug”) in Script.

(from github.com/clauded)
Thanks for the info.