robots.txt support by default configuration

(from github.com/sashaskr)
Dear Shinsuke,
How organized fess crawling with relation of robots.txt? Does it support “robots respect” by default?

Thank you in advance.

(from github.com/marevol)
allow/disallow/sitemaps of robots.txt are supported by default in all distributions.

(from github.com/sashaskr)
@marevol Thank you!