File Crawler: No data

(from github.com/RogersNelson)
Hi,

I want to index pptx and docx files from a local folder. But it seems that the crawler don’t find the documents.
fess-crawler.log:

2018-08-31 13:42:13,594 [WebFsCrawler] INFO  Target Path: file:/opt/documents/
2018-08-31 13:42:13,737 [Crawler-YzoUkGUBncdSGUTN8m6i-1-4] INFO  Crawling URL: file:/opt/documents/
2018-08-31 13:42:23,621 [IndexUpdater] INFO  Processing no docs (Doc:{access 5ms}, Mem:{used 118MB, heap 194MB, max 494MB})
2018-08-31 13:42:33,609 [IndexUpdater] INFO  Processing no docs (Doc:{access 3ms}, Mem:{used 118MB, heap 194MB, max 494MB})
2018-08-31 13:42:43,609 [IndexUpdater] INFO  Processing no docs (Doc:{access 3ms}, Mem:{used 118MB, heap 194MB, max 494MB})
2018-08-31 13:42:44,907 [WebFsCrawler] INFO  [EXEC TIME] crawling time: 31602ms
2018-08-31 13:42:53,608 [IndexUpdater] INFO  Processing no docs (Doc:{access 2ms}, Mem:{used 118MB, heap 194MB, max 494MB})
2018-08-31 13:42:53,608 [IndexUpdater] INFO  [EXEC TIME] index update time: 25ms
2018-08-31 13:42:53,635 [main] INFO  Finished Crawler
2018-08-31 13:42:53,759 [main] INFO  [CRAWL INFO] CrawlerEndTime=2018-08-31T13:42:53.635+0000,WebFsCrawlExecTime=31602,CrawlerStatus=true,CrawlerStartTime=2018-08-31T13:42:13.231+0000,WebFsCrawlEndTime=2018-08-31T13:42:53.634+0000,WebFsIndexExecTime=25,WebFsIndexSize=0,CrawlerExecTime=40404,WebFsCrawlStartTime=2018-08-31T13:42:13.277+0000

The Crawler:

  • ID: YzoUkGUBncdSGUTN8m6i
  • Name : My name
  • Paths : file:/opt/documents/
  • Depth : 10
  • The number of Thread : 5
  • Interval time : 1000 ms
  • Boost: 1.0
  • Permission: {role}guest
  • Status : Enabled

The job :

  • Name : File Crawler - My name
  • Target: all
  • Schedule : * * * * *
  • Executor : groovy
  • Script : return container.getComponent(“crawlJob”).logLevel(“info”).sessionId(“YzoUkGUBncdSGUTN8m6i”).webConfigIds([] as String[]).fileConfigIds([“YzoUkGUBncdSGUTN8m6i”] as String[]).dataConfigIds([] as String[]).jobExecutor(executor).execute();
  • Logging : Enabled
  • Crawler Job : Enabled
  • Status : Enabled
  • Display Order: 0

Can somebody tell me where I’m wrong?

Cheers,
rn

(from github.com/marevol)
Are files in /opt/documents/?
Crawler does not seem to be able to access them in it.

(from github.com/RogersNelson)
Yes they are. And they are all -rw-r–r--
For information, I’m using fess in a docker image.

(from github.com/marevol)
Your /opt/documents/ is in a container.
Did you mount it?

(from github.com/RogersNelson)
I am ashamed to have bothered you with my question :frowning:
It was not mounted.

Thank you very much.