error exception.EsAccessException

(from github.com/rafael844)
Im getting this error in fess, in elasticsearch.log dont get eny error.

failure URL:

ID 	9OXA8mwBq93-reUGAAVM
URL 	smb://10.200.51.134/setores/LAB/Desenvolvimento/185 Tabelão em 19 06 2018.xlsx
Thread Name 	Crawler-20190902000000-1-2
Type 	org.codelibs.fess.crawler.exception.EsAccessException
Log 	org.codelibs.fess.crawler.exception.EsAccessException: Failed to insert 20190902000000-1.c21iOi8vMTAuMjAwLjUxLjEzNC9zZXRvcmVzL0xBQi9PUEVSQUNPRVMvU0lNQkEgLSAwMDktTVBSUy0wMDAxODUtMDAgKFNBUE9OQUNFTykvRGVzZW52b2x2aW1lbnRvLzE4NSBUYWJlbMOjbyBlbSAxOSAwNiAyMDE4Lnhsc3g
at org.codelibs.fess.crawler.service.impl.AbstractCrawlerService.insert(AbstractCrawlerService.java:229)
at org.codelibs.fess.crawler.service.impl.EsDataService.store(EsDataService.java:59)
at org.codelibs.fess.crawler.service.impl.EsDataService.store(EsDataService.java:40)
at org.codelibs.fess.crawler.processor.impl.DefaultResponseProcessor.processResult(DefaultResponseProcessor.java:139)
at org.codelibs.fess.crawler.processor.impl.DefaultResponseProcessor.process(DefaultResponseProcessor.java:84)
at org.codelibs.fess.crawler.CrawlerThread.processResponse(CrawlerThread.java:330)
at org.codelibs.fess.crawler.FessCrawlerThread.processResponse(FessCrawlerThread.java:240)
at org.codelibs.fess.crawler.CrawlerThread.run(CrawlerThread.java:176)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.codelibs.curl.CurlException: Failed to access to http://localhost:9200/.crawler.data/_doc/20190902000000-1.c21iOi8vMTAuMjAwLjUxLjEzNC9zZXRvcmVzL0xBQi9PUEVSQUNPRVMvU0lNQkEgLSAwMDktTVBSUy0wMDAxODUtMDAgKFNBUE9OQUNFTykvRGVzZW52b2x2aW1lbnRvLzE4NSBUYWJlbMOjbyBlbSAxOSAwNiAyMDE4Lnhsc3g?timeout=1m&refresh=true&op_type=create
at org.codelibs.curl.CurlRequest.lambda$connect$3(CurlRequest.java:184)
at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
Caused by: org.codelibs.curl.CurlException: Failed to access the response.
at org.codelibs.curl.CurlRequest$RequestProcessor.accept(CurlRequest.java:251)
at org.codelibs.curl.CurlRequest.lambda$execute$4(CurlRequest.java:201)
at org.codelibs.curl.CurlRequest.lambda$connect$3(CurlRequest.java:182)
... 6 more
Caused by: java.io.IOException: Error writing to server
at java.base/sun.net.www.protocol.http.HttpURLConnection.writeRequests(HttpURLConnection.java:717)
at java.base/sun.net.www.protocol.http.HttpURLConnection.writeRequests(HttpURLConnection.java:729)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1602)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1509)
at java.base/java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:527)
at org.codelibs.curl.CurlRequest$RequestProcessor.accept(CurlRequest.java:248)
... 8 more

Mostly those errors are in xlsx and txt with more than 60 MB, but fess is indexing xlsx and txt with more than 200MB.
Elasticsearch is set to:
-Xms10g
-Xmx10g

fess:

# JVM options
jvm.crawler.options=\
-Djava.awt.headless=true\n\
-Dfile.encoding=UTF-8\n\
-Djna.nosys=true\n\
-Djdk.io.permissionsUseCanonicalPath=true\n\
-Dhttp.maxConnections=20\n\
-server\n\
-Xms4g\n\
-Xmx10g\n\
-XX:MaxMetaspaceSize=256m\n\
-XX:CompressedClassSpaceSize=128m\n\

(from github.com/marevol)
I think you need to change settings in elasticsearch. see Network Settings.