File crawler - duplicate index items.

Hi, i just manage to setup and run the latest FESS.

I have enable incremental crawling, but every time it crawls it adds all documents again to the index, so i get duplicates, even though nothing has changed in that folder. TTL is set to -1 since the files will never be touched, only new files will be added.

I thought the incremental crawling would only affect new files or have i misstaken how it all works?

Did you check log files? Fess checks a last modified or timestamp at the incremental crawling.

i checked the fess-crawler.log, but it didn’t see that it checked last modifed anywhere, i change TTL to 1 and removed the index, once it had indexed everything again i tried to run it a second time and this time it checked the modified timestamp and didn’t index them again.

So, if i run the crawler ever 12h and set the TTL to 1 day with incremental crawling, will it keep all documents in the index all the time even if i never change anything on the documents as long as they are avalible on the filer server?

Although I’m not sure what you want to do, I think it’s better to provide more information(ex. reproduce steps).