Files larger than 10M still cannot be crawled after defaultMaxLength was been changed to 20M

I changed the defaultMaxLength to 20971520(20M) in the crawler/contentlength.xml,
but I still can’t crawled the files whose size is larger than 10M and less than 20M,
Is there anywhere else should be changed beside the contentlength.xml file?

This was solved after I set max_size=20971520 to the crawler’s parameter.

So, maybe parameter should be set by both the contentlength.xml file and the crawler’s parameter?