Crawling & Threads....

(from github.com/looper976)
I am looking at crawling 100+ sites per crawler configuration. If I set threads = 20, does the crawler work sequentially through the list of sites, finishing one site before going to the next, or will it allocate those 20 threads to multiple sites concurrently?

Related, is there a way to view the current queue of URLs to be crawled? I see the crawling log, but it does not appear to show this very clearly.

(from github.com/marevol)
The number of threads for Crawler is Simultaneous Crawler Config x The number of Thread.

No UI for checking url queue.