Crawling is generally considered suspicious by FGXWeb, but will not trigger a block easily on its own. There is an overall "score" that is dynamically calculated from a a lot of factors between placing a block on a crawler. For example: not using cookies, persistently trying the same URLs, generating lots of abnormal responses (such as 401), crawling multiple sites in a very short time, using a user agent with a bad reputation, triggering security alerts, hitting URLs that are known to constitute DoS targets, all contribute to that score.
If you want to use a crawler without worrying about the possibility of it getting blocked, you can send us its IP or IPs and we will add an exception. This needs to be done by us, since the (D)DoS protection layer of the WAF which is responsible for the above functionality is not configurable by end users.