Crawler progressively slowing down and crashing

Hello dear community,

I am extracting data using crawler from around 40k links.

The crawler starts off correctly pulling data from each link in around 2-3 seconds.. but after it's been running for around 10 hours, my RAM is being used at over 6gb and it's only extracted around 2.8k links.

Heres a picture:

Is there a way to clear it's memory or doing it in batches?

@nielsbosma any suggestion for overcoming this issue with excel using up so much resources?

It's hard to believe it actually uses 6gb+ ram on 2.8k of links?

I will look into the performance of the Spider as soon as I've released 5.2.