Hi,
I'm trying to extract metadata from a CSV file which has ~3,000 columns and ~30,000 rows.
This takes upwards of 10 minutes and causes CloverETL to hang while it runs.
I'm running Ubuntu 14.04 on Lenovo laptop with an 8 core Intel i7 and 16GB of RAM - I notice one of the cores stays maxed out at 100% for the whole time.
I appreciate that ~3,000 columns is a rather wide file, but this still seems excessively long.
Am I missing any settings I can change to improve the performance?
Cheers,
Aaron
I'm trying to extract metadata from a CSV file which has ~3,000 columns and ~30,000 rows.
This takes upwards of 10 minutes and causes CloverETL to hang while it runs.
I'm running Ubuntu 14.04 on Lenovo laptop with an 8 core Intel i7 and 16GB of RAM - I notice one of the cores stays maxed out at 100% for the whole time.
I appreciate that ~3,000 columns is a rather wide file, but this still seems excessively long.
Am I missing any settings I can change to improve the performance?
Cheers,
Aaron
-
Hi
This is a known issue which is already reported here. We plan to fix this in version 4.2.0-M1. Unfortunately there is nothing else I can do in this matter until the release. I would like to apologize for any inconveniences.
Best regards
Please sign in to leave a comment.
Comments 1