Bug when dealing with very large files
- FlowHeater-Team
- Topic Author
- Offline
- Admin
Less
More
6 years 2 months ago #3475
by FlowHeater-Team
Best wishes
Robert Stark
Bug when dealing with very large files - Post(3475) was created by FlowHeater-Team
Hello,
I find that when I use FlowHeater to transfer files with over 20,000,000 rows into SQL I always get an error message at the end:
“Execution Timeout Expired: The timeout period elapsed prior to completion of the operation or the server is not responding”
It appears that this message appears after the last row is inserted, so the operation is effectively finished, but instead of completing normally, FlowHeater generates this message..
Mike
Request received via Email
I find that when I use FlowHeater to transfer files with over 20,000,000 rows into SQL I always get an error message at the end:
“Execution Timeout Expired: The timeout period elapsed prior to completion of the operation or the server is not responding”
It appears that this message appears after the last row is inserted, so the operation is effectively finished, but instead of completing normally, FlowHeater generates this message..
Mike
Request received via Email
Best wishes
Robert Stark
Please Log in or Create an account to join the conversation.
- FlowHeater-Team
- Topic Author
- Offline
- Admin
6 years 2 months ago #3476
by FlowHeater-Team
Best wishes
Robert Stark
Replied by FlowHeater-Team on topic Bug when dealing with very large files - Post(3476)
Hi Mike,
This error message is related to the FlowHeater standard settings for databases transaction. By default the database Adapters using one database transaction for the import process. This works for small data. For large data, the database server cannot handle such transaction in a certain timeframe and FlowHeater run into the defined command timeout!
You can change the default database transactions setting on the advanced tab into the adapter configurator.
In your case, you´ll get the best result with the following settings.
-> Use database transactions (YES)
-> Perform AutoCommit after 10.000 written records (YES)
Note: I do not recommend disabling database transaction because you will lose performance during the import process!
I guess these settings should solve your issue.
This error message is related to the FlowHeater standard settings for databases transaction. By default the database Adapters using one database transaction for the import process. This works for small data. For large data, the database server cannot handle such transaction in a certain timeframe and FlowHeater run into the defined command timeout!
You can change the default database transactions setting on the advanced tab into the adapter configurator.
In your case, you´ll get the best result with the following settings.
-> Use database transactions (YES)
-> Perform AutoCommit after 10.000 written records (YES)
Note: I do not recommend disabling database transaction because you will lose performance during the import process!
I guess these settings should solve your issue.
Best wishes
Robert Stark
Attachments:
Please Log in or Create an account to join the conversation.
- Michael Orr
- Offline
- User
Less
More
- Posts: 1
6 years 2 months ago #3477
by Michael Orr
Replied by Michael Orr on topic Bug when dealing with very large files - Post(3477)
Yes that solved the problem. Thank you, Robert.
Please Log in or Create an account to join the conversation.
Time to create page: 0.256 seconds