ago-assistant icon indicating copy to clipboard operation
ago-assistant copied to clipboard

Throttle deep copying to prevent timeout errors

Open ecaldwell opened this issue 7 years ago • 3 comments

Deep copying a large feature service (hundreds of thousands of records) often fails to copy all records because the addFeatures requests start to timeout. This can likely be mitigated by throttling the calls to addFeatures e.g. sending only five requests at a time. That logic would need to be added somewhere near this section.

ecaldwell avatar Mar 15 '17 15:03 ecaldwell

If you are getting 504 Timeout errors when deep copying features, it's probably because the size of the features being submitted is too large. This tends to happen when features are complex polygons or polylines. I ran into this while working on a different deep-copying tool. For example I might be able to submit 1000 points at a time, but only 100 polygons at a time.

apulverizer avatar Apr 14 '17 01:04 apulverizer

That's a great point. I hadn't considered that some of these Hosted Feature Services might contain complex polygons and that's overloading the endpoint. I still think that a lot of the timeouts are from services with many records (sending tens or hundreds of simultaneous addFeatures requests).

In your other tool, did you experiment with trying to assess the feature complexity before the copy?

ecaldwell avatar Apr 14 '17 01:04 ecaldwell

In your other tool, did you experiment with trying to assess the feature complexity before the copy?

I just added a basic check to see if it's points or not and limit it to 100 features if not points. It seems to work but isn't very robust/efficient. I'm thinking of adding a check to see how large the json of the queried features is and then determining how many features to add at a time based on that. Another option I might consider is sending X number of features, if it times out, then send X/2 number of features, and so on until it works.

apulverizer avatar Apr 14 '17 01:04 apulverizer