I have a rather unwieldy piece of code that I'm trying to optimize.

It's basic purpose is calculate the lat/long location using around data in 20 different columns from 3 different tables. The calculated lat/long is then updated into one of those tables.

The procedure for determining the lat/long is rather complicated involving 1000+ lines of C# code, and several calls to a .DLL that I have no control over.

The number of columns to be calculated/updated is huge.

My current noob like attempts are as follows

1) Populate a DataSet of each of the tables.
2) Iterate through them, calculate the lat/long for each row.
3) Individually update each row with a seperate update statement.

I don't think I can do much about (2), but I've played with (1) trying OracleDataReader, and DataSets, and I'd *REALLY* like to play with (3), but I don't really know how.

Currently I often run into an 'Out of Memory' error. I'm closing/disposing all connections/transactions/datasets/readers/etc. after I'm done with them, I think it is just the size of the data that I iterate through.

Any thoughts?

Thanks in advance.