.net - What's the best way to transfer a large dataset over an ASMX web service? -


I have received a C # .NET application that talks to the web service, and web service negotiations for the Oracle database I need to add an export function to the UI to create an Excel spreadsheet in the data.

I have created a web service function to run a database query, load the data in the dataTable and then return it, which works well for a small number of rows, though enough data in full run That the client application locks for a few minutes and then gives a timeout error. Obviously this is not the best way to recover such a large dataset.

Before I go ahead and coming in some awesome way of calling, I am wondering if there is already some place it can handle. At the moment, I have a startupport function I'm thinking, until the data remains unchanged, repeatedly calls the next 50 Rows function, but because the web service is stateless, it means that I keep some type of ID number and deal with it Affiliate permissions This would mean that I have to load the entire data in the web server's memory, though this is a good thing.

So if anybody knows the best way to get large amounts of data on AN ASMX web service, please tell me!

We had this exact business scenario a few years ago, and I'll tell you what we did.

  1. Try to limit the amount of transferred data
  2. If you move n tables, split them into n figures and one figure at a time.
  3. Compress your dataset / dataset before transferring. It has a huge (huge giant) effect. On the other hand, slip back the byte stream into dataset / dataset. Do not build in .NET Compression - Use SharpZipLib it gives very good results
  4. In addition, you can do asynchronous transfer to keep the client from locking.

Our customers are using the above solution without problems. / P>


Comments

Popular posts from this blog

windows - Heroku throws SQLITE3 Read only exception -

lex - Building a lexical Analyzer in Java -

python - rename keys in a dictionary -