We recently Virtualised our servers. Basically we had two physical
servers and now we have two vertual servers.
We run a couple of Web applications on the VM server. We had
everything running fine for a while - until we got a timeout problem
with a search function. Basically a page doing a search call to SQL
server and that took more than the default SQL Command Execution
timeout which is set to 30 secs my MS.
To resolve that problem, we set CommandTimeout property of the
SQLCommand object as well as the
SQLConnection object - to 0 - this makes the timeout to be unlimited.
The code snippet below shows what we did:
Note: that the CommandTimeout is set at both the SQLConnection and
SQLCommand levels !!
>>> code snippet follows ...
aConnection = New SqlConnection
("Database=myDB;Trusted_Connection=True;Connection Timeout = 0;")
mCommand = New SqlCommand(strStoredProcedureName, aConnection)
mCommand.CommandType = CommandType.StoredProcedure
mCommand.CommandTimeout = 0
The question is: this stuffed up the Virtual Server memory. The
connection seems to consume the memory
really big time. From 4GB RAM down to 1GB RAM left in no time whenever
we run our application making
use of the above SQL Connection Statements.
We are not sure what caused the problem - but we decided to roll-back
to our old code and
remove the above SQL CommandTimeout Statements and left it to the
This seemed to run the application and RAM consumption as normal and
happy.degradation which is of course
We suspected the above CommandTimeout code snippet to have the caused
Any changes to any other code did not cause any memory problems.
==>> Also - the problem was apparent only in the Virtual Servers
===> That is the application could run fine when hoseted on a local
IIS (personal web server)
===> making calls to the same SQL server ...
Will someone please tell me what exactly happened.
Can the CommandTimeout setting affect the RAM so much?
Please also CC me your reply to => billion.mohammed@newsgroup