Bug #40250 Apparent deadlock in pooled connection
Submitted: 22 Oct 2008 13:56 Modified: 13 Feb 2009 15:29
Reporter: Ashley Unitt Email Updates:
Status: Closed Impact on me:
None 
Category:Connector / NET Severity:S2 (Serious)
Version:5.2.3 OS:Windows
Assigned to: CPU Architecture:Any

[22 Oct 2008 13:56] Ashley Unitt
Description:
Run load tests of web pages written in C# on IIS 6.0 (Windows 2003) we can easily reproduce what appear to be deadlocks in the MySQL .NET connector. We've upgraded to 5.2.3 and these deadlocks still appear (although are different to previous versions of the connector we have tried, 1.0.7 and 1.0.10).

An example CLR stack trace is:

0faceab0 7c8285ec [HelperMethodFrame_1OBJ: 0faceab0] System.Threading.WaitHandle.WaitOneNative(Microsoft.Win32.SafeHandles.SafeWaitHandle, UInt32, Boolean, Boolean)
0faceb5c 793b03fe System.Threading.WaitHandle.WaitOne(Int64, Boolean)
0faceb74 793b0c1b System.Threading.WaitHandle.WaitOne(Int32, Boolean)
0faceb84 0f878a75 MySql.Data.MySqlClient.MySqlPool.GetConnection()
0facebb8 0f87846e MySql.Data.MySqlClient.MySqlConnection.Open()
0facebf8 0f87546e nvm.callcentre.vcm.DB..ctor(System.String)
0facec10 0f87526c nvm.callcentre.vcm.DB.getInstance(System.String)
0facec98 0f8750ab nvm.callcentre.vcm.DB.getInstance()
0faceca4 00fd0d88 nvm.callcentre.vcm.Groups.getAgentStats()
0faced14 00fd0cb9 nvm.callcentre.vcm.WebService.getAgentStats()
0facef94 79e7c74b [CustomGCFrame: 0facef94] 
0facef78 79e7c74b [GCFrame: 0facef78] 
0facef5c 79e7c74b [GCFrame: 0facef5c] 
0facf164 79e7c74b [HelperMethodFrame_1OBJ: 0facf164] System.RuntimeMethodHandle._InvokeMethodFast(System.Object, System.Object[], System.SignatureStruct ByRef, System.Reflection.MethodAttributes, System.RuntimeTypeHandle)
0facf1d4 793a44bd System.RuntimeMethodHandle.InvokeMethodFast(System.Object, System.Object[], System.Signature, System.Reflection.MethodAttributes, System.RuntimeTypeHandle)
0facf220 793a41e5 System.Reflection.RuntimeMethodInfo.Invoke(System.Object, System.Reflection.BindingFlags, System.Reflection.Binder, System.Object[], System.Globalization.CultureInfo, Boolean)
0facf26c 793a40a2 System.Reflection.RuntimeMethodInfo.Invoke(System.Object, System.Reflection.BindingFlags, System.Reflection.Binder, System.Object[], System.Globalization.CultureInfo)
0facf28c 65d0212c System.Web.Services.Protocols.LogicalMethodInfo.Invoke(System.Object, System.Object[])
0facf2a8 65d17da0 System.Web.Services.Protocols.WebServiceHandler.Invoke()
0facf2e8 65d183b6 System.Web.Services.Protocols.WebServiceHandler.CoreProcessRequest()
0facf318 65d190d6 System.Web.Services.Protocols.SyncSessionlessHandler.ProcessRequest(System.Web.HttpContext)
0facf32c 0f2e18c0 Microsoft.Web.Services.ScriptHandlerFactory+HandlerWrapper.ProcessRequest(System.Web.HttpContext)
0facf338 65fe6bfb System.Web.HttpApplication+CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
0facf36c 65fe3f51 System.Web.HttpApplication.ExecuteStep(IExecutionStep, Boolean ByRef)
0facf3ac 65fe7733 System.Web.HttpApplication+ApplicationStepManager.ResumeSteps(System.Exception)
0facf3fc 65fccbfe System.Web.HttpApplication.System.Web.IHttpAsyncHandler.BeginProcessRequest(System.Web.HttpContext, System.AsyncCallback, System.Object)
0facf418 65fd19c5 System.Web.HttpRuntime.ProcessRequestInternal(System.Web.HttpWorkerRequest)
0facf44c 660167d3 System.Web.RequestQueue.WorkItemCallback(System.Object)
0facf464 79407caa System.Threading._ThreadPoolWaitCallback.WaitCallback_Context(System.Object)
0facf468 793740ab System.Threading.ExecutionContext.runTryCode(System.Object)
0facf890 79e7c74b [HelperMethodFrame_PROTECTOBJ: 0facf890] System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode, CleanupCode, System.Object)
0facf8f8 79373ff7 System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
0facf910 79373ede System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
0facf928 79407e18 System.Threading._ThreadPoolWaitCallback.PerformWaitCallbackInternal(System.Threading._ThreadPoolWaitCallback)
0facf93c 79407d90 System.Threading._ThreadPoolWaitCallback.PerformWaitCallback(System.Object)
0facfacc 79e7c74b [GCFrame: 0facfacc] 
0facfc18 79e7c74b [ContextTransitionFrame: 0facfc18] 

We have this appearing in a heavily used production environment with the default connection string parameters for pooling (i.e. max pool size=100) and we can recreate this easily in a test environment with a max pool size of 20.

How to repeat:
It appears to happen when we place very heavy load on the web application.

We're going to try to provide a simple test case.
[22 Oct 2008 15:34] Ashley Unitt
I believe this happens when the maximum pool size is reached. To replicate in our lab we're reduced the max pool size to 20.
[12 Nov 2008 14:05] Moshe Lampert
in the deadlock - what is the result of
"show full processlist" ?
[15 Nov 2008 20:52] d di
Do the deadlocks go away when pooling is completely disabled?

(Shameless plug: possibly related work in issue #40684.)
[17 Nov 2008 15:14] Ashley Unitt
We're trying to recreate the issue again today to give you more information, but unfortunately with no success so far.
[13 Feb 2009 14:56] Tonci Grgin
Ashley, without proper test case demonstrating this behavior every time it is run I can't do much...
[13 Feb 2009 15:29] Ashley Unitt
Apologies for not being able to get a good test case for this.
[13 Feb 2009 15:40] Tonci Grgin
Ashley, no apologies necessary. I was just explaining why I have to close this report. Without good test case we have nothing to work on...