One thing I see quite a bit with ADO examples in particular is a lack of (a) error checking and (b) connection limiting. I'm not talking about connection throttling, but rather: applying some refactoring logic to how you open and close connections to optimize the use of the open pipeline without keeping it open too long (or re-opening it too many times).
As for error checking: This is a fairly standard/typical piece of VBscript/ASP code for running a "select" query via ADO against a database. It doesn't matter whether that database is local to the server/host where the code is being executed, well, it does actually, it matters more if it's remote, but whatever, let's chew and digest slowly here...
[CODE]
Set conn = Server.CreateObject("ADODB.Connection") Set cmd = Server.CreateObject("ADODB.Command") Set rs = Server.CreateObject("ADODB.Recordset") query = "SELECT * FROM dbo.SomeTable WHERE id=" & _ idNumber & " ORDER BY ItemName" conn.Open dsnString rs.CursorLocation = adUseClient rs.CursorType = adOpenStatic rs.LockType = adLockReadOnly Set cmd.ActiveConnection = conn cmd.CommandType = adCmdText cmd.CommandText = query rs.Open cmd If Not(rs.BOF And rs.EOF) Then cols = rs.Fields.Count rows = rs.RecordCount Do Until rs.EOF ' do something stupid here rs.MoveNext Loop
Else
Response.Write "oops, no records were returned" End If rs.Close conn.Close Set rs = Nothing Set cmd = Nothing Set conn = Nothing[/CODE]
This looks simple enough. But there are quite a few places that could implode here if not handled explicitly. Granted, error handling with .NET is more robust, but indulge me here for a moment since (A) there's still 100 times more VBScript code strewn about this planet than .NET code, and (B) I'm old. The big three problems that are most likely to occur with this scenario are...
Connection Failure
Connection Delay / Time-Out
Recordset Access Failure (access denied)
Let's handle them one by one...
Connection Failure
Prior to the "conn.Open" statement, we should override the default error system and then check for the exit code from the .Open method and see what happened. If it was successful (exit code: 0), we continue on, otherwise we should handle the error and exit safely.
[CODE]
On Error Resume Next conn.Open dsnString
If err.Number <> 0 Then
' an error occurred, so something clever here
Response.Write "oops, cannot open a connection"
Response.End
End If[/CODE]
If you do your connection within a Sub() or Function() block, you should probably exit using Exit Sub or Exit Function, but that's not always true either.
Connection Time-Out
What if the connection is taking a longer time to resolve than usual? We can handle that too...
[CODE]
On Error Resume Next
conn.ConnectionTimeOut = 15 ' allow 15 seconds to establish the connection conn.Open dsnString
If err.Number <> 0 Then
' an error occurred, so something clever here
Response.Write "oops, cannot open a connection"
Response.End
End If[/CODE]
Recordset Access Failure
Another common issue is when you can successfully open the connection, but cannot read from a table or view because of security permissions.
[CODE]
rs.Open cmd
If err.Number <> 0 Then
' do something awesome here
Response.Write "oops, unable to access the table or view"
Response.End
End If If Not(rs.BOF And rs.EOF) Then cols = rs.Fields.Count rows = rs.RecordCount Do Until rs.EOF ' do something neato here rs.MoveNext Loop End If rs.Close conn.Close Set rs = Nothing Set cmd = Nothing Set conn = Nothing[/CODE]
Connection Management
I've seen more situations than I can count where a single page of code (script file, web page, etc.) makes repeated requests from a database in sequential order (as the page is rendered or the script is executed). Most often it's having to grab data from multiple tables and/or views, or execute multiple stored procedures or functions. In a lot of cases, the code doing the heavy lifting is being included from separate files (using "includes"). That's all nifty and modular, which is a good approach, but always be VERY careful with that approach that you don't have each module do it's own connection open/close management. This not only slows down the processing, but it requires more bandwidth and more load on the network and the database server as well.
A case in point might be a web page that renders a report of an employee, then it displays a table with performance evaluation records, followed by a table of employees managed by the employee in question. If those data repositories are all on different database servers that may be all you can do, but if they happen to be on one server, or even in one database, you should seriously review minimizing the number of open/close requests on your connections.
A brief sample of this using pseudo code:
open connection1
open recordset1
close connection1
open connection2
open recordset2
close connection2
open connection3
open recordset3
close connection3
might work a lot faster and better as...
open connection
open recordset1
open recordset2
open recordset3
close connection
Some people prefer to open a "global" or "session" connection, whereby the connection is opened upon login or initialization by each user session. The connection object itself is stored in the session stack and made available globally to that user throughout their session window. Each concurrent user has their own connection opened and maintained on a stack. Granted this makes it easier to run queries, updates, etc. without the overhead of managing connections at the more granular page/script level, but it definitely taxes the database server with a lot of unnecessary open connections. For a handful of users that may be fine, but with hundreds or thousands of users it can be a mess and make the database server drag.
Just some random thoughts after beer. Have any thoughts you'd like to share?
No comments:
Post a Comment