Home | Forums | Contact | Search | Syndication  
 [login] [create account]   Friday, January 18, 2019 
The Myths and Legends of Prepared Queries  
Description:  A review of the how and when of using prepared ADO commands with the SalesLogix 6.2 OLEDB Provider (Version 6.2)

Category:  SalesLogix OLE DB Provider
Author:  Stuart Carnie
Submitted:  6/11/2004
Article has been read 20149 times

Rating: - 5.0 out of 5 by 5 users


A prepared query or command is used to signal to the database engine you (as the developer) are going to execute this statement more than once. Almost always, the query will have 1 or more parameters that change between execution, but the statement itself remains the same.

The When

It is very important to understand when one should use a prepared query. If you don't understand the concept or don't know why your are using a prepared query, then don't use them. If used in situation of heavy load where many users are executing prepared queries, you can bring a DB server to it's knees. However, if you understand the concepts, and know why you are using a prepared query, they can greatly improve the peformance and scalability of your application or code.

In my experience with databases, with regards to performance and the benefit of prepared queries, you should be executing the query at least 4 times before bothering to use a prepared query. The reason is most native OLE DB providers perform a network round trip to prepare, execute, and finally unprepare. There is also a lot of preparation the server does when receiving this request, so make sure you don't let it down by executing 1 or 2 queries!

Version 2 of the SalesLogix OLE DB provider (first released in Beta 3 of the 6.2 release) supports prepared commands, and natively. It does not generate a prepare on the server, but still binds to the underlying provider only once, so the performance increase is very significant.

The How

Okay, this is all great you say, but how do we use these mysterious prepared queries. This example will be in VBScript, but applies to any language using ADO. Firstly, lets prepare our variables and construct a connection:

' Variable section
dim conn      ' ADO connection object
dim dstCMD    ' ADO command object to insert data
dim srcRS     ' source data (ADO recordset)
dim params    ' parameters collection from dstCMD 
dim fldout    ' source fields collection from srcRS
dim err       ' error object
dim rows      ' rows effected from executing dstCMD

set conn = CreateObject("ADODB.Connection")
conn.ConnectionString = "File Name=mytest.udl"

Now, lets construct the ADO command object, responsible for handling the prepared query. In this case we'll demonstrate a batch of inserts. The important lines are the "dstCMD.Prepared = true" and "dstCMD.Parameters.Refresh" - the comments go into further detail.

' construct the ADO command object for inserting the data into the table
set dstCMD              = CreateObject("ADODB.Command")
dstCMD.ActiveConnection = conn
dstCMD.CommandType      = adCMDText    ' it is important this is set to the indended commant

dstCMD.Prepared         = true         ' True indicates to ADO that this command is prepared
                                       ' and will be used multple times.  It is pointless to do 
                                       ' this if you are only executing the command object once.
dstCMD.CommandText      = _
    "(accountid, name, division, revenue, employees, " & _
    "status, createdate, modifydate, createuser, modifyuser) " & _
    "values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"

dstCMD.Parameters.Refresh              ' Refreshes the parameters collection with the correct
                                       ' number of parameters and data types.
                                       ' Do this AFTER setting the command text

In our example, we will copy the data from a source recordset, however this could be from anywhere. The following code constructs this object and opens it. Points of interest here are the srcRS.CursorType = adOpenForwardOnly, which indicates you will only be traversing this recordset from beginning to end. Internally, ADO may be able to optimize this. Secondly, the srcRS.LockType = adLockReadOnly also instructs ADO that you will not be modifying any data in this recordset, so again internal optimizations and memory requirement can be realized. It is very important when using ADO, that you state your intentions on all properties. Never assume the defaults are okay - more often than not they are not. You will greatly reduce your need to troubleshoot code and as a nice side effect, will potentially gain in perfomance and lower your memory requirements.

' Create the source data set for the copy.  The could be a text file or
' anything you want.
set srcRS                = CreateObject("ADODB.RecordSet")
srcRS.ActiveConnection   = conn
srcRS.CursorLocation     = adUseClient
srcRS.CursorType         = adOpenForwardOnly
srcRS.LockType           = adLockReadOnly
srcRS.Open "Select accountid, account, division, revenue, employees, " & _
           "status, createdate, modifydate, createuser, modifyuser from account"


...but I digress. Next the work of the routine, which sets the parameter values and executes the INSERT. There is very little to the actual code, but without logging, I have seen 2-2.5 fold performance increases in inserting 1000+ rows of data. With logging on, it is about 1.5 times quicker. This is still significant, and the SLX provider may be optimized further, so this will only get better.

Set fldout = srcRS.Fields
Set params = dstCMD.Parameters

While Not srcRS.EOF 
    params(0).Value = fldout("accountid").value
    params(1).Value = fldout("account").Value
    params(2).Value = fldout("division").Value
    params(3).Value = fldout("revenue").Value
    params(4).Value = fldout("employees").Value
    params(5).Value = fldout("status").Value
    params(6).Value = fldout("createdate").Value
    params(7).Value = fldout("modifydate").Value
    params(8).Value = fldout("createuser").Value
    params(9).Value = fldout("modifyuser").Value

    dstCMD.Execute rows, , adExecuteNoRecords    ' use the adExecuteNoRecords for commands that do not 
                                                 ' return data, as this can increase performance slightly.

Set dstCMD = Nothing
Set srcRS = Nothing
Set conn = Nothing    

Wrapping up

The benefits in performance of large batch updates or inserts is worth the time to look at code and determine whether it fits the criteria for prepared, parameterized queries. If you are writing code to batch import or update data, chances are it will benefit from the techniques described above.


About the Author

Stuart Carnie
Sage Software, CRM Division


View online profile for Stuart Carnie

[ back to top] [ send to a friend]  

Rate This Article you must log-in to rate articles. [login here] 
Please log in to rate article.

Related Articles 
 - The Myths and Legends of Prepared Queries Sample Code - Submitted by: Stuart Carnie


Comments & Discussion you must log-in to add comments. [login here] 
Author Article Comments and Discussion
Ryan Farley

slxdeveloper.com Site Administrator
slxdeveloper.com Forum Top 10 Poster!

Re: The Myths and Legends of Prepared Queries
Posted: 6/11/2004 5:41:57 PM
Great stuff Stuart. It is awesome to see something from you here! Thanks.

Robert C. Levine

Re: The Myths and Legends of Prepared Queries
Posted: 6/14/2004 4:00:09 PM
Question: If the update is to be used multiple times per run and there will be many such runs, then would it not be better to use a custom SQL Stored Procedure (and supply it parameters at run time)?

Bob Levine
Advantageware, Inc.
Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 6/14/2004 4:12:10 PM

the answer is yes for native database connections, however when using SalesLogix, we require the provider, in order to generate transactions for synchronisation. If we were to use a native database stored proc, the provider could not synchronise the changes, as it would be unaware of the SQL that was executed on the server, within the stored proc.

Something we may look at in the future is the possibility to sync a stored proc that was flagged as a type that modified data. There are no immediate plans for this, as there are many limitations that will be imposed on a native database proc, which will require considerable testing on our part.
Eric Carr

Re: The Myths and Legends of Prepared Queries
Posted: 6/25/2004 12:40:27 AM
For those of us who do not have multiple servers (and thus do not need to worry about syncing) I'm glad that at least Stored Procs are supported in the form of sp_proc @param='value'. This is important because when it comes to more complex queries the SLX OLE DB Providor can't parse them. (such as insert into .. select from .. order by x).
Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 6/28/2004 4:36:19 PM

You can retrieve the native connection string and use that for complex queries - I think, never tried - instead of sps if you want.


This is a great info. I am still working with v6.1 and I find that there are a lot of issues with the provider. Maybe these are known issues and have been fixed in 6.2(BTW, when is the ETA for this?):
1. Command.Prepared = True, does not work - has no effect, and furthermore, if you tried to execute more than once, it errors out.
2. Maybe related to #1, but every single .Execute is prepared! What's up?
3. Connection.Execute methods should not be prepared but it does! Instead it should be using sp_executesql
4. There are some issues with varchar and longvarchar type parameters and using null or empty string values.
5. Unless you start a transaction on a given connection, there is no guarantee that the underlying connection to db(aka spid) is the same one you just used. This is either a consequence of #2 or a design decision that was made on the provider, which makes me wonder why. BTW transactions do work right? I hope so because if not, man...
I could go on and on ad infinitum.
Jonathan B. Owen

Re: The Myths and Legends of Prepared Queries
Posted: 7/29/2004 2:37:22 PM
I am in the process of developing an external (VBS based) app for a client. I was wondering if prepared statements are supported in SLX 6.1 provider. I just gave it a shot and received an error stating that the ADO CMD object does not support the "Paramters" property. Not sure if this is because the provider needs the app to do CreateParameter() on its own, or whether prepared statements are simply not supported at all.

Any concrete information if any support exists or not for SLX 6.1?
Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 7/30/2004 2:12:03 PM

As far as I know, Prepared property is not supported by SalesLogix OLE DB Provider. Actually, every single query is "prepared" which is terrible.
Simply it plain don't work.
Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 7/30/2004 2:15:25 PM
In 6.1x you will not see the as many benefits, however 6.2 does benefit from using prepared queries, as in the above example.
Rich Buckley

Re: The Myths and Legends of Prepared Queries
Posted: 9/16/2004 4:51:32 AM
Back to SPs for a moment. If I wanted to have Crytal use a stored procedure, would that now work. It wouldn't with the last provider. I had to have the client run the report outside of Slx. This would be huge!
Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 11:45:45 AM
ok, so I've done some testing on 6.2 and there are some issues which may truly be issues or maybe not.

"Version 2 of the SalesLogix OLE DB provider (first released in Beta 3 of the 6.2 release) supports prepared commands, and natively. It does not generate a prepare on the server, but still binds to the underlying provider only once, so the performance increase is very significant."

So, I missed the part about the provider not generating a prepare on the server and got a little upset that it still wasn't working in 6.2. Indeed, on the server, the statements execute as sp_executesql which isn't half bad; however, it's far from what Prepared is supposed to do. There really may be some performance gains between the SLX provider and the underlying provider. I have a feeling that the reason it doesn't generate prepares on the server is due to the way SLX provider pools its connections perhaps, I don't know. In fact, I am not completely sure, whether a connection I create and hold on to, will still be the same connection later on. It's not so in 6.1. Maybe it is in 6.2.

One thing I've noticed in 6.1 is that if you start a transaction on the connection, it will retain the same underlying connection. By the way, while on the subject of transaction, I've seen situations where a series of statements inside a transaction committed when an error occurs in the script and the connection closed or garbage collected without me explicitly committing, which is an incorrect behavior. I do not know whether this is still true in 6.2. We'll see.

I really hope that SalesLogix publish some kind of technical brief/doc on the provider: features supported, not supported, quirks, etc. Me having to test out all scenarios - and sometimes finding out in production - is not very nice.

From the quote above, though, what does "natively" in "supports prepared commands, and natively" mean?
Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 12:38:40 PM

On the contrary, 6.1 always did a prepare before executing the statement. The main reason for removing this was scalability and performance. The load (both network and CPU) and memory usage on MSSQL is significantly greater than what is in 6.2, and preparing the query was a major contributing factor. Some early tests running my inhouse load testing tool showed a significant delta of memory usage for the sqlservr.exe process. 6.2 was about 65mb and 6.1.0 was about 240mb.

Round trips to the SQL server are also massively impacted. Almost always (after internal meta-data is cached), 6.2 performs 1 round trip to the server to execute the query, 6.1x performs 3, and depending on how the ADO recordset is used (i.e. if it is prepared and the executed), possibly even 5 round trips! Typically a 3 way is a 1:PREPARE, 2:EXECUTE, 3:UNPREPARE, a 5 is a 1:PREPARE, 2:UNPREPARE, 3:PREPARE, 4:EXECUTE, 5:UNPREPARE. This was largely to do with the SDK we used in 6.1x - 6.2 removed the SDK completely.

In a disconnected model, prepares rarely make any sense. Given our client is now disconnected, and ADO.NET is also a disconnected model, a prepare (by the SalesLogix provider) is even more unnecessary.

I do not understand why you would be upset by all these improvements, or even upset that it is not doing a prepare on the server. Very rarely should code need to do a 'true' prepare. We have been working on this internally for 3+ years and have seen all the variations in performance. The balance we reached in 6.2 is the best for the SalesLogix application environment, including ADO.NET for how our developers including BPs typically use it.

By 'natively', the 6.2 provider truelly prepares the query in it's internal buffers, so it only parses, allocates parameter binding buffers and many other expensive internal structures once. The 6.1x provider (due to the 3rd party SDK we used) does all these steps EVERY time.

SQL Server is also a lot smarter than you give it credit for too. Given the above query in the example is static, and only the parameter values change, SQL server caches the query anyway, and so the performance difference between doing a true PREPARE, EXECUTE..EXECUTE..EXECUTE, UNPREPARE is not worth the difference.

Ryan Farley

slxdeveloper.com Site Administrator
slxdeveloper.com Forum Top 10 Poster!

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 1:07:20 PM

As always, great info. Glad to see your comments here.

Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 1:11:35 PM

On the contrary to what? I am not disagreeing with you that 6.1 was not performant with its preparing every statement and that 6.2 is an improvement over 6.1. I am in complete agreement with you in that 6.2 is a vast improvement over 6.1. See my comment on 7/30/2004. I know about 6.1 provider's faults and have known about it since it came out. And I laude SalesLogix for the effort that's been made toward improving the provider for 6.2.

I also acknowledge that the current model employeed by 6.2 provider is indeed efficient. Anyone who's read the SQL BOL can attest to this. Look up sp_executesql in the documentation and it's all spelled out - that it basically optimizes the heck out of statements and caches the execution plans, etc. that sometimes it may even prove it performs as well as sp_prepare. And the SQL Server team put tremendous efforts to this functionality apparently. These also I've known for a long time. So if it seemed like I wasn't giving proper credit to SQL Server, let it be duly noted that I do indeed give credit where credit is due.

Now to the issues that I am NOT in agreement:

"In a disconnected model, prepares rarely make any sense. Given our client is now disconnected, and ADO.NET is also a disconnected model, a prepare (by the SalesLogix provider) is even more unnecessary."

ADO.NET is not only a disconnected model. It gives you an option. DataSet is disconnected while DataReader is not.
And I'd argue that prepares in some cases would make sense in disconnected model. If you are disconnected, then you will have to sync up with the host eventually. When this occurs, chances are the statements that need to run against the host server may fit the pattern where preparing the statements before executing a whole bunch of similar queries may make sense. Again, you may argue that the difference between sp_executesql and sp_prepare is minimal if not non-existent. This I am not in total disagreement. But I'm not going to go as far as you did and say the performance difference is not worth it.

Also, my being "upset" with the fact that it doesn't actually "prepares" a statment is in that it's misleading. I expect Command.Prepared = True to either create me a prepared query or generate an error. Should SalesLogix have provided a documentation noting the difference I may have yielded.

One question though is in your statement that says "Given our client is now disconnected", I assume you mean the SalesLogix client. Has SalesLogix recently changed its architecture that this is now true? Was it a connected client and now it's not?

I don't want to argue for the sake of argument because it's a waste of time. I can live with all these issues - I do it anyway, everyday - but I just wish SalesLogix would provide a documentation where it's seriously needed.
Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 1:55:13 PM

I appreciate your comments. We have spent a significant amount of time improving the performance of this generic data layer, and will continue to do so - it is a primary focus. Again, I will reiterate the point, this data layer is designed for the SalesLogix system, and we focus our optimizations to this environment and the large majority of our BP community.

My point with the prepared statement is that we do implement it, to the level where we have found the greatest performance improvements, without having a negative impact. Let me explain further.

I highlighted in my previous post, internally we 'prepare' the query, saving many expesive steps such as parsing, memory allocation and the preparation of many internal data structures, which includes the setup and binding to the underlying provider, so it too does not have to parse the query (which the MSSQL provider does to a degree also). This saves a significant amount of time.
As you are obviously well aware, the whole point of preparing is to save doing this expensive work N number of times for the same query.
So when you set Prepared=True, you are infact getting what you paid for - a prepared query implementation that greatly improves performance. If you do not, and execute the same query multiple times, the provider will have to repeat the parse, memory allocation and internal structure preparation N times, including the bindings to the underlying provider.

The second point to my explanation is key to the design decisions we made, which I hope you can appreciate. A significant portion of core client code forces prepares (due to Borland's default implementation of their ADO wrappers), which would translate to 3 round trips to the server for our client apps in most cases, had we truelly prepared the query on the server. This would have had a significant negative impact on prepared query performance and ultimate scalability of our application. Which is why our provider should scale better used in conjuntion with our application than using the native MSSQL provider.

Often, developers do not understand when and how to correctly use a prepared query, and so it ends up having a considerably negative impact on performance and scalability. We have provided all the benefits of prepared queries, and none of the potential drawbacks for 'unprepared' users :)

I believe this forum is continuing to expose a lot of technical details about the provider.


Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 2:58:30 PM

Thanks for your response. If it seemed that I was speaking outside the context of SalesLogix system regarding the provider, I apologize for misleading you.

I understand the performance enhancement made by your reworking of the provider in 6.2. I had forgotten SalesLogix is built in Delphi(still true I assume). Onto your second point, you mention that Borland forces you to do things in a certain way. If you don't mind, can you elaborate on the new provider design? Especially where Borland and its technology fits in - unless you mean your provider is written in Delphi and is using its "ADO wrapper" between your provider and the SQL OLE DB Provider.

"I believe this forum is continuing to expose a lot of technical details about the provider."

Still the forum is not an adequate documentation although it and Ryan's blog is turning out be an oasis in the desert that is the SalesLogix developer information. :) And until you provide a guide or tutorial on the topic we're on, most SalesLogix developers will go on writing static SQL statements conjoined with ampersands and replacing single quotes for every data modifications which is, needless to say, buggy, less maintainable, and less performant.

Oh and has there been changes to the SQL parser as well? What's allowed and what's not?

Thanks for chiming in. I appreciate it. I know you don't have to and the questions could have been stranded out here until nobody really cares.

Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 3:26:24 PM

No worries at all - I'm certainly here to help where ever I can.

The OLE DB provider is writting in C++ - the ADO wrappers in Delphi are used by our client applications.
In Delphi, Borland has exposed ADO via class wrappers to provide a consistent model for writing data-driven applications. This way your app could use ADO, BDE or any other underlying technology, as long as it is wrapped in the base TDataset class. It is this implementation that by default results in bad perf - we can make changes throughout the code to work more efficiently, but this is a significant amoubt of work. This is also partly due to our migration from BDE (Borland Database Engine) to ADO.

Yes, the parser was also rewritten - it is more performant, and more generic. Still doesn't allow things like create trigger, or create procedure and this was by design. We are considering a pass-thru mechanism to allow this in a future release, but it still needs to be secure.

Stored procedures a now supported too.
Jiho Han

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 4:05:58 PM
I see.

So then I assume the migration from BDE to ADO is still in progress as of 6.2?

If possible, can you shed a little light on what version 7(or the next major upgrade) will bring about? I've only found one snippet on the web about the web client being re-architected. Will the client stay Delphi-based and VBScript as the customization langauge? Any consideration regarding moving to .NET?

Is there a provider technical bulletin for 6.2 similar to existing 6.1 one?

Stuart Carnie

Re: The Myths and Legends of Prepared Queries
Posted: 9/22/2004 4:16:52 PM
No the migration is complete - this happened in 6.0 - BDE used to be a separate install with SalesLogix version prior to 6.0 and this is no longer necessary.

As for future versions, I'm not sure what I can and cannot say just yet.. ;-) I must admit, I like .NET.



       Visit the slxdeveloper.com Community Forums!
Not finding the information you need here? Try the forums! Get help from others in the community, share your expertise, get what you need from the slxdeveloper.com community. Go to the forums...

 slxdeveloper.com is brought to you courtesy of Ryan Farley & Customer FX Corporation.
 This site, and all contents herein, are Copyright © 2019 Customer FX Corporation. The information and opinions expressed here are not endorsed by Sage Software.

code of conduct | Subscribe to the slxdeveloper.com Latest Article RSS feed
page cache (param): 1/18/2019 2:29:41 AM