11/21/2024 10:52:25 AM
slxdeveloper.com
Now Live!
|
|
|
User Profile -
Stuart Carnie
|
|
|
Stuart Carnie Sage Software, CRM Division
fiogf49gjkf0d
|
|
Log in to send this member a
message! |
|
|
Stuart Carnie's
Contributions |
Stuart Carnie
has contributed
comments and
2
articles.
|
|
Select to view:
|
Comment: Re: Configuring SalesLogix 6.2 For Microsoft SQL 2005 Express Edition
fiogf49gjkf0d I hacked the connection string in, and it works thus far. When we qualify for SQL 2005, I will make sure we can select that provider for use..
Cheers,
Stu
Author: Stuart Carnie - 2/17/2005
|
|
Comment: Re: Configuring SalesLogix 6.2 For Microsoft SQL 2005 Express Edition
fiogf49gjkf0d I hacked the connection string in, and it works thus far. When we qualify for SQL 2005, I will make sure we can select that provider for use..
Cheers,
Stu
Author: Stuart Carnie - 2/17/2005
|
|
Comment: Re: Configuring SalesLogix 6.2 For Microsoft SQL 2005 Express Edition
fiogf49gjkf0d Ryan, the provider itself doesn't care about the underlying connection, and uses very generic OLE DB interfaces (by design, to facilitate easy integration to new back ends). It is purely the configuration utility that limits us, so you might be able to use the underlying COM object the Ole Db Config Manager client uses, and create an encrypted string to test.. Obviously the DBConfigMgr interface is undocumented and subject to change at any time, however you are playing in an unsupported area, anyhow :)
Author: Stuart Carnie - 2/17/2005
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d No the migration is complete - this happened in 6.0 - BDE used to be a separate install with SalesLogix version prior to 6.0 and this is no longer necessary.
As for future versions, I'm not sure what I can and cannot say just yet.. ;-) I must admit, I like .NET.
Cheers,
Stu
Author: Stuart Carnie - 9/22/2004
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d Jiho,
No worries at all - I'm certainly here to help where ever I can.
The OLE DB provider is writting in C++ - the ADO wrappers in Delphi are used by our client applications. In Delphi, Borland has exposed ADO via class wrappers to provide a consistent model for writing data-driven applications. This way your app could use ADO, BDE or any other underlying technology, as long as it is wrapped in the base TDataset class. It is this implementation that by default results in bad perf - we can make changes throughout the code to work more efficiently, but this is a significant amoubt of work. This is also partly due to our migration from BDE (Borland Database Engine) to ADO.
Yes, the parser was also rewritten - it is more performant, and more generic. Still doesn't allow things like create trigger, or create procedure and this was by design. We are considering a pass-thru mechanism to allow this in a future release, but it still needs to be secure.
Stored procedures a now supported too.
Author: Stuart Carnie - 9/22/2004
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d Jiho,
I appreciate your comments. We have spent a significant amount of time improving the performance of this generic data layer, and will continue to do so - it is a primary focus. Again, I will reiterate the point, this data layer is designed for the SalesLogix system, and we focus our optimizations to this environment and the large majority of our BP community.
My point with the prepared statement is that we do implement it, to the level where we have found the greatest performance improvements, without having a negative impact. Let me explain further.
I highlighted in my previous post, internally we 'prepare' the query, saving many expesive steps such as parsing, memory allocation and the preparation of many internal data structures, which includes the setup and binding to the underlying provider, so it too does not have to parse the query (which the MSSQL provider does to a degree also). This saves a significant amount of time. As you are obviously well aware, the whole point of preparing is to save doing this expensive work N number of times for the same query. So when you set Prepared=True, you are infact getting what you paid for - a prepared query implementation that greatly improves performance. If you do not, and execute the same query multiple times, the provider will have to repeat the parse, memory allocation and internal structure preparation N times, including the bindings to the underlying provider.
The second point to my explanation is key to the design decisions we made, which I hope you can appreciate. A significant portion of core client code forces prepares (due to Borland's default implementation of their ADO wrappers), which would translate to 3 round trips to the server for our client apps in most cases, had we truelly prepared the query on the server. This would have had a significant negative impact on prepared query performance and ultimate scalability of our application. Which is why our provider should scale better used in conjuntion with our application than using the native MSSQL provider.
Often, developers do not understand when and how to correctly use a prepared query, and so it ends up having a considerably negative impact on performance and scalability. We have provided all the benefits of prepared queries, and none of the potential drawbacks for 'unprepared' users :)
I believe this forum is continuing to expose a lot of technical details about the provider.
Cheers,
Stu
Author: Stuart Carnie - 9/22/2004
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d Jiho,
On the contrary, 6.1 always did a prepare before executing the statement. The main reason for removing this was scalability and performance. The load (both network and CPU) and memory usage on MSSQL is significantly greater than what is in 6.2, and preparing the query was a major contributing factor. Some early tests running my inhouse load testing tool showed a significant delta of memory usage for the sqlservr.exe process. 6.2 was about 65mb and 6.1.0 was about 240mb.
Round trips to the SQL server are also massively impacted. Almost always (after internal meta-data is cached), 6.2 performs 1 round trip to the server to execute the query, 6.1x performs 3, and depending on how the ADO recordset is used (i.e. if it is prepared and the executed), possibly even 5 round trips! Typically a 3 way is a 1:PREPARE, 2:EXECUTE, 3:UNPREPARE, a 5 is a 1:PREPARE, 2:UNPREPARE, 3:PREPARE, 4:EXECUTE, 5:UNPREPARE. This was largely to do with the SDK we used in 6.1x - 6.2 removed the SDK completely.
In a disconnected model, prepares rarely make any sense. Given our client is now disconnected, and ADO.NET is also a disconnected model, a prepare (by the SalesLogix provider) is even more unnecessary.
I do not understand why you would be upset by all these improvements, or even upset that it is not doing a prepare on the server. Very rarely should code need to do a 'true' prepare. We have been working on this internally for 3+ years and have seen all the variations in performance. The balance we reached in 6.2 is the best for the SalesLogix application environment, including ADO.NET for how our developers including BPs typically use it.
By 'natively', the 6.2 provider truelly prepares the query in it's internal buffers, so it only parses, allocates parameter binding buffers and many other expensive internal structures once. The 6.1x provider (due to the 3rd party SDK we used) does all these steps EVERY time.
SQL Server is also a lot smarter than you give it credit for too. Given the above query in the example is static, and only the parameter values change, SQL server caches the query anyway, and so the performance difference between doing a true PREPARE, EXECUTE..EXECUTE..EXECUTE, UNPREPARE is not worth the difference.
Author: Stuart Carnie - 9/22/2004
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d In 6.1x you will not see the as many benefits, however 6.2 does benefit from using prepared queries, as in the above example.
Author: Stuart Carnie - 7/30/2004
|
|
Comment: Re: The Myths and Legends of Prepared Queries
fiogf49gjkf0d Bob,
the answer is yes for native database connections, however when using SalesLogix, we require the provider, in order to generate transactions for synchronisation. If we were to use a native database stored proc, the provider could not synchronise the changes, as it would be unaware of the SQL that was executed on the server, within the stored proc.
Something we may look at in the future is the possibility to sync a stored proc that was flagged as a type that modified data. There are no immediate plans for this, as there are many limitations that will be imposed on a native database proc, which will require considerable testing on our part.
Author: Stuart Carnie - 6/14/2004
|
|
|
|
|
|
|
|