11/10/2024 12:54:28 PM
|
|
slxdeveloper.com Community Forums |
|
|
|
The Forums on slxdeveloper.com are now retired. The forum archive will remain available for the time being. Thank you for your participation on slxdeveloper.com!
Forum to discuss general external development topic (related or not to SalesLogix development). View the code of conduct for posting guidelines.
|
|
|
|
Crazy idea of mine...
Posted: 24 May 06 2:54 PM
|
ru 486 abortion pill buy online buy abortion pill fiogf49gjkf0d It has long been an idea of mine to encapilate the relations within the SalesLogix database into .Net classes, there by giving the programmer using my classes the ability to quickly navigate from one relationship to another. For example, if the progammer wanted to get the list of all Opportunities of an Account he would create an instance of the AccountTable object, and AccountAdapter object, use the adapater to fill the table, and assign the row to an account object, once done they would then be able to access the Opportunites by accessing the Opportunities property of the Account. This property would check to see if this list is already memory resident, and if not, query the with the same connection string used to query the AccountTable, create the neccesary objects and pass back the list.
Code would look something like this, please forgive the fact that this is alot of psuedo code (I am using VB.Net): This assumes the connection string has already been created...
Dim myAccounts As AccountTable = new AccountTable() Dim myAdapter as AccountAdapter = new AccountAdapter("Where AccountId = 'ACCT000001'", connectionString)
myAdapter.Fill(myAccounts)
Dim myAccount as Account = new Account( myAccounts.Rows(0) )
Dim myOpps as OpportunityTable = myAccount.Opportunities
Has anyone else ever had this idea, and if so, did you find that this was just an impossible mission or just not worth the time?? I do not mind if anyone here would like to see what I have done before, hell if anything I wouldn't even mind making this open source, but I wanted to know if anyone else felt the same way, or had the same idea.
|
|
|
|
Re: Crazy idea of mine...
Posted: 25 May 06 2:20 AM
|
fiogf49gjkf0d Carlos,
Your idea is shared by many programmers who are sick of having to work with poorly structured tools. After all, what software does not entail working with data at some point?
Personally I'm all for persisted object models manually written in C# (or VB.NET); unfortunately, while it's easy to speculate about the Account table, the task of taking all 100+ tables in SalesLogix and wrapping them cleanly in real objects with some core business logic, while keeping it maintainable so that revisions to the product can mean minor adjustments to the object model, is not a one-man job even if he was working full-time on it for a year, at least not without some additional code-generation tools.
To the uninitiated (whereas Carlos seems to already be using dataset nomenclature) ...
Visual Studio 2005 does relationship-aware data encapsulation out of the box with datasets. Datasets are strongly-typed schema declarations persisted as XML but compiled as managed code. You can use the diagram designer to drag-and-drop relationships between tables just like the diagram designer in SQL Server Enterprise Manager (except that here the relationships are persisted locally, not on SQL Server). To get started, use the Server Explorer to connect through the SalesLogix OLE-DB provider to your database. You can then create a dataset directly from the tables in this connection.
Datasets are instantiable as objects, almost like strongly-typed recordsets, and can also be used to data-bind forms and controls. Keep in mind, though, that datasets are strictly schema declarations (wrapper "classes") for persisted data. They are not true object models. But when instantiated (and filled) they are certainly strongly-typed objects.
There are caveats with datasets, of course. First of all, you won't want to do this with all the tables unless you want Visual Studio running like mollasses. Secondly, you'll need to keep track of your connection context. (You're on your own there.) Third, you will definitely need to validate your relationships because the relations don't really exist in SalesLogix from a SQL Server / OLE-DB perspective, but strictly through the SalesLogix JOINDATA (and perhaps RESYNCTABLEDEFS and other metadata) tables.
Be aware that the product team may be looking into an object model based system in a future version of the SLX product. Don't reinvent the wheel. On the other hand, I personally most certainly intend to wrap the ACO tables and their main subtables with hand-written object models, just for my own future customizations. I love the idea of myAccount.Opportunities, etc.
However, I think the connection should be static, so you should be able to use static methods to get what you need, filtered. These static methods should encapsulate any such data adapter that you described.
I.e.
Account myAccount = Account.GetAccount(accID); // singular is always by ID or
AccountList myAccounts = Account.GetAccounts("Status='Active'"); // multiple is always by WHERE clause, or overloaded with no params
and
OpportunityList myOpportunities = Account.GetAccount(accID).Opportunities;
Make it declarative AccountList or AddressList rather than an Account[] array or Address[] array, so that you can add a property (AddressList.AddressIDs) to export all the IDs of the objects in the list in a comma-delimited, single-quote encapsulated string. Then just override ToString() in the _List to return the result of that property. Use that pattern everywhere, now you can do this...
OpportunityList myOppGroup = Opportunity.GetOpportunities("ACCOUNTID in (" + Account.GetAccounts("Status = 'Inactive'") + ")"); or
ContactList myContacts = Contact.GetContacts("ADDRESSID in (" + Address.GetAddresses("City = 'Austin' and State = 'TX'") + ")");
Just brainstorming with you. I'm a big believer in patterns that minimize any coding or efforts beyond simply expressing what we're trying to get at. Due to the multiple hits to the database in each line since we're not using server-side joins, this stuff may not be terribly fast at execution (faster than most VBScript scripts I've seen, though), but it's an incredible time-saver when developing.
HTH, Jon
|
|
|
|
Re: Crazy idea of mine...
Posted: 25 May 06 4:54 AM
|
am i pregnant quiz 100 accurate am i fat or pregnant quiz read here fiogf49gjkf0d BTW, the entity objects are great candiates for implementations of a common abstract class that might share common access points to session context (i.e. the databsae connection) or common methods that can be reused across entities. As well, look for some interface declarations that follow similar patterns.
Similarly, the List objects are great candidates for generics.
class AccountList : List<Account> { public AccountList() : base() { } // ... public override ToString() { // return the IDs here } }
The actual List<T> class might be a bad choice to base from, but it's just an example / idea. |
|
|
|
Re: Crazy idea of mine...
Posted: 25 May 06 7:50 AM
|
fiogf49gjkf0d Hi Jon,
First off, thanks for all the input, its really appreciated. Truth is that this whole thing came about because when I tried to use the VS2005 DataSet Designer. I quickly came to the realization that it would just be to resource demanding to add in all the datatables at once into a single dataset, especially if I was just going to be using one table, like the ACCOUNT table.
Thats the reason I came up with the design I was trying to describe in the first post. My basic thought process was to abstract the tables and their relationships by one step, but without forcing it to be in a dataset. This way I could build the dataset (if I so desired) with any of the tables that I had implemented, all done at run time. The idea seemed sound, at least in my own mind, but I am seriously starting to doubt the feasibility of it now. In either case here are some implementation details just to get them out there and if anything mabye someone can improve on this.
The singletin classes such as Account, consume a datarow and use it to return and store the values of the members, it also manages whether the row has been modified from its original state or not (basically the same as what the designer would have created as a AccountDataRow object).
Some examples...
class AccountTable : DataTable { ... } calss AccountAdapter : ComponentModel { private _adapter as oledbDataAdapter // ... }
class SlxRelation { static (or shared, I forget) overloads addRelation( AccountTable, ContactTable) { ... } static overloads addRelation( AccountTable, OpportunityTable ) { ... } // so on and so forth }
class Account { private myRow as DataRow private myTable as DataTable private myInitState as Integer
//Default constructor, creates an instance of an AccountTable //in memory public Account() { myTable = new AccountTable() myRow = myTable.newRow() myInitState = CustomInitState.NewlyCreated }
// Overloaded constructor that assignes a single row object // to this account public Account( byval row as DataRow ) { myRow = row myTable = row.Table myInitState = CustomInitState.LoadedFromObject }
}
Again, thanks for the input and have a good extended weekend (if you have it that is ) |
|
|
|
Re: Crazy idea of mine...
Posted: 25 May 06 1:47 PM
|
fiogf49gjkf0d One thing to think about is SalesLogix stores an XML representation of the entire database. This includes all of the joins in the right places and the whole 9 yards. I forget how you access it but I believe it's now stored in the database wheras it used to be in the SalesLogix folder.
Think about this example: You wish to insert a contact for a new account. This involves a row in the account, accountsummary, address (one for contact as well), contact, and possibly other tables. How do you abstract those relationships into an appropriate business object? That would be a contact object coupled with an account object. Add integrity to that mix such as an account must have at least one contact or a contact cannot belong to no accounts and you come close to what SalesLogix keeps track of internally. I'm beginning to wonder how SalesLogix handles these relationships because if they have a BO model then we could just ask for that, convert Delphi to C# (not hard) and we'd be done. No need to reinvent the wheel, just use theirs.
I also agree that loading all of the tables into a dataset would be resource intensive. What you really are asking for is a dynamic system that when you add a particular table it keeps track of all the relationships and the integrity internally. I've built a table tree traversal routine (in Architect mind you) that recursively checks the joindata table and follows a parent table down to each generation. Technically what I think you're asking for is possible at least from a relationship standpoint. The biggest hurdle is the integrity part, making sure everything you do doesn't trip any of the integrity checker's routines and keeps all of the data playing nice. The second biggest hurdle is to make all of the stock SalesLogix tables mostly read-only as some of the BLOB data is encoded and can't be manipulated reliably outside of the client. Plus since their tables are technically an outside source, depending on the structure from version to version isn't exactly wise (even though v7 lets you manipulate their tables easily).
From the outside looking in, this doesn't look easy when you take in the scope. Working a peice at a time will make it easier to handle but it still seems like it doesn't have as much benefit as the idea should. You're also relying on a company that unlike Microsoft doesn't hold backwards compatibility as the highest priority on their list, so maintenance may be a headache between major versions. They've been good so far so that isn't much of an argument but it should be in the back of your mind since this isn't coming from the designers of the app themselves. |
|
|
|
Re: Crazy idea of mine...
Posted: 25 May 06 6:05 PM
|
fiogf49gjkf0d That's the only way to go
I've been building my own object model for SLX entities for years now - project by project. Working directly with data sucks. Eventually, we'll have all that built in (beyond v7) but we've had to work with raw data for much too long now. I don't think it is very realistic to expect that you could have a expanding and growing library of business objects for *everything* in SLX. That's just too much of an undertaking. However, with each project I've had over the last few years I've been expanding my own collection. How could you expect to work any other way? Like you're really going to just work with DataSets, DataReaders, etc. We've been way beyonf that for far too long hehe.
I actually started building a tool about 2 years ago that would generate business objects and associated access layers for SLX data. It worked fine but I ended up abandoning it (although I still brush it off from time to time to generate some base code). It could never gen *everything* you'd need as SLX just has too many "unenforced rules". Things that aren't enforced in the schema but done by the application. You just can't gen that sort of stuff and end up needing to just write it all out yourself.
IMO there is huge value in building these layers. But I just don't know how realistic it is to build a generic one size fits all layer for everything. |
|
|
|
Re: Crazy idea of mine...
Posted: 25 Jul 06 8:37 AM
|
fiogf49gjkf0d Guys,
Building the Business Layer and Data Layer offer an extreme amount of functionality, however at a cost of churning alot of code. I would recommend using a tool like CodeSmith to generate the layer for you. As long as the business layer is seperate from the rest of your core code it is easy to re-gen depending on your end users system. Rarely should a developer have to create a DAL by hand anymore with all of the gen tools and really CRUD is pretty much the same for any table. Now with the business rules that Ryan noted code would have to be witten to. If you take a business object is a data container view and create a rules engine it will be possible to have that abstracted layer for loading/saving and validating the business rules. The most difficult BE's to create will be around the activity tables. This is mostly because of the recurring activity functionality.
Mark
|
|
|
|
Re: Crazy idea of mine...
Posted: 25 Jul 06 3:38 PM
|
fiogf49gjkf0d Here is some sql that will generate very simple properties for all the columns in a table. In query analyzer set the output to text rather than grid.
SELECT --C.Table_Name, C.Column_Name, C.Data_Type, C.Character_Maximum_Length ' public property get ' + Lower(C.Column_Name) + '()' + char(13) + char(10) + ' ' + Lower(C.Column_Name) + ' = p_ADO.Recordset.Fields.Item("' + Lower(C.Column_Name) + '").Value' + char(13) + char(10) + ' end property' + char(13) + char(10) + ' public property let ' + Lower(C.Column_Name) + '(byVal Value)' + char(13) + char(10) + ' p_ADO.Recordset.Fields.Item("' + Lower(C.Column_Name) + '").Value = value' + char(13) + char(10) + ' end property' + char(13) + char(10) + ''' --------------------------------------------------' + char(13) + char(10) FROM Information_Schema.Columns C INNER JOIN Information_Schema.Tables T ON T.Table_Name = C.Table_Name AND Table_Type = 'Base Table' WHERE C.Table_Name IN ('account') AND C.Table_Schema = 'sysdba' ORDER BY C.Table_Name, C.Column_Name
/* sample output:
Public Property Get HistoryID HistoryID = p_ADO.Recordset.Fields.Item("HistoryID").Value End Property
Public Property Let HistoryID(byVal Value) p_ADO.Recordset.Fields.Item("HistoryID").Value = Value End Property */
Note that p_ADO is our custom ado factory class. This code will not work as is. It is intended as an example for generating your DAL rather than hand coding it in vb script.
Also note that some column names, such as Type, are not legitimate vbs property names.
Timmus |
|
|
|
Re: Crazy idea of mine...
Posted: 25 Jul 06 8:52 PM
|
fiogf49gjkf0d Originally posted by Mark Dykun
Building the Business Layer and Data Layer offer an extreme amount of functionality, however at a cost of churning alot of code. I would recommend using a tool like CodeSmith to generate the layer for you. As long as the business layer is seperate from the rest of your core code it is easy to re-gen depending on your end users system. Rarely should a developer have to create a DAL by hand anymore with all of the gen tools and really CRUD is pretty much the same for any table. |
|
AFAICT,
http://www.mygenerationsoftware.com/ + http://www.hibernate.org/343.html = sweet ORM code-gen bliss.
I've played with MyGeneration and dOOdads (dOOdads is not real useful out-of-the-box with SalesLogix just yet but I love its minimalistic premise), but NHibernate is .. uh .. well, let's just say it has a reputation of being used lately in the industry, and probably worth betting on. I'm still a bit green on NHibernate, though.
Jon |
|
|
|
Re: Crazy idea of mine...
Posted: 25 Jul 06 10:21 PM
|
fiogf49gjkf0d Originally posted by Jon Davis
NHibernate is .. uh .. well, let's just say it has a reputation of being used lately in the industry, and probably worth betting on. |
|
Yeah, if anyone who is looking at a future of working with SLX and wanting to get into ORM now, NHibernate would be a pretty safe/great/wise choice (the flavor of the moment that I hope lasts) |
|
|
|
Re: Crazy idea of mine...
Posted: 25 Jul 06 10:56 PM
|
fiogf49gjkf0d I hope you guys are all also paying close attention to ADO.NET vNext (would be v3 but v3 would conflict with .NET Framework v3, which uses v2 of all things in .NET Framework 2, so it would be ADO.NET v4 instead, but that sounds weird because we're only at v2, so they just say vNext!!!), which includes eSQL (Entity-SQL), an "add-on" language to Transact-SQL, similar to Linq is to C#/VB.NET, that adds ORM functionality right into SQL queries and ADO.NET objects. Microsoft's goal now is to eliminate the DAL altogether.
For that matter, I hope you're all paying attention to Linq, too. I remember ogling and squealing with excitement (or something) when Microsoft Research started showing off C-Omega. I was rather surprised to see that research project turn into a production project in a matter of months. Makes me wonder how soon we're going to see the Singularity project come into being...
*sigh* That was fun. Back to today... |
|
|
|
Re: Crazy idea of mine...
Posted: 27 Jul 06 3:05 PM
|
fiogf49gjkf0d I myself am taking a little more pragmatic approach to the DAL battle that is approaching. Generally most of my Data centric code is reused and the layer of objects I use implements a constant interface/base class depending on what I am trying to do. I also like using reflection though it is slower, for most business applications it does not make much of a difference as other constraints usually are more effectual (network latency and the like). Now that generics are widely available within .net the amount of code to write is even further reduced. And when it comes to Business Logic it self ORM does not really tackle that. Honestly I have not really bought into the industrial ORM frameworks as of yet.
We use a form of ORM in the mobile product but in the grand scheme it is really very light weight. Objects can be mapped and loaded including their child relationships. The objects deltas can also be tracked at the field level though not the most elegant implemented solution it works well and probably will evolve now that the developer base out there SLX and otherwise are become more capable of adopting some of the more advanced concepts. Objects can also be saved, or deleted and syncronization messages can be created. Now this is mostly possible because of some of the advance framework features such as reflection. I would also think at any mapper would also use reflection or for performance even reflection.emit otherwise a huge amount of code would have to be churned to handle the loading and saving of each element.
When you really look at it managing the CRUD is pretty much the same no matter what object you are working on. Distilling the actual object into the appropiate Database side actions is not really rocket science at this time. Not that I am an expert on ORM but I understand it pretty well, I just like to keep as much of my code within my domain of change and alot of these ORM solutions push you into one or another direction.
Now with Linq, Dlinq, XLinq, BLinq and the new ADO.net vNext, for me I will play with them, but will not hold my breath until the rubber meets the road. You really do not know how the technology will impact the marketplace until it is actually applied to real world scenarios. Given that I do like the fact that I can query against objects with a somewhat easier syntax, although we could do the same with the current iterations of the framework implementing IComparable and other interfaces.
Either way it is still cool to see new things comming out of the MS machine weither they have the intended impact or not is still to be seen. Consider it has been almost 10 years where databinding has been available for windows forms in vb and only now in iteration 8.0 do the critics say it actually works as desired.
Lets hope.
Mark |
|
|
|
Re: Crazy idea of mine...
Posted: 30 Jul 06 1:23 AM
|
fiogf49gjkf0d Originally posted by Mark Dykun
Now with Linq, Dlinq, XLinq, BLinq and the new ADO.net vNext, for me I will play with them, but will not hold my breath until the rubber meets the road. You really do not know how the technology will impact the marketplace until it is actually applied to real world scenarios. |
|
Few of us have bet the farm on future v3 Redmond tech. Really, these are just tools. I tend to get excited about upcoming tools because of so many experiences where revised tools have made my life so much easier. Microsoft has done an amazing job, as well, at knowing what tools to endorse as key paths to development success (i.e. the whole .NET thing), versus what tools are just sort of lingering to meet the needs of a few and possibly to experiment. They do waste a lot of money on marketing their experiments, but I think they've learned from experiences in the last decade of incredible change month after month for Microsoft.
ADO and ADO.NET together are the key Server / Workstation data access utensils in the software industry, followed by a handful of others. Now that may not be because they're any better than other tools for other languages and platforms. In this sense, sure, CRUD is CRUD. But the Redmond flavor is ubiquitous. Java die-hards disagree, but it's generally true. People choose the Microsoft toolset because it's Microsoft that provides the ubiquitous Windows operating system and it's Microsoft that has managed to win all this Windows revenue and then turn right around and throw that money right back into developer support (i.e. Visual Studio, SDKs, and various APIs). And the vNext thing is an extremely important thing to watch because Microsoft has already laid down the precedent (ADO/ADO.NET success) as well as the ubiquitous framework (.NET) to say nothing of detailing and spending billions of dollars on the next revision of the operating system which itself will be the standard by which Redmond toolsets will be built around.
For that matter, NHibernate seems to me to be a slightly riskier proposition than waiting for ADO.NET v3. NHibernate may be a proven technology and available today, but it simply does not have the backing of hundreds of millions of dollars of initial research and development, nor the billions of dollars that will inevitably follow by way of end users (real businesses around the world) testing the thing and sending feedback to the creative machine that can yet revise it again.
On the other hand, NHibernate is solidified and available today, and for those who need several months to build a framework for next year's beautiful solution, you may as well get started now.
All I was saying is that you guys should be sure to pay attention. SalesLogix is not affected by ADO.NET vNext nor Linq today. But it most certainly will be affected by it tomorrow, whether we or Sage like it or not. |
|
|
|
Re: Crazy idea of mine...
Posted: 10 Aug 06 4:18 PM
|
fiogf49gjkf0d I want to qualify my last comment with the assumption that the context of the "ubiquitousness" that I refer to is a Windows environment.
I'm a believer that not all tiers necessarily belong on the same architecture. In fact, the larger the system, the less related the architectures will probably be. In large, distributed environments, it is often very appropriate to host the top tiers (the data store and the DAL and some of the DAL synchronization) in a Solaris / Oracle / Java environment. And that can be totally appropriate. In that kind of environment, the discussion of Microsoft toolsets is entirely moot, with the exception of middleware solutions.
But in a Windows environment where you can predict (i.e. by truncating your market) that all of your users will be using Windows XP or Windows Vista, it doesn't make sense to build on Java tools, for instance. But if the workstation software targets only Vista, you should feel free to take advantage also of WPF and WCF, particularly if the WCF components can target a server tier strictly running Longhorn Server. The cost limiting the market is expensive and for most people it's prohibitive. But the trade-off is significant; you get a lot of power and flexibility with these toolsets.
By bringing this up I'm not second guessing anyone's common sense, but I just wanted to be sure no one is second-guessing mine in the facet of the discussion where I make mention of the ubiquitous "Windows" or the ubiquitous ".NET" or "ADO and ADO.NET". My point is that the toolset should match the environment, whether OS or API or RDBMS, and try to align itself with the platform's vendor proactively rather than on the basis of necessity, if it is desired for *both* the process and the final product to work together with the environment smoothly. J2ME environments would not benefit much from Linq or ADO.NET vNext, for instance, but I would tend to be willing to be almost a guinea pig for just about anything Sun or RIM throws out there as an ideal new DAL or other API strategy, because they know their own platforms.
Sorry for the tangent. |
|
|
| |
|
Re: Crazy idea of mine...
Posted: 13 Aug 06 4:18 PM
|
fiogf49gjkf0d Originally posted by Mark Dykun
Jon, I am trying to undestand the statment that you wrote about WCF however I believe that WCF (Windows Communication Foundation) use standards based connectivity and can be connected to non windows endpoints. I think it would be very short sighted by MS to impose such a restriction. |
|
True, thanks for catching that. By this I meant: WCF uses .NET 3.0, and does more on IIS 7. IIS 7 (which is a component of Vista and of Longhorn Server) adds several features to WCF integration. Namely these include going way beyond HTTP and adding your own network modules for WCF.
Here again, my end of the discussion is that it's not just a matter of interopability, so much as what's the best tool for the task given the platform. It's not that Longhorn is the exclusive target for WCF, rather if you know all of your users have Longhorn then you should feel free to use WCF and not try to find some kind of open source "universal fit" solution that accomplishes the same task. Throwing XP (et al) in the mix, I'd still recommend WCF but the you'd have to either tell all your users to install .NET 3.0 (and that has a lot of market constraints as Vista does, if less), or else you'd have to start looking for alternative approaches such as 1.1 remoting or just plain SOAP.
I guess what I was getting at, going back to the DAL discussion, is that there are good tools that work well in many places, and then there are great tools that work (or promise to work) extremely well in a few limited places. And then there are "universal adaptors" like Hibernate. And those are often very nice and very slick, as is the case with Hibernate. But I for one tend to be prejudiced that the maintainers of any platform know best how to build toolsets for their platform, particularly for Microsoft, and that prejudice is not likely to change. There are certainly exceptions (where was Microsoft's NDoc equivalent when VS.NET came out? why do we have to buy Team Edition to get decent unit testing tools from MS?) but I for one among very limited few am willing to bet the farm on ADO.NET vNext, which is probably going to have some special features for targeting MS SQL Server 2005 (versus prior versions or third parties), for future planning of ORM solutions, because I am confident of its success and have always been loyal to the Microsoft platforms and toolsets (namely current Visual Studio, SQL, and Windows technologies .. but not Outlook inundated with the load of a CRM package) which have always paid off for me.
But now my bias is showing. (As if it wasn't already ..) Really I'm only speaking for myself. In the "real world", companies are shoving Seibel, Windows Server, Java, Solaris, Linux, DCOM, CORBA, SQL Server, Oracle, and ASP.NET all in the same end point pot. So you rarely have a choice of a "ubiquitous platform"... it's a fantasy world, and I admit it. Even so, I pursue it, and when I see it I am rather adamant when it is not acknowledged. |
|
|
|
You can
subscribe to receive a daily forum digest in your
user profile. View the site code
of conduct for posting guidelines.
Forum RSS Feed - Subscribe to the forum RSS feed to keep on top of the latest forum activity!
|
|
|
|
|
|
|
|