Reflection in SQL Server 2008? - reflection

I want to know, is there any reflection support in SQL Server 2008, like as c# support reflection. Basically I am curious about how SQL Server implements all our (where, orderby, exists) clauses. How it would implement all these in behind the scenes.

It's not that long ago when if you compared SQLServer to most object-oriented languages, that you would have been struck by the fact that SQLServer reveals a lot more about its inner workings than they do.
Its inherent to the concept of SQL and transactional databases, that a lot of the information about how they work is stored in the database itself. All tables are, for example, represented by rows in a table in a system database, as are all columns, stored procedures, and so on.
These days however, SQLServer does not go as far as C# in this regard, and you may be struck by the opposite conclusion.
An analogy could be drawn to the way that when you are browsing through reflected information on classes, you will hit "atoms" in the Democritian sense of something that can't be broken down any further. Either it'll be handled by the core IL instructions, or it'll be defined externally, and either way you can see any more "into" the implementation. SQLServer has more features that you can't peer into to see how they work than .NET
You might enjoy taking a look at PostgreSQL, which goes a bit further in how visible many of its functions are.

Related

Sharing stored procedures across multiple apps

Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.

Should I consider migrating from SQL Server to Oracle for my ASP.NET applications?

We're upgrading our systems to support clustering and auto failover features. Our business runs .NET 4 applications, web apps and services on SQL Server Express. We can upgrade to SQL Server Standard, but the cost has motivated us to consider other options. Is it a legitimate option to integrate our .NET data layer with ODP.NET? After searching, I have seen a tendentious statement or two in the negative (viz) and yet it would seem that people are doing it anyway. What development features in the Visual Studio IDE will we lose? Thanks for your help!
Well, I'm now working since 20+ years with Oracle and MS SQL Server, having done a lot of projects. Some projects are running now more than 10 years, with all the updates, maintenance and so on.
My quick answer is: Stay with MS SQL Server. Go to Oracle only, if you have really GOOD TECHNICAL reason, or if you are planning really an ENORMOUS database, and if you have enough staff to handle all thge administration.
The main reason is that SQL Server is much easier to maintain; and it also integrates greatly into the Microsoft environment.
Oracle, in contrast, has a steep learning curve. The handling of Oracle is much more "manual" then MS SQL Server. Well, that's also a good thing, because you are in control of every small detail, but it also means that you need to learn a lot; or you need to pay experts. And it is not so easy to find people who really know what to do.
I really like both Systems, but for a rule of thumb, I normally suggest to use MS SQL Server.
I've been using .net with Oracle for years, and migrate away from it whenever the option is available.
If all your database code is in stored procs and you call it though the codebehind or a library and you use ansi sql your migration from ms sql to oracle will be fairly painless.
If you use TableAdapters, they re-write any sql you put in to the oldschool oracle 8 syntax like table1,table2,table3 then have a big where clause to do the join conditions. There's also some weird bugs where sometimes sql that runs fine over in SQL Developer won't work in the TableAdapters.
If you use Entity Framework migration should be pretty easy, but the MS SQL driver is much better then the Oracle one. There have been several queries I couldn't do though EF in oracle because of some of the various errors with the current driver.
If you need more info let me know.
Also if Cost is the main reason to consider migration, why not go with mysql?
Since you are already working in MS SQL, you must be habitual of the way it work, be it entity framework or any other data execution. Yes offcource, microsoft has very high license rates for it. But if you want to move to any other database, it is perfectly alright. I have personally used MS SQL and MySQL both. Initially you might face some syntax related issues, but do remember that logic remains the same for fetching and saving the data. Further it gives a benefit that you got to learn a new language and that too at the expense of far less money.

Entity Framework 4.0 Scaling and Security

I want to use an ORM, and have been looking at EF 4. Is this platform scalable. I see a lot of stuff on the web, but everything looks very biased in one way or the other. Anyone know of benchmarks or non-subjective information.
On that point, does EF prevent SQL injection or XSS. I know that it used parametrized queries, but is that enough?
Any help is appreciated.
Okay so i see two questions here.
Is EF Scalable
Very difficult (and subjective) to answer, but IMO yes.
Here's a few reasons why:
Utilizes a common querying language (LINQ)
Allows for multiple providers (SqlServer, Oracle, etc)
Allows bi-directional mapping (code first, model first, database first)
Includes "classic ADO.NET" support (stored procedures, Entity-SQL)
The main real benefit in scalability is how the framework is built on LINQ-to-Entities. When you write queries, you are not writing against SQL Server or Oracle, you are writing against the Model. Depending on what Provider you have setup (in web.config), EF will translate these model queries into the appropriate T-SQL (or P-SQL).
Therefore (theoretically), you could write code against SQL Server, then change the web.config provider to Oracle, and your code should work. Obviously this isn't the case for Entity-SQL though (as you are writing T-SQL, not LINQ).
Does EF prevent SQL injection or XSS
No ORM tool can really "prevent" SQL Injection attacks - they can only provide the developer with the tools to prevent it.
As with classic ADO.NET where you use parameterized queries, Entity Framework has Entity-SQL, which allows to to execute pre-generated SQL, stored procedures, etc.
In this scenario, you need to use parameterized queries to prevent SQL injection. For most EF work, you will be writing queries with LINQ, which is a lot safer because it gets hydrated through a lot of stages before it becomes SQL.
XSS is exploited on the client-side via things like injected JavaScript, dodgy emails, etc. Has nothing to do with Entity Framework. Prevention of XSS is done on the client-side with things like HTML encoding.
No. ORMs are not a panacea for scalability. There is such a things called the impedance mismatch of objects and databases which has been around for many years. ORMS try to solve this by providing magic code generation/mapping solutions that give the appearance of just working with objects.
In a multi-tier environment with many client programs and a single/many server scenario, for every change that has to be committed to the database, checks need to be performed to make sure that your not over writing someone elses change on the data, or trying to update data that has been removed. This is not a new problem introduced by ORMs but one which appears many many times throughout the ages of updating databases in N-Tier environments. ORMS do not solve this problem. In some cases, if the ORM is the single entry to the Database, the ORM becomes a bottle neck. This means that to create a scalable architecture using an ORM becomes problematic as having DB checks performed on the ORM means that the update anomaly checks could be by passed if your using an N-Tier ORM solution where you have duplicate ORM tiers.
For the reasons above, this is why we use stored procedures. But if your using stored procedures, which naturally obfuscate the underlying data structures of the database then this increases the impedance mismatch of objects and database entities. One thing about using stored procedures and relying on table locking/row rocking, some of the update scenarios are solved, as we shift the bottle neck to the performance of the underlying database design.
So whats the answer. Don't use objects for databases. Object are great for analysis, bad for code design when interacting with RDBMS databases.
If your really thinking SQL and RDBMS data solutions are a problem, which in some scenarios they are, take a look at some of the NOSQL solutions out there. Still not a panacea for all problems, but in some cases they provide a better solution than a straight SQL solution.
Objects are not the answer to all problems. Step back from your code, take a look at what your trying to do, and think if an object is the right approach.
As for security, no ORMS do not aid security. Although they do help prevent some forms of injection attacks.

Linq vs Stored Procedure

Which one is preferable for Enterprise CMS development:
LINQ or SP?
Generally what I do is LINQ to Views and LINQ to Stored Procedures. It's not a question of what is preferred because LINQ solves how to manage the data once it's queried where Stored Procedures are run on the SQL side to allow for query manipulation (or for me, mostly saving) of data which takes away from having the code to do it which is slower.
I would say you would want to use both if necessary. Are you saving to Entities that require multiple tables saves as one Entity? If so, use Stored Procedures with LINQ. If you're using 1 to 1 Entity relations to your tables then just use LINQ.
Stored Procedures can be used with Linq2Sql (and Entity Framework), so it isn't a choice of one or another.
I would cache the results from the database for a CMS as you likely to get the same data requested over and over again (cache the dataset, or use page caching, or cache the objects if using LINQ).
Then it doesn't matter if you use LINQ or an SP, but I would just use LINQ.
For simple CRUD table(no joins !!!) operations LINQ to SQL is fine, however for anything more complex (needing joins) I always use stored procedures (you can use Linq to stored procedures if you wish)
There are numerous debates around this on this site and others. For me, you can normally split the pro Linq camp into guys who have recently come into programming and have not had the history of having to use Stored procedures, ie not been heavily involved in the database side of previous projects.
Form my experience of working on several projects using pure LINQ, stored procedures and a mixture of both these are the two reasons I would stick to Linq for basic CRUD and stored procedures for anything more complex or relying on performance.
1 - Deploymenty/Security - Anyone that has worked in the real world a knows full well that having the database logic separated into stored procedures and not incorporated into the source code and released DLL is a massive advantage. You can add a proper security/access layer around each query using roles and SQL server security, imperative for any serious enterprise level company, and you can also make changes to the SQL of any stored procedure without having to do a new release of the main application (dll). I dont care how good you claim to be we have all had to fix live issues and performance bottlenecks using stored procedures and having to do this with a new application release would have been a nightmare.
2 - Performance/Code Smells - I have seen so many applications littered with huge amount of of badly written and inefficient Linq. Developers get lazy with Linq, little hidden lazy Linq to SQL queries which cause you a nightmare trying to debug performance issues on an enterprise level system - the motto 'get it done as quick as possible' seems prevalent. I have seen more Spaghetti code since the advent of Linq than I had seen with any previous class library/pattern Microsoft have released since COM.

asp.net (mvc) and mysql, what am I getting in to here?

I'm building asp.net mvc app, and I want to know the ramifications of me switching from sqlserver2008 to mysql?
Apart from some syntax tweaks, what other things should I am taking into consideration (technically speaking ofcourse) if I want to move over to use mysql?
convert sprocs to inline queries
transaction and locking maybe handled differently
others?
There are some differences with how the two treat some kinds of locking and concurrency, etc. but for 95% of web applications those kinds of issues simply never come into play. If you're doing standard CRUD, maybe some transactions, executing a few stored procedures? No difference to speak of except the syntax, a good reference to which can be found here.
I really recommend checking out DbLinq, which is based on LINQ to SQL but supports lots of different SQL databases. It gets us much closer to making applications truly db-agnostic - you can swap out the SQL Server provider for MySQL, PostgreSQL, Oracle, Firebird, SQLite, Ingres - and all the LINQ expressions stay exactly the same. No need to tweak any queries.

Resources