Writing updates to OLAP cube - olap

What is the easiest way to write user entered measure values (sales forcast) to the SQL Server Analysis Services OLAP cube from a .Net client application?
I'm aware that underlying fact table can be updated with DML statements and that cube can be reprocessed but I'm looking for alternatives.
Regards,
Aleksandar

We use pivot table Ranet OLAP for editing cube data.
View sample Simple PivotTable Widget - PivotTaple with Updateable
Writing updates to OLAP cube.

I nearly got into a project like this once. It did not go ahead, which I was very grateful for, after looking into the work involved. My advice to you is to run away!!!
You do not have to update actual cube data, or reprocess though - depending on how complex your user-entered data is going to be. I believe this is covered in Microsoft's standard MDX course, the notes of which you may be able to find online (sorry, I've since disposed of my copy). This depends on whether you want to learn MDX though, which is not easy.

I think you can use ADOMD .Net to do Writeback. You can ADOMDCommand to wrap UPDATE CUBE Statements.
ADOMD .Net
http://msdn.microsoft.com/en-us/library/ms123483(v=SQL.100).aspx
Link below talks about some of the issues in this approach, if you are doing too many updates together.
http://www.developmentnow.com/g/112_2006_1_0_0_677198/writeback-in-ADOMD-NET.htm

Related

What cache strategy do I need in this case ?

I have what I consider to be a fairly simple application. A service returns some data based on another piece of data. A simple example, given a state name, the service returns the capital city.
All the data resides in a SQL Server 2008 database. The majority of this "static" data will rarely change. It will occassionally need to be updated and, when it does, I have no problem restarting the application to refresh the cache, if implemented.
Some data, which is more "dynamic", will be kept in the same database. This data includes contacts, statistics, etc. and will change more frequently (anywhere from hourly to daily to weekly). This data will be linked to the static data above via foreign keys (just like a SQL JOIN).
My question is, what exactly am I trying to implement here ? and how do I get started doing it ? I know the static data will be cached but I don't know where to start with that. I tried searching but came up with so much stuff and I'm not sure where to start. Recommendations for tutorials would also be appreciated.
You don't need to cache anything until you have a performance problem. Until you have a noticeable problem and have measured your application tiers to determine your database is in fact a bottleneck, which it rarely is, then start looking into caching data. It is always a tradeoff, memory vs CPU vs real time data availability. There is no reason to make your application more complicated than it needs to be just because.
An extremely simple 'win' here (I assume you're using WCF here) would be to use the declarative attribute-based caching mechanism built into the framework. It's easy to set up and manage, but you need to analyze your usage scenarios to make sure it's applied at the right locations to really benefit from it. This article is a good starting point.
Beyond that, I'd recommend looking into one of the many WCF books that deal with higher-level concepts like caching and try to figure out if their implementation patterns are applicable to your design.

Comparing features in a asp.net web application using different database technologies

I have a webstore which sells components (it is a academic project) which looks like this. I have developed the same web application using following database technologies:
MS Sql Server with Stored procedures and sql data reader
LINQ to Sql
DB4o using LINQ (Client/Server)
What features can I compare apart from the technical and theoretical details between relational database and object oriented database ?
It is my graduate/master's thesis final project. I want the features that i want to compare to be more practical and interesting so that I can draw some concrete and meaningful conclusions rather than abstract comparisons which don't create much interest and hard for inference.
Please help me.
Feel free to express your opinions.
Thanks in anticipation
PS: Don't downvote or flag this post, if some one doesn't like this question u may delete it after getting answered
Here is a site that compare DALs, maybe you can get some ideas for what other think you can compare.
http://ormbattle.net/
Also here is my first question on StackOverflow that I compare 4 dals for speed and optimization.
Benchmark Linq2SQL, Subsonic2, Subsonic3 - Any other ideas to make them faster?
What features can I compare apart
In you case I was try to compare the speed, and if the conversion to a DAL can give the same or more features that can get with out it. For example, can you get all the same questions that you can do direct with SQL or not, and what is the limitations.
Try creating some performance benchmarks and do a side-by-side compare of the three different DB technologies (these are not methodologies) for given types of queries.

Creating Data Access Layer for Small website

I am creating my application in asp.net 3.5. I have to make my Data Access layer, in which I am doing the traditional method of fetching/updating the data. Which is SqlConnection than SQLCommand, than SQLadapter.
Will there be any other way I can create my DAL layer easily.
Specification.
My website is small. Approx 7-10
pages.
Database has around 80
tables.
What I know:
Linq to SQL - I don't want to use it
because I am not fully aware about
the LINQ statement and I need to
develop the application really fast.
[3 days :-( ]. Also, there are 100%
chances that the table structure
will be altered in future.
Enterprise Library: It will take too
much time for me to integrate to my
application.
Any other suggestion to create my data layer, quick ... fast ... and "NOT" dirty.
Thanks in advance.
How about using Codesmith (free version 2.6) to generate a simple set of data access objects off your database? Given the small number of DB objects that you need to model I think this would be a quick and easy way of achieving your goal given the time constraints.
I would have recommended using LINQ to SQL. But, since that is a no from you, only other option I would suggest is Strongly Typed Datasets and Table Adapters generated by Visual Studio. They are old but decent enough to work in any modern application.
They are fast to create. They provide type safety. They are quite flexible for configuration and customization. Since they are generated by Visual Studio, any changes made to database can be easily reflected quickly.
Being a LINQ beginner myself, I would recommend taking the plunge and going with linq-to-sql or entity framework. I cant say for certain without knowing your requirements but theres a good chance taking the time to learn basic linq for this project would speed up development overall.
You may also want to consider SubSonic. It's relatively easy to implement and is fairly intuitive to use. Used it for the first time recently on a small project, and despite some initial configuration problems getting it to work with MySQL, it handled data access pretty well.

ADO.NET Entity Framework or ADO.NET

I'm starting a new project based on ASP.NET and Windows server.
The application is planned to be pretty big and serve large amount of clients pulling and updating high freq. changing data.
I have previously created projects with Linq-To-Sql or with Ado.Net.
My plan for this project is to use VS2010 and the new EF4 framework.
It would be great to hear other
programmers options about development
with Entity Framework
Pros and cons from previous
experience?
Do you think EF4 is ready for
production?
Should i take the risk or just stick with plain old good ADO.NET?
Whether EF4 is really ready for production is a bit hard to say since it's not officially been released yet.... but all the preliminary experiences and reports about it seem to indicate it's quite good.
However: you need to take into consideration what EF is trying to solve; it's a two-layer approach, one layer maps to your physical storage schema in your database (and supports multiple backends), and the second layer is your conceptual model you program against. And of course, there's the need for a mapping between those two layers.
So EF4 is great if you have a large number of tables, if you have multiple backends to support, if you need to be able to map a physical schema to a different conceptual schema, and so forth. It's great for complex enterprise level applications.
BUT that comes at a cost - those extra layers do have an impact on performance, complexity, maintainability. If you need those features, you'll be happy to pay that price, no question. But do you need that??
Sure, you could go back to straight ADO.NET - but do you really want to fiddle around with DataTables, DataRows, and untyped Row["RowName"] constructs again?? REALLY???
So my recommendation would be this:
if you need only SQL Server as your backend
if you have a fairly simple and straightforward mapping of one database table to one entity object in your model
then: use Linq-to-SQL ! Why not?? It's still totally supported by Microsoft in .NET 4 - heck, they even did bugfixes and added a few bits and pieces - it's fast, it's efficient, it's lean and mean - so why not??
My advice is use both. At first I thought I would only use linq to sql and never have to touch ado.net ever again ( what made me happy lol).
Now I am using both because some things linq to sql(and any ORM like EF) can't do. I had to do some mass inserts and I did it first with linq to sql and to do 500 records it took over 6mins(2 mins for validation rules rest was inserting into the db).
I changed it to sql bulk copy and now it is down to 2min and 4 seconds(4 seconds to do all inserts)
But like marc_s said I really did not want to fiddle around with DataTables, DataRows, and untyped Row["RowName"].
Say my table was like 10 columns long and called Table A. What I did was I used linq to sql and made a Table A class( new TableA()) object and populated it with data. I then would pass this object to a method that created the datarow.
So linq to sql saved me some time because I probably would have made a class as I would not have wanted to pass in 10 parameters into the method that makes the data row. I also feel it gives a bit of typeness back as you have to pass in the right object to use that method so less chance of passing in the wrong data.
Finally you can still use linq to sql to call Stored procedures and that is like one line of code.
So I would use both when ever you notice that linq to sql (or in your case EF) is slow then just write a SP and call it through EF. If you need to do straight ado.net evaluate what you need to do maybe you can use EF for most of the code(so you can at least work with objects) and only for that small portion ado.net sort of what I did with sql bulk copy.
EF 4 is now more similar to LINQ to SQL, in the good ways; it has the FK keys right in the object, has add methods right in the object sets, and a lot of other nice features. THe designer is much improved, and the major plus is that it works with SQL and Oracle, and maybe some others (as long as the provider supports it) instead of LINQ to SQL with only SQL Server.
EF is the future; the ADO.NET data services is a web service add on, plus it supports POCO and T4 generation, and any new features will support this (LINQ to SQL is maintenance only, and data sets won't be getting any changes any more).
HTH.

existing application, can I just start using linq-to-sql? any tips on integration?

I have an existing web app that has a data layer and a bll that calls the data layer. The data layer is ado.net that calls stored procedures.
I created another project in vs.net for linq-to-sql, dragged all my tables over.
Would it be wise to just start using linq or should I spend the time and re-write all the db logic in linq just so I don't have any issues having 2 data layers!
If it ain't broken, don't fix it.
Why would you want to rewrite completely your perfectly working data layer? ADO.NET + stored procedures is a great choice. Keep it. At the same time you can start playing with LINQ.
Anyway, you will need some practice with LINQ to see what it can and what it cannot do before you will be able to decide on the new data layer architecture. There are some situations that LINQ cannot handle right out of the box, so you will need to use tricks or substitute default implementation with your own queries. At the end of the day you may have decided, it was not worth it.
My suggestion is to gain some experience with it separately and not start rewriting everything completely just because LINQ is cool.
Unless your current data layer is broken for some reason, don't just start implementing a new one, just because you can.
Although if currently the datalayer consists of using stored procedures and that becomes cumbersome to maintain, switching to L2S (or any other OR/M for that matter) might be a valid reason. Just don't think it'll be only a matter of dragging some columns to a canvas and be done. Dependent if there's any logic in the sprocs, the logic has to exist somewhere...
I'd say until you can justify the costs of switching your datalayer entirly, stick with your current implementation.
Please be clear: there is a major difference between Linq and LinqToSql. Linq is great and you should be using it if at all possible. LinqToSql is not great and has many problems:
Do not use the Visual Studio 2008 LinqToSql O/R Designer
The drawbacks of adopting Linq To Sql
To use Linq, you need an ORM of some sort. You have many options for ORMs in the .NET world. If you like what LinqToSql offers, you may be most comfortable using SubSonic. In the long run, NHibernate is the best choice for a .NET ORM right now. I wrote a lot more on choosing a .NET ORM here:
.NET and ORM - Decisions, decisions
In the end, there is no reason you can't have two or more different data layer technologies in the same application. There are good reasons not to do this however and so it should be avoided if at all possible.
Also, here's a compelling write-up against using stored procedures:
Stored procedures are bad, m'kay?

Resources