OLAP vs Column DB - olap

I am evaluating a choice between Oracle OLAP and Pentaho Mondrian.
At the same time some people say that using a column DB could simply make the use of OLAP's redundant as they are much faster.
has anyone got any experience on the same.
Will it help if ou OLAP sits on a column db ?

OLAP is not about the storage only; it is about the MDX language as well. MDX is much more powerful than SQL to handle multi-dimensional (with hierarchical dimensions) queries.
You can have a look to www.icCube.com - it's pretty fast - I mean no problem to aggregate let's say 30 millions of facts over 20 dimensions in sub-seconds (without any pre-aggregation computation).

Columnar databases are truly designed for analytics. Beye Network published a series of blogs about columnar databases/when to use them. One of them can be found here: http://www.b-eye-network.com/blogs/mcknight/archives/2010/06/the_value_of_pe.php.
In full disclosure, I am the community manager for Infobright, an open-source Columnar database company. Pentaho and Infobright work very well together; in fact, there's a virtual machine with both Pentaho/Infobright installed. That VM can be downloaded on www.infobright.org.
Cheers and good luck,
Jeff

Related

Programs for creating, design and administrate(GUI) SQLite

Which programs do you know for subj purpose? Quickie googling reveal 2 progs:
sqlite-manager(Firefox extension)
Not all features are realizable through GUI, but really easy for use and opensource.
SQLite Administrator Screenshots looks pretty but for Win only.
Please tell me your opinion about program not just link. Thanks.
Well, I'm using Navicat Premium. It is not free, it costs money, but it is a very nice tool when working with multiple database systems, including Sqlite. It has many nice features, such as working with multiple db's from one window, import/export/synchronize data and schemas across different databases etc.
There is also Navicat for SQLite only which costs less I think.
I found this table, and may be this information help someone.
And this question are repeat this. Just in time heh.
You can try SQLitespy. I found it very useful. Its GUI makes it very easy to explore, analyze, and manipulate SQLite3 databases.

Comparing features in a asp.net web application using different database technologies

I have a webstore which sells components (it is a academic project) which looks like this. I have developed the same web application using following database technologies:
MS Sql Server with Stored procedures and sql data reader
LINQ to Sql
DB4o using LINQ (Client/Server)
What features can I compare apart from the technical and theoretical details between relational database and object oriented database ?
It is my graduate/master's thesis final project. I want the features that i want to compare to be more practical and interesting so that I can draw some concrete and meaningful conclusions rather than abstract comparisons which don't create much interest and hard for inference.
Please help me.
Feel free to express your opinions.
Thanks in anticipation
PS: Don't downvote or flag this post, if some one doesn't like this question u may delete it after getting answered
Here is a site that compare DALs, maybe you can get some ideas for what other think you can compare.
http://ormbattle.net/
Also here is my first question on StackOverflow that I compare 4 dals for speed and optimization.
Benchmark Linq2SQL, Subsonic2, Subsonic3 - Any other ideas to make them faster?
What features can I compare apart
In you case I was try to compare the speed, and if the conversion to a DAL can give the same or more features that can get with out it. For example, can you get all the same questions that you can do direct with SQL or not, and what is the limitations.
Try creating some performance benchmarks and do a side-by-side compare of the three different DB technologies (these are not methodologies) for given types of queries.

Big database and how to proceed

I'm working on a website running ASP.NET with MSSQL Database. I realized, that the number of rows in several tables can be very high (possibly something like hundread million rows). I thnik, that this would make the website run very slowly, am I right?
How should I proceed? Should I base it on a multidatabase system, so that users will be separated in different databases and each of the databases will be smaller? Or is there a different, more effective and easier approach?
Thank you for your help.
Oded's comment is a good starting point and you may be able to just stop there.
Start by indexing properly and only returning relevant results sets. Consider archiving unused data (or rarely accessed data
However if it isn't Partioning or Sharding is your next step. This is better than a "multidatabase" solution because your logical entities remain intact.
Finally if that doesn't work you could introduce caching. Jesper Mortensen gives a nice summary of the options that are out there for SQL Server
Sharedcache -- open source, mature.
Appfabric -- from Microsoft, quite mature despite being "late
beta".
NCache -- commercial, I don't know much about it.
StateServer and family -- commercial, mature.
Try partitioning the data. This should make each query faster and the website shouldn't be as slow
I don't know what kind of data you'll be displaying, but try to give users the option to filter it. As someone had already commented, partitioned data will make everything faster.

Creating Data Access Layer for Small website

I am creating my application in asp.net 3.5. I have to make my Data Access layer, in which I am doing the traditional method of fetching/updating the data. Which is SqlConnection than SQLCommand, than SQLadapter.
Will there be any other way I can create my DAL layer easily.
Specification.
My website is small. Approx 7-10
pages.
Database has around 80
tables.
What I know:
Linq to SQL - I don't want to use it
because I am not fully aware about
the LINQ statement and I need to
develop the application really fast.
[3 days :-( ]. Also, there are 100%
chances that the table structure
will be altered in future.
Enterprise Library: It will take too
much time for me to integrate to my
application.
Any other suggestion to create my data layer, quick ... fast ... and "NOT" dirty.
Thanks in advance.
How about using Codesmith (free version 2.6) to generate a simple set of data access objects off your database? Given the small number of DB objects that you need to model I think this would be a quick and easy way of achieving your goal given the time constraints.
I would have recommended using LINQ to SQL. But, since that is a no from you, only other option I would suggest is Strongly Typed Datasets and Table Adapters generated by Visual Studio. They are old but decent enough to work in any modern application.
They are fast to create. They provide type safety. They are quite flexible for configuration and customization. Since they are generated by Visual Studio, any changes made to database can be easily reflected quickly.
Being a LINQ beginner myself, I would recommend taking the plunge and going with linq-to-sql or entity framework. I cant say for certain without knowing your requirements but theres a good chance taking the time to learn basic linq for this project would speed up development overall.
You may also want to consider SubSonic. It's relatively easy to implement and is fairly intuitive to use. Used it for the first time recently on a small project, and despite some initial configuration problems getting it to work with MySQL, it handled data access pretty well.

Writing updates to OLAP cube

What is the easiest way to write user entered measure values (sales forcast) to the SQL Server Analysis Services OLAP cube from a .Net client application?
I'm aware that underlying fact table can be updated with DML statements and that cube can be reprocessed but I'm looking for alternatives.
Regards,
Aleksandar
We use pivot table Ranet OLAP for editing cube data.
View sample Simple PivotTable Widget - PivotTaple with Updateable
Writing updates to OLAP cube.
I nearly got into a project like this once. It did not go ahead, which I was very grateful for, after looking into the work involved. My advice to you is to run away!!!
You do not have to update actual cube data, or reprocess though - depending on how complex your user-entered data is going to be. I believe this is covered in Microsoft's standard MDX course, the notes of which you may be able to find online (sorry, I've since disposed of my copy). This depends on whether you want to learn MDX though, which is not easy.
I think you can use ADOMD .Net to do Writeback. You can ADOMDCommand to wrap UPDATE CUBE Statements.
ADOMD .Net
http://msdn.microsoft.com/en-us/library/ms123483(v=SQL.100).aspx
Link below talks about some of the issues in this approach, if you are doing too many updates together.
http://www.developmentnow.com/g/112_2006_1_0_0_677198/writeback-in-ADOMD-NET.htm

Resources