It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I would like to know what are the best practices for building Predictive Modeling solutions organically ?
Some of the questions I have are :-
If I have multiple R model files, what are efficient ways of storing them ?
Save as .Rdata files on file system
Serialize to a DB as binary objects
Since data is processed to create an interim model specific format, is it helpful to use such paradigms as PMML ?
Also, should one consider such practices as MVC (I'm not a trained software developer, so any insights into such development practices would be very helpful)
I apologize for the open-ended nature of this question. I wish to understand even simple things as recommended folder structure for data staging, model store, scripts collection and such other elements of a data mining solution.
I would be very grateful to members of the community for sharing their experiences and recommendations.
Thank you for your time.
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 9 years ago.
I've heard from many people that R is built for processing petabytes of data; however, on the other hand, I'm hearing very often as well that if you want to process for example 8 GB of data, you'd better to have at least 8 GB of memory, otherwise you'll face some problems.
My question is if I need to process like 20 GB of data (which I think is fairly common in many projects), how much Memory and also Processor do I need? If you had any previous experience I'd be happy to know how it should be for 2 petabytes of data as well.
I think you can't process 2 petabytes of data with any language at once (well maybe with some specific software and/or hardware you could). Paraller solutions or processing in smaller pieces is always needed. In R, objects are stored in the virtual memory, so there's a clear limit how much data you can have in R at the same time. Check Memory Limits in R.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
What happens internally when I mount a file system on UNIX using the following command:
mount -t ext3 /dev/sda1 /home/users
Please give references (articles, books etc.)
Consider your position: do you want to read this? Can you read this?
http://freebsd.active-venture.com/FreeBSD-srctree/newsrc/ufs/ffs/ffs_softdep.c.html
It is McKusick's base code for the ffs file system, which is generally considered the parent of modern
UNIX file systems. There is no finer detail than reading source.
The reason I posted: when I taught this stuff long ago, there was a text, and then I presented example code. Students seemed to get a lot out of it... those who actually worked on the material, to be more correct.
In this case the ffs.c code was kind of a defacto model. So it provides a how-we-got-here-from-there.
Now all you need to do is get this:
http://www.amazon.com/Linux-Device-Drivers-Jonathan-Corbet/dp/0596005903/ref=sr_1_1?s=books&ie=UTF8&qid=1354930353&sr=1-1&keywords=linux+drivers
Then ultimately download code for ext3. And read it.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I've been asked to model my application. I'm not clear what this means, Perhaps something related to the architecture of my project?. Does it mean giving them a break-up of the classes? Or something like building a use-case or class diagram? Or perhaps something else?
EDIT: I cannot ask them!
I'd go with UML (Unified Modelling Language). It allows you to lay out classes, methods, inheritance, etc. in a graphical format.
A quick Google search gives this FOSS option:
Umbrello UML Modeller
EDIT: Just realized that's linux-only, so here's the Wikipedia page for a whole bunch of other options.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I have stock market application which will be frequently used by millions of members at a time.
What could be the best option to retrieve data from my database based on the retrieval time and Database load - DataReader or DataSet?
If you will have a large amount of people reading the data, then you should use the DataReader paradigm. This way you can quickly get in and out with the trouble of schema inference. I would also recommend that once you pull the data from the server that you cache it. Even if it is only cached for 1 second, that will improve the number of connections from the database that will retrieve the same data. Otherwise, you could quickly saturate your connection pool if you are not careful as well as some possible locking.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
Are there any resources that compare the recently released (Jan 13, 2011) asp.net MVC 3 to Rails 3? I've looked around and couldn't find any comparisons but figure there must be something out there.
The last time I used asp.net was with MVC 1 and I'm wondering what sort of improvements they've made since then. I'm fairly new to rails so I'm not sure I can make a good comparison just by looking at asp.net MVC 3 itself. I'm hoping someone more familiar with the two frameworks has already made a comparison.
Sorry if this isn't the right place to ask but I do consider this a programming question.
EDIT:
I'd like to know specific comparisons between the frameworks. Advantages/disadvantages over using one over the other. View engines compared between the two. Static language vs dynamic language (if any comparisons apply). Ease of doing TDD/BDD between them. Features that are unique to each. Tools available, performance considerations, ease of use, etc.
Judging from the lack of responses the answer is either "no" or "not yet".