Which programs do you know for subj purpose? Quickie googling reveal 2 progs:
sqlite-manager(Firefox extension)
Not all features are realizable through GUI, but really easy for use and opensource.
SQLite Administrator Screenshots looks pretty but for Win only.
Please tell me your opinion about program not just link. Thanks.
Well, I'm using Navicat Premium. It is not free, it costs money, but it is a very nice tool when working with multiple database systems, including Sqlite. It has many nice features, such as working with multiple db's from one window, import/export/synchronize data and schemas across different databases etc.
There is also Navicat for SQLite only which costs less I think.
I found this table, and may be this information help someone.
And this question are repeat this. Just in time heh.
You can try SQLitespy. I found it very useful. Its GUI makes it very easy to explore, analyze, and manipulate SQLite3 databases.
Related
I am struggling while trying to port recent SQLite sources to VxWorks 6.8. The architecture is PPC.
I made a separate topic (Crash around pthreads while integrating SQLite into RTP application on VxWorks) to provide all details about the particular problem I am experiencing at the moment. But it looks like the problem is too specific and requires some amount of certain experience (porting C code to different platforms, pthreads, SQLite, knowledge of VxWorks).
So, I decided to just get a confirmation that it is doable at all. I mean it is doable for sure but I need to know that someone has actually succeeded with it within some reasonable time frame.
Please respond only if you yourself accomplished this. No general suggestions, like: "VxWorks is POSIX, SQLite is C - should not be a problem".
To moderators: I don't mean to duplicate my question. I am just norrowing it down and intend to close if no constructive ansver(s) appear.
Thanks in advance
Ok, I have figured it out - it was the problem in dosFS. I formatted the flash to HRFS and was able to run SQLite.
I had few porting issues on the way, something that is really platform-dependent. One familiar with POSIX functions should be able to figure it out.
I guess there is a problem with dosFS. So for now I will stick to HRFS.
In case of any questions about the porting itself - just contact me, may be I had te same and fixed.
Just for informaiton, I am using PPC and VxWorks 6.8.
Regards
I've tried Araxis merge and it's good to use. However it is too costly.
I need only file and folder diff. I also need merge for two files.
Although this Wikipedia page lists all of the free tools but it is really difficult to conclude which tool will be best.
I'm curious which is the most recommeded free merge tool for Drupalians!
I'm not sure Drupalian have specific needs merging-wise compared to other web makers :D
Try Kdiff3 ( http://kdiff3.sourceforge.net/ )which is dead simple.
Sorry I talked about opendiff which I use on my mac but it doesn't seem to be available for windows. But if you are on mac it is part of the original install.
Command-line diff (together with colordiff), vimdiff, emacs has quite good implementation as well...
I'm working on a website running ASP.NET with MSSQL Database. I realized, that the number of rows in several tables can be very high (possibly something like hundread million rows). I thnik, that this would make the website run very slowly, am I right?
How should I proceed? Should I base it on a multidatabase system, so that users will be separated in different databases and each of the databases will be smaller? Or is there a different, more effective and easier approach?
Thank you for your help.
Oded's comment is a good starting point and you may be able to just stop there.
Start by indexing properly and only returning relevant results sets. Consider archiving unused data (or rarely accessed data
However if it isn't Partioning or Sharding is your next step. This is better than a "multidatabase" solution because your logical entities remain intact.
Finally if that doesn't work you could introduce caching. Jesper Mortensen gives a nice summary of the options that are out there for SQL Server
Sharedcache -- open source, mature.
Appfabric -- from Microsoft, quite mature despite being "late
beta".
NCache -- commercial, I don't know much about it.
StateServer and family -- commercial, mature.
Try partitioning the data. This should make each query faster and the website shouldn't be as slow
I don't know what kind of data you'll be displaying, but try to give users the option to filter it. As someone had already commented, partitioned data will make everything faster.
I have considered SQLite, but from what I've read, it is very unstable at sizes bigger than 2 GB. I need a database that in theory can grow up to 10 GB.
It would be best if it was stand-alone since it is easier to implement for non-techie users, instead of having the extra step of installing something like MySQL which most likely will require assistance.
Any recommendations?
SQLite should handle your file sizes just fine. The only caveat worth mentioning is that SQLite is not suitable for highly-concurrent environments, since the entire database file is exclusively-locked during writing processes.
So if you are writing an application that needs to handle several users concurrently, a better choice would be Postgresql.
I believe SQLite will actually work fine for you with large databases, especially if you index them appropriately. Considering SQLite's popularity it seems unlikely that it would have fundamental bugs.
I would suggest that you revisit the decision to rule out SQLite, and you might try to compensate for the selection bias of negative reports. That is, people tend to publicize bug reports, not non-bug reports, and if SQLite were the most popular embedded database then you might expect to see more negative experiences than with less popular packages even if it were superior.
I stumbled across the WB on-disk B-tree library:
http://people.csail.mit.edu/jaffer/WB
It seems like it could be useful for my purposes (swapping data to disk during very large statistical calculations that do not fit in memory), but I was wondering how stable it is. Reading the manual, it seems worringly 'researchy' - there are sections labelled [NOT IMPLEMENTED] etc. But maybe the manual is just out-of-date.
So, is this library useable? Am I better off looking at Tokyo Cabinet, MemcacheDB, etc.?
By the way I am working in Java.
I have looked at the WB B-Tree Database, but SQLite might be a better fit. It handles extremely large datasets in a single file, and is a lightweight, fully-functional database.
http://www.sqlite.org/
Info on using SQLite with Java is here:
Java and SQLite
Yup, I gave it the good old college try in java. The jar file was easy to find as was the documentation. I think it was written in Scheme or something the likes and was translated to be usable in java.
The documentation speaks of functions that you ought to use but not what Objects they reside on. Sadly there is no java doc to help me out... There are no working examples and after 2 hours of trying I finally gave up. I found it not very useful at all.
I hope others have better luck using it.