I'm not very experienced with DBMS systems so I need some advice about which DBMS to use for storing RSS feeds.
The DBMS must be available on Linux and be free. I have some experience with MySQL but I am unsure if it performs fast enough to handle the storage and updates of hundreds of thousands of xml documents.
You will have to provide more informations, but almost all DBMS could do a good job at this.
SQL Server Express is free and easy to use though, so that should be a starting point.
Any of the DBMS can be sufficent to store RSS feeds the questions should be:
What DBMS do you have experience with?
What DBMS can you afford?
What DBMS has client libraries for your platform?
Related
I have a Xamarin.Forms app that uses a local SqLite database as its source for data. The data is proprietary, so I want to protect it so that if someone gets access to the database file, they would have to decrypt it to access the data.
I also want to limit the number of queries users can make against the database so that at a certain point they have to purchase the ability to use more of the data (in-app purchase).
I want to avoid making network calls as much as possible to minimize impact to the user's data plan and allow the app to work well in conditions where there is poor or no connectivity. So, I want the data stored in a local database (perhaps in SqLite).
I'm curious how different people would approach this problem to protect the data and at the same time minimize network usage.
Here is kind of what I was thinking (if it's possible):
1) Let the user download/install the app.
2) On first load, the app will upload a key based on the device id and the user's current purchase information. Then it will download a SqLite database file that has been encrypted using the uploaded key.
3) When the user reaches their limit of queries, the database file is deleted. If they purchase more data, then a new key is uploaded and a new encrypted database is downloaded to be used.
Thoughts? Is there a better way?
I would suggest SQLCipher! It is a Component within Xamarin (http://components.xamarin.com/view/sqlcipher-for-xamarin-ios) but can also be built from source as it is Open Source (https://www.zetetic.net/sqlcipher/open-source/)
That will totally secure your database :)
UPDATE 8/2/2018 - SQL Cipher is now free and easy to implement thanks to the greatness of Frank Krueger. sqlite-net (https://github.com/praeclarum/sqlite-net) is the defacto sqlite library for Xamarin now (if you're still using the Sqlite.Net fork I recommend going back to sqlite-net as soon as possible as Sqlite.Net has been abandoned) and it now includes SQL Cipher support completely free of charge.
As clb mentioned, SQLCipher is open source. So if you don't want to pay for the component you can download and build the source yourself, then wrap it for use in Xamarin. This is, admittedly, a technically challenging task.
If that's not an option, I would recommend two other options:
Reevaluate your need to store data locally. It's extremely unlikely that you need to transfer enough data to even cause a blip on a user's data plan. And between cellular and wifi, it's not that common anymore for users to be without a connection. It certainly does happen, and there are certain apps where this is very important, but you may have to make concessions if the data is that sensitive.
If you absolutely have to store the data locally, and you can't use SQLCipher, your last real option is to use a cryptography library and encrypt the data itself, rather than the database file. This is less than ideal, typically, for a variety of reasons, but it may be your last resort. PCL Crypt is a PCL capable crypto library that you can look into.
https://github.com/aarnott/pclcrypto
I'm new to ASP.Net, MVC and the Entity Framework.
I'd like to understand the best practice for small databases. For example, say at Contoso University we know there are only going to be a few hundred or a few thousand students and courses. So all the data would comfortably fit in memory. So then is it better to use an in-memory collection and avoid potentially high-latency database operations?
I am thinking of small-scale production web sites deployed to Windows Azure.
To be more specific, the particular scenario I am thinking of has a few thousand records that are read-only, although users can create their own items too. Think of a collection of movies, albums or song lyrics that has been assembled offline from a list of a few thousand popular titles. The user can browse the collection (read-only), and most of the time they find what they are looking for there. However the user can also add their own records.
Since the popular titles fit in memory, and these are read-only, is it maybe better not to use a database for the popular titles? How would you organize data and code for this scenario?
Thanks for any thoughts and pointers.
I think a database is good place to store your information.
However, you are concerned about database latency.
You can mitigate that with caching - the data is stored in memory.
In short, it isn't an either or scenario...
You should definitely store your data in some persistent storage medium (SQL, Azure Tables, XML file, etc). The issues with storing items in memory are:
You have to find a way to store them once for the application and not once per user. Else, you will have potentially several copies of a 2-5 MB dataset floating around your memory space.
Users can add records, are these for everyone to see or just them. How would you handle user specific data.
If your app pool recycles, server gets moved by the Azure engineers, etc, you have to repopulate that data.
As said above, caching can really help to alleviate any SQL Azure latency (which btw, is not that high, we use SQL Azure and web roles and have not had any issues).
Complex queries. Sure, you can use LINQ to process in memory lists, but SQL is literally built to perform relational queries in a fast, efficient, data-safe manner.
Thread safe operations on an in-memory collection could be troublesome.
Edit/Addendum
The key, we have found, to working with SQL Azure is to not issue tons of little tiny queries, but rather, get the data you need in as few queries as possible. This is something all web applications should do, but it becomes much more apparent when using SQL Azure rather than a locally hosted database. Lastly, as far as performance/caching/etc, don't prematurely optimize! Get your application working, then identify bottlenecks. More often than not, it will be a code solution to fix the bottleneck and not necessarily a hardware/infrastructure issue.
I am trying to create an app that receives an Sqlite database from a server for offline use but cloud synchronization. The server has a postgres database with information from many clients.
1) Is it better to delete the sql database and create a new one from a query, or try to synchronize and update the existing separate sqlite files (or another better solution). The refreshes will be a few times a day per client.
2) if it is the latter, could you give me any leads to resources on how I could do this?
I am pretty new to database applications so please excuse my ignorance and let me know if there is any way I could clarify.
There is no one size fits all approach here. You need to carefully consider exactly what needs to be done, what you are replicating, how much data is involved, and what your write models are, all before you build a solution. Along the way you have to decide how to handle write conflicts and more.
In general the one thing I would say is that such synchronization works best with append-only write models (i.e. inserts, no deletes, no updates), and one way to do it is to log changes that need to be made and replicate those changes.
However, master-master replication is difficult on the best of days and with the best of tools available. Jumping between databases with very different capabilities will introduce a number of additional problems. You are in for a big job.
Here's an open source product that claims to solve this for many database types including Postgres. I have no affiliation or commercial interest in this company.
https://github.com/sqlite-sync/SQLite-sync.com
http://sqlite-sync.com/
If you're able and willing to step outside relational databases to use an object store you might want to have a look at CouchDb and perhaps PouchDb that use a MVCC based replication protocol designed to support multi-master replication including conflict resolution. Under the covers, PouchDb uses adaptors for Sqlite, IndexDb, Local storage or a remote CouchBb instance to persist client side data. It auto selects the best client side storage option for the given desktop or mobile browser. The Sqlite engine can be either WebSQL or a Cordova Sqlite plugin.
http://couchdb.apache.org/
https://pouchdb.com/
We are going to store some sensitive information about our customers in the db model res_partners.
However we don't want to store this information in a simple text field. We would prefer
some basic encrypting if possible for those fields. We do not want someone who
has access to the db to have access to these fields.
Is there a way we can get this done in openerp or postgres ?
Thank you,
Vishal Khialani
There is no such thing as "basic" encryption. Rot13 is not getting to get you anywhere here. If your data is sensitive enough to deserve protection, then you need to use state of the art cyphers such as Blowfish. I advise you give a good long look at Bruce Schneier's book Applied Cryptography
The easy (and insecure) way to achieve this is to overload the write and read methods of your model to encrypt before writing and decrypt after reading.
The tricky part is storing the encryption key. You could store it in a file on the computer running the OpenERP server (assuming the database is running on another server). This is still pretty weak, as the key will be available in clear on the server, but could still be useful if you don't trust your database server admin, but do trust you openerp server admin. It's still way easier to get the database server in a secure and trusted place, and if required to crypt offline copies of the database (such as backups).
If you want more security, you'll have to send the data encrypted to the client application, and let the decryption happen there, using a user-supplied key. I'm not enough knowledgeable of this part of openerp to say if it is easily feasible or not.
What is the best way to encrypt columns in SQL Server 2005 Express edition so that no one can steal our database design?
Thanks
there is no best way to do this.
if you obfuscate them you give yourself a lot of pain when debugging. you'll have to change all the queries. there's no good tool for this.
not to mention that the potential dba who'll might have to look at it and tune is going to be lost.
It's probably hard to accpet but your database design isn't something brilliant and new. I'm sure someone else has done it before. So there's no real need to protect it.
I suggest you set permissions for data access, not obfuscate schema. Schema isn't important, data is.
You could encrypt certain db objects like sprocs and views but this is also useless because there are not too hard ways of cracking it.
better secure your database, then encrypt your table scheme
give your application user just minimal rights to your database
secure your connection string http://msdn.microsoft.com/en-us/library/ff648340.aspx#paght000010_step2
Looks like you can only obfuscate Stored Procedures, Functions, Triggers and Views.
Link To MSDN