Remote storage for sqlite database - sqlite

I'd like to upload my SQLite database to some remote storage to have access to my database from various computers and mobile devices programatically.
Is there a solution that enables secure solution (data won't be stolen) with good information privacy and some programming interface for various languages? (e.g. Python, C, Java in Android, etc.)?

SQLite has an Encryption Extension (SEE).
The SQLite Encryption Extension (SEE) is an add-on to the public domain version of SQLite that allows an application to read and write encrypted database files.
It is a commercial product, not Public Domain as SQLite.

SQLite is an embedded database which must be stored on a filesystem accessible by the client application. SQLite doesn't support multiple concurrent clients, remote access, access control, or encryption (natively.) The requirements that you list are much better served by a more traditional database server, such as MySQL or PostgreSQL. You can easily export SQLite data and import it into one of these databases.
If you are dead-set on using SQLite, you can try storing the database on a shared, remote, filesystem, a la Dropbox. You'll still have to worry about concurrent access and you'll lose many of speed advantages of using SQLite, but the database will be accessible from multiple machines.

Related

CosmosDB Multi-Model read/write on a single database

In Build session #BRK3060 Mark Russinovich demos some code that uses both the SQL and Graph APIs on the same database (starts at 45:27):
https://www.youtube.com/watch?v=S2zguwKvlQk
Does anyone have any insight into these read/write using multiple APIs on the same database?

Any problems accessing sqlite files directly from Azure file storage?

We have a legacy system we're planning on migrating to Azure. The system uses sqlite files to store the data we need to access. After bouncing around with numerous solutions, we've decided to store the sqlite files in Azure file storage and access them via a UNC path from a cloud worker role (we can't use Azure functions or app services as they don't have the ability to use SMB).
This all seems to work ok, but what I'm nervous about is how sqlite is likely to react when trying to access a large file (effectively over a network) this way.
Does anyone have experience with this sort of thing and if so did you come across any problems?
The alternative plan was to use a web worker role and to store the sqlite files in blob storage. In order to access the data though, we'd have to copy the blob to a temp file on the web server machine.
You can certainly use Azure File Storage, since it's effectively an SMB share, backed by blob storage (which means it's durable). And also, since it's an SMB share, you can then access it from your various worker role instances.
As for your alternate choice (storing in blob and copying to temporary storage) - that won't work, since each worker role instance is independent, and you'd then have multiple, unsynchronized copies of your database on each VM. And if a VM rebooted, you would immediately lose all data on that temporary drive.
Note: With Web/worker role instances, as well as VM's, you can attach a blob-backed disk and store content durably there. However, you'd still have the issue of dealing with multiple instances (because attached disks cannot be attached to multiple VMs).

Local database with later opt-in

I wrote an app that contains data that is sensitive to certain users which so not want it to end up online. I want to allow to use the app with firebase offline only with the option to sync at a later time. Is this possible with current ios and android firebase implementations as a replacement for sqlite database?
The Firebase Database is primarily an online database that can handle intermittent and medium-term lack of connectivity.
While the user is not connected, Firebase will keep a queue of pending write operations. It will aggregate those operations locally when it loads the data from disk into memory. This means that the larger the number of write operations while the user is offline, the longer loading will take and the more memory the database will use.
This is not a problem in the intended use-case: online apps that need to handle short/medium term lack of connectivity. But it is not a suitable database for long-term offline databases.

Using redis with SQL server

I am developing a web app and came across redis for key value storage. I do have a relational db SQL server. But as I have a multi tenancy system there will be separate schemas for each customer.
I was thinking how viable would it be use both redis and SQL server together? I was thinking of storing user Id and schemas so then can connect to SQL server db for that user
It's perfectly viable to use both Redis and SQL Server together.
With more details about the kinds of schema differences you expect, we might be able to provide more insight.

confused about local data storage for occasionally connected application in .NET

Can I use a SQL Server Express database as my local database for an occasionally connected application (OCA) written using Visual Studio? Would that require SQL Server to be installed on the client machine? It looks like the default architecture for OCAs in .NET is to use SQL Server Compact. However, SQL Server Compact doesn't permit the use of stored procedures. I use stored procedures for all the data access in my application so, I am confused about the best way to create an occasionally connected client to extend the functionality.
I currently have an ASP.NET web application that connects to a web service (WCF). The web service connects to the DB and calls stored procedures to get data and submit changes to data. Now, I am trying to write a desktop application that can connect to the web service when a connection is available, and work locally when a connection is not available, using the MS Sync Framework. I don't quite understand how to do the architecture for this bit.
Yes, local data cache works with SQL CE 3.5 and you cannot use stored procedures on the cache. Once you add local data cache item to your project it automatically prepares all necessary MS Sync Framework code for data synchronization with the main data source + all necessary SQL scripts for local database and it will also offer you to create either typed datasets or entity data model to access the cache from your application.
Item doesn't work with SQL Server Express - it doesn't offer any other data provider then SQL Compact 3.5. Anyway if you want to use SQL Server Express you will have to either install it on the client machine or use another machine as DB server which breaks whole purpose of Local data cache.
Btw. I think Local data cache works only against database as the main data source so you cannot use it if you want to have WCF services as data source and you will have to write the store and synchronization yourselves.

Resources