This may be a super simple question, but for some reason I'm hesitating on the obvious solution. Should I be storing a user's local device filesystem path (for an image or video) in my remote database?
In my mobile application, users can take photos and videos and those are stored locally on their devices until they decide to upload at a later time. I am currently just tracking the local filesystem locations on their device in the local app state and not saving that information in my database. I only update my database with an image URL after it is uploaded. This works well as long as it works! In the event of crashes some users are losing their images because the local app state gets lost, and there's no remote state to recover to.
The easy, obvious solution is to just save the user's local file path to the images in my database - then even if the local state is lost they can still recover the path to those images.
Example:
{ "file_path": "/users/user1/.my_app_data/media/img.jpg" }
However, it just feels improper to store internal, device-specific information like that in my remote database, since it's meaningless without that specific device (even for the same user). In my mind, data in the server should be as device-agnostic as possible.
What is considered best practice, if there is such a thing, for this situation?
There is no singular best practice here. It all depends on your exact use-case, your tolerance for problems, and the amount of effort you're willing to spend on things.
There is no harm in storing these local paths as long as you can recognize them, and not try to use them on another user's device.
But if the local state of the device is lost, aren't the local files themselves likely to be affected too? If not, the cloud backup of the paths may be a good aid in restoring the state. But if the files are likely to also be affected, storing the paths in the cloud probably isn't very helpful and I'd skip the effort.
Related
I'm trying to determine the best way to reference a Firebase Storage (Google Cloud Storage) file in a direct-read database like Realtime Database or Cloud Firestore. Since a read operation to this database does not benefit from a backend that can issue tokens and cache image URLs, it is not clear to me what the most performant way is to store these references.
I have come up with a few options and none of them are a clear winner.
Store a path like /images/foo.jpg to the database, and use Storage Client SDK to generate a tokenized path with storage.bucket().getDownloadURL("/images/foo.jpg").
Pros: Secure & simple.
Cons: Network call for every single image you want to display hurts performance considerably.
Store a tokenized path like https://firebasestorage.googleapis.com/v0/b/storage-bucket-823743.appspot.com/o/images%2Ffoo.jpg?alt=media&token=c6da1e33-f3ff-41e2-a6f0-bdb475a2f6d9 with a super long TTL.
Pros: No extra fetch on the client.
Cons: long string stored in expensive RTDB. What if that token is revoked by mistake? The database is now broken.
Store a path like /images/foo.jpg to the Database and use public storage rules. Reconstruct into a custom static URL like https://firebasestorage.googleapis.com/v0/b/storage-bucket-823743.appspot.com/o/images%2Ffoo.jpg?alt=media
Pros: Tiny database use, no extra client fetch, explicit public access, no token to lose.
Cons: URL encoding is super flaky, Storage could change their URL format and we'd be out of luck.
So, those are the options I've come up with, and there may be more. Any suggestions for how to solve this issue? The issue is unique because the Firebase databases don't have the benefit of a custom server to handle a token/caching layer to resolve this problem.
There is no single "best way" to store these paths. It all depends on your use-case, your preferences, and the context in which you're implementing it.
I typically use :
If I need access to the files to be secured, I store the image path (like in your #1), and then use the Firebase SDK to access the file.
If I don't need access to the files to be secured, I store the image path and the download URL. This way I can find the image easily based on the path, and use the download URL in non-secured clients.
The con's you mention for these are simply not affecting me. I'd recommend you take a similar approach, and report back when the problem actually occurs.
I have an application that needs to send data to a cloud database (DynamoDb).
The app runs on a computer that can lose internet connectivity or be switched off at any time, but I must ensure that all data eventually gets to the cloud database.
I can assume the application will eventually be switched on, and will eventually get internet access back.
The app is written in VB .NET
What are some schemes for achieving this, and are there any ready-made products that already achieve this?
You could implement a write-through cache using a local DynamoDB instance (or even using SQLite). But without getting specific details about what kind of data you'd be storing into the database, and what data should be made available "offline" it's hard to say exactly how you should structure your application. You'll definitely want to not keep everything local, unless the volume of data is really small overall.
Then there is the problem of resolving conflicts that may occur during network partitions (ie. a client goes offline and makes some database modifications, while other clients also make modifications to the database; these need to be reconciled and it's up to you, and your users to determine how)
It's not a simple problem to solve.
I have a Xamarin.Forms app that uses a local SqLite database as its source for data. The data is proprietary, so I want to protect it so that if someone gets access to the database file, they would have to decrypt it to access the data.
I also want to limit the number of queries users can make against the database so that at a certain point they have to purchase the ability to use more of the data (in-app purchase).
I want to avoid making network calls as much as possible to minimize impact to the user's data plan and allow the app to work well in conditions where there is poor or no connectivity. So, I want the data stored in a local database (perhaps in SqLite).
I'm curious how different people would approach this problem to protect the data and at the same time minimize network usage.
Here is kind of what I was thinking (if it's possible):
1) Let the user download/install the app.
2) On first load, the app will upload a key based on the device id and the user's current purchase information. Then it will download a SqLite database file that has been encrypted using the uploaded key.
3) When the user reaches their limit of queries, the database file is deleted. If they purchase more data, then a new key is uploaded and a new encrypted database is downloaded to be used.
Thoughts? Is there a better way?
I would suggest SQLCipher! It is a Component within Xamarin (http://components.xamarin.com/view/sqlcipher-for-xamarin-ios) but can also be built from source as it is Open Source (https://www.zetetic.net/sqlcipher/open-source/)
That will totally secure your database :)
UPDATE 8/2/2018 - SQL Cipher is now free and easy to implement thanks to the greatness of Frank Krueger. sqlite-net (https://github.com/praeclarum/sqlite-net) is the defacto sqlite library for Xamarin now (if you're still using the Sqlite.Net fork I recommend going back to sqlite-net as soon as possible as Sqlite.Net has been abandoned) and it now includes SQL Cipher support completely free of charge.
As clb mentioned, SQLCipher is open source. So if you don't want to pay for the component you can download and build the source yourself, then wrap it for use in Xamarin. This is, admittedly, a technically challenging task.
If that's not an option, I would recommend two other options:
Reevaluate your need to store data locally. It's extremely unlikely that you need to transfer enough data to even cause a blip on a user's data plan. And between cellular and wifi, it's not that common anymore for users to be without a connection. It certainly does happen, and there are certain apps where this is very important, but you may have to make concessions if the data is that sensitive.
If you absolutely have to store the data locally, and you can't use SQLCipher, your last real option is to use a cryptography library and encrypt the data itself, rather than the database file. This is less than ideal, typically, for a variety of reasons, but it may be your last resort. PCL Crypt is a PCL capable crypto library that you can look into.
https://github.com/aarnott/pclcrypto
I have an application which works well on a single server. My customer now wants to move it to a load-balanced environment. Which things are likely to bite me when doing this?
Currently I know of
Session state, and
Machine key.
Both of these are described here, for example, so I'm looking for things additional to this.
These similar questions, but the first addresses load balancing in general, and I'm looking for a migration guide, and the second addresses a specific problem.
One thing that you might experience is an increased load on your database server. If you implement any serverside caching of data, you will be used to your site hitting the database once for a given dataset, caching the data and then not going to the database again until the cache times out. This is a good strategy for commonly accessed data.
In a load-balanced environment, subsequent requests might go to different servers and the database will be hit twice for the same data. Or more if you have more than 2 servers. This is not bad in itself, but you would be well advised to keep an eye on it. If the database is queueing, the benefits of running a webfarm might be negated. As always, benchmark, profile and run tests.
One way around this is to have sticky-sessions. This is a router-based approach. once a user session is established, all requests from that user are routed to the same server. This has its draw backs, most notably, a potential decrease in the efficiency of the load-balancing, not to mention the problems when you lose a server. Furthermore, it will only help when you are caching mostly user-specific data, such as paged search results, which are only cached for a short time.
Another solution is to have a distributed, in memory cache, such as memcached, Velocity or NCache. There are others as well, but these are the ones that I have worked with.
Another thing to look out for, which is unrelated to the above is: How you deal with file uploads from your users. Many sites allow files to be uploaded by users. If such files have not previously been saved to a central store, such as a database or a common file share, they will need to be.
Look at this article which describes some tips regarding preparing asp.net application for load balancing.
I want to download a file from the server and save it in a particular location of the disk without user interaction. All this I want to do it in Flex. If anyone has a solution please help me.
You cannot do that. You cannot download a file and save it into user's machine without user's knowledge and permission. If that was possible, that would be a big security issue; I can overwrite any file on your machine while you enjoy the fancy animation on my home page - how does that sound?
If it is a small piece of data you can store it as a shared object aka Flash cookies
The SharedObject class is used to read and store limited amounts of data on a user's computer or on a server. Shared objects offer real-time data sharing between multiple client SWF files and objects that are persistent on the local computer or remote server. Local shared objects are similar to browser cookies and remote shared objects are similar to real-time data transfer devices. To use remote shared objects, you need Adobe Flash Media Server.