I'm developing app for storing some private user data. I use sql-server-ce database. Data can contain images. In future, app will be able to synchronize with SkyDrive. So I have a few questions about data encryption:
What is the best way to encrypt data? Encrypt all databse (with Password option), or just data (with AES128, as in database)?
Where to store assymetric key (for AES). Can I store it as constant in code (I guess no: Silverlight - Hardcoding private key), or should I use ProtectedData class? As I understand, ProtectedData items are linked to current device, so I won't be able to synchronize that generated key to another device?
How to store images? As separated encrypted files, or as Blob column in database? For all app it can be about 50 full-res images. I would like to store them at database, but will it impact performance (e.g. I want to show all 50 thumbs that stores at another Blob column, but with LINQ I select all rows, so they will be all loaded to memory?).
Related
What I do
I am reading the CSV file from the Azure storage account and copying the same data to the Azure SQL database in a table. The CSV file contains PII information such as first name, last name, phone and email. The column mapping with the sink database is one to one.
Want solution for:
No issue, all data is properly copied. On top of that I want to encrypt the PII information. After encryption I will see only the encrypted value in the database.
I tried:
I converted the PII value to hashbyte (SHA2_256) but I did not get the actual value back so it is not my solution
NOTE: I am using ADF V2
In advance, thank you for your time. Any of your input will be valuable to me
There are multiple types ways to handle pii in Azure SQL database like encryption or dynamic data masking.
Is there any specific requirement why you want to encrypt the data through ADF?
You can use dataflows within ADF to do the necessy encryption or needed
I want to query encrypted data from my SQLite database.
For each row, I'm using XOR operation on every value, convert it toBase64 and then INSERT it in the database.
Now I need to find a way to SELECT the encrypted values.
i.e:
SELECT *
FROM table
WHERE name_column BETWEEN 'value1' AND 'value2'
Considering the huge information in my database, how can I do that without having to decrypt all the table to get the wanted rows?
It's impossible. You are using BETWEEN 'value1' AND 'value2'. The database can only see the XORed strings and BETWEEN will not work as expected. Even if you find a way to decrypt the strings on-the-fly with SQLITE (remember XOR calling again will decrypt) it's not very efficient and resource consuming when there are thousand of entries.
So in order to continue with your problem you could have a look at this extension list. SQLITE seems to provide some very basic encryption modules, which can XOR the whole database with a key you defined. (not recommended)
This file describes the SQLite Encryption Extension (SEE) for SQLite.
The SEE allows SQLite to read and write encrypted database files. All
database content, including the metadata, is encrypted so that to an
outside observer the database appears to be white noise.
This file contains the complete source code to the SEE variant that
does weak XOR encryption. Do not take this file seriously. It is for
demonstration purposes only. XOR encryption is so weak that it hardly
qualifies as "encryption".
The way you want to do it won't work, unless you read all values of a column to your Qt program, decrypt them and check if VALUE X is BETWEEN A and B.
I'm a MSSQL developer who recently was tasked with building a new application using DynamoDB since we use AWS and we wanted a highly scaleable database service.
My biggest concern is data integrity. For example, I have a table for all my users where every row needs to have a username, email, and name field, all strings, with a verified field that's an int. Is there anyway to require all entries in that table to have those fields and to be of that particular type?
Since the application is in PHP I'm using Kettle as my ORM which should prevent me from messing up the data integrity but another developer voiced a concern about if we ever add another application or if someone manually changes some types via the console.
https://github.com/inouet/kettle
Currently, no, you are responsible for maintaining the integrity of your items with respect to the existence of attributes that are not keys on the base table. However, you can use LSI and GSI to enforce data types of attributes (notwithstanding my qualm that this is not a recommended pattern, as it could cause partition heat especially for attributes whose range of values is small). For example, verified seems like it might take only 0 or 1 as a value, so if you create a GSI with PK=verified where verified is a Number, writes to the base table may get throttled by the verified GSI.
A small-scale web site, with content pages generated from a small amount of data: maximum six integers.
I'm wondering how to optimally design the URLs that point to these pages, so that users can't manually change the URL to change the displayed data.
This is a node.js + mongoDB project, with an MVC structure (actually via sails.js, even if it's probably not relevant here).
Let's say the pattern looks like http://domain.com/xxxxxx
• Approach 1: encryption.
xxxxxx carries the six integers and a control key, encrypted via a simple library. The server decrypts the URL and serves the page and embedded data.
Pros: fast, smaller disk space since no need to maintain a database collection for this
• Approach 2: database.
I define a model to keep track of all data sets (the six integers) that are created by users.
In the URL, xxxxxx is the model's primary key. Data is retrieved from accessing the database.
Pros: analytics, ie: by enriching the model, it's possibile to track past usage and statistics
For now, I'm thinking that analytics are the only justification to Approach 2. If I don't need them, is there a risk to choosing Approach 1, and are the listed advantages meaningful?
I'm currently developing a mobile application which uses AJAX request to get data from a server.
To enable offline navigation in my application, I need to store all data collected.
My application is quite powerful because there's a section where the user can see charts (powered by highcharts).
I'm asking myself about the best solution to cache the data collected in the JSON format.
Is it light or efficient enough to JSON.stringify the data array into local storage like:
localStorage.setItem("graph_1_datas", JSON.stringify(json_data_array));
Or would it be better to create a database, and a table like that:
TABLE
-----
id
graphId
blockId
x
y
I have 3 graphIds by blockId, and about 10 blockIds...
Storing the JSON strings to local storage should be fairly fast and efficient. Just store a separate file for each request and then it will give you clear simple code for getting the data either from local storage or web service.
If you are likely to want to edit the data offline then you may wish to consider an SQLite database as it will make it easier/more efficient to add code to track changes.
You may also want to consider an SQLite database if your object graph gets more complicated and fits a relational database model.