Database with Blob field unknown format. How to extract? - sqlite

the story behind this is a bit funny. I had a diary on ios, the app isn't available anymore so I switched to another app. Could backup the database.
I know the column where the text is stored. Unfortunately, it is in a blob format. The database is a sqllite file.
Is there a way to find out what the true format is and somehow convert it back to text? At the beginning I thought it's just a sql command but I wasn't successful.
Do you have any Idea or an appraoch how to solve that ?
PS:
That is the beginning of one blob file: 'É' ¥I±00n'.
And here is the same in hex:14C9271E2005A549B130308E8F6ECF69A74A
Could only put the beginning here, since I really don't know what is written there.

Related

SQLite database shows question marks (???) instead of these Unicode characters (தமிழ்)

I imported a CSV file containing Unicode into an SQLite database but instead of seeing the text, all that I see are question marks. Like this, "???". The encoding is UTF-8 (I've mentioned below what happened when I tried UTF-16). The SQLite manager I'm using is DB Browser for SQLite.
This is the Unicode that I typed: தமிழ்
Now, according to this answer in Stackoverflow, SQLite stores text data as Unicode. So the fact that my text is Unicode can't be the problem.
The characters I'm trying to use belong the language Tamil. I'm trying to use it with Unicode. According to Wikipedia, encoding for Tamil is called TACE16. It's a 16-bit Unicode based character encoding.
So then I set the encoding as UTF-16 when I imported the CSV file. But the file doesn't even show up in the database after importing when I do that. But it says import is successful.
Then I tried importing the CSV file with UTF-8 encoding as usual. But after importing I right clicked the row header, selected "Set Encoding" and set it to UTF-16. Now it didn't show question marks but it shows something like Chinese characters. This is what it shows now: 㼿㼿.
I tried setting TACE16 while importing. I also tried setting it manually. But it said it's either an incorrect encoding or it is not supported.
Further searching online didn't turn up anything. Could someone tell me how I can fix this issue? Basically, I want this text "தமிழ்" to show in the SQLite database after importing the CSV file which has the text.
Thank you so much. I would really appreciate your help.
I had similar issue once but in my case the problem were only on the DB software I used to visualize DB tables. Have you tried to retrieve your data from the database? Are they right when you retrieve them?
Anyways if you tell us what tools are you exactly using for doing what it is impossible to find a solution in your specific case.
OK, it turns out the issue was my csv file. I edited it in excel and I guess excel saved it using another encoding. I'm still not sure what's the exact issue but I'll just write about how I fixed it.
I opened Notepad and typed out the data separated by commas. I saved the file with the extensions csv. Here's the important thing. You have to change the encoding to Unicode. There's a drop down menu just left of the save button. Use that. Here's a link to a youtube video that shows you how.
Also, you don't need to type everything in a Notepad. It can get tedious.
Type everything out in Google Spreadsheets and export download it as a CVS file. It works. If you have to use Notepad, type the data in excel, concatenate everything in each row with using a formula, and copy paste it into a notepad. Don't forget to add a comma between each cell info using the formula in excel.

serializing file via R, for insertion as blob

I know there are at least a few reasons one might not want to do this, but I'm going to want to do it anyway. I want to take a lot of PDFs, store them as blobs (so nodes can read them from the main SQL server).
What would be the best practice for an R user to read the PDF content as a string, serialize it, and insert into MYSQL as a blob?
I assume all I really need is a text representation of the pdf which can then be BLOB'ed

Core Data SQLite File

I have an app that saves your data and retrieves your data under a ID number of your choice! The only thing is I have people asking for an excel document of all there saved data! Does anyone know how I would go ahead with this! I am using Swift and Xcode 6.1! I am also using a SQLite file that core data has made for me.
Thanks,
AppSwiftGB
Creating native Excel is likely a PITA. You should export to CSV, though the format seems to be another PITA for one is liking comma, the other semicolon and the third tabs :-(

What is causing the corruption of text fields with ¿ characters?

We have a very strange problem in out application, all of a sudden we started noticing
upside down question marks being saved along with other text typed in to the fields on the screen. These upside down question marks were not originally entered by the users and it is unclear where they come from. We are using Oracle 10g with Asp.Net.
Here is an example of the issue: "140, 141) ¿ 16-Oct-07". If any one have seen this before and found a way to fix this please let me know how.
This sounds like a character encoding issue. Please check what encoding your database (tables) are set to, and what encoding the objects or strings which are passing data in the database are of. If there is a mis-match (DB in ANSI, App in UTF-8), these sorts of issues can appear.
Greg, you should check NLS_CHARACTERSET not NLS_NCHAR_CHARACTERSET settings. And I bet you it's WE8ISO8859P1 or something similar and not unicode. The problem occurs when the submitted data in unicode, which is probably UTF8, and Oracle tries to map the characters to WE8ISO8859P1 character set. It does fine for most of them but fails for high ASCII number characters, like 140.
So yes, I have seen the same issue in our application and in our case it was caused by special quote marks (“example”, ‘example’) that were copied from MS Word. Word automatically converts double quotes to some other quotes. The solution was to convert the database to UTF-8.
IF your users are copying from MS Word you can turn the feature off . Its part of the autocorrect/autoformat functionality. If you uncheck the replace options for quotes and apostrophes you should be ok. Be sure turn off the replacements in both the AutoFormat and AutoFormat as you type.

Undelete accidentally deleted records in Sqlite3

As title, possible? I have by accident deleted another record due to my ugly html interface in FireFox. The bad thing is this record delete is a root folder which the program automatically cascade delete everything :(
Take a look at undark. I already used it. It it can export the rows (deleted or not) from a SQLite db file if the records were not overwritten. Last version here.
The SQLite-Deleted-Records-Parser does not give the same type of output, but can be useful.
And there are also some products like the SQLite Forensic Explorer, SQLite Repair, Sqlite Database Recovery and SQLiteDoctor.
If you are a developer you can avoid having the same problem again using litereplica. It adds single-master replication to SQLite.
But remember to enable the point-in-time recovery because as the transactions are replicated to the replicas an accidental command like DROP TABLE or DELETE FROM will also be replicated. With PITR you will be able to go to a previous point-in-time.
Or use the Backup API regularly. Although it transfers the entire db on each backup.
And remember: if you copy an SQLite file or use a regular backup approach while a transaction is active
the copy can be corrupted.
Sorry -- nope. Backups are the only option I know of.
In the future, consider never issuing DELETE queries, especially from user-accessible forms (let only the DB admin do it, if anyone) -- just include a field in your tables that marks a record as inactive and then factor that in to your queries in the WHERE clause.
Unfortunately I don't know of a way, either. However, until you do a VACUUM on the SQLite database file the deleted data is generally not technically removed. Perhaps you might be able to still recover some of the data using some sort of hex editor on the file.
It might be possible to go in and see the data via a hex-editor. The only info I could find said that metadata was gone so the records weren't going to come back, but the data itself might still be there. It has a lot to do with how important the data is, I suspect it's not important enough for you to dig out a hex editor.
The data isn't always removed from the file straightaway. If there's lots of it and you're desperate, you could use the UNIX command strings on the file. This may help you to recover various bits and pieces of human-readable data, but it'll be a hard and inaccurate process.
No way. Without a working backup you won't be able to restore this.

Resources