Why is SQLite implicitly inserting on update of non-existing row? - sqlite

So I'm using SQLite with .NET and System.Data.SQLite library (which shouldn't be relevant to the question). I'm executing a simple UPDATE statement with a WHERE clause, such that 0 rows should be affected. Instead, SQLite inserts a new row satisfying the provided WHERE clause.
I've read through SQLite documentation on UPDATE statement, but couldn't find anything about this feature. Googling around also doesn't seem to help. Anyone knows what's going on?

Related

c++ builder with SQLite: data presentation shows WIDEMEMO in DBGrid

i use C++ Builder with SQLite database. I connect with the database through DBExpress. The problem is that DBGrid shows (WIDEMEMO) in every field and not the exact values that the database has.
I dont have this problem with other databases such as Firebird for example.
Why this behaviour? am i missing something with sqlite and how can be fixed.
Do u suggest me use SQLite / Firebird or other embeded database for a small standalone application.
Thanks in advance.
I am using FireDAC in Delphi and had the same problem.
I have resolved this issue by changing the dataset TWideMemoField DisplayValue property from dvClass to dvFull. Now my DBGrid works as expected.
I know this question is old but since it doesn't have any answer selected I will throw in my two cents. I recently came across the same problem and found a solution that might work for you as well.
This problem exists in your C++ DataSet and your SQLite. As it turns out that when you have a field with no defined size especially a text field, that field will be treated like a widememo or Memo in your DBGrid. So, what you need to do is define your text field with specific numbers of characters i.e. VARCHAR(10).
That's what I did and it work for me. I am using MySQL and Delphi with DBGrid.
This is a problem with the DBGrid. It doesn't handle the WIDEMEMO. See link for help with this.
Displaying and editing MEMO fields in Delphi's TDBGrid

SQLite insert performance

I need to write single lines to a file placed on a file server. I am considering utilizing SQLite to ensure that the writing of the line was successful and not just partially completed. However, as insert performance is really essential. My question is, what is the exact process (as it read this, write that, read this and so on) that SQLite goes through when it inserts a row to a table. The table does not have any indexes, nor primary keys, constraints or anything.
This is the most common bottleneck in my experience:
http://www.sqlite.org/faq.html#q19
Except for that, SQLite is really fast. :)
You should use transactions and thus try to avoid fsync()-ing on every INSERT. Take a look here for some benchmarks.
Also, be sure to properly set the two important pragmas:
synchronous (NORMAL)
journal_mode (WAL)
you can use explain for details what happens when you execute a statement: http://www.sqlite.org/lang_explain.html

Performing SQLite Queries

I'm having problems finding examples of sqlite3 queries. The diagrams in the documentation are useless. I want to say...
delete from food_post where ip_address=190.17.107.106;
I keep getting a syntax error, though. :/
Possibly because of multiple periods in your number. Instead try
delete from food_post where ip_address="190.17.107.106";

Asp.net - Trouble with updating LINQ Classes

We just created a new field in a database table, and so deleted, and re-inserted the table in the LINQ Class. The new database field appears in the LINQ Class in the diagram. However, when we're using the field, we get an error that says the table does not contain a definition for the field.
Any ideas on how we can solve this? Thanks!
UPDATE: What steps are required to update the LINQ to SQL Class? Maybe we're doing something wrong.
UPDATE 2: Picture of our problem - LINQ - http://img99.imageshack.us/img99/6033/usertable.png | Code - http://img43.imageshack.us/img43/5145/linqerror.png
Check the table def side of your mapping documents, either using properties in the designer, or by closing Studio and examining the XML. I recommend the designer.
Make sure the field name matches the field name in the database.
I've had problems with a few reserved keywords when using Linq2Entities, and I'd recommend you avoid reserved words in names (even though the [] handle them).
While this doesn't answer your question necessary, it may help to solve it - I've been a long time fan of the LINQ to SQL and Entities tools by Huagati. The re-sync aspect alone has saved me so much time, it's well worth the $50 (for the standard version) IMO.
http://www.huagati.com/dbmltools/
Hope it helps...
Edit:
In order to update the LINQ to SQL classes, you can either do it manually (bllurgh) or, you can remove them from the designer and drag-and-drop them from the Data Connections node in the Server Explorer.
I had to delete the entire LINQ Class, recreate it and re-add the tables for my problem to go away. Simply deleting a single table and re-adding it, or deleting all tables in the class and re-adding them did not work either.
I was having exactly the same issue, but I found that deleting the problematic tables in the Object Relational Designer and re-adding them (and re-adding the associations as well) solved the issue. I did not have to delete the entire DataContext, nor did I have to delete any of the tables that were still working properly. I would recommend trying this first before doing anything more drastic.

LINQ to SQL performance with "SELECT TOP {x}" queries

In looking up how to perform an equivalent to SELECT TOP 5 with LINQ-to-SQL, all the answers I've seen suggest using .Take(), like so:
var myObject = (
from myObjects in repository.GetAllMyObjects()
select myObject)
.Take(10);
I don't yet understand most of how LINQ works behind-the-scenes but to my understanding of C-like languages this would resolve by first assigning a temporary array containing ALL records, then copying the first 10 elements in the array to var. Not such a problem if you're working on a small dataset or without any performance constraints but it seems horribly inefficient to me if you're, for example, selecting the most recent 5 log entries from a table which can contain millions of records.
Is my understanding of how this works wrong? If so, could someone explain what actually happens? Otherwise, what (if any) better (ie more efficient) way is there of only selecting x records through LINQ-to-SQL?
[edit]
I have the hypothetical myObject class sending LINQ-to-SQL output to the debug output as per the suggestion in the accepted answer. I ended up using the DebuggerWriter from here: http://www.u2u.info/Blogs/Kris/Lists/Posts/Post.aspx?ID=11
Your assumption is incorrect. With Linq to SQL, it evaluates to an Expression<Func<...>> which can be evaluated and the proper SQL generated. You do not need to worry about it loading all the records.
Also, see this following question. You can attach a stream to your DataContext and see the SQL generated.
How to get the TSQL Query from LINQ DataContext.SubmitChanges()
LINQ uses deferred execution, and, for LINQ-to-SQL, expression trees.
No query will be executed until you enumerate the result of the Take call, so you don't need to worry about anything.
I just went through this last week! I opened the SQL profiler on my dev data base and stepped through the code. It was very interesting to see the generated SQL for the various queries. I recommend you do the same. It may not be an exact answer to your question but it was certainly enlightening to see how your various components generate entirely different SQL statements depending on the contents of the call.
I believe the "deferred query resolution" or something (?) reading on MSDN would be enlightening as well.

Resources