I've just installed MariaDB 10.5.5 on Windows Server 2019 and have enabled the file_key_management plugin (keyfile is not encrypted (but will be)). I need to provide proof that the data is in fact encrypted, and have tried the recommendations from the two other posts I found here (First post and second post), specifically using strings C:/[path to my database data]/testdb.ibd | grep "abc" but haven't had any success (I get Syntax errors that there's a problem and check the Documentation (which I did)). I did check the INNODB_Tablespaces_encryption table and it's showing my table has Encryption_scheme = 1, so I'm assuming I am OK, but need to provide screenshots / proof of the encryption. Any help or ideas would be appreciated.
Chris
Related
Situation: MS Access (happens to be 2010) using SQLite ODBC driver (0.997) to link to tables in a SQLite (3.x) database.
Problem: data values in all columns in all rows display as "#Deleted".
Solution: This is a "answer my own question" kind of post, with a solution, below.
Edited: to move solution to Answers section.
Earlier, I searched in stackoverflow, found a similar question (sqlite linked tables in Access give #deleted values) with a good answer that turns out to be inapplicable in my case. So I'm adding some info here.
Half of the problem is explained here: http://support.microsoft.com/kb/128809 '"#Deleted" errors with linked ODBC tables.'
The above link was no longer available in Jul-2021. However you may find a good explanation for '#DELETED# Records Reported by Access' in https://dev.mysql.com/doc/connector-odbc/en/connector-odbc-errors.html
This explains that Access (Jet) wants a table to have a unique index in order to be able to insert/update the table if necessary.
If your SQLite table doesn't have a unique index (or primary key), then Access will only allow read access to the table -- you can't edit the table's data in Access, but the data displays fine.
To make the table updateable you might revise your SQLite code (or using a SQLite tool) to add an index to the table.
If your PK/unique index happens to use a TEXT field, that's fine for SQLite. However, when you link to it in Access, Access will show the #Deleted indications.
The chain of events appears to be:
Access/Jet notices the unique index, and tries to use it. However, SQLite TEXT fields are variable length and possibly BLOBs. This apparently doesn't fulfill Access's requirements for a unique index field, hence the #Delete indication.
To avoid that problem, the index has to be a SQLite field type that Access will accept. I don't know the complete list of types that are acceptable, but INTEGER works.
Hope this helps someone.
I just started to use KNIME and it suppose managed a huge mount of data, but isn't, it's slow and often not response. I'll manage more data than that I'm using now, What am I doing wrong?.
I set in my configuration file "knime.ini":
-XX:MaxPermSize=1024m
-Xmx2048m
I also read data from a database node (millions of rows) but I can't limit it by SQL (I don't really mind, I need this data).
SELECT * FROM foo LIMIT 1000
error:
WARN Database Reader com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'LIMIT 0' at line 1
I had the same issue... and was able to solve it really simply, KNIME has a KNIME.ini file, this one is like the paramethers KNIME uses to execute...
The real issue is that JBDC driver is set for 10 Fetch Size. By default, when Oracle JDBC runs a query, it retrieves a result set of 10 rows at a time from the database cursor. This is the default Oracle row fetch size value... so whenever you are reading database you will have a big pain waiting to retrieve all the lines.
The fix is simply, go to the folder where KNIME is installed, look for the file KNIME.ini, open it and then add the following sentences to the bottom, it will override the defauld JBDC fetching, and then you will get the data in literally seconds.
-Dknime.database.fetchsize=50000
-Dknime.url.timeout=9000
Hope this helps :slight_smile:
see http://tech.knime.org/forum/knime-users/knime-performance-reading-from-a-database for the rest of this discussion and solutions...
I'm not sure if your question is about the performance problem or the SQL problem.
For the former, I had the same issue and only found a solution when I started searching for Eclipse performance fixes rather than KNIME performance fixes. It's true that increasing the Java heap size is a good thing to do, but my performance problem (and perhaps yours) was caused by something bad going on in the saved workspace metadata. Solution: Delete contents of the knime/workspace/.metadata directory.
As for the latter, not sure why you're getting that error; maybe try adding a semicolon at the end of the SQL statement.
Again I come to you guys for your expertise and advice on an issue that I am having. I was wondering if any of you would know how to detect if a web page has been modified using VB.NET. I need to be able to set up a task which periodically (like once a week) scans the user inputted web pages and if the web page content has changed, I need to fire off an email to an individual that it has changed (not the exact location on the page itself). I'll be storing the HTTP status and of course the page data itself as well as the date of when it was last modified. Of course this needs to be very fault tolerant since it could be another week before the check runs again. Any help would be great. Thank you.
EDIT
New twist on this question sorry. I had more time to think about what we wanted. So... Detecting ANY change on a web page would be kind of silly since time dependent elements of the page would change every so often. Instead, what I would like to do is be able to detect the documents in the page. For instance if there are excel, word docs, or pdfs that get changed on that page. So, I'd run the hash on these documents then on some sort of schedule do a check to see if new documents have been added or if the old documents have been modified. Any suggestions on how to detect the documents embedded on the page and running the hash? Thanks again!
As I mentioned in a comment, this sort of job is what checksums (also known as hash functions) were designed for.
You code for will look something like this:
- for each webpage of interest
- pull webbpage
- calculate checksum of contents
- is current checksum different to last checksum?
- if yes, send email
- store new checksum and other appropriate data
The .Net framework has a number of checksums available. The two most popular are MD5 and sha1
In addition to the checksum option, there are also various Diff function that achieve this, and provide much more information than changed=true/false. This question has more info:
How to tell when a web page has changed by x% in VB.net?
I want to make a small plugin for awesome WM that will show a number of unread messages pending in Thunderbird. I want to fetch the number of messages accessing directly Thunderbird sqlite base. The question is: which base, table and fields should I query?
There are at least 15 bases under ~/.thunderbird/profile/, including ./global-messages-db.sqlite. In this base I tried messageAttributes table, but with no big success. I could not find development documentation describing the attributes...
Any help here?
You will find what you need in the global-messages-db.sqlite file. If you watch the messages table, you will find a column jsonAttributes. In there you will find a JSON array, from attribute ids and their value. The key 58 is the read-status of a message. So if you find something like {"58": false} in this column the message is still unread. But this database won't be updated immediately when a new message is received. (It might even be updated only, when you close Thunderbird -- I am not sure about that.)
So as you see finding unread messages that way will be a bit of the hard way to go. I would recommend you to better create a plugin, that is directly checking the server via IMAP or POP3.
For IMAP servers there already exist an awesome-plugin inside the Delightful Extensions. I don't know of any POP3 plugin, and as it seems POP3 libs for lua are also rare to find.
I have a Plone site that has a lot of data in it and I would like to query the database for usage statistics; ie How many cals with more than 1 entries, how many blogs per group with entries after a given date, etc.
I want to run the script from the command line... something like so:
bin/instance [script name]
I've been googling for a while now but can't find out how to do this.
Also, can anybody provide some help on how to get user specific information. Information like, last logged in, items created.
Thanks!
Eric
In general, you can query the portal_catalog to locate content by searching various indexes. See http://plone.org/documentation/manual/developer-manual/indexing-and-searching/querying-the-catalog and http://docs.zope.org/zope2/zope2book/SearchingZCatalog.html for an introduction to the catalog.
In some cases the built-in indexes will allow you to do the query you want. In other cases you may need to write some Python to narrow down the results after doing an initial catalog query.
If you put your querying code in a file called foo.py, you can run it via:
bin/instance run foo.py
Within foo.py, you can refer to the root of the database as 'app'. The catalog would then be found at app.site.portal_catalog, where 'site' is the id of your Plone site.
Finding information about users happens via a separate API (for the Pluggable Auth Service). I'd suggest asking a separate question about that.