jdbc sqlite support allowmultiquery - sqlite

I would like to demonstrate SQL injection using Java and sqlite. I'm attempting to execute two queries at the same time with SQL injection. The user is to prematurely end the statement using ;, then add another entry using an insert statement.
Mysql JDBC using allowMultiQueries=true in the connection string.
How can I do this using sqllite?
TIA

allowMultiQueries as a MySQL-specific connection parameter.
I do not know of any SQLite JDBC driver that would allow multiple commands in one query.
Therefore, this kind of SQL injection attack is not possible.
Your best bet would be to construct some query like this:
SELECT * FROM Users WHERE Name = 'admin'--' AND Password = 'whatever'
or this:
SELECT *
FROM Users
WHERE Name = 'admin'
AND Password = 'whatever' or Name='admin'

Related

specify a database name in databricks sql connection parameters

I am using airflow 2.0.2 to connect with databricks using the airflow-databricks-operator. The SQL Operator doesn't let me specify the database where the query should be executed, so I have to prefix the table_name with database_name. I tried reading through the doc of databricks-sql-connector as well here -- https://docs.databricks.com/dev-tools/python-sql-connector.html and still couldn't figure out if I could give the database name as a parameter in the connection string itself.
I tried setting database/schema/namespace in the **kwargs, but no luck. The query executor keeps saying that the table not found, because the query keeps getting executed in the default database.
Right now it's not supported - primarily reason is that if you have multiple statements then connector could reconnect between their execution, and result of use will be lost. databricks-sql-connector also doesn't allow setting of the default database.
Right now you can workaround that by adding explicit use <database> statement into a list of SQLs to execute (the sql parameter could be a list of strings, not only string).
P.S. I'll look, maybe I'll add setting of the default catalog/database in the next versions

Invalid column definition error when using four part name to access Oracle DB as SQL Server linked server

I have setup a linked server in SQL Server 2008 R2 in order to access an Oracle 11g database. The MSDASQL provider is used to connect to the linked server through the Oracle Instant Client ODBC driver. The connection works well when using the OPENQUERY with the below syntax:
SELECT *
FROM OPENQUERY(LINKED_SERVER, 'SELECT * FROM SCHEMA.TABLE')
However, went I try to use a four part name using the below syntax:
SELECT *
FROM LINKED_SERVER..SCHEMA.TABLE
I receive the following error:
Msg 7318, Level 16, State 1, Line 1
The OLE DB provider "MSDASQL" for linked server "LINKED_SERVER" returned an invalid column definition for table ""SCHEMA"."TABLE"".
Does anyone have any insight on what my be causing the four part name query to fail while the OPENQUERY one works without any problems?
The correct path to follow is to use OPENQUERY function because your linked server is Oracle: the four name syntax will work fine for MSSQL servers, essentially because they understand T-SQL.
With very simple queries, a 4 part name can accidentally work but not often if you are in a real scenario. In your case, the SELECT * is returning all the columns, and in your case one of the column definition is not compatible with SQL Server. Try another table or try to select a single simple column (e.g. a CHAR or a NUMBER), maybe it will work without problem.
In any case, using distributed queries can be tricky sometime. Database itself does some optimizations before executing commands, so it is important for the database to know what it can do and what it can't. If the DB thinks the linked server is MSSQL, it will take some action that may not work with Oracle.
When using four part name syntax with a linked DB different from MSSQL, you will have other problems as well, for example using database builtin functions (i.e. to_date() Oracle function will not work because MSSQL would want to use its own convert() function, and so on).
So again, if the linked server is not a MSSQL, the right choice is to use OPENQUERY and passing it a query that use a syntax valid against the linked server SQL dialect.
If you use the OLEDB provider for Oracle you can query without using openquery

specify default schema for a database in db2 client

Do we have any way to specify default schema in cataloged DBs in db2 client in AIX.
The problem is , when it's connecting to DB, it's taking user ID as default schema and that's where it's failing.
We have too many scripts that are doing transactions to DB without specifying schema in their db2 sql statements. So it's not feasible to change scripts at all.
Also we can't create users to match schema.
You can try to type SET SCHEMA=<your schema> ; before executing your queries.
NOTE: Not sure if this work (I am without a DB2 database at the moment, but it seems that work) and depending on your DB2 version.
You can create a stored procedure that just changes the current schema and then set the SP as connect proc. You can test some conditions before make that schema change, for example if the stored procedure is executed from the AIX server directly with a given user.
You configure the database to use this SP each time a connection is established by modifying connect_proc
http://pic.dhe.ibm.com/infocenter/db2luw/v10r5/topic/com.ibm.db2.luw.admin.config.doc/doc/r0057371.html
http://pic.dhe.ibm.com/infocenter/db2luw/v10r5/topic/com.ibm.db2.luw.admin.dbobj.doc/doc/c0057372.html
You can create alias in the new user schema that points to the tables with the other schema. Refer these links :
http://pic.dhe.ibm.com/infocenter/db2luw/v10r5/topic/com.ibm.db2.luw.sql.ref.doc/doc/r0000910.html
http://bytes.com/topic/db2/answers/181247-do-you-have-always-specify-schema-when-using-db2-clp

SP Not Found When It Clearly Exists

I copied a database from a live MSSQL server to my local one, and was able to log in correctly. I am having a problem however in that when it is time to call a stored procedure the Asp.Net application keeps telling me the SP does not exist, when it clearly does.
I am using windows authentication but on the server I was using credentials, could this be the problem?
Also, all of the SP's have my online username attached to their name, as in username.StoredProcedurenName.
Please help I have been trying to fix this for hours.
I just noticed that when I attempt to run the SP from the SQL Management Studio it works, but it appends the username to the SP such as:
USE [DBNAME]
GO
DECLARE #return_value int
EXEC #return_value = [username].[SPNAME]
SELECT 'Return Value' = #return_value
GO
If I remove the username, it says the same thing (SP not found). How do I get around this?
I suspect you are calling your stored procedure without specifying the schema. When calling a stored procedure (or accessing a table, view, etc) that's not in the default schema that your account is configured for, usually dbo, you need to explicitly include the schema like the sql command below
SqlCommand cmd = new SqlCommand("username.StoredProcedurenName", mySqlConnection);
It's likely what Jason said. The solution has to do with rights and ownership. When you see the SP in the SQL Management Studio, under Programmability->Stored Procedures, your SP should have a prefix like "dbo." or "GateKeeper."
If the SP has "dbo." as the prefix, the user account with which you're connecting to the DB just be part of the database owners (dbo) group, otherwise you won't have access to it. So, you can either add the user to that group, or create the stored procedure ("create procedure spBlahBlah as ..") using the account to plan to run the program under; when you call it you use "exec GateKeeper.spBlahBlah" to stipulate the Schema.StoredProcedureName.
Those are your two choices.

Diagnostics SQL which both Oracle and MSSQL understand

We envision a Diagnostics-process in a ASP.NET WebForms application (.NET4, C#): we dispatch end-to-end a diagnostics signal from UI into the database to verify that all layers of our web-architecture are alive and well. Until now we supported Oracle and invoked
SELECT * FROM DUAL
ultimately. Going forward we will support MSSQL, we will invoke
SELECT GETDATE()
Does anyone know a universal SQL which would work on any Oracle and MSSQL instance out-of-the-box?
If all you are after is for a SQL statement to execute successfully, then you can use something benign like
SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES
See this link on INFORMATION_SCHEMA support
To use this query in Oracle, you will first have to create the schema and table, even if it only has 1 column with no data.. just to get count(*) working. In going that route, it may be even better to just create a dummy table and count from it rather than from INFORMATION_SCHEMA.TABLES

Resources