Google AppMaker: data import not working from sheet - google-app-maker

I am trying to import to an AppMaker app and all I can get is the following error:
JDBC backend failure. More information:
Error while executing SQL statement:
Incorrect string value: '\xC2\xA0' for column
'business_name' at row 1. 0 records imported
I know this has something to do with UTF-8 or character sets, but I have no clue how to solve it!
The spreadsheet is dead simple address info columns.

I had to recreate the Cloud SQL database and specifically choose type as:
utf8mb4
Cloud SQL defaults to utf8. I could then import the data.

Related

Dump to Storage Container failed ingestion in ADX

I've problems when I ingest data from an IoT Hub in my Azure Data Explorer.
When I execute on ADX the command to see the errors:
.show ingestion failures | where FailedOn >= make_datetime('2021-09-09 12:00:00')
I get the error message:
BadRequest_NoRecordsOrWrongFormat: The input stream produced 0 bytes. This usually means that the input JSON stream was ill formed.
There is a column with IngestionSourcePath, but it seems to be an internal URI of the product itself.
I read in another Stack Overflow question, that there is a command in ADX to dump the failed ingestion blob into a container with this syntax:
.dup-next-failed-ingest into TableName to h#'Path to Azure blob container'
The problem is that this command is not documented through Microsoft documentation.
The questions are:
What are the full syntax of this command?
Can you show me some examples?
Which are the needed permissions to run this command over ADX and also over the Blob Container?
Is another command to remove this dump after fix the ingestion errors?
The full syntax of the command is:
.dup-next-failed-ingest into TableName to h#'[Path to Azure blob container];account-key'
or
.dup-next-failed-ingest into TableName to h#'[Path to Azure blob container]?SAS-key'
We will add the documentation for this command.
The error you encountered most likely indicates that the JSON flavor you are using does not match the flavor you specified for the data connection, or that the JSON objects are not syntactically solid. My recommendation would be to make sure you use the "MultiJSON" as the data connection format for any JSON payloads.
When looking at the interim blobs created using this command, please keep in mind that you will not be looking at the original events sent into the IoT Hub, but batches of these events, created by ADX internal batching mechanism.

How to create Hive connection in airflow to specific DB?

I am trying to create Hive connection in airflow to point to specific Database. I tried to find the params in HiveHook and tried the below in the extra options.
{"db":"test_base"} {"schema":"test_base"} and {"database":"test_base"}
But looks like nothing works and always points to default db.
could someone help me to pointout what are the possible parameters we can pass in extra_options ?

why the output values length are getting reduced when using DB links in oracle and asp.net

We are retrieving the output from table through DB link by executing a stored procedure and input parameters which worked previously and got the output in asp.net application.But now we noted that outputs through DB links are getting trimmed say if status is 'TRUE' ,we are getting as 'TRU' etc why the output values are getting trimmed.The only change we did recently was we changed one of the type of input parameter from number to varchar at the receiving remote side,But i don't think that is the issue??whe we execute the stored procedure remotely on the table.It is giving proper output but through DB link outputs are getting trimmed.ANy one has any idea about this issue??
My oracle client is having issues,i reinstalled and it worked fine.Only my system was having issues.so i desided to reinstall

ORA-14102: only one LOGGING or NOLOGGING clause may be specified

While importing an oracle schema from dump file, i am getting below error while creating tables.
ORA-14102: only one LOGGING or NOLOGGING clause may be specified.
I see the above error while creating tables from the dumpfile for several tables.
How to enable or disable LOGGING/NOLOGGING at schema level before i start import?
When performing an Oracle database export with the expdp of Oracle 11gR2 (11.2.0.1) and then importing it into the database with impdp, the following error messages appear in the import log file:
ORA-39083: Object type INDEX failed to create with error:
ORA-14102: only one LOGGING or NOLOGGING clause may be specified
This is a known Oracle 11gR2 issue. The problem is that the DBMS_METADATA.GET_DDL returns invalid syntax for an index created. So, during the index creation, both the NOLOGGING and LOGGING keywords are visible in the DDL. Download and apply Patch 8795792 from Oracle to resolve this issue.

What connection string to use to read sqlite db from powerpivot using SQLite ODBC driver

I'd like to import that data contained in a sqlite file to PowerPivot. I downloaded an ODBC driver for sqlite (http://www.ch-werner.de/sqliteodbc/) to accomplish this. In PowerPivot I selected "Home" > "Get External Data" > "From Other Sources". I scrolled down to "Others (OLEDB/ODBC). Selected it and clicked next.
TheFor the connection string. I found this website: http://www.connectionstrings.com/sqlite and I tried the connection string at the bottom suggested for SQLite3 ODBC Driver:
DRIVER=SQLite3 ODBC Driver;Database=c:\Chinook_Sqlite.sqlite;LongNames=0;Timeout=1000;NoTXN=0;
SyncPragma=NORMAL;StepAPI=0;
(I'm using a sample database that I put at the root of my c:. The db if from here: http://chinookdatabase.codeplex.com/releases/view/55169 )
With this connection string when I Test the Connection I get the following error message:
The test connection failed because the provider could not be initialized. If you contact Microsoft support about this error, provide the following message: Faile to connect to server. Reason: Provider information is missing from the connection string. Add the provider information and try again.
I understand that the driver I installed is not found, but I don't know how to correct the connection string to point to the driver dll.
This solution came after many hours of research and trial-and-error. Though it came 2 years late, I am putting it up to help others trying to import information to Power Pivot 2013 from SQLite.
Step 1: Install SQLite ODBC Driver from here.
Step 2: Create a DNS by opening Windows' 'ODBC Data Sources Administrator' (you can find it under Windows > Administrative Tools). See here and here for more information. I have tried creating the DNS under both 'User DNS' and 'System DNS' - both work fine with Power Pivot.
Step 3: Open Power Pivot and do the following:
Click 'From other Sources' > 'Others (OLEDB/ODBC)' > Click on 'Build' button >
Under 'Provider' tab > Select 'MS OLE DB Provider for ODBC Sources' > In 'Use Data Source Name', select your DNS created in Step 2 and add any other parameters. At this point, you can test the connection and it should say 'Test Connection Succeeded'
Once you click 'OK', you should see the Connection String automatically generated. Mine was: 'Provider=MSDASQL;Persist Security Info=False;DSN=SQLiteTest'.
Follow the next few steps to import your data from SQLite.
You need something like this:
Provider=MSDASQL.1;Persist Security Info=False;Mode=ReadWrite;Initial Catalog=C:\XXX.db;DSN=SQLite3 Datasource

Resources