Fast Export script of Teradata - teradata

i am trying to pull data from Teradata and put it into hadoop.
i have written a script to do so.
Well this is not a direct process.
It is staged to Hadoop's local and then put into Hadoop.
While running the script i am getting the following error:
0002 .LOGTABLE log_1;
**** 16:06:28 UTY1006 CLI error: 235, MTDP: EM_GSSINITFAIL(235): call to
gss_init failed.
**** 16:06:28 UTY2410 Total processor time used = '0 Seconds'
. Start : 16:06:28 - TUE AUG 20, 2013
. End : 16:06:28 - TUE AUG 20, 2013
. Highest return code encountered = '12'.
Can anyone help me and tell what is the mistake here? What does that error mean?

The system you are running the FastExport script from is either missing, has a corrupt installation of, or incorrect PATH statement for the Teradata GSS Libraries. The ICU and GSS libraries are security components for the Teradata providers (CLI, ODBC, .Net, and JDBC). Without them you will not be able to connect to the Teradata system.

Related

Error when launching Kibana : cannot execute binary file - undefined error 0

I'm very new to the ELK stack and was trying to add some security settings (username and password) to access Kibana following the instructions from the link below:
https://www.elastic.co/blog/getting-started-with-elasticsearch-security
At Step 4: Security in Kibana, once the yml file modified, I try to launch Kibana from the terminal with the command ./bin/kibana but it displays the following errors :
./bin/kibana: line 24: /usr/local/var/homebrew/linked/kibana-7.6.2-linux-x86_64/bin/../node/bin/node: cannot execute binary file
./bin/kibana: line 24: /usr/local/var/homebrew/linked/kibana-7.6.2-linux-x86_64/bin/../node/bin/node: Undefined error: 0
I think I've followed all the previous steps carefully and everything else worked so far.
I'm using a Mac and the error seems to be very basic. Any clue?
Thanks for the help.
Looks like you've downloaded the wrong architecture (Linux) of Kibana on your Mac.
This generally happens when the architecture is not compatible with system architecture or running a 64 bits on 32 bits machine.
Simple solution:
Download the mac version of Kibana from here - https://www.elastic.co/downloads/kibana
Once downloaded, run ./bin/kibana in Kibana directory.
This will successfully start local server of Kibana on localhost:5601

Bigquery crashlytics dataset schedule interval

We're currently looking into Firebase<>BigQuery (not sandboxed) for monitoring purposes. We've hooked up one of our projects using the Firebase integration and have gathered a few days worth of data.
Only the data is always a day off, which makes sense since the transfer only runs every 24 hours. But trying to change it through the bq cli:
bq update --transfer_config \
--target_dataset='crashlytics' \
--schedule='every 2 hours' \
projects/p/locations/l/transferConfigs/c
results into a 400 error:
Bigquery service returned an invalid reply in update operation: Error reported by server with missing error fields. Server returned: {u'error': {u'status': u'INVALID_ARGUMENT',
u'message': u'Request contains an invalid argument.', u'code': 400}}.
Please make sure you are using the latest version of the bq tool and try again. If this problem persists, you may have encountered a bug in the bigquery client. Please file a bug
report in our public issue tracker:
https://issuetracker.google.com/issues/new?component=187149&template=0
Please include a brief description of the steps that led to this issue, as well as any rows that can be made public from the following information:
========================================
== Platform ==
CPython:2.7.16:Darwin-19.2.0-x86_64-i386-64bit
== bq version ==
2.0.53
== Command line ==
['/path/bq/bq.py', '--application_default_credential_file', '/path/e#mail.com/adc.json', '--credential_file', '/path/e#email.com/singlestore_bq.json', '--project_id=tde-psv-app', 'update', '--transfer_config', '--target_dataset=crashlytics', '--schedule=every 2 hours', 'projects/p/locations/l/transferConfigs/c']
== UTC timestamp ==
2020-02-24 08:47:23
== Error trace ==
Traceback (most recent call last):
File "/path/bq/bq.py", line 1116, in RunSafely
return_value = self.RunWithArgs(*args, **kwds)
File "/path/bq/bq.py", line 4615, in RunWithArgs
schedule_args=schedule_args)
File "/path/bq/bigquery_client.py", line 3984, in UpdateTransferConfig
x__xgafv='2').execute()
File "/path/bq/bigquery_client.py", line 810, in execute
BigqueryHttp.RaiseErrorFromHttpError(e)
File "/path/bq/bigquery_client.py", line 788, in RaiseErrorFromHttpError
BigqueryClient.RaiseError(content)
File "/path/bq/bigquery_client.py", line 2385, in RaiseError
raise BigqueryError.Create(error, result, [])
BigqueryInterfaceError: Error reported by server with missing error fields. Server returned: {u'error': {u'status': u'INVALID_ARGUMENT', u'message': u'Request contains an invalid argument.', u'code': 400}}
========================================
Unexpected exception in update operation: Bigquery service returned an invalid reply in update operation: Error reported by server with missing error fields. Server returned:
{u'error': {u'status': u'INVALID_ARGUMENT',
u'message': u'Request contains an invalid argument.', u'code': 400}}.
Please make sure you are using the latest version of the bq tool and try again. If this problem persists, you may have encountered a bug in the bigquery client. Please file a bug
report in our public issue tracker:
https://issuetracker.google.com/issues/new?component=187149&template=0
Please include a brief description of the steps that led to this issue, as well as any rows that can be made public from the following information:
We might get the impression that this is not possible for this kind of datasets / Firebase projects, but we can't see to find any clean answer on that.
Right now the data export is only available once per 24 hours. We are looking into changing this behavior. Please stay up to date on the Firebase blog for any announcements.

Unable to communicate with the runtime for 'R' script in SQL Server 2017

I'm having trouble getting R to work on SQL Server 2017 on one server (I've successfully installed it on about 8 other servers). I've already installed that latest cumulative update.
When I execute a stored procedure that runs a simple hello world R script, I can see that LaunchPad.exe and rterm.exe are both running. After 60 seconds, however, I get the following error:
Msg 39012, Level 16, State 1, Line 0
Unable to communicate with the runtime for 'R' script. Please check the requirements of 'R' runtime.
STDERR message(s) from external script: Fatal error: creation of tmpfile failed -- set TMPDIR suitably?
This is the script that fails:
EXEC sp_execute_external_script
#language =N'R', #script=N'print("hello")';
Any ideas on what I need to do to resolve this error?
The problem was that Named Pipes wasn't enabled for SQL Server. Enabling that, and restarting the services solved my issue.
My assumption is that you applied the CU after the installation of Machine Learning Services? If so, the CU somehow messes up the folder permissions.
I wrote a blog post about how to fix it here. The blog post is about CU7, but it should apply to any CU.
I do not guarantee that it works, as I have seen other issues when the ML Services stop working, for those cases what fixes it is to do a repair of the SQL installation.

Installing SQL Server R Services - error

I'm trying to Install SQL Server R Services. I'm using SQL SERVER 2016 RC1. I'm following this step by step tutorial https://msdn.microsoft.com/en-us/library/mt604883.aspx Everything seems to install ok, but I get the following error when testing an R script.
Msg 39021, Level 16, State 1, Line 1
Unable to launch runtime for 'R' script. Please check the configuration of the 'R' runtime.
Msg 39019, Level 16, State 1, Line 1
An external script error occurred:
Unable to launch the runtime. ErrorCode 0x80070490: 1168(Element not found.).
Msg 11536, Level 16, State 1, Line 1
EXECUTE statement failed because its WITH RESULT SETS clause specified 1 result set(s), but the statement only sent 0 result set(s) at run time.
I'm using the following code:
exec sp_execute_external_script #language =N'r',
#script=N'OutputDataSet<-InputDataSet',
#input_data_1 =N'select 1 as hello'
with result sets (([hello] int not null));
go
Any ideas as to what may be going wrong?
Thank You
I had the same issue initially. I had mistakenly skipped the post-installation steps, specifically the step to register the R runtime with SQL Server. See MSDN post: https://msdn.microsoft.com/en-us/library/mt590536.aspx
Try to uninstall using RegisterRext first and then install. Only this worked for me:
"C:\Program Files\Microsoft SQL Server\130\R_SERVER\library\RevoScaleR\rxLibs\x64\RegisterRExt" /uninstall
Then
"C:\Program Files\Microsoft SQL Server\130\R_SERVER\library\RevoScaleR\rxLibs\x64\RegisterRExt" /install
set User Account Control to never notify when apps try to install software or make changes to your computer.

Can't able to create table using ORE.create

I had executed the R program and when I try to push the result to a table using
ore.create(score, table="xyz")
I'm getting the following error:
Error in .oci.GetQuery(conn, statement, data = data, prefetch = prefetch, :
ORA-12801: error signaled in parallel query server P007, instance XY.ab.dc.cd:abc (2)
ORA-06520: PL/SQL: Error loading external library
ORA-06522: /app/oracle/product/11.2.0/dbhome_1/lib/librqe.so: cannot open shared object file: No such file or directory
ORA-06512: at "RQSYS.RQROWEVALIMPL", line 20
ORA-06512: at "RQSYS.RQROWEVALIMPL", line 16
ORA-06512: at line 4
Please help to solve this issue since I tried to solve this for the past 1 week and I cant able to as I am new to this.
Any help much appreciated
This looks like a problem with your installation of Oracle R package.
The message indicates your are running on 11gR2. ORE requires 11.2.0.3 or higher, or 11.2.0.1 with a specific patch applied. Check this OTN Forum thread for details.
You need an Oracle Support contract to get hold of these patches. If you don't have a contract you will need to migrate to database 12c in order to use R.

Resources