YQL "The current table has been blocked” - web-scraping

I'm trying to query my self-written YQL-table. If I run the table from the YQL Console, everything works fine. But if I call the table by URL via browser or application, the following error appears:
The current table 'yahoo.finance.quant' has been blocked. It exceeded the allotted quotas of either time or instructions
The documentation says, 1.000 queries an hour are allowed. I definitely didn't exceed that limit. Does anyone have an idea how to resolve that?
And if not, is there any good alternative to YQL?

Related

Authenticating Google Cloud Storage in R Studio

I know a similar question has been asked (link), but the response didn't work for me.
TLDR: I keep running into errors when trying to authenticate Google Cloud Storage in RStudio. I'm not sure what is going wrong and would love advice.
I have downloaded both the GCS_AUTH_FILE (created a service account with service admin privileges'--downloaded the key associated with the service account) and also downloaded GAR_CLIENT_WEB_JSON by creating a OAuth 2.0 Client ID and downloading that associated JSON file.
I've tried authenticating my Google Cloud Storage in several ways and hit different errors.
Way 1-automatic setup:
gcs_setup()
Then I select any one of the options, and get the error: Error in if (file.exists(local_file)) { : argument is of length zero And that error happens no matter which of the three options I select.
Way 2 - basic, following manual setup instructions from the package:
Sys.setenv("GCS_DEFAULT_BUCKET" = "my-default-bucket",
"GCS_AUTH_FILE" = "/fullpath/to/service-auth.json")
gcs_auth()
In this case, GCS_AUTH_FILE is the file that I mentioned at the beginning of this post, and the GCS_DEFAULT_BUCKET is the name of the bucket. When I run the first line, it seems to be working (nothing goes awry and it runs just fine), but when I run gcs_auth() I get taken to a web browser page that states:
"Authorization Error
Error 400: invalid_request
Missing required parameter: client_id"
Way 3: Following the method from the post that I linked above
This way involves manually setting the .Renviron file w/ the GCS_AUTH_FILE and GAR__CLIENT_WEB_JSON locations, and then running gar_auth(). And yet again, I get the exact same error as in Way 2.
Any ideas about what could be going wrong? Thanks for your help. I wasn't sure how to put in totally reproducible code in this case, so if there is a way I should do that, please let me know.

How to solve a data source error when loading Google Analytics data in Power BI?

I would like to load data from Google Analytics into Power BI.
After transforming the data in the Query Editor, I apply the changes.
At first, I see the message 'Waiting for www.googleapis.com' and the number of rows increases.
After a while, I get the following error message:
Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [DataSource.Error] There was an internal error..'
Rows with errors have been removed in one of the steps and I have a stable Internet connection.
Does anyone have suggestions on how to solve this?
I was also facing this kind of refreshing issue, First go to edit query and verify the data types and change the data types if needed, after that if you still facing this error, you need to keep open the app.powerbi.com while refresh your PBI dashboard, I was followed the above steps and my issue got resolved now.

Why is Bing Custom Search API throwing an error when I make a call through node-js?

I was using Bing Custom Search API for the past week with no problems thanks to the free trial, but today, I tried upgrading to the S1 plan, since the API was sending error messages. I tried regenerating the key as well, but despite doing both of these things, I was still getting errors and unable to use the API.
However, I was able to make calls using: https://www.customsearch.ai/applications, where I was able to use my API key to test endpoints and get the results I expected. What baffles me is that my nodejs code which hasn't been modified aside from the subscription key should still work with the upgraded plan, but it doesn't.
I should be able to help here. Let us do little bit of troubleshooting once:
First go to https://www.customsearch.ai/applications -> click on your instance name ->Click on "Production" tab on top -> Try making API call on this page by providing Query and Subscription key you got. If this works, go to the next step.
On the page mentioned above you would have seen Custom Configuration ID and Subscription Key. Make sure those two are same in your node.js code. Ideally this should work.
If it still doesn't work, please share your error code so that I can get better understanding of the error you are getting.

Failing to upload JSON file through Chrome to Firebase Database

This is really frustrating. I have a 104 MB JSON file that I want to upload to my Firebase database through the web front end, but after a random period of time (I've timed it, it's not constant, anywhere from 2 to 20 seconds) I get the error:
There was a problem contacting the server. Try uploading your file again.
So I do try again, and it just keeps failing. I've uploaded files nearly this big before, and the limit for stored data in the realtime DB is 1 GB,
I'm not even close to that. Why does it keep failing to upload?
This is the error I get in chrome dev tools:
Failed to load resource: net::ERR_CONNECTION_ABORTED
https://project.firebaseio.com/.upload?auth=eyJhbGciOiJIUzI1NiIsInR5cCI6…Q3NiwiYWRtaW4iOnRydWUsInYiOjB9.CihvjvLSlx43nOBynAJeyibkBRtygeRlG4Yo1t3jKVA
Failed to load resource: net::ERR_CONNECTION_ABORTED
If I click on the link that shows up in the error, it's a page with the words POST request required.
Turns out the answer is to ignore the web importer entirely and use firebase-import. It worked perfectly first time, and only took a minute to upload the whole json. And it also has merging capabilities.
Using firebase-import as the accepted answer suggested, I get error:
Error: WRITE_TOO_BIG: Data to write exceeds the maximum size that can be modified with a single request.
However, with the firebase-cli I was successful in deleting my entire database:
firebase database:remove /
It seems like it automatically traverses down your database tree to find requests that are under the limit size, then it does multiple delete requests automatically. It takes some time, but definitely works.
You can also import via a json file:
firebase database:set / data.json
I'm unsure if firebase database:set supports merging.

Long query: Does anyone know what this error is all about?

Currently, I am doing a query to a sql server database which has 6 million records.
A date range is specified in the query in order to filter the result. When the date range is short, i.e. 2 hours, the application displays the result with no problems.
But if the date range is a bit longer, i.e. a week, the application displays the following errors:
Finally, after I have accepted the two previous errors, and I click in any other section of the application I get the following error:
Strangely, this behaviour only happens in the live server (running on iis7), whereas in the localhost (casini) the applications displays the query results regardless the data range value.
Any thoughts on how to get around the problem will be greatly appreciated.
For your first problem, read following article here
When an error occurs on the server while the request is being processed, an error response is returned to the browser and a PageRequestManagerServerErrorExceptionobject is created by using the Error.create function. To customize error handling and to display more information about the server error, handle the AsyncPostBackError event and use the AsyncPostBackErrorMessage and AllowCustomErrorsRedirect properties. For an example of how to provide custom error handling during partial-page updates, see Customizing Error Handling for ASP.NET UpdatePanel Controls.
For second problem, may be you can get solution here
Solution: Our web server could not resolve the URL of the back-end website. We needed to add a hosts file entry on our server to resolve the issue.

Resources