I'm new to using APIs so bear with me!
I am trying to scrape journal articles from Scopus using the package "rscopus" in R (session info below). I have a working API and institutional key but when I run my query, it gets to about 50% completion before I get this error:
"Error in get_results(query, start = init_start, count = count, verbose = verbose, :
Gateway Timeout (HTTP 504)"
Any idea on how to resolve this?
Session Info
R version 4.2.0 (2022-04-22 ucrt)
Running under: Windows 10 x64 (build 22000)
It could be that your request rate or number of requests is exceeding the quotas set to your account.
Please try adjusting your settings.
If the problem persists, please contact the Elsevier Data Support datasupport#elsevier.com, who may be able to help you.
Related
calling request (Request/Response, not Batch execution, Azure Machine Learning Web service) fails. Error is in topic, I have updated RCurl-library to solve problem, but it wasn't.
Request R-code is taken from the "Request Response API Documentation for XXXXX".
This R-code worked fine until January 2020 (worked about 2 years).
What to try next?
An R script to contact webservices deployed with Azure ML is available at the following link
https://studio.azureml.net/apihelp/workspaces/26d0e94d818b46c099320b38c0495493/webservices/ba6fb51adb48479dbda75cff0fd2c05a/endpoints/92c6c73f1dbc4c0baf57b075d6ea7f50/score
The "RCurl" library is deprecated for contacting Azure ML webservices. In contrary the "curl" and "httr" R libraries should be used.
After updating to macOS Sierra my fairly old and previously ok application has stopped sending NSLog messages directly to Console. Instead I have to view the messages in system.log.
Is this expected behaviour, or is there some migration I need to perform to restore the old behaviour?
I'm having trouble with the googleway package in R.
I am attempting to get driving distance for 159,000 records.
I am using a paid google cloud account and have set all quotas as unlimited.
I've attempted to use server keys and browser keys.
After multiple attempts the service returns a time out message
Error in open.connection(con, "rb") : Timeout was reached
Successfully returned x results before timeout
1) x ~=5,000 2) x ~=7,000 3) x ~=3,000 4) x ~= 12,000
All tried on different days.
As you can see none of these are any where near the 100,000/day quota.
We've checked firewall rules and made sure that the cause of the time out is not at our end.
For some reason Google API service is cutting off the requests.
We have had no response from Google and we are currently on the bronze support package so we don't get any real support from them as a matter of course.
The creator of the googleway packages is certain that there are no impediments coming from the package.
We're hoping there is someone out there who may know why this may be happening and how we could avoid it from happening to enable us to run the distance matrix over our full list of addresses.
Using R version "Supposedly Educational".
Using Googleway package.
CHARSET cp1252
DISPLAY :0
FP_NO_HOST_CHECK
NO
GFORTRAN_STDERR_UNIT
-1
GFORTRAN_STDOUT_UNIT
-1
NUMBER_OF_PROCESSORS
4
OS Windows_NT
PROCESSOR_ARCHITECTURE
AMD64
PROCESSOR_IDENTIFIER
Intel64 Family 6 Model 60 Stepping
3, GenuineIntel
PROCESSOR_LEVEL 6
PROCESSOR_REVISION
3c03
R_ARCH /x64
R_COMPILED_BY gcc 4.9.3
RS_LOCAL_PEER \\.\pipe\37894-rsession
RSTUDIO 1
RSTUDIO_SESSION_PORT
37894
I have developed a different implementation between Google maps and R:
install.packages("gmapsdistance")
You can try this one. However, take into account that in addition to the daily limits, there are limits to the number of queries even if you have the premium account (625 per request, 1,000 per second in the server side, etc.):
https://developers.google.com/maps/documentation/distance-matrix/usage-limits I think this might be the issue
At first I was getting an error that involved my signing identity. After signing out and then signing back in, I'm now getting the following errors. I think it has something to do with the the certificates or provisioning profile.
error: A cryptographic verification failure has occurred.
*** error: Couldn't codesign /Users/x/Library/Developer/Xcode/DerivedData/SeedDemo-elhrfqwwnjtcgucmplbbubsbcjcd/Build/Products/Debug-iphoneos/SeedDemo.app/Frameworks/libswiftCore.dylib:
codesign failed with exit code 1
This is a macOS Sierra problem that is affecting many other developers. Your options now are to revert back to El Cap, or to wait and hope for a resolution with the next Sierra / Xcode 8 beta. I wish there was a better answer, but at this time (6/26/2016), those are the only know solutions.
Some developers are reporting this fixed with macOS Sierra Beta 2 and Xcode 8 Beta 2, but I have not yet comfirmed on my spare Mac if this fixes the problem.
Also, some developers have reported that resetting the Keychain fixes is. In the Keychain Access app , go to "Preferences..." and click "Reset My Default Keychain".
I have an ASP.NET MVC application running on Mono 4.0.5 under Ubuntu 15.04. The application works as expected while the internet access is available, but if the OS is restarted on a network without internet connection, I get the following error:
I have tried updating machine and user certificate stores without any success using "mozroots --import --sync --machine".
It should be noted that this error only occurs on the Login page (using Forms Authentication with MySQL provider with "requireSSL" set to "false").
I don't use SSL on any of my pages and don't have it enabled/configured in Apache/Mod_Mono configuration. The LoginController doesn't make any (e.g. HTTPS) requests either.
Also, I've tried running the application through XSP4, which produced exactly the same behavior.
Any help would be much appreciated...
After inspecting logs, I've noticed that the system date was set to to 01/01/1970. After updating the date and restarting Apache, everything worked. I guess in my case the NTP was updating the date/time on boot every time and without internet connection was falling back to Unix epoch.