library(Quandl)
library(xts)
Quandl.api_key("your_api_key")
cl <- Quandl("CME/CLU2022", type = "xts")
This data ends on June 30th 2022. As of today 7/14/2022, it should have data out to 7/13/2022. Thank you!
I experienced the same issue. I have seen no price updates past June 30, 2022.
Unfortunately, it looks like NASDAQ has acquired Quandl and they are in the process of deprecating the CME EOD price database/API.
From Nasdaq data support:
Kindly be advised that the free CME futures database has been
deprecated. Moving forward, the data will no longer be updated and may
no longer be accessed. We're very sorry for any inconvenience caused
by the deprecation of the free CME futures database and we don't have
an equivalent database available on Nasdaq Data Link at the moment.
Please note that free data feeds may get deprecated at any time for
various reasons eg. they may no longer meet our quality standards,
they get copyrighted, technical issues prevent us from obtaining the
data, etc.
Unfortunately, we do not have a similar database on Nasdaq Data Link
that may serve as an alternative.
Related
I found this site which allow me to scrap the announcement title and date. However, the amount of announcement is limited to 20. See image below, anyone know where I can scrap the title and date for each announcement for the fiscal year 2021
https://www.asx.com.au/asx/1/company/CBA/announcements?count=50&market_sensitive=false
The API is limited to 20, it is an undocumented API, other than spending time working with the API to figure out other ways around it, they don't provide any other method of getting results outside the bounds of the 20 results.
If you want more, you're going to have to find another service, which likely involves paying or writing more code.
I'm getting campaigns/adGroups reports from Sponsored Brands/Sponsored Products Amazon Advertising API. When I select reportDate older than 60 days, I'm getting the error "Report date is too far in the past. Reports are only available for 60 days." (code 406). Is it really not possible to get older reports? Or maybe older reports need to be queried differently? Also, isn't it possible to get the report for time period longer than one day in one request?
There is an information about "reportDate" parameter, that it is "The date for which to retrieve the performance report in YYYYMMDD format. The time zone is specified by the profile used to request the report. If this date is today, then the performance report may contain partial information. Reports are not available for data older than 60 days." - but is it for all reports always?
It seems strange to me, as other services normally offers more stats that 2 months, and also there is the note in the documentation, that "Note: New-to-brand metrics are calculated from November 1, 2018. If a report date is requested earlier than this date, the metrics will be calculated from November 1, 2018."
Thank you for explanation and your help!
Ela
No, you cannot get data older than 60 days through the API. The data does exist within Amazon's databases, but cannot be accessed via API.
If you have a vendor manager contact or something similar, it's theoretically possible to request this data from them, but they'd probably only do it as a one-off for a large client.
Scratching my head with a specific setup for member-only content (currently using Woo Memberships and Woo Subscriptions).
The project has a large number of downloadable products which are available for purchase as one-offs. So far so good :)
However, those downloads should be available for free for paid members - a bit more tricky.
Besides that, only those downloads that are published during the membership period should be available for free - every product has a custom field that contains month and year the product belongs to.
For example, if I'm an active member since Jan 2019 - I get free access to downloads for Jan 2019, Feb 2019 and so on.
If I used to be a member from Apr 2018 till Dec 2018 - I get free access to downloads for Apr 2018, May 2018 and so on.
Any tips how to build such setup? Should I consider EDD (or maybe something else) instead of Woo? I'm almost ready to begin modifying https://woocommerce.com/products/woocommerce-subscription-downloads/ for this, but thought maybe someone else has done anything similar before?
A little late to the party on this one, but I think that a plugin called S2Member might be really good for this case.
I've been banging my head against the keyboard for about a day now trying to figure out this time variance with momentjs.
I have an API endpoint that sets the time the request came in by calling
var now = moment().valueOf();
Then, it finds the beginning of the year by simply calling
moment(now).startOf('year');
After about 48 hours of seeding our DB with test data on a run, I noticed that the timestamp set in the database is December 31, 2015 (1451624400000).
When I run the program locally, the timestamp is January 1, 2016 (1451635200000).
When I log into our servers, create a test script to find the variance, and log the output, the timestamp is January 1, 2016.
Nothing is touching the time except for this one method in the api.
Is there some reason why momentjs would think the start of the year is December 31, 2015? If so, how do I change it? Haven't been able to find any help on this so far, and the moment.js docs explicitly say the start of the year is January 1 of the year.
I know this question has been around for quite sometime.
I faced similar issue and the solution to this is formatting the result.
var startOfThisYear = moment().startOf('year').format();
Thanks for the help, all.
It was a timezone issue, after all. I was overlooking a few things and had myself convinced I was testing consistently.
I had to configure the timezone on our Ubuntu machines with
sudo dpkg-reconfigure tzdata
We will be setting up NTP servers in addition for future prevention of such a variance.
It means I can't see traffic i got today. Also is it only specific to API or Overall Analytics System?
The reason Google recommends this is because for most of the data, there is about a 24 hour delay before you see it in reports or have it available for pulling with the API. The extra 24 hours on top of that is a buffer for insurance.
So if you look at a report or pull data with the API from like 12 hours ago, and then wait an hour or whatever and pull the data with same ranges/metrics/etc... the numbers won't match up, because by then, more data will have become available. But it's data that was already there (people didn't take a time machine into the past and visit your site, obviously)...it was just not yet processed and available for looking at through the report/API.
A delay in data for reports (or through an API) is not unique to GA. Different reporting tools have different "lags" in data availability, depending on how their databases are setup, how they process the data, how much you are paying for the services, etc... for instance (these are the 4 major tools I've used):
Yahoo Web Analytics data is more or less real-time
Adobe/Omniture SiteCatalyst is..they say real-time but in practice I've seen it take anywhere from instant to an hour
WebTrends has a 24 hour delay
GA has a 24 hour delay
But this isn't as big a deal as you might think. Most companies look at reports by the week, month, quarter, year, so really the delay isn't a problem for the people that matter. The only people that really feel it are the code implementers who have to sit there and wait to see data come in when they are trying to QA an implementation or debug when there is a potential problem.
But even then there are a lot of tools out there that let you see in real-time what is physically being sent to the tool (like firebug, charles proxy, etc...), which greatly helps in QAing. It doesn't really help as far as QAing stuff that requires settings/alterations within the tool's interface, but still, it's a big help.