Exact time for activities monitored by Microsoft band - microsoft-band

I want to know the exact times during the night when i was awake or was experiencing deep sleep.
The data that Microsoft exports in csv format is the summed up data while the one it the graph on the dash board is no time stamped.
Is there a way to get the exact times when i was awake from the Microsoft band?

You may be able to get more raw data about sleep activities by pulling it from the Cloud API, though that's certainly not as convenient as the using the dashboard.
You can find samples and documentation about the (preview) API here.

Related

Azure Time Series Insights Gen2 slower than preview?

We have a couple of environments still running Time Series Insights Preview version. This is really fast and we are really satisfied by it. However, new environments really seem a lot slower with the official release. Warm path extraction is a lot slower, but still doable, while cold path extraction becomes unbearable.
EDIT: We need to add &storeType=WarmStore if we would like to query warm data. Cool! This works really fast again! Question about cold store still persists:
It is hard to compare the different environments, because the datasets are not exactly the same, but for our new environment we have about 4.5 TB sensor data imported in TSI.
The following screenshot shows a query that tries to retrieve one minute of data for one device (each device only sends data each 10 seconds) in the far past of 2018. However, the server returns the call after 30 seconds with a continuationtoken, saying it couldn't retrieve all the 6 values in time. Sometimes it manages to return all 6 of the values, but it still takes 30 seconds.
My internet download speed, while performing the query, was over 80 Mb per second, so that shouldn't be an issue either.
Is this something we should be worried about in the new release?
please submit a support ticket through the Azure portal with all of these details and the product team will investigate.

Google Analytics Real-time + historical data

I work for a non-profit that needs to see how our fundraising efforts are going in 'real-time'.
We look at results in blocks of about a half hour - so we need to report on how we finished the last 24 hours or so and also where we're at in the current half-hour. We're accomplishing this through google analytics, as we have multiple fundraising streams all pointing to a common GA account.
I have tried using datastudio to report against the GA API, but that connector does not seem to refresh at a reliable rate - someitmes it'll pull fresh data within a minute, sometimes it can take twenty minutes to report on recent transactions. I believe the 'real-time' API could be used to get fresher GA data, but as far as I can tell, that will only report 'live' data, and not prior/historical data (say from four hours ago). Does anyone know what API I could use if any to pull all data historical through current datetime?
I apologize if this request is vague, but I'm just looking for a conceptual approach at this point to get the freshest data - preferably in one fell swoop (API call). There is more complexity post-data intake (I have to then compare it to goals we've set for each half-hour, amongst other nuances to the transacitons themselves), so i wanted to start with this fundamental piece/question.
Thanks!
Given the context provided, I believe that the API solution would not be feasible. Among other reasons:
The real time API only offers a limited amount of dimensions and metrics. For example, e-commerce data is not available.
https://ga-dev-tools.appspot.com/dimensions-metrics-explorer/
https://developers.google.com/analytics/devguides/reporting/realtime/dimsmets
The Standard intraday processing SLA for the Core Reporting API is < 24 hours for standard properties. The processing occurs on a best effort basis. Meaning that an hourly availability can occur from time to time but can not be guaranteed.
https://support.google.com/analytics/answer/7084038?hl=en
As an alternative approach to the API solution, you could consider the use of an App + Web property which would allow you to stream event data in real time to BigQuery. However, this solution has some cost implications and would introduce you to a new tracking paradigm.
https://developers.google.com/analytics/devguides/collection/app-web/tag-guide
https://support.google.com/firebase/answer/6318765?hl=en
https://www.simoahava.com/analytics/getting-started-with-google-analytics-app-web/

How to extract Google Analytics historical data using APIs. Pros and cons?

I'm doing a quick proof of concept to understand the procedure to extract historical data from Google Analytics to be further used for offline data stitching to generate a holistic view of data and its analysis. I have not found any detailed online documentation available to understand pros and cons.
Would like to know any limitations on:
The time period for which data can be extracted or any limitation for max. calendar days?
Whether all dimensions/metrics can be extracted or any specific ones?
Will the data be real-time or sampled?
Can all data be pulled into a single table or separate ones?
Will it be available for both freeware and premium version?
The time period for which data can be extracted or any limitation for max. calendar days?
Start date can not be before the launch of Google analytics on '2005-01-01'. Due to processing time lag extracting data that is newer then 2 days old can result in incomplete data. Recommend checking the isDataGolden flag on the response.
Requesting large date ranges can result in sampling which can not be prevented. Its best to request the data in small chunks.
Whether all dimensions/metrics can be extracted or any specific ones?
A list of the dimensions and metrics you can extract can be found here. Each request can contain a maximum of 7 dimensions and 10 metrics.
Will the data be real-time or sampled?
Real-time API and Reporting API are two different APIs. Real-time API is not to my knowledge sampled but as its only about five minutes of data I find it hard to think anyone but really big websites will hit this problem if it is.
Will it be available for both freeware and premium version?
Accessing Google Analytics APIs is free there is no charge. There are however limits on how much data you can extract in a given day.
By default your application can run a maximum of 50k request a day. This can be extended.
Each view you are extracting from can make a maximum of 10k requests day. This can not be extended.
See: limits and quotas for more info.
Note: I am a developer on a business intelligence application that extracts Google Analytics data. I can tell you that its definitely doable.

Google Analytics API recommends running queries on dates that are at least 48 hours in the past for consistent results

It means I can't see traffic i got today. Also is it only specific to API or Overall Analytics System?
The reason Google recommends this is because for most of the data, there is about a 24 hour delay before you see it in reports or have it available for pulling with the API. The extra 24 hours on top of that is a buffer for insurance.
So if you look at a report or pull data with the API from like 12 hours ago, and then wait an hour or whatever and pull the data with same ranges/metrics/etc... the numbers won't match up, because by then, more data will have become available. But it's data that was already there (people didn't take a time machine into the past and visit your site, obviously)...it was just not yet processed and available for looking at through the report/API.
A delay in data for reports (or through an API) is not unique to GA. Different reporting tools have different "lags" in data availability, depending on how their databases are setup, how they process the data, how much you are paying for the services, etc... for instance (these are the 4 major tools I've used):
Yahoo Web Analytics data is more or less real-time
Adobe/Omniture SiteCatalyst is..they say real-time but in practice I've seen it take anywhere from instant to an hour
WebTrends has a 24 hour delay
GA has a 24 hour delay
But this isn't as big a deal as you might think. Most companies look at reports by the week, month, quarter, year, so really the delay isn't a problem for the people that matter. The only people that really feel it are the code implementers who have to sit there and wait to see data come in when they are trying to QA an implementation or debug when there is a potential problem.
But even then there are a lot of tools out there that let you see in real-time what is physically being sent to the tool (like firebug, charles proxy, etc...), which greatly helps in QAing. It doesn't really help as far as QAing stuff that requires settings/alterations within the tool's interface, but still, it's a big help.

How to update SQL Server database every 1 minute?

I have a SQL Server database which contains stock market quotes and other related data.
This database needs to be updated at regular interval, say 1 minute.
My question is:
How do I get stock quotes every 1 minute and update it to database?
I really appreciate your help.
Thanks!
You know, you seriously put the question from the wrong side. Like "I have a car, Mercedes, Coupe - how can I find the best road from A to B". Totally unrelated to the car.
Same with your question - this is not a sql or even an asp.net question to start with. The solution is independant of both, the sql server used and your web technology. Your main question is:
How do I get stock quotes every 12 minute and update it to the database?
Here we go. I assume you (a) talk of US stocks and (b) mean all of them, not a handfull.. 1 minute is too small an interval to make scanning things like yahoo.com feasible - main problem here is that there are tousands of stocks (actually more in the tens of thousands), and you dont want to go to yahoo scrapping thousands of pages per minute.
Same time, a end retail user data feed provider will not work. They support X symbols at a time, and x being typcially in the low hundred area, sometimes upgradable to 500 or so.
If you need STOCK DATA every minute, as per all US stocks, then this is technically identical to "real time prices", which ends up costing money. In adition you need a commercial higher end data feed of which I know of... one. Sorry. Costs going to be near or full four digit, without (!) publication rights.
And that is NxCore - their system has a data offer that offers US Stocks (all exchanges) real time, complete feed with all corretions etc. Native and C# wrapper API, so you can take the real time data feed, update your current pricing in memory and write them out to sql server every minute. Preferably not from asp.net (baaaaad choice for something that should run 24/7 without interruption unless you do heavy setup changes etc.) but from an installed windows service. Takes some bandwidth - no real idea how much (I am getting 4 exchanges from them, but no stocks, only the cme group futures, CME, CBOT, NYMEX and COMEX).
Note that wwith this setup you can go faster, too, but if you go fully real time you need a serious server. We talk of a billion updates or so per day...
End user sql server setup (i.e. little ram, and few slow discs) wont work.
Too expensive? Ther are plenty of data feeds around for a lower price, but they will not give you "stocks" as in "all of them", just "a selection".
If you are ok with not real time data - i.e. pulling stuff down at the end of the day, eoddata.com has a decent offer. YOu could also thnen pull things up via an asp.net page, but again.... you will not have the data during the day, just - well - after close. Smallest granularity is 1 minute. Repluublication rights again a no - but probably you can talk to them.
This isn't really SQL Server specific; a typical solution is that your run a process that polls an external source (a web service or the like) at regular intervals and uses this information to update the database. You can either implement this as a simple command-line program that gets executed every minute from the task scheduler, or you can make it a windows service that sleeps most of the time and only wakes up once a minute to do its processing. Once you have that, writing to the database is as usual.

Resources