Google Analytics: Get all App Data - google-analytics

I'm trying to fetch all App data, number of installs, new installs, etc, but I want overall results:
I have the following:
https://www.googleapis.com/analytics/v3/data/ga?dimensions=ga%3AmobileDeviceInfo%2Cga%3AappVersion%2Cga%3AappName%2Cga%3AappInstallerId%2Cga%3AappId&metrics=ga%3Ausers%2Cga%3AnewUsers%2Cga%3ApercentNewSessions&start-date=2014-04-30&end-date=2014-05-14&max-results=50
But it gives me many results in a date interval. Isn't there a way just to grab
Downloaded apps / OS
Active apps (installs that is still in phone)
uninstalls
That's it. I don't need anything more, just need this info. Can I get it back in one resultset?

Related

Does remoteconfig fetch all the keys again?

On calling FetchAsync() in unity, does it download the complete data set again or only the changes i.e. key-value pairs.
I tried using fiddler to get the response but surprisingly, no session were logged for firebase. I tried to get the size of the data usage tab in android, but I was getting a spike of 1 mb for a mere change of 1 key-value pair.
When a fetch can no longer be satisfied by local cache, it will download the entire set of keys and values. There are no incremental or delta updates like there are for the Firebase database options.

Using the Nexus3 API how do I get a list of artifacts in a repository

We are migrating from Nexus Repository Manager 2.1.4 to Nexus 3.1.0-04. With version 2 we have been able to use the API to get a list of artifacts by repository, however we are struggling to find a way to do this with the Nexus 3 API.
Having read https://books.sonatype.com/nexus-book/reference3/scripting.html chapter 16 we have been able to get artifact information for a specific blob using a groovy script like:
import org.sonatype.nexus.blobstore.api.BlobId
def properties = blobStore.blobStoreManager.get("default").get(new BlobId("7f6379d32f8dd78f98b5b181166703b6")).getProperties()
return [headers: properties.headers, metrics: properties.metrics]
However we can't find a way to iterate over the contents of a blob store. We can get a blob store object:
blobStore.blobStoreManager.get("default")
however the API does not appear to give us a way to get a list of all blobs within that store. We need to get a list of the blobIDs within a blob store.
Is there a way to do this via the Nexus 3 API?
One of our internal team members put this together. It doesn't use the blobStore but accomplishes I believe what you are trying to do (and a bit more): https://gist.github.com/kellyrob99/2d1483828c5de0e41732327ded3ab224
For some background, think of a blobStore as just where we store the bits, with no information about them. OrientDB has Component/Asset records and stores all the info about them. You'll generally want to use that instead of the blobStore for Asset information as a result.
Once your migration is done, it can be worth to investigate to update your version of Nexus.
That way, you will be able to use the - still in beta - new API for Nexus. It's available by default on the version 3.3.0 and more: http://localhost:8082/swagger-ui/
Basically, you retrieve the json output from this URL: http://localhost:8082/service/siesta/rest/beta/assets?repositoryId=YOURREPO
Only 10 records will be displayed at a time and you will have to use the continuationToken provided to request the next 10 records for your repository by calling: http://localhost:8082/service/siesta/rest/beta/assets?continuationToken=46525652a978be9a87aa345bdb627d12&repositoryId=YOURREPO
More information here: http://blog.sonatype.com/nexus-repository-new-beta-rest-api-for-content

storage().ref().putString() limitations?

I'm using Firebase 3.4.1 javascript API and trying to persist several string content into storage engine.
The first ones are successfully persisted, but quickly, I get lots of errors :
{"code":"storage/retry-limit-exceeded","message":"Firebase Storage: Max retry time for operation exceeded, please try again.","serverResponse":null,"name":"FirebaseError"}
Is there any storage limitation on allowed writes per second ? Tried to browse the website/FAQ but didn't see any of this anywhere.
Note : I recently migrated to Blaze account, but it doesn't seem to change anything.

What is the best way to get (stream) data from BigQuery to R (Rstudio server in Docker)

I have a number of large tables in Google BigQuery, containing data to be processed in R. I am running RStudio via Docker on Google Cloud Platform using the Container Engine.
I have tested a few routes with a table of 38 million rows (three columns) with a table size of 862 MB in BigQuery.
The first route I tested was using the R package bigrquery. This option was preferred as data can be directly queried from BigQuery. And data-acquisition can be incorporated in R loops. This option is unfortunately very slow, it takes close to an hour to complete.
The second option I tried was exporting the BigQuery table to a csv file on Google Cloud Storage (approx 1 minute), and using the public link to import in Rstudio (another 5 minutes). This route entails quite some manual handling, which is at least not desirable.
In Google Cloud Console I noticed VM instances can be granted access to BigQuery. Also, RStudio can be configured to have root access in its Docker container.
So finally my question: Is there a way to use this backdoor to enable fast data-transfer from BigQuery into an R dataframe in an automated way? Or are there other ways to achieve this goal?
Any help is highly appreciated!
Edit:
I have loaded the same table into a MySQL database hosted in Google Cloud SQL, this time it took only about 20 seconds to load the same amount of data. So some kind of translation from BigQuery to SQL is an option too.

Data is not coming from the database while running the windows phone 8 application

This is Basina, new to windows phone development.
I'm using SQLite in my Windows Phone8 Application following the below article.
http://www.developer.nokia.com/Community/Wiki/How_to_use_SQLite_in_Windows_Phone
When I debug the application I'm able to successfully created the database and tables.
After that i successfully installed the records into the table and retrieved the result and showed in a list.
After that I stopped the debugging of the application.
Now I run the application.
When I retrieve the data from the database and try to show the data in the list,
the list was empty i.e. data is not added to the list.
I didn't understand what was the problem.
While debugging the application everything goes fine, But while running the app i'm not getting data and also not showing any errors too.
I'm looking forward for your response.
Thanks & Regards,
Basina.
actually this is sqlite time connection issue. when you debug your app then thr is enough time for the sqlite to make successful connection but when you actually run this time is very less and when you made a query at that time the connection is not established so thr is no data in the list. so what you can do is..make the static connection at the start of your app and use this connection throughout app cycle .

Resources