I need to transfer nexus snapshot package to test servers for testing purpose. So i need to identify the latest snapshot package url.
I have tried to use metadata.xml file but it only include the time stamp.To rebuild the full url we need the identify the time stamp and the last number.
EX:- foo-1.0-20110506.1100-1.jar
We can extract up to foo-1.0-20110506.1100 but we can not identify the last integer number.
-
-
Related
I'm using Firebase for my iOS app. For users on old versions of my app I want to be able to force them to update their app from App Store to be able to continue using the app. For this I'm checking a remote config value in the app if it's true the user will get a full screen message to update. In Firebase Console Remote Config I'm checking the version of the app to set the remote config value to true/false. See image below.
The problem is that I want to be able to use "where version < X" to set the value or if that is not possible at least be able to pick more than one version (where version is X or Y or Z).
This must be a common use case but I can't figure out how to do this. Anyone know how to? Can I use regex, but how?
Add a remote config value that declares the minimum required version like this
Then you implement the "is old" logic on client side by checking the client's version number against the provided remote config value. If the check fails, then you display the "update your app" screen.
Make sure to set the default/fallback value on the client to a version number that is not forcing them to update (version 0.0.0 for example).
You can configure remote config conditions for different platforms and version number values if you don't have a synchronized version numbering across your platforms.
We are trying to move some of our monitoring away from Grafana / Graphite DB into another system. Is there anyway to pull the full db data into a sql db?
You may use tools delivered from Whisper DB to extract data from files, i.e. and upload it into your DB.
[boss#DPU101 gn_bytes_total_card0]$ whisper-fetch gauge.wsp | head
1499842740 51993482526.000000
1499842800 51014501995.000000
1499842860 51011637567.000000
1499842920 51301789613.000000
1499842980 50994189020.000000
1499843040 50986821344.000000
This tool also allows you to extract data in JSON:
$ whisper-fetch --help
[...]
--json Output results in JSON form
You can use whisper utilities provided. You need to download it separately using following command.
on Ubuntu 14.04
apt-get install python-whisper
whisper-fetch.py program will allow you to download data into json format (or pretty format - separated by tab).
The data points will be for every 60 seconds.
Whisper Link
We are migrating from Nexus Repository Manager 2.1.4 to Nexus 3.1.0-04. With version 2 we have been able to use the API to get a list of artifacts by repository, however we are struggling to find a way to do this with the Nexus 3 API.
Having read https://books.sonatype.com/nexus-book/reference3/scripting.html chapter 16 we have been able to get artifact information for a specific blob using a groovy script like:
import org.sonatype.nexus.blobstore.api.BlobId
def properties = blobStore.blobStoreManager.get("default").get(new BlobId("7f6379d32f8dd78f98b5b181166703b6")).getProperties()
return [headers: properties.headers, metrics: properties.metrics]
However we can't find a way to iterate over the contents of a blob store. We can get a blob store object:
blobStore.blobStoreManager.get("default")
however the API does not appear to give us a way to get a list of all blobs within that store. We need to get a list of the blobIDs within a blob store.
Is there a way to do this via the Nexus 3 API?
One of our internal team members put this together. It doesn't use the blobStore but accomplishes I believe what you are trying to do (and a bit more): https://gist.github.com/kellyrob99/2d1483828c5de0e41732327ded3ab224
For some background, think of a blobStore as just where we store the bits, with no information about them. OrientDB has Component/Asset records and stores all the info about them. You'll generally want to use that instead of the blobStore for Asset information as a result.
Once your migration is done, it can be worth to investigate to update your version of Nexus.
That way, you will be able to use the - still in beta - new API for Nexus. It's available by default on the version 3.3.0 and more: http://localhost:8082/swagger-ui/
Basically, you retrieve the json output from this URL: http://localhost:8082/service/siesta/rest/beta/assets?repositoryId=YOURREPO
Only 10 records will be displayed at a time and you will have to use the continuationToken provided to request the next 10 records for your repository by calling: http://localhost:8082/service/siesta/rest/beta/assets?continuationToken=46525652a978be9a87aa345bdb627d12&repositoryId=YOURREPO
More information here: http://blog.sonatype.com/nexus-repository-new-beta-rest-api-for-content
I'm trying to fetch all App data, number of installs, new installs, etc, but I want overall results:
I have the following:
https://www.googleapis.com/analytics/v3/data/ga?dimensions=ga%3AmobileDeviceInfo%2Cga%3AappVersion%2Cga%3AappName%2Cga%3AappInstallerId%2Cga%3AappId&metrics=ga%3Ausers%2Cga%3AnewUsers%2Cga%3ApercentNewSessions&start-date=2014-04-30&end-date=2014-05-14&max-results=50
But it gives me many results in a date interval. Isn't there a way just to grab
Downloaded apps / OS
Active apps (installs that is still in phone)
uninstalls
That's it. I don't need anything more, just need this info. Can I get it back in one resultset?
Is is possible to have R connect to gmail's POP server and read/download the messages in a specific folder of mine? I have been storing emails and would like to go back and start to analyze subject lines, etc.
Basically, I need a way to export a folder in my gmail account and I would like to do this pro grammatically if it all possible.
Thanks in advance!
I am not sure that this can be done via a single command. Maybe there is a package out there, which I am not aware of that can accomplish that, but as long as you do not run into that maybe the following process would be a solution ...
Consider got-your-back (http://code.google.com/p/got-your-back/wiki/GettingStarted#Step_4%3a_Performing_A_Backup) which "is a command line tool that backs up and restores your Gmail account".
You can invoke it like this (given that python is available on your machine):
python gyb.py --email foo#bar.com --search "from:pip#pop.com" --folder "mail_from_pip"
After completion you'll find all the emails matching the --search in the specified --folder, along with a sqlite database. (posted by dukedave, Dec 4 '11)
So depending on your OS you should be able to invoke the above command from within R and then access the downloaded mails in the respective folder.
GotYourBack is a good backup utility, but for downloading metadata for analysis, you might want something that doesn't first require you to fetch the entire content of all your email.
I've recently used the gmailr package to do a similar analysis.