Kaa: Is there a step by step video demo on how to setup Data collection Demos? - kaa

I'm stuck at the bundleId=datacollection_demos. I ended up entering into the Adminitration UI and I think I'm supposed to setup the schemas for Data collection and then there are lots of options there and I'm not sure which to choose and wht to do next. I tried to find step by step guides online but most of it are over the steps I'm stuck in...

There is a step-by-step tutorial on each Kaa feature (including Data Collection) in the Kaa 0.10 Sandbox. Also, the 'Your first Kaa application' guide in the Kaa documentation resembles the Data Collection demo and its tutorials.

Related

Postman-Jenkins-Jira Integration

I have a postman collection which in turn is integrated to Jenkins via newman.
I need to integrate my Jenkin results with Jira via X-ray plugin.
I tried using newman junitxray reporter but this report consider each request as test case.
In my collection i always need to run some series of request before running the actual request which contain pm.test.
But junitxray report is considering those series of request as also test cases.
I want only a specific request to be taken as test-case.
Can someone please help me on this.
You can use different reporters with newman; depending on that, what will be on the JUnit XML report will be different. Some time ago I've prepared a tutorial showing the differences.
If you're using Xray on Jira cloud, please check this tutorial and related code.
If you're using Xray on Jira server/datacenter, please check this tutorial and related code instead.

having difficulty in getting api key and other relevant information in firebase

I just started with firebase and I am having some difficulty in getting api keys and other relevant data required.I am building a web app .
Here is the screenshot of the current screen that i got in firebase .
I clicked on Add project .Got directed to another page where I clicked on Develop ,then database and finally created a realtime database .But i am stuck as how to get relevant api key and other data .
I am following this tutorial
tutorial link
to get my hands on firebase and react.
I believe you are new to Firebase so I'm going to Help you with the screenshots. As Tomka mentioned please read the documentation where it covers all the required information.
1-Go to Console, Select your project(In My Case TestApp)
2-After you have selected the Project, Select Project Settings
3-As you are configuring in the Web app (React), Please click the one pointed in the image.
4-You should have the Details you are looking for something like below

database settings Google Appmaker

Google's App Maker Setup Instruction seem to be inaccurate!
I am closely following the instructions found on https://developers.google.com/appmaker/models/cloudsql
For: Create a custom Google Cloud SQL database for your app/
Second Generation
Step 6a is not valid
There is No Database Found in the Settings
Note I have followed all of the previous steps without issue.
Does anyone know what I can do to follow those intructions listed on the url above if Step 6a appears to be incorrect?
Note: This is my first ever post on stackoverflow, please be kind to my ignorance if obvious.
Thanks!
This is how to set it up:
Connection from App Maker to Cloud SQL is locked
Basically, you need to assign the default instance within the Admin Console.
Mike

How to get Crashlytics event from BigQuery[Firebase project]

We have enabled BigQuery feature for our Firebase project , since last week firebase team announced that Crashlytics is moved from Beta to Prod release , so I was thinking this data should be available in BigQuery in some form. But I was not able to see any Crash event in my BigQuery table even the app crashed a couple of time. So does anybody know how to extract the crashlytics report from Firebase for custom reporting solution.
Crashlytics data is not currently available in BigQuery but we are looking into this in the near future. Please stay tuned :D
I know this question is a bit old now but for the benefit of those who end up here when searching the web, please see below:
It seems like Crashlytics data is now available for export into Big Query. You will need to link your Crashlytics account with your Firebase project, which requires admin rights from Crashlytics side. Have a look at the following links as a starting point:
https://firebase.googleblog.com/2018/08/exporting-crashlytics-data-to-bigquery.html
https://cloud.google.com/solutions/mobile/mobile-firebase-analytics-bigquery#working_with_firebase_crashlytics_data_in_bigquery
On the second link you may need to scroll back to the top and read the 'Before you begin' section.
Hope that helps.

Azure Data Factory Test Framework

Are there any automatic testing mechanism available for azure data factory pipelines? Does the azure data factory visual studio project come with any test suite of its own? Any help highly appreciated
Thanks
EDIT after comment:
You could use a github repository (gbrueckl - Azure.DataFactory.LocalEnvironment) for running custom pipelines local etc. His repository provides some tools which make it easier to work with Azure Data Factory (ADF). It mainly contains two features:
Debug Custom .Net Activities locally (within VS and without
deployment to the ADF Service!)
Export existing ADF Visual Studio projects a Azure Resource Manager
(ARM) template for deployment
In addition, the repository also contains various samples to
demonstrate how to work with the ADF Local Environment.
https://github.com/gbrueckl/Azure.DataFactory.LocalEnvironment is the link to the repository for doing that. You can use it to debug your pipelines on your local environment at least and it could also help to test them...
Not that I'm aware of, but happy to be told otherwise.
I suggest you post this on Microsoft's user voice page as a feedback idea. Then people searching will come here, go to that link and vote to get something developed.
https://feedback.azure.com/forums/270578-data-factory/filters/my_feedback?query=Unit%20Testing%20for%20ADF%20Projects
Hope this helps.
The only related project / sample code I'm aware of is the ability to step into and debug "DotNetActivty" outside of Azure Batch, by using your pipeline variables.
This could effectively be used as a test runner of some description.
If you have your data factory setup to be automatically deployed, you could deploy to an alternative QA environment.
Then you could probably (I haven't dug into the SDK in that area enough to know for sure) then use the SDK to run the slice and check if it ran successfully. It would be more of an integration test / end to end smoke test at this point.
ADFCustomActivityRunner on Github

Resources