I want to upload a json report made in R using lintr package to my SonarQube server. I'm making a POST taking advantage of the api/ce/submit command (You can find it in https://next.sonarqube.com/sonarqube/web_api/api/ce?internal=true). To do this i'm using Postman with this params:
projectKey: XX
projectName: XXname
report: lintr_out.json
projectBranch: testing-1.0
This command create the Project in Sonar but it's not able to show the information of the report.
Anybody knows how can i see the results of the report in Sonar properly? Thanks for all!
The WS api/ce is for internal use (as marked). It is not an API and the report it expect may change its format anytime.
To submit issues based on a third party linter, I advice you look at the generic issue import feature. You simply have to convert your JSON file to the format we expect.
Related
I have a postman collection which in turn is integrated to Jenkins via newman.
I need to integrate my Jenkin results with Jira via X-ray plugin.
I tried using newman junitxray reporter but this report consider each request as test case.
In my collection i always need to run some series of request before running the actual request which contain pm.test.
But junitxray report is considering those series of request as also test cases.
I want only a specific request to be taken as test-case.
Can someone please help me on this.
You can use different reporters with newman; depending on that, what will be on the JUnit XML report will be different. Some time ago I've prepared a tutorial showing the differences.
If you're using Xray on Jira cloud, please check this tutorial and related code.
If you're using Xray on Jira server/datacenter, please check this tutorial and related code instead.
I am writing UI automation script using karate DSL. In this at certain point I need to get value from network call in chrome. I want to interact with one of the webservice call in chrome devtools network tab and get the json response of that webservice.
I need this because I have to extract the value from that particular call and pass it on to the next step in my automation script.
I have seen the question related to sessionStorage(Is there a way of getting a sessionStorage using Karate DSL?) but I wonder how to do the same for network call using script command or any other way?
The first thing I would recommend is don't forget that Karate is an API testing tool at its core. Maybe all you need to do is manually make that call and get the response. You should be able to scrape the HTML and get the host and parameters needed.
That said - there's a new feature (only for Chrome) which is documented here: https://github.com/intuit/karate/tree/develop/karate-core#intercepting-http-requests - and is available in 0.9.6.RC2
It may not directly solve for what you want, but in a Karate mock, you should be able to set a value for use later e.g. by using a Java singleton or writing to a temp-file.
If there is something oddly more specific you need, please contribute code to Karate. Finally, there is an experimental way in which you can actually make raw requests to the Chrome DevTools session: https://github.com/intuit/karate/tree/develop/examples/ui-test#devtools-protocol-tips - it is for advanced users, but maybe you are one :)
I want to create a custom report. Response format for sonarqube web service API /api/issues/search is JSON or XML. How can I use that response to create a html or CSV file using "unix shell without using command line tools" so that I can use it as a Report. Or is there any other better way to achieve this?
you can generate a html file if you run an analysis in the preview mode http://docs.sonarqube.org/pages/viewpage.action?pageId=6947686
It looks as if the SonarQube team has been working hard to not allow people to do this. They appear to want people to purchase an Enterprise Subscription in order to export reports.
An old version of sonar-runner (now called sonar-scanner) had an option to allow local report output. But that feature is "no more supported".
ERROR: The preview mode, along with the 'sonar.analysis.mode' parameter, is no more supported. You should stop using this parameter.
Looks like version 2.4 of Sonar Runner does what you want. If you can find it. Of course they only have 2.5RC1 available on the site now.
Using the following command should work on version 2.4:
sonar-runner -Dsonar.analysis.mode=preview -Dsonar.issuesReport.html.enable=true
There at least two open-source projects that query the SQ API to generate reports in various formats.
https://github.com/cnescatlab/sonar-cnes-report/tree/dev (Java)
https://github.com/soprasteria/sonar-report (JavaScript/Node)
At the time of writing both are active.
App Services is a great place to store data but now that I have a lot of critial info in there I realized there isn't a way to create a backup or roll back to an earlier state (in case I did something stupid like -X DELETE /users)
Any way to back up this data either online or offline?
Apart from API access to fetch records x by x and storing locally, there is no solution at the moment. Team is planning an S3 integration (export data to S3) but no completion date is defined for that yet.
Looks like the only way is to query the data using e.g. CURL and save the results to a local file. I dont believe there is a way to export natively.
http://apigee.com/docs/app-services/content/working-queries
From 2014/2015 Usergrid versions it is possible to make exports and imports using "Usergrid tools"
On this page it is explained how to install them :
https://github.com/apache/incubator-usergrid/tree/master/stack/tools
Basically once you run
$ java -jar usergrid-tools.jar export
and this will export your data as json files in an export directory.
There are several export and import tools avaible, the best way to see them is to visit this page :
https://github.com/apache/incubator-usergrid/tree/6d962b7fe1cd5b47896ca16c0d0b9a297df45a54/stack/tools/src/main/java/org/apache/usergrid/tools
I am using Tridion 5.3.
I have webpage that has over 100 pdf links attached to it. When I publish that page not all pdf get published even though I get a URL for each pdf like "/pdf/xyzpdfname_tcm8-912.pdf". When I click on those links I get a 404 error. For the same pdf components for which I get the error, if I publish them by attaching 5 to 10 pdf at a time they get published and there is no 404 error and everything works fine. But that's not the functionality I need. Does any one know why Tridion is not able to deploy the binary contents if I publish them in bulk?
I am using engine.PublishingContext.RenderedItem.AddBinary(pdfComponent).Url to get the pdf url.
Could this be to do with the naming of your PDF?
Tridion has a mechanism in place to prevent you from accidentally overwriting a binary file, with a different binary file that is named the same.
I can see the Binary you are trying to deploy has the ID:
tcm:8-755-16
and you are naming it as follows:
/www.mysite.com/multimedia/pdfname_tcm8-765.pdf
Using the Variant Id:
variantId=tcm:8-755
is it possible you are also publishing the same binary from a different template? Perhaps with the same filename, but with a different Variant Id?
If so Tridion assumes you are trying to publish two 'Variants' of the same binary (for example a resized image, obviously not relavent for PDFs)
The deployer is therefore throwing an error to prevent you from accidentally overwriting the binary that is published first.
You can get round this in 2 ways:
1> Use the same variant ID for publishing both binaries
2> If you do want to publish a variant, change the filename to something different.
I hope this helps!
Have a look at the log files for your transport service and deployer. If those don't provide clarity, set Cleanup to false in cd_transport_conf.xml, restart the transport service and publish again. Then check if all PDFs ended up in your transport package.
engine.PublishingContext.RenderedItem.AddBinary(pdfComponent).Url gives you the URL of an item as it will be published in case of success, not a guarantee that it will publish.
Pretty sure you're just hitting a maximum size limit on your transport package.
PS - Check the status of your transaction in the publishing queue, might give you a hint
After you updated the question:
There's something terribly wrong with the template and/or your environment. The Published URL says "tcm8-7*6*5.pdf" but the Item Uri is "tcm:8-7*5*5".
Can you double check what's happening in here?