How to set test URL in LoadTest in LoadAndPerformanceTestProject - Basic - asp.net

This question is very basic if you previously used this tools, but I just spent 2 hours and havent figured it out, so please help.
It is the first time I try to test a REST service and how many concurrent users can it take. While LoadStorm looks like the best product for something like this, I want something free.
I began using the LoadAndPerformanceTestProject project in Visual Studio. I created a loadTest, as you can see in the image, but I have no idea how to set the url I want to test. (see image below)
Questions:
1) where do you set the url you want to test and the request parameters for my REST service?
2) is this tool just a programs that runs on my machine and makes requests to the server?
3) what can I do to run it in the cloud so I can get more accurate results? I have a MSDN subscription, and the URL I use is actually my project that is in Azure.
These are a lot of questions, but I don't see any tutorial online on how to use this feature. I saw this video enter link description here but unfortunately when I go to TEST in the menu bar, I have different options from the video - see picture bellow. In the video is seams like he doesn't have to add a LoadAndPerformanceTestProject at all.

First you need to create one or more Web Performance Tests. It's in this test that you define which urls should be hit in the test and how it should happen (GET, POST, loop, conditions, headers, ...). You can add a new Web Performance Test by simply right clicking your project, Add, New Web Performance Test.
After you created your test, simply add them in a scenario of your load test:
This should get you started. To get the most out of load testing I suggest you do some more reading about the topic because there's a lot more to it.

For running load test in cloud you can follow the links here
http://www.visualstudio.com/get-started/load-test-your-app-vs
Do let me know if you need any help.

Related

Customising Mobius Forms

I'm really keen to use the 2sxc environment on my website for a number of applications.
I'm currently looking at the Mobius forms.
What I'm wanting to do is create a ticket in ConnectWise rather than send an email, using the ConnectWise REST API.
Some of these questions might have obvious answers to someone who has been taught in these technologies, but I'm self-taught. When I went to school I learnt COBOL!
There is c# code in the application, but I can't see how you build and incorporate into the application. I forked the code and it seems to just code with no build.
There are live and staging folders with the same cshtml files. However, it seems a bit random when the live or staging is actually used. For example, I did a quick fix to the _Contact Form.cshtml so to fix the type that meant it always displayed the ReCaptcha warning, and I changed the live version, which didn't do anything, so I had to change the staging version.
I need to update the settings so that configure the ConnectWise API settings, I haven't been able to find where I can do this? I am still looking though.
I also need to store a private key in the settings. Is there a secure way I can do this?
PS. When I get my head around all this I'm happy to be a contributor
welcome to StackOverflow.
I'll try to give you some guidance to help you figure it out
Live and staging are folders meant to let you make changes while the users see the unmodified output. So a host-user sees the files from staging, others see what's in live. When you're done and all is tested, you copy from staging to live. This we call Polymorphism.
Polymorphism applies to both the cshtml as well as the api. So as a host-user, you'll be using staging/api/FormController to save/send.
There is no build process, everything is hot-compiled. That's one of the things that makes 2sxc so amazing. No Visual Studio, DLL or restarting the application ;) You'll love it.
Secure keys: there is no special secure key storage. We usually put it in the App-Settings, just like the MailChimp key you'll see there. We split it into two fields for very technical reasons, because we publish our code on github and that causes trouble when our code has API keys. But you can just use one field, assuming you don't plan on publishing the code on github.

Posting graphite events to Hosted Graphite

I'm using Hosted Graphite and trying to add deploy events to my grafana dashboard. I first attempted to use the method described here.
The metric is added to graphite with a simple line at the end of the deploy script:
echo "$HOSTEDGRAPHITE_API_KEY.events.$ENVIRONMENT.api.deploy 1" \
| nc -uw0 carbon.hostedgraphite.com 2003
I can show those data points in a simple graph, but using the annotations feature with the "regular metric query" doesn't seem to be adding anything to the graphs.
I'm more interested in using real events, based on: http://obfuscurity.com/2014/01/Graphite-Tip-A-Better-Way-to-Store-Events. This should allow us to tag the event with, for example, the commit hash or git tag. Unfortunately, I can't find anything in the Hosted Graphite documentation about how to get these data into graphite. I also can't find anything about it in the graphite docs.
Despite the lack of docs, I tried posting to a few endpoints, just hoping to get lucky. All of these returned 404:
https://${HOSTEDGRAPHITE_API_KEY}#www.hostedgraphite.com/api/v1/events
https://${HOSTEDGRAPHITE_API_KEY}#www.hostedgraphite.com/api/v1/sink/events
https://${HOSTEDGRAPHITE_API_KEY}#www.hostedgraphite.com/XXXXXX/graphite/events
where XXXXXX is the path prefix I have when accessing the graphite dashboard at /XXXXXX/graphite/dashboard.
I also tried contacting Hosted Graphite support but the "Support" link seems to go nowhere.
Hosted Graphite employee here.
UPDATE: We support Graphite Events and Annotations now: http://docs.hostedgraphite.com/advanced/annotations-and-events.html
We don't currently support events, but it is in development.
This is the reason there is no mention of this functionality in our documentation.
We do support annotations based on metrics.
Which support link didn't work for you? I'll get that fixed :)
You can email us at support+so# or on twitter, as you already discovered.
I'm sorry I don't have a better solution to tagging deploys right now ( it's something we want to be able to do too) but it should be available soon.
Please get in touch via email if there's anything else we can help with.
e
EDIT: We're using Intercom for support, do you have something like noscript/disconnect that might stop that from working?

Script to automatically log in to webpage and click button

I have a time clock system that employees login to via browser and punch for time but I find that the process takes our less tech savvy employees 4-5 minutes. Is there a way to make a script that would auto-login for them, load the punch page, and then select the clock function via menu and then click the 'punch' button? Ideally I'd like to make two shortcuts on the desktop linking to these scripts: one for clocking in and one for clocking out. The button and menu both have IDs so I know it is possible to assign those values via javascript, but I'm unsure of how to do the auto-login / page redirection.
I see several ways you could go with this.
Coding scripts using WebDriver. WebDriver is a browser driver library and should be able to generally interact with most elements, so it could likely do what you're looking for. WebDriver works in a variety of programming languages which gives you flexibility. Here's the getting started guide for Java.
Use a macro recorder like AutoHotKey. Later versions come complete with COM support and you could hook up to Internet Explorer. More details are here. If you don't know AutoHotKey, you'll likely want to go through some of the initial tutorials before digging through that post though.
The third way would be to look for an API or web service or even a tool like curl (a command line tool to fetch URLs). Depending on how your time card application is coded, you might be able to create a batch script that never actually renders the page but just calls the URL in succession. This is likely to be the fastest solution for the users but may prove difficult if there's a lot of asynchronous script calls or PUT http requests in your app. A curl tutorial is available here.
Another solution: Sikuli (free, open source, Linux/Mac OS X/Windows). It allows you to write Python script that clicks on the screen based on the screenshot you provide.
You might also be interested in Selenium IDE:

Is there way to extract the sections and notes from one note using web api

I am trying to write a sync program so that someone can link with one note from inside my asp.net application. I would like to read all the section names / page names from one note (via api) and show them on my application. I would also like to download the notes in html format so that users can preview. If they want to edit or add a new notes then I would like to send them to the one note web application to make changes.
I could not find an api for this. Is this possible?
The current APIs only allow you to create pages. We don't have support for reading pages, sections or notebooks back at the moment, but we are actively working on those APIs. Follow us on Twitter #onenotedev for future updates. We expect to make some of these APIs available over the next several months.

How to profile my production web application?

I have a blog I made. Recently, I've been noticing some performance problems. I'm getting about 400ms waiting times for the index page. I think this is quite high. When I first deployed it(with less features, but still) I recall index load times of something like 80ms.
Now I would profile it, but the problem is that this only happens in my production environment. In my test environment, the index page only takes 10ms.
How do I go about profiling my production application? I use Apache+mono+mod_mono on Arch Linux with MongoDB. I have a similar test environment except I use xsp.
I'm unsure of where to look: my code, Apache's configuration, or MongoDB? How can I profile my production server to figure out why it's so much slower than my development environment?
Tough to be specific without details, but here's a shot at a general guide:
First I would recommend using something like Firebug for Firefox - there are equivalents in other browsers, but this is my old go-to tool for this kind of thing. Enable it and got the Net Panel view for a waterfall diagram that will show you a list of every object that is loading on a page (you might have to refresh) - it will also have a blue showing the render event (when the page becomes visible).
The waterfall should make it pretty obvious where the slow pieces of the page are and armed with that information you can go to the next stage - figuring out why particular pieces are slow.
If plugins are not your thing, or you suspect that it could be something local to yur machine causing the issue, then take a look at: http://www.webpagetest.org/
That will give you the ability to remote test from different locations, different browsers, speeds etc. and give you similar detailed results.
If it is a static file being fetched, look at network problems, Apache as a cause. If it is dynamically generated then look at Apache, ASP, mongodb etc.
For Apache, what do the access logs say is the response time for the index page? Assuming Apache 2 or newer, make sure you have %D (and %T if you like) being logged so you can see the time taken to serve the page (from the Apache perspective) at the required level of details. For more info on that, take a look at the LogFormat directive.
I can't help on the ASP/Mono side, not my thing, but adding debug statements at various points to track the index page generation (assuming it is dynamically generated) would be a pretty standard approach.
For the database, MongoDB by default logs only "slow" queries that take >100ms - if you are trying to track down a sub-100ms response time issue via the logs you will need to adjust that or you will likely get very little. That can be done as follows:
> db.setProfilingLevel(0,20) // leave profiling off, slow threshold=20ms
You can also adjust it as a startup parameter (--slowms) to the mongod process. More information on profiling, which may also help but has overheads, can be found here:
http://www.mongodb.org/display/DOCS/Database+Profiler
I'd suggest you have a look a Sam Saffrons Mini Profiler. If you use it in your site, it allows you to turn on profiling on production.
By adding sufficient instrumentation to your code, you should then be able to identify which bit is taking the time and then focus your efforts there.

Resources