we would like to know if it is possible to send parameters to Webdav server (for example as query string path: http://server:8080/WebDavItHit/Notes.txt?param=value...") using IT Hit WebDAV Server Library for Java + JS Client.
We are looking forward to validate individual users using other application, and sending some parameters could be really useful.
We appreciate any way or alternative for doing this with your library.
Some WebDAV clients, such as MS Office will truncate query string when saving a document. So to pass parameters, you will typically do this in file path, for example:
https://server/[SessionID1234567890]/path/file.ext
Related
From https://en.wikipedia.org/wiki/Query_string
A web server can handle a Hypertext Transfer Protocol request either
by reading a file from its file system based on the URL path or by
handling the request using logic that is specific to the type of
resource. In cases where special logic is invoked, the query string
will be available to that logic for use in its processing, along with
the path component of the URL.
What does the quote mean by the two methods by which a web server can handle a HTTP request
"by reading a file from its file system based on the URL path"
"by handling the request using logic that is specific to the type of resource"?
Can you give specific examples to explain the two methods?
Is the query string used in both method?
Thanks.
by reading a file from its file system based on the URL path
^ The web site uses a generic mapping mechanism to convert a URL path to a local filesystem path, and then returns the file located at that path. This is common with static files like .css.
by handling the request using logic that is specific to the type of resource"
^ The web site turns control over to a web application, which contains code written by a developer. The code reads the query string and decides what to do. The logic for deciding what to do is completely customizable, and there does not need to be a static file in the local filesystem that matches the URL.
I am working with a historic API which grants access via a key/secret combo, which the original API designer specified should be passed as the user name & password in an HTTP Basic auth header, e.g.:
curl -u api_key:api_secret http://api.example.com/....
Now that our API client base is going to be growing, we're looking to using 3scale to handle both authentication, rate limiting and other functions. As per 3scale's instructions and advice, we'll be using an Nginx proxy in front of our API server, which authenticates against 3scale's services to handle all the access control systems.
We'll be exporting our existing clients' keys and secrets into 3scale and keeping the two systems in sync. We need our existing app to continue to receive the key & secret in the existing manner, as some of the returned data is client-specific. However, I need to find a way of converting that HTTP basic auth request, which 3scale doesn't natively support as an authentication method, into rewritten custom headers which they do.
I've been able to set up the proxy using the Nginx and Lua configs that 3scale configures for you. This allows the -u key:secret to be passed through to our server, and correctly processed. At the moment, though, I need to additionally add the same authentication information either as query params or custom headers, so that 3scale can manage the access.
I want my Nginx proxy to handle that for me, so that users provide one set of auth details, in the pre-existing manner, and 3scale can also pick it up.
In a language I know, e.g., Ruby, I can decode the HTTP_AUTHORIZATION header, pick out the Base64-encoded portion, and decode it to find the key & secret components that have been supplied. But I'm an Nginx newbie, and don't know how to achieve the same within Nginx (I also don't know if 3scale's supplied Lua script can/will be part of a solution)...
Reusing the HTTP Authorization header for the 3scale keys can be supported with a small tweak in your Nginx configuration files. As you were rightly pointing out, the Lua script that you download is the place to do this.
However, I would suggest a slightly different approach regarding the keys that you import to 3scale. Instead of using the app_id/app_key authentication pattern, you could use the user_key mode (which is a single key). Then what you would import to 3scale for each application would be the base64 string of api_key+api_secret combined.
This way the changes you will need to do to the configuration files will be fewer and simpler.
The steps you will need to follow are:
in your 3scale admin portal, set the authentication mode to API key (https://support.3scale.net/howtos/api-configuration/authentication-patterns)
go to the proxy configuration screen (where you set your API backend, mappings and where you download the Nginx files).
under "Authentication Settings", set the location of the credentials to HTTP headers.
download the Nginx config files and open the Lua script
find the following line (should be towards the end of the file):
local parameters = get_auth_params("headers", string.split(ngx.var.request, " ")[1] )
replace it with:
local parameters = get_auth_params("basicauth", string.split(ngx.var.request, " ")[1] )
finally, within the same file, replace the entire function named "get_auth_params" for the one in this gist: https://gist.github.com/vdel26/9050170
I hope this approach suits your needs. You can also contact at support#3scale.net if you need more help.
I have an asp.net mvc application, which allows to upload company structure using CSV file. I was asked about possibility to automate this function using powershell script.Creating CSV in powershell is easy, but I do not have an idea how upload to asp.net.
My first choice was to use WebClient, but I have problem with authentication - in mvc we are using forms authentication.I read here that it is possible, but if my login form changes I will have to send updated script to client. I would like to omit mange code on client side.
The second option is to crate separate controller and use in it authorization token, but I look like "inventing a wheel again", because I would need to write all code responsible for authentication.
Can I improve one of above options? Or maybe there is a better choice?
You might be able to use the existing web service using the Invoke-RestMethod cmdlet, but it could take two separate invokes. For both invocations you'll probably need to say -Method Post. Its possible to do all this with WebClient, but the cmdlet may be easier to describe. I haven't actually tried this, but it could look something like this:
Invoke-RestMethod -Method Post $loginPage -SessionVariable webSession -Body "..."
Invoke-RestMethod -Method Post $uploadPage -WebSession $webSession -Body "..."
For the first invocation, you specify the URL of the login page, and would simulate a web forms login by providing a username and password in the -Body parmeter, and use -SessionVariable to capture and store context for making the next request(s) authenticated.
In the second request, you use your data upload URL, the -WebSession parameter to supply the context established by the first request, and -Body to upload your CSV. Note that you need the dollar sign on the webSession variable in the second one, but not the first.
Of course, you'll need to store the username/password for the automation to use somewhere, but that's always needed for unattended automation. A possible slick approach to mitigate this would be to use client certificate-based credentials rather than a web form authentication.
How about using a side channel?
There are two approaches. You either send it to the customer's web server every now and then, or the web server downloads the data from you.
For sending data, FTP over SSL should be secure enough. Here is an example about how to command FTP with Powershell. FTP/SSL is easy enough to configure for IIS.
For receiving data, just publish the CSV on your own web site. Set up a script on the customer's web server that downloads CSV every now and then. If the CSV should not be accessible to anyone but the customer, require a client certificate.
I would probably do it like this:
Step 1: Create an https webpage on the asp.net server to receive the csv
Step 2: Create a powershell script that calls curl with the -F option to post a file to it and append any metadata you need on the call
Step 3: Upon receiving the file, store it using the metadata provided in the form and append clientid/date/etc to the file
I want to send a bunch of XML files from my client (iPad) to my application server(Web)..Is there any way I can pass them to server using HTTP POST? I assume HTTP POST only allows embedding strings not attaching as files..We don't want to use FTP due to securuty reasons. We even thought of web service, but not sure whether attachments are possible..Pleas advise if you know any ways of transferring files from client to server.
The maximum length of a POST variable is massive - so no worries there, you can send XML fine. POST can send any type of data, just make sure you set the Content-Type header correctly or you may get unexpected results.
It is no less / more secure than FTP however.
Use-case: Upload a simple image file to a server, which clients could later retrieve
Designate a FTP Server for the job.
HTTP Put: It can directly upload files to a server without the need of a server side
component to handle the bytestream.
HTTP Post: Handle the bytestream by the server side component.
I think to safely use PUT on a public website requires even more effort than using POST (and is less commonly done) due to potential security issues. See http://bitworking.org/news/PUT_SaferOrDangerous.
OTOH, I think there are plenty of resources for safely uploading files with POST and checking them in the server side script, and that this is the more common practice.
PUT is only appropriate when you know the URL you are putting to.
You could also do:
4) POST to obtain a URL to which you then PUT the file.
edit: how are you going to get the HTTP server to decide whether it is OK to accept a particular PUT request?
What I usually do (via PHP) is HTTP POST.
And employ PHP's move_uploaded_file() to get it to whatever destination I want.