We have third party service and writing below code to execute. After adding its service reference it gives us object.
client objClient = new client();
info objEntities = new info();
objEntities.reginfo = new reginfoRequest();
response objResp = objClient.registerInfo(objEntities);
I have enabled Diagnostic and it records XML in .svclog file.
** But what I want to do is.. I want to record each XML file during processing and record each XML file in specific folder.
How can I do that?
What about using Fiddler ?
There you can see all requests and responses.
Related
I'm creating blazor server app. I use external file storage with RestAPI.
I want to create download button to get file from storage. This may seem easy, but not necessarily.
From file storage I download HttpContent like that
var request = new HttpRequestMessage(HttpMethod.Get, _url);
request.Headers.Add("auth-token", token);
request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("application/octet-stream"));
HttpResponseMessage response = await _Http.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);
response.EnsureSuccessStatusCode();
var content = response.Content;
next I act like this tutorial https://learn.microsoft.com/en-us/aspnet/core/blazor/file-downloads?view=aspnetcore-6.0
var fileStream = content.ReadAsStream();
using (var streamRef = new DotNetStreamReference(fileStream))
{
await JS.InvokeVoidAsync("downloadFileFromStream", "file.txt", streamRef);
}
For small files everything work great. But if I try to download large file (100mb), algoritm firstable download file to memory(RAM) of server and later save on local disk of client.
In ideal world I dream that when I click button download, file from external storage will download after delay (with progressbar) like physical file (no stream buffer) form http server e.g. https://www.example.com/file.txt. Of course by my BlazorServer Application with authorization and authentication, and whole neccesery services.
I have solution.
Create Service to service File Storage API
Create controller to avoid cros-origin error
Use microsoft tutorial to create download button https://learn.microsoft.com/en-us/aspnet/core/blazor/file-downloads?view=aspnetcore-6.0
I'm a Java developer, absolutely new in BMC Remedy system, but a I have just one fast task to solve.
Our Remedy use Java Applet to upload files from Remedy browser UI to FTP server. I should replace it with Javascript (upload files via http to the server side, which then upload it to FTP server).
In general web application, I can add a servlet, which would receive Multipart file, connect to FTP, upload it and respond with params. Piece of cake.
But is it a right way to solve this problem in Remedy? I've read documentation and it all about some sort of plugins for Remedy Mid-Tier and there is nothing about simple servlets.
What is the right way to solve my task? Any source samples would be really helpful.
Thank you.
if you are doing it via the API, you could just get the record ID, and field ID and do this:
//First, we retrieve the form
int[] fieldIds = {1};
String formName = "My:Form:Name";
//Request ID. Field ID = 1. Always 14 chars long.
String requestID = "00000000000001";
Entry entry = arsConnection.getEntry(formName, requestID, fieldIds);
//add the attachment
AttachmentValue attachment = new AttachmentValue("name_of_file.ext", "path/to/file.ext");
entry.put(550000011, new Value(attachment));
arsConnection.setEntry(formName, newEntry,null,0);
to do this, you need the request id. this code is using the java API.
How can I download file from SVN repository from my web application in C#.net?
I want to download file programatically. When button is clicked, it should download file from URL given in TextBox.
First off, you should be able to access (read-only) the file directly on the repository without the need for an SVN client (though you may need authentication). In that case, it's just like downloading any file from any URL:
using (WebClient client = new WebClient())
{
// In case you need authentication...
// client.Credentials = new NetworkCredential("username", "password");
client.DownloadFile(fileUrl, localDestinationPath);
}
However, like Vinayak said, SharpSvn is not a bad way to go-- it's pretty good for handling SVN interactions. So if using the standard WebClient doesn't work, you can do the following in SharpSvn to get the file stream:
MemoryStream stream = new MemoryStream();
using (SvnClient client = new SvnClient())
{
client.Authentication.DefaultCredentials = new NetworkCredential("username", "password");
client.Write(SvnTarget.FromUri(svnFilePath), stream);
}
return stream;
Hope this helps!
Edit:
I've also found another SharpSvn Tutorial which i hope u'll find helpful. It has everything from Introduction to SVN Operations(checkout,commit, etc)
Also,
Read about SharpSVN and check out this blog
there are few related examples for Checkout, update, commit in c#
The source code contains an example project as well
I've a site set up in IIS. It's allows users to download files from a remote cloud to their own local desktop. HOWEVER, the context seems to be mixed up, because when I access the website externally via the IP, and execute the download, it saves the file to the server hosting the site, and not locally. What's going on??
My relevant lines code:
using (var sw2 = new FileStream(filePath,FileMode.Create))
{
try
{
var request = new RestRequest("drives/{chunk}");
RestResponse resp2 = client.Execute(request);
sw2.Write(resp2.RawBytes, 0, resp2.RawBytes.Length);
}
}
Your code is writing a file to the local filesystem of the server. If you want to send the file to the client, you need to do something like
Response.BinaryWrite(resp2.RawBytes);
The Response object is what you use to send data back to the client who made the request to your page.
I imagine that code snippet you posted is running in some sort of code-behind somewhere. That is running on the server - it's not going to be running on the client. You will need to write those bytes in the Response object and specify what content-type, etc. and allow the user to Save the file himself.
I have a remote RSS feed which has to be transformed into Notes documents using LotusScript.
I've looked through the documentation, but I can't find how to open a remote URL in order to retrieve its contents. In other words, some sort of wget- or curl-like functionality. Can anyone shed some light on how to do this? Using Java is not an option.
Thanks.
Check out the NotesDOMParser class - available in LotusScript - which lets you (indirectly) pull XML from a remote URL and process in a an XML DOM object.
You can pull the XML into a string using the MSXMLHTTP COM object, then use NotesStream to send the XML to the NotesDOMParser.
I have not tested, but the code would look something like this:
...
Set objXML = CreateObject("Microsoft.XMLHTTP")
objXML.open "GET", sURL, False, "", ""
objXML.send("")
sXMLAsText = Trim$(objXML.responseText)
Set inputStream = session.CreateStream
inputStream.Open (sXMLAsText)
Set domParser=session.CreateDOMParser(inputStream, outputStream)
domParser.Process
...
Documentation: http://publib.boulder.ibm.com/infocenter/domhelp/v8r0/index.jsp?topic=/com.ibm.designer.domino.main.doc/H_NOTESDOMPARSER_CLASS.html
You can't open a remote URL (whether it's HTTP or some other protocol) using native Lotusscript: the object library simply doesn't support it. If you're running on a Windows server, you should be able to use the MS XMLHttp DLLs to get a handle on your remote file via a URL, as specified by the previous answer. (Alternatively, this link specifies how to parse and open a UNC path with Lotusscript—again, Windows only).
All that said, if I understand you correctly, you're not using HTTP to access the remote file at all. If the RSS file is just on a simple path, why can't you open the file for parsing in the normal way with Lotusscript?