I have a remote RSS feed which has to be transformed into Notes documents using LotusScript.
I've looked through the documentation, but I can't find how to open a remote URL in order to retrieve its contents. In other words, some sort of wget- or curl-like functionality. Can anyone shed some light on how to do this? Using Java is not an option.
Thanks.
Check out the NotesDOMParser class - available in LotusScript - which lets you (indirectly) pull XML from a remote URL and process in a an XML DOM object.
You can pull the XML into a string using the MSXMLHTTP COM object, then use NotesStream to send the XML to the NotesDOMParser.
I have not tested, but the code would look something like this:
...
Set objXML = CreateObject("Microsoft.XMLHTTP")
objXML.open "GET", sURL, False, "", ""
objXML.send("")
sXMLAsText = Trim$(objXML.responseText)
Set inputStream = session.CreateStream
inputStream.Open (sXMLAsText)
Set domParser=session.CreateDOMParser(inputStream, outputStream)
domParser.Process
...
Documentation: http://publib.boulder.ibm.com/infocenter/domhelp/v8r0/index.jsp?topic=/com.ibm.designer.domino.main.doc/H_NOTESDOMPARSER_CLASS.html
You can't open a remote URL (whether it's HTTP or some other protocol) using native Lotusscript: the object library simply doesn't support it. If you're running on a Windows server, you should be able to use the MS XMLHttp DLLs to get a handle on your remote file via a URL, as specified by the previous answer. (Alternatively, this link specifies how to parse and open a UNC path with Lotusscript—again, Windows only).
All that said, if I understand you correctly, you're not using HTTP to access the remote file at all. If the RSS file is just on a simple path, why can't you open the file for parsing in the normal way with Lotusscript?
Related
I'm a Java developer, absolutely new in BMC Remedy system, but a I have just one fast task to solve.
Our Remedy use Java Applet to upload files from Remedy browser UI to FTP server. I should replace it with Javascript (upload files via http to the server side, which then upload it to FTP server).
In general web application, I can add a servlet, which would receive Multipart file, connect to FTP, upload it and respond with params. Piece of cake.
But is it a right way to solve this problem in Remedy? I've read documentation and it all about some sort of plugins for Remedy Mid-Tier and there is nothing about simple servlets.
What is the right way to solve my task? Any source samples would be really helpful.
Thank you.
if you are doing it via the API, you could just get the record ID, and field ID and do this:
//First, we retrieve the form
int[] fieldIds = {1};
String formName = "My:Form:Name";
//Request ID. Field ID = 1. Always 14 chars long.
String requestID = "00000000000001";
Entry entry = arsConnection.getEntry(formName, requestID, fieldIds);
//add the attachment
AttachmentValue attachment = new AttachmentValue("name_of_file.ext", "path/to/file.ext");
entry.put(550000011, new Value(attachment));
arsConnection.setEntry(formName, newEntry,null,0);
to do this, you need the request id. this code is using the java API.
i 've tried using
string path= "\\abc\wof\TY044-12";
bool exist=System.IO.Directory.Exists(path);
but 'bool' return true on localhost but return false on server side.
Also, i've searched some answers, but it is difficult reconfigure permission of IIS.
can i use FileWebrequest/httpWebRequest command? not understand about this
FileWebRequest request = (FileWebRequest)System.Net.WebRequest.Create("\\abc\wof\TY044-12");
FileWebResponse response = (FileWebResponse)request.GetResponse();
Ihope someone can help me. THANKS!!!
That looks more like a UNC.
To me it seems the appool account on iis which is hosting your application doesn't have permissions to access that UNC folder. Please refer link below to set it:
http://technet.microsoft.com/en-us/library/cc771170(v=ws.10).aspx
The link below says you can access files over the network using pre-registered reserved types like http, https, file, etc.
http://msdn.microsoft.com/en-us/library/bw00b1dc(v=vs.110).aspx
...And if you pass unc paths to the Uri class during construction, you can get the required uri scheme:
var uri = new Uri(#"\\abc\folder\file.jpg");
Console.WriteLine(uri.ToString()); //outputs - file://abc/folder/file.jpg
However the recommended approach is using classes in the System.IO namespace like you originally started out with.
How do I decode the rdweb/feed/webfeed.aspx content from a Microsoft remote desktop(RDP) server?
I am having difficulty locating the encoding of the webfeed.aspx or more specifically https:// RDP url /rdweb/feed/webfeed.aspx url of the RDP client. In Microsoft's RDP client, the data resolves to references to directories and applications that can be used as shortcuts for the RDP connection.
The file that I get appears to be a base64 encoded file. From what I have read, this should be an XML file that describes the resources, but it seems to be compressed or encoded somehow. I am having no issue getting the data. I can read it via a browser (though not understand it) and Microsoft's RDP client is pulling the data appropriately, so the data is good. I need to decode/process the data because I am extending an open source RDP tool to do the same as Microsoft's RDP client.
Here is an example,from the text file from a test server's rdweb/feed/webfeed.aspx
46672D19C141995BFAA3317324E7595B8AF001B09CF315A3668E2335F383079AA7397E6E8ADF56379306F18DCCFFB4A542CC4C8B81609D5E9D738F8347BC0372EB7513DD797EF0BFA921F7D6E2A108C6A12F44712D57D6191FB068AF1733256291BC0BD7429AD585DA9E6ECC3D1380562A091E980D6908E2E0EF4184689329686AD132E2D63945810D93F88ECAEC6A0B9460F23B9ABF229F974D3B32D0D7415CD8EAF1B6B93678718C9E658F0CEDA604D5294FF3458FB2ABD798A668E8E6714939C8115EC00A13354F8EF22563CF65F5C6D053306D4C3276032D045752412BA760C683C5
Have you tried something like this?
HttpWebRequest httpWebRequest = (HttpWebRequest)WebRequest.Create("https://RDPurl/rdweb/feed/webfeed.aspx");
HttpWebResponse httpWebResponse = (HttpWebResponse)httpWebRequest.GetResponse();
string connectionXml;
using (StreamReader streamReader = new StreamReader(httpWebResponse.GetResponseStream()))
{
connectionXml = streamReader.ReadToEnd();
}
More detailed code is here.
The resulting connectionXml string should be Resource List Syntax.
I'm dynamically creating html files on my local system (using HTMLTEXTWRITER, then save them using streamwriter to local file system). I want to copy this file to my remote server withour user interaction, so that my users can read file. I use C#
for instance I want to copy from d:\myfile.html to mysite.com\myfile.html, how can I do it?
I have used this and it worked. may be useful
for holding path of local
rPath = "\\" & Request.UserHostAddress & "\c$\temp\"
for output file
rOutput = Session.SessionID & "_" & Format(Date.Now(), "ddMMyyhhmmss") & ".pdf"
now: report will be created at localhost\c\temp
You can't use the System.IO classes for this (unless you have access to the remote server as a network drive), but you can programmatically POST the file from the client to the remote server over HTTP using System.Net.
Here's a snippet using the WebRequest class:
WebRequest request = WebRequest.Create( url );
request.Timeout = 1000; // some appropriate value
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = 0; // set a length here
using( StreamWriter requestStream = new StreamWriter( request.GetRequestStream(), System.Text.Encoding.UTF8) ) {
// write to the stream here using requestStream.Write();
requestStream.Close();
}
More info for HTTP: http://msdn.microsoft.com/en-us/library/debx8sh9.aspx
Alternatively, you could use a protocol designed for transferring files like FTP (or something more secure) which isn't that hard to do in code.
FTP options: http://msdn.microsoft.com/en-us/library/ms229718
Is you remote server based on Windows and in the same workgroup or domain with you working machine ? If so, you can turn on the Windows File Sharing on the server. Then you can copy your file with cmd like this:
copy c:\test.txt \\mysite.com
The path "\\mysite.com" is also valid used by File.Copy in C#.
Otherwise, you need to set up a FTP environment on you server and use the FTP related API in C#.
You could set-up an FTP server and copy the files programmatically via FTP.
An example would be found here or here.
There are three ways by which you can copy the file to remote server.
Using normal file copy mode. Here you need to have access to the the webserver shared path. If the webserver is in same network as your application, then you can share the webroot and provide write access to the user who is running the application. He can then use File.Copy("source.txt", "\\Servername\SharedFolderName\target.txt").
The second approach is to use FTP to copy the file to the remote server. This MSDN example would help you on how to do this. This will work with most of the shared hosting providers.
You can use HTTP POST as noted by Tim. But this would let any user to perform the post. You may have to take care of user provisioning, authentication and authorization. IMO, keep this as last option as provisioning user and providing rights to certain path, may become cumbersome.
Issue
Msxml2.ServerXMLHTTP keeps returning 401 - Unauthorised errors each time we attempt to read the contents of a file (ASP) from a web server.
Source server is running IIS6, using NTLM integrated login.
This process has been used successfully before, but only in as far as extracting XML files from external websites, not internal ones.
The proxy settings in the registry of the server on which the script is run has also been updated to bypass the website in question, but to no avail.
All paths identified in the VBScript have been checked and tested, and are correct.
User running the script has correct read/write permissions for all locations referenced in the script.
Solution needed
To identify the cause of the HTTP 401 Unauthorised messages, so that the script will work as intended.
Description
Our organisation operates an intranet, where the content is replicated to servers at each of our remote sites. This ensures these sites have continued fast access to important information, documentation and data, even in the event of losing connectivity.
We are in the middle of improving the listing and management of Forms (those pesky pieces of paper that have to be filled in for specific tasks). This involves establising a database of all our forms.
However, as the organisation hasn't been smart enough to invest in MSSQL Server instances at each site, replication of the database and accessing it from the local SQL server isn't an option.
To work around this, I have constructed a series of views (ASP pages) which display the required data. I then intend to use Msxml2.ServerXMLHTTP by VBScript, so I can read the resulting pages and save the output to a static file back on the server.
From there, the existing replication process can stream these files out to the site - with users having no idea that they're looking at a static page that just happened to be generated from database output.
Code
' Forms - Static Page Generator
' Implimented 2011-02-15 by Michael Harris
' Purpose: To download the contents of a page, and save that page to a static file.
' Target category: 1 (Contracts)
' Target Page:
' http://sharename.fpc.wa.gov.au/corporate/forms/generator/index.asp
' Target path: \\servername\sharename\corporate\forms\index.asp
' Resulting URL: http://sharename.fpc.wa.gov.au/corporate/forms/index.asp
' Remove read only
' Remove read only flag on file if present to allow editing
' If file has been set to read only by automated process, turn off read only
Const READ_ONLY = 1
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.GetFile("\\server\sharename\corporate\forms\index.asp")
If objFile.Attributes AND READ_ONLY Then
objFile.Attributes = objFile.Attributes XOR READ_ONLY
End If
Dim webObj, strURL
Set webObj = CreateObject("Msxml2.ServerXMLHTTP")
strURL = "http://sharename.fpc.wa.gov.au/corporate/forms/generator/index.asp"
webObj.Open "GET", strURL
webObj.send
If webObj.Status=200 Then
Set objFso = CreateObject("Scripting.FileSystemObject")
Set txtFile = objFso.OpenTextFile("file:\\servername.fpc.wa.gov.au\sharename\corporate\forms\index.asp", 2, True)
txtFile.WriteLine webObj.responseText
txtFile.close
ElseIf webObj.Status >= 400 And webObj.Status <= 599 Then
MsgBox "Error Occurred : " & webObj.Status & " - " & webObj.statusText
Else
MsgBox webObj.ResponseText
End If
Replace your line:
webObj.Open "GET", strURL
With:
webObj.Open "GET", strURL, False, "username", "password"
In most cases 401 Unauthorized means you haven't supplied credentials. Also you should specifiy False to indicate you don't want async mode.
It sounds like the O.P. got this working with the correct proxy settings in the registry (http://support.microsoft.com/kb/291008 explains why proxy configuration will fix this). Newer versions of ServerXMLHTTP have a setProxy method that can be used to set the necessary proxy configuration in your code instead.
In the O.P. code above, after webObj is created, the following line of code would set up the proxy correctly:
webObj.setProxy 2, "0.0.0.0:80", "*.fpc.wa.gov.au"
ServerXMLHTTP will pass on the credentials of the user running the code if it is configured with a proxy, and if the target URL bypasses that proxy. Since you are bypassing the proxy anyway, you can make it a dummy value "0.0.0.0:80", and make sure your target url is covered by what you specify in the bypass list "*.fpc.wa.gov.au"
I would first test if you can reach your url through a normal browser on the same server X you run your code on (A). I would try then reach the url from another PC. One never used to reach that url but in the same network as server X (B).
If B works but A doesn't I would suspect that for some reason your source server (i.e. that one that serves the url) blocks server X for some reason. Check the security settings of II6 and of NTLM.
If both A and B don't work, there is something wrong more in general with your source server (i.e. it blocks everything or NTML doesn't allow you in).
If A works (B doesn't matter then), the problem has to be somewhere in your code. In that case, I would recommend fiddler. This tool can give you the HTTP requests of both your browser and your code in realtime. You can then compare both. That should give you at least a very strong hint about (if not immediately give you) the solution.