Why file path is invalid in a MQ configurable service? - mq

I have a configurable service (CDServer). When I try to deploy my WMB flow that uses a CDInput Node, which is using the configurable service I got the following error:
BIP7962E: File path '\\192.168.45.91\myfolder' specified for the property 'brokerPathToInputDir' in the CDServer configurable service is not valid.
The file path is accesible from my Windows Explorer. The folder "myfolder" is shared on the remote computer.
I don't know where could be the error. I've tried chanching the file path to a different format (192.168.45.91\myfolder), but still doesn´t work.
I'm using:
WMB 8.0.0.1
MQ 7
Sterling Connect Direct 4.6
Any help on this issue is very appreciated.

I'm up against the same problem. While I continue to hope for a solution that works like yours, my manager told me yesterday he had wrestled with the problem a year ago, and the only solution he found was to put an SFTP server in the middle. For his message flow, he used Attachmate Reflection.
The FileInput node has hooks for remote access. On the FTP tab, click Remote Transfer and fill in the Attachmate server and port, and other settings. Attachmate in turn is configured with a virtual folder, which accesses the actual remote server.
It seems like more machinery than is necessary, but you can't argue with the fact that it works and has been in production for over a year.

Related

Unable to get temp directory for .NET web site hosted in Azure App Service

We're working on validating our Loupe service to run as an Azure App Service and have run into a showstopper we can't figure out. Anything that attempts to resolve a temp directory fails with the exception:
mscorlib : System.IO.IOException
The directory name is invalid.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.__Error.WinIOError()
at System.IO.Path.InternalGetTempFileName(Boolean checkHost)
The stack trace has this within the .NET method for generating a temp file name. This stack trace is common to pretty much all the areas we get the failure. For a bit it seemed that if we forced the site to restart and/or forced the underlying App Service Plan to rescale it would go away until we next updated the site but no longer.
Since the only search results we could find said this error happens when impersonation is enabled and the user the site's impersonating doesn't have access to the IIS App Pool user's temp directory we've dug into that. First, we can confirm from our logging that the thread is not impersonating at the time the failed request is made. Second, just for fun we added this to the web.config to be doubly sure:
<system.web>
<identity impersonate="false"/>
</system.web>
All to no avail. If this was a generic problem with Azure App Services then I would presume it would break many systems, so I have to conclude we've done something fascinating and wrong to cause it.
This might not be the exact answer you're looking for but it might help point you in the right direction.
I had similar issues a while back using the Azure App Services. I found that accessing the local file system was somewhat problematic. Sometimes it worked fine and other times it didn't.
Eventually, I discovered that when an Azure App Service is instantiated, it doesn't always use the same drive letters for the system behind it. In some cases, this can cause the environment variables to be blatantly incorrect. They "think" they are set properly, but that's not always the case.
Generating a temp filename will use that environment variable for the path and if it's set to C: but the machine has a D: drive instead, if will fail. The C: drive doesn't exist and therefore the path to the temp file can't exist either.
To identify if this is the problem, you need to enable RDP so you can log into it directly. https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-role-enable-remote-desktop
It's the only way I was able to eventually figure it out.
If you open up the Kudu instance for your App Service Web App you'll be able to see what the local Temp directory is on the Managed VM underneath. You can access Kudu by going to "Advanced Tools" on the App Service blade in the Azure Portal, or by navigating to the https://{web app name}.scm.azurewebsites.net domain for your Web App.
Once in Kudu, click on Environment in the top navigation. The Temp directory is usually D:\local\Temp and that path is stored in the "TEMP" environment variable made accessible to your Web App.

Sharepoint Server 2010 unable to map network drive

We're having a Sharepoint Server 2010 where we used to copy the files into the server by connecting to a mapped network drive. Since few days we're unable to map the network drive. We're able to make a remote connection to the server and also able to ping and access the shared file from the server. It is an annoying situation when it is accessed through remote and network and not able to access through map network drive. Can anyone help what could be the reason?
What is the exact problem with the mapping?
Didn't you find the server or does it report any error message when you try to map the drive?
First thing I would check is the mapping table.
Open a command line (cmd.exe) with admin rights and type:
net use
That should give you a list of mapped drives.
If your server is listed, delete this mapping
net use /d \\your_server\share
Now or if your share has not been listed, try to map it here
net use drive: \\your_server\share
If it works all is fine, if not, give us the error message.
Often mapping problems are caused by (wins) browsing problems. Try to use
net use drive: \\your_server_ip\share
instead of
net use drive: \\your_server_name\share
too.
If this does not work, check your firewall settings and make sure file and printer sharing is allowed and your desired Folder/drive is shared correctly.

Deploying not happening in publishing process

I am trying to publish to local file system, however publishing is not happening properly and its failed to deploy in my 2011 GA VM environment.
I am getting "Polling for notification for destination: YTnMgU6u5Vh09cOGUG7ouA== has exceeded polling attempts for transaction: tcm:0-121257-66560" error in "Preparing Deployment" stage.
I have used the “Local File System” protocol in my publication target and I have provided path like d:\tridion\publish.
And I have provided the same path in cd_storage_conf.xml under the <storage type=”filesystem”>. All other storage types are commented.
And in cd_deployer_conf.xml , quee location path is c:\tridion\incoming.
When I publish any page into my publication target, the zipped package is placed in the d:\tridion\publish and it’s not deployed.
Do I need to do any other thing to deploy the zipped package?
The path provided in the cd_deployer_conf.xml (the one you specify in Queue/Location!!!) needs to be the same one you provide in your publication target (in your case you have in the publicationTarget some path on D drive while in the deployer conf you have another one from C drive). Then you also need make sure that your deployer is initialized. You can easily determine if your deployer is initialized by looking if the meta.xml is regenerated in the deployer incoming folder.
Not sure if this is relavant but you might be interested also in how to install the deployer: as a .NET WebSite, as a Java WebSite or Windows Service
Hope this helps.
You say your working sites use HTTP sender/deployer. In that scenario your deployer is triggered by the HTTP servlet which receives the transport package.
When you use local file system - you MUST configure your deployer to work in a different way. It has to run as some form of background service. Typically on a windows box this means installing the deployer as a windows service. Keep in mind that this will then probably have additional config files for the deployer and broker/storage.

How can I have a file appearing on a WebDav server trigger a BizTalk event?

I have a legacy system which can create files visible in WebDav as an output. I'd like to trigger a BizTalk orchestration receive port when a file matching a filter appears - so a lot like the standard File adapter, but for WebDav.
I found the BizTalk Scheduled Task Adapter, which can pull in a file by HTTP, but it looks abandoned, poorly documented, and out of date.
So, how is it done? Can I use the standard HTTP adapter perhaps?
If you're able access the WebDAV via a UNC path from the BizTalk server the File Adapter should do the trick.
Have you tried to assign a drive letter to the WebDav folder?
http://en.wikipedia.org/wiki/WebDAV
We've had to go with a workaround on this where we made a completely unrelated separate process to make a copy of the file from the legacy system appear in a Samba share, which we in turn attach to with an ordinary FILE adapter.

Remote File Read

How can I read a text file resides in a remote machine? There is no share exists in that machine and I am not allowed to create any share or file in the remote machine. Also I am not allowed to run any client program in the remote machine. My program is a ASP.net in C# residing in a IIS webserver. For linux machine we used ssh connections and file reads are easy. Is there something by default available in windows similiar to it ?
Thanks,
Sreejith
The first question to ask is if there's a good business reason to read that file. If yes, the IT people will have to allow you a reasonable solution to the problem.
I have frequently used SFTP (secure FTP) for this kind of problem. Unfortunately SFTP is not part of Windows, but there are free and low-cost SFTP servers available. Here's a list from Wikipedia
Explain to IT why you need access to that file and discuss options including SFTP. If you have a valid business reason for this and they will "not let you because of policy", it's the job of your project manager or boss to clear out that roadblock. Ask them to help.
Finally, consider whether it's practical for the file on the remote machine to be pushed to you instead of you pulling it. If you can setup a file share on your PC, ask them to setup a job on the remote server that copies the file to your file share every time it is changed.
You could try accessing the Admin share of the machine. Windows by default created a share for all disks (named C$, D$ etc). But in that case the application you write should be running with the credentials of a user with rights to that share ((local) administrators have sufficient rights to do that).
If that doesn't work you need to create a share or install software to get files from that machine (like FTP). This is all because of security, it's a good thing you are not able to just read a file from any machine...
I have done this many time with the Remote File port 34
http://en.wikipedia.org/wiki/List_of_TCP_and_UDP_port_numbers

Resources