WebDAV - Microsoft Excel 2016 not able to save back changes - webdav

I have a set up the ITHit WebDAV server on our company website which works great on Office 2007/10/13 with PowerPoint, Word and Excel. However I have recently updated to Office 2016 and I have found that Excel no longer works, however Word and PowerPoint work OK.
In Excel I get the error message below:
In Word and PowerPoint I get the dialog below which I can skip:
Is there a known issue with the ITHit WebDAV server in Excel 2016?
There are no exceptions thrown when I'm attached in Visual Studio 2015. Also when I've checked Fiddler I can see the last thing the WebDAV server try to do is lock the document which is seems to do without any exceptions. It locked and unlocks the document twice then on the lock where I try to save it then comes back with the error message in Excel (see pic 1).
The lock requests are shown below:
First Lock OK:
Unlock:
locks the document and stops:
The only think I can see that is different is a field in the Miscellaneous section of the header:
I've exhausted all options and I've got no idea why this is happening with just excel in office 2016.
Any help would be greatly appreciated.

This error has been around for a while and I found an old related Microsoft post:
http://answers.microsoft.com/en-us/msoffice/forum/msoffice_excel-mso_other/upload-failed-server-file-updated-were-sorry/30b69218-2cc1-40e2-8ede-69ac8bd55ba6
Microsoft never came up with a solution, but a user named Berend Engelbrecht did post two workarounds. There was a code fix which did not work for me, but is worth trying. The assumption was that Excel incorrectly handled the modified date so don't bother returning it. This is how I implemented it for IT Hit WebDAV. Setting Modified to DateTime.MinValue in my implementation of IHierarchyItemAsync prevented the modified date from being added to the response.
if (context.Request.UserAgent != null &&
context.Request.UserAgent.IndexOf("Microsoft Office Excel", StringComparison.InvariantCultureIgnoreCase) >= 0)
{
Modified = DateTime.MinValue;
}
The second solution did work, but was not practical for our end users. Each would have to turn off Protected View for internet files: File > Options > Trust Center > Trust Center Settings > Protected View.
Also note that when debugging locally, I never hit the issue because my local WebDAV didn't trigger Protected View as an 'internet location'.
Hope one of these helps and if anyone finds a different solution please post it here.

I have double checked that MS Excel 2016 works with no problem with IT Hit WebDAV Server Engine. Here is what may cause this issue:
Incorrect Modified date property implementation. Make sure this
property returns a correct UTC date.
Incorrect ETag implementation. Make sure to change this property
every time the document is updated.
I have checked 2 server configurations: https://ajaxbrowser.com website (anonymous auth) running Server Engine v3.9.2075 and a sample server running on localhost generated by WebDAV wizards for Visual Studio v4.5.2958 with (Basic auth enabled in registry).
As a test client environment I used MS Office 2016 on Win 8.1 and MS Office 2013 on Win 10. In both cases excel document opened with no problem and I was able to save it back to server. The MS Office protected view options were set to default on the test machines - all checked.

I know this post is old, but for anyone who's having the same issue here's the solution:
This error is caused because Excel requests the file a second time if the user selects "Enable Editing".
If the Server sends the file again (with code 200) instead of returning a 304 (Not Modified) Office seems to believe that somebody has changed the document even if this is not the case.
One quick solution is to use the ETag header and validate the If-None-Match in future requests to know what to return, whether 304 or 200.

Related

Inconsistent Cognos errors

I am trying to do a couple of things within Cognos:
Load Framework Manager and view/modify SQL behind existing models and create new models
Modify existing reports through Report Studio via Cognos Connection
I was given an account on the Cognos application server and I installed Framework Manager. I was given the gateway URL and dispatcher URL from the System Admin and then transferred all of the project files to the server so that I could load the project in question. I'm able to open the .cpf file; however, when going into any models, I get the error:
Unable to access service at URL:
https://xxx.cognos.xxx.xxx:443/ibmcognos/cgi-bin/cognos.cgi?b_action=xts.run&m=portal/close.xts
Please check that your gateway URI information is configured correctly and that the service is available.
For further information please contact your service administrator.
I then contacted the system admin and he indicated that the URL was correct.
Furthermore, now when I try to access Cognos Connection (which worked fine last week), I receive the error:
CM-REQ-4159
Content Manager returned an error in the response header. The error "cmAuthenticateFailed CM-CAM-4005 Unable to authenticate. Check your security directory server connection and confirm the credentials entered at login." can be found in the response SOAP header.
The odd thing is, another member of my team receives this error:
AAA-AUT-0016:
https://xxx.cognos.xxx.xxx/ps/images/space.gif
https://xxx.cognos.xxx.xxx/ps/images/space.gif
https://xxx.cognos.xxx.xxx/ps/portal/images/msg_error.gif
The function call to 'Method.invoke(cmServiceInstance, queryRequest)' failed.
https://xxx.cognos.xxx.xxx/ps/images/space.gif
DetailsExpand:
CM-SYS-5192 An error occurred with Content Manager.
I've done some research (I'm not really familiar with Cognos or even networking) and found that these errors (the ones that I receive) are usually received when trying to run a single report; however, I can't even access FM models or Cognos Connection in general. I also don't understand how we can receive 2 different errors when accessing the same URLs from the same network.
Any guidance would be greatly appreciated. We are using Cognos 10.2.2.
http://www-01.ibm.com/support/docview.wss?uid=swg21624136
One possible reason is that the user does not have the required "Import relational metadata" capability.
Or maybe it is something to do with the registry
Note: Make sure you backup the registry before making any changes.
see http://www-01.ibm.com/support/docview.wss?uid=swg22015730
Open cmd and type "regedit".
Navigate to [HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Main\FeatureControl.
Right click "BMT.exe" = dword:00002af9.
Delete.
Re-launch Framework Manager.

Error:1411809D:SSL routines - When trying to make https call from inside R module in AzureML

I have an experiment in AzureML which has a R module at its core. Additionally, I have some .RData files stored in Azure blob storage. The blob container is set as private (no anonymous access).
Now, I am trying to make a https call from inside the R script to the azure blob storage container in order to download some files. I am using the httr package's GET() function and properly set up the url, authentication etc...The code works in R on my local machine but the same code gives me the following error when called from inside the R module in the experiment
error:1411809D:SSL routines:SSL_CHECK_SERVERHELLO_TLSEXT:tls invalid ecpointformat list
Apparently this is an error from the underlying OpenSSL library (which got fixed a while ago). Some suggested workarounds I found here were to set sslversion = 3 and ssl_verifypeer = 1, or turn off verification ssl_verifypeer = 0. Both of these approaches returned the same error.
I am guessing that this has something to do with the internal Azure certificate / validation...? Or maybe I am missing or overseeing something?
Any help or ideas would be greatly appreciated. Thanks in advance.
Regards
After a while, an answer came back from the support team, so I am going to post the relevant part as an answer here for anyone who lands here with the same problem.
"This is a known issue. The container (a sandbox technology known as "drawbridge" running on top of Azure PaaS VM) executing the Execute R module doesn't support outbound HTTPS traffic. Please try to switch to HTTP and that should work."
As well as that a solution is on the way :
"We are actively looking at how to fix this bug. "
Here is the original link as a reference.
hth

Reporting in ASP.NET

I have an SQL Database and an ASP.NET website built to put data into the database.
One of the project requirements is to build a system that would let the user upload a Crystal Report to the server and run it as needed. This way, the user could create a customized report (for then turning into management, customers) that wouldn't force them to go through a developer.
I'm looking for suggestions on how to accomplish this goal.
Currently, I'm looking for a way to redirect the database connection in the Crystal Report from the database it was developed with to the database it will eventually run on. However, There doesn't seem to be a simple way to do this.
I'm also investigating the ReportViewer object. However, all the code I have seen involves specifying the query for the report in the code, which isn't acceptable.
One option (which I don't like at all) is to let them write their own queries so they can copy the results into Excel. This would mean a blank textbox and information about the structure of the database. Not a good idea for multiple reasons.
Another option is to create one report for each table (and maybe a few extras), let the user copy the data they want into Excel, and go on their merry way.
tl;dr How do I build a flexible reporting system?
=========================================
Continuation: 08/20/2012
I have decided to go the route of b.pell's extension methods. So far, it has gotten me closer than anything else. My code to bind to the CrystalReportViewer is below:
CrystalReportSource rs = new CrystalReportSource();
rs.Report.FileName = Server.MapPath("ReportFiles/") + Request["reportname"];
string connstring = System.Web.Configuration.WebConfigurationManager.ConnectionStrings["myConnectionString"].ConnectionString;
rs.ReportDocument.ApplyCredentialsFromConnectionString(connstring);
rs.ReportDocument.ApplyNewDatabaseName("myDBName", "mySchemaName");
rs.ReportDocument.Refresh();
CrystalReportViewer1.ReportSource = rs;
This comes very close to working. It works fine on my dev machine, but when I run the code on the server, it gives the following error:
Logon failed.Error in File CrystalReport2 {5D2E82E5-783E-4DFD-A770-C8AE72A51E4E}.rpt:
Unable to connect: incorrect log on parameters. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Runtime.InteropServices.COMException: Logon failed.Error in File CrystalReport2 {5D2E82E5-783E-4DFD-A770-C8AE72A51E4E}.rpt: Unable to connect: incorrect log on parameters. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
The error is in this line in the code:
crTable.Location = String.Format("{0}{1}", prefix, crTable.Location.Substring(crTable.Location.LastIndexOf(".") + 1))
When I remove the call to ApplyNewDatabaseName, I am asked to enter the Server Name, the Database name, the Username and the Password or to select Integrated Security. I can't enter the Database Name or the Server Name (those fields are disabled).
Any thoughts?
I think what you're looking for is the Reporting Services, part of Business Intelligence
Or maybe you can setup a UI that let the users pick the tables and columns they need for the report (this way you can limit the information they can access) an write a Dinamic Query Builder Function or something like that.
I answer the changing Crystal Reports connection question a lot (it's something I'd think Crystal would make easier, but I wonder if they don't because that's what their server product does). :D Anyway, you can set the database credentials at runtime. Crystal is very particular in the order it's done, but I have some code that I turned into extension methods that do the trick. This code will go through the main report and all sub reports and change the connection information. This assumes that all sub reports connect to the same database that the main report does (if not, you'll need to modify it to handle multiple connections, but this rarely comes up at least with what I do).
Extension methods to change connection info: http://www.blakepell.com/2012-05-22-crystal-reports-extension-methods
It would be used something like this (although, you're binding to a viewer probably and not exporting, so you could ignore that, this is just for example).
Using rd As New ReportDocument
rd.Load("C:\Temp\CrystalReports\InternalAccountReport.rpt")
rd.ApplyNewServer("serverName or DSN", "databaseUsername", "databasePassword")
rd.ApplyParameters("AccountNumber=038PQRX922;", True)
rd.ExportToDisk(ExportFormatType.PortableDocFormat, "c:\temp\test.pdf")
rd.Close()
End Using
System.Diagnostics.Process.Start("c:\temp\test.pdf")
You could use the Crystal Viewer at this point to deliver the reports and store the report in a database or on the file system (with a db meta data table) and have some predefined connections the user could select from that would be applied when it is run.
You also have the option to write your own front end. In this scenario a user would select a report from your meta data (you could put whatever security on it you wanted, I use AD). Then you can read the report parameters in and lay them out on the web form. When the user fills them in, you then sanatize them and pass them to the report via these extensions and you can output Excel, PDF, Word Doc, RTF, etc. A little more overhead and not the nice preview view, but can work well (I've done something like this in the past). Hope this helps.
About "...let them write their own queries" part of your question.
The solution can be to use some query builder component with friendly user interface which hides from users the complexity of your database and avoid any possible SQL injections.
There are few such products on the market. One of them is called EasyQuery, another one is build by Aspose if I'm not wrong. Try to search in Google for "query bulider for asp.net" or ".net query builder component".

Windows Azure Cache Preview

I'm having some trouble using Windows Azure Cache Preview.
I've add the Nuget Package here: http://nuget.org/packages/Microsoft.WindowsAzure.Caching and have configured my role for storing the ASP.NET sessions state as per the info on windowsazure.com
Problem is, I get No connection could be made because the target machine actively refused it 127.255.0.0:20004 when debugging. The dev IP used by Azure is 12.7.0.0.1:81
I'm not sure why, even the sample Windows Azure Caching (Preview) Session State and Output Caching Sample does the exact same thing.
Update: error from log:
w3wp.exe Error: 0 : ERROR: <SimpleSendReceiveModule> b4551065-941b-4bdb-9487-57d9207af308:Request - 1, result - Status=ChannelOpenFailed[System.Net.Sockets.SocketException (0x80004005): No connection could be made because the target machine actively refused it 127.255.0.0:20004
at Microsoft.ApplicationServer.Caching.AsyncResultNoResult.EndInvoke()
at Microsoft.ApplicationServer.Caching.TcpClientChannel.ConnectionCallback(IAsyncResult result)] for end point [net.tcp://127.255.0.0:20004]
Also, more testing shows it is just not for session state storage, but all cache related tasks.
found the answer , in the webrole or workerrole properties, set the number of instances to minimum 2, this is the issue for me. got it resolved after 8 hours of debugging
I haven't seen this error personally so I'm taking a couple stabs in the dark...
Did you enable the caching preview in the role properties tab? That step sets up the "server" side of the caching solution.
You also need to make very sure that the names in the various configurations are consistent or the caching client won't be able to find the service. You should find one of these in the web.config in the dataCacheClient section. SPecifically the identifier attribute.
For others that have the same problem, there are two other possible solutions in case the resolution proposed by Sundara Prabu does not work:
simply reboot your computer, as suggested in this thread on MSDN forums;
in the Regional Settings of Windows change the Long Time format to HH:mm:ss. As I discovered myself by reading this SO answer, the cache emulator calls logman.exe for logging purposes and in particular, it uses the cnf parameter, that requires a duration in the format HH:mm:ss. The cache emulator instead formats this duration using the Long Time format found in the Regional Settings -- in my case, using Italian settings under Windows 8 the format used was HH.mm.ss, thus causing the problem described in the question.

Documents.Add fails on ASP.NET (VB.NET)

I am having an issue with opening a document using Microsoft Word from ASP.NET MVC.
This works perfectly on my developer machine, but not when deployed to IIS.
Dim word = New Microsoft.Office.Interop.Word.Application
'This line is failing to return a document object
Dim letter = word.Documents.Add(letter_doc_path)
'This line then fails due to [letter] being null
letter.MailMerge.OpenDataSource(csvPath)
I have added permissions in "Component Services" (dcomcnfg) to the NETWORK SERVICE user which allows the creation of the Word object in the first place, but I am completely stuck as what to do with this one.
I have also tried suppressing Word dialogs with the following line just in case
word.DisplayAlerts = Microsoft.Office.Interop.Word.WdAlertLevel.wdAlertsNone
The issue isn't helped by not having an error (apart from the null object reference obviously) - maybe there's a way to query Word for a specific error message?
Word requires the normal.dot template file when opening any document, the problem was occurring because the IIS user didn't have anywhere to create the normal.dot so it was failing in the background.
This was fixed by setting the UserTemplate path for the newly created word instance (immediately after creating it).
The path must be writeable by the IIS user (NETWORK SERVICE in my case).
word.Options.DefaultFilePath(Microsoft.Office.Interop.Word.WdDefaultFilePath.wdUserTemplatesPath) = working_folder
So just for completeness, here's the original example with the winning line included:
Dim word = New Microsoft.Office.Interop.Word.Application
'this line fixed it
word.Options.DefaultFilePath(Microsoft.Office.Interop.Word.WdDefaultFilePath.wdUserTemplatesPath) = working_folder
Dim letter = word.Documents.Add(letter_doc_path)
I was having the same problem, and the settings that wheelibin suggested weren't enough to create documents using the NETWORK SERVICE account.
What I ended up doing is:
Create a user account for this
process to run under.
Login as the user and run Word (this
does various setup tasks in Word so
the application doesn't try putting
up modal dialogs when running as a
service).
Create a new application pool and set
the pool to run as the user account.
If you're using Windows
Authentication, and your server is
Windows 2003 (or 2000, presumably),
then this issue applies, and you
need to either change the SPN of the
server, which will break Windows
Authentication for any application
running under a different user
account, or you have to switch the
authentication provider over to NTLM
instead of Kerberos.
IIS 7 can use Kernel Mode Authentication to avoid the issue.
I am not sure how are you catching the errors.
Please take a look at the following pages if you find some clue from that.
Error while using Microsoft Office 2003 in web application
Error while calling MS-Word from ASP.NET
"There is insufficient memory or disk space. Save the document now" - Opening MS Word from ASP.NET

Resources