I'm facing a problem, and don't know if anybody has solved this issue.
I'm using Tridion webdav (Tridion 2011 SP1 HR1) to insert structured content (folders + binary files) into a publication.
All goes well when the folder doesn't exist, but if the folder exist, then an error appear:
You do not have permission to access this Web Folder location
Watching the log files, the problem is that Tridion tries to create a new folder when I drag & drop an existing folder, and rename to the folder title, giving an error:
Name must be unique for items of type: Folder/Virtual Folder within this Folder and its BluePrint context.
I tried using Events to avoid saving the folder when it exists (throwing an exception in the initiated phase), but the permission error appears when updating the content inside the folder.
Has anyone tried to do something alike?
Has anyone any sugestion I can try?
Thank you all in advance.
This has nothing to do with Tridion as you are using Windows Explorer to access WebDAV, so if anything it's the Explorer who is to blame for sending wrong requests, but if you will take a look at WebDAV sprcification here: http://www.webdav.org/specs/rfc4918.html, you can see all the possible calls under chapter 9.
Tridion Folders are treated as collections, so there's a MKCOL call to create a folder, but if you will check 9.7.2 PUT for Collections, you will see:
9.7.2 PUT for Collections
This specification does not define the behavior of the PUT method for existing collections. A PUT > request to an existing collection MAY be treated as an error (405 Method Not Allowed).
The MKCOL method is defined to create collections.
So there's no way to update existing folders, it's create and upload contents, indeed.
You can also check this article if you are interested in WebDAV protocol: http://amarchuk.blogspot.nl/2011/10/heres-c-webdav-client-that-works-with.html
Related
In the past I was able to view the contents of my raw repository to which I upload my javadoc sites (through dav:http:// using mvn site-deploy):
e.g. Accessing from the browser: http://myserver/nexus/content/sites/my-raw-repository/ would list the subdirectories.
Now (after admin upgraded to nexus 3.20) directory-listing does not work and I have to explicitly request for a resource page e.g.:
http://myserver/nexus/content/sites/my-raw-repository/artifact-site-name/1.0.0/apidocs/index.html
Is there a way to bring back the directory listing?
This closed issue says that nexus3 supports it, but I cannot figure out how.
I found the issue. The repository browser url is now differrent.
content/sites has become: service/rest/repository/browse.
(The old path is still available but with no directory listing. Based on the findings below, I guess it is deprecated)
I went to the browse left-menu-option -> select the raw repository -> clicked on the Html View link and it took me to the correct location.
The new link is now: http://myserver/nexus/service/rest/repository/browse/my-raw-repository/
Something noteable is that now when reaching the actual resource file (e.g. html),
that resource file link is differrent: http://myserver/nexus/repository/my-raw-repository/path/to/the/htmlpage.html
so instead of service/rest/repository/browse/my-raw-repository/ it is repository/my-raw-repository/ (which also does not support directory listing)
I already made pipeline. Which is working fine. Suddenly it give error like
2015-12-18 02:39:08.091 GMT] ERROR system.core ISH-CORE-2368 Sites-SiteGenesis-Site core Storefront [uuid] [request-id]-0-00 [timestamp] "Error executing pipeline: Hello com.demandware.beehive.core.capi.pipeline.PipelineExecutionException:Pipeline not found (Hello) for current domain (Sites-SiteGenesis-Site)"
Does anybody know how to solve this?
In the event that your pipeline can not be found for the selected domain, please go trough and verify all of the following:
Double check Pipeline-Node naming
Pipeline URLs are generated by their name and your desired entry node, in this scenario, I would expect a file named Hello.xml in you cartridge's pipeline directory, and a start node named Start, would be accessed via {instanceURL}/on/demandware.store/Sites-mySite-Site/Hello-Start
Try and force upload of your cartridges
Occasionally the files on the server will not be updated correctly when a save is made; to force an update, right click your project, click Demandware > Upload Cartridges
Check your Cartridge Path
If you are using a shared instance, or your instance is re-provisioned, you may need to check your cartridge path to be sure your custom cartridge(s) are still there.
Check your Code Versions
Occasionally you may increment / change your code version - if you do, make sure that the path you select in Studio is the one that you have selected in Business Manager.
Tech Support
Should you still have issues following the four steps above, please file a support ticket and the tech-support team will be able to provide you with more assistance.
I followed Backup and restore method in alfresco share instead of import/export. It is now working as i expected in new Alfresco, i can see the content in sites, can view files in site document library, can view events, workflow,users,groups and so on. Everything goes fine except that the repository is not loading, but When i search for files in repository it is showing "3 result(s) found in Quality site."...but it is not displaying those files.
In my old Alfresco i have set permissions for folders in repository...will it cause any error to load repository in my new alfresco?
It shows following error when i close my server...
log4j:ERROR LogMananger.repositorySelector was null likely due to error in class reloading, using NOPLoggerRepository.
Kindly look into my issue and give some suggestion......
that error means that the log4j tries to log something in the log file of the webapp but Tomcat already shut down. have you sufficient/right permissions on the new restored alfresco installation?
If you followed correctly the backup/restore procedure from the wiki, the permissions on nodes of the repository also come together. But, if you want to reset and rebuild all the permission, you could perform a FULL reindex with the string appended to alfresco.global.properties:
index.recovery.mode=FULL
As part of a CMS, I have created a custom VirtualPathProvider which is designed to serve a single file in place of an actual file structure. I have it set up such that if a file actually exists on the server, that file will be served. If the file does not exist, the virtual content stored for that address will be served instead. This is similar to the concept of serving a website from files stored in a database, though in this case the content is stored in XML files on the server.
This setup works perfectly when a request is made to a specific page. For example, if I ask for "www.mysite.com/foobar.aspx", the content that is stored for "foobar.aspx" will be served. Further, if I ask for "www.mysite.com/subdir/foobar.aspx", the appropriate content will also be served.
The problem is this: If I ask for something like "www.mysite.com/foobar", things begin to fall apart. If the directory exists on disk (and doesn't have a configured default page in IIS, such as index.aspx), I will get a "Directory Listing Denied" error. If the directory does not exist, I'll simply get a 404 - Resource Not Found.
I've tried several things, and so far nothing I've done has made a bit of difference. It seems as though IIS is simply noting the nonexistence of a directory (or default file in an existing directory) and serving up its own error code, without ever asking my application what to do with the request. If it ever did get to the application, I would be able to solve the problem, but as it stands, I'm quite lost. Does anyone know if there is some setting in IIS that is causing this?
I've looked for every resource I can find on the subject, and am coming up empty. I know this should be possible, because I have read tutorials on serving content from both databases and ZIP files. HELP!
p.s., I am running IIS6 and .NET 3.5
IIS will only pass a request to the ASP.NET process if it is configured to do so for the particular extension. The default is aspx, ascx, etc. In other words, if you request a .html file, ASP.NET will never see that HTTP request. Likewise for empty extension.
To change this behavior, add a wildcard mapping to the ASP.NET process. Load IIS Manager, go to the Properties for your web site and look at the Home Directory tab. Click on "Configuration" and there you will see the extension-to-applicaiton mappings.
I have an application with 2 directories (books and export).
If we create a book or a page of a book in the application a directory is added with the id of the page (this is for uploading resources).
If we delete a page, the page (and it's directory) is removed from the database and the filesystem.
However this resulted in a session loss (even an application restart). I've looked up some thing on google and found the following link.
It seems to be a problem in ASP.NET 2.0 (and 3.5).
We are now thinking about writing a service that will clean up the directories at night.
But there has got to be another solution for this no?
Oh and putting the directory outside the virtual directory is not an option.
Try disabling the monitoring of File System. This will prevent your session alive.
This article may be usefull for you.
Oh and putting the directory outside
the virtual directory is not an
option.
Putting the directory outside the virtual directory is the only solution I found (so far). What you can do, is to create a link (junction) in the file system so that the directory appears to be inside the virtual directory, e.g:
Our web site (virtual directory) is located at C:\projectX\website
the data directory (where we create/delete files and folders) is located at C:\projectX\data
then we create a link which makes the data folder available as C:\projectX\website\data
The link is created using the program Linkd.exe (available in the windows resource kit), with the following command:
linkd c:\projectX\website\data c:\projectX\data
Now c:\projectX\website\data is a link/junction which points to the real data directory. You can work with the link as if it were a physical directory.
E.g. in your web site you can access it using this code:
Server.MapPath("~/data")
And you can also used the windows file explorer and browse to C:\projectX\website\data. It appears just like a real directory.
There seems to be a supported hotfix which achieves the same as the article Sachin mentioned (turn off the file change notifications in a web site).
Check this article in the microsoft KB for more information.
But since you mentioned in a comment, that you do not have access to the server, I guess this will also not help in your case.
For storing data files that are frequently updated, created and deleted you need to use App_Data folder in the root of the web site. MSDN for App_Data folder states:
Contains application data files
including MDF files, XML files, as
well as other data store files. The
App_Data folder is used by ASP.NET 2.0
to store an application's local
database, which can be used for
maintaining membership and role
information.
Also check Q&A section for App_Data folder usage: App_Data folder question
I had the same problem. The solution is to externalize the session handling by using the ASP.Net State service. The only draw back is that every object you place in the session needs to be serializable, as it is transferred to the state service and back.
I currently do not have the possibility to provide further links, but google will help you, now that you know what to search for.