Here's the rundown:
- We have a web site built with a commercial CMS (Sitefinity)
- The web site is .NET
- We have a test server and the live server set up for development (HTML, CSS, JavaScript).
I'm not a .NET programmer, but I work on a lot of frontend work. My question is this:
When I make CSS changes on the test server, is it normal practice for the entire web site to be "pushed" over to the live server? In other words, can I not just move over the CSS file from development to the live server? (It does not work when I do this.)
I ask this because every time CSS changes are made, the entire web site has to shutdown for 10-20 minutes to "push" the entire development to the live server. This seems like an unusual practice for something so small as making a few CSS changes, and it heavily slows down my work. Shutting down an entire web site to publish one basic CSS file just seems unreasonable of a service.
Can someone please educate me about your processes for .NET and CSS changes? What are the best practices in the industry? I would like to better my understanding of this.
Thank you. Your insight is appreciated.
We use Sitefinty as well and you can just copy or FTP the CSS file or files instead of redeploying the entire site. I like to use Beyond Compare.
We have some ways to push the changes on our development server to live server using sitefinity:
We can choose synchronization option in sitefinty, using that we can push content and all from one server to other server.
http://www.sitefinity.com/documentation/gettingstarted/getting-started-synchronizing-data-between-two-servers
You are doing updation on css files only then need not to push all code every time, might be after doing changes you are not able to see reflection, in that case please publish the page once and you can see reflection.
Please let me know if you want to know more.
It's possible there is some sort of caching going on. perhaps you can check the settings for static content with your host or in IIS to see when static files like css expire.
In addition you can restart Sitefinity by going to Administration Settings > Basic > Languages and clicking Save (or installing the Falafel Dashboard which has a handy restart button you can put on the home page)
Restarting the site should clear the cache and show you the changes. I hope this is helpful!
Related
I have a sharepoint at my office. Its 2013 version. Where I want to write some asp code. But the issue is SharePoint is blocking the code and I am getting error "Code blocks are not allowed in this file". I searched google and found several links to solve the issue by saying make some changes to the webconfig file.
Now my question is how do I find the file. Where it is actually.
What I have is a sharepoint, I don't have any designer. I only have admin access for this site. Can some one please guide me.
I know there are several entries here in stackoverflow, but no one is talking about where to find the file.
Please help me.
My apologies if this happens to be a repetition, in that case please point me to the right post. Thank you guys.
By default injecting server-side code (ASP.NET) in SharePoint pages directly from sites is not allowed for performance reasons, and should remain as is.
If you never approched SP developpment and are not an administrator of the farm in your company I strongly advise you to see first if you can solve your needs with client side development (javascript) instead of going to server side (ASP.NET).
SPS2013 comes with the "Script Editor WebPart" that you can use to inject your custom JS on pages. If you need your custom on all pages consider adding your JS on the site's masterpage.
From JS you can use SharePoint REST API to interact with your site https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/get-to-know-the-sharepoint-rest-service
If you need heavy customisation for your site you can move to the addin model (client side) that will require Visual Studio IDE develoment suite.
And last option is if you explicitly require serve side code and/or need to develop a scalable enterprise grade solution, you will need to make a "SharePoint full trust solution package".
PS: You may see articles around about "SharePoint Framework" (aka SPFx), unfortunatly this is not available for SPS2013.
I've a Ajax .net website which follows this structure :
Control (ascx) : TopMenu, LeftPanel, RightPanel, Footer, all are very simple controls and don't require any connection to database or server side code !
One div body (ajax)
Everytime the website starts, the 4 controls load first, then comes the Ajax body. The performance is pretty good in development environment.
But when i uploaded the precompiled site to the host, it always take quiet long for starting up, after the first load, the performance is good
What i can't understand is : as far as i know, the four ascx control will be rendered first, that means the page will be loaded to the client, after that is the ajax content. So what's causing the performance on start up ?
P/s :
i did set the key compilation=false in web.config
i compiled the site using Publish tool in VS 2010 (Release mode and not allow updatatable ... )
i have no images on the site, it's a very simple site
i've checked similar topics, and event posted a question not so long ago about
this, but still without success
my site: http://iketqua.net
From your site and running the Network Analysis on google chrome what is blocking the render of your site is a huge delay for make a lot of calculations on page load, there is a lot of time that takes to start get data.
Also the google analytic script, must be placed on bottom of your page, together with other external scripts for google plus, facebook like etc.
Also there are 2 fonts on this css, that can not be load, and this takes almost 3 seconds delay.
http://iketqua.net/Styles/Fonts/MyriadPro/font.css
(source: planethost.gr)
If you are referring to the very first request after deployment to production. I don't think there's anything you can do about it. ASP.NET first request will always be slow, even if it is a pre-compiled site because the server still needs to load resources on the server-side.
But, if you are talking about first load from the client-side perspective, by just running Chrome Developer Tools I can see that your site's home page is quite heavy (44 requests, ~4 seconds to load) which explains why the first load takes some time and sub-sequent requests are quicker...mainly, because most of those 44 requests get cached by the browser. Now, in your dev environment it happens quickly because there is no significant network latency or connection hops, once you move to production the network lantency and connection hops plays a big role in performance...that's why many sites use CDNs.
Suggestions
Make your site lighter. There's many things you can avoid. For example:
This background image (http://iketqua.net/img/header_bg.png) is useless because it is a plain color which you can easily achieve that using css. That'll translate to one request less
Bundling and minification tools to minify and merge style sheets and js files
Optimize your css. Take the time to review your css and clean it. I can't believe that such a simple page can be requesting 9 css files...probably most of them are coming from open source frameworks (jQuery UI, DatePick, etc)
I lack permissions to post this as a comment, but if it's fine in the development environment, it may be something as simple as ability of the host or the connection to the host.
After the first load, the performance is good
I'd be inclined to think this is due to the site being cached.
How to make apache web server dynamically load new CSS and images? The purpose is to change the look and feel of portals in a cluster of load balanced web server instances, dynamically. I get a list of files that are changed and would be pushing to web server instances. Now the web server should, display the new CSS and images instead of old or cached.
Please let me know what changes I have to make in the html, apache server settings, cache settings etc. to make this happen.
Thanks in advance.
Ranjith
Get a list of changes built up, and flush the entire cache. This is the least headache way to do it.
I might end up having to build this, but it would be nice if there is a solution already...
I need to add functionality to a client's web page to allow them to upload files, and then to view and download them. We also need some form of authentication mechanism to restrict who has access to which files. I have used Neat Upload in the past and have found that it works pretty well, but it only handles upload. If there is a control that does everything, that would be pretty nice. Has anyone seen or used anything like that? I am working in ASP.Net. Our server is IIS 6, but I cannot confirm which version of IIS the client is using.
I did some more searching, and found this. I think it will fit our needs perfectly
edit: The link didn't come through. The solution is FileVista, at http://www.gleamtech.com/products/filevista/web-file-manager
I would recommend using Neat Upload or some other upload component and integrating ASP.NET membership services to manage permissions.
I've recently started embedding JavaScript and CSS files into our common library DLLs to make deployment and versioning a lot simpler. I was just wondering if there is any reason one might want to do the same thing with a web application, or if it's always best to just leave them as regular files in the web application, and only use embedded resources for shared components?
Would there be any advantage to embedding them?
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
I doubting and questioning the validity of Easement's comment about how browsers download JavaScript files. I'm pretty sure that the embedded JavaScript/CSS files are recreated temporarily by ASP.NET before the page is sent to the browser in order for the browser to be able to download and use them. I'm curious about this and I'm going to run my own tests. I'll let you know how it goes....
-Frinny
Of course if anyone who knew what they were doing could use the assembly Reflector and extract the JS or CSS. But that would be a heck of a lot more work than just using something like FireBug to get at this information. A regular end user is unlikely to have the desire to go to all of this trouble just to mess with the resources. Anyone who's interested in this type of thing is likely to be a malicious user, not the end user. You have probably got a lot of other problems with regards to security if a user is able to use a tool like the assembly reflector on your DLL because by that point your server's already been compromised. Security was not the factor in my decision for embedding the resources.
The point was to keep users from doing something silly with these resources, like delete them thinking they aren't needed or otherwise tamper with them.
It's also a lot easier to package the application for deployment purposes because there are less files involved.
It's true that the DLL (class library) used by the pages is bigger, but this does not make the pages any bigger. ASP.NET generates the content that needs to be sent down to the client (the browser). There is no more content being sent to the client than what is needed for the page to work. I do not see how the class library helping to serve these pages will have any effect on the size of data being sent between the client and server.
However, Rjlopes has a point, it might be true that the browser is not able to cache embedded JavaScript/CSS resources. I'll have to check it out but I suspect that Rjlopes is correct: the JavaScript/CSS files will have to be downloaded each time a full-page postback is made to the server. If this proves to be true, this performance hit should be a factor in your decision.
I still haven't been able to test the performance differences between using embedded resources, resex, and single files because I've been busy with my on endeavors. Hopefully I'll get to it later today because I am very curious about this and the browser caching point Rjlopes has raised.
Reason for embedding: Browsers don't download JavaScript files in parallel. You have a locking condition until the file is downloaded.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
Regarding the browser cache, as far as I've noticed, response on WebRecource.axd says "304 not modified". So, I guess, they've been taken from cache.
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
You know that if somebody wants to tamper your JS or CSS they just have to open the assembly with Reflector, go to the Resources and edit what they want (probably takes a lot more work if the assemblies are signed).
If you embed the js and css on the page you make the page bigger (more KB to download on each request) and the browser can't cache the JS and CSS for next requests. The good news is that you have fewer requests (at least 2 if you are like me and combine multiple js and css and one), plus javascripts have the problem of beeing downloaded serially.