optimize and best practice for image call each time - asp.net

I have client code for large project, their logo is on master page, which he updates weekly.
I have to update the title, image path, etc. on all the server and also upload a new image on all servers, now he wants to automate this process.
What is the best way to accomplish this? I have thought about storing the information (including Image) in database with single record and each time client update this single record when they want. But on each page, the logo image will come from database. Is it ok or should we cache or some other option to reduce load each time? Can we update page(.aspx) file using filestream like we can update text file? because if I update aspx from code then we don't have call any SQL, just upload an image, update the title and other information in aspx page.

Very simple. If the client wants to update something on their own without requiring developer intervention, it needs to go in the database. You can of course add any caching or whatever you like so that it's not necessary to query the database each request.
The only other viable alternative would be giving the client access to the directory that has the logo and insist on a convention for the logo filename. The client can then simply upload a new logo with the same filename and nothing needs to change on the site.
Making changes to the page itself automatically is a hugely bad idea. Avoid that like the plague.
I suppose a final though not bloody likely option is to get the client to realize that a logo shouldn't freaking be changed weekly or even yearly. It's a freaking logo; it's your identifying mark to the public.

Related

Reading a PDF back from an iFrame?

I have a PDF document that is getting generated on the fly, and rendered on the fly to an iFrame within a radwindow. Basically the document is already largely prepopulated, however the user will still have a chunk of information that they are required to enter. I've found a good amount of information about sending a pdf TO an iframe, but not much information about going the other way. I have a button within the radwindow that can access the iframe object, however I'm somewhat lost as to where to go from there.
EDIT: The PDF is an editable form. I'm trying to pull back the entire PDF document as is, after the client side makes their entries to the form.
I think you'll need to send the file to the user so they can edit it locally and instruct them to upload it.
The content-disposition header with value attachment can help with the first task and you can use RadAsyncUpload to upload it: http://demos.telerik.com/aspnet-ajax/asyncupload/examples/overview/defaultcs.aspx.
I am not aware of ways to tap into the PDF viewer plugin the browsers use to show the PDF. Perhaps there is API from Adobe or some other third party plugin but that would rely on them and is out of your control.
Perhaps the JS PDF viewer from FireFox has something: https://mozillalabs.com/en-US/pdfjs/ but I don't know how stable and usable it is.
As per what was described in the comments, I ended up using postbacks through the PDF's themselves along with 1 pixel fields to store data required to identify the documents. It's a little hacky, but functional. I'm leaving this as an actual answer as this is as close to a real solution to the problem I originally had. This has been up and running for close to 4 years in this manner, and thus far hasn't caused any issues.

Show list of pages that are opened at the time being

I have a task to list all pages which are opened at that moment and show how many people are on that page.
I am looking for a way to make that happen without keeping any db records or saving information on a text file or smth like that. (Not seccessarily, then. Of course I am going to save that info to a dB, I just wanted to the logic of catching opened page addresses.)
I can of course keep track of every page which are opened till that time, but I want the page address appear on the page when someone opens that address and disappear when user is no longer browsing that address.
Can you give me some ideas how to make that happen using ASP.NET?
Note: I am using web forms with asp.net 4.5
Thanks!
"I just wanted to the logic of catching opened page addresses"
Use javascript in a timed loop (onload and then every 30 seconds perhaps) on every page, to asynchronously post to a page on your server. It should send information identifying the page. This will give you a good idea of how many people currently are on this page.
Store this information in a db in your code-behind, and use this information to report as you wish.
Of course if a user leaves their browser open on one of these pages or opens another tab it will still be reporting as 'open'.
To get the current url in javascript you can use:
var pathname = window.location.pathname;
In google analytics you can see what pages are being used in near-real time.
Why not use that to solve this issue - it's easy to setup.

Getting Google Spreadsheet in the Background

We have a Google Spreadsheet from which we wish to load data into our webpage.
I started by using the Google Spreadsheet APi via C# and the Google API .NET libraries to read the spreadsheet and load it into an html unsorted list.
The spreadhsheet has about 200 rows, but could have more, as it will be updated frequently. So the problem is that the users have to wait until the spreadsheed data is retrieved and parsed before they can see anything in the webpage (the page is white whilst loading).
How can I load this data in the background whilst the page loads?
I've already written my code in C# and don't much want to spend the time swapping to javascript, but I will if I have to.
Could I use the AJAX Control Toolkit to do this? I know it will load html, but can I use it to fetch google data?
What can I do here that would be fast and easy?
[Edit]
The account that hosts the google spreadsheet is inside a google domain, so it's documents can't be shared to the public as a whole - only to individuals. The C# libraries allow me to use the account's username and password to log into the account to get the spreadsheet data, and so the spreadsheet doesn't need to be shared at all. Even if I went with a javascript/ajax solution, I would yet need this functionality.
Well, this probably isn't the BEST answer, but it IS a solution. I'd like to see if y'all have a better one.
Anyway, I found this, which is an example of how to use an asp:Timer to delay the calling of a function for a certain amount of time - in my case, long enough for the page itself to load. At least this way, the user gets to see the page, and can watch the nice loading-gif until the actual content arrives.
It is an AJAXy approach that allows me to keep my c# programming without having to add any javascript.

Web server instances?

Very newbie question, please forgive me:
I'm creating an asp.net website. I assume that when multiple people request the page each person gets a new instance of the site. However, if the site uses a .jpg image on the server and manipulates the image, does each person get an instance of the image also, or do they share the image somehow? I think I know, and that this is probably a dumb question, but I wanted to ask.
As an example: A user logs into the site, and adds times to a schedule. Depending on the schedule, a blue line is drawn on an image (grid.jpg), which depicts a daily timeline. The image is then saved as newgrid.jpg and displayed to the user. Is there a way for each user to get an instance of the image that only they can see?
A great way to generate dynamic images in ASP.NET is by using a Handler. The answer over here offers a good, simple example. In this scenario, the generated image never touches the local file system, it's just generated in memory, and returned to the client.
Well, usually the site is always the same for all users, but there are sessions created for each user with specific settings that you can set. So yes, all people will by default see the same changes (and the same content).

Flex 3: Project Architecture & SEO

I've got a Flex 3 project. One of the problems I have is that not very much of its content is indexed by Google. Currently, I pull data from a mySQl database, so the Googlebot doesn't see most of the site.
My goal is to increase the amount of content indexed by Google, improve the SEO, and improve SERPs.
I thought that instead of pulling the data from the database that I would change the project's architecture and create separate "pages". So, in my case, I would compile each puzzle separately and upload it to the server in its own directory. This way the info in each puzzle would get indexed.
The negative is that if I add a puzzle, I'd have to add a link to it in all of the puzzles that are already on the server. I would have to add the link, re-compile each puzzle and upload it to the server. Is there a way to get around this problem? Also, if I wanted to communicate some data from one puzzle to another in the future, I wouldn't be able to do so.
Any suggestions?
Thank you.
-Laxmidi
The usual way to achieve this goal is to develop a hidden parallel site in HTML.
On the first page you will have your flash and, hidden by javascript, a list of links to the other pages. These links will be parsed by the robots. Ideally, the href pages are virtual (look for "url rewriting"). On each "fake" page, your server-side language will print on the page a content or links from your database AND the flash. The flash will be provided with a string explaining where it is and what it's supposed to show.
Ex: http://www.mysite.com/category1/content7 The URL rewriting sends this request to http://www.mysite.com/index.php?uri=category1/content7. The page should display the Flash with FlashVar "uri=category1/content7". The Flash knows which content it has to display so when an user comes from google, following this link, he will find the content he was looking for.
Every linking and content for SEO should be in HTML, don't trust robots capability of reading Flash.
have a look at Adobe's reference on deep-linking.
you can generate a website's sitemap.xml with a cron process (daily), such that the URLs encode the state of the application you need. This URL will encode whatever content you need to retrieve from the db, with just one index.html page.
good luck!

Resources