we deliver micro-site content for our client. Our content is injected into a wrapper that is supplied by another developer.
To deliver our content we host the wrapper as well as the content. The user can access this at
http://fundcentre.[redacted].ie/ (try a search for '[redacted]')
For the other content that is not ours, the other developer hosts a similar (though slightly different) wrapper and delivers the content. the user accesses this here:
http://www.[redacted].ie/ (try a search for '[redacted]')
The wrapper contains a search box, which does not work for us but it works for the other developer. I took a look at the network traffic with FireBug but it appears that when I do the search from the wrapper that we're hosting, I'm getting a "407 Proxy Access Denied" error. My guess is their proxy has a problem with the fact that the search is being conducted from a page hosted outside the scope of their proxy.
It was also suggested that there were javascript errors on the page that were preventing the search from executing but I can't see any. Also, I don't think I'd get as far as the proxy error if that was the case.
I don't really understand this stuff too well though, so could somebody with a bit more experience please take a look and maybe shed some light on this for me? Thanks.
The problem appears to be that the search box and the button next to it (the magnifying glass) are both causing the whole page form to submit after they try to set the page URL to the search URL. When you type into the search field and hit "Enter", the outer form that's wrapped around the entire page is submitted. When the magnifying glass is clicked, it tries to load the search results but because it's an image button the click also causes the outer form to be submitted.
I'm not exactly sure how best to fix it, partly because I think the entire page design should be thrown out. But if you're stuck with it, it might be possible to get it working by ditching that in-line Javascript on the button (since it's not working anyway) and then wrapping the search stuff with its own <form> directed to the search page. Having a <form> within a <form> is bad mojo but that's hard to avoid in a design that puts the whole page in a <form> to start with.
Alternatively, you could try handling keypress events on the search input to detect "Enter", and have that handler and the code on the button both return "false" to stop the outer form submission.
edit — as to why that works on the other site, well it appears to me that there the outer form really is the "search" functionality somehow, as they don't have the click handler on the search button at all, so all it'll do is submit the outer form anyway.
edit again — also, I never see that "proxy" issue. The search from your page works fine for me if I first fix the inline Javascript on the button so that it ends with ; return false. That actually may be all you need to do.
It could be a problem that your tags' action are pointing to different scripts. One is pointing to "Home.aspx" and the other to "/Default.aspx".
The two links are in different subdomains, so maybe you would like to change the action of the subdomain so it contains the full location of the action (ex. "http://www.newireland.ie/Default.aspx")
Related
The problem can be seen this page: http://ignitingthesixthsense.com/pre-launch-1
The issue is with the pluggin called Social Discount Press. The purpose of this pluggin is that it prompts the user to share your webpage via social media, and if they do so a link will then appear giving them access to restricted content.
There are actually 2 problems, but I am not sure if they stem from the same issue or not.
1) The first issue is that I have placed the social share buttons on the page twice using this shortcode:
[social_sharing_discount index="2"]
And the second instance of the share buttons (towards the bottom of the page) do not work properly. The Facebook and Twitter button activate the share box when clicked, but after sharing, the "instant access" button does not appear beneath the share button as it should. And the Google button does nothing at all when clicked. I have found that if I remove the 1st instance of the share buttons (at the top of the page), then the second instance of them starts working, so in other words it only seems to work properly if there is one instance on the page at a time.
2) The second and much much smaller issue, is that when the Google share button is clicked, the access button appears before the person actually shares.
And assistance with this would be greatly appreciated.
I have found that the main cause of the issue was due to the pluggin being built mainly on ID's which need to be unique across the page. And having multiple instances of the same ID was causing many errors.
The solution was to rewrite some of the php and js to use Classes instead of ID's. This allowed for two instances of the pluggin to work on the same page, thus solving problem number one.
The reason why the google one is so inaccurate is because the author simply opens up a popup to the google share page in a popup, this doesn't allow us to see the JS events such as when we share.
On the other hand because the twitter and facebook jssdk are loaded into the page, we know the exact moment when there has been either of those have been shared and thus accurately display the instant access button.
Due to this the author of the pluggin is just doing a best guess as to when the user is likely to have shared. It this case the author is guessing that when the javascript onunload event fires, that logically the user has shared on google. Now the onunload event can be fired in multiple ways, one is when the user closes the popup, i.e the best guess scenario as to if the user has shared. However the onunload event also fires when the form loses focus, a user naviagtes a way or a link is clicked.
Furthermore the onunload event is only properly supported in IE, FF and Safari, not Chrome and Opera. Which in the end gives it differeing behaviour in different browsers.
All of this can lead to unpredicatable behaviour, such as what you are noticing.
A better solution would be for google to have a jssdk for the the google plus share that let's us create a share box on the fly, however it's lacking that functionality as of now.
I would like to know at which stage is it okay to start manipulating HTML elements/content using Javascript so as not to impair SEO?
I have read somewhere that HTML content that is hidden using the CSS property display:none is often penalized by Google crawlers, with good reason from what I'm led to believe...I ask this as I intend to have some div panels that are initially hidden, but shown once the user clicks on an appropriate link. My intention is therefore not to hide content from users entirely - just intially to give them a better user experience - I'm afraid Google may not see it that way!
My reason for doing this is to prevent the split second (or in some cases, a full 2 seconds) of ghastly unstyled html elements (positioning), before my Javascript comes in to position, hide and neaten everything up. So adding the display:none at the forefront, and then using Javascript to toggle visibility would have been ideal, but is apparently a no-no with Google Search Engine bot.
Do you experts have any advice? Thank you!
google can now crawl AJAX sites using a simple URL substitution trick; you might be able to take advantage of this to let googlebot see a plain html version of the page for indexing instead of your load-optimized page; see http://code.google.com/web/ajaxcrawling/docs/getting-started.html
If the content in question exists on the page in the html, and is accessible to the user by the time the page finishes loading initially, then you are okay. You want to make sure google can lead a user to your page and see the content in question without requiring further interaction. Adding new content to the html after the initial load (i.e. content from the server), can be problematic for SEO. However if all content is in the html by the end of the page load, then you shouldn't get docked. Keep in mind, good SEO strategy dictates using standard methods of usability so the web crawler can access your content.
Also, each page should follow a content theme. Example: Don't abuse users by hiding five different unrelated blocks of content "medical devices, kazoos, best diners, motorcycles, toxic waste" on one page. Theoretically you could take all of your site's content and lay it out on one page using javascript and 'display:none' waiting for an 'onClick', but that smells like spam.
EDIT, additional info as pertaining to the original question:
The search engine friendly way to display content dynamically is to load it, then hide it from the user.
I have got a website running with an option to report abuse of functionality. This is being done by clicking a link. After the link is clicked the webmaster will get a report of the location where the content is that was reported as abuse. I have added an rel="NOFOLLOW" to the href of the particular links but this is not helping. It seems that this tag is only being used to check wheter a page has to be ranked or not. How can i Exclude googlebot from clicking those abuse links?
This is what a link looks like:
Click me
The way I would go is have the report abuse email be behind a POST form of some kind. For example, a drop down box to select the issue or a text box to write in some comment about the abuse. Another method would be to style the form submit button so that it looked like a link and use that in place of your current link if you didn't want to have another step.
I'd do this in two parts:
By default, I'd make the link take you to a page where you report the abuse via a (very short, friendly) form, one where if you don't want to, you don't even have to choose anything, just click the Report button. Clicking the Report button (or a cancel link) takes you back where you were.
I'd include JavaScript that would test if the user has modern browser features (DOM node creation and such) and, if so, change the action of the link so that when they click it, the form appears right there (in a small overlay box) rather than taking you to a separate page. That makes for a less-intrusive user experience. Either way, though, the end result is a form being submitted rather than simply a link being followed.
We are adding openfire fastpath chat to our site. It will determine and indicate when live chat is available or not and display an appropriate image to indicate the current status and links for each state.
The javascript call hit's a function that is on another box and this function uses document.write to output the html to the page. I know there is a delay because it is making the request to another server and waiting for a result to be returned. The pause here is about a half second, but causes the rest of the page load to be held up.
Has anyone experience a similar issue or offer any tips for getting this to load synchronously somehow. I tried putting this into an aspx ajax panel, but that seemed to cause other issues.
I used an iframe that is the only thing I could find that seemed to work.
This is actually a follow up on my previous question (link)
I've created the HttpHandler and it works fine for now, I'll add flexibility by using the querystring and session to point the post I'm making in the right direction.
The next question is as follows.
Now that I have the old page iframed as it should be, there's still the trouble of handling the postbacks (or actions) these pages trigger.
Every button action (asp form post) refers to a page that is not there (it's on the other server from which I am importing functionality).
I've tried using a url mapping to the other server but I get an error that tells me the external link is not a valid virtual directory. Hence I discarded this option.
I there anyway to keep functionality going inside the iframe?
please do ask clarification if you need it.
I got a solution from a colleague.
before passing the response string to the Iframe from the handler I use a string.replace to adjust the urls in the old site. This way they point to the old site and everything works again :)