Can I prevent spam bots from submitting a form if its hidden? - css

If I display:none over a form wrapper on my website contact page will a spam bot still be able to submit it?
If no, then would a possible solution be to just place a link saying - send us an email - and when clicked it display:show the wrapper with the form thus this preventing spam?
Thanks,
-O

A spam bot is usually a script that is executed and run automatically. It's not an actual human being so the bot would not care if it is actually hidden by the style or not. You could add it to the DOM in the moment the user clicks a button if you wanted to prevent a "spam bot" to abuse it.

There are many different types of spam bot out there, and many different approaches used in filling in your form (and thus: sending you spam).
For the most part these are automated scripts which don't actually 'see' the page at all but simply use the markup it finds on-page. To that end, using CSS to hide a form won't stop them at all.
On the same vein: you'll find that there are a lot of bots out there which will scrape your email address and send you spam directly if you leave a mailto: link on your site.
There's no sure-fire way of stopping spambots I"m afraid. There are a lot of techniques that you could employ to reduce it though.
The most common is to use a captcha service like recaptcha although even this isn't a concrete guarantee that some spam still won't filter through.
The other trick I tend to employ is a Honey Pot input which is relatively easy to employ and seems to do a fairly good trick of keeping the spammers out.

Spambots are meant to be fast, and they don't parse CSS or JavaScript. If you want to protect against them you can include a CAPTCHA, even a simple one, like asking the user to write the sum or product of small numbers (obviously write the request in a sentence, using letters and not numbers). A cheap solution like this should prevent most spam, but it's easily attackable, if someone wants to target you.
I wouldn't suggest to create dynamically the form, since users that aren't capable to execute JavaScript wouldn't be able to see it.

If using a honey pot, which is the most user friendly, simple and extremely effective, try not to hide your input field with the css hidden command. Bots can see this if they are complex..
Position the field absolutely at the top of the form, to the left or right, have it with no border, and the background and text colour the same as the element the form is within.

Related

Is there a way to see what accessibility standards a Marketing Cloud email meets?

We have a project at my work at the moment around ensuring we meet accessibility standards on our website.
Our emails are built using Salesforce Marketing Cloud layouts. Does anyone know how we can see or test how 'accessible' they are?
I can test using ReturnPath to see how they render on various devices and that gives me results for colour blindness, but I'm not sure how to test how well they would or wouldn't work with a screen reader for example
Emails are a tough one from an accessibility perspective. We are still stuck using tables for layout even in 2021!
We can't use WAI-ARIA and CSS is very limited.
As such I have a smaller checklist for emails that covers the important stuff we can control.
The main things I would look at are:
Reading order
Make sure that the email reads left to right and then downwards (assuming the language is a left to right language, otherwise reverse it)
Use headings
Email marketers often just style normal text instead of putting proper headings in. Make sure headings are in fact <h2> to <h6> with a single <h1> explaining the purpose of the email at the start.
Also make sure that the headings do not skip levels (so don't go from <h2> to <h4> for example).
Colour contrast
The exact same rules for websites apply to emails. I would recommend running the colours through the Web Aim colour contrast checker for your text and background around the text, button backgrounds and text etc.
Alt Attributes
Alt attributes on images is the big one you need to check for especially in emails where images might be the only content within a hyperlink. As you can't use aria-label or visually hidden text in an email alt attributes are the only way you can make a link have meaning if it contains an image (plus as email clients block images it means there is meaningful text for everyone else not just screen reader users).
Meaningful Link Text
Along the same lines make sure links do not just say "Read more". Instead make link text meaningful e.g. "Read our article on X Y Z".
Use a descriptive subject line
This one is the only one that is "difficult" as a marketer. You want subject lines to intrigue people to open the email, however for people with cognitive impairments cryptic subject lines can be disturbing / confusing etc.
Getting the balance between "giving the game away" and meaningful subject lines is difficult, if unsure err towards meaningful (it may help your open rate / conversion rate anyway so A / B test it!)
View in Browser
Due to the limitations of email clients the best way to ensure accessibility is to have a custom "view in browser" link in the email (as salesforce etc. are very unlikely to do a good job of the browser versions of their emails).
That way you can use WAI-ARIA, visually hidden text etc. and mark the page up properly, complying with WCAG 2.1 (and very soon WCAG 2.2) requirements.
Obviously I am aware of the amount of work this entails but once you have a template and components built that are accessible it does become much easier.
Testing
Personally I would just test manually, but I am sure somewhere out there an email testing service exists similar to Axe Accessibility Checker.
But given the length of a typical email I would say a manual check will only take 2 minutes once you know what to look for so a service may not be worth it.
You could always copy and paste the email HTML into a file and save it with a .html extension and then open and test it in a browser / accessibility checker. But you might get a load of issues you can't resolve due to the use of tables for layout.
Finally - learning to use a screen reader takes less than an hour, grab NVDA or VoiceOver and test the email yourself, if you can understand it and access all the same information as everyone else then send it!

Using Javascript to get around SEO concerns

I would like to know at which stage is it okay to start manipulating HTML elements/content using Javascript so as not to impair SEO?
I have read somewhere that HTML content that is hidden using the CSS property display:none is often penalized by Google crawlers, with good reason from what I'm led to believe...I ask this as I intend to have some div panels that are initially hidden, but shown once the user clicks on an appropriate link. My intention is therefore not to hide content from users entirely - just intially to give them a better user experience - I'm afraid Google may not see it that way!
My reason for doing this is to prevent the split second (or in some cases, a full 2 seconds) of ghastly unstyled html elements (positioning), before my Javascript comes in to position, hide and neaten everything up. So adding the display:none at the forefront, and then using Javascript to toggle visibility would have been ideal, but is apparently a no-no with Google Search Engine bot.
Do you experts have any advice? Thank you!
google can now crawl AJAX sites using a simple URL substitution trick; you might be able to take advantage of this to let googlebot see a plain html version of the page for indexing instead of your load-optimized page; see http://code.google.com/web/ajaxcrawling/docs/getting-started.html
If the content in question exists on the page in the html, and is accessible to the user by the time the page finishes loading initially, then you are okay. You want to make sure google can lead a user to your page and see the content in question without requiring further interaction. Adding new content to the html after the initial load (i.e. content from the server), can be problematic for SEO. However if all content is in the html by the end of the page load, then you shouldn't get docked. Keep in mind, good SEO strategy dictates using standard methods of usability so the web crawler can access your content.
Also, each page should follow a content theme. Example: Don't abuse users by hiding five different unrelated blocks of content "medical devices, kazoos, best diners, motorcycles, toxic waste" on one page. Theoretically you could take all of your site's content and lay it out on one page using javascript and 'display:none' waiting for an 'onClick', but that smells like spam.
EDIT, additional info as pertaining to the original question:
The search engine friendly way to display content dynamically is to load it, then hide it from the user.

Security implications of allowing framing?

I notice that when I try to access Stackoverflow through the reddit toolbar, I get a popup that says "For security reasons, framing is not allowed". See here for an example.
What exactly are those security reasons?
I realize that this might be a question for meta, but it is really more of a general web security question, so I'm giving it a shot here.
Thanks.
You can check the story on that in here.
EDIT:
Ok, so quoting from the link the problem with framing is that it's the first step to clickjacking. How is that accomplished? You can have an apparently harmless page with links which have on top of it a frame with full transparency that was carefully positioned so that when you click the links of the page, you'll be clicking links or buttons of the framed page. Although you can't see the frame (due to full transparency), your clicks will be caught by it. This results in, while the user is lead to thinking that he's just navigating on a random page, he may be actually changing his twitter status, sending emails, doing something on facebook, clicking a paypall "Yes please donate it all" button, ... imagination is the limit.
To protect its users from click jacking attacks. In simple words click jacking works like this:
The attacker hosts the malicious html file
This file loads the 'attacked' website in the background using a frame and by overlaying elements on top of the 'attacked' website it tries to trick the users into clicking something they didn't want to.
If an evil website decides it's going to frame your website, you will be framed. Period
Wrong. Mechanisms like the one implemented here in stackoverflow protect websites from being loaded inside another possibly malicious page. This way the site protects its users against click jacking attacks.
f that is the case, why do it at all? Furthermore, the target of the attack is not necessarily the site being framed, it could be any site. So again, why bother busting the frame?
The frame is used to load the 'victim's website inside a page that will try to trick the users. Busting the frame means that the site is blocking these possible click jacking attacks. Or at least adding an extra layer of security since these 'filters' can also be bypassed.
Read the original research paper about click jacking
Apparently there is a tiny chance of a possible click-jack attack as demonstrated here:
http://dsandler.org/wp/archives/2009/02/12/dontclick
So I guess it kinda makes sense, but it is awfully inconvenient.

Bust iFrames accurately when implementing DiggBar or FacebookBar?

Understanding all the security and UI concerns with iFrames, I am implementing a toolbar similar to the DiggBar or FacebookBar.
A top bar persists across the top 30 pixels of the screen, and an iFrame displaying external content fills up the remainder of the page.
When users close the toolbar, and thereby exit my little site to go directly to the third-party site, how can I bust the iFrame properly and display the right page? If the user clicks on even one link in the iFrame, I end up showing the wrong page.
Given my understanding of browser security, and coupled with how DiggBar and FacebookBar fail to do this accurately, I'm guessing it cannot be done.
But I was hoping the Stackoverflow coders are smarter and might have an answer? :)
Thanks!
You can't. Because of browser cross site-scripting security, your bar which sits in its own frame cannot access any other frames and determine their URLs.
Not to mention that'll you'll be sued by website owners for numerous things and that you'll piss off every hacker out there.
This is the last thing you want to do if you'd like to NOT in your our office as that one guy who wanted to include everyone elses web site in their website with the owners permission.
I wouldn't speak up at any of the conventions either.
I've also added the question: "Have you ever written code or worked on code that frames other sites?" to my list of questions to use to weed out job applicants.

Is there a generic way to implement an ConfirmIfDirty feature for a web page?

On stackoverflow, and other websites, if you start making a change to form elements and then you try to navigate away from the page, you will get a confirmation message asking if you are sure you want to discard your changes.
This seems relatively easy to do by hand, but impractical to apply across an entire site. Is there any generic solution that can be plopped onto a page as a control (or even jQuery plugin) which will track IsDirty for all fields (without having to specify each field by hand)?
You can use the window.onbeforeunload event.
See also How can I override the OnBeforeUnload dialog and replace it with my own?
A possiblity would be to clone a selection of all your inputs when the page is loaded (and data into it as well).
You could then do a compare as desribed here:
http://chris-barr.com/entry/comparing_jquery_objects/
Word of warning though, this may be costly, performance wise.

Resources