Ioncube HTML encoder issue - ioncube

Using the encoder it displays this message
<!--
Page protected by ionCube - HTML/JavaScript Encoder
Copyright (c) 2003 RWJD.Com and ionCube Ltd. All Rights Reserved.
Any analysis of this source code, embedded data or file by any means and by
any entity whether human or otherwise to including but without limitation to
discover details of internal operation, to reverse engineer, to de-compile
object code, or to modify for the purposes of modifying behavior or scope of
their usage is forbidden.
-->
Is it possible to change that message to include a policy? Like edit that comment? Its really annoying because it needs to be changed to my written one that my lawyer wrote..

This might be better asked in the ionCube Helpdesk, but that said, I can advise as I am associated with ionCube. Modification of the paragraph below the copyright message would be permitted in this case, however as the obfuscation is generated dynamically at runtime and as the script does not have a feature for providing a replacement legal notice, this isn't entirely trivial. You could try installing an output handler after the obfuscator is included so as to catch the output produced by the obfuscator. Your handler could then do a search and and replace on the output.
Please be aware of the hopefully obvious limitations to client side protection and trying to hide HTML and Javascript in this way. The obfuscator, which was donated by an early ionCube customer, is quite clever and a free example of an encoded file, and a decade ago also very effective. As browsers evolved, plugins and then native features for DOM browsing became standard making it easy to regenerate HTML from the DOM, i.e. after any deobfuscation has run.
While most users to a site would not be aware of how to recreate HTML in this way, it's equally the case that they're not going to care about doing so; most visitors just want a site to work, be responsive, and not have navigation and other features broken by tricks such as right click disabling. Those who would want to steal HTML would know how to via DOM inspection, and they would also know how to access data from browser caches, would take screenshots if necessary to steal images rather than need right-click "save image", and so on. There are techniques to defeat some of those approaches, but at a cost to every visitor that's not probably worth it. Also note that obfuscation in this way should ensure exclusion of pages from Google, just in case that matters.
That said and noting IANAL, protecting the HTML with the obfuscator may have benefits from a legal perspective as the HTML cannot in general be discovered unwittingly or by mistake, and if someone did steal content, they would have had to have tried, albeit perhaps not that hard.
Hope this helps!

Related

Protecting web content

I am writing a website that will be publishing content that has high IP and we would like people to pay for it. To prevent screen capture consistently, I know that there are limitations in using Javascript + flash + html.
I have discovered artistscope which seems to make it impossible to do anything of that nature. I am happy to inconvenience the user as they view my webpage but lock it down.
Does anyone have any experience with this framework?? I understand all users will have to install a plugin that some antivirus software has flagged and i'll just need to add some mark-up to the article page.
Does anyone know anything about artistscope solution and what is involved in implementing it or how well it works??
If its only a few users, I'm guessing you'll require registration? if so you could use legal copyright to protect intellectual property. Use Creative common's, TradeMark your sitename, use registered post to send content to yourself before you post it online that way you can prove in a court that its plagiarized and you were the first to copyright it. This sort of reminds me of this article: http://thedailywtf.com/Articles/Lock-and-Key-.aspx and maybe your site should be in stone, a safe or as a MagicEye. As Greg mentioned there is no bullet proof solution, guys like us will come along and write auto-OCR readers to scan your site and get foreigners to run the app. If you had a legal notice, I'd at least think twice.
Edit: maybe you could even get creative with Captcha's too to deter people (when you detect copyright infringement), here's an idea to two: Is there an efficient algorithm for segmentation of handwritten text?
I also have used Artist Scope solutions, but when it comes to screenshot, it is not enough.
I've just written this post about protection against screenshot and other content grabbing methods like snipping tool. I'll update it soon for other protection methods that I followed in my blog.
Here's a general description of my approach:
It only works for restricted content; content that needs registration to view it.
It requires continuous monitoring by the administrator because...
It detects screen print key and sends an email to you with the username and other details of the person that has already captured your content (If you are aware of any method that bans the user automatically, I'd be glad to hear it).
It covers your content with an overlay if the user tries to capture it while outside the browser window.

Is it ok if everything is looking ok but X/HTML and CSS are not valid , for CMS's Admin/control panel?

Is it OK if everything looking OK but HTML and CSS are not valid , for CMS Admin/control panel?
Should we only consider Web-standards for site, not necessary for site-management tools?
for example
:http://example.com/wp-admin
:http://example.com/admin/
Well, the point of standards compliance is to make everything work correctly for every user. Even though admin areas are only accessible to a few select users per site, if you are building a CMS you have to consider that many, many people might use your script which would add additional users who will be needing to access those admin panels. It's best to make everything standards compliant, that's why they create them. If an admin can't get the admin panel working properly, they'll ditch the script.
I agree it may not be worthwhile to to make everything valid. As long as you've done your testing and it working then it's probably not worth the time to make everything valid.
Some validation errors matter, some don't. The spec is a bit ridiculous in its requirements in places. What is important is that all open tags are closed and they are nested correctly as if this is not done there can be many subtle errors both now and in the future. As for non-encoded entities, using rel's and targets when you shouldn't - it doesn't matter so much.
Since you (usually) can control the access to the admin area (rather than normally having every single device and platform access it) it certainly matters less. The time would probably be better spent adding more features and fixing real bugs rather than aiming for 100% compliance. Don't tell any standards advocates I said that though. :)

Is it worth the time to care about users who have Javascript disabled in their browsers? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Is it worth it to code different functionality for users with javascript disabled?
I've just learned in this question that an ASP.NET webforms application will have issues to run properly in a browser with disabled Javascript (unless I don't use certain controls and features).
So this is a kind of a follow-up question. For my current web application which will only have a small number of users I can postulate to enable Javascript.
But how to deal with this question in general? As I can see (with my rather small knowledge about web development until now) Javascript is all around, especially the more a web site is "dynamic" or "RIA" like.
Is it worth at all to take care about the few users during web app development who disable Javascript in their browsers?
Are they really "few"? (I actually have no clue, I just don't know anyone who has Javascript disabled.)
I'm a bit inclined to tell them: "If you want a nice interactive page, don't disable Javascript." or "You cannot enter this website without Javascript. Go away!" Because I don't want to double code and development time for some mavericks (if they are mavericks). Does it indeed add a lot more time to get a website working with Javascript and without Javascript as well?
For what reason does someone disable Javascript at all in his/her browser? Well, I've heard: "Security!" How unsecure is Javascript actually? And does that mean in the end that millions of pages are unsecure and almost every website I am using is a risk for me? Are there other reasons except security?
Are there any branches or environments where it is usual or even mandatory to disable Javascript for security (or other) reasons? (For instance I have security services, offices in defense ministries or a bank in mind.)
Are there any trends in development to see to add more Javascript to web sites, thus making it more and more necessary to let Javascript enabled in the browser? Or is there rather a counter motion?
Thank you for some feedback in advance!
Whether or not to care about users who turn off javascript should be done on a case by case basis. If you believe that it is ok to turn away users that do not have it enabled then that is a decision that you can make and is made by many apps.
Keep in mind that it is not necessarily a conscious decision to have javascript disabled or at a limited capacity. Screen readers, for example have a very stripped version of javascript and a site that uses it throughout will often be inaccessible. Depending on the website, this may actually be illegal.
If a website is properly constructed with progressive enhancement from the beginning, then creating versions that work without javascript should not be too much additional work. Therein lies one of the major issues with webforms - it is difficult to gain control over markup and javascript tends to be very tightly coupled.
My personal view is the number of people who completely disable javascript to the extent that .net web sites stop functioning is very small - For some government sites I have been responsible for, I don't recall getting any complaints from "non-javascript" users.
My concern would be more about making sure your site was xhtml compliant, with valid markup (which earlier versions of Visual Studio did not generate), valid css, and intelligent use of javascript.
Having a disclaimer somewhere on your site - that is accessible to those few with javascript disabled telling people that javascript is required for the site to function correctly would be a good thing.
Depends on your audience. For one thing, if the site is completely nonfunctional without JavaScript, accessibility (e.g. to those who must use a screen reader) may be an issue, so if you expect any blind users, you might need to consider that.
In most situations, I'd say, you're probably fine just using <noscript> tags to drop in a quick disclaimer along the lines of "JavaScript is required to use this web site."
While I have no solid numbers -- and I presume I'm in the distinct minority -- I, and many of the more savvy users I know, disable JavaScript by default (a la NoScript). I enable it on websites on a case by case basis. Most novice users (I'm ignoring the 25% in the upper/middle of experience) don't even know what "JavaScript" means.
As a developer I see the cost/benefit of supporting JS-less users
as boiling down to one question:
Do your users need the site more, or do you need your users more?
One of my current projects makes heavy use of JavaScript and Flash and does not function at all without it. But as it's installed at the employer's site and the visitors are the employees using it for their job, that requirement is completely reasonable.
However, if I were working on a revenue-generating site where losing users meant losing money, I'd seriously think about crafting the site to work w/o JS -- albeit less quickly with more page reloads. (Perhaps by asking advice here.)
As a random thought -- you could probably devise a special method to determine how many of your current users do and don't have JS enabled.

What might my user have installed thats going to break my web app?

There are probably thousands of applications out there like 'Google Web Accelerator' and all kinds of popup blockers. Then theres header blocking personal firewalls, full site blockers, and paranoid cookie monsters.
Fortunately Web Accelerator is now defunct (I suggest you read the above article - its actually quite funny what issues it caused) but there are so many other plugins and third party apps out there that its impossible to test them all with your app until its out in the wild.
What I'm looking for is advice on the most important things to remember when writing a web-app (whatever technology) with respect to ensuring the user's environment isnt going to break it. Kind of like a checklist.
Whats the craziest thing you've experienced?
PS. I may have linked to net-nanny above, but I'm not trying to make a porn site
The best advice I can give is to program defensively. For example, don't assume that all of your scripts may be loaded. I've seen cases where AdBlocker Plus will block 1/10 scripts that are included in a page just because it has the word "ad" in the name or path. While you can work around this by renaming the file, it's still good to check that a particular object exists before using it.
The weirdest thing I've seen wasn't so much a browser plugin but a firewall/proxy configuration at a user's workplace. They were using a squid proxy that was trying to remove ads by replacing any image HTTP request that it thought was an ad with a single pixel GIF image. Unfortunately it did this for non-GIF images too so when our iPhone application was expecting a PNG image and got a GIF, it would crash.
Internet Explorer 6. :)
No, but seriously. Firefox plugins like noscript and greasemonkey for one, though those are likely to be a very small minority.
Sometimes the user's environment means a screen reader (or even a braille interface like this). If your layout is in any way critical to the content being delivered as intended, you've got a problem right there.
Web pages break, fact of life; the closer you have been coding and designing up against standards, the less your fault it is.
Something I have checked in the past is loading some of the more popular toolbars that people tend to install (Google, Yahoo, MSN, etc) and seeing how that affects the users experience.
To a certain extent it is difficult to preempt which of the products you mentioned will be used by your users since there are so many. I would say your best bet is to test for the most frequent products that your user base may employ and roll with the punches for the rest. If you have the time to test other possible scenarios, by all means do.
Also, making it easy for your users to report possible issues also helps lessen the time it takes to get a fix in place should it be something you can work around.

Is it worth the development time to output valid HTML?

Developing websites are time-consuming. To improve productivity, I would code a prototype to show to our clients. I don't worry about making the prototype comform to the standard. Most of the time, our clients would approve the prototype and give an unreasonable deadline. I usually end up using the prototype in production (hey, the prototype works. No need to make my job harder.)
I could refactor the code to output valid HTML. But is it worth the effort to output valid HTML?
It is only worth the effort if it gives you a practical benefit. Sticking to standards might make it easier to build a website that works across most browsers. Then again, if you're happy with how a website displays on the browsers you care about (maybe one, maybe all), then going through hoops to make it pass validation is a waste of time.
Also, the difference in SEO between an all-valid html website and a mostly-valid html website is negligible.
So always look for the practical benefit, there are some in some situations, but don't do it just for the sake of it.
Yes. It's hard enough trying to deal with how different browsers will render valid HTML, never mind trying to predict what they'll do with invalid code. Same goes for search engines - enough problems in the HTML may lead to the site not being indexed properly or at all.
I guess the real answer is "it depends on what is invalid about the HTML". If the invalid parts relate to accessibility issues, you might even find your customer has legal problems if they use the site on a commercial basis.
Probably not if you have a non-complying site to begin with and are short on time.
However, and you won't believe me because I didn't believe others to begin with, but it is easier to make a site compliant from the start - it saves you headaches in terms of browser compatibility, CSS behaviour and even JavaScript behavior and it is typically less markup to maintain.
Site compliance (at least to Transitional) is pretty easy.
Producing compliant HTML is similar to ensuring that you have no warnings during a compilation - the warnings are there for a reason, you may not realise what that reason is, but ignore the warnings and, before you know where you are, there as so many, you can't spot the one that's relevant to the problem that you're trying to fix.
If you use Firefox to view your web pages, you'll get a helpful green tick or red cross in the bottom right hand corner, quickly showin you whether you've complied or not. Clicking on a red cross will show you all of the places where you goofed.
Some of the warnings/errors may seem a bit pedantic, but fix them and you'll benefit in many ways.
Your page is much more likely to work with a wider range of browsers.
Accessibility compliance will be easier (You'll have 'alt' attributes on your images, for example)
If you choose XHTML as a standard, your markup will be more likely to be useful in an AJAX environment.
Failure to do this results in unpredictability.
One of the biggest problems with web browsers is that they have perpetuated bad habits (And still do, in some cases) by silently correcting certain markup problems, such as failure to close table cells and/or rows. This single fact has resulted in thousands of web pages that are not compliant but 'work', lulling their developers into a false sense of security.
When you consider how many things there are that can go wrong with a website, being lazy when it comes to compliance is just adding more problems to your workload.
EDIT: having read your original post again, I notice that you say you don't bother with compliance when working on a prototype, then you go on to say that you usually use the prototype in production - this means that it's not strictly a prototype, but a candidate.
The normal situation in such circumstances is that once the customer accepts a candidate, no time is allocated for bug fixing or tidying up, thus strengthening the argument for making the markup compliant in the first place.
If you won't be given time later, do it now.
If you are given time later, then you had the time to do it anyway.
If you want your sight to be accessible to people with and without disabilities, as well as external systems, then yes, you should definitely make sure you output valid HTML.
It's easy to test your HTML with automatic validators.
I'll add to what Mike Edwards said about legal ramifications and remind you that you have a moral obligation too :)
Why not write the prototype in valid (X)HTML in the first place? I've never found that to be more of an effort than using invalid HTML. Producing valid XHTML should be a trivial task. (On the other hand, producing semantically meaningful XHTML might be more taxing.)
In short, I see no advantage whatsoever in using invalid HTML for prototypes.
I honestly dont know why it is extra effort to do standards based HTML. It's not as if it's hard and you should be doing it as a matter of professionalism.
If you paid someone to build you a house and he cut corners out of laziness, that you didnt notice at the time, but in 10 years cracks appeared in your walls, would you be happy?
Valid HTML just to be able to have a badge on your site - no.
Having "valid HTML" in the sense of "HTML that works on every major browser or browser engine" - yes.
Absolutely. Invalid code can cause all sorts of weird behaviors, and errors which don't obscure those that do when you get a validation report.
Case in point:
A yellow background was spilling out of a list of messages and over the heading for the next list of messages - but only in Internet Explorer.
Why? The background was applied to a list item, but the person who wrote the page had written it as a single list with a heading in the middle. Headings are not allowed between list items and different browsers attempted to recover from it in different ways. Internet Explorer ended the list item (with the background colour) when it saw the start of the following item (after the heading), while other browsers ended it when they saw the end tag for the first list item.
It was the only validity error on the page, so it took only a couple of minutes to track down the problem and fix it.
Because, if you stick to standards, your work will be compatible in the future. User Agents will strive for standard compliance and their quirks non-compliance mode will always be subject to change. This is the way is supposed to be.
Unless you're into that whole IE8 broken standards perpetuation thing that they want to enable by default. -- that's another argument.
Webkit, Gecko, Presto? (is that opera's engine?), and the others will always become more compliant with every release.
Unless your html work is in a IE embedded browser control, then there's really no reason to output valid html as long as it renders.
In my opinion the key criterion is "fit for purpose" - If your clients want something for a small/internal market (and don't care if that alienates potential customers who have disabilities or use less-common browsers) then that's their choice.
At the same time I think it's our (as developers) responsibility to make sure they know the implications of their decisions - Some organisations will be bound by legislative requirements that websites be useable by screen readers, which typically means standards-compliant HTML.
i believe making valid html outputs wont hurt your development time that much if you've trained yourself to code valid html from the start. for one, its not that hard to know which tags are not allowed within an elementand the required attributes in a tag are sometimes the ones you'd really need anyway - i believe these are the main errors that makes your html invalid, so why not just learn them as early as now if you plan to stay on the web for long?plus outputting valid html can help boost your sites ranking
There are two rules for writing websites:
The site must work for your users.
The site must work for your users.
To meet the first rule, you have to code such that your site renders correctly when using Internet Explorer. Unless you have the freedom to alter your site design to use only those features that IE renders correctly, this means writing invalid HTML.
To meet the second rule, you have to code such that your site renders correctly when using screen-readers and braille screens. Although some newer screen readers can work with IE-targeted sites, in general this means writing valid HTML.
If you're working on a small project, or you're part of a large team, you can code a site that outputs IE-targeted HTML for IE, and valid HTML otherwise. But if you're taking on a medium-to-large project on your own, you have to decide which rule you're going to follow and which one you're going to ignore.
UPDATE:
This is getting voted down by users who think you can always get away with valid HTML in IE. That may be true if you have the flexibility to change your design to get around IE's shortcomings, but if a client has given you a design and you have to get it working, you may have to resort to invalid HTML. It's sad, but it's true, whatever they might think.

Resources