Get available languages in asp.net - asp.net

i am trying to get the available languages installed in visitors pc's.
The problem is that i don't want to get the languages from the internet browser.
Any suggestions please?

The only (standard) way is to look in the HTTP header's 'Accept-Language'. See the standard. It would be a security hole if you could get access to more information than that without asking permission.
You could run some Active X component to spy on the users' computers, but you'd have to get them to give you permission first, but I suspect that will just cause people to not want to use your website. Also it would only work on Windows. I wouldn't recommend doing this.
Of course, you can always ask your users to tell you via some settings page. If changing this setting would help them to use your site, they would probably not mind doing that.

Related

Protect Website Against Piracy

I have a membership website where I sell video content but I have found out that users are downloading the content. Although I had tried Amazon with cloudfront and firewall and now moved to vimeo pro, users are always able to download the content using various extensions for chrome or firefox.
Is there a way that the website can detect such extensions and prevent the user from accessing the website? Maybe an overlay with a message would do the trick.
The website is in Wordpress, so any plugin or code would be highly appreciated.
Thanks for your help!
The simple answer is that there is really no effective way to stop people downloading your videos, if you want them to be able to actually view them.
You can authenticate users and control access that way but even this does not stop authenticated users copying and sharing the video.
The usual approach is to accept it will be downloaded and use an encryption mechanism along with a key exchange mechanism which means that only people with the proper rights can see it - this is what the common DRM systems do.
Even with this, your protection level will depend on what you need to protect - if the video is an entertainment video and you just don't want people viewing it for free then this is likely a good enough solution for you. If your video contains sensitive information, e.g. company data etc, that you don't want anyone to know at all then even this won't stop someone simply pointing a camera at the screen and getting (albeit a low quality) copy.

My Iframe Won't open?

I can't open the following page in . Can someone solve why and if you know please send me the code?? https://www.mcjukebox.net/client?server=1792
This is the website.
I'm the lead developer of MCJukebox. We intentionally block all embedding of the client as this allows users to place the client on their site but without official support. This could cause issues which are hard for us to work on, and we also prefer users coming onto our domain as the project is made free and we need a way of attracting new users.
Feel free to email us if you have any more issues.

Google analytic shows me wired links for one of my visitors

I have a website wich is registered with google analytic so I can see the statistics of it The problem is that sometime it shows me this link :
website.com/www.bndv521.cf/
or:
website.com/admin
I do not know if this is a hacker trying to hack me or something but I think nobody will try to access my admin for good
Can you help me to know what is this link refers to ?
Consider checking for a malicious code included on your pages. And yes it's likely that some one is trying to access those pages but it may not execute because it's invalid path. You should consider blocking such ip addresses after checking in logs.
Although trying to reach an admin page seems a suspicious action, in our website we come accross this issue every one in ten thousand requests.
We think that a browser extension or a virus like program tries to change URL or trying to add this keyword to URL. Not for a hacking purpose, but to redirect to their advertisement website.
Very similar issue here: Weird characters in URL

how to completely Hide website from search engines?

Whats the best recommended way yo hide my staging website from search engines, i Googled it and found some says that i should put a metatag, and some said that i should put a text file inside my website directory, i want to know the standard way.
my current website is in asp.net, while i believe that it must be a common way for any website whatever its programming language.
Use a robots.txt file.
see here http://www.robotstxt.org/robotstxt.html
You could also use your servers robots.txt:
User-agent: *
Disallow: /
Google's crawler actually respects these settings.
Really easy answer; password protect it. If it’s a staging site then it quite likely is not intended to be publicly facing (private audience only most likely). Trying to keep it out of search engines is only treating a symptom when the real problem is that you haven’t appropriately secured it.
Keep in mind that you can't hide a public-facing unprotected web site from a search engine. You can ask that bots not index it (through the robots.txt that my fine colleagues have brought up), and the people who write the bots may choose not to index your site based on that, but there's got to be at least one guy out there who is indexing all the things people ask him not to index. At the very least one.
If this is a big requirement, keeping automated crawlers out, some kind of CAPCHA solution might work for you.
http://www.robotstxt.org/robotstxt.html
There are search engines / book marking services which do not use robots.txt. If you really don't want it to turn up ever I'd suggest using capcha's just to navigate to the site.
Whats the best recommended way yo hide my staging website from search engines
Simple: don't make it public. If that doesn't work, then only make it public long enough to validate that it is ready to post live and then take it down.
However, all that said, a more fundamental question is, "Why care?". If the staging site is really supposed to be the live site one step before pushing live, then it shouldn't matter if it is indexed.

What does it mean when I see some IPs look at hundreds of pages on my website?

What should I do when I see some IP in my logs scrolling through 100s of pages on my site? I have a wordpress blog, and it seems like this isn't a real person. This happens almost daily with different IPs.
UPDATE: Oh, i forgot to mention, I'm pretty sure it's not a search engine spider. The hostname is not a searchengine, but some random person from india (ends in '.in').
What I am concerned with, is if it is a scraper, is there anything I can do? Or could it possibly be something worse than a scraper e.g. hacker?
It's a spider/crawler. Search engines use these to compile their listings, researchers use them to figure out the structure of the internet, the Internet Archive uses them to download the contents of the Internet for future generations, spammers use them to search for e-mail addresses, and many more such situations.
Checking out the user agent string in your logs may give you more information on what they're doing. Well-behaved bots will generally indicate who/what they are - Google's search bots, for example, are called Googlebot.
If you're concerned about script kiddies, I suggest checking your error logs. The scripts often look for things you may not have; e.g. on one system I run, I don't have ASP, however, I can tell when a script kiddie has probed the site because I see lots attempts to find ASP pages in my error logs.
Probably some script kiddie looking to take advantage of an exploit in your blog (or server). That, or some web crawler.
It's probably a spider-bot indexing your site. The "User-Agent" might give it away. It is possible to have 100s of GET requests easily for a dynamically generated Wordpress site if it isn't all blog pages but includes things like css, js and images.

Resources