DNS authentication for blocked sites - networking

We're looking to setup some restrictions employees won't have access to social media sites on workstations. However, some workstations needs to have access. One method I've tried was DNS zone, however, I'm not sure how to authenticate for those who needs to have this access. Any solution would be greatly appreciated.

What you are trying to achieve is not possible with DNS, what you are looking for is a proxy server, depending on your environment you can use either Squid Proxy (linux) or Microsoft Internet Security & Acceleration Server (Windows). Each has their own way of restricting access to certain sites based on users and user groups so you would need to review the documentation for the specific site you are after but this should give you a starting point and point you in the right direction.

Related

How to make My PC work as Host Server?

I have an ASP.NET web application that has been hosted in IIS local Machine.
My Question is :
Is there any free or paid method that allows browsing this web
application from the internet as Host Server ?
Thanks
The easiest way to to publish it directly onto the internet. You do run the risk of attackers then being able to attach your machine, so you will need to brush up on your security skills. It might be worth looking into one of the free hosting options from AWS, Azure or Google Cloud.
To use your local machine as a web server, first, configure it to use a static IP. Its been a while since I've done it on windows, but this looks about right http://www.howtogeek.com/howto/19249/how-to-assign-a-static-ip-address-in-xp-vista-or-windows-7/.
Next you will need to configure port forwarding on your model. You want to send all traffic on port 80 to your machine, using its new fixed IP address. If your using HTTPS as well, configure port 443 to go to your machine. There are too many different modem brands, all of which handle this slightly differently, to consider offering any more help on this. You will need to do some reading up on your particular modem for step-by-step instructions.
If your internet connection is using a fixed IP, then you can stop here.
If not, or if you just want a domain name, then its worth signing up for a dynamic dns service. I use No-ip, its free, it integrates with my modem and I haven't had any problems with it in the last few years. Once this is in place, you will be able to hit your webserver just like a real one. Using something like "http://mypc.no-ip.biz/mydemoapp/
But again, be warned about exposing your machine on the internet. There are nasty people out there who love to hijack other peoples computers.
Update:
This should give you some guidance on port forwarding
http://www.howtogeek.com/66214/how-to-forward-ports-on-your-router/
Try http://www.noip.com I just logged in and it seemed happy. Otherwise, have a click through all the settings in your modem looking for ddns or dynamic DNS. There is usually a drop down of all the providers that it will talk to. And some providers have apps that you run on your PC , which is easier that working with the modem for some. (Or for models that don't support ddns.)

What is the best solution to prevent malicious IPs from accessing my hosting server?

Just to explain my setup: I have a few websites hosted on a shared server (Lunarpages) and I use Google Apps (with modified MX records in Lunarpages) so the Google Apps emails work.
Now, I've noticed occationally that a mail script on one of my sites gets triggered without any content, though it includes IP information that the form collects. I looked up a couple of those IP address with AbuseIPDB, and they are known hacking IPs. So I want a good way to block all access to my server from known bad IPs.
I see in Cpanel in Lunarpages an option to turn on CloudFlare for security, and looking into them a little, it does appear that they block bad IPs. But I'm a little concerned about whether that would risk messing up how my site works or email works or how my analytics and email forms collect IP address information or if there would be anything different from me besides just turning it on and that the bad IPs would be blocked. I'm not looking to get myself in to a lot of troubleshooting.
Is CloudFlare a good solution, or are there other good alternatives?
Regarding the AbuseIPDB, they look like they have an API that I might be able to set up to block IPs, but if I understand right, I would have to modify all my sites and that still wouldn't block direct access to a lot of files. Unless I'm mistaken.
You can use ipset to block a list of IP addresses and you can set up ipset list of IP addresses from some spam DB.

Subdomains. How do you do development with subdomains?

I am currently building an web app which also utilizes websockets. (Rails for webserver and Nodejs for socket.io)
I have structured my application to use subdomains to separate between connection to the Nodejs server and the Rails webserver. I have "socket.mysite.com" redirected to the Node server and everything else to the webserver.
I am able to test this functionality on localhost. I simply modified my /etc/hosts to include the following:
127.0.0.1 socket.mysite.com
127.0.0.1 mysite.com
I know that on production I simply have to generate a CNAME record for socket.mysite.com and this will also work on my users' computers.
However, I am accustomed to testing my application by passing an IP address around. My team typically set up the server on our own machines and do development. When we want to test our individual servers, we just pass around an IP like "http://123.45.123.45".
With the new subdomain hack, this is no longer possible without modifying each of my tester's /etc/hosts. I honestly don't expect my testers to modify their /etc/hosts on the spot. What I can do is have each member of my team have their own domain and create the appropriate CNAME records for each individual team member.
Is there an easier way to allow me to run my app on an IP and just pass that IP around?
It sounds like your needs have scaled beyond the days of just simply editing a host file. While you could continue to have everyone on your team continue to edit host files, there are two main risks that I see here:
For your idea to just use IP Addresses, you risk missing something in testing that you wouldn't see unless you were on production, as the issue may be dependent on something in the domain configuration.
For using host entries, you introduce a lot of complexity and unnecessary changes to each developer and tester's configuration, which of course leaves the door open for mistakes, and it also takes time that will add-up over the long term.
Setting up a DNS server may be helpful in your case. You could map a set of domains for each developer that match a certain pattern so that your application will still run correctly. This would allow you to share the URLS without having to constantly reconfigure each person's computer. Additionally, marketing and sales stakeholders can easily view product demos as well, without needing to learn what the elusive host file is for.
If you have an IT department, they can help you setup the DNS. However, if you are a small team without a real IT department, some users have found success using DNS systems designed for home or small office networks.

Internal access to company website in dmz blocked as best practice

My client's network security person is setting up their new website in a DMZ for security. This makes total sense to me. However, she proceeded to say that it's a best practice that the company employees not be able to access the site internally. For example, to check if the site was up, she suggested they use their phone.
Is this a new thing? Does it even make sense? I've never heard of not allowing company employees to access the company website over their internal network before. I'm not a security person, I'm a developer, so if this is right on the money please let me know, it just seemed unusual to me.
Is this a best practice that companies are implementing now? Is it the advised way to go?
Any information is greatly appreciated. I'm just confused and a little stunned.
Thanks!
They should be blocking the windows domain, directory services and unused ports from the inside network but should allow the necessary web ports for management. The purpose of the dmz is to protect your internal network from the public server, not the other way around. You shouldn't have to the network security guy that the risk is too low to justify the extra costs associated with monitoring the server from the outside. If your security guy has any experience in network security he'll know that this is standard practice. If not, take it to management and tell them that you need them to pay for another internet connection to monitor your servers or ask the security guy to make 1 access list change in his firewall.
A machine in DMZ should not be able to 'connect in' to any machine in your internal network. Machines from your internal network can always connect to the machines in DMZ.
Generally employees have access to the websites (and other services) running in the DMZ, so there is no reason why you should be restricting employees to connect to your own DMZ machine.
So to answer your question:
Is this a best practice that companies are implementing now? No
Is it the advised way to go?
This doesn't make you any more secure that you are.
If the rationale behind this restriction is to prevent possible infection of internal machines from a malware being distributed by your own website, then how is it more secure than getting infected by a malware distributed by a random website.

Setting up 2 factor authentication

We are in the process of building a new website which we want to lock down to specific computers to only allow access, then once the pc is authenticated we will do our in built user authentication.
Also, when a pc is known, we dont really want anything on the pc which can be easily transfered (by the client) onto another pc in order to gain access to the website.
Please can anyone give us an idea on the best way to achieve this 'lock down', we dont really want to go down the AD route and have loads of extra user data to maintain.
Thanks in advance.
Richard
IP and MAC addresses are trivial to spoof. Without Trusted Computing, there is nothing you can really trust to authenticate a PC. What you need to figure out is what can you do that gets you an acceptable level of trust. Here's what we have done with our "locked" tokens: They take some info from the PC and hash them and send that hash to the auth server. Any requests for an OTP then needs to be accompanied by that hash. It's not perfect, but it also handles mutual https authentication, so it thwarts network-based MITM attacks too. If the token is stolen, the attacker must also know what info to spoof and spoof it. Again, it's not perfect, but better than nothing given the current state of PC security. http://www.wikidsystems.com/downloads/token-clients and our sourceforge page: http://sourceforge.net/projects/wikid-twofactor/
specific computers on your network?
set some IP restrictions in IIS, this assumes your DHCP box is giving out static IPs.
The only way a user could "transfer" the authentication is to take their NIC with them, or clone its MAC address.
Install Helicon Ape free and put .htacces and .htpasswd files in the root of the site you are trying to protect.

Resources