Freeradius and WHMCS for VPN your opinion needed - vpn

I am using WHMCS and bought the Free Radius plugin in order to manage VPN service.
I am trying to assess the pro and cons of using Freeradius Vs a Custom solution.
In my opinion Freeradius brings the following cons :
Huge overhead
Complex and not well coded: difficult to improve.
In case of bug, very hard to fix, no maintenability.
Impossible or complex and expensive to add functionality or evolve the code.
Centralized authentification system, will add between 500ms to 5s lag for the authentification (client connection).
If the central server is down, no client can connect to any server.
Do you think it is a good idea to choose not to use Freeradius and code my own solution for WHMCS in order to manage authentication and user management ?
Trying to assess the pro and cons.
Thx for your opinion

Don't get too caught up on it adding 500ms for the authentication. That's nothing, and won't be noticeable
You mention single point of failure should radius go down. Radius works from MySQL as well as a built in flat-file database (the freeradius module makes use of this MySQL database anyway) so you can have a master-master configuration for your database, with two radius nodes set up in failover.
I fell in love with digitalocean which make this even easier with floating IPs. And all controlled via an API :-)
freeradius is very widely used within this industry, so that should say something about it's stability for you

Related

How to spin up proxies on demand to use for webscraping?

I've been using web scraping services for a bit but I need such a high volume that I can't pay for those services anymore.
I was thinking to build my own proxy pool to mimic these services for myself but didn't know how to do it.
My solutions:
My first thought was to use kali linux's proxychain to get the same result but I don't know if that would work since I can't find the list or proxies in the code.
I also thought that I could create a server config with Ansible, and use Terraform to spin cloud servers either on demand or until blacklisted but I'm not sure it'll be fast and reliable or if cloud providers could block me.
Lastly, I thought about using VMs or Containers since you can attribute them random IPs and bridge networked VMs have random IPs so if it gets banned just delete it and spin up a new one.
Does someone know which way between the three makes the most sense to achieve what I'm looking for ?
I don't know what is required to get a pool so reliable that some companies have created business of offering them for webscraping such as ScrapingBee but just on the reliable proxy pool side, because all the free proxy pools online are not really good.
I imagine that if it was public knowledge, those companies wouldn't exist, but I'm sure I'm missing something there.
Thank you.

How can I get data from a scale into a web application?

*If you think I should ask this question elsewhere, please let me know.
Background:
I need to build an application for converting weights into piece counts. The weights currently come from scales that are connected to PCs via serial ports. I am replacing PC based applications that connect to the scales via a serial connection. I am considering the feasibility of making the next generation of these applications into a web based solution. However, I do not want to do this if it is not a better solution than building an application that runs on the client. In addition, I do not want to use any sort of browser specific technology (ActiveX).
FYI, we currently run a Windows based environment.
What I have so far:
I am currently thinking that I will need some sort of client side “service” to allow the scale data to be retrieved by the web application. I have looked into creating a WCF service for this task and have determined that it would probably work. This would require that the scale be connected to some sort of Windows based computer that is on the network. I would then interface the WCF service (running as a Windows Service on the PC) from an ASP.NET web application running on an IIS web server. This would minimize the footprint on the client and allow us to use a web application.
I am looking for any constructive thoughts and ideas. I am open to reviewing any feasible option that would make this solution as simple and reliable as possible.
Answering my own question per request #honeycomb.
I discovered two viable options for this purpose. Following are high-level overviews of the techniques we leveraged.
Develop a scale reader to be run on a PC connected to the weigh scale device via an RS-232 connection. This reader will forward any information received from the scale into a database. Combined with technologies like change notifications and server-side push notifications, this option will allow data from a weigh scale to be pushed into a web page with little effort and no additional cost. (This option has performed well during testing but is not yet in production)
Invest in converting weigh scale devices to use ethernet connections and connect them to the network. Use an OPC server with a driver that can connect to the weigh scales you are using to read the data from these devices. Consider KEPWare's offering for this purpose. Use KEPWare's tools to forward this data to a database or wherever it is needed. Once again, you can leverage change notifications and server-side push technologies to push this data into web applications in near real-time without polling. (This option is currently working in a critical, production environment)
The second option is probably better in the long-term, but this may vary based on your specific situation. It has some up front costs and would be better suited to new implementations. For my system, I am using the first option because it will ease the transition between the new and old systems.
Note: I am not in any way associated with KEPWare. I am only suggesting their product because it is the only one I am aware of that supports this functionality. I am sure there are other OPC servers that support this type of device.

Web browser as web server

Sorry if this is a dumb question that's already been asked, but I don't even know what terms to best search for.
I have a situation where a cloud app would deliver a SPA (single page app) to a client web browser. Multiple clients would connect at once and would all work within the same network. An example would be an app a business uses to work together - all within the same physical space (all on the same network).
A concern is that the internet connection could be spotty. I know I can store the client changes locally and then push them all to the server once the connection is restored. The problem, however, is that some of the clients (display systems) will need to show up-to-date data from other clients (mobile input systems). If the internet goes down for a minute or two it would be unacceptable.
My current line of thinking is that the local network would need some kind of "ThinServer" that all the clients would connect to. This ThinServer would then work as a proxy for the main cloud server. If the internet breaks then the ThinServer would take over the job of syncing data. Since all the clients would be full SPAs the only thing moving around would be the data - so the ThinServer would really just need to sync DB info (it probably wouldn't need to host the full SPA - though, that wouldn't be a bad thing).
However, a full dedicated server is obviously a big hurdle for most companies to setup.
So the question is, is there any kind of tech that would allow a web page to act as a web server? Could a business be instructed to go to thinserver.coolapp.com in a browser on any one of their machines? This "webpage" would then say, "All clients in this network should connect to 192.168.1.74:2000" (which would be the IP:port of the machine running this page). All the clients would then connect to this new "server" and that server would act as a data coordinator if the internet ever went down.
In other words, I really don't like the idea of a complicated server setup. A simple URL to start the service would be all that is needed.
I suppose the only option might have to be a binary program that would need to be installed? It's not an ideal solution - but perhaps the only one? If so, are their any programs out there that are single click web servers? I've tried MAMP, LAMP, etc, but all of them are designed for the developer. Any others that are more streamlined?
Thanks for any ideas!
There are a couple of fundamental ways you can approach this. The first is to host a server in a browser as you suggest. Some example projects:
http://www.peer-server.com
https://addons.mozilla.org/en-US/firefox/addon/browser-server/
Another is to use WebRTC peer to peer communication to allow the browsers share information between each other (you could have them all share date or have one act as a 'master' etc deepening not he architecture you wanted). Its likely not going to be that different under the skin, but your application design may be better suited to a more 'peer to peer' model or a more 'client server' one depending on what you need. An example 'peer to peer' project:
https://developer.mozilla.org/en-US/docs/Web/Guide/API/WebRTC/Peer-to-peer_communications_with_WebRTC
I have not used any of the above personally but I would say, from using similar browser extension mechanisms in the past, that you need to check the browser requirements before you decide if they can do what you want. The top one above is Chrome based (I believe) and the second one is Firefox. The peer to peer one contains a list of compatible browser functions, but is effectively Firefox and Chrome based also (see the table in the link). If you are in an environment where you can dictate the browser type and plugins etc then this may be ok for you.
The concept is definitely very interesting (peer to peer web servers) and it is great if you have the time to explore it. However, if you have an immediate business requirement, it might be that a simple on site server based approach may actually be more reliable, support a wider variety of browser and actually be easier to maintain (as the skills required are quite commonly available).
BTW, I should have said - 'WebRTC' is probably a good search term for you, in answer to the first line of your question.
httprelay.io v.s. WebRTC
Pros:
Simple to use
Fast
Supported by all browsers and HTTP clients
Can be used with the not stable network
Opensource and cross-platform
Cons:
Need to run a server instance
No data streaming is supported (yet)

Sending broadcast with Chrome Extensions

I'm coding an extension for a customer, one of the requirements is that the extension also works offline because internet services are not that reliable, my customer's business can't stop but can deal with "stale" data, thats a nice tradeoff I guess.
Therefore, I want to code some kind of distributed cache as an extension to synchronize local data among the N nodes that will be connected running the same application and thus synchronize with the real database, hosted on the internet.
In order to achieve that I imagined that I would need to make a network broadcast and listen to incoming broadcasts, then every node that starts to run my application will broadcast it's IP address and become available as a new node for the distributed cache, failover is very important here.
I googled some possibilities I initially thought but none of them will work, I guess. The first was to do it just with HTTP, the second was to use Google Native Client to write C++ code that could run network code and thus do the broadcast, but it has limitations. Right now I'm thinking to use Java Applets but I don't really know if they have some limitations related to networking or if Chrome Extensions has any limitation with Java Applets.
Any ideas on how to do it? Using some of the stuff I suggested or another approach?
You could create an NPAPI extension, which would not be restricted by Chrome at all.

How to scale my stratus application?

Stratus is currently in beta !
So, if I create a simple app with stratus tecnology and I get millions of users, then how can I scale the application ?
How chatroulette have resolved the scaling issue ?
Stratus itself is [probably] a redundant distributed system of Adobe-owned servers accepting connections so rapidly that it is not an issue at all. On top of that, remember that Stratus simply distributes peer identifiers, and all other communication that is any bandwidth-hungry is peer-to-peer, which obviously does not suffer from scaling problems at all.
chatroulette use an array of Red5 servers that stream video in case peers cannot communicate directly (behind firewall/NAT etc), and they HAVE used a PHP/MySQL database backend for peer discovery (Stratus does not discover peers, it simply shares them on demand). I say "HAVE" because that's what it was at a very early point in time, they would be wise to not do PHP/MySQL now, because that would be a suicide with their amount of traffic.
UPDATE
It seems I was talking out of my a$$ when mentioning chatroulette using Red5. I have no evidence whatsoever of the fact, and may have remembered something wrong or confused it with some other service. I do have evidence it used FMS 3.5.2 as of the time of writing.

Resources