Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I was wondering where I could learn more about decentralized sharing and P2P networks. Ideally, I'd like to create something to help students share files with one another over their universities network, so they could share without fear of outside entities.
I'm not trying to build the next Napster here, just wondering if this idea is feasible. Are there any open source P2P networks out there that could be tweaked to do what I want?
Basically you need a server (well, you don't NEED a server, but it would make it much simplier) that would store user IPs between other things like file hash lists, etc.
That server can be in any enviroinment you want (which is very comfortable).
Then, each client connects to the server (it should have a dns, it can be a free one, I've used no-ip.com once) and sends basic information first (such as its IP, and a file hash list), then sends something every now and then (say each 5 minutes or less) to report that it's still reachable.
When a client searchs files/users, it just asks the server.
This is a centralized network, but the file sharing would be done in p2p client-to-client connections.
The reason to do it like this is that you can't know an IP to connect to without some reference.
Just to clear this server thing up:
- Torrents use trackers.
- eMule's ED2K uses lugdunum servers.
- eMule's "true p2p" Kademlia uses known nodes (clients) (most of the time taken from servers like this).
Tribler is what you are looking for!
It's a fully decentralized BitTorrent Client from the Delft University of Technology. It's Open Source and written in Python, so also a great starting point to learn.
Use DC++
What is wrong with Bit-Torrent?
Edit: There is also a pre-built P2P network on Microsoft operating systems that is pretty cool as the basis to build something. http://technet.microsoft.com/en-us/network/bb545868.aspx
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 months ago.
Improve this question
I am developing an architecture of a website which will have more than 5000+ concurrent hits with each user possibly requiring heavy background processing. What are the guidelines for the same? Which technologies are recommended.
This is primarily for Java/Spring-boot
try to acheive horizontal scalability.
use JPA with L2 caching that's backed by Redis if you're doing processing
Utilize a cloud infrastructure to have the managed redis/serverless databases (saves you the headache)
When implementing your web tier, use Reactive programming platforms like Project Reactor. This will allow you to scale better with less resources.
Offload anything that's scheduled processing away from your main app cluster. Since the scheduler usually runs as a single instance
Don't put your background processing in the same nodes as your main app cluster.
Offload UI responsibility to the client. (i.e. just expose APIs)
Avoid request/response (except for authentication) and focus on subscribing to events to update the local client data. Or use something like CouchDB to synchronize data between server and device.
Leverage caching on the device.
Do not "proxy" large content, instead use a direct upload to an object store like S3 (or better use Minio to avoid vendor lock to Amazon).
leverage different types of data store technologies
RDBMS (stable, easily understood, less vendor lock if using ORMs, easy to back up and restore, not as scalable for writes)
Elastic Search (efficient searching of data, but only use it for search, vendor lock)
Kafka (stable, harder to understand, but much more scalable, vendor lock)
Hazelcast/MemCached/Redis (unreliable key-value stores, very very fast, super scalable, useful for sharing and caching data)
I intentionally didn't list others like Cassandra, MongoDB as these would yield major vendor lock and harder to transfer skills.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
I have a small doubt regarding the LAN as i Havesome pcs in my office i want to connect them internally with the little cable connection but i don't want to use any internet activity from them.(purpose is the share the data internally and no use of internet).
And i have some more selected people who want to use the internet access so i want to give some special access for internet for that selected laptops.
Iam a kid in networks as i don't have any idea how i can start and move with the project suggestions are mostly accepted
You can do the following :-
Establish a small LAN connection in your office which will consist of those selected PC's which are not intended to run internet at all! You can simply establish LAN connection using routers and switches! Then,develop a small web-server like thing on one of the PC's which will work as server and the rest will work like clients! You can simply set up a distributed server which will take care of synchronisation things too(but, that is not advisable for a basic OR a newbie)!
But, simply multiple-clients and a server is what you need to
establish using LAN connection for and make network file-sharing access permissions for all the systems... There are several softwares to transfer files and internally communicate like a small mail-server intended for OS like Windows,Linux,etc.
Next for those laptops which you wanna connect to internet---please establish a source of internet like any ISP and so! Next,a gain establish a small LAN connection among those PC's which you want to connect to internet to. That's it,VOILA!
Next step of yours would be simply to configure DNS setting,IP-Address of the ISP,Subnet Mask and Gateway and that is damn easy. You simply need to add it to the router settings through which all of your systems,which are intended to access internet,would be connected. If you want to achieve the first thing with these PC's, then simply establish a local web-server or mail server for file transfer or mails,etc. locally within the office.
Another possibility :-
Establish the web server communication with all the PC's connected. Connect all the PC's and laptop to router's and switches as desired. Keep a note of IP-Address of all the PC's and laptops. Now, install a web-filter/firewall which will restrict users from accessing internet based on their hostname & IP-Address. Remember for this to take place, all the systems must have static IP-Address allocation,not the DHCP configuration!
I guess these are some of the possible steps. But,there can be several effective steps too...
Best wishes from my side!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
When I review the hardware requirements of many database backed enterprise solutions, I find requirements for the application server (OS, processor, RAM, disk space, etc), for the database server (versions, RAM, etc) and requirements for the client.
Here is an example of Oracle Fusion Middleware.
I cannot find requirements on network speed or architecture (switch speed, SAN IOPS,RIOPS, etc). I do not want a bad user experience from my application but caused by network latency in the clients environment.
When sending clients required hardware requirements specifications, how do you note the requirements in these areas? What are the relevant measures of network performance? (Or is it simply requiring IOPS=x )
Generally, there is more than one level of detail for requirements. You'd typically differentiate the levels of detail into a range from 0 (rough mission statement) to 4 (technical details), for example.
So if you specify that your SAN shall be operating with at least a bandwidth capacity of x, that would be a high number on that scale. Make sure to break down your main ideas (The system shall be responsive, in order to prevent clients from becoming impatient and leaving for competitors....) into more measurable aims (as the one above).
Stephen Withall has written down good examples in his book "Software Requirement Patterns". See chapter 9, page 191 ff., it is not that expensive.
He breaks it down into recommendations on, and I quote, Response Time, Throughput, Dynamic Capacity, Static Capacity and Availability.
Of course, that's software! Because basically, you'd probably be well advised to begin with defining what the whole system asserts under specified circumstances: When do we start to measure? (e.g. when the client request comes in at the network gateway); what average network delay do we assume that is beyond our influence? from how many different clients do we measure and from how many different autonomous systems do these make contact? Exactly what kind of task(s) do they execute and for which kind of resource will that be exceptionally demanding? When do we stop to measure? Do we really make a complete system test with all hardware involved? Which kinds of network monitoring will we provide at runtime? etc.
That should help you more than if you just assign a value to a unit like transfer rate/ IOPS which might not even solve your problem. If you find the network hardware to perform below your expectations later, it's rather easy to exchange. Especially if you give your hosting to an external partner. The software, however, is not easy to exchange.
Be sure to differentiate between what is a requirement or constraint that you have to meet, and what is actually a part of the technical solution you offer. There might be more solutions. Speed is a requirement (a vague one, though). Architecture for hardware is a solution.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
Basically, I just need a simple app that frequently pings external IP Addresses and web addresses to make sure the sites are up. Does anyone know of a good one of these?
I started to make one myself, but wanted to know if someone else has already done the work.
It just needs to track multiple external addresses with the status codes returned, at potentially different intervals.
I did see this post on "How do you monitor the availability of multiple websites", but it seems a little bit like overkill for what I need. I need a KISS app! Thanks!
Ok, second attempt. What about Website Monitor (seen in this list: Monitor and Check Web Site or Server Uptime and Availability for Free)? Your dog should be able to use it.
I'm not sure if this fits your needs but
http://aremysitesup.com/
May be a simple way to go.
The free version supports up to five sites.
This can be done with Cacti which is a great app. See:
Http Response Time monitoring and Alerting on the Cacti forums
How do you Monitor a https website and graph uptime/latency? on the Cacti forums
Cacti: Using Cacti to monitor web page loading blog posts serie
Use Cacti to Monitor HTTP Status Codes of Request Responses?) here on SO
Unless you are the network admin of those sites it is a colossal waste of resources, what I call ping-then-do.
Ping-then-do
use command prompt if you are on a windows system.
type in :
ping (website host name)
and then press enter, it will ping the website and give you the time that the website took to respond as well as the TTL
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I wanted to know if any such system already exists for the average open-source user. With all of the net neutrality arguments around and with the cost of broadband likely to go up in the future. It seems like a good idea for an open-source protocol that allows standard consumer routers to operate together and form a mesh network with other consumer routers close by.
Likely possible that with enough nodes in close enough proximity and a good abstraction we could get something good going.
You could always use WDS nodes (like a repeater, kinda).
I use it in my Buffalo AirStation with DD-WRT installed (any router that can load DD-WRT would work).
www.dd-wrt.com
Not sure on the scalability of it though. And the APs would have to be in reach of each other. They could run on separate SSIDs though.
Edit: here's the DD-WRT Wiki page about WDS: http://www.dd-wrt.com/wiki/index.php/WDS
WDS is not meant for and will not scale to more than a few nodes.
There has been extensive work on mesh routing protocols such as BATMAN-ADV, OLSR, BMX and 802.11s. These are all supported on OpenWRT which supports a very large number of consumer wireless routers
There are also many large scale deployments such as freifunk and deployments by The Village Telco
Just to add more info, batmand (layer 3) or batman-adv(layer 2) can run on almost anything with a resemblance of linux, I have managed to get it working on android devices (running cyanogenmod mostly), raspberry's, laptops, foneras, .... basically anything that has or allows a wireless card with ad-hoc mode and a linux-based operating system.
Freifunk Luebeck uses D-Link 300 with batman-adv