It has been itching me for a long time to know what the historical reason for calling daemon programs or threads "daemon"
Lat: daemon, latin version of the Greek "δαίμων" ("godlike power, fate, god")
god,
a subordinate deity, as the genius of a place or a person's attendant spirit
There are numerous questions clarifying what daemons are and how they behave, but none explaining the origins of the term "daemon" for programs that run in the background like sshd.
Why do we title programs that run in the background as daemons?
See the wiki
According to Fernando J. Corbato who worked on Project MAC in 1963 his
team is the first to use the term daemon. The use of the term daemon
was inspired by Maxwell's daemon, in physics and thermodynamics as an
imaginary agent which helped to sort molecules.
"We fancifully began to use the word daemon to describe background
processes which worked tirelessly to perform system chores."
In the Unix System Administration Handbook, Evi Nemeth states the
following about daemons:
"Many people equate the word "daemon" with the word "demon", implying
some kind of satanic connection between UNIX and the underworld. This
is an egregious misunderstanding. "Daemon" is actually a much older
form of "demon"; daemons have no particular bias towards good or evil,
but rather serve to help define a person's character or personality.
The ancient Greeks' concept of a "personal daemon" was similar to the
modern concept of a "guardian angel"—eudaemonia is the state of being
helped or protected by a kindly spirit. As a rule, UNIX systems seem
to be infested with both daemons and demons."
According to Wikipedia:
The term was coined by the programmers of MIT's Project MAC. They took the name from Maxwell's demon, an imaginary being from a thought experiment that constantly works in the background, sorting molecules.
Unix systems inherited this terminology. Maxwell's Demon is consistent with Greek mythology's interpretation of a daemon as a supernatural being working in the background, with no particular bias towards good or evil. However, BSD and some of its derivatives have adopted a Christian demon as their mascot rather than a Greek daemon.
More here.
And here.
And here.
As part of the recent "rowhammer" exploit proof-of-concept, a read-suid-exec tool "ping" was used to create a more finely tuned proof of concept.
And so my question - why do various distributions prepare suid (especially root) executables as readable as well as executable?
My speculations include:
Convenience for use with "ldd"
To allow tripwire or package-update checking software to run as non-root
It doesn't matter since most distributions are public and the ELF binary can be gotten by anyone (installing into a VM, etc.)
selinux can be used to make this irrelevant
Lazy developers
With (3), hiding the binary of a public distribution offers only a fig-leaf of security - and (5) is pretty much name calling.
Not a complete answer, but I found that I needed to make setuid root programs readable if they were stored on an NFS server.
Let me say again: On a local file system chmod 4711 was enough for setuid root programs, but on NFS the required mode was 4755.
It's a mixture of "it doesn't matter" (3) and "lazy developers" (5).
It's good practice to turn off unnecessary permissions such as read access on SUID executables because it can reduce attack surface generally, but in many cases it doesn't make much difference.
As you say for (3), hiding the program data doesn't stop attackers searching for ROP gadgets etc. because the data is typically visible in the public distribution that the binary came from.
Note that that doesn't apply to the rowhammer-based exploit described in the Project Zero blog post. For that, the exploit doesn't want to read the data in the SUID executable, it just wants to use /proc/self/pagemap to learn which physical addresses contain the executable's data.
However, as the blog post says, if the attacker can't open() the SUID executable, it can just open() a library it uses, such as /lib64/ld-linux-x86-64.so.2, and apply the exploit to that. So restricting read permissions on the SUID executable doesn't help. We can't remove the read permission on these libraries otherwise they would be unusable.
I'm trying to do basically this in Go:
netstat -an | grep 2375 -c
I need to count the number of connections to the Docker daemon in my regression test for a connection leak bug. However, because I run this in multiple places in different OS (local dev box, CI, etc), I cannot rely on the "netstat" tool, so I wonder how can I do this in a more programmatic way in Go?
I looked around the net package and could not find anything that would help. There are some libraries that basically replace netstat:
https://github.com/drael/GOnetstat
https://github.com/dominikh/netstat-nat
But they are not cross-platform compliant (Mac and *nix). Any idea how can I achieve this?
In linux this info is exposed in the /proc filesystem.
Use os.Getpid and query the info in /proc/<pid>/fd. Most likely a simple count is good here, if you need more see the proc man page.
Cross platform compatibility for this kind of thing is going to be roll your own, as the ways of identifying open fd's for a process are very per platform. If you simply need to compile, and pass some tests for this on non linux platforms you can use Go's per platform support to make this a no-op on other platforms, or implement an appropriate solution.
I am struggling to integrate the network stack of Linux Kernel 2.6.35 with the Network Simulation Cradle (http://www.wand.net.nz/~stj2/nsc/).
Has anyone done it before ? If yes please reply.
I am getting an error saying : fatal error: when writing output to : Broken pipe.
Well, I cannot proceed further explaining what I have done till now, as that would make no sense if no one here has worked with this Network Simulation Cradle.
So, if anyone has worked with this, please reply.
Regards
You're probably better asking off on the ns-3 mailing list, or emailing the author of the NSC directly.
Either way, you'll need to include more information in your question!
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I suppose this question is a variation on a theme, but different.
Torrents will never replace HTTP, or even FTP download options. This said, why aren't there torrent links next to those options on more websites?
I'm imagining a web-system whereby downloaded files are able to be downloaded via HTTP, say from http://example.com/downloads/files/myFile.tar.bz2, torrents can be cheaply autogenerated and stored in /downloads/torrents/myFile.tar.bz2.torrent, and the tracker might be /downloads/tracker/.
Trackers are a well defined problem, and not incredibly difficult to implement, and there are many drop in place alternatives out there already. I imagine it wouldn't be difficult to customise one to do what is needed here.
The autogenerated torrent file can include the normal HTTP server as a permanent seed, the extensions to do this are very well supported by most, if not all, of the major torrent clients and requires no reconfiguration or special things on the server end (it uses stock standard HTTP Range headers).
Personally, if I setup such a system, I would then speed limit the /downloads/files/ directory to something reasonable, say maybe 40-50kb/s, depending on what exactly you were trying to serve.
Does such a file delivery system exist? Would you use it if it did: for your personal, company, or other website?
first of all: http://torrent.ubuntu.com/ for torrents on ubuntu.
second of all: opera has a built in torrent client.
third: I agree with the stigma attached to p2p. So much so that we have sites that need to be called legaltorrents and such like because by default a torrent would be an illegal thing, and let us not kid ourselves, it is.
getting torrents into the main stream is an excellent idea. you can't tamper with the files you are seeding so there is no risk there.
the big reason is not really stigma. the big reason is analytics, and their protection. with torrents these people (companies like microsoft and such like) would not be able to gather important information about who is doing the downloads (not personally identifiable information, and quickly aggregated away). with torrents, other people would be able to see this information, at least partially. A company would love to seed the torrent of an evaluation version of a competing companys product, just to get an idea of how popular it is and where it is getting downloaded from. It is not as good as hosting the download on your webservers, but it is the next best thing.
this is possibly the reason why the vista download on microsofts sites, or its many service packs and SDKs are not in torrents.
Another thing is that people just wont participate, and that is not difficult to figure out why because of the number of hoops you have to jump through. you got to figure out the firewall, the NAT thing, and then uPNP thing, and then maybe your ISP is throttling your bandwidth, and so on.
Again, I would (and I do) seed my 1.5 times or beyond for the torrents that I download, but that is because these are linux, openoffice that sort of thing. I would probably feel funny seeding adobe acrobat, or some evaluation version or something, because those guys are making profits and I am not a fool to save money for them. Let them pay for http downloads.
edit: (based on the comment by monoxide)
For the freeware out there and for SF.net downloads, their problem is that they cannot rely on seeders and will need their fallback on mirrors anyway, so for them torrents is adding to their expense. One more reason that pops to mind is that even in software shops, Internet access is now thoroughly controlled, and ports on which torrents rely plus the upload requirement is absolutely no-no. Since most people who need these sites and their downloads are in these kinds of offices, they will continue to use http.
BUT even that is not the answer. These people have in their licensing terms restrictions on redistribution. And so their problem is this: if you are seeding their software you are redistributing it. That is a violation of their licensing terms so if they host a torrent download and allow you to seed it, that is entrapment and they can be sued (I am not a lawyer, I learn from watching TV). They have to then delicately change their licensing to allow distribution by seeding torrents but not otherwise. This is an easy enough concept for most of us, but the vagaries of the English language and the dumb hard look on the face of the judge make it a very tricky thing to do. The judge may personally understand torrents, but sitting up their in the court he has to frown and pretend not to because it is not documented in legalese.
That there is the ditch they have dug and there they fall into it. Let us laugh at them and their misery. Yesterdays smart is todays stupid.
Cheers!
I'm wondering if part of it is the stigma associated with torrents. The only software that I see providing torrent links are Linux distros, and not all of them (for example, the Ubuntu website does not provide torrents to download Ubuntu). However, if I said I was going to torrent something, most people associate it with illegal downloads (music, video, TV shows, etc).
I think this might come from the top. An engineer might propose using a torrent system to provide downloads, yet management shudders when they hear the word "torrent".
That said, I would indeed use such a system. Although I doubt I would be able to seed at home (I found that the bandwidth kills the connection for everyone else in the house). However, at school, I probably would not only use such a system, but seed for it as well.
Another problem, as mentioned in the other question, is that torrent software is not built into browsers. Until it is, you won't see widespread use of it.
Kontiki (which is very similar to bittorrent), makes up about 10% of all internet traffic by volume in the UK, and is exclusively used for legal distribution of "big media" content.
There are people who won't install a torrent client because they don't want the RIAA sending them extortion letters and running up legal fees in court when they (the RIAA) break into your computer and see MP3 files that are completely legal backup copies of CDs that were legally purchased.
There's a lot of fear about torrents out there and I'm not comfortable with any of the clients that would allow even limited access to my PC because that's the "camel's nose in the tent".
The other posters are correct. There is a huge stigmata against Torrent files in general due to their use by hackers and people who violate copyright law. Look at PirateBay, that is all they "serve" are torrent files. A lot of cable companies in the US have started traffic shaping Torrent traffic on their networks as well because it is such a bandwidth hog.
Remember that torrents are not a download accellerator. They are meant to offload someone who cannot afford (or maybe just doesn't desire) to pay for all the bandwidth themselves. The users who are seeding take the majority of the load. No one seeding? You get no files.
The torrent protocol is also horrible for being so darn chatty. As much as 40% of your communications on the wire can be control flow messages and chat between clients asking for pieces. This is why cable companies hate it so much. There are some other problems of the torrent end game (where it asks a lot of people for final parts in an attempt to complete the torrent but can sometimes end up with 0 available parts so you are stuck with 99% and seeding for everyone).
http is also pretty well established and can be traffic shaped for load balancers, etc. So most legit companies that serve up their content can afford to host it, or use someone like Akamai to repeat the data and then load balance.
Perhaps its the ubiquity of http-enabled browsers, you don't see so much FTP download links anymore, so that could be the biggest factor (ease of use for the end-user).
Still, I think torrent downloads are a valid alternative, even if they won't be the primary download.
I even suggested Sourceforge auto-generate torrents for downloads, and they agreed it was a good idea.. but havn't implemented it (yet). Here's hoping they will.
Something like this actually exists at speeddemosarchive.com.
The server hosts a Metroid Prime speedrun and provides a permanent seed for it.
I think that it's a very clever idea.
Contrary to your idea, you don't need an HTTP URL.
I think one of the reasons is that (currently) torrent links are not fully supported inside web browser... you have to fire up the torrent client and so on.
Maybe is time for a little firefox extension/plugin? Damn, now I am at work! :)