Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I am working on getting the performance parameters of a tcp connection and one these parameters is the bandwidth. I am intending to use the tcp_info structure supported from linux 2.6 onwards, which holds the meta data about a tcp connection. The information can be retrieved using the getsockopt() function call on tcp_info. I have spent lot of time finding a good documentation which explains all the parameters in that structure, but couldn't find one.
Also I tested a small program to retrieve the values from tcp_info for a tcp connection where I found the measured MSS values for most of the time as zero.To make long story short-Is there a link to follow for which has complete details ontcp_info and also is it reliable to use these values.
Here is a fairly comprehensive write-up of the structure and use of the linux tcp_info by René Pfeiffer but there are a couple of things worth noting:
The author needed to look at these data repeated over time because there are no aggregate stats in that structure.
The author directs you to the tcp.c source as the final authority on the meaning of any of those data.
I'm not sure what you were hoping to get from the Maximum Segment Size, but expect you thought it meant something else.
If you are truly interested in exact measurements of bandwidth you need to use a measurement device which is outside the system being tested as even pulling the ioctls will affect the phenomenon you are interested in knowing about. A passive wire sniffer is the only way to get truly accurate results. Finally, depending on your application, "bandwidth" is a really broad umbrella which flattens many measurements (e.g. latency, round-trip-time, variability, jitter, etc.) into one category.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 1 year ago.
Improve this question
I have to write the "assumptions" part of a pentest report and I am having trouble understanding what I should write. I checked multiple pentest reports (from https://github.com/juliocesarfort/public-pentesting-reports) but none of them had this paragraph. Also I found this explanation "In case there are some assumptions that the pen-tester considers before or during the test, the assumptions need to be clearly shown in the report. Providing the assumption will help the report audiences to understand why penetration testing followed a specific direction.", but still what I do have in mind it is more suited for "attack narative".
Can you provide me a small example (for one action, situation) so I can see exactly how it should be written?
I would think the "assumptions" paragraph and the "Attack narrative" paragraph are somehow overlapping. I would use the "Assumptions" paragraph to state a couple of high level decisions made before starting the attack, with whatever little information the pentester would have on the attack. I would expand on the tools and techniques used in the "Attack narrative" paragraph
For example an assumption could be:
"The pentester is carrying on the exercise against the infrastructure of a soho company with less than 5 people It is common for soho companies to use consumer networking equipment that is usually unsecure, and left configured as defualt. For this reason the attacker focused on scanning for http and ssh using a database of vendors default username and passwords"
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm looking for a in house geocoding tool to geocode millions of address. I've tried on TIGER database, but it got only about 60% address rooftop. There are addresses way far away from the actual address. My needs are:
1. fast enough to process those millions of address in days
2. rooftop accuracy - shouldn't be too far away (I'll say less than 100 foot mistake)
3. in house service - so it should be free to our internal staff
4. ideally open source, but it's ok to have a one time cost to set it up
Currently I'm looking at application level infrastructure, and I'm open to dedicate map server or something like that. I just don't have enough information to start researching.
Feel free to throw me any ideas, thoughts, comments. I'd love to hear them!
There are two pieces to this problem.
the geocoder and how well it parses addresses and matches them to the reference data set.
the reference data
For 1, I have extracted the parser standardizer from PAGC into a postgresql stored procedure (which is OpenSource) and then built a couple of geocoders using that as the heart of the engine.
For 2, and the accuracy that you are looking for, you will likely need high quality commercial data like Navteq or parcel data. Tiger is good for the cost to get you near the location but Title 13 requires Census to fuzzy the address ranges to no single address can be matched to a Census form. So as you found out, Tiger will not do the job.
I have written a lot of geocoders and have one that will work with Navteq and should give you results that are close to your requirements. Check out http://imaptools.com/ and contact me if your interested.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
We are getting ready to start testing an enrollment webpage that has several pages and fields and we have generally used employees to do the testing.
The problem is, nobody tests for the same things.
I was wondering if there was a program that would run through the site and try to break it or is there no way to do that?
WebDriver, Watir, Robot Framework, and other open source tools can help you with that.
I'd also recommend Telerik's Test Studio, but I work for them so I'm biased about how great a tool it is.
Aside from tooling, Pavel's advice is a sound start. If you're looking to automate your tests then you also have to understand that carefully choosing what to automate is critical. Don't waste time on automating look and feel tests, focus on the high value/high risk areas of your system.
You'll also need to allow yourself some potentially significant ramp up time to learn how to do automation well in your particular environment. UI automation is a difficult problem domain, so you've got to dive into it with proper expectations.
Although the question is bit general, there are some advices:
Common advice is to first write down your tests. If every employee tests differently, it means that you have no test scenario present. This step should be in human language, with steps and expected result. This will get you idea, what you actually need.
After writing down the tests, think about test data or even test environment - what data (user accounts, user roles, input files, output files...) are you going to need.
Then think about automated testing. My personal favorite is Selenium. Its not the only possibility and maybe even not the best one for you, because everyone has different needs.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Does anyone know of a programme that can take a wireshark (pcap) trace and turn it into a visual network topology?
I have 3 pcap files with "ALOT" of data and I really want to see if I can make sense of some things.
I played with things like network miner but nothing that can give a visual cue to the data. For instance
You are in fact asking two questions:
How to discover the network topology from network traces
How to visualize the discovered topology
Topology Discovery
This is the hard part. The community has not yet have developed reliable tools, because network traffic exhibits so much hard-to-deal with crud. The most useful tool that comes to mind in this space is Bro, which creatse quality connection logs.
It is straight-forward to extract communication graphs, i.e., graphs that show who communicates with whom. By weighing the edges with some metric (number of packets/bytes/connections), you can get an idea about the relative contribution of a given node.
For more sophisticated analyses, you will have to develop some heuristics. For example, detecting routers may involve looking at packet forwarding behavior or extracting default gateways from DHCP ACK messages. Bro ("the Python for the network") allows you to codify such analysis in a very natural form.
Graph Visualization
The low-key approach involves generating GraphViz output. Afterglow offers some wrapping that makes the output more digestible. For inspiration, checkout out http://secviz.org/ where you find many examples on such graphs. Most of them have been created with afterglow.
There is also Gephi, a more fancy graph visualization engine, which supports a variety of graph input formats. The generated graphs look quite fancy and can also be explored interactively.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Can you suggest sites where I can generate dummy data based on my requirement for testing purposes of my project?
Note: I need a dummy data for usage of VMs and physical servers in terms of memory, CPU, disk, I/O utilization in percentage. Is there any site which provides utility to generate this kind of data?
Check out InfoChimps, they may have the sort of data your are after. But if you're just looking for numbers, it should be exceptionally trivial to just generate them yourself.
Maybe you can try http://www.generatedata.com/#generator
Obviously late to the discussion but in case anyone finds their way here. The baseball database contains some moderately large datasets (160,000+ records in the Fielding table, I believe).
Check out my MySQL Datagenerator. Perhaps that is what you are looking for...