User story for web page [closed] - scrum

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I'm Creating a web site for my girlfriend and I'm using it to get my head around user story writing / scrum and vso. I'm trying to write a user story for the homepage which will be quite simple but graphical. It's my understanding that I should write a user story in the first person and it's a quick description about what that user wants / expects. The acceptance criteria is where the specifics go. With this in mind how would you go about writing a user story for a home page. So far I have the following.
User Story
As the site owner I would like a homepage that shows my completed work in as a set of full size images that change periodically so that the visitor instantly sees what services I offer.
Acceptance Criteria
Can I have have 4 full size background images that change periodically?
Can the background images resize depending on which device they are viewed on? Can the background images load in a timely manner?
Would you say that this user story and the acceptance criteria are adequate?

The user story is in the voice of the end user, not the person delivering the service.
So in your case, it might be something like:
As a visitor to the site I would like to instantly see what services are offered so that I can easily select a service
Notice that it doesn't say anything about images. The images and their size are implementation details. The user story is about what they user wants, not how they will get it.
The acceptance criteria look fine. Be carefull of statements like:
Can the background images load in a timely manner?
Generally you would expect an acceptance criteria to define what 'a timely manner' means.
Something like:
Can the background images load in under 3 seconds when running on our production server under normal load.

Related

How do i setup auto refresh on smart window? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I would like to know how do i setup auto refresh facility on smart window which should get auto refresh whenever new records gets updated in db.
I could setup refresh button on browser and inside that call the program for retrieve. But the concern here is each and every minute new records gets updated to db so that i need to click the button if i want to see on the browser as its having only initial fetched record while opening.
So i am questioning here is that is open edge have the facility of real time update? i.e will window get automatic refresh once new records uploaded to the specific database table.
I am new and don't know how can I write a query for this. Please throw a lamp here to bright. It would be better.
Sorry if i am wrong.
There is no automatic, built-in auto refresh capability for OpenEdge.
Your idea of coding a refresh button is a good start. If you start with that, and get a manual refresh working to your satisfaction you can then focus on automating the button press.
There are a number of ways that you might choose to do that. The specifics are dependent on your platform and the application framework that you are using. You mention "smart window" so I am guessing you must be on Windows and using the really old and crusty stuff.
For that environment you probably just want to add a "pstimer" ActiveX. That's not really the modern way of going about things but it might be the best fit for the world that you actually live in.
There are many Progress kbase articles on how to do that. This might be a good start: https://knowledgebase.progress.com/articles/Article/19064
There are probably perfectly good .NET equivalents too. The key is to first code your "refresh button" so that the critical logic just needs to be triggered from time to time.

How easy is to checkout the plugins being used by a wordpress website [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I was going through a site and saw that site is developed in wordpress. However, when i land on below page. I am unable to understand whether they have used a wordpress plugin or api to develop functionality:
https://legaldesk.com/affidavits/affidavit-for-change-of-name
Above link shows me a form when I select any city and click on create agreement. It asks for my details and on the right it shows preview.
Please answer . If this question should not be asked here, please do let me know the right site on which I can ask this. I like learning new technologies.
The forms themselves seem to be from "TemplatesForm" sourceforge.net/projects/kobject/files/kobject-java/tests/tests/…
Generally, the process of finding which plugins and scripts is one of a kind of forensic detection using, on the one hand, direct links to scripts' sources, which lead to the publisher's url and on the other, class names and other names or function calls, for example, found in the HTML.
Apart from obvious signs like "Powered by WordPress" or similar lines which some sites will have clearly displayed on them, you can find out a lot from the HTML page source.
Right click your page, select "view source" and search for, for example, "plugins" in the paths you find. One plugin used here is http://codecanyon.net/item/mega-main-menu-wordpress-menu-plugin/6135125 MegaMainMenu also Woocommerce, LayerSlider http://codecanyon.net/item/layerslider-responsive-wordpress-slider-plugin-/1362246 and FlexSlider https://www.woothemes.com/flexslider/
In this case, this is the theme - http://themeforest.net/item/envor-fully-multipurpose-wordpress-theme/9251688
They are also using Visual composer WYSIWIG plugin to help build it. codecanyon.net/item/visual-composer-page-builder-for-wordpress/…
You can launch scripts in browser windows as text by clicking on links to them in the HTML itself (at your own risk but it usually pops up in the browser as text) and you will often find the licence agreement texts at the top of the script.
You can see what national language the script has traces of to establish nationality (for example the forms scripts have traces of French in their variable names, which match with those on this site) This will help to confirm if you have discovered a script which was produced in the country you expect.
You can also see what programming language is involved, to help in your search and understanding of how the site works.
Website traffic in this case is monitored by Data Sumo Me https://sumome.com/ and Google Analytics. Also the Yoast WordPress SEO plugin is used. The URLs and names here are fairly obvious in the page's HTML and in HTML comments.
The part that pops up the overlays wordpress.stackexchange.com/questions/95661/… myModalLabel is BootStrap Modal - More on Modal Popups from http://www.tutorialspoint.com/bootstrap/bootstrap_modal_plugin.htm which seemed a very clear tutorial and explained the application of modal popups well.
More on Modal dialogue boxes - there is plenty you can do with those and they have been used extensively on this site en-gb.wordpress.org/plugins/modal-dialog modal-dialog for popup dialog boxes - you may find other bits on this site, including the usual jquery, bootstrap and fonts/css.
Another script called "imageareaselect odyniec.net/projects/imgareaselect is used for selecting image areas.
As this question is a little "off topic" for Stack Overflow you should probably try on Wordpress StackExchange, wordpress.stackexchange.com though they don't actually recommend any specific plugins as a matter of policy. I am also not recommending any here, only answering your question as to what has been used here and hope that the focus on what actually makes it work will be of use and interest to programmers who read this answer.
I can't see any other ones indicated in your webpage example - there may be some I have not noticed, but this answer is also intended partly as an example of the forensic analysis processes involved in working out what powers a site, which is at least partly within the scope of StackOverflow and I hope you will be able to use it to help you complete the discovery of any I missed. This started out as a collection of comments but seemed to develop a life of its own.

Could my site being viewed in iframes hurt my SEO? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have studied most of the posts concerning web page being viewed in an iframe here but I was wondering if this can hurt the SEO of the framed site! I own a niece blog, lets call it mynieceblog.com and I recently found out that my web content, mynieceblog.com/mypostname.html, is viewed in an iframe by a site acting like a blog aggregator. A toolbar exists on top (has a closing button) and the url looks like aggregator.com/content/myposttitle.html The visitor can view my entire site content through this iframe and has the opportunity to visit relevant posts of other aggregated blogs. Here are my questions:
a. When a user visits mynieceblog.com/mypostname.html who gets to see visits/impressions on his google analytics?
b. Do I get incoming links from aggregator.com? Could this be possible only if the user closes down the toolbar?
c. Does this hurt the ranking of mynieceblog.com since I both see mynieceblog.com/mypostname.html and aggregator.com/content/myposttitle.html in search engine results for some keywords?
The view of my blog content through this aggregator does not hurt my site reputation. I have read that bandwidth use is an issue too! I am more concerned about my rankings and page views.
It can't harm you and probably gives you some credit. You found it yourself so it's getting traffic.
Your own Google Analytics code will be run so you will see the visitors. You can actually tell who is framing your website via the Hostname parameter in Google Analytics. Hostname seems to get set to the domain shown in the address bar.
Google does see the link but how much ranking you get from that is unknown. Somewhere between 0 and 100%! I have recently read a test where someone believed some framed content was indexed.
It cannot hurt your ranking. Worst case is that it ranks higher for a keyword so Google presents their page for you instead of yours directly.
If you're really worried about it then you could implement some JavaScript code to make your page break out of the frame. Something like this:
if (top.location != location) {
top.location.href = document.location.href;
}
If your viewer views your website through aggregator.com then surely i wont help you for SEO. For good SEO viewers needs to visit your site directly from aggregator.com
It's not a question of hurting your site reputation - it won't; however, will it benefit your site? I'm unsure, but if you get any benefit, I imagine it would be less than if your site was access directly.
As this article suggests, the SEs may be able to spider your content through the aggregator, but the aggregator won't gain from your content (framed content is rightly considered to be outside the site), and given the dynamic architecture of many aggregators, you may also not gain much/anything.
I would imagine that the you could consider exposure of your site through an aggregator could be considered an in-bound link, but it is unclear whether SEs would agree.

How can I prevent my asp.net site from being screen scraped? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
How can I prevent my asp.net 3.5 website from being screen scraped by my competitor?
Ideally, I want to ensure that no webbots or screenscrapers can extract data from my website.
Is there a way to detect that there is a webbot or screen scraper running ?
It is possible to try to detect screen scrapers:
Use cookies and timing, this will make it harder for those out of the box screen scrapers. Also check for javascript support, most scrapers do not have it. Check Meta browser data to verify it is really a web browser.
You can also check for requests in a minute, a user driving a browser can only make a small number of requests per minute, so logic on the server that detects too many requests per minute could presume that screen scraping is taking place and prevent access from the offending IP address for some period of time. If this starts to affect crawlers, log the users ip that is blocked, and start allowing their IPs as needed.
You can use http://www.copyscape.com/ to proect your content also, this will at least tell you who is reusing your data.
See this question also:
Protection from screen scraping
Also take a look at
http://blockscraping.com/
Nice doc about screen scraping:
http://www.realtor.org/wps/wcm/connect/5f81390048be35a9b1bbff0c8bc1f2ed/scraping_sum_jun_04.pdf?MOD=AJPERES&CACHEID=5f81390048be35a9b1bbff0c8bc1f2ed
How to prevent screen scraping:
http://mvark.blogspot.com/2007/02/how-to-prevent-screen-scraping.html
Unplug the network cable to the server.
paraphrase: if public can see it, it can be scraped.
update: upon second look it appears that I am not answering the question. Sorry. Vecdid has offered a good answer.
But any half decent coded could defeat the measures listed. In that context, my answer could be considered valid.
I don't think it is possible without authenticating users to your site.
You could use a CAPTCHA.
Also, you can mitigate it instead by throttling their connection. It won't completely prevent them from screen scraping but it will probably prevent them from getting enough data to be useful.
First, for cookied users, throttle connections so you can see at most one page view per second, but once your one-second timer is up you experience no throttling whatsoever. No impact on normal users, lots of impact on screen scrapers (at least if you have a lot of pages they're targeting).
Next, require cookies to see the data-sensitive pages.
They'll be able to get in, but as long as you don't accept bogus cookies, they won't be able to screen scrape much with any real speed.
Ultimately you can't stop this.
You can make it harder for people to do, by setting up the robots.txt file etc. But you've got to get information onto legitimate users screens so it has to be served somehow, and if it is then your competitors can get to it.
If you force users to log in you can stop the robots all the time, but there's nothing to stop a competitor registering for your site anyway. This may also drive potential customers away if they can't access some information for "free".
If your competitor is in same country as you, have an acceptable use policy and terms of service clearly posted on your site. Mention the fact that you do not allow any sort of robots/screen scraping etc. If that continues, get an attorney to send them a friendly cease and desist letter.
I don't think that's possible. But whatever you'll come up with, it'll be as bad for search engine optimization as it will be for the competition. Is that really desirable?
How about serve up every bit of text as an image? Once that is done, either your competitors will be forced to invest OCR technologies, or you will find that you have no users - so the question will be moot.

Is there a way to know if someone has bookmarked your website? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I want to make stats for my website. One thing I want to do is to know how many people bookmark my website. What's the best way to do that without a survey?
There is no way to tell.
A proportion of people who arrive at the page without sending referer information will have bookmarked it — but they might also have come from a link in an email, typed the URL, dragged it from their history, turned referers off, etc, etc,etc.
Your best bet is to have a Javascript "Bookmark us" link that bookmarks the site and makes an AJAX call to a backend script to store info about a new bookmark in your db. This won't catch people who bookmark your site directly using their browser, but it will give you some idea about the stickiness of your site.
As David said there's no way to tell how many people bookmark it in their browser.
But I do all my bookmarking with Delicious.com, so you could look at getting some sorts of stats from the various third party bookmarking sites.
It's not 100% accurate but you can try putting a cookie when they first arrive to your site. If a request is made with that cookie and no referrer information in the Request object, than you can assume that the user has added your site into bookmarks (a very optimistic assumption but the worst case is that the user is loyal enough to visit your page directly typing the url which is as good as adding to the bookmarks I believe...)
I think the answers given are over complicated. Just use Addthis.com. It gives you an analytical report that shows you have many people bookmarked the link.
You can put a link which add your website in user's bookmark, and notify you that someone added your site to his bookmark.
You can also monitor numbers of people that come directly to your website, that usually means they have you in their bookmarks, or better, that they know your site's name so well that they just type it.
Edit : Using google analytics, you can have a good overview of the proprotion and number of people comming "directly" on your website.
No other way i think, except polls
This is not useful information. Bookmarking is meaningless in isolation. I currently have hundreds of bookmarks, most of them for articles that I tagged as "looks interesting, but I don't have time/energy to read and understand it right now, so I should come back later"... and then never got around to going back to. On the other hand, I have about a dozen bookmarks that I visit daily. Even if you knew I had your site bookmarked, you wouldn't know which group you're in (but it's overwhelmingly likely that you'd be in the "never used" bookmark pile).
The only way to determine which category you're in is to count actual visits to your site. This also has the added advantage of telling you about people who subscribe to RSS feeds, which are at least as "sticky" as bookmarks, regardless of whether or not they bookmark in addition to subscribing.
It sounds like the actual information you want may be how many "loyal" visitors you have - people who keep coming back. Counting bookmarks won't tell you that. Counting visits, along with some simple cookie and/or IP address based code to identify repeat visitors, will. If you don't want to write the code to manage that visit tracking yourself (and there probably isn't any reason why you should), you can get it free and easy from Google Analytics.

Resources