I have a website for my soccer team where I want to put in the standings. The organisation (dbu.dk) has the standings where there is (as I know) no API to subtract the standings to your website. So I thought about using web scrape.
Is there a way that I can, every time a person visits the site (edit: or every day at 8am), web scrape the standings to my own site? I consider using JS if possible.
I don't know anything about web scraping.
I stumbled over PHP cUrl which does what I want.
Reference: http://php.net/manual/en/book.curl.php
Related
New to this site and my question is centered on web scraping. Can you, with minimal to no coding, have a web app or desktop app web scrape, starting from any search engine, with a particular type of job listing in mind, for certain web pages so that those pages that you output will only have the job listings for type of job that you are looking for? For instance, Sales Agent? And, if so, what would be the steps for doing this?
Thank you.
Let's say I want to go to a specific web page and track their users activity (for example get their location, how many times they logged on, the links they clicked etc..) Its is easy to implement this if it was my website, however I want to do it for any website.
Is it technologically doable? Do you have any idea how i can start to implement this?
If the website publicly (with or without authentication) gives Users data, you can do web scraping.
The data that you mentioned are the website stats which can be tracked only by the website or the Web server. Unless you have access to the server logs, you can't do it.
As the title says. I have an ASP.NET web application that I published to my azure account. I did a little SEO and it should show up somewhere on the search engines but it doesn't.
It doesn't even show up if I type in the address in the search field. It works fine when typing the URL in address field.
My azure subscription is "Pay-as-you-go".
Any tips or answers are appriciated!
Thanks!
My answer mainly pertains to Google. How long have you waited? It's my experience that it takes a few days to a week minimum to start showing up (if you're using Google sign up for their web master tools and when you submit your site you can see when it's indexed and what pages are indexed which is important because they may skip content they deem is duplicated elsewhere whether it is or not). It's also my experience (using Azure) that sub domains on "azurewebsites.net" end up with poor SEO but if I have a full domain on my site it ranks much higher.
I also assumed that you submitted the site to the search engines, if you haven't site up for a web master account and do that (Bing and Google both have these).
http://www.bing.com/toolbox/webmaster
https://www.google.com/webmasters/tools/home?hl=en
In Google you can also search specifically for your site to see what comes back which will indicate that others can get to your stuff (even if it's buried 100 pages deep in other searches):
site:[your site].azurewebsites.net
Two days ago i published a new article on my website, Racebooking.net, and taking a look at Google Analytics, i found something amazing!!
I had 10 times more visitors than average on these 3 days. This means somebody shared my article on some website.
What i would love to know is:
On which website the article was shared. For instance, a user sees
the link to my article posted to somewebsite.com -> the user clicks on the link -> the user
arrives on my website. This way Google Analytics should be able to
tell me that one user came from somewebsite.com, right?
If possible, the exact page on which
the link was shared. For example, if it was shared on a forum, i
wanna go to that page in order to "spy" comments and know what people think about my
article
Is Google Analytics capable to do it? If not, how can i get what i want?
I really need to know these info to improve my website!
Thanks guys
You should be able to go to Acquisition > All Referrals to see an overview the websites that referred traffic to your site. Once you find a site you'd like to see more info on, click that site and you can see the exact page from which the referral came from.
Is it advisable to implement url routing for an asp.net(webforms) website which is one year old... What are the factors to be considered before implementing....
Edit:
It is a web based product website developed my company and users should pay for using it...
Some of the factors I can think of from top of my head:
Does your boss/sponsor/client/guy-who-pays-the-bill understand the importance & wants it done?
How large is the user base? If it is a internal site with few users, it might not be a big deal to ask them to update their links, but for a large public facing site, it might not be so simple as the users might have many bookmarks etc. to the content
What kind of site is it? If it is like a news site, I think people visit it for the new content rather than very old articles, but it if a knowledge base of some kind (read MSDN-like) you can expect people to have a lot of bookmarks etc. to keep handy.
Is the site SEO'd & how important is not loosing the traffic to the site based on the old URL?
What is the plan to ensure that web search engines re-index your site pages & the old URL's are given a permanent move?
There is a great advantage in having a good simple URL, but are your users tech-savvy to use it or is it just a "next-shiny-thing" initiative pushed by the developers?
HTH.
User friendly urls will improve you site from the SEO point of view. If your site is public and present in search engines will be benefited with this change.
I have to disagree with Sunny in relation with old urls. It's not true that users won't be able to access to old urls, normally, you can create redirect rules to send user hitting the previous format to the new one.
So reasons I would evaluate are:
- How important is to improve the site from SEO point if view
- If old urls can be translated to new url format through redirect rules and the importance to this.