WordPress custom query-string not index by Google - wordpress

I have a WordPress site (www.AgingSafely.com) and on it I have built a plugin to show the “Details” about various Adult Family Homes (AFHs). All of the details are retrieved out of database table via a query-string (?asi_id=WA_Af_nnnnn) where the n’s are the AFH’s license number. I have created a “Site Map” page (https://www.agingsafely.com/asi-site-map/) that lists an overview and has links to the Details Page for each AFH, so that Google can find and link them. They are also listed in sitemap.xml.
Google isn’t indexing them, but is indexing the more normal pages on my site.
I figure that I need to change my URLs from https://www.agingsafely.com/adult-family-home/?asi_id=WA_Af_751252 to something like https://www.agingsafely.com/adult-family-home/AFH/751252 to make Google happy. To add a little more complication, The “Af” in the query string is for “Adult Family Home”. The plugin also handles “Boarding Homes” “Bf” and “Nursing Facilities” “Nf”.
How do I get the URL with the ?asi_id=WA_Af_751252 rewritten to AFH/75152
This appears to two parts: Change the links in the plugins to the /AFH/nnnn format which should be easy. Have some re-write rule that converts the new URL format back to a query string.
What is the best way to do this?
Does Google ignore query strings?

Are you planning on a lot of people entering in that particular string in a google search? Possibly some will, but probably not. If you want people to be able to easily find your products/homes via google searches, yes I would change the links to something like https://www.agingsafely.com/adult-family-home/AFH/751252. Literally spell out as much as you can, unless a string is a popular part number that people search for or something like that.
Also, is your site integrated with google analytics and google search console? I would definitely do that if you haven't.

Related

The Google Analytics Vote For Trump Analytics Spam

We were checking newly implemented Google Analytics for our mobile app and surprisingly there are a lot of visitors from multiple countries but in actuality, we haven't released our app for any store and it's just beta between 5 main users.
After checking Google Analytics report in details we have found that it got spammed by Bot call "Trumps Bot" when something happens on your account you can see following lines in your language section.
“Secret.ɢoogle.com You are invited! Enter only with this ticket URL. Copy it. Vote for Trump!”
There are a lot of solution available to avoid this data in your reports using the filter but i was just wondering if there is any concrete solution on permanently remove this data from my reports and also is there anything we can do to avoid such data in future as its seriously affecting business strategy.
Due the tecnology used on Google Analytics the only way to eliminate this referal is using a filter, check one common point of all this hits . In this case is a hard one, because all the parameters changes , exept for the language, for a well know reason, to see the spam.
So try to use this one, in my case works
I highly recommend you read the community policy, this can be considered as off-topic question
Analytics spammers are always trying to find new ways of getting attention, and with this one, this spammer hit it big.
It is not possible to permanently remove it unless you delete the whole property. But you can create and advance segment to get a clean view.
But the most important part is blocking it so it doesn't pollutes your data. For this particular type of spam you should create a custom exclude language filter with this expression:
\s[^s]*\s|.{15,}|.|,
That expression will block any hit that doesn't use a proper language. That combined with a valid hostname filter should prevent most of the current spam and save you a lot of headaches.
If you need help, you can check this step by step guide for building these filters and creating the advanced segment to remove it from your historical data.
Here is also a related question.
Login in to Your Google Analytics account
Select ADMIN Section
Click on All Filters -- Add Filters
Give a filter name such as -- Include only website traffic
In Predefined section, select  Include Only
for more... Click Here

Capturing hash tag values in Google Analytics

We want to capture aggregated, anonymous search query history for analytic purposes to improve our internal search engine performance and metadata practices.
I found this article: https://support.google.com/analytics/answer/1012264?hl=en
Unfortunately, our search engine uses a hash tag instead of a question mark (nonstandard query string).
For example: http://www.site.com/search#q=search%20term
Is there a way to configure Google Analytics to recognize hash tag values in the URLs and capture these given a defined pattern?
Thanks
sorry to say this, but hash tags won't "make it" into the reports at all, so no search reports for hash tag.
There is a simple workaround though: use virtual pageviews, that would emulate the request with regular query parameter with ? sign.
_gaq.push(['_trackPageview', '/search?q=search%20term']);
However, this virtual pageview will generate a second pageview for a given page, which isn't preferable. So I would recommend setting up a new view specifically just for site search reports (or try to play around with advanced filters, which might get the work done). Also, don't forget to turn on the site search within the view settings as you would do otherwise:
Have you tested putting in a hash into the Query Parameter field?

Google analytics and dynamic pages

I have a (Symfony based) website. I would LIKE to analyize the site traffic using Google Analytics. My site is divided into several (i.e. N) categories, each of which may have 0 to M sub categories.
Schematically, the taxonomy of the site breaks down into something like this:
N major categories
Each major category may have 0 to M sub categories
further nesting is possible, but I have just kept it simple for the purpose of illustration.
I need to know which sections of the website are genererating more traffic, so that I can concentrate my efforts on those sections. My question is:
Is there anyway to identify the data that is being generated from the different sections of my site?.
Put another way, is there a code or 'tag' that I can generate dynamically (in each page that is being monitored) and pass to GA, so that I can identify which section of the website the traffic came from?
The documentation I found on google about this topic was not very useful (atleast it did not answer this question).
You can pass a uri to _trackPageview that would permit you to log the request in whatever format you'd like, including however your user's requesting the page.
Remove/replace the original call to pageTracker._trackPageview with the following:
pageTracker._trackPageview('/topcategory/subcategory');
You'd just need to plug in the topcategory and subcategory info. If the info is available in the URL you could parse it out using js on the fly.

Best way to generate an automated report in Google Analytics for a specific collection of URLs

Currently using Google Analytics as a supplement to our paid tracking software, but neither of them are giving us exactly what we need.
I have a list of about 60 or so urls (out of about 1500) on the site that I wish to setup a monthly report for that can be emailed to multiple recipients. I can't seem to figure out how to create a report showing just the hits on these 60 urls, I can apply advanced filters on the content page but those disappear after a while and sometimes error out when adding too many URL's.
Is there a method I'm missing in Google Analytics to achieve this goal or am I better running an SSIS package to pull the URL's from the API and formatting a document that way?
Yeah, advanced filters are not really designed for this kind of thing.
Here are some things which may work for you:
Try setting up a new GA Profile with an Include filter to filter only the URLs that you want to report on. You can use a regular expression to identify the 60 URLs. Then these will be the only URLs tracked in that particular profile.
Try setting up an Advanced Segment to select the Pages using a very long "OR" filter.
You could set up a new GA account and log the URLs into that account with additional tracking code. This is not really recommended as the 2 accounts will share tracking cookies.
Use Excellent Analytics to pull down data into Excel for the URLs in question using the GA Export API.

How to provide multiple search functionality in website?

I am developing a web application, in which I have the following type of search functionality;
Normal search: where user will enter the search keyword to search the records.
Popular: this is no a kind of search, it will display the popular records on the website, something as digg and other social bookmarking sites does.
Recent: In this I am displaying Recently added records in my website.
City Search: Here I am presenting city names to the user like "Delhi", "Mumbai" etc and when user click this link then all records from that particular city will be displayed.
Tag Search: Same as city search I have tag links, when user will click on a tag then all records marked with that tag will be displayed to the user.
Alphabet Search: Same as city and tag this functionality also has links of letters like "A", "B", .... etc and when user clicks on any letter link then all records starting with that particular letter will be displayed to the user
Now, my problem is I have to provide above listed searches to the user, but I am not able to decide that I'll go with one page (result.aspx) which will display all the searches records, and I'll figure using query string that which search is user is using and what data I have to display to the user. Such as, lets say I am searching for city, delhi and tag delhi-hotels then the urls for both will be as :
For City: www.example.com/result.aspx?search_type=city&city_name=delhi
For Tags: www.example.com/result.aspx?search_type=tag&tag_name=delhi-hotels
For Normal Search: www.example.com/result.aspx?search_type=normal&q=delhi+hotels+and+bar&filter=hotlsOnly
Now, I feels above Idea of using a single page for all searches is messy. So I thought of some more and cleaner Idea, which is using separate pages for all type of searches as
For City: www.example.com/city.aspx?name=delhi
For Tags: www.example.com/tag.aspx?name=delhi-hotels
For Normal Search: www.example.com/result.aspx?q=delhi+hotels+and+bar&filter=hotlsOnly
For Recent: www.example.com/recent.aspx
For Popular: www.example.com/popular.aspx
My new idea is cleaner and it tells specifically everything to the user that which page is for what, it also gives him idea that where he is now, what records he's seeing now. But the new idea has one problem, In case I have to change anything in my search result display then I have to make changes in all pages one by one, I thought that solution for this problem too, which is using user-control under repeater control, I'll pass all my values one by one to user-control for rendering HTML for each record.
Everything is fine with new Idea, But I am still no able to decide that with which I dea I have to go for, Can anyone tell me your thoughts on this problem.
I want to implement an idea which will be easy to maintain, SEO friendly (give good ranking to my website), user-friendly(easy to use and understand for the users)
Thanks.
One thing to mention on the SEO front:
As a lot of the "results" pages will be linking through to the same content, there are a couple of advantages to appearing* to have different URLs for these pages:
Some search engines get cross if you appear to have duplicate content on the site, or if there's the possiblity for almost infinite lists.
Analysing traffic flow.
So for point 1, as an example, you'll notice that SO has numberous ways of finding questions, including:
On the home page
Through /questions
Through /tags
Through /unanswered
Through /feeds
Through /search
If you take a look at the robots.txt for SO, you'll see that spiders are not allowed to visit (among other things):
Disallow: /tags
Disallow: /unanswered
Disallow: /search
Disallow: /feeds
Disallow: /questions/tagged
So the search engine should only find one route to the content rather than three or four.
Having them all go through the same page doesn't allow you to filter like this. Ideally you want the search engine to index the list of Cities and Tags, but you only need it to index the actual details once - say from the A to Z list.
For point 2, when analysing your site traffic, it will be a lot easier to see how people are using your site if the URLs are meaningful, and the results aren't hidden in the form header - many decent stats packages allow you to report on query string values, or if you have "nice" urls, this is even easier. Having this sort of information will also make selling advertising easier if that's what's you're interested in.
Finally, as I mentioned in the comments to other responses, users may well want to bookmark a particular search - having the query baked into the URL one way or another (query strings or rewritten url) is the simiplist way to allow this.
*I say "appearing" because as others have pointed out, URL rewriting would enable this without actually having different pages on the server.
There are a few issues that need to be addressed to properly answer your question:
You do not necessarily need to redirect to the Result page before being able to process the data. The page or control that contains the search interface on submitting could process the submitted search parameters (and type of search) and initiate a call to the database or intermediary webservice that supplies the search result. You could then use a single Results page to display the retrieved data.
If you must pass the submitted search parameters via querystring to the result page, then you would be much better off using a single Result page which parses these parameters and displays the result conditionally.
Most users do not rely on the url/querystring information in the browser's address bar to identify their current location in a website. You should have something more visually indicative (such as a Breadcrumbs control or header labels) to indicate current location. Also, as you mentioned, the maintainability issue is quite significant here.
I would definitely not recommend the second option (using separate result pages for each kind of search). If you are concerned about SEO, use URL rewriting to construct URL "slugs" to create more intuitive paths.
I would stick with the original result.aspx result page. My reasoning for this from a user point of view is that the actual URL itself communicates little information. You would be better off creating visual cues on the page that states stuff like "Search for X in Category Y with Tags Z".
As for coding and maintenance, since everything is so similar besides the category it would be wise to just keep it in one tight little package. Breaking it out as you proposed with your second idea just complicates something that doesn't need to be complicated.
Ditch the querystrings and use URL rewriting to handle your "sections".. much better SEO and clearer from a bookmark/user readability standpoint.
City: www.example.com/city/delhi/
Tag: www.example.com/tag/delhi-hotels/
Recent: www.example.com/recent/
Popular: www.example.com/popular/
Regular search can just go to www.example.com/search.aspx or something.

Resources