For a project I am making I will be using a route that looks something like this: collection/id. The server returns an 404 error when the given ID does not exist.
When this happens I would like to push the 404 error page that I have in my router.
{
name: 'notfound',
path: '/:pathMatch(.*)*',
component: () => import('../views/PageNotFoundView.vue'),
}
Now this is achieved by calling the following code: this.$router.push('/notfound');. However, this also changes the URL in the browser which is not something I would want. When reloading the page you should be going to collection/id.
Is there any way to push a navigation route (or component?) without changing the URL displayed?
Example: https://google.com/ajdfkasf doesn't go to https://google.com/404 but rather you keep the link and can refresh it. I could include the component in the view instead of trying to solve this with routes, but that would add overhead to all kinds of views.
Related
I am using the graphql-request npm package. It is working fine for all page, and post requests, but I am not able to get the menu data.
When I do it within the IDE it returns the data as expected, however, when I call it from the Next.js client it is returning null.
This is the request:
const response = await graphcms.request(`
query MyQuery {
menu(idType: NAME, id: "navbar") {
id
name
menuItems {
nodes {
url
title
label
}
}
}
}
`);
Any ideas what can be causing this to work in the IDE but not fetching on an external request? I did not see any options for settings.
I have discovered this is because menus require locations. Since I had created a theme that simply redirected to my Next deployment, it did not include any menu locations.
You can follow these instructions to add menu locations to a theme:
https://wordpress.org/support/topic/adding-a-new-menu-location/
I'm looking for a solution (guess it has to be a wordpress plugin) in order to solve a problem.
I'm publishing lots of sites with WP - some of them have internal links (already inserted via html) to pages which aren't published yet.
My goal is that these links are not "active" from the point of publishing the URL (because then they would result in a 404 since the direction site is not online yet). I'd rather like them to be somehow inactive or deactivated until the "target" of the link is published.
I tried broken link checker but it doesn't work.
Regards
Something like this should work. Ideally you would have some way to change the wrapping class once you know all the links should work so that this happening forever. Replace #content and www.yourdomain.com with the appropriate values. Also I'm assuming that you have jQuery loaded already since this is a wordpress site. Also if you're using ES6 then convert to using let/const and arrow functions if you want.
jQuery(function ($) {
$(document).ready(function() {
$('#content').find('a[href*="www.yourdomain.com"]').each(checkLinkStatus(this));
});
function checkLinkStatus(linkObject) {
var link = $(linkObject).attr('href');
$.ajax({
type: 'HEAD',
url: link,
success: function () {
// page exists
},
error: function () {
$(linkObject).attr('href', '#');
}
});
}
});
My current analytics setup looks like this:
this.context.router.listen(location => {
analytics.page(location.pathname, {
title: document.title,
url: `${config.HOST_URL}${this.context.router.createHref(location)}`,
path: this.context.router.createHref(location),
referrer: document.referrer,
state: location.state,
});
});
The amount of data in the location object is pretty minimal. The question is, how do I get information about an item that is loaded in a componentDidMount block into the page tracking?
In this particular instance, I am trying to add information about an artwork that a user is looking at into Google analytics as a custom dimension. The link below shows how it would normally be added.
https://support.google.com/analytics/answer/2709828?hl=en#example-hit
I'm using react-router v2
From the information given, I'd say your best bet is on the pages you want to add more data put a condition in the listener to NOT log if loading that page. Then in the componentDidMount where you have the data call the analytics for the page.
Basically just override it for those pages that need more data.
I am trying to export specific pages child pages into some kind of list. The purpose of this is because I am updating a web site which is on episerver 5 to the new which is on episerver 7. The links should be updated as well, meaning I want to redirect all the current child pages to go to the new parent link. And I don't want to use a catch all since I want to show the 404 page if the child page doesn't exist.
Example:
Parent page, current web:
http://localhost/newssection/
Parent page, new web:
http://localhost/news/
All child pages from the current web should redirect 301 to the parent page for the new web.
Example of child pages:
http://localhost/newssection/what-is-going-on/ should redirect 301 to http://localhost/news/ and so on...
I want to build a script that generates all these redirects with rewrites for all the parent pages I select. How can I achieve this the best way?
Thanks in advance,
Mike
There are obviously nearly endless ways of achieving this, but two viable options are:
1) Use rewrite rules in IIS (if you can use a common pattern or have a limited number of redirects)
2) Create a custom redirect, potentially as an HTTP module or an action filter on your page controller base class, and redirect to a URL set on each page
For option 2) you would probably add a new property called something like "LegacyUrl" on your base page type. It would be used to specify which old URL the new page is replacing.
For each page request resulting in a 404, look up the URL among the legacy URLs and redirect with "permanently moved" status code.
I would create a simple page and do something like:
var children = DataFactory.Instance.GetChildren(new PageReference(IdToParentPage));
foreach (var child in children) {
Response.WriteLine(child.URLSegment); // this is only the segment for the page, not the full URL
}
And from that create rewrite-rules in web.config or add to it to some 404 module if you have that.
If you need the full URL you can get it with the following code:
EPiServer.UrlBuilder url = new EPiServer.UrlBuilder(child.LinkURL);
EPiServer.Global.UrlRewriteProvider.ConvertToExternal(url, child.PageLink, System.Text.UTF8Encoding.UTF8);
I'm currently on a project where I want to have :
Wordpress for easy content managment.
AngularJS for some UX (the goal is to have no page reload + nice animation between pages loading) + further functionalities.
And care about the SEO.
In that purpose, I'm using Angular's Route module to get the user a smoother experience, and using the Angular HTML5 "pretty urls" mode to "hook" the page switching (No hashbang -> natural links).
I don't want to generate hashbangs because it's more difficult to maintain (HTML snapshots with phantom.js server etc...) than just leaving Wordpress generate the content as he does it well.
So my intention was to let angularJS controls the user's navigation, and wordpress to generate the content when user will F5 & for the SEO bots(No JS).
But I can't find a clean & clear solution to this problem because either the Angular way will work, either the "PHP" way will work.
Any ideas will be welcome ! :)
Wordpress already provides you with wp_ajax_ hook for AJAX requests. ( link)
Example:
mysite.com/my-test-page
Wordpress
In this simple case we need our wp_ajax_ hook to retrieve a page by it's slug.
One easy way is to use get_page_by_path($page_path, $output, $post_type), to get the page we want where $page_path is the slug.
Then return the page data as JSON, return json_encode($pageArray);
AngularJS
Route: Do a simple GET:
.when('/:page_slug', {
templateUrl: 'views/page.html',
controller: 'PageController',
resolve: {
page : function($route) {
return $http.get(wp_ajax_url,
{
'action': 'the_ajax_hook',
'data': $route.current.params.page_slug
}
);
}
}
})
SEO
Google recently announced they are updating the Webmaster Tools to show you how a Javascript generated site renders and provide you with tips on how to make your site crawl-able.
http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html
Apart from that you can use other services to make your site SEO-friendly today:
getseojs.com
brombone.com
prerender.io