So I managed to set up my website through the gh-page repository, however I need to make changes to various elements, such as responsiveness, but the changes are not updating. My page still looks the same prior to the changes.
What am I doing wrong?
My repository is https://github.com/MannyLerma/portfolio-main
I tried updating, copying-pasting builds and other files and I've tried merging pull requests and such, but I am failing to understand where the issue lies. Am I just not being patient enough? I saw online that changes usually take about 10 minutes to take effect, but I've waited past this.
Related
Previously working code that downloads a csv file from our site, now fails. Chrome, Safari and Edge don't display anything helpful except "Blob Blocked", but Firefox shows a stack trace;
Uncaught TypeError: Location.href setter: Access to 'blob:http://oursite.test/7e283bab-e48c-a942-928c-fae0907fdc82' from script denied.
Then a stack dump from googletagmanager
This appears to be a fault in the tagmanager code introduced in the last couple of weeks.
The fault appears in all browsers and is resolved immediately by commenting out the tag manager. The problem reported by a customer on the production system, and then found on both staging and locally. The customer advised they had used the export function successfully 2 weeks ago.
The question really is, do Google maintain a public facing issues log for things like the tag manager?
It's not about GTM as a library really, it's about poor user implementation. It's not up to Google to check for user-introduced conflicts with the rest of the site's functionality.
What you could do is go to GTM, and see what has been released in the past two weeks. Inspect things and look for anything that could interfere with the site's functionality. At the same time - do the opposite, see all the front-end changes introduced during this time frame by the web-dev team.
Things to watch for is mostly unclosured JS deployed in custom HTML tags. junior GTM implementation specialists like to use the global space, taking the global variables, often after the page is loaded, thus overwriting front-end's variables.
Sometimes, people would deploy minified unclosured code to the DOM, thus chaotically taking short var names. To the same end.
This is likely the easiest and most common way for GTM to break front-end. There definitely still are many ways to do so besides this though.
If this doesn't help, there's another easy way to debug it: make a new workspace from Default (or whatever is live), go into the preview mode and confirm that the issue still happens. Now start disabling newest created fired tags one by one and pinpoint which one causes the issue.
Let us know what it was.
Solution was to replace the previous tag manager code with the latest recommended snippet
My wordpress website is suddenly slow to respond when I want to view another pages. For example, if I clicked one link on the menu, it took around 10-15 seconds for the website to move to the page linked. However, when the website responded and move to the page, the content loaded fast. So, I thought it was not about the loading speed of my website. Correct me if I am wrong.
Are there any solution for this?
Thanks
It is likely TTFB (time to first byte) problem - it takes a long time for your server to generate the HTML of the page and send it as an answer to the request. Meanwhile when the HTML is already sent - all the js files and images are loading fast enough, as you described. There's just this "delay between switching pages".
This could be caused by plenty of factors, I recommend reading a profound article to understand all the nuances, like this: https://kinsta.com/learn/page-speed/
But in short, the first and most sure way to cut down your page loading speed with the described situation - use page caching. You can use https://uk.wordpress.org/plugins/w3-total-cache/ free plugin as the first step.
Managed WP hosting with built-in caching would be even better, but prices start from 30$ per month if that works for you.
Please open this link and follow the steps accordingly
https://www.codeinwp.com/blog/ways-to-speed-up-wordpress/
A client of mine is adding content to his Wordpress. The site is www.airsolid.ca.
He uses "portfolios" to add his different boat models. All seems fine except when we click "all boat models" in the section where it lists all items, it takes up to 30 seconds to load.
Here is the direct link to the section:
http://www.airsolid.ca/bateaux/
Any idea on what I could change to make it load under 3-5 seconds? I have a feeling it loads all images at once... and since there are many, it takes way too much time. Ironically, he doesn't even want the images to show when he lists them.
Use https://tools.pingdom.com to monitor what's loading, how long it's loading, etc. You can see if images or scripts are holding it up.
Since the screen is white while it's loading, I'd imagine it's a query issue. You can use the Query Monitor plugin to help determine the cause.
I used pingdom and got these results: https://tools.pingdom.com/#!/ekJpVY/http://www.airsolid.ca/bateaux/
It had 1 request until the ~22 second mark at which point the CSS/JS/Image requests came in, which means that it's not being held up by scripts or images.
The page is only ~2mb, which means it's not loading all the images either. I'd start with Query Monitor - it's definitely something server side, probably a faulty WP_Query or other issue in a PHP loop.
I have recently developed 4 websites in WordPress using the rtpanel theme framework.
When I put the websites live, I noticed that a couple of them upon clicking through to the blog page, are taking up to 25 seconds to load. (see link)
http://tools.pingdom.com/fpt/#!/brQN7J/www.exactabacussoftware.com/blog
Can anyone tell me what is causing this long wait? If i change my theme back to twentytwelve it loads fine and the same applies on the other sites eg: http://www.exactabacusfulfilment.com/blog
The two examples are both running on the same server using the same theme but I cannot find out what is slowing the software site down so much.
Any help would be greatly appreciated!
It seems that PHP execution is taking lot of time. The analysis of your site shows that it takes around 22 seconds to generate HTML.
There could be few reasons why php execution is taking time:
You have activated some plugin which is causing your site to slow down.
There may be some theme component which is causing your site to load slow.
Your database queries are taking long to execute. (If this is the case, check why this is happening and you can enable Memcache to cache mysql queries)
Install and activate P3 (Plugin Performance Profiler) on the website and find out which component of your site is taking the performance of the site down. To debug in detail, you can also try Query Monitor plugin.
Once the issue tracked down and resolved, you may activate PHP-APC on your server if you do not make changes to the code.
There are couple of really easy ways to investigate the reasons behind slow loads:
Google Page Speed. Basically you feed the URL and you will get list of suggestions how to improve the page load speed. Here is the url for testing for your site:
http://developers.google.com/speed/pagespeed/insights/?url=www.exactabacussoftware.com%2Fblog
The first thing that I notice is very high server response time (in my tests between 0.5sec and 1.6sec). This means that it takes at least 0.5+resource download time sec to load every image, javascript etc. If you have 100 resources this will take you 50 seconds or so which is a lot. So you might want to look for hosting alternatives.
Google page speed will give you more details what could be fixed and improved look through it and try to solve those issues. It should help you to improve your speed quite a bit.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar developer tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Building on that.
It looks like there is 2seconds of latency before your server even answers the first GET request --- followed by another 2 seconds that contain 84 more GET requests.
Now, a 4 second load time isn't awful, but if you want it to go faster, the best thing you can do is:
1). Combine all of your javascript files into one file - making sure jQuery/other dependencies are first.
2). Combine all of your PNGs into one file -- a sprite -- or, alternatively, Base64 encode them all.
3). A lot of those pngs could be compressed --- 5kb for an icon is a bit big. 66kb for an image is certainly too big.
4). Same thing with your CSS -- combine them all, and there will be fewer requests.
The default behavior of grunt-rev is to evaluate the given resource and put a hash of the content in the path, so /images/sprites/rewards.png becomes /images/sprites/f936712a.rewards.png
I don't want that. I've got several revisions being served simultaneously and I want to be able to delete old revs at will, so I'd like to rename it to /2013-06-10-e75607/images/sprites/rewards.png (where e75607 is the commit-id for the whole revision, nothing to do with the individual file).
Is this a possibility with grunt-rev and grunt-usemin? Is there an equivalent tool that would do it?
Edit
Several people have asked me why not use the hash of each file. Let me explain:
In old-fashion websites, in response to pretty much any user input, the browser is reloaded, the back-end generates a brand new page from the input, the HTML is sent to to the browser, which displays the page. It's a slow, CPU- and bandwidth-intensive process, but it does have one advantage: all the assets that are ever loaded are loaded together, within a few seconds.
In a more modern website, the page that is loaded describes the whole application. When the user makes some input, the Javascript on the page renders new DOM elements and loads new assets as needed. The page as a whole is rarely or never reloaded The site is enormously more responsive in consequence (and much easier to develop, more secure, cheaper to run, and on and on) but has the corresponding disadvantage: an asset might be loaded hours or days after the page is loaded.
Say, you visit the site at 10 am, when the site is running version cd1d0906. At 10:30, the site is upgraded to version 4b571377. At 11 am, you press a button that causes a popup that is rendered with a image called sprite.png. Obviously, you need the cd1d0906 version of sprite.png -- not the 4b571377 version.
Therefore, a well-maintained site will continue to offer old versions of all assets for several days after the versions have changed over. The easiest way to do that is to keep all the assets in a directory that is named after the version.
The complaint that this "unnecessarily" discards cache entries for unchanged files is rather unconvincing. Most deployed assets are CSS files, JS files, and sprites, all of which are compilation of many smaller files. It's a rare deployment that doesn't changes one CSS file and one JS file and one sprited image. The cache is rarely valuable after a version change.