I'd like to know the amount of data that is going over the wire when someone is first opening my Meteor app.
Pingdom is useful but I'd like something I can run locally on my own machine.
Ideally I'd also like to see a breakdown per package so I can decide on whether I want to keep or ditch a specific package.
You can just use your browser's developer tools. For example, in Chrome, open the developer tools (right click -> Inspect Element) and go to the network tab. Refresh and you'll see all of the javascript files and their sizes, one per package. You can filter for only Scripts and then sort by size (you may have to do a full refresh to clear out the cache for this to work). jQuery will probably be one of, if not the biggest package.
You can also run meteor with the --production flag and the server will send one concatenated and minified js file. This is much smaller than the total size of the individual package files, but shows you the actual size of the data that will be sent in production.
You also need to be aware of how much data you are publishing/subscribing. If you add the meteorhacks:fast-render package, the initial published set of data will be added as a script tag to the HTML. You should also be aware of how much data you are publishing while the user browses and uses your application. Something like Kadira is helpful with that.
Related
Previously working code that downloads a csv file from our site, now fails. Chrome, Safari and Edge don't display anything helpful except "Blob Blocked", but Firefox shows a stack trace;
Uncaught TypeError: Location.href setter: Access to 'blob:http://oursite.test/7e283bab-e48c-a942-928c-fae0907fdc82' from script denied.
Then a stack dump from googletagmanager
This appears to be a fault in the tagmanager code introduced in the last couple of weeks.
The fault appears in all browsers and is resolved immediately by commenting out the tag manager. The problem reported by a customer on the production system, and then found on both staging and locally. The customer advised they had used the export function successfully 2 weeks ago.
The question really is, do Google maintain a public facing issues log for things like the tag manager?
It's not about GTM as a library really, it's about poor user implementation. It's not up to Google to check for user-introduced conflicts with the rest of the site's functionality.
What you could do is go to GTM, and see what has been released in the past two weeks. Inspect things and look for anything that could interfere with the site's functionality. At the same time - do the opposite, see all the front-end changes introduced during this time frame by the web-dev team.
Things to watch for is mostly unclosured JS deployed in custom HTML tags. junior GTM implementation specialists like to use the global space, taking the global variables, often after the page is loaded, thus overwriting front-end's variables.
Sometimes, people would deploy minified unclosured code to the DOM, thus chaotically taking short var names. To the same end.
This is likely the easiest and most common way for GTM to break front-end. There definitely still are many ways to do so besides this though.
If this doesn't help, there's another easy way to debug it: make a new workspace from Default (or whatever is live), go into the preview mode and confirm that the issue still happens. Now start disabling newest created fired tags one by one and pinpoint which one causes the issue.
Let us know what it was.
Solution was to replace the previous tag manager code with the latest recommended snippet
I was just about to set up a 2nd GA property that I would implement into my Staging environment. I figured i'd do the same with GTM and just export/import containers from Stage to Production whenever necessary. I also figured I'd dynamically populate the Tracking-ID dynamically based on hostname. No big deal.
But then I stumbled across Environments for GTM. The first bit I read said that using this feature would solve the problem of moving code across environments. To me this implied that the snippet code would remain the same in all environments and that there would be no need to change (dynamically, via build script, manually or otherwise) any values or anything... that GTA was smart enough to deploy the right container(s) to the right place(s) at the right time(s). That sounds great, I'll do it.
Now that I'm getting into that process I'm learning (if I'm understanding correctly) that each environment does in face have to have a separate snippet. So now I"m back to where I started, with having to dynamically add values to the snippets based on domain name (which determines stage or test). With out that, every time the file containing the snippet is pushed between environments, it will contain the wrong values. I guess using Environments still takes out the export/import process for containers (which, don't get me wrong, is nice) but having to change those values is a pain..
Is this the long and short of it - do I have this right? Is there any way around having to change code in the web page (or template) by doing it somehow through GTM instead? I'm guessing not, since the snippet is the base of GTM's functionality, but i figure I'd ask.
Further complicating things is that I was planning to use a Wordpress plugin, Google Tag Manager for Wordpress, to add the GTM code. in this case, all I can even change is is the Tracking-ID, which actually stays the same... it's other values that change that I have no control over with the plugin. Is anyone aware of a way to inject new values into the snippet that the plugin writes to the page?
The snippet for an environment has the same GTM id, but has a token for the environment name attached to the GTM url. If you use any kind of build system it should be possible to set or change the token according to the server you deploy to. Personally I am not convinced that environments are really useful.
If all you need is different values for tracking ids, you can implement lookup table variables that take hostname variable as input and return the respective tracking id for live or staging. Then use that instead of hardcoding the tracking id into your tag.
Specifically, how does it manage to serve different versions of the same site, with no access to the server or anything, just a script on the head?
The way all client-side testing platforms work is by applying the changes by executing JS on the top of the existing HTML of the page.
Basically, these platforms provide WYSIWYG editor that allows you to make the changes on any site. These changes can range from simple changes like color/text/layout to more complex changes where you can modify the HTML content of any element altogether.
Every change done via visual editor generates a corresponding JS code that will get executed on the fly when someone participates in one of the variants.
To summarize, the flow will be:
Inside the platform
Place the JS snippet of the platform on the site(should be inside the head tag to avoid any flickering).
Create the test and the variants in the platform using the visual editor or by writing your own code inside the code editor.
Run the test.
On the website
The user visits the site and the respective platform's JS snippet executes.
Snippet connects to the nearest CDN and brings back the test configuration along with the platform's library.
The library executes quickly and applies the changes to the respective elements by firing the JS snippet generated during the variant creation.
The library sends a hit to track the user along with variant info inside the platform reporting.
You will get the stats in real-time and will get to know which variant performed the best.
Whenever we make changes to the CSS, it generally takes 24 hours to reflect those changes on my site. I have tried clearing the server cache and browser cache but it doesn't help too. Is there any other way to make the CSS changes reflect immediately after updation?
it happens in all the browsers... when i check it in the browser , i can access my css file with two paths eg : i store my css in folder named "Cssfolder" and my css name is say 135.css
So when i access the folder paths, Cssfolder/135.css & cssfolder/135.css, one of the path shows me latest css whereas other one shows me old css.Notice the "c" is captital in one path whereas small in other path.
Thanks.
I've found this to be a pretty common problem in a lot of my projects. I would suggest two things...
If it's just an app that you are working on you can use the CSS Cachebuster during development.
Following the idea behind the Cachebuster I have found that often adding the timestamp of the CSS file as a query string off of the CSS link will help in telling the browser that the file is different... something like... whatever.css?12212009035543
You might want to use a monitoring tool, like Live Http Headers for Firefox, to see the requests and responses to and from the server. This usually solves a lot of problems for me. Take a look at the "Expire" headers and conditional requests (like: "If-modified-since"). This said, take a look at server and client local times and timezones - it might be that they differ significantly and conditional GET requests "seem to be" handled correctly, because of future or otherwise mangled timestamps.
You can force to load the current css directly from the server with appending a random unique value to the url, like http://example.com/Cssfolder/135.css?983274928374 and http://example.com/cssfolder/134.css?08973249827. There's no way that this would ever get cached unless you use the same random value twice.
This way you learn where to look further for the solution to your problem: At the server, the ISP/a proxy or your browser.
You really need to see whether this is server side or client side. If the server is still serving the old CSS then clearly you've got no chance on the client side.
I've occasionally seen times where I've had to show the CSS in the browser, and then next time I've been to the real page, it's used that new CSS. Usually just hitting refresh does it.
Do you have any web caches like Akamai involved anywhere?
If you try to go to the CSS page from a computer which has never seen the old version, which version does it show?
EDIT: Changed answer to reflect edits in question.
I have been dealing with this issue in the past, and ended up writing a httpmodule to deal with it.
It's pretty simple, it just finds all script/css links in head tag (they now need to have runat=server) and appends the assembly version number to the link, in the same way as Tim K describes. This way im sure my clients always fetches the newest css/scripts when my app is updated in production, and never have to deal with this issue again.
Maybe Internet Service Provider cache, as in this case?
I was perplexed by this issue then someone said Ctrl+F5. Worked for me :)
When I am developing and I need to be sure that I am seeing changes as I work, I stick the css in the page ie
<style type="text/css">
/* your css */
</style>
Or you could constantly change the name of the css file itself, not very useful in a production environment, but perhaps okay while developing.
I know it doesn't solve the problem, but for developing it is okay.
I have an ASP.Net application which as desired feature, users would like to be able to take a screenshot. While I know this can be simulated, it would be really great to have a way to take a URL (or the current rendered page), and turn it into an image which can be stored on the server.
Is this crazy? Is there a way to do it? If so, any references?
I can tell you right now that there is no way to do it from inside the browser, nor should there be. Imagine that your page embeds GMail in an iframe. You could then steal a screenshot of the person's GMail inbox!
This could be made safe by having the browser "black out" all iframes and embeds that would violate cross-domain restrictions.
You could certainly write an extension to do this, but be aware of the security considerations outlined above.
Update: You can use a canvas utility function to get a screenshot of a page on the same origin as your code. There's even a lib to allow you to do this: http://experiments.hertzen.com/jsfeedback/
You can find other possible answers here: Using HTML5/Canvas/JavaScript to take screenshots
Browsershots has an XML-RPC interface and available source code (in Python).
I used the free assembly UrlScreenshot.dll which you can download here.
Works nicely!
There is also WebSiteScreenShot but it's not free.
You could try a browser plugin like IE7 Pro for Internet Explorer which allows you to save a screenshot of the current site to a file on disk. I'm sure there is a comparable plugin for FireFox out there as well.
If you want to do something like you described. You need to call an external process that prints the IE output as described here.
Why don't you take another approach?
If you have the need that users can view the same content over again, then it sounds like that is a business requirement for your application, and so you should be building it into your application.
Structure the URL so that when the same user (assuming you have sessions and the application shows different things to different users) visits the same URL, they always see same thing. They can then bookmark the URL locally, or you can even have an application feature that saves it in a user profile.
Part of this would mean making "clean urls", eg, site.com/view/whatever-information-needed-here.
If you are doing time-based data, where it changes as it gets older, there are probably a couple possible approaches.
If your data is not changing on a regular basis, then you could make the "current" page always, eg, site.com/view/2008-10-20 (add hour/minute/second as appropriate).
If it is refreshing, and/or updating more regularly, have the "current" page as site.com/view .. but allow specifying the exact time afterwards. In this case, you'd have to have a "link to this page" type function, which would link to the permanent URL with the full date/time. Look to google maps for inspiration here-- if you scroll across a map, you can always click "link to here" and it will provide a link that includes the GPS coordinates, objects on the map, etc. In that case it's not a very friendly url but it does work quite well. :)