Xamarin.Forms MasterDetail Page catching the menu - xamarin.forms

I began to create a project, was following in the footsteps of this project "https://github.com/jamesmontemagno/Hanselman.Forms" but when I begin to navigate the menu items get the following message:
[Mono] [0xba013240] hill climbing, change max number of threads 4
[Mono] [0xba894098] hill climbing, change max number of threads 20
and also realized that is catching the menu

Hill climbing is when the number of threads is increasing to help with throughput as it is under some heavy load.
All that is happening here is there is some intensive work going on in the background. You can just look at the code and debug to where it performing a lot of work and optimize those functions.
On Android I also find large images will cause this issue rather quickly. Reduce the size of images (it must be the size not the quality as that doesn't make any difference).

Related

How do I improve my Largest Content Paint when all that remains to be improved are optimized CSS files?

I'm optimizing my website and attempting to enhance the Largest Contentful Paint, but the only item that appears to require work are the optimised CSS files made with W3Speedster; here is a link to the Google speed test I performed. I want the LCP to be less than 2.5.
Any advice would be greatly welcomed; thank you!
That does not mean it takes over 3 seconds to render the content.
The rendering of you page from start render to document complete is only about 0.100 seconds.
You server is your biggest problem. It's slow.
The best thing you could do for your pages is either eliminate shareaholic.js, or get it to load sooner. It appears there may be some sort of lazyload. I does not load until 1.8 seconds. That is likely the reason the page rendering starts too late. That and too many font and CSS files. The Browser cannot begin rendering and till it has all the CSS and font files. shareaholic.js is your 15th file being loaded. There are 5 images being prefetched before this js.
You need to get the page to start rendering sooner. You could join some of the CSS files together to reduce the number of files being retrieved.
It's just your ads. :)
You'll most likely not be able to make it any faster. Maybe a little bit. I reduced it from 4 seconds on Cloudways Nginx servers to under 2 seconds on WPX Apache servers.
The problem is, when your ads are loading in the beginning a bidding war takes place. It's only after this they can load in the ads. So it'll always take time.
If I were you, I'd change the host to something like WPX, they will also help you make it as optimized as possible, with ads. If not changing the host, I'd focus on the CLS. Much bigger factor and easier to do something about.

Website with a lot of animate gifs runs high CPU

I'm having trouble finding a solution to this problem. I have a site much like http://giphy.com, that runs a lot of animated gifs. The site currently runs a high cpu of 40-50%. I need to find I solution to lower the CPU usage without having to remove images because the images are chosen by the users. The site can have between 20 to 30 little animated gifts at a time. Any help would be greatly appreciated, Thanks
Unless you can combine all gifs inside a single file and display it at one. you're out of luck

Size, resolution and quality recommendation for images

I'm looking for good articles around image resolutions, size and quality for web pages, especially around how this affects web sites currently.
I'm working on a web site for a client, and as an honour graduate in arts and design, the client is persistent that her 7mb - 10mb images are sufficient for her web site, totalling in at almost 400mb. I've tried arguing bandwidth limitations and performance but these are not holding ground.
The standard for images are at 72dpi, no larger than your standard screen resolution (1024x768) and above 1mb in size (which is already too large in my opinion). my argument is that loading 7mb+ files into a gallery on page load will seriously hinder the users experience if they have to wait a long period of time for 7 - 10 images to first get streamed into cache before the page is loaded, and even testing this with lazy loading plug-ins (we don't want to go flash) and late-loading has performance penalties.
Does anyone on here have any recommendation around image size, resolution and quality? We don't want to loose the HD quality of the images when users navigate the gallery (obviously we'll have to thumbnail them first)?
i have read guidelines before (when we still used 1Mbps connections or less) and have been following these until now:
high resolution images should not be bigger than 1.5 - 2MB. making it this big is like bigger than the webpage contents itself. try checking http://deviantart.com on how they place big photos in their site and check the image properties using the EXIF if any
dimensions should be enough to be viewable in the browser (and avoid scrolling)
compression is to be tested. it's a case to case basis, no setting is the same for everyone. high quality, high compression without visible quality loss is a practice in web design.
JPEG is best for images, PNG for the layout and GIF for icons.
try loading images in the background when the browser is idle using javascript. that way, they are in the cache before the user knows it.
more on the webpage design, avoid using heavy graphics on the site itself, making the site fast so we only wait for the image.
If you really boil it down you don't have a choice.
You are talking about HUGE file sizes which are not realistic.
You need to download a smaller version. After that you can subsequently download versions with increased quality or offer the full image with an onmouseover or click.
Some general guidelines:
Thumbnails (of course)
Offer multiple image sizes (small, medium, large). While I understand the UX implications of giant images, some people do have fast connections and large displays and/or will be willing to wait for a high-resolution version. But it shouldn't be the only option.
Try different compression levels to see what works best for different sizes. Using one compression level across the board doesn't always work. Again (depending on the source material), there may be a need for near-lossless compression at the high end. For example, images for print, CAD drawings with fine detail, etc.
Use sequential loading techniques if applicable. For example, if you have ten images to load (optimized or not), make sure that the first visible one is the first one actually requested from the server.
When it comes down to it, your client is under the impression that asking to shrink her image represents a 'compromise' that only results in damaging the quality of the image the user receives.
The truth is, of course, that an 8-10MB image is so large that it would take most users many seconds to download, creating a horrible user experience that will increase bounce rates.
Show your client a side-by-side demo of her website loading a handful of web-optimized images, and show her a site loading 8-10MB images, then let her decide. Ultimately, your job as professional is to assist her in making good choices, but she's free to make bad ones if she insists upon it (it's her brand, money, and right).
Something else you can potentially do is detect the size of the window and load larger images if the user is on an ultra-high-resolution monitor or if the window appears to be especially large.
Best of luck!

Page Performance in IE7 with a large page

Ok so I'm writing a failry complex ASP.NET page that has quite a bit of javascript related to it. The problem is the page has alot going on with it but the browser just acts unresponsive alot of the time and lags while the javascript seems to perform fine.
In this page, I send down a array list of available items for the user to select from. Well when this list grows to like a 1000+ items in the list the page just sucks for lack of better words. If i don't have that many items to select from the page acts fine. I mean the javascript performance is ok but the page just lags. The scroll bars on the page are just laggy, it just feels like horrible. Of course none of this happens in Chrome or Firefox.
To kind of give you a little more perspective on the issue, the site has about 150k unminified uncompress css styles for this page, about 10,000 lines of code for js including frameworks, controls, and business rules specific for the page, and the array object text saved to a text document is about 200kb.
Any help on the matter would be greatly appeciated as this is about my 5th month stab at getting this faster...
One of the Yahoo performance rules is "Reduce the number of DOM elements". They say this for a reason.
When you start to go into the range of "thousands" of DOM elements, IE bogs down pretty rapidly. Every interaction with the page becomes slow. The only "solution" is to use fewer DOM elements.
For example, I recently made a web app containing 4 grids with 100 rows each with around 10 columns, all visible at the same time. Those 4000 cells were making IE really slow. I solved this by using a buffered view grid, that only renders the visible rows, and removes the rows outside of the visible scroll area from the DOM (using the ExtJS grid if it interests you).
Of course none of this happens in Chrome or Firefox.
The bane of my little web app developer existence once things get awesome. There's no answer usually other than to simplify the page.
The concepts behind pagination apply to other application pieces. Active Directory won't display every record in a single list once it's large--and it's a desktop app.
Cut it back and then use the interface to get things gradually (usually through JSON requests for me).

Sprite/PNG graphics-heavy site, oh my!

I need to create a site that is very graphics-heavy (torn paper backgrounds with transparent shadows over textured graphics, etc.) One way that I was thinking of saving on file size was to drop all my background elements into one PNG. The issue is that this file is now 180k. If I break it up into various GIFs and a couple PNGs then it would be closer to 70k.
Does it really matter? What is "too large" these days for file size? Will anyone notice if the file is 180 or 70k?
If your users have fast access to your site (like, in an intranet), 180k is hardly a problem. If, on the other hand, the site is used by The Generic Older Person With A Humorously Slow Connection, it's probably going to be a problem. If your users use GPRS, but have endless patience, it's probably not going to be a problem. If the site gives out a million dollar to whoever has the patience to wait out the load time, transfer speeds are not an issue. And so on.
What I'm saying, it really depends on your requirements and constrains. This requires you to know (and subsequently tell us, for us to be more helpful) many things before you can get it close to right.
To avoid those pesky downvotes for very-valid-answers-but-simply-doesn't-please-someone, here's my answer:
180k divided by a standard ADSL modem transfer rate = 180kB / 100kB/s = 1.8s = endurable.
Is there a reason not to use the smaller images? It sounds like you've already broken it up, so why not go with the smaller, faster method?
From a pure relativistic point of view, 70k will take only 38% of the download time that 180k would (approximately). If you're expecting high traffic or want fast load times, every bit helps.
You have to compare the time it takes to request all the separate images and the time it takes to download the one large one. The issue is with HTTP requests.
I suggest you run some tests with Google's Firefox extension, Pagespeed to see if there is a huge difference between the large png or the separate ones.
One benefit I can think of, besides fewer HTTP requests is that your site will load all at once instead of gradually as all the graphics are downloaded. The bottom line however, as Henrik said is that it depends on your requirements.
I'm sure you're aware that splitting into multiple images means additional connections to the server to retrieve them, with associated lag on each, and the additional size of the request and response headers.
Since browsers restrict the number of active connections to each server (browser version dependent) this may end up taking longer than retrieving a single image. The usual workaround to lift the limit is to use a separate "images" server, or a DNS alias that maps to the same host.
And unless you require animation, I'd always recommend PNG over GIF.
Make sure that the site looks fine with images disabled first (so alt tags, width and heights set, correct colours used) and then split the images based up into groups. Group all of your buttons into one image if possible (using css sprite sheets), and all of the borders into another. Keep large images in separate files (so site background, headers).
The more images you have, the more the browser can parallelize the requests. However, if you split them up too much then different images will load at different times, making parts of the site pop in. It's a bit of a trade off, but that's the joy of programming :)
The better your site looks before the images are visible, the less the user will mind the speed of downloading the images.

Resources