How to properly render latin2 characters using google-font? - google-font-api

I have to render slavic letters (đ, č, ć, ...) with google-font, but what I get is always degraded rendering using alternative font, so the words are really ugly...
How could I encode these letters to have proper rendering? I have tried a few entities without success.

Not sure if still relevant, but you need to add &subset=all to the URL with which you load the font. For example:
<link href="http://fonts.googleapis.com/css?family=Open+Sans:400,400italic,600,600italic,700,700italic&subset=all"
rel="stylesheet" type="text/css" />

Ok, got it: the letters not rendered were not defined in the font... That's kind of misleading when almost all expected letters are defined, but others are missing!

Related

How to load the glyph for a character, that Google Fonts is not passing by default?

I'm using "Noto Serif" font for normal text, and it is working fine. The Google Fonts URL is https://fonts.googleapis.com/css?family=Noto+Serif
However, that font face request does not return references for all the characters actually supported by "Noto Serif". It returns only latin, cyrillic, greek, and vietnamese subsets.
If I add the infinity symbol ("∞", U+221E) to the text, it gets displayed by a fallback font, even though "Noto Serif" does have a glyph for that character.
I tried to enforce a subset. I'm not sure what is the subset that would include "∞", maybe "math", if Google Fonts have such. However, the public API does not seem to support "subset" parameter (anymore?), and it seems to have no effect on the font request.
I see that a JSON API call with some access key supposedly supports parameter "subset". Is that the only way, if it even works for such case?
There is also an API parameter "text". https://fonts.googleapis.com/css?family=Noto+Serif&text=∞ does in fact return the "∞" glyph, however it returns nothing else. Is there a way to still return all the default characters and, in addition, also this "text" character?
I cannot just do both font requests, because the text one does not have unicode-range parameter, so they would override each other.
I could add manual #font-face CSS definition, but that would void the Google Fonts dynamic benefits of serving different replies to different browsers in a way they would understand best.
I could list all the characters in the text parameter... Resulting in a big query to get the font, and maybe miss some characters.
I could download the TTF font and host it locally, but that would void the benefits of not loading all the font files, containing a lot of glyphs that will never be used.
What is the best way to do this?
It turned out that loading the same font a second time doesn't actually override the first one, the browsers seem to be smart and merge them all right.
So, the solution I used is:
<link href="https://fonts.googleapis.com/css2?family=Noto+Serif:ital,wght#0,400;0,700;1,400;1,700&display=block" rel="stylesheet">
<link href="https://fonts.googleapis.com/css2?family=Noto+Serif:ital,wght#0,400;0,700;1,400;1,700&display=block&text=%E2%88%9E" rel="stylesheet">
At first, load all the usual characters Google Fonts give you by default, and then load the font a second time and request specific characters in text parameter.
Of course, this works nicely only if you don't have too many of those extra characters that are needed...

::after content changes to weird characters when compiling less file to css

I have this less file that contains this line &::after{content: '▼';} but when compiled to css using less compiler (lessc) I get the following result &::after{content: 'Ôû╝';} in the css file.
On the website, the weird characters are displayed instead of the arrow down.
How can I make this content unchanged even though compilation process ?
The code you need to replace this with is:
content: '\25BC';
Not all servers will recognize the actual character - but the code will be recognized every time.
If you want to simply use the character “▼” directly – you just need to save and open your file using the same, agreed on encoding. Otherwise you get the garbled symbol. The best encoding to use is UTF-8.
You can do this in your HTML
<head>
<meta charset="UTF-8">
</head>

CSS Text transform character

I am using text-transform: lowercase; code to lowercase my users posts. But when i use this code characters like: "i,ş,ç,ü,ğ" are become something else. How can i fix this?
The declaration text-transform: lowercase leaves lowercase letters intact. It is very unlikely that any browser has a problem with this. If you remove the CSS declaration, you will most probably see the letters already as “something else”.
The odds are that the problem is elsewhere, in the transfer of user input to web page content. It is easy to go wrong here, due to character encoding problems. More information about the situation (a URL would be good start, and so would a description of “something else”) is needed to analyze them, and the issue would fall under a different heading.
Regarding lowercasing, it should normally be performed server-side, not in CSS. Note that the text-transform: lowercase cannot properly handle Turkish or Azeri text, as it unconditionally maps both “I” and “İ” to “i”, not to “ı” and “i”. Proper support to them has been promised for Firefox 14, presumable to be used when the content language has been suitably defined using the lang attribute, but it will take a long time before such processing is common across browsers. In server-side processing, it is usually very easy to deal with this as a special case.
Does not seems like text-transform should cause any issues
http://jsfiddle.net/m9fpX/
are you setting
<meta http-equiv="content-type" content="text/html; charset=UTF-8"/>

QtWebKit not rendering Japanese (Shift_JIS charset)

I have an HTML file which I want to load in a QWebView. The header looks something like:
<head>
<meta http-equiv="Content-Type" content="text/html; charset=Shift_JIS">
</head>
The body text is mixed Latin and Japanese characters.
The page displays perfectly in Chrome, but all of the Japanese characters are replaced with □ when the page is displayed in a QWebView.
QtWebKit seems to use the same system as used by QTextCodec to handle conversions between unicode and other charsets (please correct me if I'm wrong on this) and I'm therefore working on the assumption that QtWebKit can support Shift_JIS.
As a test, I've tried adding the specific unicode for a kanji character (e.g. ぁ to display ぁ) to my HTML file. I get the character properly rendered in Chrome, but it also displays as □ in a QWebView - I'm not sure whether this means I can trust the Shift_JIS to unicode conversion in Qt, but it certainly means I can't assume that it is the cause of the problem.
I'm not sure where to go from here; any suggestions as to solutions or other areas to investigate would be much appreciated.
Turns out I've been over-thinking this one, there is in fact a pretty simple solution:
When confronted with Kanji characters which the current font is unable to display, Chrome is clever enough to fall back to a font which does support those characters (on my Win 7 PC the default Kanji font is MS Gothic).
QtWebKit does not have this feature, and hence it is necessary to explicitly specify (in CSS) a Kanji-capable font for the areas which need it.

Website doesn't find stylesheet in Safari/IE

No idea why this is happening... this is what it looks like in firefox:
And this is what it looks like in Safari:
A quick inspection with firebug for safari shows that its not picking up any style sheets except transmenu.css (for the menu - which isnt' even being used). I can't find ANY reason why this would happen.
Any ideas?
website: http://tradartsteam.co.uk
Thanks
#Thomas Clayson: Remove the extra </script> on Ln. 35 of your source-code.
Edit: It's two lines after <script type="text/javascript" src="/_common/js/mootools-1.2-more.js"></script>
Update:
#Thomas Clayson: The only way I could get the page to display just fine was to comment out the entire <script> element with $.noConflict(); inside until before <script language="javascript" type="text/javascript" src="/js/swfobject.js"></script>. Even just $.noConflict(); on its own breaks the page again. I'll do some more digging and update my answer again (unless someone else can find the solution before me).
Super Massive Update: #Thomas Clayson: After going a little batty that I couldn't see/find the problem, I set about making the document compliant for its DOCTYPE using the W3 Validator...that eventually and finally led me to the offending code hidden deep within the events of $('#calendar').fullCalendar!
<!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:PunctuationKerning /> <w:ValidateAgainstSchemas /> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPl...
Not only is it horrible, proprietary MS Word nonsense, it got truncated and well...it severely messed with your site in some unexpected (at least by me) ways. Ah, even though your comment below pointing out what I should have felt in my bones having had to deal with this very issue far too many times before myself came a little late, your question has ultimately given me valuable experience so +1 for that. ;-)
Safaris web inspector shows errors. Ignore the favicon, but I suppose it might have something to do with that.
In fact, just having looked more closely at your source code, I found a great number of mismatched opening and closing tags, especially within the abbreviated text strings in your jQuery javascript (the ones ending on "...") - whatever you did to parse those even cut some of the tags in half.
I don't know how Firefox tolerates this (quite amazing, actually), but you might want to check your document syntax from top to bottom and make sure your code is correct. I'm quite sure Safari is not the only browser bound to have problems with this page.
Link to your stylesheet before your scripts (even better - put your scripts at the bottom of the page.)
I'm not sure exactly what the problem is but you do have an awful lot of javascript and the errors look to be interfering with the page load.

Resources