Is it possible to specify a font version in the google fonts api? - google-font-api

A Google font we use (Pacifico) recently had a rather radical redesign to the capital 'L'. As we use this for our product with a name begging with 'L' this caused a rather radical change in the look to our product.
Digging into it, I noticed that the url for the fonts had changed from
fonts.gstatic.com/s/pacifico/v7/Q_Z9mv4hySLTMoMjnk_rCXYhjbSpvc47ee6xR_80Hnw.woff2
to
fonts.gstatic.com/s/pacifico/v8/Q_Z9mv4hySLTMoMjnk_rCXYhjbSpvc47ee6xR_80Hnw.woff2
but that the previous v7 urls still worked. So, for now, I have added hardcoded font references to that version.
Is it possible, though, to instead link to a particular version of a font?
<link href='https://fonts.googleapis.com/css?family=Pacifico' rel='stylesheet' type='text/css'>
So, somehow specify a version in the above link?
I have tried various guesses, such as:
https://fonts.googleapis.com/css?family=Pacifico&version=7
https://fonts.googleapis.com/css?family=Pacifico&version=v7
https://fonts.googleapis.com/css?family=Pacifico:v7
but to no avail.

The recommended way to stick with a particular version is to self-host it.
Google does not and will not have a versioning option.
The API v7 hack you're using is officially not recommended, and may break.
A Google Fonts employee “davelab6” has been responding to several questions like yours on the Google Fonts github. This question is much like yours and has a brief answer from him. It also links to several other issues with spirited discussion about a similar change.
The upshot is that Google has considered offering a versioning feature and rejected it for several reasons. They say the vast majority of users will be fine with the latest version of a font, and the rest can self-host the version they like.
As for how to self-host, I'm out of my depth, but there's a question about it here with several answers.

Related

Can custom web fonts affect SEO?

A web site I'm developing needs two custom font families using. There are close matches to these fonts on Google Fonts, but they aren't exact matches.
I have the ttf files for the two fonts, so can create them easily enough as my own custom web fonts, but I am wondering if using my own custom web fonts (ie, rather than Google Fonts) may have an adverse affect on SEO - as there is far less chance a browser would have my custom fonts cached, which would increase the average page load time.
Although my concern seems valid, I'm wondering if it is significant enough to actually be taken into account by search engines and, therefore, have an adverse effect on the site rankings?
Yes, custom fonts affects loading speed, which in offers lower page ranking. Refer below
http://www.webilogy.com/2013/11/tips-uploading-custom-fonts-website/
http://blog.futtta.be/2011/01/07/website-performance-impact-of-web-fonts/
Well if you look at the top 10k sites from Alexa, you can see how many of them use web fonts. It's an overwhelming majority, including not just copy fonts, but icon fonts like FontAwesome, which is THE most popular web font, pretty much, excluding OS fonts like Arial, Helvetica, Georgia. See the data for yourself here:
http://bonfx.com/fonts-of-the-world/
If there were penalties, which translate into lost revenue, we would not see widespread adoption. I would look for performance gains everywhere else to offset any potential slow down from using web fonts, but definitely keep your web fonts.
Short answer is : No well for the more description

Is there a standard web-based font that is similar to Malgun (Korean font)?

A client needs to have Malgun as the font whenever hangul characters are present. I'm trying to find something to use in CSS that is close to it. I was thinking Verdana. Anyone else have a suggestion?
Verdana is also looking closer to Malgun, I think you should try google fonts http://www.google.com/webfonts
There are no "standard web-based fonts", only fonts that are more or less probable to be installed on the computer, where the browser is running. You may try to build a font-stack, that comes close to the one you want, e.g. the Verdana based font stack from this Sitepoint article, and then use font-loading methods like Google Webfonts to load your defined font for browsers that support loading fonts.
Do not try to give each visitor the same experience, but the best experience possible. Tell your customer, that a website is not a application that looks the same everywhere, but more like a TV program, that must be viewable from a black and white TV also, see this video.
Have you thought about using Fontsquirrel #font-face generator ? Also, for hangul, you might be interested in reading this.

Using schema.org or RDFa microformatting with Wordpress?

Wordpress tends to strip out all kinds of code in VISUAL mode, including microformatting. The current WP (3.2.1) seems beyond compatibility with earlier widgets, including wp-RDFa (which I thought showed promise). As discussed in [this Google group post]https://groups.google.com/forum/#!topic/schemaorg-discussion/E72kDkuguk4/discussion, clients often need to use VISUAL, thus the problem in using any kind of microformatting with Wordpress. Yet we want to start using microformats of some kind. (Damn you wordpress!)
Suggestions please?
Still early days for this plugin, but give it a try...
http://schemaforwordpress.com/

What technique was being used to generate fonts on this website?

There was a site, Web Design From Scratch (archived link, no longer has the following behavior), where the browser would render the fonts at first, and then quickly the header text flickered and then it became an image.
Does anyone know what technique was being used here?
They are using Cufón:
<script type="text/javascript" src="/assets/cufon.js"></script>
<script src="/assets/Delicious_500.font.js" type="text/javascript"></script>
<script type="text/javascript">Cufon('h1');</script>
The above is applying the library to all h1 elements, I would imagine.
Another popular technique to achieve this is sIFR, however that requires Flash. The Cufón website says:
Cufón aims to become a worthy alternative to sIFR, which despite its merits still remains painfully tricky to set up and use. To achieve this ambitious goal the following requirements were set:
No plug-ins required – it can only use features natively supported by the client
Compatibility – it has to work on every major browser on the market
Ease of use – no or near-zero configuration needed for standard use cases
Speed – it has to be fast, even for sufficiently large amounts of text
And now, after nearly a year of planning and research we believe that these requirements have been met.
Looks like Cufón.
I've used 'Dynamic Text Replacement' as described in this article from A List Apart: http://www.alistapart.com/articles/dynatext/
The site you mention uses Cufón.

Automated link-checker for system testing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I often have to work with fragile legacy websites that break in unexpected ways when logic or configuration are updated.
I don't have the time or knowledge of the system needed to create a Selenium script. Besides, I don't want to check a specific use case - I want to verify every link and page on the site.
I would like to create an automated system test that will spider through a site and check for broken links and crashes. Ideally, there would be a tool that I could use to achieve this. It should have as many as possible of the following features, in descending order of priority:
Triggered via script
Does not require human interaction
Follows all links including anchor tags and links to CSS and js files
Produces a log of all found 404s, 500s etc.
Can be deployed locally to check sites on intranets
Supports cookie/form-based authentication
Free/Open source
There are many partial solutions out there, like FitNesse, Firefox's LinkChecker and the W3C link checker, but none of them do everything I need.
I would like to use this test with projects using a range of technologies and platforms, so the more portable the solution the better.
I realise this is no substitute for proper system testing, but it would be very useful if I had a convenient and automatable way of verifying that no part of the site was obviously broken.
We use and really like Linkchecker:
http://wummel.github.io/linkchecker/
It's open-source, Python, command-line, internally deployable, and outputs to a variety of formats. The developer has been very helpful when we've contacted him with issues.
We have a Ruby script that queries our database of internal websites, kicks off LinkChecker with appropriate parameters for each site, and parses the XML that LinkChecker gives us to create a custom error report for each site in our CMS.
I use Xenu's Link Sleuth for this sort of thing. Quickly check for no deadlinks etc. on a/any site. Just point it at any URI and it'll spider all links on that site.
Desription from site:
Xenu's Link Sleuth (TM) checks Web
sites for broken links. Link
verification is done on "normal"
links, images, frames, plug-ins,
backgrounds, local image maps, style
sheets, scripts and java applets. It
displays a continously updated list of
URLs which you can sort by different
criteria. A report can be produced at
any time.
It meets all you're requirements apart from being scriptable as it's a windows app that requires manually starting.
What part of your list does the W3C link checker not meet? That would be the one I would use.
Alternatively, twill (python-based) is an interesting little language for this kind of thing. It has a link checker module but I don't think it works recursively, so that's not so good for spidering. But you could modify it if you're comfortable with that. And I could be wrong, there might be a recursive option. Worth checking out, anyway.
You might want to try using wget for this. It can spider a site including the "page requisites" (i.e. files) and can be configured to log errors. I don't know if it will have enough information for you but it's Free and available on Windows (cygwin) as well as unix.
InSite is a commercial program that seems to do what you want (haven't used it).
If I was in your shoes, I'd probably write this sort of spider myself...
I'm not sure that it supports form authentication but it will handle cookies if you can get it going on the site and otherwise I think Checkbot will do everything on your list. I've used as a step in build process before to check that nothing broken on a site. There's an example output on the website.
I have always liked linklint for checking links on a site. However, I don't think it meets all your criteria, particularly the aspects that may be JavaScript dependent. I also think it will miss the images called from inside CSS.
But for spidering all anchors, it works great.
Try SortSite. It's not free, but seems to do everything you need and more.
Alternatively, PowerMapper from the same company has a similar-but-different approach. The latter will give you less information about detailed optimisation of your pages, but will still identify any broken links, etc.
Disclaimer: I have a financial interest in the company that makes these products.
Try http://www.thelinkchecker.com it is an online application that checks number of outgoing links, page rank , anchor, number of outgoing links. I think this is the solution you need.

Resources