I’m a little confused by your write-up here, the behavior you describe is different from your example. In your example the @font-face rule containing ‘local(Arial)’ follows the one containing ‘local(Baskerville)’. This renders the same in Firefox 8, Safari 5, Chrome 17 and IE9, it always uses Arial. The behavior you describe (Firefox showing Baskerville) actually occurs if you use the reverse order than the one shown in your example. Here’s a testpage showing the variations.
The unicode-range descriptor is really intended to help better control load behavior for subsetted fonts designed to support a wide variety of scripts and languages. Especially for CJK fonts, these can be rather large. An author can include a set of subsetted fonts to support Japanese and via unicode-range assure that those fonts are only downloaded when Japanese characters are actually used on the page.
If the unicode ranges overlap for a set of @font-face rules with the same family and style descriptor values, the rules are ordered in the reverse order they were defined; the last rule defined is the first to be checked for a given character.
So the last @font-face rule defined is the first one matched. The reason this is specified this way is so that an @font-face rule defined later can override a specific range. If the compounding worked the other way, a given @font-face rule that didn’t specify a unicode-range descriptor couldn’t be overriden by later rules.
This isn’t a case of Firefox not wanting to implement this feature, we’ve just been focused on better support for the rich typographic controls that are also part of CSS3 Fonts. Just a matter of priorities, we have every intention of implementing all features described in the spec.
As for “opentype” vs. “truetype” format hints, the spec defines them to be synonymous. That’s due to the fact that the terms are used ambiguously and OpenType is a superset of TrueType.
I’m a little confused by your write-up here, the behavior you describe is different from your example. In your example the @font-face rule containing ‘local(Arial)’ follows the one containing ‘local(Baskerville)’. This renders the same in Firefox 8, Safari 5, Chrome 17 and IE9, it always uses Arial. The behavior you describe (Firefox showing Baskerville) actually occurs if you use the reverse order than the one shown in your example. Here’s a testpage showing the variations.
The unicode-range descriptor is really intended to help better control load behavior for subsetted fonts designed to support a wide variety of scripts and languages. Especially for CJK fonts, these can be rather large. An author can include a set of subsetted fonts to support Japanese and via unicode-range assure that those fonts are only downloaded when Japanese characters are actually used on the page.
With multiple @font-face rules for the same style settings, the CSS3 Fonts spec defines how these rules are referenced:
So the last @font-face rule defined is the first one matched. The reason this is specified this way is so that an @font-face rule defined later can override a specific range. If the compounding worked the other way, a given @font-face rule that didn’t specify a unicode-range descriptor couldn’t be overriden by later rules.
This isn’t a case of Firefox not wanting to implement this feature, we’ve just been focused on better support for the rich typographic controls that are also part of CSS3 Fonts. Just a matter of priorities, we have every intention of implementing all features described in the spec.
As for “opentype” vs. “truetype” format hints, the spec defines them to be synonymous. That’s due to the fact that the terms are used ambiguously and OpenType is a superset of TrueType.
John Daggett
Mozilla Japan
CSS3 Fonts editor