afdko icon indicating copy to clipboard operation
afdko copied to clipboard

[fea syntax] OpenType Variation support?

Open twardoch opened this issue 8 years ago • 47 comments

@readroberts @brawer etc.,

Do we have any plan in place to extend the FEA syntax to support OpenType Variation fonts?

This would be a major change of course. Correct me if I'm wrong, there are two major aspects to be revised:

FeatureVariations table

fontTools already supports FeatureVariations so using .ttx syntax, one could build an rvrn feature. But how would be express this in .fea syntax?

ValueRecord variations.

Before 2001, AFDKO did support CFF-based Multiple Master OpenType fonts, so there must have been some syntax to support "per-master metrics data" for metrics, kerning etc. Also, the FEA syntax has some provisions for device-specific adjustments. As we know, GPOS/GDEF/JSTF data in OTVar is inspired by the device adjustments.

General approach

Currently, my understanding is that Google's fontmake uses a method where OT features in each master are compiled using feaLib or AFDKO, and then the fontTools/varLib code is called to add variation data to GPOS.

This generally opens up a legitimate question:

  1. Do we want to go the fontmake path, where a complete or partial FEA file would be provided per master, and then code would somehow "glue" it together? This would allow for re-using most or all of existing FEA code, and require smaller changes to the building libraries and FEA syntax. Of course this is always possible, i.e. this path can be used without changing the syntax at all.
  2. Do we want to express the ValueRecord variations in some list, akin to the current device metrics spec in the FEA syntax? This would get a bit ugly and require serious restructuring of the FEA syntax for existing projects.
  3. Do we want to go yet another path, and that might be adding a set of keywords similar to languagesystem, script and language but refer to OTVar.

I can think of something like:

In the prolog section of the FEA file, we could have new keywords that could be used to compile the "fvar" table (perhaps), and to "set up" the general rules for how variations work. This would be somehow analogous to how languagesystem works:

varmaster wght 300.0 400.0 700.0;
varmaster wdth 75.0 100.0 120;
languagesystem latn dflt; 
languagesystem cyrl dflt; 

Then, inside any lookup, there could be some syntax that defines the values for the variation. Roughly, it could look like:

feature kern { 
  script latn;
  language dflt; 
  lookup kern1 { 
    # Variation data for the default master
    pos A A -50;
    # Variation data for the wght -1 wdth -1 master
    # Note: this data will need to be calculated against the default data to obtain the deltas
    # If the default data does not contain an entry, then the value 0 is assumed in the default data.
    var wght -1 wdth -1; 
    pos A V -300;
    pos V A -250;
    # Variation data for the wght 1 wdth 1 master
    var wght 1 wdth 1;
    pos A V -20;
    pos V A -15; 
  } kern1; 
  script cyrl;
  language dflt; 
  lookup kern1;
} kern;

With an approach like this, users could be flexible in using the include() statements for various masters, i.e. the whole thing could be expressed as:

feature kern { 
  script latn;
  language dflt; 
  lookup kern1 { 
    var wght -1 wdth -1; 
    include(kern_ThinCondensed);
    var wght 1 wdth 1;
    include(kern_BlackExtended);
  } kern1; 
  script cyrl;
  language dflt; 
  lookup kern1;
} kern;

This way, "old-style" FEA files such as for the Source Sans Pro project: ExtraLight kern.fea and ExtraBold kern.fea could be used.

Note: The syntax I'm proposing is just an "idea".

twardoch avatar Oct 17 '16 12:10 twardoch

I've been thinking about this, and it is not yet obvious to me which way to go. At the moment, I lean towards the approach used by fontmake: use current tools to build a full OpenType font, then run a script to merge the master designs into a variable font. This appeals to me because a high level design idea behind the feature file syntax is to not have to specify data which can be derived from the sources, and much of the new variable font structures can be derived from the master designs. Also, the way a variable font is expressed lends itself to this approach: basically, as a single normal OpenType font, with extra tables containing the differences between that and the other master designs ( well, not exactly, but close enough). With current workflows, I don't think we need an extension of feature format for most of the variable font data.. This is actually separate from the variable font issue, but I have noticed that all the major font development tools now have readable source data formats, which would allow fontmake to generate the kern data and design space data directly from the sources. The AFDKO doesn't do this because when it was developed, source font data was proprietary, and the only place human-readable and editable kern data was stored was the feature files. This is no longer the case. I've discussed this with Miguel Sousa and Frank Grießhammer, and to us it looks like a good workflow would be to require the default font to have a fully produced OpenType font, with all the GSUB and non-kern-GPOS feature, and then draw on the master design sources to extract the kern data and all the other delta data. Another advantage of this approach is that it avoids having to re-work the feature files whenever you make blend design space changes. Just in working for the last few weeks with some CFF2 test fonts, I have found myself often changing which master is the default, and playing the design space position of the masters. If any of the metrics had been encoded in in the feature files, this would have been a lot of work. Feature file extensions are then needed only data which is not in the masters. The STAT table will need to be supported, as it currently does not have any external expression.

All that said, my experience is that we will still need to come up with feature file syntax for almost everything in order to allow overriding source-derived values.

For kern and other metric data, I still like the old MM syntax, where individual values were simply replaced by the list of equivalent values form each master design font, in angle brackets:

feature kern {
  pos A V <-25  -22 -27>;
} kern;

However, I think I like Adams' suggestion even more, as it not only lends itself to easy adaption of current workflow, but also avoids the problem having to regenerate all the pos statements whenever the designer makes changes to the master positions in design space; only the master references need to be edited. Let's have a lot more suggestions about alternatives!

readroberts avatar Oct 17 '16 16:10 readroberts

Read,

I actually do like the fontmake approach, and I think having "per-master" positioning syntax is sensible. However, we still have the problem of FeatureVariations (rvrn feature and any other feature which changes its implementation at some variation space section): https://www.microsoft.com/typography/otspec/chapter2.htm#featvartable

I think we'll need some way to specify these conditions. The "script" and "language" keywords have been a kind of conditions in both FEA and binary OTL forever. With FeatureVariations, we get a new kind of conditions essentially. The ConditionTableFormat1 needs a variation axis id and two values: min and max. Multiple such conditions are expressed in a ConditionSet collectively (which allows to define conditions that kick in only in certain combinations of the axes segments).

So at least for this, we'd need some way to specify these conditions.

I think in FEA, the axis should be expressed as a tag rather than index.

So we need an idea how to express the condition. For example if my general variation source is 2D and has axes "wdth" and "wght" going from.-1 to 1, I might want to apply a different feature set if wdth >= 0.5 and wght >= 0.5.

Even though the implementation talks about exchanging the whole FeatureSet, in practice, it'll be more like feature sets for languagesystems "latn dflt", "cyrl dflt" and "cyrl BGR" which would normally have the same dozen features but "cyrl BGR" would also have "locl".

FEA has a decent mechanism to control the languagesystem conditions. You set up the general rule using "languagesysten" and then within feature definitions, you can fine-tune it via "script" and "language".

I think something like:

varcondsys dflt; 
varcondsys wght 0.5 1 wdth 0.5 1; 

feature liga {
varcond dflt; 
lookup liga1 {
 sub f i by fi;
} liga1; 
lookup liga2 {
 sub f l by fl;;
} liga2; 
varcond wght 0.5 1 wdth 0.5 1; 
lookup liga2;
} liga;

The example above would define at the beginning the "varcondsys" entries -- a list of all possible ConditionSets. Any feature definition that does not explicitly use the "varcond" keyword within it would get registered in the FeatureSets for all the ConditionSets.

But in case of liga, we would have two behaviors: "varcond dflt" would define the implementation for the default conditions (two lookups are applied) while "varcond wght 0.5 1 wdth 0.5 1;" would define the implementation if wght >= 0.5 and wdth >= 0.5;

Ideally, the coordinates in the conditions should be expressed in user coordinate space, not normalized coordinate space, because "regular" font developers would not know much about normalized coordinate space. But that would require much more knowledge of the FEAV compiler (it would need to process "avar" etc.). So I'd settle for normalized space.

Overall, the above mechanism is rather similar to how LangSys definitions work in FEA, so it should be possible to reuse some logic.

twardoch avatar Oct 18 '16 23:10 twardoch

Given that it is more common than not these days to separate GSUB and GPOS, perhaps it might be sensible to extend FEA by these keywords:

table GSUB [variationspace] { 
} GSUB;

table GPOS [variationspace] { 
} GPOS;

This, again, would simplify working with includes, especially if the optional conditions section could allow variation master definitions, e.g.

table GSUB { 
} GSUB;

table GPOS varmaster wdth -1 wght -1 { 
} GPOS;

table GPOS varmaster wdth 1 wght -1 { 
} GPOS;

Or perhaps

table GSUB { 
} GSUB;

table GPOS <-1 -1> { 
} GPOS;

table GPOS <1 -1> { 
} GPOS;

Inside the new "table GSUB" and "table GPOS" blocks would be the traditional FEA contents.

twardoch avatar Oct 19 '16 00:10 twardoch

With the "table GSUB" and "table GPOS" proposal comes a new idea, for makeotf specifically: a "keep layout tables" option. If my font already has a GSUB table compiled and I only want to compile a GPOS and my FEA only has GPOS stuff, I should be able to do it without destroying the existing GSUB. In other words, I'd love to be able to run makeotf in multiple passes.

twardoch avatar Oct 19 '16 00:10 twardoch

@twardoch the conditions should be defined in the designspace file, no? I think Erik is on it: https://github.com/LettError/MutatorMath/issues/55

miguelsousa avatar Oct 19 '16 00:10 miguelsousa

If my font already has a GSUB table compiled and I only want to compile a GPOS and my FEA only has GPOS stuff, I should be able to do it without destroying the existing GSUB. In other words, I'd love to be able to run makeotf in multiple passes.

Sounds more like a job for fontTools and its feaLib.

miguelsousa avatar Oct 19 '16 00:10 miguelsousa

@miguelsousa AFAIK, the Superpolator rules are very simplistic, they only allow for simple substitutions. FeatureVariations can define conditional ligatures, contextual substitutions, positioning etc.

Of course it would make sense for a tool to convert from .designspace rules to some FEA code that expresses FeatureVariations.

Also: I'm talking here about the FEA syntax. Many workflows use the FEA syntax but don't rely on .designspace. I think building the OTL tables should be possible as a separate step from building the actual font file.

The FEA syntax has been implemented by MakeOTF, feaLib and FontForge. I agree the syntax discussion is a bit separate from the tool, but only a bit.

In the past, FEA and MakeOTF could be used to completely define sources for GSUB, GPOS, GDEF and BASE. There were higher-level expressions for some of the data, such as group kerning within UFO, but overall, FEA has proven itself to be very popular.

twardoch avatar Oct 19 '16 06:10 twardoch

I've mentioned several ideas in this thread, and I think all those that do not affect the logic of the OTL features for a given instance can be offloaded to a different process, e.g. the creation of the positioning deltas across variation masters. This can be indeed done by providing several separate FEA files which a tool "blends", like Google's fontmake. We don't have to change the FEA syntax for that.

But to get reasonable support for FeatureVariations, I don't see how this can be easily without extending FEA.

Imagine an Arabic font which in the "wdth" axis only adjusts the width of the kashida stretching glyphs and of stretchable Arabic letters that have the horizontal descenders. In addition to that, the designer might want to gradually switch on stacking ligatures, so glyph sequences that join horizontally if there is ample horizontal space would start joining vertically if there is less space. Such a strategy could be used for other cursive scripts, and even Latin fonts.

Similarly, we might want to enable other contextual substitutions when a certain width or weight threshold is exceeded.

For that, proper support for the FeatureVariations conditions would be necessary, especially in GSUB.

Implementing conditional switching of GPOS adjustments in kern/mark and combining it with the blending of GPOS may indeed be tricky and perhaps out of scope, though the syntax I proposed would still be compatible and allow it in theory.

Whether it'll be MakeOTF or feaLib or FontForge that implements the syntax changes first, or ever, is another story.

twardoch avatar Oct 19 '16 13:10 twardoch

FeatureVariations cont'd

Alternative approach to the FeatureVariations conditions syntax, inspired a bit with how named lookups and the featureNames work:

feature rvrn {
  sub O by O.alt;
  featvar varEuro {  
    ifvar wght 0.7 1;
    sub Euro by Euro.bold;
  } varEuro; 
  featvar varOslash {  
    ifvar wght 0.5 1;
    ifvar wdth 0.5 1; 
    sub Oslash by Oslash.nocounter;
  } varOslash; 
} rvrn;

In this syntax, we would assume that the for all defined features, if a feature does not have the featvar block, the feature gets included in all FeatureVariations FeatureSets (collected from all definitions in the FEA file), but if a feature does have a featvar block, then that feature gets split. Whatever is outside the featvar block is included as default (which may be empty). Then, we have featvar blocks which get names (each named featvar block corresponds to a ConditionSet). Inside the block, each ifvar keyword uses the syntax

  ifvar <axisTag> <minValue> <maxValue>;

Of course, all ifvar conditions within one featvar spec are taken with a logical AND, as in the OT spec. After the ifvar conditions are specified, the alternative implementation of a given feature follows, which may include lookups etc.

As with named lookups, I should be able to define featvar blocks outside of feature blocks, and then reference them from inside the feature blocks, even across different features. Example:

featvar varEuro {  
    ifvar wght 0.7 1;
    sub Euro by Euro.bold;
} varEuro; 
featvar varOslash {  
    ifvar wght 0.5 1;
    ifvar wdth 0.5 1; 
    sub Oslash by Oslash.nocounter;
} varOslash; 

feature rvrn {
  featvar varEuro; 
  featvar varOslash;
} rvrn;

feature ss04 { 
  sub O by O.alt;
  featvar varEuro; 
  featvar varOslash;
} ss04; 

Of course instead of featvar, we could use featureVar or featureVariation, and instead of ifvar we could use ifVar or ifVariation.

twardoch avatar Oct 20 '16 12:10 twardoch

The conditions need to, somehow, combine information from two different types of data:

  • Design space geometry is the first: something has to happen when weight > 0.5. These values relate to how the masters are placed in the designspace. That means establishing these values requires design decisions ("No, the oslash needs to kick in when it is bolder") and thus need to be flexible during the whole design process. Also, a single rule can consist of a condition for each axis. As we appear to have a lot of potential axes, let's find a compact definition.
  • The other type of information is when a rule is expected to be applied and which glyphs are involved.

It would be useful to be able to change weight > 0.5 to weight > 0.4 without having to edit the whole feature file. Maybe not all weight > 0.5 statements are the same. The decision to add a specific conditional switch somewhere would not need to rely on knowing the exact axis values.

Maybe there is a way to separate the definition of the condition (with its geometric references) and its application within the feature blocks. A set of named rules, each with any number of conditions, to be defined somewhere in the beginning of the feature text. Then whenever it is needed in the feature text, a rule can be invoked by name. The rule remains active until the end of the block, or until a new rule is called.

In fantasy feature text:

# before other code, define a rule with a set of conditions:
rule oslashcounter {
     wght 0.7 1;
     wdth 0.5 1;
} oslashcounter;

# then inside a feature block, call the name of the rule to have it evaluated
feature aaaa{
     sub @something by @somethingelse;
     rule oslashcounter;
          sub Oslash by Oslash.nocounter;
          sub oslash by oslash.nocounter;
}aaaa;
  • easier to cluster substitutions
  • single place to modify a designspace value
  • allows the designer to experiment with rules and substitutions (and record results) even when the features are not ready.
  • different tools can be used to edit and proof, and can be done by different people at different times.
  • let's avoid having to use normalised coordinates in the feature text.

LettError avatar Nov 29 '16 12:11 LettError

Perhaps even allow the rules to be nested:

rule heavy {
     wght 0.8 1;
} heavy;

rule narrow {
     wdth 0 0.3;
} narrow;

rule dollarbar {
     rule heavy;
     rule narrow;
} dollarbar

feature aaaa{
     rule heavy;
        sub w by w.compact;
     rule dollarbar;
        sub dollar by dollar.nostroke;
}aaaa;

LettError avatar Nov 29 '16 13:11 LettError

This appeals to me. This format also makes it easy to define the default lookups as well as several different rule-based lookups:

feature rvrn {
       sub w.compact by w;
       sub dollar.nostroke by dollar;
     
rule heavy;
        sub w by w.compact;
     
rule dollarbar;
        sub dollar by dollar.nostroke;
} rvrn;

When no rules apply, you get the default substitutions. When any rule applies, you do not get the default substitutions, and you do get the substitutions whose rules are satisfied. In the font data, the default lookups are written in a regular GSUB feature, and the rule-based substitutions are defined in ConditionSets in the GSUB FeatureVariations table. If there are no default substitutions, the feature will be written as a regular GSUB feature without any lookups.

readroberts avatar Nov 29 '16 17:11 readroberts

@LettError I like your proposal a lot!

twardoch avatar Nov 29 '16 18:11 twardoch

Additional to this, note that rules can become more complex than a simple boolean equation. If the $-bar strike-through switch happens on a different weight value for condensed width than for extended width, the connecting line is not vertical or horizontal. It may not even be a straight line in the 2-dimensional design space. In that case the rules must define a "staircase" approximation of the "watershed" line, by combining a number of weight & width rules. That complexity is favouring Erik's nesting proposal. And it may even need to be extended with more boolean and/or/bracket operators to define the complete rules. Just summing up "ifvar + condition" is not enough. How to combine a set of "and" rules with a set of "or" rules? With 3 axes this "watershed" between glyph shapes can become a Minecraft wall of cubes, approximating a double curves surface. Hard to imagine or visualize, especially if there are multiple breaks for alternative glyph shapes in the same design space. But at least readable rule syntax helps there. As these rules needs to be applied "per glyph", the re-use of rules and conditions for multiple glyphs is also important.

petrvanblokland avatar Nov 29 '16 23:11 petrvanblokland

Extending the syntax to include 'and' and 'or' in the if clause would make it possible to define a set of regions in the blend design space to approximate a watershed line that is not orthogonal to all axes. Because of the underlying data structures, I don't think that full boolean logic can be supported: the FeatureVariations table supports only 'or's between sets of 'ands'. For the implementation, each 'and'ed rule would be added to the current ConditionSet, and each 'or' would trigger the start of a new ConditionSet. I don't see a way to support an 'and' between two parenthetical clauses that contain an 'or'. However, the current support is enough to do what Petr is describing, which certainly is useful.

readroberts avatar Nov 30 '16 17:11 readroberts

Any news on this? How can we move this forward?

anthrotype avatar Jan 05 '18 17:01 anthrotype

I still like the proposal. Let's do it.

readroberts avatar Jan 05 '18 22:01 readroberts

+1 :)

twardoch avatar Jan 24 '18 21:01 twardoch

I just realized — this syntax could actually be extended even more, to support variable positioning:

  1. The rule keyword basically defines regions in the variation space.
  2. Variable fonts use regions to define master positions (which is not really supported in designSpace, which is another cause for concern)
  3. Either the rule conditions could be extended to allow 1, 2 or 3 values, or another keyword (like master) should be created to allow conditions with 1 value (simple master location) or 3 values (min, peak and max)
  4. The extended rule syntax or the new rule plus master syntax would be used to create named locations and regions.
  5. Then, they (in either form) could be used to define interpolable positioning statements. It’d be possible to do something like this:
# Silently, positioning statements which are not placed in a master (or extended rule) are placed in the neutral position so all axes are 0

master Regular {
     wght 0;
     wdth 0;
} Regular; 

# If a master does not define a location for an axis, the neutral (0) is assumed, here wdth 0;

master Bold { 
	wght 1; 
} Bold;

# A region or set of regions could also be defined

master Semibold {
	wght 0.3 0.5 0.7;
} Semibold;

# Then we could write GPOS

feature kern {
master Regular;
    pos A V -10;
master Semibold;
    pos A V -20;
} kern;

twardoch avatar Aug 06 '20 06:08 twardoch

Should we make a PR? What was the consensus?

moyogo avatar Jan 04 '21 14:01 moyogo

I’ve been thinking about this this morning. I’m now convinced of the idea of having one feature file which contains the rules for the whole variable font, rather than trying to interpolate between different masters’ files. (Too easy for them to go out of sync.) Adam’s latest idea looks good but I wonder if there is a way to better scope variation information to a rule. The danger of

feature kern {
master Regular;
    pos A V -10;
master Semibold;
    pos A V -20;
} kern;

is - what does this do?

feature kern {
master Regular;
    pos A V -10;
    pos A Y -10;
master Semibold;
    pos A V -20;
} kern;

Throw an error, I guess, but it's annoying for tooling to have to match up the rules.

Glyphs3's tokens approach looks quite interesting. You define a set of numbers for each master: padding might be 10 in regular and 20 in bold, then say pos W W $padding;. Obviously that wouldn't be pleasant for an entire kern table, but it's a useful approach for other kinds of positioning rule.

Another downside of Adam's approach is that it incorporates axis definition information which really should be part of designspace or similar. Why not have the tools read a designspace file and pick up the master definitions from there?

simoncozens avatar Jan 15 '21 12:01 simoncozens

Another thought to go alongside my last point: how about seeing it the other way around? With a sufficiently good master statement, you don't need a designspace file, but can generate fvar and avar tables from the feature file. This is consistent with the idea that we're using feature files to generate other OT tables beyond GSUB and GPOS.

In fact, you could even define your source files inside the master statement too. Hmmm.

simoncozens avatar Jan 15 '21 12:01 simoncozens

But how do you then generate statics instances? The main problem with feature files is that they're blobs you have to parse into structure. DS files are structure right there.

madig avatar Jan 15 '21 13:01 madig

I'm not sure the parsing issue is relevant. If you have an XML parser you can parse XML and if you have a FEA parser you can parse FEA.

simoncozens avatar Jan 15 '21 14:01 simoncozens

Yes, but the XML is data, FEA is code.

madig avatar Jan 15 '21 14:01 madig

I still don't see why that matters. (And I don't agree - everything in section 9 is definitely using FEA as a data format, not a programming language.) You have to parse it into structure? Thankfully we have computers which can do this for us. It's not a big deal.

simoncozens avatar Jan 15 '21 15:01 simoncozens

The more I think about this, the more I think Adam's proposed syntax isn't sufficient, and I worry that we're doing the standard OpenType thing of thinking of a solution that works reasonably well in simple cases and then hacking the hard cases into it later, instead of thinking of the hardest and weirdest (plausible) needs first and work out a syntax that works for them - at which point the simple cases will be obvious.

I had a good conversation with @tamirhassan yesterday and he said "Why are you talking about varying rules per master? Surely you want to vary them based on areas of the design space?", and I realised that he's dead right.

Here's my hard but plausible case: I have an Arabic font with a weight axis and two masters, wght=400 and wght=1000. I want to make a contextual vertical kern for the glyph sequence "[lam-ar.init lam-ar.medi] beh-ar.medi [twodotshorizontalabove-ar threedots-horizontalabove-ar]' [alef-ar.fina lam-ar.fina lam-ar.medi ...]" (e.g. لتا) so that once the weight goes over 800 and the dots get too fat to fit inside the two vertical strokes, they are raised up by an additional 350 units. How should I express this? I don't have a good answer, but here's a proposal to get things started.

pos @tallglyph beh-ar.medi @widedots' <0 350 0 0 (wght>800)> @tallglyph;

(I also realised we need to be able to specify both rules that interpolate and rules that operate "at a point" - I don't want this kern to interpolate from 0 to 350 as we go up the weight axis.)

This syntax so far is cumbersome but I want us to focus our attention on the fact that it's the value record which varies, not the rule.

simoncozens avatar Mar 02 '21 07:03 simoncozens

Here's what my case looks like: https://twitter.com/simoncozens/status/1366784085780217860

I had to hand-hack the TTX file to get it working...

simoncozens avatar Mar 02 '21 16:03 simoncozens

I'm basically doing the same thing right now and relying on hand edited TTX, but one big difference is I am applying to entire features and not just one pos rule. One thing I ran into is that with mark positioning it works to actually swap out a lookup in the FeatureTableSubstitution and the position changes to what's in the new lookup, but with features that add or subtract advance width the effect is cumulative. For example at wght=700 I want to switch from

position \1401 <0 26 0 -87>; to position \1401 <0 36 0 -120>;

but that will add the values instead of just swapping the lookup (at least in places I could test). So the second lookup needs to be rewritten as only the deltas: position \1401 <0 10 0 -33>; In this case I'm looking at the entire feature so it would be nice to say

feature vpal {
    variation wght<=699 lookup 1;
    variation wght>=700 lookup 2;
} vpal;

Like @simoncozens says we need to allow both "at a point" and "between points", but also maybe we want to interpolate from 100-500 and then jump at a point to a new value and continue interpolating there from 500-900. That would mean we'd want something like 4 values to allow interpolation from 1-2, jump to 3, interpolate from 3 to 4.

punchcutter avatar Mar 03 '21 09:03 punchcutter

Yes, I meant to mention that there are two ways we need to vary layout. (And actually what we first need to do is sit down and read through the OT spec at all the ways things can vary, and then make sure we have syntax which covers all those potential cases):

  • Creating a FeatureVariations table to apply different lookups in different circumstances.
  • Specifying how device records and anchors vary
  • ...?

simoncozens avatar Mar 03 '21 11:03 simoncozens