As described in the introduction, the underlying hypothesis for this research is that Gutenberg and his colleagues developed a standardized and unitized system for the ‘design’ and casting Textura type, and that this system was extrapolated to roman (and later italic) type. Humanistic handwriting was thus literally modeled into predetermined standardized proportions.
In line with my illustrious predecessor and tutor at the kabk, Gerrit Noordzij, I consider writing a good starting point for exploring matters like construction, contrast sort, contrast ﬂow, and contrast. However, translating handwriting into type is not a very straightforward process. Although they are trained to work directly from their own writings, students often begin to deﬁne grids before they draw letters. And usually they look at existing typefaces for the ‘correct’ proportions. Obviously, patterning is a requirement for designing type and it is difﬁcult to distill these patterns from handwriting. Therefore, could it be possible that type also ﬁnd its origin in patterning intrinsically related and/or artiﬁcially added to writing, and that the latter even inﬂuenced writing after the invention of movable type?
There seems to be no Humanistic handwriting predating movable type that shows such a clear standardization as roman type. My measurements of incunabula seem to prove that character widths were standardized during the Renaissance. The written Textura Quadrata made it relatively easy for Gutenberg and his peers to standardize and systematize their movable Gothic type. When this was achieved, it was obvious to apply the same system to the new roman type (and decades later on italic type). The clear morphological relationship between Textura and Humanistic minuscule made this possible.
The shared underlying structure of Textura Quadrata and Humanistic minuscule made an organic standardization of the handwritten models possible. It was always there, but it was not necessary to record it literally before movable type was produced. Standardized widths were also a natural extension of the handwritten model, although side bearings were not (after all, a calligrapher does not mind where the space that belongs to a character starts or end). This standardization is captured in the dtl LetterModeller (LeMo) application.
Nowadays it is common to ﬁrst design characters and then apply side bearings. It is quite plausible that during the early days of typography the proportions and widths of the characters were deﬁned ﬁrst and subsequently the details were adapted to the widths.
As mentioned, the step from handwriting to type design is not simple, even for me as an experienced calligrapher. I set up a calligraphy course for the Dutch television and wrote an accompanying book end of the 1980s. Noordzij was very positive about it in Letterletter 12 from June 1991: ‘Frank Blokland has succeeded in bringing the literature on calligraphy on a higher level; his book makes better reading and is a more reliable guide than any other book on the subject.’
The question is, how to combine the outcomes of my measurements with calligraphy in type-design education. Well, one can make a template with LeMo, such as this one for a Pilot Parallel Pen 6 mm. In the case of a translation over 30 degrees, the stem thickness is pen-width x sin 60º = 0.87. The x-height here is ﬁve times the stem thickness; approximating what I measured in Jenson’s type (which is actually slightly bolder).
The template can then be used to trace with a broad nib (additionally, one can try to apply some subtle details). The outcome can be auto-traced and converted into a font. As mentioned, spacing is an intrinsic part of the system, so the letters should form words automatically.
This basis can be used for further formalization and reﬁnement. For digital type it is, of course, not necessary to standardize widths. This clearly is different from what was required in the practice of the Renaissance punchcutter.
The image above shows the transformation of the digitized handwritten letters in formal variants, using the original LeMo-based standardization. The stem-interval has been maintained by especially adjusting the lengths of the serifs to get equilibrium of white space. This way, for example, the /n is measurably centered in its character width: this preserves the equal distances between all stems. It is plausible that Jenson for this reason applied asymmetrical serifs to, for example, the /n. The /o looks circular, but is an ellipse (and, of course, as wide as its handwritten origin).
For typesetting foundry type –especially for justiﬁcation of lines– it is useful if the widths of characters and spaces are deﬁned in the same units. The simplest system has been applied here in the examples, using the stem-thickness as basic value. The original spacing was rounded to the grid.
The original character proportions are preserved here. The ﬁtting becomes a bit tighter, but the word spaces in this case a bit wider (three units). Until now, the character relationships and their widths have been generated ‘artificial’, that is not optically, throughout the entire process.
The grid can then be doubled for reﬁnement.
And this process can be repeated.
A more reﬁned grid also makes it possible to redeﬁne the proportions of certain letters on it. Here one enters the world of cadencing, so that is a different story.