Typewriters, Script, and Modernity in India

Karthik Malli, 2025

In 1971, the government of the Indian state of Kerala issued an announcement. The script used for the local language, Malayalam, would henceforth be written and taught with a simplified and standardized set of letter forms, reducing the complexity of the script. The focus of the order was on the vowel markers for the sounds u and ū, which combined with some consonant letters in unpredictable ways. 

Much of the complexity of the script, dating all the way back to age-old manuscripts in Malayalam, was suddenly flattened out. The impetus for this change? The development of a typewriter for Malayalam. 

The 1971 Malayalam reform is a more dramatic example of some of the changes imposed on writing systems by newer, more industrial writing technologies—hot metal typesetting and typewriters primarily—that inspired linguistic communities across South Asia (particularly in India) to reimagine how their languages were written and read. The timing of these reforms often coincided with local struggles for independence and state-building. Modern writing in these regions bears the imprint of these reforms, but what went on behind the scenes?

Sketch for Malayalam letter bha
Sketch for Malayalam letter ba, intended as a schematic for a typewriter letter punch.

Typewriter Reform?

Through my past research, I became aware of the role typewriters and hot metal typesetting played in catalyzing script reform in Indian scripts and the promise of linguistic and political modernity they held for nationalist leaders. Vaibhav Singh’s work on Devanagari typewriter development in colonial India was a revelation that helped shape much of this interest.1 Titus Nemeth’s work on mechanized Arabic typography was also very instructive.2 I wanted to explore this dynamic between technological development and script reform to incorporate it into my ongoing research on the development of linguistic nationalism and language standardization in peninsular India.

A conversation at the Unicode Technical Workshop (October 22-23, 2024) with Anushah Hossain and Debbie Anderson from the SEI Team on script politics helped kindle this spark into something more. Anushah’s work on the legacy of Malayalam typewriter reforms in Unicode encodings revealed how elements of this process could be traced back to earlier, pre-digital script standardization efforts. We also spoke of how type designers and historians of typography knew all too well how scripts were reimagined in the shift to industrial writing technologies, but there was no formal scholarship documenting the breadth of these efforts. Somehow, they had not been studied in any substantial way. Our conversation ended with a tantalizing question: how could we trace the threads of connection between these two different periods—mechanized and digital—in writing technology history? 

To answer that, we decided that I would have to start by identifying what the nature of these reforms was (and by extension, what it wasn’t), and what the reforms looked like in their form and process. This phenomenon, which had largely eluded scholarship, presented an opportunity for me to work together with SEI over the summer as a research intern. 

Luckily, my research focus on India proved especially fruitful for these questions. Indian scripts are complex and feature many moving parts. Consonant letters take vowel markers to represent syllables of sound, and consonants can combine with other consonants in complicated ways. The rules governing this behavior require a certain dynamism that stands almost in opposition to the predictability and uniformity that newer writing technologies like hot metal typesetting and typewriters boasted of. Arguably, these technologies faced their biggest challenges in India, but they also offered the most potential to change the very nature of writing.

Reading Between the Margins

To start, I met with the SEI Team over Zoom to take stock of the sources we had at our disposal. I drew from some of my earlier work from 2021 to 2024 with Typotheque, a type foundry based in The Hague. I worked with them on a series of research projects—including on Devanagari and Malayalam—centered around readers’ letter form and orthographic preferences in different Indic scripts. The same block of text in a script can often be “presented” differently, drawing from different rules in spelling and visual form, changing the social and graphical identity of the text while keeping its linguistic content intact. 

We created a shared Zotero repository for our sources, adding to it whenever we thought of something to include. We stuck to secondary sources, but we quickly realized that the lack of direct references to typewriters and language reform meant that we had to look across disciplines and read between them to make the connections we needed. The list was shaping into something with no disciplinary center of gravity, so to speak.

I had encountered some primary sources in my earlier fieldwork in archives across India and had some familiarity with research on linguistic nationalism and literary culture, but it was hard to put the pieces together.

It was important to situate these disparate sources within a larger framework connecting the nodes of script, technology, and society. SEI’s expertise proved invaluable in the process of giving shape to these ideas. First, they prompted me by asking questions that tied these themes together and would give the research a coherent narrative: what were typewriter reforms, and how were they introduced and spread? What scripts were affected, and what technologies were involved? What were the lasting legacies of typewriter reforms, continuing into the digital era?

Photo of Bapurao S. Naik’s comprehensive 3 part series, Typography of Devanagari
Bapurao S. Naik’s comprehensive 3 part series, Typography of Devanagari, was an extremely rich source of information. While it focuses on Devanagari, it offers a framework to understand the place of script reform across major Indian languages.
Screenshot from Riccardo Olocco’s thesis at the University of Reading, titled Linotype Bengali and the digital Bengali typefaces
From Riccardo Olocco’s thesis at the University of Reading, titled Linotype Bengali and the digital Bengali typefaces. Having visual information side by side with the analysis helped us picture these changes better.
1967 manual for a newly devised keyboard for Devanagari issued by the Government of India
A 1967 manual for a newly devised keyboard for Devanagari issued by the Government of India.

The SEI Team helped broaden the scope of the literature by introducing me to current scholarship on these themes, as well as older work that could be read across the grain to offer information we needed. As I went through the team’s readings, I found myself revisiting the original sources I had worked with and finding gaps in what they spoke of. Much of the literature on linguistic nationalism focused on language communities deciding on one standard script for their language, but did not have much to say about how letterforms and script behavior itself was reformed and reimagined.

This meant that it was important for us to also keep our eye on the letters themselves, so to speak. To that effect, we were very fortunate that Gerry Leonidas at the University of Reading, an institution known for its cutting-edge program on typographic histories of non-Latin scripts, graciously offered to grant us access to theses written by students on some of the scripts we wanted to examine. Each thesis essentially served as a case study of graphic evolution and variation in a particular script (usually narrowed down to a certain period in time), helping us visualize how writing itself evolved.

I met with the SEI team every week to discuss what stood out to me from the readings, making notes of important points in the literature. Every meeting, our notes opened up newer questions and directions. The literature review was underway, but we still had to give shape to our findings.

Tracing an Outline

Synthesizing the sources involved reading across these disciplines and genres, and looking critically at the patterns of script standardization processes. Going over letter forms and representations of letterform schematics could be as important as going through a chapter in a book. As a non-practitioner, I had to expand my toolkit to look, and not just read. 

Take the example in the figure below. Both rows show ligatures for the consonant cluster kta. The form above is the “traditional” variant, and the form below is “simplified”. Although the individual ligatures display a similar degree of graphical complexity, the “traditional” form obscures the nature of the constituent letters क [ka] and त [ta], while the “simplified” form is easier to untangle and features a half-form for क [ka].

Kta consonant cluster shown in traditional (top) vs. simplified (bottom) forms in Mukta and Calibri fonts, respectively

I identified specific instances of typewriter reforms undertaken in different languages, what mechanisms were deployed, and what the results were. For example, the 1953 Lucknow conference for Devanagari “rationalized” the script by picking one representative letter form for the typewriter and doing away with options in letter form. This was undertaken by the Government of India to standardize Devanagari, a script used for Hindi and Marathi and earmarked as India’s “national” script. The results of the reform are still current in Hindi today.

I perused through the typographic case studies to look at text from before and after. It was also important to identify the different ways in which type reform manifested in scripts, although in many cases secondary literature was not sufficient. After going over findings from the case studies, the SEI team suggested developing a typology of script reform mechanisms. It gave us a vocabulary to describe script reforms in clearer terms, identifying the parameters and extent to which different mechanisms of script reform were used.

Based on the information in the readings, I used a browser-based tool to develop a timeline to help visualize the progress of typewriter reforms alongside technical milestones and political events. The timeline allowed us to see typewriter reforms as an unfolding process linked to the world outside.

A snapshot of the timeline we developed to visualize typewriter reforms across languages
A snapshot of the timeline we developed to visualize typewriter reforms across languages. Created using a Javascript tool developed by Northwestern University Knight Lab.
A rough illustration of the various levels of Malayalam script simplification
A rough illustration of the various levels of Malayalam script simplification, with simplified elements marked.
Results of the 1951 Lucknow conference, held by the Government of India to decide a standard letter set for Devanagari
Results of the 1951 Lucknow conference, held by the Government of India to decide a standard letter set for Devanagari. Modern Hindi follows the recommendations of this conference.

Along the way, we wrestled with multiple conceptual questions. A big one was what came first: changes in print technologies that forced script reform, or political mobilization that created demand for it? Answering these questions required us to go back to our literature review, and even revisit some of the texts. 

By the end of the internship, we had a blueprint for understanding typewriter reforms. We had more than just a definition; we could say how, when, and why the process happened, even if only in a broad sense.

Reflections

The research showed me the demands made on script complexity by the rigid frames of keyboard matrices, and how invested the nation was in typewriters. 

I can now speak of typewriter reforms as a concept, something that happened under certain technological, linguistic, and socio-political conditions. Nationalist leaders and literary figures in South and West Asia wished to harness the possibilities offered by mechanized writing technologies such as the Linotype machine to print newspapers and government texts in large numbers, but were hindered by the complexity inherent to their writing systems. 

Typewriter reforms were then technolinguistic negotiations (to use Thomas S. Mullaney’s phrase3) that made complex scripts amenable to usage on these new technologies, through simplification and standardization of letter forms and combination rules. Such reforms represented a process of “translation” of linguistic forms and their graphical representations to discrete units on a keyboard, essentially a process of mechanized script encoding. 

These efforts set the stage for the world of digital encoding, where the nuts and bolts of standardization and encoding are replaced by code points and digital typography paired with rendering algorithms. 

We have only begun to answer our research questions. There is more work to be done, beyond the conceptualization and sketching out I focused on over the summer. So far the research has been limited to secondary sources, but the many references to primary sources—including documentation of keyboard schematics, coverage of typewriters in contemporary Indian language newspapers, and records of government correspondence—have been encouraging, and they mirror the eclectic mix of disciplines the scholarship covers. Some primary materials lie in libraries, others in government archives, and some even in corporate offices!

The relationship between modernity, writing technology, and linguistic nationalism is a subject I wish to return to and explore more in both academic and professional capacities. I see myself incorporating my learnings with SEI into my intended Masters thesis at the University of Washington, where I hope to explore the socio-political environment of technology-driven reforms harnessed by growing nationalist groups.

Thanks to SEI’s mentorship, I feel more equipped to wrestle with the initial question that connected me and SEI—how looking at typewriter reforms helps shape our understanding of script and politics in the digital era. I also believe that the findings from my research can help inform digital script encoding by offering new theoretical perspectives on the relationship between script, technology, and nationalism.

  1. Singh, Vaibhav. “The Machine in the Colony: Technology, Politics, and the Typography of Devanagari in the Early Years of Mechanization.” Philological Encounters, vol. 3, no. 4, 2018, pp. 469–95. ↩︎
  2. Nemeth, Titus. Arabic Type-Making in the Machine Age : The Influence of Technology on the Form of Arabic Type, 1908-1993. Brill, 2017. ↩︎
  3. Mullaney, Thomas S. The Chinese Typewriter: A History. 1st edition. The MIT Press, 2017.
    ↩︎
Karthik Malli interned with the Script Encoding Initiative in Summer 2025. Karthik is an independent researcher and writer interested in the intersection of linguistic nationalism, script reform, language standardization, and typography. His focus is on India and Indic scripts, spanning both metal type printing and contemporary digital language planning. He is currently pursuing a Masters in South Asian Studies (2024-26) at the University of Washington, Seattle. In the past, he worked with Typotheque to identify and document critical junctures in the typographic histories of Devanagari, Malayalam, and Urdu, and wrote for a wide range of publications on language, writing, and identity in south India.
Headshot of Karthik Malli