Interview with Neil Patel: Surfacing the Market for African Fonts

SEI Team, 2025

We are excited to continue our exploration of African scripts in digital environments with an interview with font designer, Neil Patel. Neil is one half of the acclaimed design studio, JamraPatel, alongside Mark Jamra. 

JamraPatel has drawn attention for producing high quality fonts for African scripts – a notably underserved market – but also for their extensive community engagement, research, and digital infrastructure work. SEI has been lucky to work with them on recent projects such as the Bété digital inclusion project and our broader stream of research on modern “neographies,” or newly-invented scripts.

This interview dives into the calculus behind betting on African scripts as a business and why we should have hope for the future.


Could you tell us a little about your background and the start of JamraPatel?

We thought it would be a good idea to keep working together since we had complimentary skills.

We were also probably the only two type designers in Portland, Maine.

My background is kind of funny. Before I started working on font development, I was in the semiconductor industry for ten years. I worked at National Semiconductor as a photolithography process engineer. After a decade, things were not going great at the fab. There were layoffs happening, and while I didn’t get laid off, I was moved from process development back to the manufacturing engineering where I got my start. As a result, I decided to leave and change my career path. 

My wife is a graphic designer, and she suggested that I try designing type since it was technical and something I could do on my own. I started teaching myself how to make fonts by reading books and articles and practicing on my own. Font engineering is a whole different thing—it’s not well documented. You don’t really know how to do it until you start doing it. As I worked with clients, they’d request specific things, and I’d learn on the go. 

At some point, I met Mark Jamra. He’s been making typefaces for much longer—about twenty years. At that time, he was working on a Cherokee typeface project and needed some technical assistance. He brought me in to help him engineer the font to release with Adobe. After we completed the project, we thought it would be a good idea to keep working together since we had complimentary skills. We were also probably the only two type designers in Portland, Maine. That’s how we started JamraPatel.

Neil Patel and Mark Jamra with a kigelia africana tree
Neil and Mark with a Kigelia Africana tree, the Kigelia font’s namesake (Source)

How did you then find your niche working on African scripts?

From the beginning, we knew we wanted to focus on non-Latin scripts. Mark was frustrated with the over-saturation of Latin fonts. There were so many Latin fonts and not many fonts for other scripts. 

We realized that African scripts, in particular, hadn’t been well supported, so we decided to make that our initial focus. The first thing we did was look at what was encoded. While investigating each of those scripts, we realized that some did not have any available fonts, and others that did were not always particularly well-executed. From this dataset, we created a list of scripts to target. We then prioritized them based on the population of people using each language and associated script. 

The first big script we worked on was N’ko. We made our connections with someone in the community, and then they provided us guidance, information, and more contacts. For N’ko, there’s a culture of script activists that help promote the script. A lot of the initial effort is working your way into the network of activists. 

The process for Adlam was different. Really, each case is a little different. Because Ibrahima and Abdoulaye Barry, Adlam’s script founders, both live in the States, it was easy to make a connection with them. 

The scripts we ultimately included in what became the Kigelia font family are N’ko, Adlam, Vai, Ge’ez, Tifinagh, Osmanya and Arabic in a distinctively African style. We also still have a few scripts we have not worked on yet because they serve smaller populations or were recently encoded that are on the list.

Kigelia font specimen showing range of scripts supported
Kigelia font specimen showing range of scripts supported
Image of the Kigelia - A typeface for africa font booklet
“Kigelia – A Typeface for Africa” font booklet

Did you feel confident there would be a market for Kigelia?

Mark and I often joke that what we do doesn’t make sense as a business at all. In fact, we worked on Kigelia for five years with no specific client in mind. 

From the very beginning, we knew we had to work on a font family with an assortment of writing systems included to have any hope of viability in the market. Selling a license to a font for an individual African script is challenging. Given the difficulty in assessing the number of users of a given script, the market potential is unknown. But if you bundle a handful of scripts together, then that makes it more of an attractive prospect to vendors. 

And as we were interacting with script communities, we also realized that releasing our fonts and keyboards in a retail market was not feasible. This is due to a number of factors: the price point for a font being too high for many users, the lack of credit card infrastructure needed to accept payment, and the inability to install fonts on mobile devices. We decided that if we can get our fonts into the hands of Original Equipment Manufacturers (OEMs) to bundle on devices, then everyone would get access for free, and that would ultimately be better for incentivizing usage.

During the development of Kigelia, we did a variety of presentations about the work we were doing. This raised awareness about the scope of the project, and through organic industry connections, we ended up talking to someone at Microsoft, which led to the first license of the font family. It might have taken a while to achieve it, but our plan of releasing our fonts through OEMs worked. So even to this day, we primarily release our fonts directly through OEMs.

In name, JamraPatel is a design studio, but you do much more than design fonts.

We started out just designing fonts, but as we worked with communities and gathered feedback, we realized that there were a lot of infrastructural gaps. The lack of fonts was just the initial problem. There were also a lack of keyboards, a lack of support in many programs, along with other issues. This is how our scope of work broadened to addressing these issues. If a script can’t effectively be used on devices, then it doesn’t really matter how many fonts are available.

That being said, everything we do outside of licensing fonts or designing commissioned fonts is not paid for; we just do that. The business itself is in the fonts, but developing keyboards, doing advocacy, submitting bug reports and Unicode proposals, all of that stuff are things we do on the side. That’s just because it is work that has to happen and giving script communities agency in digital spaces is what matters most to us.

These days, communities reach out to us for help with scripts we have not even been looking into. It is a positive feedback loop because once a community with a nascent script sees the successes of other African scripts, they see the possibilities for their own. This has led to a handful of ongoing projects for unencoded scripts. Some of these projects started about four years ago; others have started this year like Gbékoun and Songhai. 

When a community comes to us for assistance with a new script, we typically look to see if the script has gained some traction and if there is a group dedicated to developing and promoting the script. These are usually good indicators that a writing system has potential for broad adoption.

Neil with Abdoulaye & Ibrahima Barry, creators of the Adlam script, at their home in Portland, Oregon
Neil meeting Abdoulaye & Ibrahima Barry, creators of the Adlam script, at their home in Portland, Oregon in 2018 to work on a font for Adlam (Source)

What is the process like for working on unencoded scripts?

An encoded script has all the necessary character properties needed for an operating system to effectively render the script. This makes font and keyboard development relatively straightforward. 

But for unencoded scripts, we need to use Private Use Area (PUA) code points, and in some cases, code points assigned to other scripts.1These code points are not set up to handle the shaping needs of an unencoded script, so they often require a lot of hacky workarounds to render a font properly.

Regardless of whether a script is encoded or not, getting it to work on a mobile device is challenging. Since fonts cannot be installed on mobile devices to function at the system level, we are wholly reliant on there being a Noto font that is deployed with the devices. Google’s Noto project has helped close gaps for under-supported scripts by being licensed cross-platform. If a newly-encoded script is fortunate enough to have a Noto font available, then only a keyboard is required for full functionality.

Screenshot of the Noto Adlam mobile keyboard
Noto Adlam designed by JamraPatel in 2019, released in Android 11 in 2021
Screenshot of the Noto Tifinagh mobile keyboard
Noto Tifinagh designed by JamraPatel in 2019, released in Android 11 in 2021

For unencoded scripts, however, getting something to work on a phone is really difficult. We handle this by building a workaround within our keyboard apps. The way it works is that we bundle a PUA font within the app, and then the user can type a message inside the app and share it as an image to other apps. There’s no live text being sent, but the recipients can at least read the message. It’s clunky and not fun to use because it adds a step to what should be a quick way to communicate. However, if a user is willing to go through the effort to use it, then it is an indication that they are passionate about the writing system. At the same time, this can be a deterrent for many users.

The first keyboard we implemented with this workaround was for Adlam. We didn’t invent this workaround. There used to be an app, called N’ko Pad, and it basically did the same thing. We stumbled upon this app when we were working on Kigelia and we adapted to work with a keyboard extension. Keyman, I believe, can do something similar and that is open and free to use.

Even for recently-encoded scripts, we find that we still need to use this workaround often. When a Noto font is created, it will only be bundled in devices running the latest version of Android. Older devices are not provided with the font. In Africa, the phones are typically two or three generations behind on operating systems. As a result, many users can’t use their writing system even after it has been encoded into Unicode. It can take a few years before the community can see the benefits of encoding. This was the case for Adlam. But with our keyboard, people were able to use the image sharing workaround in the interim. In the early days, we would see the images of Adlam text from our app on Facebook quite regularly.

Would you say you have a fairly standard toolkit now for all of the African scripts on your radar?

The specific strategy and goals actually vary by the stage of development a script is in. We can loosely break them down as newly-created, pre-encoded, and post-encoded. The specifics will also depend on the ambitions that each community has for their script. Some just want a culturally-meaningful way to write their language; others are looking far ahead and striving for their script to operate in parity with Latin script, meaning that it can handle math, science, and legal tasks.

In the newly-created stage, the primary goal is developing a digital means to disseminate the script to a broader audience in a more efficient manner. Handwriting textbooks is a tedious task and expensive to reproduce. Here we are primarily concerned with developing fonts and desktop keyboards, and ensuring all functions are there in basic desktop publishing software. Depending on the complexity of the script, this can be relatively straightforward or require a lot of iterations. Once the initial assets are delivered, the resulting improvement in efficiency creates more time for the community to refine, improve, and add capabilities to the script, which we continue to support.

Handwritten documents produced in the N’ko script
Handwritten documents produced in the N’ko script (Source)

In the pre-encoded stage, the script has matured a bit, and sights are set on attaining encoding. As a result, the need for the script to be usable online becomes more important. In Africa, language communities are often dispersed across multiple countries. As a result, having the script circulated on the internet is a primary concern. Without an online presence, a script is confined to small geographic regions. To grow the user base, we need to provide mobile keyboard apps that include the text-to-image sharing feature. Script promoters also need to be able to engage the potential user base with regular and interesting content. We contribute to building excitement by opting to make custom keyboard apps for each script rather than building a universal app for all the scripts we are supporting. It’s less efficient for us, but we have found that people prefer tools that speak to them directly. Using one’s own writing system is an act of claiming one’s own cultural identity. We like to reinforce that sentiment. 

Frederic Bruly Bouabré’s looking at the Bété Keyboard app
Sons of Frédéric Bruly Bouabré, the creator of the Bété script, looking at the progress of the Bété keyboard, designed by JamraPatel (Photo by Adam Yeo)

Post-encoding, the needs shift more towards advocacy and market-building. Encoding is a big milestone, but with it comes a greater expectation that the script will work everywhere. Unfortunately, this does not happen quickly. Many popular programs are slow to support new scripts and oftentimes have bugs. These can be difficult to get resolved. However, the more software products that can be used effectively helps demonstrate to other software suppliers that there is a market to be supported in African scripts. During this phase, we may also get involved in propagating CLDR data2 and coordinating translations for machine learning—it all depends on what the community is looking to pursue next.

Neil Patel presenting the timeline of Adlam script support at Typecon 2024
Walking through the timeline of Adlam script support at Typecon 2024 in Portland, Oregon (Source)

Overall, it’s a considerable amount of work to onboard a script. I do find it interesting that we live in a world where a lot of people essentially carry a computer in their pocket that makes it readily possible for them to communicate and create works of artistic expression. Yet writing one’s thoughts in a form of their choosing, a fundamental form of human expression, is something that needs to be proven and justified before being enabled. Our current framework works well to maintain the technology and standards side of the equation, but maybe comes at the cost of human expression.

In the context of the Unicode Standard, I would love it if nascent scripts that showed some promise could be assigned a provisional code range that had all the functionality of encoded scripts. That way we could have live text on a phone, instead of workarounds like the text-to-image sharing–the current friction in using it hinders learning cycles and deters adoption. Perhaps after a few years of provisional usage, if there is good evidence of stability and breadth of use, the script could then be officially encoded in that code point range. A framework like this would reduce the lag in support that typically follows encoding, when all the existing tools need to be updated to match the standard. 

Zooming out, where would you say new script inventions fit within Africa’s language policy landscape?

What we’ve seen in Africa is that state sponsorship of writing systems other than Latin is rare. There is a lot of ethnic diversity within any given country, which means you have a lot of different communities speaking different languages. There are some colonial underpinnings behind this along with current political complications. In general, though, it’s difficult for a government to say they’re going to support a language and a writing system, because then it opens the door to entertaining all languages in the country equally. Practically, that creates a lot of administrative overhead.

As result, all of the work in promoting a script ends up being driven by community grassroots efforts. In Guinea, all formal education is conducted in French rather than in a community’s own language. For Adlam, and the same goes for N’ko, the communities operate their own schools. Children go to state-run school during the day, and then they stay at school into the evening and study topics in Adlam or N’ko. It’s a lot of effort for the kids, not to mention all the teachers that have been recruited to volunteer their time. That is a serious commitment.

That being said, there are some shifts starting to happen in West Africa where countries are realizing that maybe they need to get away from French. I think the first country to move in this direction is Mali, which changed the constitution after the most recent coup to say that the mother tongues are the official languages of the country, and French is just a working language. What the government needs to resolve now is how they handle it—the actual day-to-day mechanics of whether to utilize Bambara for all official communications or to use all languages equally. Then, they also need to decide what writing system to use.

Another project that we’re working on is the N’ko Phonetic Extensions, which is basically adding a set of new characters to the existing N’ko script to support all the languages of Mali. The creator, Dr. Boubcar Diakite, has developed characters to support all the languages in Africa should other countries want to adopt it, but for now, the primary focus is seeing if it can be deployed to cover the languages of Mali. This endeavor is still playing out.

Image of the N'ko extensions from the Unicode document registry
N’Ko extensions highlighted in yellow for various Malian languages (L2/25-081)

However, we are seeing neighboring regions starting to think about the same idea. I believe there’s a similar sentiment taking hold in Burkina Faso now; even in Guinea, they’re starting to really consider making Adlam and N’Ko official writing systems.

Who knows? In 20 years, if there are some successes in this arena and there are models to follow from one country to another, maybe we’ll see more of this happening across the continent. We don’t know quite yet where things will land, but I believe things are changing.

Mark Jamra and Neil Patel at the annual conference of the World Organization for the Advancement of N’ko in 2023
Mark and Neil at the annual conference of the World Organization for the Advancement of N’ko in 2023 (Source)

How would you describe the progress that has been made since you first started in this space?

In the long run, though, I think real progress will be made when homegrown software options are created by African developers.

I think software support for African scripts has definitely come a long way. At first glance, African scripts don’t seem to have much of a presence online. Even for scripts that we know are heavily used, I might only find a couple websites here and there. If you’re working at a software company, you’re not seeing all the content using the script. It would be easy to ask the question, “Is this something that we need to invest in at all?”

I think there is a discrepancy between what people on the outside see and what is happening on the ground. African scripts are used quite a bit, but online content is mostly on social media and peer-to-peer messaging apps. Printed content is available, but none of the books are catalogued in library systems.3 For unencoded scripts, people will write out books and documents by hand, make photocopies, and sell them in markets. From the outside, it would be natural to assume that there isn’t a compelling reason that these scripts need to be supported.

When we first started, it all seemed like an impossible hill to climb. But it has improved. Collaboration has become easier. The industry at-large is more aware and mindful of the disparities, and more importantly, script communities are aware of entities and individuals like us that want to focus on closing these gaps. The networks of interested parties have gotten larger and more robust. 

Overall, the process is more streamlined, with basic font and keyboard support happening in less time, and some bugs being resolved quicker. In the past, we had to communicate issues many times before resolution. There are still problems, but there has been a lot of improvement, especially with the large tech companies.

However, when we look at smaller tech companies, we see support for African scripts lacking. This includes a range of productivity tools and social media apps. Much of this software is available for free or are affordable alternatives to their mainstream counterparts. I am sure economics plays a role in why these companies have resisted internationalizing their software. However, I also believe there is a lack of awareness about the libraries and tools that Unicode provides to facilitate and simplify software localization. I would like to hope that we see improvement in this area, but I think it would require internationalization to be a focus area in computer science education. Until you achieve the complete localization of your computer and interface, you are always on the periphery. In the long run, though, I think real progress will be made when homegrown software options are created by African developers.

Finally, is there a moment in your work so far that you are especially proud of?

When we made the ADLaM Display font for Microsoft, the reception from the Adlam community was really great.4 It was the first display face they had ever had, which is kind of crazy to think about.5 It was rewarding to see the font being embraced so enthusiastically.

That year, we traveled with Ibrahima and Abdoulaye to Gambia and Guinea to attend some conferences and visit schools. The font had just been released, and it was already everywhere. The event banners were all using ADLaM Display. We were seeing it on T-shirts and product labels. It was amazing to see in use so quickly. Later, Ibrahima told me that after seeing the font, some people that were on the fence about learning Adlam changed their minds because they felt a connection to it.

Four women in Guinea holding Pulaar-language newspapers printed with the Adlam display font
ADLaM Display font being used for titles in a Pulaar-language newspaper in Guinea (Source)

Sometimes, I get frustrated from being mired in the weeds of technical issues, but I am motivated by seeing the difference that fonts and tools make in script communities. The hope is that everything builds momentum. As more people use these tools to create content, it will demonstrate that there is a robust market in supporting African writing systems. 

  1. The Private Use Area refers to several ranges of code points within the Unicode Standard that are left unassigned. Any third party can coordinate and use these blocks for discretionary purposes. Read more here. ↩︎
  2. CLDR refers to the Common Locale Data Repository, an open data project under the Unicode Consortium that contains language (as opposed to script) information to help with software localization. ↩︎
  3. Read more about the challenges of updating library systems in our interview with Charles Riley here. ↩︎
  4. Microsoft produced an in-depth feature following the release of the ADLaM Display font: “Can an Alphabet Save a Culture?”, available here. ↩︎
  5. Display fonts are specialized fonts used in large sizes for headlines or posters. Users often want a range of display fonts for different stylistic needs. Text fonts, in contrast, are typically multi-purpose fonts used in small sizes, where function – that is, legibility and neutrality – are valued most.

    Most minority scripts are lucky to have a single text font available, let alone a display font, whereas Latin script users are spoiled for choice in both categories. ↩︎