Architecture and Hardware

Pen-Based Computing: Still Looking For The Write App

A basic configuration for pen computing.
A basic configuration for pen computing.
"I believe we were in a kind of 'sketch Winter,' like there was an AI Winter. And the reason I did this paper was because I believed we were exiting the 'sketch Winter' and entering the'sketch Spring' or 'pen-based Spring'."

In 2004, a story in The New York Times quoted Andy van Dam, Thomas J. Watson, Jr. University Professor of Technology and Education and professor of computer science at Brown University, about likely audiences for widespread adoption of pen-based computing. Van Dam, a computer graphics and human-computer interaction research pioneer, as well as an ACM Fellow, suggested pen-based interfaces might become popular among educators, architects, and graphics and interface designers.

Brown University subsequently became a hub of pen-based computing research. In 2006, Brown became the site of the Microsoft Center for Research on Pen-Centric Computing, and after that project's funding ran its course, the university rolled its activities into the university's computer science graphics group.

Alas, van Dam recently told Communications, research into pen-based computing has become moribund at Brown, as it generally has overall in research labs; its wider adoption outside labs has also languished.

"We have an 84-inch pen-and-touch display," van Dam said. "Microsoft does not maintain it. We don't know how to maintain it. It would probably be very simple for the people who built it and supported it to fix it, but it's sort of broken and we use it as a big TV screen at this point. And we have gone back basically to what I think is a much poorer interaction device, which is the mouse and keyboard.

"Part of it I blame on Steve Jobs, whom I knew quite well. He was passionately opposed to pens and killed any idea of pens on iPads and iPhones, and they were the standard-setters. I think he held back what could have been a revolution."

Van Dam's colleague at Brown, research director Robert Zeleznik, said the recent lack of activity doesn't mean pen-based computing is dead, but rather that the direction in which research needs to go should be re-calibrated. "We have shied away from using the pen in the last four years not so much because we are not interested in it, but because I think it works best in niche areas," he said, explaining that pen-based computing will be an evolutionary discipline-by-discipline undertaking, rather than a revolutionary mass way of communicating with devices.

Joseph LaViola, Jr. was van Dam's Ph.D. student at Brown and did intensive research into pen-based computing. LaViola said he has also noticed a distinct slowdown in research in the area, but that there are disciplines for which a pen could still indeed be mightier than the keyboard, particularly in education.

"There is research to be done," LaViola said. "The research that needs to be done is more in trying to understand how we can utilize pen-based computing from an application perspective. Pen-based computing is ideal for educational applications, because a lot of the things we do in education deal with two-dimensional languages, which are a lot easier to write than to type—mathematics, chemistry, and music, for example. You can actually just draw it out on the screen as you would with pen and paper, and the system recognizes that and does something with it."

LaViola, now a professor of computer science at the University of Central Florida, pioneered pen-based educational applications while at Brown; his doctoral work included the development of an interactive sketch-based mathematical instruction platform called MathPad. The platform recognizes handwritten equations and drawings and subsequently animates the equations, bringing static concepts to life. The technology was commercialized in a company called Fluidity.

Pen-based winter?

LaViola said he currently is not pursuing any pen-based research, adding that he thinks interest in it has cooled based on the small number of papers presented at conferences. As an example, he said a recent computer-human interaction conference had just one pen-based paper presentation, while previous conferences might feature three or four.

Yet pen-based computing is not dead by any means; it may just have been in a "winter" akin to dormant periods in AI research, according to Ian Arawjo, who worked with colleagues from Cornell University in creating a pen-based project called Notate, which integrates pen-based programming and textual coding in Jupyter notebooks. Arawjo and his colleagues presented their findings on Notate at the 2022 ACM Symposium on User Interface Software and Technology (UIST 2022). The paper, for which Arawjo was first author, compared the efficiency of integrating hand-drawn coding elements of quantum programming within lines of traditional code, as opposed to typed commands of the same data; it was awarded honorable mention recognition.

The publication of the Notate paper represented 10 years' worth of work for Arawjo, who will join the faculty at the University of Montreal next January 2024, where he will conduct additional human-computer interaction research. Arawjo said numerous factors held his vision back; when he first purchased a Microsoft Surface tablet and pen in 2013, the pen was simply not advanced enough to conduct rigorous research. He also felt the larger research community had perhaps turned its attention away from pen-based research prematurely.

"A lot of work on pen-based computing and what we call sketch recognition was done in the late 1990s and early 2000s and a lot of prominent human-computer interaction researchers got their start on pen-based interfaces," Arawjo said. "And when I tried to talk to people who had worked in that area around 2010, they would say things like 'I don't work on that much anymore because we have already done that.'

"It always bothered me, because the hardware was not really there yet. We didn't have the Surface or the Apple Pencil. Right now, we take things for granted, but there had to be a lot of hardware improvements in order to get there."

Additionally, he said, recent achievements in deep learning have made it possible to advance beyond the delicate architectures of early pen-based experiments: "If you read those papers closely, you'd realize that what they were talking about sounded very great, but in fact, you had to write a very particular way, you had to use a very particular system setup. It was much harder to use than these papers would suggest.

"Now, with deep learning and AI systems, the software is there. For example, it's now possible to do offline sketch recognition, whereas before you had to do online sketch recognition, and it was pretty wonky. You had to write a symbol a very certain way in order for it to get recognized. There are many fewer problems like that now."

In essence, Arawjo said, basic hardware and software capabilities are now robust enough that domain-specific research in pen-based computing could be easier to implement and for others to understand.

"I believe we were in a kind of 'sketch Winter,' like there was an AI Winter. And the reason I did this paper was because I believed we were exiting the 'sketch Winter' and entering the'sketch Spring' or 'pen-based Spring'. This kind of work needs another look by human-computer interaction researchers who might have stopped looking at it because they looked at it in the 2000s and said it was over."

Culture and subculture

The Notate paper is as much a review of the culture of programming as it is a technical description of pen-based coding, which Arawjo and his co-authors termed 'notational programming' in the context of their users' sketches interacting with text-based code. The method is nothing new, as they pointed out: "At the advent of programming, the earliest computer programming notations were handwritten, not typed," they wrote in their introduction to the paper. "In the celebrated 1945 First Report on the EDVAC, for instance, John von Neumann equated diagrams to text and vice-versa."

For Arawjo, a resurgence of pen-based programming would be no less than a return to a more equitable ecosystem, in which people who are more comfortable communicating with devices visually are accommodated to a degree the typed paradigm does not reach.

"When someone is a member of a dominant culture, they often don't perceive that it can be not as good for other people," he said. "So that's the key thing with pen-based computing that I am really excited about; we can get back to a more naturalistic mode of being, where we are not caught up in these distinctions so much."

Indeed, the Notate paper found that pen-based programming matched and sometimes exceeded the performance of a typed coding approach to the same task.

"That which is 'better' depends on the task at hand, how the design of the notation affords or resists encoding a particular solution, and the background and preferences of who is trying to apply it," Arawjo and his colleagues found. "Taken collectively, these findings support our 'heterogenous'(sic) vision of notational programming—for designs that mix modalities, instead of demanding one for all time."

A Promise, but no 'penacea'

Though Arawjo said he can envision domains for which pen-based computing could emerge as a viable  programming option—data visualization programming and game design came immediately to mind—he also said the larger infrastructure that would support such an interface is also sorely lacking; the system he used to test Notate, which included re-training a machine learning model that worked in concert with traditional computer vision techniques, took a year to build.

"Don't get me wrong, developing this notational programming system, this deep learning model, was hard," he said. "And there were errors in it. And I think there is a way to go to make the process more streamlined. There is infrastructure that has to be built, to make building sketch recognition applications easier, for instance. We need more powerful programming tools, that could be textual to be honest, just to build these new sketch recognition or pen-based interfaces." Specifically speaking, within the bounds of Notate, Arawjo and his colleagues outlined debugging tools and designs that manage or mitigate the mode-switching between keyboard and pen as examples of needed infrastructural improvements.

Ultimately, according to both van Dam and Arawjo, the future of pen-based computing may come down to a combination of a user base more mindful of drawing precision and artificial intelligence platforms that can help mitigate the need for precision beyond the capabilities of all but a few users.

"Suppose there was an alternative reality where we had to sketch really precisely in order to communicate with a computer," Arawjo said. "Then almost everyone in the world by necessity would get very good at drawing, and they are not because that is not how the power of computing has traditionally been afforded to them. They can type. But typing did not come naturally either. Once there is a critical mass with sketching, then it becomes more okay to say you should probably be better – and the onus isn't on the computer but on you, because that's how you unlock the power of the computer."

Van Dam, who worked on optical character recognition applications decades ago, said machine learning's capabilities have helped advance the field beyond what was possible when Zeleznik and LaViola were working on the early versions of LaViola's math platform, and may play into the future of pen-based computing.

"In the 1950s and 1960s, we had to pick features, and some of the early algorithms Bob and Joe used for the math recognition had to use features that were baked in or had to be added on, but machine learning does away with all that. It finds features that make no sense to humans. It just works and no one knows why," Van Dam said.

He advised, "Don't give up on pens. Don't give up on gesture recognition. Just accept they won't be the universal solvent.

"It's just not going to happen, and part of it, interestingly, is kind of a cultural thing. If (Apple's Steve) Jobs, with his kingmaker ability, had said 'pen is one of the natural input modalities, as well as touch', a lot of kids and teenagers and young adults would have said, 'okay, we follow you, you are our thought leader, here are a bunch of apps that make good use of that', and it would have become part of the technology user ecosystem. But because he was so passionately against it, the opposite happened."


Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More