The Two Cultures of Computing
User Culture Versus Programmer Culture
There are now two main cultures in computing: Most computer users treat software as a tool for getting tasks done, while programmers hold conversations with their software. One big challenge when teaching programming, no matter in what language, is getting students used to a conversation-oriented programmer culture, which is very different than a tool-oriented user culture.
The Two Cultures originally referred to the schism between the sciences and humanities. However, I've noticed a similar schism in computing between users and programmers, which makes it hard to teach programming to beginners.
In computer user culture, each piece of software is a tool for getting something done, like a virtual notepad or paintbrush. For example, Microsoft Word is for writing reports, Excel is for managing budgets, Spotify is for listening to music, and the iPhone Camera App is for taking selfies.
Each app on a user's computer, tablet, or smartphone is a self-contained digital tool with a specific purpose.
And like physical tool design, a good graphical user interface (GUI) design provides affordances to help users understand how to interact with that particular interface. For instance, Spotify (shown above) provides affordances for clicking on song titles and scrubbing the volume and playhead sliders.
In contrast, in computer programmer culture, each piece of software is an agent with whom one can hold a conversation, via either programming or chaining together shell commands. Sure, each piece of software is still a tool, but programmers “talk” to their software by issuing textual commands rather than clicking with their mouse or swiping with their fingers. Thus, these interfaces are usually text-based:
For instance, a programmer can tell Git (shown
above) to save a new version of a file (
Challenges in teaching beginners
When I volunteered to help teach programming to a group of librarians earlier this year (read Teaching Librarians Programming for details), I witnessed this cultural schism firsthand.
The 35 students all came from user culture, whereas the 6 instructors all came from programmer culture. Not only did the students need to learn new technical skills, but they also had to get accustomed to the ways of programmer culture. And some instructors – who had been living firmly in that culture for decades – had a hard time relating to the students, despite their most sincere efforts.
As a result, it was sometimes hard to motivate students to learn programming-related skills, since they felt like they could accomplish similar things with the tools that they already knew. And it's not just this one isolated event: When I've witnessed programmers teaching beginners in several other contexts, this same Two Cultures schism keeps cropping up again and again.
I'll now dive into specific examples:
Graphical versus command-line interfaces
The first shock to students was that they had to toss aside decades of advances in graphical user interfaces (culminating in Beyonce-on-demand) ...
... to work in a plain-text command-line interface from the 1960s:
It's comically absurd that the most sophisticated computer programmers nowadays are still using an interface that's remained unchanged since before humans landed on the moon. What the duck?!?
“So you programmers spend all of your days typing in that boring thing?” Ummm, we might customize our shells to have pretty colors or weird custom keyboard shortcuts to make our computers even less usable for others, but basically, yes. (A related problem – jargon – then arises: What's a shell?)
When the first instructor of the session plugged his Linux laptop into the projector, it immediately crashed (“because Linux”), so he had to use another instructor's Mac laptop for his lesson. However, that other instructor's command-line shell had all sorts of weird-ass customizations that the first instructor had no idea how to use. And the best part was that the students were all using the default terminals on either Mac or Windows, which looked nothing like the souped-up shell that the instructor was projecting on the screen and struggling to control.
The instructor then tried to extol the power of the command-line interface by showing a canonical UNIX trick – piping together multiple commands. For instance, how would you search for the word “widgets” in a bunch of Python files? A ha, presto!
Each token in that command needs an entire hour of explanation on its own (what's “minus name”?!?), but let's just suspend disbelief for one second. Then this question inevitably arises, “Why can't I type that term into the search bar at the top of my screen?”
BAM! The Mac OS Spotlight search instantaneously finds all occurrences of the word “widgets” and also sorts by file types.
Of course, uber-geeks know that the UNIX pipeline is infinitely more flexible, more customizable, and more programmable than Mac Spotlight search. But how can they convey those advantages to students who have never seen a command-line interface before? Just waving their arms and shouting “because, because UNIX!!!” isn't going to cut it.
I don't have a magical solution to this issue. I just want to highlight the fundamental cultural disconnect between programmers, who routinely talk to their software via decades-old command-line interfaces, and ordinary users, who exclusively use modern GUIs. Programmers often forget how massively bizarre it is that we're still typing text into a monochrome box when we now have more computing power in our pockets than entire countries had back in the 1960s.
Rich-text versus plain-text editing
To start programming, students need to first open a text editor. You mean like Microsoft Word? Can I embed my favorite doge picture, write a caption, and highlight it in bold?
No! Instead, you've gotta use something that, again, looks like it's straight out of the 1960s ...
Hmmm, not nearly as exciting.
(Also, cue the text editor religious war amongst programmers. Computer users are befuddled at how programmers can get so worked up over a text editor. It's just a tool for printing letters on the screen!)
Wait, why are we ignoring decades of advances in modern word processors to use something that simply lets you type in plain text without any pretty pictures or formatting?!? There's not even a spell checker. (Sure there is ... just activate M-x Alt Ctrl-J Esc qq ::spell-check mode!)
Students are starting to grow suspicious: These instructors are supposed to be expert programmers, but their tools look pretty primitive to me.
Excel versus Python
Next comes some actual programming. How about we use Python to process real-world data and then draw a few charts?
Okay sure, let's fire up our trusty 1960s-era text editor (not Microsoft Word) and write some code. Wait, first we need to install the proper add-on libraries such as NumPy and Matplotlib. [an hour of troubleshooting later, especially for Windows users ...] Okay, let's write some code. [type, type, type] Yeah, isn't this fun and intuitive? Python makes it all so easy ...
“Uhh, why can't I just use Microsoft Excel? I've used Excel for years to analyze data and draw charts. See, look, I can do a few clicks, and BAM!”
At this point, the instructor is thinking, “but, but, but Python rules!!!” Okay, I love Python just as much as the next Python fanatic, but right now, this isn't looking so hot:
The more general form of the student's question is, “Why should I spend all of this effort to learn programming – not just a new language but loads of add-ons and libraries, whatever those are – when I can just use my existing GUI-based tools such as Excel, SPSS, or Stata to do all of my work?”
Every instructor who aspires to teach pragmatic programming to working professionals needs to develop a compelling answer to this question. Potential arguments in favor of programming include scalability (what happens when you have a million data points?), flexibility (what happens when you need to reformat your data in quirky ways?), and automation (what happens when you need to re-run variants of your analysis on a hundred related data sets?).
Copy-rename-email versus version control
Next comes version control. All (good) programmers need to learn how to use a version control system for backup and collaboration. So let's learn how to talk to Git:
[hours of jargon-filled confusion and typo-laden frustration later ...] Okay, so you're finally able to save multiple versions of your sample file and then restore old versions from the command line. Cool!
“Wait, but why can't I just make copies of my files and rename them like I've always done?”
“... and then email those files to my colleagues? We use the Track Changes feature within Microsoft Word to see what we've edited, and it works fine for us.”
Now every single programmer will begin gouging their eyes out. ERGHHHHHHHHH!!! SO ... MUCH ... WRONG ... WITH ... THAT ... SENTENCE!!!
But take a step back and think about it. From the perspective of an ordinary computer user, which is easier?
Again, the burden is on the instructor to show the longer term benefits of version control, which hopefully justify its steep learning curve. But of course, version control only works well for text-based file formats, which finally brings us to ...
Microsoft Word versus LaTeX
An example that the instructor used to teach version control was tracking changes to his resume, which was formatted in LaTeX. After an hour of Git-and-LaTeX-fu, he paused for a break. Then a group of students flagged me down and quietly asked, “What's LaTeX?” I made my best attempt at an explanation, but they just looked at me blankly and asked, “So why not use Microsoft Word with Track Changes?” I was stumped.
But what about the magic of version control, GitHub, pull requests, forking clones, cloning forks, diffing forks, forking diffs, etc., etc., etc.? None of that matters for someone who works with binary file formats (i.e., user culture) rather than plain-text formats (i.e., programmer culture). There was such a disconnect between the two cultures that it was hard for me to come up with responses to these sorts of “why use X rather than Y” questions without sounding either incomprehensible or patronizing.
Call to action
Okay, so as an experienced programmer, what can you do to help beginners learn?
I think the most important first step is to acknowledge that students are often coming from a vastly different culture than the one you inhabit, and that programmer culture just seems weird to them.
The tools that you use every day at work look like they're straight out of the 1960s. And it literally seems like you're typing in code – gibberish – into the computer, and it's spitting more gibberish back at you.
You need to gently introduce students to why these tools will eventually make them more productive in the long run, even though there is a steep learning curve at the outset. Start slow, be supportive along the way, and don't disparage the GUI-based tools that they are accustomed to using, no matter how limited you think those tools are. Bridge the two cultures.
Best wishes, and happy teaching!
Note that by “programmer culture,” I mean a UNIX-style programmer culture. There are lots of programmers who work completely within proprietary IDEs (e.g., Matlab, Xcode, MS Visual Studio) with specialized workflows for version control, rich searching, and task automation. However, the high-quality, widely-available free software that is most likely to get beginners hooked on programming – to turn users into programmers – are almost always written in a UNIX style. That's why my article focuses on this culture.
Joel on Software further explores these nuances in his Biculturalism essay.
Last modified: 2013-12-24