2012 was the year that I learned to code for real. I started with the Stanford 106 courses and moved myself through a few online tutorials on Python, then Ruby on Rails. But I didn’t really feel that I had a tangible skill until I started making things of my own. A few weeks ago, I was told by a former software engineer after a code review of my work that I could probably get a entry level software engineering job now if I wanted to. It all feels kind of incredible— yet not. What I have learned from this experience most of all is that coding is a lot of ongoing hard work, but it’s hard primarily because most people have no had the proper cognitive exposure to it early on enough in their lives. Like language or arithmetic, it demands a different type of pattern recognition on your brain which you learn to adjust to with time. It is not magic or talent, it is simply learning to think differently. That always takes time.
For one of her top ed-tech trends for 2012,Audrey Watters noted that learning to code became a bit of a media and Silicon Valley favorite. It started with Codecademy’s genius marketing plan of the Code Year. But the most interesting part of the post was her link to Jeff Atwood’s blog on “Please Don’t Learn to Code.” In a gist, Atwood argued that one should not learn to code simply because there is a perception that learning to code is automatically equated with solving problems or a big paycheck. There are parts of his argument that I agree with; for e.g., that Michael Bloomberg would be better off running his mayoral duties than learning about variables and functions (incidentally, there is great irony to this example…which I will get to below). Then there are parts of it that I don’t– which Mark Guzdial put in much more eloquent terms than I ever could:
- Most people who write code are not trying to create code solutions. Most people who write code are trying to find solutions or create non-code solutions. By “most people,” I do mean quantitatively and I do mean all people, not just professional programmers. We know that there are many more people who write code to accomplish some task, as compared to professional programmers….
- Most people who program are not and don’t want to be software developers. Most of the people that I teach (non-CS majors, high school teachers) have zero interest in becoming programmers. They don’t want to be “addicted to code.” They don’t want a career that requires them to code. They want to use coding for their own ends…
The problem is that we in computer science often have blinders on when it comes to computing — we only see people who relate to code and programming as we do, as people in our peer group and community do. There are many people who code because of what it lets them do, not because they want the resulting code.
“You should be learning to write as little code as possible. Ideally none.” And people who want to do interesting, novel things with computers should just wait until a software developer gets around to understanding what they want and coding it for them? I could not disagree more. That’s like saying that the problem with translating the Bible is that it made all that knowledge accessible to lay people, when they should have just waited for the Church to explain it to them. ”Please don’t learn to code” can be interpreted as “Please leave the power of computing to us, and we’ll let you know when we’ll make some available to you.”
The last part of Guzdial’s arguments sums it up perfectly for me. Programming is a power for solving problems— it should not be reserved for only certain segments of the population that have deemed themselves worthy. Programming is also only the beginning and, in a sense, rather small part of any problem solving process. It wasn’t until I built my first web application prototype that I understood quite vividly that getting something that works and responds to you is only such a small part of the puzzle— there is the larger question of what the purpose of this software is and how do you get it be actually useful in the world and not simply resting on your server somewhere. The problem does not end with the code. In that sense Atwood is right to advocate for a greater understanding of the problem that you are trying to solve.
But one can’t ignore the fact that at the end of it, code is still what makes it possible. You can design all the solutions that you want, but if you can’t make it a tangible working concept– it stops there. Code, even imperfect snippets of it, enable that first inkling of possibility. Going back to my side comment on Michael Bloomberg: he was an electrical engineering major whose fortune was founded on a financial software company! You do not have to be a genius to appreciate that, surely, his technical background gave him an advantage in understanding what possibilities existed. He probably did not code a thing himself, but I find it hard to believe that his background would not have been at least a bit helpful.
In the end, I understand what Atwood is trying to get at. People shouldn’t assume that coding is easy and that they should make at least a respectable effort at it if they want to pursue it seriously. But I do believe that everyone should at least try to learn to code; for most, it is not even something that they realized they may enjoy or be good at or may be able to use for other purposes— simply because they never had exposure to it. I strongly wish now that I had majored in CS back in college; I simply didn’t understand enough about it at the time.