14

Don’t Learn to Code – Learn to Automate

 4 years ago
source link: https://daedtech.com/dont-learn-to-code-learn-to-automate/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Does anyone remember a few years ago, when the mayor of New York decided to learn to program ? It was a heady time, because it wasn’t just him. I remember these surreal commercials where Miami Heat forward Chris Bosh was encouraging children to learn to code for the good of humanity or something. There was this sudden, overwhelming sentiment that humanity should abandon the folly of any non-programming pursuit and learn them some Ruby or whatever. Andy Warhol, were he alive in 2012, no doubt would have said, “in the future, everyone will write code for 15 minutes.”

Jeff Atwood wrote an interesting rebuttal to this zeitgeist, entitled, “ Please Don’t Learn to Code. ” The covers a good bit of ground and makes some interesting points, but the overarching thesis seems to be, “avoid thinking of writing code as the goal and learn to solve problems.” I think this is an excellent, philosophical point, but I’d like to add a bit of nuance.

I’ve written in the past about how important I think that it is to be a problem solver, to the point where I wrote a post about liking the title “problem solver.” So please don’t think I disagree with his take that a lot of programmers get too hung up with the particulars of code. I don’t — I think that’s a very common issue. But, at the same time, I think the mayor of New York and Chris Bosh and others have a point that Jeff doesn’t really address, per se. Specifically, the world is getting dramatically more technical, which means that a lot of pursuits are being automated out of existence, while other pursuits require an increasing degree of technical savvy. My fiancee, a professional copy editor, is finding aspects of her job to be easier if she knows a bit of HTML and CSS.

So while I wince alongside Jeff at the thought of people randomly learning programming languages because they think it’ll make them rich or because they want to be a person that writes lots of code, I don’t think we can simply say, “stay out unless you’re serious and willing to spend years getting good.” The rapidly evolving technical landscape has created this black hole of technical savvy that’s sucking in even people well past the event horizon.

The advice that I’d offer on this subject creates a pretty fine distinction. I don’t think that everyone needs to learn to code by any stretch. What I think that everyone needs to start learning about and understanding is how to automate . Or, if not how to do it themselves, at least how to recognize things that could be automated and have meaningful discussions about whether the effort is worth it or not.

Automation in Real Life

It might be best to offer an example here. Tonight, I was working with a friend on some course materials to be communicated via Power Point. We were making slides that were largely images with a few large-font words scattered here and there. We’d paste the pictures into the slide deck, resize a little and then use my Mac’s trackpad laboriously to center the images. At one point, he said, “there should be some shortcut key in Power Point that you can hit and it just auto-centers the thing vertically and horizontally.”

That, my friends, is automation at the moment of conception. Centering something on a Power Point is just the kind of brainless, maddening, time-consuming task that makes a good candidate for automation. You know what I’m talking about, where you drag the thing around using movements as microscopic as possible, hoping to see that line appear that indicates that it’s in the middle. It’s so hard to find and you’ve got to go so — wait, there! Crap! Missed it. Back up, try it again.

zeQzaiN.jpg!web

Of course, the conversation doesn’t end here, by any stretch. The talent for which I’m advocating — the talent of savvy automation — involves an assessment like the following, and a potential automation at the end.

Does Power Point already provide this capability and we just don’t know about it?

Doesn’t seem like it after a bit of quick googling. If it did, a blog about Power Point secrets would probably just need two lines to describe it “Step 1: hit magic key sequence. Step 2: there is no step 2.” But still, it might exist.

Is it worth doing more research?

Not right now. We’re almost done with the slide deck. On a broader scale, it might be worth doing.

Would it be worth it to us over the long haul?

Hard to say. I’m not entirely sure how much time in my life is wasted to this problem. It’s probably some but I’m not the most avid Power Point user, so you figure it might take years for it to add up to hours of my time. And who knows how long fixing this problem would take?

Would it be worth it for humanity over the long haul?

Hmmm… almost certainly. Even if I waste hours on this only after years of Power Point use, humanity probably wastes hours on this after microseconds go by. Lots of people somewhere are swearing at Power Point right this very instant.

Could we sell it if we made it?

Meh, doubtful. But maybe for a pittance here and there. Or maybe we could just do it, donate it to the greater good, and reap a reputation benefit.

Okay, so what’s the next action?

Let’s set aside a time box of an hour to research this at a good time for low priority tasks, such as sitting in an airport. A nice interim goal might be to see if we can get it going in a limited scenario. On just a blank slide, does the “ctrl-E” shortcut do the job for horizontal centering? If so, can we find a way to do vertical centering and then somehow chain those shortcuts? If yes, then, bam, problem solved with just a bit of configuration. We’ll know we’ve succeeded when we can hit a single shortcut and have horizontal and vertical center of a single image on a blank slide. If that little test works, we can write a blog post, bask in some win, and then move on to more scenarios.

Poor Man’s Coding?

Okay, so that was a little hokey, but I’d like to point some things out here. First up was identification of a crappy task and the recognition of an automation candidate. Figuring out the exact coordinates for centering something is the perfect job for a computer. After that came a sequence of questions contemplating the possibility that there might be an existing solution, that coming up with a new one might not be worth it, or that coming up with a new one might be time consuming enough as to offset any gains. In the world of software development, that goes by various names such as “discovery,” “requirements analysis,” and “sprint 0.”

After that came a tentative plan of action along with some risk mitigation. Let’s invest a bit of effort in seeing what we can do, but let’s cap that amount of effort so that we don’t run off tilting at windmills. And then there was a concrete strategy that involved carving a larger potential effort down into the smallest slice that might provide some incremental value. This is known in the software development biz as an agile (or, if you want, “lean”) approach to a software project. And then, finally, in spite of the lack of writing of actual code, came implementation and a clear, verifiable description of what success looks like. In the biz, that’s called “Acceptance Test Driven Development” (ATDD).

But forget the terms and the parallels to programmer shop-talk. The more important point is that successful software development projects — projects that involve code and IDEs and compilers and whatnot — are just a special case of successful automation projects. You can automate all manner of things, even some very non-trivial things, without actually writing any code. In this vein, Jeff’s point is absolutely spot on. Coders like to code, but writing code ought not to be a first class goal when there are problems to be solved.

AfYz2m3.jpg!web

Throughout human history, there’s been a sort of “pain is gain” approach to the repetitive. There was value in putting your head down, getting into a rhythm, and working hard at menial tasks. But throughout most of that human history, we didn’t have computers. It turns out that computers are really, really good at doing repetitive, menial tasks — tasks that involve precision and not judgment. They’re far better at them than we are, so it makes sense to let computers do them.

It’s obtuse to suppose that a prerequisite for every job in the future will be the ability to implement sophisticated, specialized computer applications. But it’s not at all obtuse to suppose that, given the ubiquity of computing, a prerequisite for every job in the future will be the ability to recognize which tasks are better suited for humans and which for computers. Learn at least to recognize which parts of your job are a poor use of your time. After that, perhaps learn to use your ingenuity and creativity to automate using the tools that you know (such as googling for solutions, leveraging apps, etc). And, if you’ve come that far, maybe it’s time to roll up your sleeves and take the plunge into learning to code a little bit to help you along. Because, while there’s no need for the mayor of New York to write any code, it couldn’t hurt him to be ready to jump if something opens up in the city’s IT department.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK