10

Our old 8-bit machines were not that easy, either

 3 years ago
source link: http://rachelbythebay.com/w/2011/11/12/learning/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Our old 8-bit machines were not that easy, either

My post on asking permission to learn generated a bit of feedback. I got a couple of comments which basically said that "where should I start" is still a valid question. I'll try to continue down that line and see if we can come up with a hypothesis which works for everyone.

The common sentiment seems to be that computers now are complicated and they used to be simple. By extension, they are now complicated to program and they used to be simple to program. Further, this can be explained by the fact that you used to drop into a text-based environment where you had to type commands, and now you usually don't.

Who really remembers back to those early days of using such machines? I remember poking at the computer in my neighbor's bedroom. She was a bit older than me and had gotten one from her step-dad. I guess they turned it on for me and then had to duck out before showing me how to do anything with it.

What did I see? This.

VIC-20 BASIC ready screen

Granted, it was fuzzier and the cursor blinked since it was the real thing on a TV set instead of a screen shot, but that's it. That was the entire interface as far as I knew. I probably typed a few English words at it and had nothing happen except the usual "?SYNTAX ERROR READY.", and that was that.

Some time later, she upgraded to a C-64 and I got that VIC-20. Yep, I was incredibly lucky. My family's first computer was all mine. Now, it could have been just a lifeless box, as it was when I first saw it. Or maybe someone could have shown me how the cartridges work and then it might have been nothing but a game machine.

I did get cartridges with it, and I did play them, but I didn't stop there. The reason I went beyond that stuff with my machine is this:

VIC-20 manual

This manual starts by telling you how to hook everything up in the preface. Then the first thing it has you do is type in a simple two-line program. From there, it builds up slowly, showing off new things like having you edit stuff on the screen plus colors and sounds.

Eventually they get into simple question and response things, and this is where it captured my attention. One demo does temperature conversion. I realized that I could teach the computer to do things and then have it do them for other people. All of my early programs reflect this.

Now we can start putting the pieces together. It's not the interface alone which encouraged me to learn how to program. Let's face it. Just because we can wrangle BASIC and still type in a quick display hack on a 30 year old machine doesn't mean we knew any of that stuff the first time around. Those machines were totally black boxes until you latched onto a manual.

Even then, the manual could have just talked about graphics and music and stuff like that, and it wouldn't have done anything for me. It had to come up with an interesting use case for me to stick with it.

It's a combination of things. First, I needed access to the machine. Second, I needed to know what it could do. Third, I needed to see something that I wanted to do with it that actually required learning BASIC in order to deliver it. Miss any of those and it falls apart.

These days, access to computers is far better for much of the world. You can assume that the people we're talking about here who would ask the "where do I start" question have access to one already. Okay, that's part one. You can probably also guess that most of those same people are going to think that a computer is in fact capable of doing whatever it is they want to do.

It's not like before, where the idea of a box which could ask your name and tell you how to hook up your TV and VCR for stereo sound was totally far-out, man. These days, I'd think most people would acknowledge that these devices can in fact do mundane tasks like that.

That's one and two. What about number three, which is finding a use case that actually requires programming skills to resolve? Now we're getting somewhere. Think about all that computers already do for people right now. They probably make most users pretty happy straight out of the box and do just about everything they really want.

So really, it's just down to those corner cases where someone comes up with a new idea and then wants to implement it. They should then go off and do the research and discover that programming may be involved here. Maybe this is something which can't be solved with existing tools or products.

At that point, they have a goal, and they have some idea of what's standing between them and it, and that's a great start. It also provides a perfect answer to my proposed question from last time:

What have you tried so far?

This time, the answer might be "Well, I want to listen to all of the SJC frequencies at once but can't do it with my normal scanner so I decided to try to write a program to do it for me". Okay! Now we're getting somewhere!

This is why wanting to "learn how to program" just because it's there makes no sense to me. If there's nothing you want to do which requires it, why do you care? Why even bother? Direct that energy into something else.

For what it's worth, I'd probably respond the same way to anyone who wants to "learn how to X" just because it's there and not because they want to accomplish things with it. Programming is just a shiny topic. X could also be "chop up a /24 into smaller subnets". Knowing how is all well and good, but if you're not going to use it, what's the point?

I'm a pragmatist.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK