everything wrong with free software

 "obedience breeds foolishness"

### [generate-title] other pages: [[computing-fundamentals]] | beyond-computerphobia | [[treating-computerphobia]] | [[computerphobia]] | [[letters-to-a-computer-student]] | *originally posted:* jan 2021 moving past computerphobia is a process of addressing the myths and fears that the learner has accumulated. they may not be entirely aware of these fears, which include a general fear of making mistakes as well as harming a complex machine. one of the ways to address this is to simplify the machine. removing features and even physical parts of the machine make it simpler to explain and teach about. computers and their software have many layers, though as explained in previous sections, they were a lot simpler years ago. an old 8-bit computer can be as straightforward as a game console (even with a builtin keyboard), particularly if you dont bother with the floppy drive or peripherals other than a screen. compared to a more modern pc, an 8-bit machine that is every bit as much a computer, even with its "operating system" stored on and loaded from a rom chip, may look like a toy to someone judging it by its outward apprearance. the perception of a computer as a toy, rather than a sophisticated instrument is beneficial to someone who has yet to fall in love with computers. the goal is to instill comfort with the machine, regardless of whether the learner "loves" computers or not. the assumption that the computer is just a toy will most likely help towards this goal. as much as we want to move away from the idea that the system is very complicated, we want to move away from the idea that the system is very rigid in its design. it isnt-- programmability is a feature generally inherent to "modern" computers (its sort of the point) and programmability is a type of extreme flexibility. *coding lets you literally define and redefine what the computer does*. to let a user remain unaware or barely aware of this fact, is to basically deny them the point of computing itself. we certainly want to get the point across that "most of the things the computer does, can be done differently *if you want*". if the computer does what the user wants, they may use it; *if the user only does what the computer wants, the computer uses them*. applications have made this worse in many regards; they give the impression that a particular program or application is "what the computer does" and that the computer works the way the application is designed. *this is completely backwards*-- the application borrows its abilities from the machine, the machine cannot do anything with a program that it was not already capable of before the program was written. whether it puts text or pictures on the screen, whether it makes or records sounds, whether it talks to other devices electronically, the program does not instill these abilities of the computer-- it accesses them. to some degree this is a quibble, because a sophisticated program can certainly "expand" on the basics of the design. but even if we design some adapter that say, allows you to connect an old cassette drive interface to a modern hard drive or high-speed network (presumably with a good deal of buffering and maybe even a small computer like an sbc as the part of the interface between the two) the older computer is still doing what it was designed to do: communicate data over a port. new software can provide new applications for this ability, but it still borrows a feature that the computer had prior to software being added to the system. a useful analogy for this is used in arguments against software patents: if you own the patents on a car, and someone "invents" a way to use a car for delivering pizza, they havent really added anything to the car that the design wasnt already capable of. yes, you can be the first to deliver a pizza that way, but it is not a separate invention from what the car could already do. a computer is fundamentally, really just a fancy calculator. instead of inputting numbers directly, it accepts all sorts of data-- pictures, sounds, text, movement-- all of which are converted to numeric data by some device, which the computer processes (like any other math problem) and then outputs as numeric data, which the devices convert back to pictures, sounds, text, etc. the really amazing thing about this is that thanks to lower-level programming, most programmers (or users) dont even have to worry about a great deal of math. you really dont have to be good at math to write a program or use a computer-- just as a graphics chip takes care of converting numeric data to pictures on the screen, programming languages take care of converting words and punctuation to (and from) numeric data. which in plain english means if you want to tell the computer "hello", you dont have to toggle or key in 104, 101, 108, 108, 111-- you can just say "*hello*" and the computer will turn that data into numbers *for you*. the word of course means nothing to the computer really-- but, because the computer will always convert "hello" to numbers, you can very easily get it to recognise that pattern and respond to it *as if* it knew what "hello" actually meant. an easy way to think of this is to imagine that youre teaching a command to a dog. if you ask an untrained dog what the word "sit" means, they arent going to use sign language or try to pantomime an explanation of sitting. if they know what "sit" means, theyre just going to sit, or not sit. similarly, it is mostly trivial to get a computer to respond to "hello" with whatever its supposed to do when you say hello: "*hello, computer*" you type on the keyboard. and depending on what its currently programmed to do in response to that, it could say "i dont understand what that means" (if you dont speak chinese, you might not know what "ni hao" means, though idiomatically it means "hello") or it could start doing a series of tasks its meant to begin when greeted, or it could simply respond by outputting the word "hello", or: "hi." one of the "funny" things about computers is that it seems to take so much *more* computing and sophistication to implement a "simple" interface where you point and click at things, while having the computer say "hello" in response to to you saying it can be accomplished in one or two lines of code. the reason for this is twofold-- first, its not /really/ "one or two lines of code"; even when it is, those one or two lines are being converted, by software, to several native functions that the computer can understand more easily than the person. *this is what most computer languages do-- they take code that is easier to write and convert it to code that is more tedious to write.* but in order to have the computer know where youre pointing and clicking, that involves keeping track of changing coordinates, comparing them to the thing being "clicked", drawing the thing being "clicked", having a visual response to that, and so on. its far less work for the computer to just recognise a series of letters (which the keyboard has already converted to numeric codes) and respond accordingly. in this sense, a simple "text based" program is much easier to make (and easier for the computer to run) than a virtual 2-dimensional or 3-dimensional graphical environment. plus, training the computer to recognise a word is really closer to what a pocket calculator does. to the computer, your "hello" is 104, 101, 108, 108, 111. it can simply run in circles until exactly those keys are pressed (at least thats one way to do it) and then respond in a way that it is setup to respond. but on a programming level it looks something like this: get keyboardtext if keyboardtext = "hello" then say "hello" thats a really simple program. but the computer can do that. the thing people tend to do when they see something like that is they think "okay, so thats how the computer works-- thats what it understands." but theyre only half right. *applications give the (false) impression that the computer works the way the application does*. the truth is that *applications can be changed*, so really *its only the application* that works the way the application does. if you dont like the application, it doesnt dictate how *the computer* works, really-- you can change how it works so it works *the way you want*. and thats what makes the computer so cool, is the incredible flexibility. so lets say you want the computer to respond to "hello" differently-- maybe you want it to do chinese instead of english. this is a slight oversimplification, but: get keyboardtext if keyboardtext = "ni hao" then say "ni hao" right? so we changed the *data* of the program. but the computer is more flexible than that! text = readkeys if text says, "ni hao" then say, "ni hao" what we are illustrating here is that the computer is not only capable of responding to different types of data-- it is so flexible that even its instructions can be written in a different form or grammar. so the idea that you have to "learn the language of the computer" or "learn how an application works" is only half-true. you can actually change the entire language its own behaviour is coded in. you can create *your own instructions*, that work *the way you want*. so when someone says "you should learn to code" and you find some tutorial that involves lots of specific punctuation and thousands of different special words-- what you need to know is *those are all optional*. theres very little the computer does that cant be done differently. practically everything that "works a certain way" on your computer can *actually* work a *different* way if you want. almost anything the computer can possibly do can be customised. a car works a certain way-- but you can (technically speaking, maybe not legally speaking) drive it anywhere that has a road, if the car has enough traction, fuel and the design has enough power. a computer is like that, except you can even change the way in which you instruct the car where to go. there are some hard rules built into the computer, to be certain. if you remove the screen, the keyboard, the usb devices, everything that isnt bolted into the machine (plus even the hard drive, which actually is bolted in) those rules are still there. but if you look at a random person who is a skilled computer user, even someone who can reprogram the machine, most of the people who can do that much dont even *use* the rules that cant be changed-- they do everything using only rules that are piled on top of those rules. they can change everything the computer does, including the rules it goes by-- without even knowing the rules that cant be changed. thats almost like magic. though you too, can make your own rules for the computer that way. and that is the essence of personal computing. home: [lit]https://wrongwithfreesw.neocities.org[lit]