Psychology of Computer Programming
According to a paper I recently read (http://www.cs.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf) there are, apparently, three kinds of people in the world. People who "are capable of seeing mathematical calculation problems in terms of rules, and can follow those rules wheresoever they may lead," those who "looks for meaning where it is not," and others who know that they are "looking at meaninglessness, and refuse to deal with it."
What's really interesting to me, as a programmer and recreational mathematician, is that apparently some people cannot wrap their heads around programming. This could be attributed to engineers teaching the classes for people who were basically born engineers, but it seems even math people would have trouble based on the hypothesized reason for this.
In the bizarro land we call programming, a=a can be either a command or a question depending sometimes on context and sometimes on symbol choice. In C derivatives, if you want to know whether a is equivalent to a then you use this snippet
a==a;
which does essentially nothing. In order for this to be useful you have to store the truth value of the result. What if you need a constant? That's simple. In most languages, storing a number in memory is just this:
a=10
or, in C derivatives (except in special fancy cases as we shall see shortly)
a=10;
If you've followed along so far, you might have started to form some kind of mental image of how these things work that you might be able to use to apply to further operations. So what is the final value of a after these lines:
a+=a+++a+a++a+++a++a++;
a=a==a=a==a=a==a==a=a;
hmm? I've seen several questions like this on first year computer science tests (often as extra credit), even though you would have to either be insane or some kind of paranoid obfuscation freak to ever use a line like that. If you think about it for a bit, you may come to the realization that (if this even compiles) a will be 1 (1 meaning true) no matter what a was.
So, should I be surprised that some people's brains simply refuse to operate on these things? It is the teacher's fault for being obtuse? Something more than either of those?
- Login to post comments
No, programming is either the way your brain kind of works already, or it will change your brain. I've had a few different "career paths" in my short life, and the most mind-altering has been programming.
Oh, and the answer to both those questions for the value of a is "You're fucking fired."
Speaking of which: why do new computer science graduates always want to show you how clever they are by writing completely indecipherable code?
Saint Will: no gyration without funkstification.
fabulae! nil satis firmi video quam ob rem accipere hunc mi expediat metum. - Terence
if (c++)
{
exit;
}
I've found that the atheists that are closest to me in philosophy and thinking style tend to be programmer/engineering types. I'm thinking specifically of BobSpence1 on RRS, and a guy called inmendham (love him or hate him) on YouTube.
Programming definitely is a mind-bending experience, and I agree, it does change your brain and thinking style, much like learning formal logic does.
The biggest benefit I've found is that the more I learn about programming, the more I'm able to understand, visualize, and predict the outcomes of complex systems. But not just computer systems, but systems of people, like governments, corporations, the voting public, etc. It also helps to more deeply understand biological systems and ecosystems.
If you're already a programmer, and have got that initial mind-bender out of the way, the thing I would suggest to do next is learn as many different programming languages as you can, especially ones that emphasize a different style of programming. For example, most programmers already know procedural programming. So learn an object oriented language (Java or C# these days). Then learn a functional language (like Lisp or Scheme), then learn a pragmatic OO scripting language (like Ruby or Python), then some other style, etc. Take them one at a time and study them. Get a good book on it. If there's a form or style that boggles your mind, take the time to learn it and understand it. The forms that have modified my thinking skills the most are Lisp macros, continuations, OO patterns like Double Dispatch, Command object, Ruby/Smalltalk-style blocks.... Even learning about various data types and algorithms can be mind expanding. I'm thinking of heaps, graphs, and genetic algoritms, among others.
Take each of these things that you may not currently understand, apply your mind to it, study it, get 'into' it. It trains your thinking and allows you to see the world differently. It's really quite amazing, now that I think back on all the stuff I've learned, both in University, and in my working years. I remember my first 4-5 years of working, and at any point I could look back on what I'd learned in the past 6 months and say to myself, "Holy crap, I feel like a completely different person than I was 6 months ago." It was a period of intense mental change. Things that were confusing before were suddenly simple and ho hum. Spent a lot of money on books, I'll tell ya that much.
Side note: Now I'm taking the same basic learning method and applying it to learning about the mind itself, and now it's a whole new mind-bending experience again. Whereas learning about programming can make you a master of logical thinking, learning about the mind can expand your intuitional thinking skills too.
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
Agreed, but how could you leave out Eiffel?
Considering there are literally thousands of languages I haven't learned, and Eiffel is one of them, it's not surprising that I didn't mention it. Besides, everyone knows that Dylan is better than Eiffel. If I were to mention languages I didn't know, but was impressed by, it would be Dylan. Bertrand Meyer's philosophy of programming is too heavy for me. I like more dynamic languages. If I were to name a personal favourite of languages I know, it would be Ruby; not the most powerful, but definitely the most enjoyable, for my tastes.
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
With or without Rails?
char* FindAnswers(char *ReleventData, char processor) {
// ...
if (processor == HUMAN_BRIAN) {
if (Answers->Cause == *GOD) {
exit(HALT_PROCESSOR);
} else if (Answers->Cause == NOT_FOUND) {
return(CONTINUE_SEARCH);
} else {
return(FOUND_ANSWER);
}
}
}
Taxation is the price we pay for failing to build a civilized society. The higher the tax level, the greater the failure. A centrally planned totalitarian state represents a complete defeat for the civilized world, while a totally voluntary society represents its ultimate success. --Mark Skousen
Although it's common to use = for assignment, several notable programming languages use other symbols.
Some languages, like Pascal, use := instead of =
Pascal's designer Nicklaus Wirth had been a notable computer-science professor, so it's not surprising that he was a bit pedantic.
However, using = for assignment dates back to the original version of Fortran, a C-like language that was the first high-level programming language. It dates back to the 1950's, making it over 50 years old.
Some other languages like to write it out as a word, like Lisp and related languages, with "set" and "setq" (set quoted).
And COBOL offers these choices:
ADD A TO B GIVING C
COMPUTE C = A + B
The second was because the first was too Mickey Mouse for many programmers, I guess; the first sort of syntax was intended to be easy for non-programmers to understand.
I have also noticed this. (Sorry to necro an old thread...)
After interacting with the government bureaucracy a few times I've noticed how very like a badly written piece of software they are. The bureaucracy can not cope with a situation it hasn't been designed for. Given a choice of A, B or C on the form, the bureaucracy simply can not comprehend that you may be neither A, B or C but are in fact X. It just falls over or waits for you to lie and say 'ok, C then'.
Given this, it would probably be possible to write a Bureaucracy Modelling Language and start designing government systems with the same formal testing and quality control that is used for software.
Yeah, like that idea.
Though, unlike many "pure" sciences, the phenomena and environments studied in Computer Science are often arbitrary (C++; one foot is object oriented and the other is purely procedural, and you usually have to shoot one off to get something done) or malicious (Brainfuck, or your favorite turing tar pits). The next closest thing to it and beurocracy would probably be biology, though at least evolution can be studied through its results. With Computer Science you may recieve a particularly arcane and undocumented piece of code and find yourself with a craving to take up random acts of violence as a hobby.
Still, it could work with the right people/person doing the designing.