Miguel Paraz voted up this answer.
Lots of good stuff here, thanks for the A2A.
In grad school, a friend of mine was working on a Psy. D.
across town at the University of Houston
. His thesis? That people went into computer science
because they were maladapted and found the binary thinking
, intractable logic
, and complete control that went into creating digital approximations of "real life" preferable to interacting with "real people".
That's a very popular view, and his research was making it look true, until I had a look at his survey instrument and found the built-in bias, pointed it out to him and his adviser, and told them that the reasoning possible to find it was a product of my CS education...
There is an element of truth
to the stereotype, but the reality is much more complex.
I've always been a tinkerer, taking things apart and putting them together. Some didn't turn out so well (the alarm clock when I was five); some did. The experience of doing it always improved my understanding of how things worked, and what might make them work better.
If I'd been born 100 years ago, all things being equal, I would have been taking stuff apart, improving it, and making new stuff. Just not as fast, nor with as much reach.
I consumed a vast amount of science fiction growing up, and the world in which things "automatically happened as if by magic" - but were clearly done by some kind of technology
that was just "hand-waved" fascinated me. How do you make doors open like on the Starship Enterprise
? How would you make a remote acting
arm like Waldo
used to overcome his weaknesses? How would you make the perfect woman
? Can we really make a man better... stronger... faster
Computers were clearly key to all that.
Programming wasn't something widespread when I started. There weren't a lot of things to "take apart". What there was taught volumes. And when Apple published the entirety of the OS for the Apple II, and IBM the entire BIOS for the first IBM PC, that was like gold...
For me, programming was more satisfying than working in the physical world. Think. Code. Compile. Build. BOOM! New thing. Easily shared with my friends, or the whole world. Or no one, depending.
Doing that in metal? Takes days, months even (try making something with an air hammer and an English wheel
and a gas torch sometime). And when done, you have exactly one. Want a hundred? A hundred thousand? Not possible. With code, write once, ship a million times.
I still take code apart. I'll see a super cool app, a great web site, or an awesome game, and say "I wonder how they did that?" Sometimes I'll try to make it myself first. Sometimes I just open it up in a debugger and have a look to see what I can learn. Yeah, yeah - TOS, no reverse engineering, blah blah blah. I'm not taking it apart to rip you off, I'm taking it off to see what makes it tick. And frequently get back to the author with a note about some change they could make to improve it.
I can't think how many times I've sat down and said "I wonder if..." and moments later had an answer.
Now with the advent of 3D printing
,... things get even more interesting.See question on Quora