Tuesday, November 27, 2012

Just What Is A Duckroll?

You've probably heard of rick rolling - if not, you aren't missing much. In fact, it's probably better that you don't know first hand - it's meant to be an unpleasant experience. Basically it's a bait-and-switch; you are lured into clicking a link to a gameplay video (that really couldn't exist yet anyway) or something that many people would quite obviously be searching for, given the season or particular circumstances.

Instead, you're greeted (after the fact) with a video of Rick Astley's "Never Gonna Give You Up" music video. Why was this particular video chosen? you might wonder. That is, if you weren't fuming with frustration or deluged in disappointment over the deception.

However, far fewer people have heard of the elusive "duckroll" and still fewer know of its relation to rick rolling. I find the story to be a fascinating and amusing one, and thus, felt obliged to share it.

It started with a rogue forum parser bot. I won't even mention the forum name as it's one of those that is better left unmentioned. This bot apparently took a liking to the word "duck" and an intense dislike to the word "egg" and decided, of course, to replace all instances of the latter with the former.

Somewhere on this forum that day, there must have been some instances of the word "eggroll" which were then changed to "duckroll" purely by mistake. Or as much as you can attribute something as inane as changing "duck" to "egg" as a mistake in the first place. Regardless, it did happen, and some people found this funny.

Not only did they find this funny, they felt compelled to spread the glee, but not in the way that it would occur to most of us. No, whoever began this strange meme (admittedly, the phrase "strange meme" is actually quite redundant; forgive me an unrestrained chuckle) decided that they should load up MS Paint and hack together a picture of a duck on wheels.

The craziest part is that it didn't end there - far from it. That picture was only the beginning. For most of us simpletons, the mere image of a duck on wheels would be enough to warrant placing it as our desktop background and laughing about it for several months, if not more. However, rather than be content with its value purely as humor, the image's creator imagined for it an even greater purpose - irony. Just as discretion is the better part of valor, irony is the better part of humor.

But I digress. This potentially-amusing image was then used for something very un-amusing; displaying it whenever someone clicked a link to a hot topic or something nearly everyone would be quite interested in. Some time later, the idea was changed to use Rick Astley's "Never Gonna Give You Up" music video instead of an image of a duck on wheels, and of course, the name changed along with it.

Whether you like egg rolls, hate rick rolls, or both, I hope I've helped enlighten you somewhat to the nature of this odd meme, and perhaps made you more wary of it. My goal was to brighten as well; I try to present things in as interesting or funny a manner as possible, particularly on this blog. If I accomplished that with this post, even better. If not? Then move along. These clearly aren't the droids you're looking for.

Tuesday, November 6, 2012

The Most Sour Drink In The World

I thought I'd post my first food-centered topic: the most sour drink in the world!

One possible downside is that I may be the only person who would ever actually enjoy drinking this. It originated many years ago when my brother and I were kids and came up with an early, less flavorful version of this. Surprisingly, it was even MORE sour than the completed recipe which I'm about to share. However, as lemon juice (particularly in the quantities we were using at that time) can wear the enamel off your teeth, I decided the original was in dire need of revision.

There is an alcoholic version and a non-alcoholic version of this drink, and it can be mixed to suit your own taste (as far as how sour to make it). Beware: if you don't like grapefruit juice, there is probably no way you will enjoy this drink, as it consists mostly of grapefruit juice.

This was formerly known as 'Wake-Up Juice' from the scene in Back To the Future 3 where Doc needs woken up with some. I still call it that, but I fear it could be copyrighted! Therefore let it be known henceforth as battery acid (or, the most sour drink in the world).

The Most Sour Drink in the World (non-alcoholic)

2 tbsp. lemon juice
1 tsp. lemon lime salt (the kind you can get at a 7/11 - Twang is fine, but any variety will do)
3 - 5 oz. grapefruit juice

Add lemon juice and lemon lime salt. Stir until salt is dissolved. Add grapefruit juice to taste.

The more grapefruit juice, the less sour it will be. The less grapefruit juice, the more flavorful it will be.

Battery Acid (alcoholic)

What better name could there be for a drink so ridiculously sour and salty?

2 tbsp. lemon juice
1 tsp. lemon lime salt (Twang or similar beer salt works)
2 oz. grapefruit juice
1-2 shots tequila; anejo or reposado varieties work best
3 dashes grapefruit or other bitters
dash of Patron (any flavor) to taste

Add lemon juice and lemon lime salt. Stir until salt is dissolved. Add grapefruit juice, bitters, and Patron (if desired) and stir well. Finally, add the tequila but do not stir. Garnish with lime slice and/or a salted rim. Amount of tequila can be adjusted to suit taste.

This one has more bite and flavor, but is not quite as sour. For less of a bite, try any brand of blanco/silver tequila. Blanco is not aged and thus has a more mild and smooth flavor, much closer to the true flavor of blue agave. Reposado and anejo are aged which gives them a more robust flavor, and more kick.

Bottoms up! Here's hoping I'm not the only one with a "sour" tooth (like a sweet tooth but for everything salty and sour).

Tuesday, October 23, 2012

Help! I'm becoming a spambot!

It starts with slowly losing your ability to read those capcha things in online forms. Then gradually you send shorter, more frequent emails, which increase in annoyance and spelling errors and decrease in meaningful content. Then you begin feeling a strong urge to look at the Google keyword statistics page every few minutes. You may develop an obsession with collecting vast amounts of valid email addresses. The last step is when you begin writing your blogs in raw HTML and the meta tags contain more data than the blog post itself.

Do any of these symptoms sound familiar? If they do, you may have a severe infection of spambotitis. It is a degenerative disease that slowly morphs you into a spambot until you fade into the world wide web as just another useless email / tweet factory.

Consider this post your guide to survival in a world of targeted ads and social media. Here are some pointers on how you can avoid becoming a spambot!

1. Always use your real name in your email address.

As long as your account is secure, there is really no problem with this. Change your password often, and never use the same password for two or more different accounts. If you have trouble remembering passwords, either write them down near your computer, store them in a password tracker program, or come up with a way to construct them from an idea or a phrase that you can easily memorize. It helps if you use random letters, but can still attach some kind of meaning to them that only you would know - particularly something that relates to that account or website you're creating the password for.

Using your real name will distance you from sockpuppetry, the first step towards becoming a spambot. Spambots can send emails at will because they are completely anonymous, and multiple bots may be controlled by a single person or computer. Even though your real name is more traceable, as long as you don't attach your email address to any real details of where you live or who you are, the chances of an identity theft claiming your personal information are slim to impossible. Just remember that anything you don't post online can't be stolen online!

An added bonus is that it looks much better to have john.smith24@gmail.com on a resume than johnny_baby_margaritas@gmail.com.

FYI - Sockpuppetry is the act of managing multiple accounts on the same website, pretending to be different people (especially on forums or chat boards). It's a bad practice and usually engaged in by trollers or flamers who don't want to be recognized as such by people who have previously banned them. A troller/flamer is someone who goes around the internet looking for a way to start arguments and incite drama, tension, or anger on a forum or chat board. Flame wars are just a fancy name for online arguments, though the name comes from the fact that they can be (and are intended to be) quite heated and antagonistic, usually involving swearing and belittling by both sides. You have to be very, very careful to not buy into these arguments and continue feeding the fire.

2. Remain ignorant of SEO.

The second huge step in becoming a spambot is learning about SEO, or search engine optimization. The less you know about it, the better. SEO is a spambot's world and entire life. They eat, breathe, and sleep SEO, or they aren't good spambots. Steer clear and be free.

3. Don't read or reply to spam email.

Spambots are essentially a network of computers and email servers that churn out emails day and night, generating them on the fly based on SEO, ad targeting, and information stored in each user's browser about what they look at (called cookies). Cookies are harmless little bits of info that a website uses to, say, show you ads that are more relevant to the purchases and product choices you've made in the past, or shown an interest in by remaining on certain pages longer than others. Ignoring spam email puts fellow spambots' hard work into the dumpster where it belongs.

4. Learn to write clean and concise emails/chats/tweets.

Don't skimp on your writing skill online, even for "unimportant" emails. All emails have a potential to be forwarded to someone you might not have intended to read the email. You would be surprised how impressed people can be simply by a well-written email or chat with no spelling or punctuation errors - just think of how rare those are in today's world! Often we communicate more with what we don't say (albeit, via email) than in actual, verbal exchanges. Make sure your writing skill is up to par and doesn't scream "spambot" all by itself. There is no reason not to.

5. Don't waste time on social media.

Spambots are all about wasting time. Don't cultivate a habit of spending hours trudging through tweets and Facebook, and you'll be better off. Just think of all that extra time you'll have to spend actually being productive! I suppose you could make the argument that spambots are quite efficient and productive; however, the endless spam emails they send, and the fact that 95% of them are ignored completely, makes them some of the most inefficient and unproductive programs ever written.

Disclaimer: This post is intended to be humorous, but honestly I've seen some spam emails that looked more intelligently put together than emails I knew were from an actual person. Do yourself a favor and apply some security and discipline to your online communications. You'd be surprised where it might take you!

Disclaimer Disclaimer: On the other hand, instant messaging is a different beast. It's okay to "lol" and omit punctuation in instant messages, though the general principle still applies that neater is always better. In this case I find it is more a matter of personal opinion, Twitter being a close shoe-in to IM/chat. Ironically, Twitter's 140-character limit can actually help with point 4, since you have to learn to say a lot with a little.

Tuesday, August 28, 2012

Do You Want What You've Got?

Kids want everything. Candy, toys, or anything that appears at first glance to bring happiness and fun. Then when they get home, instead of actually having fun, they fight over the toys. During this fight, one or more of the toys likely breaks, and the real crying begins. Then, if one of the toys was a paddle ball (and it's not one of the broken ones, or the eldest child has cleverly thought to hide it) they all are likely to be swatted with it for misbehaving. At this point, clearly no one has had any fun at all. However, from the kids' point of view, everything would be perfect as an adult since you have your own money and can buy as much candy or toys you want. And as an added bonus, you never get swatted.

Adults want everything. Cars, energy drinks, anything that society claims will lower your monthly payments or make your kids settle down (or preferably, go to sleep) and stop fighting over their toys. So you take out a huge loan on a nice house and a spiffy car. The kids wreck the house fighting over their toys, and the car gets hail damage in a freak ice storm. When everything thaws and you can finally head to the grocery store, you realize that after paying all your bills, the only thing you've got enough money left for are energy drinks. Then, you remember that they're not good for you anyway, so you decide to buy your kids another paddle ball (since the last one is either broken or hidden, and possibly both). Clearly, kids have all the fun. Man, wouldn't it be great to be young again?

There's something lacking in both these pictures, isn't there? It's peace and contentment. Despite innate desires apparently being fulfilled, neither group is happy with what they have. Instead, it's all too easy to continue buying into the lie that obtaining more stuff will bring joy. Since the first thing didn't work, why not try another? There's never an end of new "things" to choose from.

Yet it really doesn't work that way, does it? How many times has a new car, or a piece of candy, ever brought anyone true happiness or contentment? Most candy goes down in a quick gulp, and new cars eventually turn into old ones, which eventually fall apart. No, lasting contentment and peace can be found only in the act of giving, rather than getting. How many times in the past month have you stopped to reflect on what you already have? And more importantly, how grateful are you to have it? The answer to that question is a very good indicator of how happy the next "thing" that comes along will make you once you obtain it.

I will just leave you with a quote:

"Happiness isn't getting what you want, it's wanting what you've got."

-Garth Brooks

Tuesday, August 21, 2012

Taxonomy of Music

As a disclaimer, some of this post is intended to be humorous. Don't take it too seriously and don't take any of this as my personal opinion for or against any of these genres. I've tried (maybe unsuccessfully) to keep my opinion of each genre as separate from this as possible.

This is basically my definition for each of the genres of music. I won't claim to be knowledgeable about this so if I've gotten some wrong, it's because I probably don't listen to that genre very much and/or don't understand completely what it is. Again, some of these are more facetious than realistic.

Alternative - Incorporates such variety or such a wide range of styles that it doesn't fit neatly into any of them. In one word, unique.

Ambient - Atmospheric, ethereal. Few instruments, limited but just interesting enough to keep you listening. In one word, minimalist.

Bluegrass - Like Mardi Gras, but for music, not food. Jazz, but not quite. In one word, indulgent.

Blues - A fairly standard range of chord progressions which are used to evoke sad, melancholy, or depressed emotions, though those are not the only choices, nor are they always the most prominent. They are simply the most common, and hence, the name. In one word, well, blue.

Classical - Instrumental, symphonic music. Solo piano and orchestral are staples of this genre. If it could be described with one word, I would have attempted to come up with one. But it can't.

Country - A good ol' boy's got a dog, a truck, a woman, and some booze, and he sings about them regularly. The story changes from time to time, but not the subjects of his affection. Not to discount female country singers, of course - the only difference is, no dogs or trucks. They only sing about their feelings, men, and their feelings about men. And sometimes booze. In one word, trailer hitch (that's intentionally neither one word, nor an adjective).

Dubstep - Like techno, but even more repetitive; usually one vocal sample from a famous song repeated every 5 to 10 seconds, with some odd electronic/techno noises in the background. Often combined with other genres. In one word, alien.

Easy Listening - Just simple music, quite a wide variety but usually softer instruments. Jazz is often associated with this genre and the two do go hand-in-hand quite well. In one word, cool.

Electronic - Almost no real instruments or real people playing them; mostly synthesized with lots of effects. In one word, futuristic.

Funk - Rock, with a lot more sweat and head-banging. In one word, loud.

Hip-hop - Party/dance music, stuff that instantly makes you want to start jumping up and down or waving your hands in the air. Basic tenets are: an infectious beat; lots of different instruments and sounds happening all at once; and a touch of crazy. In one word, contagious.

Jazz - Swing, a certain vibe and character to the music that can be achieved with very few instruments and even the simplest of melodies. The moods can vary, but the basic idea is smooth, flowing, and glossy. Jazz works well as either fast and energetic, or slow and pensive. In one word, sleek.

Opera - Always singing, commonly with an orchestra. It is an acquired taste (meaning you can't just listen to it the first time and like it, it has to grow on you) but once you "get it" there is nothing better. Singers that can do opera justice have the strongest voices there are. In one word, grandiose.

Pop - Halfway between hip-hop and rock, with a greater variety of sound/instrument choices and more interesting melodies. Usually has a very intense, pulsating drum beat. In one word, sharp.

R&B - Rock & Blues? Reggae & Bluegrass? Your guess is as good as mine. In one word: soul.

Ragtime - Music to work to? Always a lively tune and a catchy melody, but simple, and with an air of light-heartedness. Of course, if you're working, then who's playing the music? And if you're playing the music, then who's working? In one word: funny.

Rap - Poetry with a beat and an attitude; slightly musical poetry delivered in an intense way. Sometimes quite repetitive, and often with darker themes and moods. In one word: happenin'.

Reggae - I haven't the foggiest idea what this is (nor why I've included it).

Rock - The most standard or recognizable type of modern music; usually has guitars, a solid drum beat, and vocals, with a very high-energy vibe. Think guitar hero. In one word, energetic.

Rock & Roll - Like rock but less intense and more dramatic. (The two are very similar)

Techno - Like electronic but a lot less interesting; the whole song is basically the 30-second sample on iTunes, repeated several times, with a minor change here or there to spice things up. In one word: repetitive.

Tuesday, August 14, 2012

A Matter of Taste

Taste is one of the most interesting sensations, because there are so many different flavors and possibilities, which seem to stem from just four basic building blocks: sweet, sour, bitter, and salty. I find it amazing that such variety is possible simply with these four flavors in different amounts.

One common misconception about the tongue is that each flavor sensation is limited to a certain part of the tongue. To the contrary, the myth seems to have sprung up since the sensations are typically more powerful in those areas. Either the tongue sensors in those areas are more developed/sensitive, or there are simply more of them.

A great example of the complexity of flavor is champagne. While made mostly like wine, the primary difference is that many different wines are combined to make a more interesting flavor, and the yeast and and fermentation byproducts are removed, which makes the flavor sweeter. What is so fascinating about this is that there are teams of taste-testers dedicated to ensuring that the new batch of champagne has the exact same flavor as the old one. Why is this necessary? Each crop of grapes will naturally have ever-so-slight variances in flavor, and this is something only such world-class taste-testers would notice!

One pro tip for tasting anything liquid, particularly wine - smell it first with both your mouth and nose open. This imprints the scent and part of the taste (since smell is apparently 70% of the taste anyway) so that when you actually taste it, you get more out of it. In addition, while tasting it, swirl it around and savor it, rather than swallowing immediately. Keeping your nose open here helps as well. Most people tend to unconsciously close their nasal cavity when drinking - it's a reflex designed to prevent you from inhaling instead of imbibing. Just suppress the reflex long enough to get more out of the taste, but only before you swallow. When you swallow, this reflex is a very good thing!

I imagine these tips might also help for solid foods, but are probably not as effective, especially when it comes to drier foods like bread. Scent is transmitted via microscopic particles and/or vapors, and vapor is produced more readily by liquids, especially at higher temperatures. Higher temperature simply means more movement in the atoms and thus a greater chance of particles vaporizing.

Why then does a bakery smell so good, if dry foods don't produce as many particles? An increase in heat causes motion in the air. Heat wants to move and rise up, and this includes air from the oven. As the bread bakes, yeast in the bread produces gas, causing the dough to rise. Some of this gas will escape the bread, and since it is very hot, it will act just like any other hot gas and try to escape the oven, building up pressure if it cannot do so. When it does escape, wind will then carry this hot air, which then brings the smell of baking bread along with it.

Another fascinating mystery in the realm of taste is the compound known simply as miracle berry (literally, as the molecule causing the effect is known as miraculin). Apparently, it changes the flavor of oranges and other citrus fruits when eaten beforehand, making them sweet instead of sour. I've never tried it, but I imagine it would be quite interesting and stimulating. One also wonders at its effect on other foods!

Now, taste gets really interesting when you combine two flavors together, even if you are not experiencing them at the same time. The effect is much stronger in that case, however. For instance, if you take a bite of something, then after you've swallowed that, take a bite of something else, the taste from the first food is still imprinted in your memory. Some of the liquids or particles from that food may still be on your tongue, as well. Either way, upon tasting the second food, you experience both the flavor of the new food, the flavor of the first food, and most importantly, the difference between the flavors. This is what really makes things interesting, as you'll see more of in a minute.

In this case, you've experienced what I like to call a one-way flavor delta. Delta is the Greek character that looks like a triangle, and is used in mathematics to represent change or transition; more specifically, the amount of change. In this case, you have now tasted both flavors at the same time, though a lesser amount of one than the other. Due to this discrepancy (tasting less of the first flavor), this is only a one-way delta, meaning you experience the difference between the flavors in one direction. You taste more of the second flavor, so you can tell more about how that flavor differs from the first, but not vice versa. The first flavor is now all but gone, though for a brief moment both of the tastes were there. After a couple more bites of the second food, the first flavor is wiped out completely, and the taste of the second food becomes far less interesting.

Taste the foods in reverse, and you will understand why I call it a one-way flavor delta. The difference between food A and food B is not the same thing as the difference between food B and food A. You have to experience this personally to know what I'm talking about, but I'm certain that any good taste tester would agree with me here. The second flavor is always stronger (unless you try them both at once) and this creates a discrepancy in which your taste buds must tell you more about one flavor than they can about the other, causing this one-way difference.

When you have two different foods at the same time, that is when things get really magical. Not only are you now tasting flavor A and flavor B at the same time (in roughly equal amounts), but you are also tasting what I call a two-directional delta. You can now taste the complete difference between both flavors, and this is one of those cases where the sum of the parts is greater than the whole. That's why a sandwich tastes so much better than just eating the individual things one at a time, even though the parts are all the same. The average sandwich has (I would estimate) about 10 different flavors in it, so this means you are actually tasting ten flavors as well as C(10,2) two-way flavor deltas! That's an incredible combination of 55 flavors and flavor deltas at once. I bet you never knew that 5 plus 5 on a sandwich equals 55! (Heh, actually if you write '5' on the crust of both pieces of bread and look at it sideways like the spine of a book, 5 plus 5 DOES equal 55!)

For those mathematically-inclined folks, C(n,k) is a combination of n items taken k at a time; in this case we have 10 distinct flavors taken 2 at a time, giving us 45 flavor delta combinations (plus the ten individual flavors themselves for a total of 55). A combination is very similar to a permutation, except order does not matter, similar to a hand of cards, but unlike a race where the racers finish in a certain order.

Tuesday, August 7, 2012

Good Code

No one appreciates good coding, because only coders understand what coding is to begin with. For anyone else, if the thing works then you only did your job - that doesn't make you a good coder. Well, it does, but to anyone who doesn't understand it, computer code that works the first time is just normal, not great. If anything, it only reflects badly on the quality of the testing that was done.

On the other hand, bugs are high visibility and high priority, unlike coding. Only when it's broken do people seem entirely willing to acknowledge that code actually does exist. If you fix a lot of bugs in a few hours (despite it being far simpler than the actual coding, in most cases) then people understand that you've done a great deal of work. Coding nearly always goes unnoticed and unappreciated. If the code works fine, then it's just ignored, like it doesn't even exist.

To get some perspective, apply the same argument to writing in general. Writing software is in many respects very similar to writing prose or poetry. Programming languages have nearly the same semantics and syntax as spoken languages, it's just that their purposes are quite different.

Computers never use programming languages to communicate with us; it's only us communicating to them the instructions that we want performed, the order we want them performed in, and the logic that holds the rules together and the system in place.

With writing, absolutely no one expects you to get the first draft right. What makes you a great writer is being as good at revising (or better) than you are at writing.

The same applies to coding, only the underlying purpose gives the revisions a different purpose as well. In both cases, you are revising in order to make the written medium (words, code) better. Better writing is hard to define, since writing is a little more of an art form than computer programming. Better code, by contrast, is well defined - if it doesn't work, it's not good code. I will now go through some of the differences between good writing and good code, and how the writing systems themselves differ.

Prose writing is more about the context than the actual words, though being a good wordsmith certainly earns you extra points. The context is normally the story elements (plot) which includes character development, something that takes a good deal of time to do outside of the actual writing. The characters must be alive in your head and have their own motives, goals, aspirations, and personality, before you can plug them into the time frame and particular circumstances of your actual story and have it come out sounding realistic. In the case of non-fiction, the context is your particular topic or subject, and the structure of your argument or points. You remember this from high school English class: the first sentence of each paragraph should be the theme or thesis of that paragraph, followed by support, evidence, and further argument about that same point. The first paragraph is normally an introduction and the last is normally a conclusion, with at least three body paragraphs between them.

Poetry, on the other hand, is all about word selection, diction, and imagery. Poetic devices, such as rhyme, alliteration, meter, and various other tools, are of paramount importance. The actual message is usually only a result of the deeper themes and moods created by the specific words and their connotations. I find that rhyme is often a quite underestimated tool, not used as often as you might think in poetry. It can be used to emphasize - the words involved in the rhyme are usually the focal point of the entire phrase or sentence. Thus, it is also important to choose which words are going to rhyme, and as such, this often requires some grammatical flexibility to rearrange the parts of the sentence in a way that doesn't sound archaic or confusing.

Lastly, computer code is like neither. The entire purpose of computer code is logic - that is both its foundation and its end. The building blocks are simply logical constructs, such as a loop that executes a portion of the code over and over (to avoid having to write many instructions that do mostly the same thing). Most logic boils down to conditions - if this, do that, or if this other condition is true, do this other thing. If this condition is false, skip this part of the code. This logic tells computers how, when, and what to do, in a way that bears no interpretation (heh, at least not the kind you're probably thinking!) Here is where we get into the revision.

At this point, I'd like to mention one minor historical anecdote: the first computer bug actually was a bug! A moth had been electrocuted while chewing on some circuitry in one of the first mainframe (room-sized) computers, and was causing a short in the circuitry. What then constituted the 'software' was actually hardware, in the form of vacuum tubes and switches. The switches would be set to input the instructions, and the computer would run through whatever instructions had been set in these switches. A programmer's job back then would have been to manually go to each switch and move it to the right setting, according to a long (and probably quite boring) sheet of numbers. This bug probably took a while to find, as the chances of a programmer losing his focus and mis-setting even a single switch were quite high!

Revising computer code is simple, yet not straightforward. This is because most often, you don't know what specifically is wrong with the code and how the problem is being caused. If you had, you wouldn't have written the wrong code in the first place! The first step is called 'debugging' which means going through the code, one instruction at a time, and watching the computer perform it, then examining the current state of the computer and the resulting output at that point. Once the problem is seen, the instruction last executed is most likely to be the one causing the problem. Now that this is known, it is normally a simple matter to determine where the error in the logic lies, and rewrite the code accordingly. Therefore, until programmers can code perfectly, we are stuck with bugs for the time being.

Now, the existence of bugs is no reason to knock computers themselves! The great thing about computers is that they are seldom at fault for the problems we face. It is normally operator error, either on the programming side or the user side. If the programming is wrong, we call it buggy or glitchy. If the user is wrong, it is known as a PEBKAC (problem exists between keyboard and chair). Computers execute their instructions correctly 99.9 % of the time. Whether those instructions are right or wrong is a different matter. Readers more interested in the subject of computers writing their own instructions should have a look into aspect-oriented programming. It is the newest programming paradigm, a step up from object-oriented (warning for the non-techies: highly advanced technical terms may cause head to explode). See my other post for more information about the different programming paradigms.

Tuesday, July 31, 2012

Wanted: Schrodinger's Cat, Dead AND Alive

I've always been fascinated with the idea of Schrodinger's Cat. For those who don't know what that is, I'll elaborate:

Picture a cat. In a sealed box. A box with an odd-looking machine inside. This machine will randomly dispense poisonous gas, which would instantly kill the cat. Well, not quite randomly. There's a nifty little gadget called a Geiger counter hooked up to the gas chamber. A Geiger counter is a device that detects certain particles passing through it - particles so tiny or volatile they can pass through matter; X-rays, for instance. The idea is that when the box is opened, the cat will be either dead or alive.

However, the interesting bit is that before the box is opened, the cat is said to be both dead and alive. This is because, due to the random nature of the machine, we cannot determine for sure whether the cat is dead or alive without opening the box. But ah - whoops! If we open the box, we may well change the state of the cat, and therefore, still be none the wiser about whether the cat was actually dead or alive before we opened the box.

It can be likened to the Heisenberg Uncertainty Principle, one of the most interesting facets of modern physics and/or science in general. It states that no quantity can be fully measured to perfect accuracy - not only because we have no method of recording a measurement of infinite precision, but also because the act of measuring itself, on such a microscopic scale, changes the quantity being measured.

Insert random Heisenberg joke here:

Heisenberg is out for a drive and is going quite fast, when a cop pulls him over.

"Do you have any idea how fast you were going?" the cop asks him.

Heisenberg replies, "I might have, but blast you, you just had to go and measure it!"

Another conundrum that provides great food for thought is quantum physics. Without going into much detail, the idea is that small packets of matter (the smallest known, in fact) called quanta, are constantly going in and out of existence. This changing state is governed solely by probability, and the act of observing a quanta forces it to collapse into one of its probable states. In other words, until you look directly at something, it may or may not be where you think it is. It's probably reasonably close - but its exact shape and details of its existence (such as viscosity, temperature, and structure - things all determined by its atoms and hence, its subatomic quanta) are not set in stone until you actually observe it. When observed, each particle must collapse into one of the possible states, according to the probability of each.

This probability is identical to, say, rolling a six-sided die. Half of the time, the numbers 4 through 6 will be rolled. One third of the time, the numbers 1 or 2 would be rolled. Lastly, 3 is only rolled one sixth of the time. If these three outcomes have the same probability as the appearance of different states of a certain quantum particle, then one state will occur three times more often as the others - the state that appears half of the time (rolling 4, 5, or 6). The second state, rolling 1 or 2, occurs twice as often as the third state (rolling a 3). So each time you view this quantum particle, you've just rolled the six-sided die, and the particle you see (its location and properties) are determined by rules of probability, much like those discussed here.

Tuesday, July 24, 2012

Premeditated Toxicity

One of the most fascinating paradoxes I've ever come across is that of the poison gambit. For lack of a more standard name, I've chosen that one. Or, if you like, premeditated toxicity. Or perhaps better yet, "Who Wants To Die A Millionaire?"

It's a sort of contest. You will win a million dollars (or pounds or the currency of your choice) if you can simply intend to drink a toxic substance - a poison guaranteed to take your life. Note carefully the specific phrasing used - you don't have to actually drink it. You only have to intend to drink it. The contest might therefore involve some sort of waiver for legal purposes saying that you agree to drink the poison and that if you do, the host company cannot be held liable for the consequences of your own intent and actions based on that intent. The exact details of this are unimportant.

The paradox is, can you intend to drink the poison, collect the money, and then later change your mind and not drink it, thus living on and being able to enjoy your reward? It's a very interesting thought experiment into free will. I believe I've resolved the paradox.

The answer is this: NO. You cannot actually intend to drink the poison, then later, decide not to. If you did this, your intention to drink the poison to begin with would not be valid, and your winnings would be forfeit. On a positive note, you wouldn't have to drink the poison and could continue on happily with your life (one million dollars poorer, nevertheless). Assuming there was a lie detector or that the contest had some other way to verify your real intent, you would be completely unable to "cheat the system" and lie about your intent to drink the poison.

However! It is actually possible to intend to drink the poison, claim the reward, and later, not drink the poison and live on to celebrate it. Yes it is! And this does not contradict what I have said in the previous paragraph. There is a subtle difference. It's true you can't actually decide on your own to not drink the poison, later. To claim the reward, you have to truly intend to drink the poison, and with every fiber of your being, know that you are going to drink it and die from it. However, there are two ways to get out of it - one improbable, and one reliable but dependent on the rules of the contest.

First, the unlikely way: something unforeseen has to happen, something you could not have predicted and that you had no hope of stumbling into. This event must somehow change your mind about whether you wanted to drink the poison or not. For instance, perhaps after going through a divorce, you felt alone and became severely depressed. You thought you could at least give your kids a better future by providing well for them, though you wouldn't live to see it. Then, after the contest is over, but before you drink the poison, along comes a stunning, single woman into your life and you fall in love and are no longer depressed, and you actually don't want to die anymore. You find a reason to keep on living. This would not break the rules of the contest, and your finding love would be a triple bonus: you now have the promise of a happy relationship, a lot of money, and a bright future with no obligation to actually drink the poison. You were sincere in your intent, because you had no idea such a great event would happen to you and so drastically change your outlook on life.

The other method is the cheap shot. This may not be possible depending on the specific rules of the contest, but let's assume that you are not told the specific date on which you must actually drink the poison. When you intend to drink the poison, you must fully and completely agree to drink the poison at some point in your future, barring any event that kills you in another manner. In which case, the winnings would not be forfeit - remember, you only need to intend to drink the poison - the actual drinking is merely an extremely likely consequence of your intent to drink it.

Have you already spotted the loophole in your mind's eye? You can intend to drink the poison, claim your winnings, and then indefinitely postpone the actual drinking of it. You fully intend to drink it, but since the actual date was not specified, you simply intend on drinking it so near the end of your life that its fatal result means almost nothing. Then again, if you wait long enough, you may actually die unexpectedly before you even had a chance to drink the poison. Either way, you're practically scot free. The only real caveat is that you do have to truly intend to drink the poison at some later date!

This loophole makes an interesting question of intent - how do you define an intention? Is it immutable as first conceived in your mind, or are intentions a malleable substance that can be formed and shaped on a whim? Food for thought, or at least for another post.

Tuesday, July 17, 2012

There Was A Stink Here

A strange smell enters your nose. At first it seems pleasant, but at some point it hits you that the smell doesn't normally belong there. You then recognize that it's most likely an air freshener, and your brain supplies the needed context; namely, that there was a stink here. This knowledge makes the air freshener somewhat less effective, doesn't it? Moreover, I think that air fresheners are mostly ineffective, and I will gladly explain why.

All air fresheners smell pleasant, and yet most of the time when you smell them, the scent actually comes off as annoying, or at least less-than-pleasant. Why is this the case? Doesn't a pleasant smell always smell pleasant? In fact, no. The connotation that comes with the smell indicates that something else likely happened to make the air freshener necessary.

Moreover, when would you choose an air freshener over fresh air? Only on a summer's day when it's too hot to open the windows, or a winter's day when it's too cold to do the same. Depending on where you live, these may vary in frequency. Either way, there will be some days where the air freshener just smells out of place, even if it wasn't there to cover up a stench.

In many cases the air freshener cannot even completely cover the previous odor. This is arguably worse than not using any air freshener at all. The rank smell mingles with the more pleasant aroma, creating a confusing and irritating environment. So either the air freshener couldn't fully eliminate the other scent (and therefore, pointless), or the other scent was already gone, and the air freshener only serves as an ironic reminder of the incident (again, pointless). It's ironic because the pleasant smell hints at something unpleasant. In either case, we see that the use of the air freshener, at least as a remedy, was completely futile.

Also there is something to be said about the oils. If given a choice, I'd prefer fresh air over freshened-air any time. Most air fresheners use oils which either drift down to the floor, or float around. Either kind introduces particles into the environment (albeit, pleasant-smelling ones) that would not have been there otherwise - at least, certainly not in the quantity dispersed. This may even have some negative health effects, but I've not researched the subject. I simply know that when I inhale directly from a thick fog of such particles, the effect is not exactly stimulating, but rather, cough- and sneeze-inducing.

Don't get me wrong; I'm all for the use of air fresheners. However, I don't kid myself into thinking that they are effective. At the very least, all they do is kick your olfactory into gear and remind you that "there was a stink here."

Tuesday, July 10, 2012

Music For Nerds

In a similar vein as the "For Dummies" series, this post resides in the highly underdeveloped genre known as "For Nerds." The topic today is music! I am no expert at music, but I have my fair share of qualifications, particularly for someone who never majored in it, or otherwise, studied it intensely. I have played trumpet for over 10 years, piano for about 2 years, and dabbled in guitar until my cheapo First Act guitar broke (which wasn't very long). I also compose music, primarily solo piano and electronic. For those who are curious, I do my mixes in FL Studio Producer Edition (FL = FruityLoops), and my piano stuff in Finale. This was all self-taught during high school and college, but I've done little in that area since my career in software development officially began. As far as music theory, for seven out of eight of my college semesters, my roommate was a music education major. I absorbed no small amount of musical knowledge this way, second-hand.

Added to that, my own studies (going through all the lessons on musictheory.net in one night) provided a somewhat unstable basis for learning the actual theory. I compose primarily by ear, which may not be the best way, but it works for me. All that aside, I'd like to share some simple tricks for just about anyone. If you're a nerd like me, you will hopefully find these tricks a bit easier to remember than the old-school methods. The three tricks are how to quickly read a key signature, how to quickly read a time signature, and how to quickly read bass clef. A fourth tip is the simplest explanation I could muster for the idea of concert pitch, something that confused me to no end until recently.

Overview

Here is an overview of how to read music, what all the clef symbols and notes mean, and some of the other basic symbols on the notes.

Pitch and Clefs

Pitch is determined by the "height" of the note on the staff, and further, the staff's clef and location relative to other clefs. The higher the dot on the note, the higher the pitch. Don't look at the stems because they can point up or down, so that may be confusing (the stem is the line emerging from the dot).

For instance, on a piano score, you will see the treble clef, connected by a long vertical bar on the left side to the bass clef. The notes at the top of the treble clef are highest, and the notes at the bottom of the bass clef are lowest. The treble clef looks like a cursive letter S with a big dot at one end, but the important part is that the curl in the middle ends on the note line for G. The bass clef looks like a backwards C with two small dots to the upper right. The main shape also curls into a larger dot in the middle, a dot which would be the note F. Thus, these two clefs are often nicknamed G-clef and F-clef. Looking at their dots is one quick way to identify those particular notes.

The other clefs are actually even easier to read, because there is only one, and its position on the staff tells you where middle C is. It looks like a double thick bar on the left, with a curly bracket thing on the right (the entire thing looks like the letter B). The bracket has a sharp point going left, and this sharp point will always be on a line. That line is always middle C. Thus, this is the C-clef and is used mostly for vocal parts, except where hymns are concerned, since hymns normally have the same 4 voices (alto, tenor, baritone, and bass) and are written all on one page, using only the treble and bass clefs. When a single vocal part (say, for opera) is written on its own page, then the C clef is normally used to indicate which voice part it is.

Rhythm

The rhythm is largely determined by the type of dot in each note. The single 'beat' - the most basic unit in music - is normally the quarter-note, shown as a filled dot. Half notes and whole notes are unfilled dots. A half note is the one with the stem, and it gets two beats, while the whole note has no stem, and gets four beats. In common time or any time signature with a number 4 on the bottom, there are four beats in each measure. The small vertical lines in the staff divide the measures for easy reading.

If a note's stem has a tail on it, then it is an eighth note (if there is a tail, the dot will always be filled). A double tail indicates an even shorter note, the sixteenth. A triple tail indicates a thirdy-second note, and so on. Generally, the number used tells you what fraction of a measure the note represents, at least for common time. Thus, a thirty-second note is so fast, you could fit 32 of them into a single measure!

If two notes with a tail (eighths or shorter) are next to each other, the tails are connected, forming bars instead. Similarly, if four such notes are found together, all four tails are connected into one long bar. If the tails were double tails (for sixteenth notes) then there will be double bars instead.

A note with a smaller dot just to its right is a tricky one. You can think of this type of note or rhythm as working overtime - it gets time and a half! A dotted quarter note gets a quarter of the beat, plus half of that same value. Half of a quarter is an eighth, so a dotted quarter gets a quarter beat plus another eighth beat. It makes a lot more sense when you hear it, but the simplest way to think of it is two separate notes tied together. It sounds the same - a tie is the curvy line connecting two adjacent notes of the same pitch. Ties are normally used to connect notes across measures, since it's usually cleaner to combine them if they are in the same measure.

A slur is just a tie between notes of different pitches. This indicates that the second note isn't as distinct as the first, giving a blurred or slurred sound between them.

Finally, the accidentals - the things that make music look the most confusing! They are certainly the trickiest part since they occur when there is a modulation (when the notes or key at that spot don't match the overall key of the song). There are only three different symbols - one looking like a lowercase letter B, one that is the pound sign or hash (#) and a third that looks like a box with lines extending from two of its corners (almost like the hash with six of its 'arms' cut off). The first is the flat, causing the note to its right to go down one half-step. The second is the sharp, which does the opposite - the note it modifies goes up a half-step. Finally, the most strange looking one is the natural - it means the note is neither sharp nor flat, and assumes its normal pitch for that location in the staff. There are also double flats and double sharps, but I think that's getting a bit advanced for this post.

Quick Read Tips

Time Signature

Now, here is the quick-read tip for the time signature. In case you can't find the time signature, it is usually on the very left side of each staff, just to the right of the clef symbol. It is two small numbers, one on top of the other, both wedged inside the lines of the staff. Sometimes (if the time signature changes during the song) you may also see a time signature just before a double bar.

The top number tells you the length of each measure. If it says 4, there are four beats between each of the small vertical lines dividing the measures. If it says 8, there are eight beats instead. However, the beat is defined by the bottom number, which tells you what note is considered a single beat.

I still haven't gotten to the actual trick yet. I find that examples work best - let's say you want to understand how 3/4 time relates to 6/8 time. The two are actually very similar!

Think of it in terms of math (because that's what musical rhythm boils down to). Reduce fractions, and 6/8 actually equals 3/4. What does that tell you? The ratio between the measure length and the beat is the same! And what does that tell you? Well, for one, you can conduct 6/8 in much the same way as 3/4, waving your arms in a triangle-like shape. Second, you can tell that 6/8 is 3/4 times 2/2 which means everything is doubled. Where in 3/4 you only have three quarter-notes per measure, in 6/8 you have six eighth notes. Hey, that matches up - 3/4 = three quarter-notes, 6/8 = six eighth-notes. BAM! There's your trick. 2/4 equals two quarters (two quarter notes per measure). 9/4 equals nine quarter-notes.

Time signatures ARE fractions. Treating them like anything else only makes things confusing. Treat them like fractions, and everything will make sense from now on. But wait, that's math, not music! Oh, right - for nerds math is cool, so this actually works out better! Win.

Key Signature

Now, key signatures are a little more complicated, as you may know. Since there are twelve half-steps (distinct notes, including sharps and flats) in an octave, the combinations of which sharps and flats you have can become quite complex. One thing is on your side - except for accidentals, you will always have either all sharps or all flats. There is no key signature with both sharps and flats. The only "odd man out" is the key signature with neither sharps nor flats.

Another thing that adds to the complexity is that each key signature can be either major or minor, and it may not be obvious at a glance which one is being used. In addition to that, there are flat and sharp versions of each major and minor scale. This can all be tabulated and calculated quite easily by counting the number of sharp or flat symbols in the key signature, and then using the following chart:

0  2  4  -1  1  3  5

At first, these numbers probably seem like they could serve no useful purpose. The fact is, they already represent the seven natural keys! And the simplest part is they begin with C and proceed in alphabetical order. In music, nothing goes past G, so starting from zero you have:

C = 0
D = 2
E = 4
F = -1
G = 1
A = 3
B = 5

Now, what do the numbers mean? It's quite simple: positive numbers represent sharps, and negative numbers represent flats, while zero means no sharps or flats.

For example, if you see a key signature with two sharps, it's either D major, or the relative minor for that same key (which also has two sharps).

Determining the relative minor is just as easy! Add three. D is 2, so D major's relative minor is 2 plus 3 or 5, which is B. So the key signature with 2 sharps is either D major or B minor.

What about sharp and flat scales? G sharp (G#) major and A flat (Ab) minor and so forth? Amazingly, it gets easier yet! Simply subtract or add seven. Since flats are negative, you subtract seven to get the flat version of the key, and you add seven to get the sharp one. Ab is 3 minus 7, which is -4, so that key signature has four flats.

You can also calculate minor keys directly by subtracting three. For instance, what would C minor be? Take the value for C (zero) and subtract three. You get -3, which indicates that C minor has three flats.

Time for another example: suppose you want to look at G# major. Here, an exception occurs. We end up with 1 + 7 or 8 sharps. However, the key of G# is never used; it is purely theoretical. If a composer were to want that exact key, he or she would use the enharmonic equivalent key of Ab major instead, rather than using a double sharp and six sharps to indicate the proper eight-sharp key of G#.

In some cases you may have to add or subtract twice. For instance, what about A# minor? First, you add seven to the value of A (3) to give you A# major, which is 10. Then, you subtract three for the minor, which gives you 7. Thus, A# minor has seven sharps.

One last example: if you see three flats, well this isn't on the chart, is it? But you know one possibility - add three and you end up with zero, which is C. So the relative minor is C minor, which could be the key. Another way to get -3 is subtract 7 from 4, which is how you calculate Eb major. The key is thus either Eb major or C minor.

Credit for this system goes to musictheory.net, though it doesn't mention the detail that you can add three to get the relative minor.

Staves and Clefs

Lastly, understanding where all the notes are, independent of the clefs and staves, will greatly aid your ability to read different clefs. This section focuses only on the treble and bass clefs.

I'll assume that if you do know how to read music, you probably grew up playing a musical instrument. Unless you picked the baritone, tuba, or trombone, you are probably already familiar with the treble clef, and completely unfamiliar with the bass clef. Either way, the following should help you read the other clef a bit easier.

First, note that middle C is precisely and exactly that, when it comes to the treble and bass clefs. Treble-pitched instrument players will know that middle C resides on the first ledger line below the treble clef staff. Bass-pitched instrument players will know that middle C resides on the first ledger line above the bass clef staff. I bet both types were surprised to learn this about the other clef! In fact, let's say you situated the clefs together so that the lines match up (so the treble clef lines blend perfectly into the bass clef lines as you move downward, and vice versa). There would in this case be exactly one ledger line between both staves, with only one space in between on either side. Notes on this line would of course be middle C! How much simpler could it get?

One major helpful thing to note is that, for most notes, line notes in the treble clef become space notes in the bass clef, and vice versa. So the note G in the treble clef, a line note on the second-lowest line, becomes a space note in the bass clef, in the topmost space. Don't think about this too hard, as there are cases where it fails, but it's another handy thing to remember.

The major tip I can give you here is that the space notes (going from bottom to top) in the treble clef spell out the word FACE, and in the bass clef, they instead form the acronym ACEG (all cows eat grass). If you can just remember these two words (face = treble, bass = all cows eat grass) then you can use the space notes to go up and down and determine the other notes from there.

Tuning Systems

Concert Pitch

Concert pitch is the system in which most music is written today. Let's say I have a trumpet tuned to B flat and a cornet tuned to C. These tunings are also in concert pitch. The primary goal of this system is to prevent the difficulties that arise when the same note can have different fingerings depending on the key. With concert pitch, as long as you know the fingerings for each note, you don't have to worry what key you're in when reading the music (other than pressing all the right fingerings for that key and any accidentals).

If I am playing the same piece of music written in concert C, it will sound different depending on whether I play it on the trumpet or cornet. If I play it on the trumpet, I am actually playing in concert B flat. I wouldn't hear the difference if playing by myself, but if I played along with someone else who was playing a C cornet, it would sound quite awful indeed. So why doesn't a concert full of instruments all pitched differently sound terrible?

Relative Pitch

The trick here is all in the writing. Solo pieces can be written in relative pitch, meaning your only limitation is to write within that instrument's range. What you hear may not always be the right key, but it will sound good by itself because you have no other parts behind it for a reference. It is often written out of key (transposed) to put more of the notes within the staff and make it easier to read, but without a tuning fork or a highly trained ear, you wouldn't know that it's out of key. With no other instruments playing (no reference), all the pitches would still sound correct, relative to each other. Being out of key only makes the overall pitch of the song higher or lower than normal.

Transposing to Concert Pitch

By contrast, for ensembles and concert pieces, each part is normally written transposed to the right pitch for each instrument, using the concert pitch system. Thus, B flat trumpet parts are written in B flat concert, transposed with the key of the song. If the key is C, for instance, all notes must be transposed up a half-step. Now, by playing these transposed notes, a B flat tuned instrument is playing in C concert pitch. Likewise, for an F-tuned instrument, the notes would be transposed down five half-steps for a song written in concert C. If the key is concert B flat, then a trumpet B flat part would not need transposed at all. A notation such as "horn in F" would indicate that F is the concert pitch tuning for that instrument's part. Incidentally, F is also the standard concert pitch for horns, as is B flat for clarinets, and so forth.

Transposing Instruments


Some instruments, on the other hand, have such high or low ranges that their music always needs transposed simply to avoid having the notes so far away from the staff, requiring many ledger lines. Such instruments can be kept in the same key, but transposed up or down by one or more octaves. Examples of instruments that must always be transposed due to their range are the piccolo and the contrabassoon.

Besides extreme ranges, there are other reasons why instruments might be written transposed from concert pitch, but all such instruments written this way are known as transposing instruments. For these, concert pitch is not used as the standard. Instruments that use concert pitch are known as non-transposing instruments.

Transposed Reading

There are certain skilled individuals who can read and play music outside of concert pitch. This means they have to use a different fingering depending on which key the music is in, in order to match the correct concert pitch. If I want to play a part labeled "trumpet in B flat" but I want to actually play tuned to C concert, for instance, I would have to do the transposition myself as I am reading the music, playing each note a half-step higher than what is written. It would be far easier, however, just to obtain a "trumpet in C" part for that song!

Tuesday, July 3, 2012

Will Program For Twenty Cents

The subject of this post is actually programming, and another word after it which makes the post title a geeky pun. Bonus points for those who can figure it out! (Hint: You won't find the word anywhere in this post. I took extra effort not to use it).

From the first time I looked at computer code, I was fascinated by it. Ironically, during elementary school, I didn't think I'd ever be smart enough to program computers. This may have been because I wasn't smart enough at that point - the only intellectual limit of childhood seems to be one's inability to think farther than a week ahead. Better understanding your own potential for growth is part of maturing into an adult, I have found.

When personal computers first became available, their use was not widespread. Rather, the internet was simply a network of the major computers that existed at that time, and primarily between research facilities. Programming at that time would have been a headache compared to what it is today. Since binary (or more generally, anything digital) is nearly the opposite of how the human brain works (everything is analog), computer instructions then were written in hexadecimal, a number system on the other end of the scale from binary. Binary uses two digits; hexadecimal uses sixteen. In comparison, the number system we write with uses ten digits, and is simply called decimal.

Even in hexadecimal, computer code is all just numbers, but in a form more easily usable by humans. This is because, due to having eight times more digits than binary, a very long number in binary becomes a very short number in hexadecimal, meaning a lot more code can be shown with very few digits. This was the first form of software programming; hardware programming previously used switches that had to be set by hand.

Next came spaghetti code. The mental picture is fairly accurate - instructions just thrown in wherever they were deemed necessary. There was no real structure or organization at all. Each instruction was directly mapped to an address; the address was actually part of the programming code. The addresses had to be in order, but you could skip ones you didn't need, or decided not to use. You can see how it got its name with this kind of ad-hoc arrangement! However, the one improvement was that actual words could be used instead of codes. This introduced the need for another program, called a compiler, to come along after you write the code and turn the words into the hexadecimal and/or binary instructions that the computer can execute.

The first real structure came with the invention of - you guessed it - 'structured' code. The new idea here was to cut up the spaghetti into logical segments. Each segment, also known as a 'routine' or 'subroutine' was given a name, usually one that described what that portion of the instructions did. For instance, you might have a routine to display text on the screen, and another one to ask the user for input. In this way, instructions were organized by function, rather than being all thrown together in one big monolithic mess. In addition, this introduced the idea of parameters. In the case of a routine that displayed text on the screen, a parameter could be the text to be displayed. Whenever you invoke or call the routine (so that it performs the instructions contained therein) these parameters are given. This way, you need not reinvent the wheel and write the same code over and over each time you want to display text on the screen. You just call the routine that does it for you, and pass the text you want displayed as a parameter. This is also known as functional or procedural programming, because it is organized by function (routines are also known as functions or procedures).

This phase lasted quite a while until the next revolution: object-oriented programming. This provided not only further structure, but also several important, new concepts that would change the way programming was thought of, and what it was capable of doing. These powerful new tools created quite a stir and made computer code far more elegant and interesting. The three primary concepts are: encapsulation, inheritance, and polymorphism. All three fall under the umbrella term "abstraction" since they all give us new ways to represent abstract ideas (or objects) such as bank accounts, transactions, and other such things managed by computers, using computer code. This means the code is structured in a way that more accurately represents these objects, and therefore, more accurately handles and manages them.

Encapsulation is the idea of the black box. Think of a car engine, for instance. Many people haven't the foggiest notion of how a combustion engine works (perhaps a better example is how an airplane stays up in the air, since even fewer seem to understand that secret). However, that isn't a very big problem, unless of course, your car breaks down (or the airplane crashes). As you drive the car, it doesn't (and shouldn't) concern you what happens when you press the gas pedal, or the brakes. When you turn the steering wheel, you don't care how the engine responds and decides which way to turn the car. It doesn't matter to you, because it's a black box system. As long as it does its job, the black box can remain a mystery, and there is absolutely no problem with that.

We can do precisely this same thing with computer software. We can now write a portion of code that can interact in a well-defined way (known as an API) with other code. We can bundle up the code we wrote, sell it to someone else, and then they can write code on top of it that turns the steering wheel and pushes the pedals, so to speak. They don't care how our code works; it just works. When they turn the steering wheel, the car turns, and when they push the gas pedal, the car moves forward.

Encapsulation is accomplished in the software world by defining the scope of program elements. The scope tells us where in the program (and outside) we can see those elements. Functions, as mentioned earlier, are one such element. Stored data is the other primary element. We can define something as public (viewable by everyone, everywhere) or private (viewable only within the black box). This allows us to share the information we want, and protect the information that shouldn't be shared within the black box.

Inheritance is a lot simpler; you are already familiar with one form of it - genetics. In programming, inheritance works exactly the same way. We can write generic code that acts like an Animal - it has behaviors (defined by functions) such as speak, play, sleep, and so on. Then, we can write more specific code that acts like a Dog, but inherits the more generic aspects that it shares with all Animals. All animals can speak, but when a Dog speaks, the behavior can be defined specifically as "Bark." We could then write a Cat which inherits this same behavior (speaking) but again, when we invoke the Cat's 'speak' function, instead we receive a "Meow" in response.

Finally, polymorphism is the most complex of the three. It's quite a difficult concept to wrap your mind around, even if you're a programmer. However, the simplest way to explain it was already done in the last paragraph. It is closely related to inheritance. When a Cat speaks and we hear a "Meow" then a Dog speaks and we hear a "Bark," this is an example of polymorphism. In either case, we are simply invoking the inherited "speak" function - but the behavior is different depending on the subclass (Cat or Dog). This is polymorphism - the ability to define a specific response to a generic behavior.

In essence, these abstractions give us two dimensions in which to program. With structured design, a function always does the same thing every time you call it. With object-oriented design, polymorphism gives us a different response based on both the function/behavior and the object/idea. Invoking the same function on a different object normally produces different results.

Now, prepare your mind for some extreme warping - we are now in the age of subject-oriented programming, where we can wield three such dimensions. The result or response we get from a function can now be defined by the name of the function, the object on which it is invoked, and the subject related to the invocation. For instance, my Dog might have a different bark depending on whether he was happy to see me, or whether he was trying to fend off an intruder. This constitutes the subject, or aspect, in which the behavior is invoked.

Aspect-oriented programming is very similar to subject-oriented, but to spare your mind further warpage, I won't go into any detail on the differences between the two. Instead, I will just say that programming has come a long way from using hexadecimal codes and command line interfaces. We now have the power to determine the software's behavior based on the desired function, the object performing the action, and the context in which the interaction occurs. This gives incredible potential even just for artificial intelligence. Computer code that can update and edit itself is now just around the corner.

And yet, DNA has been doing exactly that for thousands of years. Is that something that occurred by random chance? I think it's about as likely as computers assembling and programming themselves out of thin air. It takes a mind more intelligent than a computer to design and build a computer. By the same token, it takes a mind more intelligent than a human mind to create such an incomprehensibly vast universe, and to populate it with beings whose bodies themselves contain technology far more advanced than computers; least of all, the human mind itself.

Tuesday, June 26, 2012

Piracy Is Wrong, Period

The following was actually meant for a previous post (the one on being regarded as a computer genius) – however, I rambled so much in my initial write-up that I simply couldn't leave this all in the same post. It had become far too big, and a post all its own. As you may know, that other post was already long enough! This one is about media sharing and piracy. Touchy topic, I know. There's also some smaller bits about CD visors and generosity.

To get things rolling, I would like to point out that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free, there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right?

The worst part that only adds insult to injury is that most of the sites claiming to offer such “free” goods are the worst sites to visit, if purely from a safety standpoint. They are often bloated with malware, and links to gambling and porn sites galore. It’s simply a nightmare, and it’s not made any better by the fact that the people who, often unknowingly, access these sites have no virus protection on their computer. They end up getting something for nothing, certainly - a free computer meltdown.

Just because the RIAA, or the standards bureau for whatever industry is involved, can't prosecute 99% of the cases of media piracy (or unauthorized reproduction and distribution, whatever you want to call it) that occur online is no reason to condone it. I certainly don't, and you shouldn't, either. Don't claim that the distribution system is flawed - come up with one that isn't. Don't claim ignorance - I've now rid you of that excuse. Just pay up.

Getting something for nothing is a ridiculous idea that is taught by a society addicted to gambling, massive debt (or rather, living beyond one's means) and other such vices. Don't be taken in by the lie. Anything worth having is worth earning or buying. Getting it for free may feel great now, but it destroys your character. Free things are temporary and will only decay and wear out with use. Character is powerful and eternal. It's easy to see which matters most.

Burning CD's and DVD's is illegal too. That's not necessarily piracy - piracy in the strictest sense is copying and then selling something you don't have explicit permission to copy (much less sell). However, unauthorized copying and distributing is just as illegal as piracy. That, and the fact that "piracy" is a lot less of a mouthful than "unauthorized reproduction and distribution" leads to the popular belief that the two are the same thing. It doesn't really matter though - one is just as illegal as the other.

Now, with CD's I admit there is certainly a valid reason why so much of it goes on, perhaps more so than in the other markets. Notice, I didn't say there is a valid reason why it is legitimate. It isn't legitimate in any sense. However, many people seem to think it's okay because it helps generate interest and that any "woe" over lost sales is ludicrous compared to the buzz factor gained. I won't get into that debate; but again, I must draw the line clear and simple, black and white: it's illegal. Does that word even mean anything these days? If it means something to you, or if you have any desire to consider yourself a decent citizen, don't do it.

Technically, copying (a.k.a. 'ripping') any copyrighted CD to your computer (yes, even one you paid for and own) is an unauthorized copy; so is burning a new CD so you can have one to play in your car. I feel this is going a bit far, myself. If I paid money and legitimately own my own copy of the media, have no intention to make it available to anyone else, and have the means and the know-how to prevent that from ever happening*, I don't see what the big deal is. It's the same music, I own it, and it's up to me how I decide to use it, as long as I'm the only one doing so.

*All you really need to do to prevent this media being taken without your consent is to always lock your car - if you leave CD's in your car they are likely to be fried anyway. I recently bought a CD visor for my car, and it has a warning label that says not to use it in any "closed cars" - I'm not joking! Am I mistaken, or would that be referring to all non-convertibles? And, if a convertible is not closed - that is, if the top is down - where would one put the visor?

However, though I do think that prohibiting personal CD burning is going a bit far, it's still the law. I would hardly be justified in this anti-piracy rant if I myself was guilty of it. In the past, I certainly was, no denying that – but it’s something I have been working hard to set right. My current copy of Photoshop is legitimate, for instance – something I couldn’t have claimed even two years ago. Do I think Photoshop is worth $700 dollars? Absolutely not! I settled for Photoshop CS2 on eBay and saved five hundred bucks. Oddly enough, I like my legitimate version better – and not just because my pirated copy was the much-older Photoshop 7 (though mostly for that reason).

The same can be said about the Producer’s Edition of FL Studio (Fruity Loops, a music producer’s flagship to you non-nerds). Unlike Photoshop, I believe it was worth every penny, and that’s why I paid for it. I have no interest in obtaining such a great piece of software, which no doubt cost many other software developers like myself untold hours of labor and effort, for free. That labor and effort is worth something to me – namely, precisely the amount I paid for it.

That’s the very idea of a market. You render services and obtain goods, or you render goods and obtain goods, or some other such interaction, where the value of the items exchanged is estimated to be the same by both parties. You never render nothing and get something; the very idea is absurd. How many things on which you would place a high personal value have you given away for free in the last year? Case in point.

Giving for the sake of giving is the ONLY way anyone ever gets anything for free, and by virtue of the idea, YOU are never on the receiving end. People give to the less fortunate because they're just that - needy. Maybe they ask for help, maybe they don't. If you have a roof over your head, running water, air conditioning, and even mediocre health, you are more blessed than probably 90% of the world's population.

It’s the problem of give versus get. Getting something for free is irrational, while giving something for free is quite rational, and the mark of any stable society - a virtue which is quite rare these days known as generosity. The modern idea of generosity seems limited to a certain time of year, and is far less affordable that way.

Piracy is not wrong just because making an unauthorized copy of a CD or DVD financially or even mentally hurts someone else, even indirectly. I ran across an article recently that seems to indicate the very opposite - that piracy actually helps music sales, of all things. No, piracy is wrong because the law says it’s wrong. Piracy is wrong because the United States has a moral code that everyone is subject to uphold - namely, the Constitution.

Constitutional law used to be absolute, meaning there was no reasoning around it or changing it on a whim, or because “times have changed.” These days the term 'obsolete' has replaced 'absolute.' But back when the U.S. was founded, the law was the law, like it or not. If you don’t live by it, don’t be surprised when no one else does, either. Know what that’s called? Anarchy. Set an example and start developing some character. Rid your life of the plague of piracy and the lie that you can get something for nothing. Start giving something for nothing and see where that gets you – see where that gets our society.

Tuesday, June 19, 2012

Entertaining Ideas

No, this post is not about ideas that one might find entertaining. Some of these ideas may be entertaining, (by accident, mind you!) but that's not the point.

The point is this:

"It is the mark of an educated mind to entertain an idea without accepting it."
-Aristotle

I simply love this quote. People would get along much better on the whole if they were, according to Aristotle, educated, or at least had this mark indicating such, by acting accordingly.

Why? It seems to me that the primary reason most arguments begin (other reasons notwithstanding) is that neither person can entertain (is willing to consider) the idea that the other person is trying to get across, or the idea that the other person might be correct. Another idea many who commonly get into arguments cannot seem to entertain is the idea that more than one person can be right. I admit personally that these are two very difficult ideas to entertain! You have to dance a jig and tell a joke, and even then it's a tough audience.

I have noticed (in hindsight) that in roughly 80% of the arguments I have been in, both participants were correct, and neither was willing to admit this possibility, resulting in only frustration and anger, and prolonging the argument.

This is an important concept to wrap your mind around (to entertain, if you will) and one that I believe even goes beyond the Aristotle quote above. Because it was stated so concisely and eloquently, I am including a portion here from an article in 2600 magazine (The Hacker Quarterly) that explains this very subject, in terms of the stigma and confusion surrounding the hacker group Anonymous:

"Because we have a culture where there are good guys and bad guys, we demand that those labels be used, and that people be lumped into either one or the other, preferably those who agree with us and those who don't. The problem is that when we do that without understanding why it doesn't actually work that way, we unfairly prosecute people who were doing the "right" thing, and wind up having to deal with people who have been mislabelled. [...] [Y]ou can't really destroy an idea unless you consider it. The problem is, once you open your mind and consider it, you may no longer disagree with it.

And that is the bottom line which creates and perpetuates both the fear and the paranoia [about Anonymous]: a sense that we might just be wrong. When you only ascribe to the "good" things with which you agree, you leave no place for learning from your mistakes. Thus, when we discover we have made mistakes, rather than being honest, meeting sympathetic eyes, and moving on, we must run and hide, begging forgiveness, or morph the mistakes into shell statements of what they actually were, devoid of any meaning, and shedding any potential lesson we could have learned. With this pattern, we learn to brush things we don't understand under the table, hoping they will go away and leave us alone." [1]

The drama about Anonymous aside, the author's point is a great one. When fully understood, it is quite profound, and opens your mind to a new way of thinking - a way of entertaining ideas, as Aristotle put it, without accepting them. If you can truly do this, you have not only proven yourself to have the mark of an educated mind - you've made a big step down the never-ending but enlightening road of education outside the classroom.

[1] aestetix, "Who Is Anonymous?" 2600 Magazine. Spring 2012.

Tuesday, June 12, 2012

On Being A "Computer Genius"

Whenever I meet anyone, the moment that person becomes aware that I might know a little something about computers, they come to regard me instantly and irrevocably as a "computer genius." I've always been bemused by this, and perhaps a little bit frustrated. I would therefore like to elaborate (and quite elaborately) on this subject, particularly for those who might think they know such a "computer genius" - and show you what is wrong with this concept. It's not that complicated, I promise. In fact, this diagram pretty well explains it, for the most part. (Slight warning: there is quite a lot of vulgar material on XKCD, so I can't recommend browsing into it very far. Do so at your own risk. There is none on the particular comic found at the link above, however.)

Personal Disclaimer

Before I get into the meat of this post, I will say that this is not written with any bitterness or anger towards people that regard me as a computer genius. I will happily devote my time to helping you solve your computer troubles; this goes for anyone. If you know me, you know this is true. I have no reservations about making technology work better for people, and any way I can serve others is a great benefit to me personally on many different levels. Again, this post is written only to educate and inform, not to spread discontent in any way. I would like nothing better than for you to continue regarding me as a computer genius (even after having read this post) if you are so inclined. It won't hurt my feelings one bit - though it might inflate my ego a tad, something I could definitely do without. I'll leave your reaction up to you. Just don't come away from this post more hesitant than before to ask me for computer help - that's not my purpose in writing this at all. My computer knowledge is almost entirely useless if I can't use it for the good of others.

Overview

First I will discuss the most common misconceptions about "computer geniuses," then I will move on to refute the false idea of computer illiteracy. Third, I'll explain why a true computer genius does not exist, and finally, I will show that even if computer geniuses did exist, neither I, nor most nerds, would qualify as one. Believe it or not, I'll do this all without using any technical terms or saying anything that might go over your head. Yes, I admit, that will take some effort (not using any technical terms). Well, okay, there might be one or two scattered around, but you won't need to understand them to get what I'm saying.

You may have noticed that this post is quite large. It is therefore broken into sub-sections to make both reading and browsing somewhat easier.

What Is A "Computer Genius"?

A computer genius, as defined by just about everyone who regards me as one, is a person who can call up any bit of knowledge whatsoever about a computer at any given moment. They can fix any computer problem, provide advice for any situation that even remotely involves a computer, and are always the best candidate to look at your computer and make it go faster and work better, in every way possible. They know all the latest facts about any and every computer model, operating system, hardware, and software, and can provide you near-free* access to any of it given a moment's notice, and a flash drive.

*Here I would like to note (I may write a future post just for this topic) that music, movies, and, in general, things you think shouldn't be free, are never actually free. Sure, you can get them for free online - illegally. If your local computer genius promises you a free copy of Photoshop, for instance, be very skeptical and ensure that they are not doing something that is against the law. I don't care what edition it is; enterprise, professional, teacher, student, or hobo - if it's free there is a 90% chance of something illegal going on in obtaining it. I'm a computer genius - I should know, right? (Though if there were any 'hobo editions' I imagine they'd be free!)

I admit that to someone who considers him or herself computer illiterate, us nerds may actually seem to possess all this knowledge and such in regards to computers, and that's totally understandable. In fact, that's the myth this post is here to dispel. Not that you should stop regarding me/us as computer geniuses, but I'll be the first to say we certainly don't deserve to be put on a pedestal. In fact, most of the time when I have helped someone do just about anything on a computer, I don't really feel I've done anything extraordinary at all. Most of the computer things I end up helping others with are things I routinely do without the slightest notion that someone else might not know how to do it.

Guess What? You're Not Computer Illiterate

By the way, a 'computer illiterate person' is a figment of your imagination. I'm sorry, but that's just a lame excuse not to learn anything more about computers, or put forth any effort in doing so. As stated in my disclaimer above, I'm totally fine with bearing all your technological burdens. That being said, is it too much to ask that you put forth at least as much effort in learning about computers as you do with anything else?

Computer technology is really not all that difficult to learn, particularly in this age where you don't have to know the commands to run programs, for instance. A program is now just a little icon that you click on. It's been made as simple as possible so anyone can use it - the least you can do is have some self esteem and stop calling yourself illiterate. If you can read, you can use a computer. Reading takes far more mental acuity than using a computer - though I admit that I learned to use a computer at an age remarkably close to the age at which I learned to read. If you can't say the same, I'll give you a little bit of philosophical slack there.

If you're over 50, you get a lot more slack due to growing up before the age of what we know as modern computer technology. This is because the human mind generally loses its ability to quickly and readily absorb new information at about age 26. This puts the pioneering age of personal computers, or at least commonplace usage of them, well past your years of optimal computer-learnage, to turn a phrase.

Still, the "you can learn anything" principle always applies. Ask anyone who you would consider an expert at anything - besides computers. Think of the one person you know who is best at a given skill or talent. Collectively, these people would tell you that when they began doing it, they were terrible at it and thought they would never get any better. Had the subject in question been computers, they might have even labelled themselves as computer illiterate. The key is that they didn't let that stop them for a second.

How To Approach "Computer Geniuses"

Calling yourself computer illiterate in front of a computer genius means basically nothing, for starters. Nearly everyone that person meets is by comparison computer illiterate, except perhaps people working in the computer industry. But think a little more deeply about what you are really trying to tell the person - wouldn't "I need help, can you teach me?" come across as a slightly more inviting attitude? When you've basically said "you'll have to do everything for me, and I won't even try to pay attention," an awkward smile just won't cut it (though smiles are generally never frowned upon, at least not in my book).

If you can't be sincere, at least act like you want to learn. I can't imagine that anyone would enjoy having to ask how to do every single tiny thing on a computer, particularly asking more than once about the same exact thing. So do you, on the flip side, imagine we like teaching these same basic things over and over, to people who refuse to put forth any effort in trying to learn?

Don't just make the motions - follow through. Bring a pencil and paper so you can write down instructions, if necessary. Don't force your local computer genius to write it down for you. Not only is their writing seldom self-explanatory (or even legible), but it would be like asking a scientist to help you do advanced science by writing down the instructions using basic formulas, ones they work with every day. Most scientists, if you were to do that, would simply hand you a copy of their calculus textbook. With computers, there is no such manual, or at least one with universal formulas like mathematics has.

Anyone who has been through high school should be competent enough at math to at least understand an equation, if not solve one. Computers, on the other hand, are all about how to use the mouse, and how to interact with the buttons and other elements displayed on the screen. To a lesser extent, it also helps to have a basic understanding of files and how data is organized and handled by the computer. Nearly every computer-usage instruction would begin with "move the mouse" and include several "click once here" or "click twice there" phrases. To be blunt, using a computer is far easier than math. Now programming computers, on the other hand...

As an aside, did you know that the first mice (for mainframe computers) were so large that you had to sit down on them and drive around, like a golf cart with a tail?

Just kidding! But the very idea sure is amusing!

There Are No Computer Geniuses

I'm going to assume that by now, you've realized you might actually be able to learn how to use a computer, and begun to move out of your self-imposed "computer illiterate" shadow. Great work so far! You're roughly halfway to learning something about computers. The next step is to realize that, no matter how much computer knowledge is possessed by any person you know, there will always be something that person simply can't help you with, no matter how much they would like to. If that person happens to be me, I will happily utilize my well-developed Googling skills* and at least help you try to find an answer somewhere, even if I don't have it.

*Yes, Googling is actually a skill. It seems to me that if there's something you're interested in finding, it is most likely "out there somewhere" in the world wide web. If it really is out there, chances are that I can find it. I figure it's a skill because 99% of the time, I found it, and the person who asked me to find it did a considerable amount of searching for it themselves. Here's a hint: the key words are the key! Even changing a single word in your search can bring up drastically different search results. For instance, "iPad keyboard case" versus "iPad keyboard cover."

The point is, a true computer genius simply does not exist, because it's a stereotype. Computers are a vast collection of highly advanced digital circuitry that no single person in the entire world knows everything about. Just understanding how they work, from beginning to end, is a monumental goal in and of itself, which very few people, even in the computer world, have ever achieved. Computers (to be more precise, personal computers) represent decades and even centuries of continuous research and development. Would you really expect a single person to know, for instance, everything there is to know about any other field? Artistry? Music? Food? Architecture? Then how is it reasonable to impose this same stereotype on someone who, at best, actually happens to be the world's leading authority on some tiny portion of computer technology, and at worst, knows a bit more about computers than you do? Odds are it's the latter, and in most cases, you've barely even met this person!

The answer is, it's no more reasonable than any other stereotype - quite unreasonable, really. Expecting their help simply because they are the "local computer genius" - and because you haven't invested any time or energy into your own knowledge of computers - is even more unreasonable. When you want to know something about flowers, do you drive down to the local flower shop and lampoon the cashier with questions you already know she can't answer, calling yourself "flower illiterate" - or do you head to Wal-Mart, buy a gardening magazine and some seeds, then head home and grab your hoe and shovel? Why should your approach be any different with computers?

I can't blame you if some hardware breaks and you want someone else to install it. Hardware is a somewhat different story, since that is far more specialized knowledge. However, most non-hardware issues can be solved by moving the mouse to the right spot and clicking. This involves no digging of holes or planting of flowers, and certainly nothing anywhere near advanced as basic math or even reading and grammar - so why harbor the ridiculous stigma that "I will never be able to understand or use computers?" In 90% of computer problems, simply Googling your problem and then following the first set of instructions you find will get you halfway to solving the problem, and by then it wouldn't be rocket science to figure out the rest.

What We 'Computer Geniuses' DO Know

Not to sound like I'm working backwards, but now that I've covered all that - many computer geniuses may actually know quite a bit about computers! Some more than others. This shouldn't surprise you, but in light of everything I've said, it's still something to consider. It becomes pertinent, then, to understand more about the different categories of computer knowledge (again, as with any other field) so that you can begin to pinpoint the areas you need expertise in, and therefore, target those individuals who will actually be able to help you. Why do I say this? Odds are if you consider someone a computer genius, he or she is quite likely to try and help you regardless of your request, even despite a complete lack of knowledge in that area. It's true! I do it all the time.

Is it just to keep up the 'computer genius' image? Is it out of sheer helpfulness? I can't say for sure, but it would definitely make things less confusing if more people read this post! Even just guessing the area your problem falls under might save your computer genius the effort of figuring that part out. Even if you guess wrong, you will have impressed them just by trying to apply yourself. For instance, instead of saying "I'm computer illiterate," say something like "I think I'm having a network problem, can you help me?" By contrast, when I hear someone tell me "I'm computer illiterate" by implication they are expecting me to know everything and fix everything with little to no involvement from them.

Now, I certainly wouldn't offer to help anyone else if I didn't think I knew something about computers. In fact, I've studied into computers quite a bit just for the sake of learning them - what I know about computers is not all 'intuitive' and it did not all enter my head the first time I laid hands on a keyboard and mouse. It doesn't always come easy, either. Still, one might wonder, why don't I consider myself a computer genius?

Let me answer this question in a somewhat odd fashion. I will attempt to categorize all computer knowledge. This should open your mind somewhat to just how vast is the world of personal computers. Then I will estimate my level of knowledge for each category.

In my experience, here are the basic areas of computer expertise, and a brief description of what exactly each one is: software, the available programs, or lists of instructions (computer code) that tell computers what to do; hardware, the physical components that make up a computer; networking, the wires, cables, airwaves, satellite transmissions, and other such things that computers use to communicate with each other, as well as the involved hardware, software, and communication protocols; digital electronics, the super-tiny components that make up the hardware; programming, the art of writing, testing, and distributing software (computer instructions) that perform meaningful tasks; web design, the art of creating websites; server architecture, the computers that run web sites and the internet; and security, keeping data secure as it travels over the internet, and keeping computers from being infected with viruses (malicious programs). Here's the kicker - this is so high-level we have barely scratched the surface. Each of these areas is in itself an entire field, about which no one person could possibly know everything. Are you starting to get the picture?

Now, I will be using a percentage scale: 100% representing all available and possible knowledge about any one particular category, and 0% representing absolutely no knowledge in that area. Here is a breakdown of my own estimation of my computer knowledge and skill:

Software: 36%
Hardware: 3%
Networking: 4%
Digital Electronics: 7%
Programming: 22% (I am a software developer, mind you!)
Web Design: 11%
Server Architecture: 1%
Security: 18%

I think that should pretty well answer the question. My extreme lack of computer knowledge speaks for itself! I would define a true 'computer genius' as someone who has even 25% to 50% in each of these categories.

Don't assume by these low numbers that I am trying to be modest. This is the most accurate data that I could come up with. Think about this for a moment: of all the programming knowledge, even with me being a programmer - I believe I know less than a fourth of all knowledge about programming. That's in just one category, and my second highest!

What should really help drive the point home is when you consider the average computer problem. Here is another breakdown estimating what percentage of all computer problems (adding to 100%) occur in each category:

Software: 22% (not bad programming, just installation issues)
Hardware: 24%
Networking: 14%
Digital Electronics: 1%
Programming: 31% (this is basically the cause of most software problems)
Web Design: 5%
Server Architecture: 1%
Security: 2%

Since the average computer problem can be solved with a slightly-above-average collection of knowledge about software, hardware, and programming, something any nerd often pegged as a computer genius is quite likely to have, this explains how we appear to have "all knowledge." Networking generally takes care of the rest; however that ends up being the most common unsolved problem category, since very few nerds actually know enough about networking to solve most such problems. Typically problems finding their way to computer geniuses from outside sources range from simple in difficulty to about medium; however, as was just stated, the network category is a bit different. Since this level of knowledge falls at a much lower level of the overall computer architecture, the configuration and often mathematics involved is usually too difficult for your average computer genius to solve, myself included. In my case, I have less of an excuse for this than you do for computer knowledge in general - I graduated college a double major in computer science and math. For me, the gap is simply a lack of detailed study (or more precisely, interest) in the networking field.

'Computer Geniuses' Have Problems With Computers Too

Finally I would like to briefly refute the myth that computer geniuses never have computer problems of their own (since, obviously, they already know how to solve them all.) This is not only quite untrue, but the opposite is actually the case - computer geniuses, or those regarded as such, typically have far more problems, and more difficult ones at that, than people that consider themselves 'computer illiterate.' Why does this occur? It isn't just because computer geniuses can usually solve the easier problems on their own - it's primarily because they do more advanced things on a regular basis and change settings a lot more frequently. As such, they are far more likely to break something in these advanced settings.

As an example, one of my first experiences with a computer was in the Windows Registry. I won't go into any detail here about what that is - just know that it's an important part of Windows, the primary collection of software (called the OS or operating system) that gives you all those little icons, buttons, windows, scroll bars, and other nice things to click on. Well, I was messing around in there without understanding what this Registry was for or how it worked. I began changing some things and found that now my programs would not run. I really thought I'd broken that laptop for good! However, I played around with it some more and was luckily able to get it back to normal. I admit it did require some deductive skill and intuition about what was going on with this Registry program, but it still taught me what it does and why it is there. This is a problem 70% of computer users would never have, because 70% of computer users don't have the foggiest idea that the Windows Registry even exists, much less any desire to open it or start changing things.

Whew! This will likely be my longest post for quite a while - maybe the entire blog. I guess I had a lot to say about this. It's certainly plenty of food for thought. I'll leave you with one major tip about solving nearly ANY computer problem. If all else fails...throw it out the window! Forget versions, OS levels, and actually having to know anything about computers to begin with. This solution works! And it gives you a great excuse to go buy a new computer. Although, to avoid any legal controversies and/or murder charges, you may want to ensure there is no one walking around on the sidewalk below said window before you go and chunk a computer out of it.